What is the ACCURACY OF DRAGONFLY?
페이지 정보
작성자 Otilia 댓글 0건 조회 27회 작성일 25-12-19 04:11본문
Dragonfly is Onit’s cutting-edge laptop vision indoor localization technology primarily based on visible SLAM that gives correct indoor position and orientation to forklifts, automated guided autos (AGV), autonomous mobile robots robots (AMR), robots, drones and every other moving vehicle and asset. Dragonfly allows RTLS options for analytics, productiveness and security in GPS-denied environments like warehouses, production plants, factories and many others.. Dragonfly delivers the X-Y-Z coordinates and 3D orientation of any transferring system with centimeter-stage accuracy, by analyzing in real-time the video stream coming from a normal wide-angle digital camera related to a small computing unit. Dragonfly represents the state-of-the-art for mobile tracking gadget indoor localization technologies at locations where GPS/GNSS cannot be used and it's much more aggressive compared to other indoor localization technologies primarily based on LiDAR, Ultra Wide Bandwidth, Wi-Fi, Bluetooth RSS. HOW DOES IT WORK? In the course of the system setup phase the huge-angle digicam sends the video feed of its surroundings to the computing unit.
The computing unit takes care of extracting the features of the setting, in each of the frames, and making a 3D map of the surroundings (which is geo-referenced using a DWG file of the realm). During its usage in manufacturing the vast-angle digital camera sends the true-time video feed of its surroundings to the computing unit. The computing unit extracts the features of the setting in each of the frames and compare them with these inside the previously created 3D map of the atmosphere. This process permits Dragonfly to calculate at more than 30 Hz the X-Y-Z place and orientation within the 3D space of the digital camera (and thus of the mobile tracking gadget asset on which it is mounted). Dragonfly is an accurate indoor location system based on laptop vision. The location is computed in real time utilizing just an on board camera and a computing unit on board of the system to be tracked, thanks to our computer vision algorithm. Computer imaginative and prescient, odometry and artificial intelligence are used to create an accurate system, as a way to ship a exact location for multiple functions.
It is a superb solution for the precise indoor tracking of forklifts, AGV, AMR, robots and drones (within the 3D area). Dragonfly is way more competitive than LiDAR, UWB, radio sign primarily based applied sciences for which an advert-hoc infrastructure have to be designed, setup, calibrated and maintained for every particular venue. No receivers, no RFID tags, no antennas, no nodes, no magnetic stripes. Nothing has to be deployed by way of the venue. You need only a digicam and a computing unit onboard your cell vehicles. No tech abilities required, no troublesome directions, no need for error-prone and time-consuming calibrations of ad-hoc UWB infrastructure. SLAM know-how is way more sturdy to environmental changes versus LiDAR, which struggles significantly to take care of accuracy in environments through which obstacles change over time. Dragonfly cameras are easier to calibrate and are extra robust to changes in the setting. Dragonfly distributed architecture makes the solution dependable by eliminating obligatory server that led to SPOF (single factors of failures).
This additionally implies that Dragonfly can develop following the size and progress of your fleet of shifting vehicles. Dragonfly can work utterly offline on a computing unit on board of forklift, AVG, AMR, drones, robots or on an on-premise server. Dragonfly means that you can optimize your operations, increasing the productiveness and effectiveness of the tracked gadgets. Along with this its aggressive value, makes the ROI larger than any other technology at the moment available on the market. Enhance the operations because of a real-time visibility of the actual utilization and path of your cell vehicles (like forklifts) to avoid under/over utilization and maximize the performance of the fleet. Know in actual-time the situation of every shifting asset to forestall accidents between human-guided mobile autos (comparable to forklifts) inside warehouses and production amenities enabling thus V2V (car to automobile) and V2P (automobiles to pedestrians) applications for collision-avoidance. Speed up the productivity by tracking the location of each transferring asset to indirectly know the place of each dealing with unit on the bottom, racks and shelves.
댓글목록
등록된 댓글이 없습니다.