Oriented FAST and Rotated BRIEF (ORB) Feature Detection Speeds Up Visual SLAM
In the realm of smart edge devices, signal processing and AI inferencing are intertwined. Sensing can require intense computation to filter out the most significant data for inferencing. Algorithms for simultaneous localization and mapping (SLAM), a type of navigation, entail complex image processing at the sensing stage to detect features. Executing in real time, these algorithms must quickly perform feature detection. Introduced in 2011, Oriented Fast and Rotated Brief is one of the faster feature-detection algorithms. Nonetheless, cost and power constraints mean the underlying hardware must also be efficient, and software must be optimized and, for the sake of developer productivity, easy to use.
SLAM assesses where a moving agent is, while also mapping the environment. Self-driving cars employ the technique and so can other devices, such as autonomous mobile robots (AMR), which are warehouse vehicles that move materials. The AMR is a successor to the autonomous guided vehicle (AGV), an older technology that followed a fixed path outlined, for example, by a painted stripe. Like a self-driving car, an AMR can employ cameras and other sensors that feed a SLAM system.
To read the full article, click here
Related Semiconductor IP
- Rad-Hard GPIO, ODIO & LVDS in SkyWater 90nm
- 1.22V/1uA Reference voltage and current source
- 1.2V SLVS Transceiver in UMC 110nm
- 1.8V/3.3V GPIO With I2C Compliant ODIO in GF 55nm
- Verification IP for UALink
Related Blogs
- A Fast and Seamless Way to Burst to the Cloud for Peak EDA Workloads
- Faster, Higher Capacity Emulation and Prototyping for AI Workloads
- How Chip Makers Are Defying Complexity and Innovating Faster
- ARM furthers its "cover the earth" strategy with introduction of R5 and R7 core variants for fast, real-time, deterministic SoC applications