Oriented FAST and Rotated BRIEF (ORB) Feature Detection Speeds Up Visual SLAM
In the realm of smart edge devices, signal processing and AI inferencing are intertwined. Sensing can require intense computation to filter out the most significant data for inferencing. Algorithms for simultaneous localization and mapping (SLAM), a type of navigation, entail complex image processing at the sensing stage to detect features. Executing in real time, these algorithms must quickly perform feature detection. Introduced in 2011, Oriented Fast and Rotated Brief is one of the faster feature-detection algorithms. Nonetheless, cost and power constraints mean the underlying hardware must also be efficient, and software must be optimized and, for the sake of developer productivity, easy to use.
SLAM assesses where a moving agent is, while also mapping the environment. Self-driving cars employ the technique and so can other devices, such as autonomous mobile robots (AMR), which are warehouse vehicles that move materials. The AMR is a successor to the autonomous guided vehicle (AGV), an older technology that followed a fixed path outlined, for example, by a painted stripe. Like a self-driving car, an AMR can employ cameras and other sensors that feed a SLAM system.
To read the full article, click here
Related Semiconductor IP
- Multi-channel, multi-rate Ethernet aggregator - 10G to 400G AX (e.g., AI)
- Multi-channel, multi-rate Ethernet aggregator - 10G to 800G DX
- 200G/400G/800G Ethernet PCS/FEC
- 50G/100G MAC/PCS/FEC
- 25G/10G/SGMII/ 1000BASE-X PCS and MAC
Related Blogs
- A Fast and Seamless Way to Burst to the Cloud for Peak EDA Workloads
- The Age of AI Demands Faster Chip Development: Only Arm and Cadence Deliver
- Faster, Higher Capacity Emulation and Prototyping for AI Workloads
- How Chip Makers Are Defying Complexity and Innovating Faster