Automotive vision processing moving into the camera
Advanced driver assistance systems (ADAS) automate, adapt, or enhance automotive vehicles to increase safety and enhance the driving experience. These systems use sensors such as radar, ultrasound, and especially standard digital cameras to capture their surroundings. Video analytics of these images enables the driver with extra information, warnings, or can even autonomously decide to adapt speed and direct the steering wheel.
Market research firm ABI Research forecasts that the market for ADAS will grow from US$11.1 billion in 2014 to US$91.9 billion by 2020, passing the US$200 billion mark by 2024. ADAS packages have long been available as optional extras on luxury and executive vehicles, but recent years have seen the more popular systems penetrating through to affordable family cars.
To read the full article, click here
Related Blogs
- CogniVue's "Opus" APEX Generation 3: Vision Processing With Implementation Flexibility
- Vision C5 DSP for Standalone Neural Network Processing
- Where Imagination fits into the automotive supply chain
- Powering AI and Automotive Applications with the MIPI Camera Interface
Latest Blogs
- Evolution of CXL PBR Switch in the CXL Fabric
- CNNs and Transformers: Decoding the Titans of AI
- How is RISC-V’s open and customizable design changing embedded systems?
- Imagination GPUs now support Vulkan 1.4 and Android 16
- From "What-If" to "What-Is": Cadence IP Validation for Silicon Platform Success