AI ASICs Exposed!
Artificial intelligence, or AI is really heating up these days. The technology has been around for decades, but of late it is becoming quite a focus for applications such as data center analytics, autonomous vehicles and augmented reality. Why the rebirth? The trend appears to be driven by two forces – availability of data to train these systems and new technology that dramatically speeds up the training process. Let’s take a look at both of these trends.
Regarding the data, this is really the currency of AI. Without massive amounts of known results, inference and machine learning isn’t possible. Thanks to the huge, global footprint and ubiquitous nature of a few key players, data stores are being built every day. Google has amassed a huge amount of empirical data associated with autonomous vehicle behavior. So has Tesla, Detroit and every other car manufacturer for that matter. Audi appears to pulling ahead with the planned introduction of Level 3 capability (“no feet, hands or eyes”) for their flagship A8. Look here to learn more about this development.
To read the full article, click here
Related Semiconductor IP
- RVA23, Multi-cluster, Hypervisor and Android
- 64 bit RISC-V Multicore Processor with 2048-bit VLEN and AMM
- NPU IP Core for Mobile
- RISC-V AI Acceleration Platform - Scalable, standards-aligned soft chiplet IP
- H.264 Decoder
Related Blogs
- Is Formal Verification Artificial Intelligence?
- Artificial Intelligence calls for Smart Interconnect
- Datapath Validation - Solving Verification Challenges in the Era of Artificial Intelligence and Mathematical Cores
- Between ASIC and microcontroller: It's all about System Realization
Latest Blogs
- How fast a GPU do you need for your user interface?
- PCIe 6.x and 112 Gbps Ethernet: Synopsys and TeraSignal Achieve Optical Interconnect Breakthroughs
- Powering the Future of RF: Falcomm and GlobalFoundries at IMS 2025
- The Coming NPU Population Collapse
- Driving the Future of High-Speed Computing with PCIe 7.0 Innovation