Startup Claims AI Design Wins
Gyrfalcon's inference chip has broad targets
Rick Merritt, EETimes
1/25/2018 00:01 AM EST
SAN JOSE, Calif. — Startup Gyrfalcon is moving fast with a chip for inferencing on deep neural networks, but it faces an increasingly crowded market in AI silicon. A year after it got its first funding, the company is showing a working chip and claiming design wins in smartphones, security cameras and industrial automation equipment.
Data centers typically train deep neural networks and run inference tasks on them using banks of servers. Increasingly, client and embedded systems from cars to handsets are adopting accelerators to speed the inferencing jobs.
Apple, Google and Huawei are already shipping smartphones with inferencing blocks in their custom SoCs. Google and Microsoft built inference accelerators for their data centers.
To read the full article, click here
Related Semiconductor IP
- Xtal Oscillator on TSMC CLN7FF
- Wide Range Programmable Integer PLL on UMC L65LL
- Wide Range Programmable Integer PLL on UMC L130EHS
- Wide Range Programmable Integer PLL on TSMC CLN90G-GT-LP
- Wide Range Programmable Integer PLL on TSMC CLN80GC
Related News
- Start-up claims breakthrough in reconfigurable logic
- Arasan Claims Worldwide Dominance in Design Wins For Flash Memory Card Interface IP
- Startup Claims Low-power IoT Geolocation Without a Positioning Chipset
- Sondrel announces two new design contract wins plus two existing Design Consultancy contracts extended
Latest News
- RaiderChip NPU for LLM at the Edge supports DeepSeek-R1 reasoning models
- The world’s first open source security chip hits production with Google
- ZeroPoint Technologies Unveils Groundbreaking Compression Solution to Increase Foundational Model Addressable Memory by 50%
- Breker RISC-V SystemVIP Deployed across 15 Commercial RISC-V Projects for Advanced Core and SoC Verification
- AheadComputing Raises $21.5M Seed Round and Introduces Breakthrough Microprocessor Architecture Designed for Next Era of General-Purpose Computing