Startup Claims AI Design Wins
Gyrfalcon's inference chip has broad targets
Rick Merritt, EETimes
1/25/2018 00:01 AM EST
SAN JOSE, Calif. — Startup Gyrfalcon is moving fast with a chip for inferencing on deep neural networks, but it faces an increasingly crowded market in AI silicon. A year after it got its first funding, the company is showing a working chip and claiming design wins in smartphones, security cameras and industrial automation equipment.
Data centers typically train deep neural networks and run inference tasks on them using banks of servers. Increasingly, client and embedded systems from cars to handsets are adopting accelerators to speed the inferencing jobs.
Apple, Google and Huawei are already shipping smartphones with inferencing blocks in their custom SoCs. Google and Microsoft built inference accelerators for their data centers.
To read the full article, click here
Related Semiconductor IP
- USB 20Gbps Device Controller
- AGILEX 7 R-Tile Gen5 NVMe Host IP
- 100G PAM4 Serdes PHY - 14nm
- Bluetooth Low Energy Subsystem IP
- Multi-core capable 64-bit RISC-V CPU with vector extensions
Related News
- Start-up claims breakthrough in reconfigurable logic
- Arasan Claims Worldwide Dominance in Design Wins For Flash Memory Card Interface IP
- Startup Claims Low-power IoT Geolocation Without a Positioning Chipset
- Cadence and TSMC Advance AI and 3D-IC Chip Design with Certified Design Solutions for TSMC’s A16 and N2P Process Technologies
Latest News
- IntoPIX & Altera Unlock New Levels Of Efficiency For JPEG XS On Agilex At IBC 2025
- Perceptia Begins Port of pPLL03 to Samsung 8nm Process Technology
- Efinix® Doubles Titanium Product Line
- SmartSoC Solutions Partners with Cortus to Advance Chip Design and Manufacturing for SIM Cards, Smart Cards, Banking Cards, and E-Passports in India
- Fraunhofer IIS and ARRI announce partnership for post-production workflows at IBC 2025