Startup Claims AI Design Wins
Gyrfalcon's inference chip has broad targets
Rick Merritt, EETimes
1/25/2018 00:01 AM EST
SAN JOSE, Calif. — Startup Gyrfalcon is moving fast with a chip for inferencing on deep neural networks, but it faces an increasingly crowded market in AI silicon. A year after it got its first funding, the company is showing a working chip and claiming design wins in smartphones, security cameras and industrial automation equipment.
Data centers typically train deep neural networks and run inference tasks on them using banks of servers. Increasingly, client and embedded systems from cars to handsets are adopting accelerators to speed the inferencing jobs.
Apple, Google and Huawei are already shipping smartphones with inferencing blocks in their custom SoCs. Google and Microsoft built inference accelerators for their data centers.
To read the full article, click here
Related Semiconductor IP
- HBM4 PHY IP
- eFuse Controller IP
- Secure Storage Solution for OTP IP
- Ultra-Low-Power LPDDR3/LPDDR2/DDR3L Combo Subsystem
- MIPI D-PHY and FPD-Link (LVDS) Combinational Transmitter for TSMC 22nm ULP
Related News
- Start-up claims breakthrough in reconfigurable logic
- Arasan Claims Worldwide Dominance in Design Wins For Flash Memory Card Interface IP
- Startup Claims Low-power IoT Geolocation Without a Positioning Chipset
- Xiphera Wins ECSO STARtup Award 2025 for Innovation in Cybersecurity
Latest News
- LTSCT and Andes Technology Sign Strategic IP Licensing Master Agreement to accelerate RISC-V Based Advanced Semiconductor Solutions
- Global Semiconductor Sales Increase 29.8% Year-to-Year in November
- BAE Systems Licenses Time Sensitive Networking (TSN) Ethernet IP Cores from CAST
- HBM4 Mass Production Delayed to End of 1Q26 By Spec Upgrades and Nvidia Strategy Adjustments
- ASICLAND Secures USD 17.6 Million Storage Controller Mass Production Contract