Startup Claims AI Design Wins
Gyrfalcon's inference chip has broad targets
Rick Merritt, EETimes
1/25/2018 00:01 AM EST
SAN JOSE, Calif. — Startup Gyrfalcon is moving fast with a chip for inferencing on deep neural networks, but it faces an increasingly crowded market in AI silicon. A year after it got its first funding, the company is showing a working chip and claiming design wins in smartphones, security cameras and industrial automation equipment.
Data centers typically train deep neural networks and run inference tasks on them using banks of servers. Increasingly, client and embedded systems from cars to handsets are adopting accelerators to speed the inferencing jobs.
Apple, Google and Huawei are already shipping smartphones with inferencing blocks in their custom SoCs. Google and Microsoft built inference accelerators for their data centers.
To read the full article, click here
Related Semiconductor IP
- Securyzr™ neo Core Platform
- 112G Multi-SerDes
- SHA3 Cryptographic Hash Cores
- ISO/IEC 7816 Verification IP
- 50MHz to 800MHz Integer-N RC Phase-Locked Loop on SMIC 55nm LL
Related News
- Startup Claims Low-power IoT Geolocation Without a Positioning Chipset
- sureCore extends its sureFIT design service to include custom memory solutions for AI applications
- Xiphera Wins ECSO STARtup Award 2025 for Innovation in Cybersecurity
- FlexGen Streamlines NoC Design as AI Demands Grow
Latest News
- Breker Donates Advanced Test Suite Components to RISC-V International for Use in Future Compliance Activities
- MIPI Alliance Releases New Groundbreaking Audio Interface Specification for High-Bandwidth, Low-Latency Consumer Electronics, Automotive and Industrial Applications
- TSMC Reports Third Quarter EPS of NT$17.44
- CHIPS Alliance’s Caliptra Launches 2.1 RTL Release
- AMI Joins Fujitsu to Deliver Next-Gen Arm-Based FUJITSU-MONAKA CPU with Unmatched Power and Efficiency