Startup Claims AI Design Wins
Gyrfalcon's inference chip has broad targets
Rick Merritt, EETimes
1/25/2018 00:01 AM EST
SAN JOSE, Calif. — Startup Gyrfalcon is moving fast with a chip for inferencing on deep neural networks, but it faces an increasingly crowded market in AI silicon. A year after it got its first funding, the company is showing a working chip and claiming design wins in smartphones, security cameras and industrial automation equipment.
Data centers typically train deep neural networks and run inference tasks on them using banks of servers. Increasingly, client and embedded systems from cars to handsets are adopting accelerators to speed the inferencing jobs.
Apple, Google and Huawei are already shipping smartphones with inferencing blocks in their custom SoCs. Google and Microsoft built inference accelerators for their data centers.
To read the full article, click here
Related Semiconductor IP
- eUSB2V2.0 Controller + PHY IP
- I/O Library with LVDS in SkyWater 90nm
- 50G PON LDPC Encoder/Decoder
- UALink Controller
- RISC-V Debug & Trace IP
Related News
- Quadric, Inference Engine for On-Device AI Chips, Raises $30M Series C as Design Wins Accelerate Across Edge
- Start-up claims breakthrough in reconfigurable logic
- Arasan Claims Worldwide Dominance in Design Wins For Flash Memory Card Interface IP
- Startup Claims Low-power IoT Geolocation Without a Positioning Chipset
Latest News
- Qualitas Semiconductor Secures Strategic IP Licensing Agreement for MIPI Solutions
- Chinese RISC-V Chipmaker SpacemiT Launches K3 AI CPU, Highlighting the Rise of Open-Source Hardware in Intelligent Computing
- Weebit Nano Q2 FY26 Quarterly Activities Report
- Arasan announces the immediate availability of the industries first xSPI NOR + eMMC NAND Combo PHY IP
- AMIQ EDA Gives AI Agents Access to Essential Design and Verification Data