Startup Runs AI in Novel SRAM
Areanna claims 100 TOPS/W in simulation
By Rick Merritt, EETimes
July 22, 2019
SAN JOSE — In his spare time, an engineer at Tektronix sketched out a novel deep-learning accelerator, and now his two-person startup is the latest example of the groundswell of enthusiasm that deep learning is generating.
Behdad Youssefi defined an SRAM with specialized cells that can handle the matrix multiplication, quantization, storage and other jobs needed for an inference processor. After four years solo work on the concept originally planned as a PhD thesis, he formed startup Areanna with a colleague at Tektronix and a Berkeley professor as an advisor.
In Spice simulations the design delvers more than 100 tera-operations/second/watt when recognizing handwritten digits using 8-bit integer math. Youssefi claims it could beat Google’s TPU in computational density by an order of magnitude.
To read the full article, click here
Related Semiconductor IP
- HBM4 PHY IP
- eFuse Controller IP
- Secure Storage Solution for OTP IP
- Ultra-Low-Power LPDDR3/LPDDR2/DDR3L Combo Subsystem
- MIPI D-PHY and FPD-Link (LVDS) Combinational Transmitter for TSMC 22nm ULP
Related News
- Stacked SRAM cells support programmable startup
- Startup Slashes SRAM Power With Standard Logic Process
- Israeli AI startup NeuReality raises $35M Series A to bring its novel inferencing chip to the market
- Indian Startup Builds Full-Stack Edge AI Chips Using In-House IP
Latest News
- LTSCT and Andes Technology Sign Strategic IP Licensing Master Agreement to accelerate RISC-V Based Advanced Semiconductor Solutions
- Global Semiconductor Sales Increase 29.8% Year-to-Year in November
- BAE Systems Licenses Time Sensitive Networking (TSN) Ethernet IP Cores from CAST
- HBM4 Mass Production Delayed to End of 1Q26 By Spec Upgrades and Nvidia Strategy Adjustments
- ASICLAND Secures USD 17.6 Million Storage Controller Mass Production Contract