Startup Runs AI in Novel SRAM
Areanna claims 100 TOPS/W in simulation
By Rick Merritt, EETimes
July 22, 2019
SAN JOSE — In his spare time, an engineer at Tektronix sketched out a novel deep-learning accelerator, and now his two-person startup is the latest example of the groundswell of enthusiasm that deep learning is generating.
Behdad Youssefi defined an SRAM with specialized cells that can handle the matrix multiplication, quantization, storage and other jobs needed for an inference processor. After four years solo work on the concept originally planned as a PhD thesis, he formed startup Areanna with a colleague at Tektronix and a Berkeley professor as an advisor.
In Spice simulations the design delvers more than 100 tera-operations/second/watt when recognizing handwritten digits using 8-bit integer math. Youssefi claims it could beat Google’s TPU in computational density by an order of magnitude.
To read the full article, click here
Related Semiconductor IP
- HBM4 PHY IP
- Ultra-Low-Power LPDDR3/LPDDR2/DDR3L Combo Subsystem
- HBM4 Controller IP
- IPSEC AES-256-GCM (Standalone IPsec)
- Parameterizable compact BCH codec
Related News
- Stacked SRAM cells support programmable startup
- Startup Slashes SRAM Power With Standard Logic Process
- Israeli AI startup NeuReality raises $35M Series A to bring its novel inferencing chip to the market
- Arteris To Provide FlexGen Smart NoC IP In Next-Generation AMD AI Chiplet Designs
Latest News
- Rivian Unveils Custom Silicon, Next-Gen Autonomy Platform, and Deep AI Integration
- NanoXplore raises €20 million from MBDA and Bpifrance to accelerate its diversification into defense and its growth in support of European strategic sovereignty
- Omni Design Technologies Appoints Hinesh Shah as Vice President of Strategic Sales
- DHRUV64: India’s First 1.0 GHz, 64-bit dual-core Microprocessor
- QuickLogic Announces Expanded Scope of Strategic Radiation Hardened FPGA Contract