Startup Runs AI in Novel SRAM
Areanna claims 100 TOPS/W in simulation
By Rick Merritt, EETimes
July 22, 2019
SAN JOSE — In his spare time, an engineer at Tektronix sketched out a novel deep-learning accelerator, and now his two-person startup is the latest example of the groundswell of enthusiasm that deep learning is generating.
Behdad Youssefi defined an SRAM with specialized cells that can handle the matrix multiplication, quantization, storage and other jobs needed for an inference processor. After four years solo work on the concept originally planned as a PhD thesis, he formed startup Areanna with a colleague at Tektronix and a Berkeley professor as an advisor.
In Spice simulations the design delvers more than 100 tera-operations/second/watt when recognizing handwritten digits using 8-bit integer math. Youssefi claims it could beat Google’s TPU in computational density by an order of magnitude.
To read the full article, click here
Related Semiconductor IP
- Simulation VIP for Ethernet UEC
- Bluetooth® Low Energy 6.2 PHY IP with Channel Sounding
- Simulation VIP for UALink
- General use, integer-N 4GHz Hybrid Phase Locked Loop on TSMC 28HPC
- JPEG XL Encoder
Related News
- Stacked SRAM cells support programmable startup
- Startup Slashes SRAM Power With Standard Logic Process
- Israeli AI startup NeuReality raises $35M Series A to bring its novel inferencing chip to the market
- Xiphera Wins ECSO STARtup Award 2025 for Innovation in Cybersecurity
Latest News
- M31 Debuts at ICCAD 2025, Empowering the Next Generation of AI Chips with High-Performance, Low-Power IP
- Perceptia Begins Port of pPLL03 to Samsung 14nm Process Technology
- Spectral Design and Test Inc. and BAE Systems Announce Collaboration in RHBD Memory IP Development
- VSORA and GUC Partner on Jotunn8 Datacenter AI Inference Processor
- Mixel MIPI IP Integrated into Automotive Radar Processors Supporting Safety-critical Applications