Cool AI chips are green
When chips get hot, thermal management quickly becomes difficult. The chips need more complex power grids, more expensive packages and active cooling fans for instance. Lots of GPU-based systems even need water cooling. And just in case you can get away with using only passive cooling, even a little heat greatly affects the housing and mechanical design. There needs to be enough surface material to dissipate the heat, resulting in the final device to become larger, heavier, and more expensive. In addition, if your power comes from a battery, that’ll have to have more capacity too, again adding cost and bulk. That’s a lot of reasons to keep power consumption low.
And chips do burn a lot of power. Especially when they’re running compute-intensive deep learning workloads. Deep learning algorithms require many tera-ops of multiplications per second and move massive amounts of data. At videantis, we designed our processor architecture from the ground up to consume as little energy as possible. Optimizing for low power has always been in our DNA at videantis – with a history in designing for mobile phone applications that are battery operated.
To read the full article, click here
Related Blogs
- What are AI Chips? A Comprehensive Guide to AI Chip Design
- All your chips are belong to us. Who's buying all of our chips?
- Are FinFETs too Expensive for Mainstream Chips?
- How PCIe 7.0 is Boosting Bandwidth for AI Chips
Latest Blogs
- Securing The Road Ahead: MACsec Compliant For Automotive Use
- Beyond design automation: How we manage processor IP variants with Codasip Studio
- Cadence Extends Support for Automotive Solutions on Arm Zena Compute Subsystems
- The Role of GPU in AI: Tech Impact & Imagination Technologies
- Time-of-Flight Decoding with Tensilica Vision DSPs - AI's Role in ToF Decoding