Can Sub-Arctic Temperature Circuits Solve the AI Energy Challenge?
The AI era is fostering unprecedented innovation across industries and creating acute challenges for the high-performance computing (HPC) industry to support exponential power and performance demands. AI alone is poised to increase data center power demand by 160 percent by 2030, as queries from applications like ChatGPT require nearly 10X the electricity to process as a Google search. The HPC ecosystem is exploring new semiconductor designs to unlock next-generation infrastructures that deliver greater performance and energy efficiency. One promising area of semiconductor research looks to answer the question, “Can sub arctic-cold, microscopic circuits = a more energy efficient AI data center?” Enter – cryogenic CMOS.
To read the full article, click here
Related Semiconductor IP
- NPU IP Core for Mobile
- NPU IP Core for Edge
- Specialized Video Processing NPU IP
- HYPERBUS™ Memory Controller
- AV1 Video Encoder IP
Related Blogs
- ReRAM-Powered Edge AI: A Game-Changer for Energy Efficiency, Cost, and Security
- Designing Energy-Efficient AI Accelerators for Data Centers and the Intelligent Edge
- Bluetooth set as short range wireless standard for smart energy!
- ARM A Huge Challenge For Intel
Latest Blogs
- Cadence Extends Support for Automotive Solutions on Arm Zena Compute Subsystems
- The Role of GPU in AI: Tech Impact & Imagination Technologies
- Time-of-Flight Decoding with Tensilica Vision DSPs - AI's Role in ToF Decoding
- Synopsys Expands Collaboration with Arm to Accelerate the Automotive Industry’s Transformation to Software-Defined Vehicles
- Deep Robotics and Arm Power the Future of Autonomous Mobility