Ampere to Acquire OnSpecta to Accelerate AI Inference on Cloud-Native Applications

Boosts AI performance of Ampere® Altra® family across cloud and edge infrastructure

Santa Clara, Calif. -- July 28, 2021 – Ampere® Computing today announced it has agreed to acquire AI technology startup OnSpecta, strengthening Ampere® Altra® performance with AI inference applications. The OnSpecta Deep Learning Software (DLS) AI optimization engine can deliver significant performance enhancements over commonly used CPU-based machine learning (ML) frameworks. The companies have already been collaborating and have demonstrated over 4x acceleration on Ampere-based instances running popular AI-inference workloads. The acquisition will include an optimized model zoo with object detection, video processing and recommendation engines. Terms were not disclosed and the acquisition is expected to close in August, subject to customary closing conditions.

“We are excited to welcome the talented OnSpecta team to Ampere,” said Renee James, founder, chairman and CEO of Ampere Computing. “ The addition of deep learning expertise will enable Ampere to deliver a more robust platform for inference task processing with lower power, higher performance and better predictability than ever. This acquisition underscores our commitment to delivering a truly differentiated cloud native computing platform for our customers in both cloud and edge deployments.”

According to IDC Research, the AI server market is expected to be over $26B by 2024 with an annual growth rate of 13.7%. Ampere customers are seeking ways to manage the costs and the growing requirements for AI inference tasks in both centralized and edge-based infrastructure scenarios. DLS is a seamless binary drop-in library for many popular AI frameworks and will accelerate inference significantly on Ampere Altra. It enables the use of the Altra-native FP16 data format that can double performance over FP32 formats without significant accuracy loss or model retraining.

“This is a natural progression to the strong collaboration we already have with Ampere,” said Indra Mohan, co-founder and CEO of OnSpecta. “Our team will greatly benefit from being a part of Ampere as we help build upon the great success of Ampere Altra and provide critical support to customers as they apply the Altra product family to a wide variety of AI inference use cases.”

“We have already seen the powerful performance and ease-of-use of Ampere Altra and OnSpecta on the Oracle OCI Ampere A1 instance,” said Clay Magouyrk, executive vice president of Oracle Cloud Infrastructure. “With DLS compatibility on all major open source AI frameworks including Tensorflow, PyTorch and ONNX, and the predictable performance of Ampere Altra, we expect to see continued innovation on OCI Ampere A1 shapes for AI inference workloads.”

About Ampere

Ampere is designing the future of hyperscale cloud and edge computing with the world’s first cloud native processor. Built for the cloud with a modern 64-bit Arm server-based architecture, Ampere gives customers the freedom to accelerate the delivery of all cloud computing applications. With industry-leading cloud performance, power efficiency and scalability, Ampere processors are tailored for the continued growth of cloud and edge computing.

About OnSpecta

OnSpecta’s software, DLS, significantly accelerates AI inference workloads in the cloud and the edge. The company is headquartered in Redwood City, CA, and is led by its founders Indra Mohan (CEO), and Victor Jakubiuk (CTO). Its investors include Sage Hill Capital Partners, WestWave Capital, BMNT and GoingVC Partners.

×
Semiconductor IP