Disruption Ahead for Ethernet Chips
The network industry is decoupling software from hardware, enabling new opportunities in the Ethernet switching market.
Data centers keep demanding more performance because the titans of cloud--Apple Google Facebook and Amazon--are constantly starved for faster connections and old design methods are not keeping pace. Disaggregation is about to hit the Ethernet chip market.
Looking at the numbers, the market has experienced a substantial increase in merchant silicon and demand for higher speed ports. 100 Gbps revenue surpassed 10 Gbps revenue for the first time in 2Q18 within the data center, furthermore, 25 Gbps serdes lanes are now more common in that space than any other previous technology. This signals the clear future trend.
A prominent way forward for chip design is increasingly defined by the transition away from network switch ASICs--dependent mainly on TSMC’s process node technology--to multi-chip or “chiplet” architectures. This evolution will shake up big silicon vendors such as Broadcom and Cisco who have done business in much the same way for years.
To read the full article, click here
Related Semiconductor IP
- 100G MAC/PCS Ultra Ethernet
- Complete 1.6T Ultra Ethernet IP Solution
- Software solution targeting the configuration of Ethernet switches and endpoints in TSN networks
- Verification IP for Ethernet
- Multi-channel, multi-rate Ethernet aggregator - 10G to 400G AX (e.g., AI)
Related Blogs
- What Lies Ahead for the Automotive Industry in 2024
- Why Is Interoperability So Important for Ethernet Connectivity?
- How PCIe 7.0 is Boosting Bandwidth for AI Chips
- Ultra Ethernet Consortium Set to Enable Scaling of Networking Interconnects for AI and HPC
Latest Blogs
- Upgrade the Raspberry Pi for AI with a Neuromorphic Processor
- Securing The Road Ahead: MACsec Compliant For Automotive Use
- Beyond design automation: How we manage processor IP variants with Codasip Studio
- Cadence Extends Support for Automotive Solutions on Arm Zena Compute Subsystems
- The Role of GPU in AI: Tech Impact & Imagination Technologies