Last-Time Buy Notifications For Your ASICs? How To Make the Most of It
By Enrique Martinez, EnSilica
This article explores the history of the 600 nm process, the reasons behind its phase-out, and what the industry has to look forward to as smaller, more efficient chips become the new standard.
The semiconductor industry is continuously evolving, and while evolution is a good thing, it also means that some well-established technologies are inevitably approaching the end of their end of life. As those in the industry will know, many foundries have issued last-time buy notifications for their 600 nm ASICs as they pivot toward more efficient, smaller geometry nodes.
This will force companies in the automotive, aerospace, defense, telecommunications, and consumer electronics sectors to rethink their semiconductor designs. However, as we’ll discuss, that isn’t necessarily bad news.
Reviewing the Semiconductor Evolution
The journey of semiconductor technology has been marked by significant milestones. The 10 µm process developed in 1971 gave rise to the Intel 4004, which contained 2300 transistors and was the world’s first commercially produced microprocessor.
The pace of growth following this breakthrough was swift, with 6 µm and 3 µm processes following in 1974 and 1977, respectively. Each iteration improved on the last. By this time, Moore’s Law had been established. It predicted that chip density would almost double every two years as semiconductor technology advanced exponentially.
At the start of the 1980s, transistors were down to 1 µm in size, and companies were commonly packing more than 100,000 onto a chip. This number reached more than 1 million by the 1990s, and this is where the 600 nm process made its debut.
By now, the term “sub-micron” had become a reality, and the 600 nm process played a central role. This era saw the introduction of CMOS technologies that could integrate up to 4 million bits of SRAM and 16 million bits of DRAM—a substantial leap forward at the time. Notably, during this period, Intel transitioned from the 486 processor, which utilized 600 nm technology, to the first generation of Pentium processors, which used a more advanced 350 nm process.
During the 600 nm era, 8-inch (200 mm) wafers became the industry standard, and 5 V logic levels were commonly used. This period also saw the development of various process variants, including BiCMOS and BCD technologies, which enabled mixed-signal applications that featured analog and digital signals. These innovations allowed products developed on 600 nm processes to remain in production for many years, used in everything from consumer electronics to industrial equipment.
To read the full article, click here
Related Semiconductor IP
- Root of Trust (RoT)
- Fixed Point Doppler Channel IP core
- Multi-protocol wireless plaform integrating Bluetooth Dual Mode, IEEE 802.15.4 (for Thread, Zigbee and Matter)
- Polyphase Video Scaler
- Compact, low-power, 8bit ADC on GF 22nm FDX
Related White Papers
- Basics of SRAM PUF and how to deploy it for IoT security
- How a voltage glitch attack could cripple your SoC or MCU - and how to securely protect it
- Paving the way for the next generation of audio codec for True Wireless Stereo (TWS) applications - PART 5 : Cutting time to market in a safe and timely manner
- eFPGA Saved Us Millions of Dollars. It Can Do the Same for You
Latest White Papers
- Reimagining AI Infrastructure: The Power of Converged Back-end Networks
- 40G UCIe IP Advantages for AI Applications
- Recent progress in spin-orbit torque magnetic random-access memory
- What is JESD204C? A quick glance at the standard
- Open-Source Design of Heterogeneous SoCs for AI Acceleration: the PULP Platform Experience