Why is Analog increasingly important in the Digital Era?
By Chipus Microelectronics
Since its invention in the `60s, integrated circuit development has seen an aggressive and cyclic pace for improvement pushed by specific disruptive applications. These once were military/aerospace, mainframe computer, minicomputer, personal computers, networking, mobile and more recently smartphones. The traditional approach has always been to make the most out of digital and support it, when needed, with analog submodules. This resulted in “Small-A Big-D” devices, which could profit the most from Moore’s Law: highest-performance computing on cutting-edge process nodes, low cost-per-function and high production volume, to name a few. Some current and future hot applications will keep being pushed and enabled by Moore’s law, such as Augmented/Virtual Reality (AR/VR), Artificial Intelligence (AI - Machine Learning and Deep Learning), and Big Data. However, the most discussed topic today, specially inside the semiconductor environment, differs a bit. Internet of Things (IoT) is the next big wave that will drive the Semiconductor Industry to the next level. Plus, what is going to make this revolution possible is More than Moore rather than Moore’s law. In fact, IoT’s edge nodes invariably will consist of “Big-A Small-D” devices and there are some reasons for it.
Internet of Things (IoT)
Internet of Things is not a single market, but a collection of markets and applications that intend to connect ‘things’ to the cloud. It may involve end-users, industrial equipment or environmental monitoring, for instance, but all applications gather some common technical and business requirements. A complete IoT solution will, in general, require a cloud structure of servers and services, a broad number of edge nodes and a gateway that connects these to the cloud. Fortunately these three elements may be developed independently once standards are created – and they have been already discussed for a while now –, which enriches the design ecosystem and gives more opportunity to small and medium-sized businesses. Among those three elements, the most interesting for hardware companies is definitely the IoT edge node, for being pervasive and more susceptible to innovation.
Making a long technical story short, an IoT end node must have means to read something from the environment, process that information, store it and communicate it to the cloud. Nevertheless, everything must be executed securely and efficiently in terms of power consumption. All these features demand a broad expertise in IC design, including digital, analog and RF. Moreover, ultra low power design capabilities become crucial for long life span edge nodes, which is the case of most applications.
To read the full article, click here
Related Semiconductor IP
- Root of Trust (RoT)
- Fixed Point Doppler Channel IP core
- Multi-protocol wireless plaform integrating Bluetooth Dual Mode, IEEE 802.15.4 (for Thread, Zigbee and Matter)
- Polyphase Video Scaler
- Compact, low-power, 8bit ADC on GF 22nm FDX
Related White Papers
- Radiation Tolerance is not just for Rocket Scientists: Mitigating Digital Logic Soft Errors in the Terrestrial Environment
- Mixed-Signal = Analog + Digital, or is there more to it?
- Embedded 3D graphics is here, but 2D is still important: Here's why
- Why the Memory Subsystem is Critical in Inferencing Chips
Latest White Papers
- Reimagining AI Infrastructure: The Power of Converged Back-end Networks
- 40G UCIe IP Advantages for AI Applications
- Recent progress in spin-orbit torque magnetic random-access memory
- What is JESD204C? A quick glance at the standard
- Open-Source Design of Heterogeneous SoCs for AI Acceleration: the PULP Platform Experience