How to Create Efficient IP Standards and Why You Should Care
Saverio Fazzari, Cadence Design Systems, Columbia, MD USA
Abstract:
The use of intellectual property (IP) is critical in modern system-on-chip (SoC) designs today. Design teams must be able to select and bring together the right IP in the design in order to meet the functional and time-to-market requirements. These IP blocks must be able to interface and communicate effectively and efficiently. Standards are designed to make communication between design blocks more effective and reduce the chance for errors, which can lead to poor design quality. Should we still care about standards when working with IP? Can an efficient standard be created? What would that standard look like? This paper will cover the issues of design associated specifically with semiconductor IP components.
Why Care About Standards?
One definition of efficient IP standards is, “standards that are usable and communicate.” Both vendors and users must participate in the creation of these standards or face significant cost increases in terms of product development or lack of interoperability. If only the vendors participated, then standards would emerge that are technically acceptable, but they may lack the needed interoperability between various devices or applications that users require, i.e. fails to communicate. If only the users participated, then standards would emerge that span the relevant ecosystem, but they may lack the necessary specifications to allow efficient interoperability, i.e. fails usability.
Design challenges are growing dramatically and product requirements are dictating the support of many capabilities within projects. These factors have driven the development of various industry standards throughout the world. For example, some video standards provide support for the appropriate technology needs, such as performance or protection of the intellectual property rights. These standards are created with the promise that they will speed industry adoption and product development, foster innovation of more advanced products, and reduce consumer costs over time as the technology becomes commoditized. The success of the USB standard is a perfect example. It went from being an incremental replacement for the various interfaces on a PC to becoming the standard connection tool. Specifically, the USB is now used by cameras to interface with printers. The success and speed of adoption of this standard can in part be attributed to the wide participation by both vendors and users in its creation.
Consumers are constantly upgrading to new integrated devices that play games, take pictures and communicate. This “digital convergence” is what many companies are striving towards. In this case, the single device has multiple purposes such as an integrated camera, MP3 player and a phone. This increase in required functionality is typically coupled with a shorter design cycle. Also, people do not want to pay additional money for items they deem should be part of the standard product. So a design company is stuck with the unpleasant task of adding expected capability along with unique features, while still keeping the price competitive.
Companies eventually must make a decision about their core competency and in-house expertise, which may lead to acquisition of other items that are necessary to create the product, and bring all of the pieces together. In this case, they must allocate additional time in the development cycle to do the integration and revalidation. As an IP consumer, one needs to be aware of all the features in the IP and what their impact is on the design.
In order to select a model, companies must evaluate the available solutions. Companies typically look at the model, the development process and the IP provider. This process can be time consuming and costly depending on the various blocks needed. A company may not be able to devote enough time to do more than one or two evaluations during this process. It would be desirable to have an evaluation process that is straight forward and can address all of the important concerns. Finding standards that allow the vendor to communicate information about the IP, and its applicability to the environment that it will be used in, is critical.
The goal of implementing IP standards is to facilitate quick time-to-market introduction, while also maintaining cost and quality constraints. Customers want access to functional IP blocks that can be used and reused effectively in their product development environment, and the IP providers want to deliver the models in the most effective manner. The tool providers want to accurately integrate all of the various IP components to produce an effective design. These are the goals that that industry standards must strive to meet.
Standards make it easier to improve efficiency and increase the number and quality of available IP solutions. For example, the IP’s RTL model should work on multiple tools to allow greater flexibility. The model could be tested against a hardware standard to show functionality. Finally, it could use a library for implementation that supports advanced verification. With only a few tests, the IP can be leveraged throughout the design flow with confidence.
In our industry’s sphere of interest, we have various standards for IP behavior protocols, Electronic Design Automation (EDA) models, IP interchange and EDA tool interfaces. However, there are no current standards for measuring quality.
The practical nature of standards leads to some less than perfect solutions. Many factors impact the features of the standards and how they will be implemented. For instance, a for-profit company must balance their support of standards with acquiring and/or maintaining a competitive edge in their industry, and some may want to make it easier for others to support their approach. Still others may want to leverage standards where it provides a prototype example for the latest technology. Companies can drive divisions by blocking their competitors in the definitions of standards which can lead to multiple or competing standards for various capabilities. In the modern design chain, standards are a matter of fact. The salient issue is to understand how they impact the ability of projects to be successful and cost-effective, while still maintaining quality.
This paper explores the various standards that can impact the creation of a design. Starting at the system level where an environment is created for the chip, the various models must be selected and then implemented in the design, and libraries must be made available to take advantage of the tool features that allow for effective implementations. This paper also explores the compelling reasons why full EDA, IP provider and user participation leads to more efficient standards.
IP Interchange
The process by which IP is bought, sold and integrated into designs is of paramount importance. Various groups have come together to create and modify approaches for sharing relevant information. The goal is to create an interchange that is usable, secure and effective. Without a proper interchange system, the IP may become unusable because of confusion between the requirements and the deliverables. There is also a risk that the IP may become insecure, costing a company legitimate revenue from illegal or untraceable usage. An ineffective business process could create substantial costs that may damage a company’s ability to engage in this industry.
In this arena, communication about the IP and verification of the IP are keys. A key part of IP usage is the ability for users to select the correct blocks for their needs. Here is where the VSIA QIP metric can be effectively leveraged. QIP is a tool from the Virtual Socket Interface Alliance (VSIA) that allows communication of the quality of the IP. In the design environment, SPIRIT organization is driving a method for managing the flow of the IP data needed in a SoC verification and design environment.
IP quality can be determined objectively by scoring the IP with VSIA’s QIP worksheet. Scores can be compared between suppliers for a relative measure of quality. However, a stronger indication of quality can be obtained by verifying that the IP has been tested with the EDA supplier, foundry and IP developer in a flow that closely matches the one used by the chip development team. Confidence increases when the IP quality metrics align with the desired use.
Design Tools and Methodology
There are many competing file formats in this area: Hercules vs. Calibre decks; TLF vs. Liberty; LEF vs. Milkyway FRAM; and Verilog vs. VHDL. Having competing tools in the EDA industry is good, but every IP supplier and tool vendor must finance the development and maintenance of all the parsers and interfaces. Often times the syntax is easily specified but the semantics may differ between tools. It costs time, money and human resources to effectively create these interfaces. That cost is compounded by the fact that it does not provide a competitive advantage to any company involved. It is simply the cost of doing business.
Traditionally, EDA companies have developed and maintained these parsers and interfaces in an inwardly focused manner. They have customer requirements for new capabilities that must be addressed. Customers are looking for features and not pushing for standards support. There never seems to be enough time to develop an industry standard first, so that the customer can be served by having only one format. IP providers must undertake return on investment (ROI) discussions to properly prioritize their development. EDA companies are generally driven by customer demands. The value of the standards must be demonstrated, communicated and promoted.
One recent example is Cadence’s Effective Current Source Model (ECSM) and the recent development of Synopsys’ Composite Current Source (CCS) model. Both models offer improved accuracy within a few percent of SPICE. Both work on leading edge tools that many customers request. However, failing to quickly develop an industry standard can divide the industry and customers, and create an increased IP and EDA development cost that must be passed on to the end-customer. The time-to-market cost or increased risk of failure also adds to the penalty paid. This is an example of how competition can move the industry forward, but it would be in everyone’s best interest for there to be a reconciliation of the two formats into a single industry standard. A good a place for this to occur is within the OMC, which we will discuss later.
In order to provide an industry-serving standard that achieves the promise of new innovation, fostering new technology and better pricing through competition, the industry must rethink how it is developing technology in all areas. The IP and EDA companies are dependent on the SoC developers who purchase the tools, IP, and foundry services. Traditionally these customers have been vocal only to suppliers with their needs. Tool and modeling requirements are often generated from these mostly private conversations.
In order to achieve the quickest and most efficient standards, there must be a gathering of industry-wide requirements. Those in academia are frequently working on these areas several years in advance of widespread need. Often times, requirements gathering can occur after the leading-edge tools have been produced. Si2, Accellera and Sematech all do a good job of bringing the stakeholders to the table. The end results are often a well thought out set of requirements that show a common understanding of the solution space. Unfortunately, the value of this effort is usually degraded by the time that it takes to arrive at a usable standard.
Even the best standards fall short when the industry is driven hard to evolve and often a quick, practical and extensible standard is not available for industry use and implementation. It can often take years to get a formal IEEE standard approved. No modern technology industry can wait for final ratification because of the high cost of lost opportunities. Again, for-profit companies can not wait and must develop standards that may not be unified.
By improving communication between all parties on the base technology requirements, the efficiency gains could be used for product differentiation and general R&D spending, and returned to the customer in terms of lower prices or simply kept as profit. The technology standard, driven by the end-customer, is the key starting point.
Current State of EDA Standards
EDA standards can be useful; as they provide a foundation of trust that the IP is useable and works within the design methodology. Further trust can be developed through support for the design standards that operate between different tools.
IP representation provides another level of trust. Providing EDA views as part of the design kit can provide efficient integration. Standard cell and I/O libraries are shipped with static models that have been tested against a wide variety of tools and flows. Memory models are more difficult. They are usually generated dynamically at the customer site based on user configuration. In either case, it is helpful to the IP consumer to have ready-made views for use with all of the relevant tools. If the IP consumer must translate models, additional costs for model generation and for testing must be accounted for. Although translators are often available to ease integration, the verification cost can be high and ultimately reduces trust.
Finally, there are the standard interfaces to EDA tools. These represent inputs to the base analytical and design capabilities needed by today’s SoC designers. Currently there are well known and used standards for physical data exchange, PDEF, GDSII, etc.; functional representation with Verilog/VHDL; and timing information with Liberty, ECSM, DCL and TLF. As new capabilities are added, the speed in which standards are established will determine the efficiency of how IP and EDA capabilities are delivered.
A standards body that is serving the IP development industry is the Open Modeling Coalition (OMC), sponsored by Si2. The OMC is building on the success of the Open Access Coalition to bring together the users and providers of the IP to define some of the infrastructure needed to be successful. The coalition is attempting to standardize the various interfaces in IP development and usage. The requirements for advanced device characterization and model data at the newer process nodes are also critical for success. It is important to have all the members of IP ecosystem involved in defining the standards as early as possible to get them widely adopted. One example is the OMC facilitating ECSM evolution by gathering input from industry users and suppliers.
Typical Design Flows
- Cell Library Development Flow
Once characterization is completed with Virage Logic’s proprietary solution, the next procedure is to generate the various EDA views that compose the customer packages. Each EDA view has a software module to generate that particular view. Once all the views have been built and assembled, they are verified for consistency between them and are correlated back to the original SPICE and GDS files to ensure the design was not altered during development.
Each view and vendor toolset is used to verify the models. This extensive testing matrix can be oppressive in terms of license, CPU and disk space requirements. It can also be difficult to evaluate the consistency of the models when there are occasional semantic differences.
- IP Development Flow
- Chip Design Flow
Heterogeneous flows usually require disparate views based on each tool’s capability and the authoring company. One traditional flow example uses Synopsys’ Design Compiler with Liberty for synthesis and Cadence TLF in Silicon Ensemble/Pearl for finalizing the chip’s physical design. Great care was needed to ensure the timing was accurate at all design stages to avoid manual rework due to tool inconsistencies.
In much the same way, external hard/firm IP kits may fall victim to different model and tool limitations. Real world testing in this situation is required to gain confidence in accuracy and reliability concerns. This is another added cost of using certain types of external IP.
Development Process Inefficiencies
There are several development side issues in developing IP without a consistent standard. There is a cost associated with providing multiple formats with similar information. With each new EDA tool or supported view, the specific requirements are checked against all the information stored in the IP design database. Any requirements missing from the design database must be added, the characterization and modeling systems must be updated, and the verification must be enhanced to check for accuracy and completeness of the new features.
- Format Interpretation
Consistency of usage across EDA tools and models is also a risk. Active driver models, such as ECSM and CCS models, can be used in a myriad of ways in various tools to analyze different behaviors. There is no guarantee that these models match, especially when used in a particular tool.
- Language Impact
With the evolution of Verilog, a similar pattern is emerging as well. SystemVerilog represents the next stage of the language’s development. Many software-centric features are proposed to help allow for advanced verification techniques and better simulation performance. The problem is that the standard does not show a clear path for support by the tools that are used in the design chain. For example, is the code which works in the synthesis tool supported by the simulator? IP vendors are left with the uncertain task of trying to figure out which set is truly reusable across the various tools. The user suffers because the language features are not available in all the tools or the available IP.
As an offshoot of this issue, the use of external files to provide additional constraints to the tools in order to help control the performance, is growing. These files start as proprietary formats which are then made available to the industry once they have been established. The issue then becomes what information is in the language vs. the external files. Now the IP must address the issue of what information is needed by the different tools.
Why do you care? What’s next?
There is extensive work currently ongoing in standards relevant to the development, acquisition, and utilization of IP. SPIRIT, VSIA QIP, and OMC are examples of such important industry standards efforts. Experience has shown that early and deep involvement by key stakeholders; EDA vendors, IP developers, and end users, is critical to developing effective IP standards.
All evidence points to earlier and more active involvement by the interested parties being the key to developing efficient IP standards.
VSIA QIP
http://www.vsia.org/pillars/IP_Quality_Pillar.htm
The VSIA QIP defines the key quality attributes of any IP core required to make it properly functional and efficiently reusable in SoCs. It is metric to motivate the use and measure the presence of these quality attributes. These include scores for documentation, modeling content, interoperability and verification. The average IP consumer should care because these are excellent starting points for understanding the value of each piece of IP. As more widespread adoption and maturity of the QIP standard occurs, the IP consumer will have an advantage in quickly surveying what is available with reasonable certainty. Secondly, a higher QIP score will typically reduce the integration costs.
SPIRIT
http://www.spiritconsortium.org
SPIRIT is an industry level consortium developing standards for IP description, as well as tools that: raise automation levels, reduce costs, improve ease-of-use, and increase the flexibility in IP selection and integration. It is focused on addressing a clear problem and producing useful technology. The original concept was driven by customers who care about the integration issues associated with their SoC designs. Two of the initial founders were Philips and ST Microelectronics, two large consumers of both commercially available IP and EDA tools. They, along with ARM, were able to get all of the major EDA providers to cooperate with the creation of the standard.
The SPIRIT standard focused on leveraging existing technology to address issues. There was a required commitment from members to actively enable creation of the standard, meaning that members could not just listen in and not participate. It also helped to have aggressive deadlines and release delivery of releases. In fact, the first release, which focused on RTL integration, has been used in production today. This created an ecosystem and basic set of examples that make it easier for people to reuse the technology in other tools and designs.
OMC
http://www.si2.org
The Open Modeling Coalition, sponsored by Si2, has been created to address the accuracy needs for the future technology nodes such 65nm for IP development and usage. There are several parts of the development flow that will be standardized: device characterization, device modeling, and interfaces with the various tools. It will support key aspects of IP development. The following description is taken from the OMC group charter:
“The goals of the OMC are to:
- Define a consistent modeling and characterization environment for standard cells, IO cells, macro blocks and memory model libraries, etc. for predominantly digital design
- Enable the cost effective creation of more technically capable library models than those that currently exist
- Develop model characterization and generation capabilities to supply more accurate and consistent models.”
The OMC focuses on creating useful standards that all companies in the design eco-system, both partners and competitors can benefit from. This provides the technology competency to create a truly usable standard that companies are interested in.
The average IP consumer should care because standards that do not provide integration and accuracy throughout the flow lead to poor and inconsistent results. This results in increased design iterations, increased integration costs for model regeneration and higher prices from IP providers and EDA tool suppliers. Innovation will happen when we agree to use a common infrastructure and compete solely on the basis of algorithms and price, not retaining customers through transition costs.
What is needed going forward, in our opinion? Certainly greater customer involvement in defining accuracy and performance targets in the design flow is needed. More importantly, having customers join technical development groups, with commitments and donations from the EDA vendors and IP developers for technology that will ease the integration transition and prove the technology will help everyone attain accuracy and performance goals.
Companies want standards to help them to be more successful. These standards must be able to support the requirements of all members in the design chain. The tool providers, IP users, and IP providers must be able to use their partner’s technology. For example, SystemVerilog models should work in a Cadence Simulator and Synopsys synthesis tool flow, as well as in a Mentor Graphics simulator and Cadence synthesis tool flow. Today, side files are often needed to configure tools and models to work together properly. This is hindering widespread adoption. If the standard, as well as the usable portion is clearly defined, it can be successful.
Conclusions
Standards are a powerful tool to improve the effectiveness of IP. They also can help throughout the development process. Standards allow information to be communicated in a manner that is well understood by all parties. The IP and tool provider must be able to understand the format and interpret the results correctly. The key driver in all of this is the end customer. Customers must be able to share their requirements and see the standards through to deployment. The value proposition must be there and the provider companies would do everyone a service by working together on the base technology as they work to advance the industry with new innovation. Standards are an effective tool when they enable usage and communication of the data.
Saverio Fazzari is the Technical Marketing Director responsible for compliance testing in the Cadence OpenChoice IP program. He can be reached at Saverio at cadence dot com
Dan Moritz is the QA Program Manager for Virage Logic Corporation’s Logic and I/O products. He can be reached via email at Dan.Moritz at viragelogic dot com
Related Semiconductor IP
- AES GCM IP Core
- High Speed Ethernet Quad 10G to 100G PCS
- High Speed Ethernet Gen-2 Quad 100G PCS IP
- High Speed Ethernet 4/2/1-Lane 100G PCS
- High Speed Ethernet 2/4/8-Lane 200G/400G PCS
Related White Papers
- Agile Analog's Approach to Analog IP Design and Quality --- Why "Silicon Proven" is NOT What You Think
- How Low Can You Go? Pushing the Limits of Transistors - Deep Low Voltage Enablement of Embedded Memories and Logic Libraries to Achieve Extreme Low Power
- How a voltage glitch attack could cripple your SoC or MCU - and how to securely protect it
- How to create energy-efficient IIoT sensor nodes
Latest White Papers
- New Realities Demand a New Approach to System Verification and Validation
- How silicon and circuit optimizations help FPGAs offer lower size, power and cost in video bridging applications
- Sustainable Hardware Specialization
- PCIe IP With Enhanced Security For The Automotive Market
- Top 5 Reasons why CPU is the Best Processor for AI Inference