SoC designers describe their 'best practices'

SoC designers describe their 'best practices'

EETimes

SoC designers describe their 'best practices'
By EE Times
April 8, 2002 (3:01 p.m. EST)
URL: http://www.eetimes.com/story/OEG20020405S0071

Bleeding-edge designers today confront 20 million-gate, six-level-metal designs, in projects that may be divided among teams of 40 or 50 engineers. It's not a job for the faint of heart. Yet despite the seemingly impossible mission, these designs are getting done and out the door, thanks to developers' assiduous attention to best design practices.

Ideally, best practices are clearly identified, scrupulously documented procedures for facilitating high-quality system-on-chip design. They may take the form of design requirements; intellectual property (IP) reuse rules; RTL-coding conventions; verification plans; and strategic decisions regarding methodologies, staffing and resources. One contentious question is whether best practices evolve based on current EDA tool capabilities and limitations, or in spite of them.

For this special report, editors at EE Times and sister publication ISD polled design managers in Europe and the United States about their best practices for complex SoC design. The semiconductor and OEM design managers told us how their companies are tackling such issues as IP and tool build-or-buy decisions as the industry scouts the best paths to best practices.

"Success is much more than just doing your bit correctly: It is also making sure that at every stage in the product introduction process where your bit has influence, it does so uneventfully." That's how ARM Ltd.'s Ian Phillips identified the challenge for creating today's best design practices at a panel discussion on "How to Choose Semiconductor IP" at the Design Automation and Test in Europe (DATE) conference in Paris.

But in many cases the "bits" aren't yet stitched into a coherent piece, Mentor Graphics Corp.'s Pierre Bricaud said in an interview with EE Times. "At the moment, we have islands of automation of different design tasks, each based on a different language. The approaches are powerful, so nobody will give up the language or the approach used for their own particular task for the sake of the overall flow," Bricaud said.

Eventually the logjam will break, he predicted, and some variant of C/C++ will become the "lingua franca" for design and test applied to hardware and software.

Broadcom Corp.'s Sophie Wilson argued that it is not just what you do or the EDA tools you do it with. In Wilson's view, best practices come from hiring and working beside the best people. The practices and EDA tools, she noted, are always in flux and secondary to the engineers.

Wilson outlined three trends that loop back to affect processor and SoC design. First, "nowadays, [workstation] machines are about 500 times more powerful than in the early 1980s, and we can afford more of them, so we can do much more modeling and simulation.

"Second, there's much more information and training about how to do things: The body of knowledge of what works well when designing a processor or an SoC is much larg er now. It was virtually nonexistent when the ARM1 was designed." Third, Wilson identified the burgeoning of software programmability and niche acceleration: "Now there is a trend toward chips' getting more 'specialized' on top of their basic capability to do something."

As the task of design has become a more expansive and farther-flung activity, there is increasing evidence that the processor or SoC directly reflects the engineering organization applied to it. In addition, increasingly larger design teams dictate more formal interactions among the architecture, microarchitecture, logic and physical design activities. That adds time to the process.

Are functional verification and test keeping up with the complexity of systems-on-chip? "Only when aided by the very best verification people," said Wilson with emphasis.

There are, of course, some tasks that commercial EDA tools do not address well. "Timing closure is particularly difficult. The PLL is our own design, and it was not designed in a com mercial environment. And we have a whole set of physical tools to prepare a design for its process target," she said.

Tools: help or hindrance?

What can EDA vendors do to support best practices more effectively?

Sun's Sunil Joshi considers best practices "as orthogonal to the tools. To me, best practices are a design discipline, making sure that you're doing the right things, in spite of the tools. I see those as two orthogonal issues."

Nvidia Corp.'s Chris Malachowsky both agreed and disagreed with Joshi: "It's really the context and where you're applying it. I think our best practices would involve a lot less internal CAD development if we could have a single, industry-supported way of specifying whatever that 'something' is."

"Without a doubt, the tool capabilities force the design methodologies that we use, and based on those design methodologies we've come up with our best practices," said Texas Instruments Inc.'s Mike Fazeli. He called for EDA vendors to standardize on in terfaces: "I think we need to have that capability so that we'd be able to be more flexible on our use of the tools and the design methodology that we put around them."

Also on Fazeli's wish list: "being able to be more flexible as far as bringing point solutions into the overall design environment and integrating them, and having more flexibility on the choices of the tools that we have for any given solution."

Malachowsky believes the one overriding best practice is, "Don't let anything stand in the way of success. If you think of a way to move ahead, to overcome whatever the obstacle is, you're going to make the investment, whatever it is. But you want to succeed in as highly a leveraged way as you can."

Invest in success

"If that means buying tools, I go out and buy tools; I'm not cheap. In the procurement of tools, you don't want to bear an expense; but . . . if it's an investment in my future and my success, then it's justified," Malachowsky said.

"We capture as much as we can in automated flows, which are documented and posted on the Web," he continued. "Design requirements that enable the flow of best practices are captured with as much automation as possible. That's what my group does a lot of: trying to capture what has been determined as a best practice, or a best result, into an automated set of flow progressions, tool progressions or checks to ensure adherence to certain things."

As for transferable IP, one challenge is the mushrooming amounts of information required to integrate IP blocks into successful systems-on-chip, said Sumit DasGupta of Motorola Inc.

"The 'must-have' set of models and files to accompany the transfer of a synthesizable-from-RTL hardware IP block is getting larger as design extends upward in abstraction and final systems-on-chip become more complex," said DasGupta.

DasGupta's list includes VHDL or Verilog files, synthesis pragmas, testbenches, test vectors, bus-functional models and "black box" interactivity models. Those are accomp anied by a wealth of design documentation, prescribed target information, and software drivers and software monitors that wrap around the intellectual property.

"Given the complexity of the business we are involved in, we do see benefits in the development of generally applicable IP. We are raising it to the platform level," said DasGupta. "We have created a library where IP can reside, and it does allow you to speed up the development of SoCs. It's gaining strength as an idea."

Motorola is organized approximately into library builders and SoC integrators — a structure that DasGupta calls an organizational challenge. "We do not want to create a wall between IP creation and SoC integration, so often we let an engineer move with the IP into becoming an SoC integrator," he said.

Support for standards

"Internally we have our own semiconductor reuse standards," DasGupta added. "But standards are of no use if they are not used, so we have donated some of them to the verification cha pter of the Virtual Socket Interface Alliance."

Meanwhile, he said, "our internal standards are not being handed down by executive fiat from a central group. It is a broad group, engaged in a democratic process, which makes adherence more likely. In platforms, there is a need for application understanding and for discipline. That realization, I think, is dawning among all engineers."

Keith Diefendorff of MIPS Technologies Inc. defines a modern best design practice as the use of high-performance cores and putting as much of an application in software as possible. "We need to try and develop the architectures today that will still be valid in 10 years' time. We need headroom," he said.

Diefendorff pointed out that software can be written before a chip design is started and that features can be added after the chip is finished — even after it is in customers' hands — with Internet-downloadable software.

"Under this scheme, the number of IP core types reduces to just two: the high-perfo rmance processor and memory," he said. "Increasingly, in the future, we are going to see multiprocessor SoC devices and multithreading cores."

Diefendorff argues against heavily application-specific systems-on-chip that still resemble the ASICs from which they have evolved. "The fixed costs of an SoC of 50 million to 100 million transistors are enormous, so we are going to see a move away from custom hardware and a return to standard chips that can go in multiple machines," he said. "Going to large volumes is the only way to amortize the cost of producing the chips in the first place." Those standard multiprocessor SoC devices would then be customized in software to tune them to particular applications.

"The MIPS architecture has remained very clean since its inception and is a good platform from which to add support for parallelism," said Diefendorff.

Bricaud's view of platforms differs. "Platforms today are all very processor-driven, often around the ARM RISC processor cores, or in Adelante's c ase, around DSP."

But if platforms are going to be beneficial they are going to have to be application-specific, rather than company- or processor-specific, he said.

Today's platforms, Bricaud noted, typically have been developed internally by the major semiconductor or systems houses. He cited Philips, Motorola, Toshiba and Infineon as being among the leaders in developing proprietary "platforms" based on a mixture of internally developed and licensed IP.

Nvidia's Malachowsky said that when selecting external IP, "there are things that we've learned to ask for and look at over the years [that] have been captured in checklists and decision-making forms or flows. For us, the acquisition of IP is a matter of really understanding how well the IP, as designed and verified by the vendor, will integrate. Will it not only solve our functional need but integrate well with our design flow and our existing tools?"

Sun, Joshi said, has a formal process, called Sun Craft, for making tool-acquisition de cisions. "As well as the technical evaluations, we make sure that we know up-front what we are planning to do, what we are going to assess, how long it's going to take, what resources we will put on it and what the decision criteria are. We apply it rigorously, and that has worked really well for us in buying the best-of-breed tools that are out there."

Test plans

But no matter how good engineers are, without a comprehensive test plan the chip cannot get out the door. TI's Fazeli said test plans are a critical part of the peripheral IP development strategy: "We basically have processes in place where they get reviewed by cross-functional teams of application product engineers and designers before we pass them on to the verification teams.

"I would say that our objective is 100 percent coverage of the test plan. I think the issue is going to be, 'Is your test plan going to be adequate enough?' We're debating that, but we believe that we have a good process in place right now. It's been proven on a number of different projects that we just went through, and so far samples and all the units are functional, so we haven't missed anything.

"In my opinion," Fazeli said, "the issue is going to be how good you do your test plan, and then measuring against 100 percent coverage of that test plan."

A full transcript of the interview with Joshi, Fazeli and Malachowsky is available at www.EEdesign.com.

Copyright © 2003 CMP Media, LLC | Privacy Statement
×
Semiconductor IP