Until recently, the integration and verification of SoC hardware and software began in the late stages of the design cycle, after the first silicon arrived from the foundry. Designers debugged the hardware and software components in a serial fashion, with work on software following work on hardware. Designers customarily validated the code, running simulations of high-level models of the design, executed far below true system speed. Hardware, on the other hand, was described at the register transfer level (RTL) with traditional hardware design languages and validated with test benches.
Full verification using traditional tools can take months to validate system performance and uncover bugs. The time required to complete these processes makes this approach untenable, given time-to-market constraints.
SoCs are a key technology enabling the growth of the Internet of Things and mobile computing. Multi-core SoCs often consist of more than a billion logic gates, run millions of lines of code, multiple interface protocols, and hundreds of clock domains and IP blocks. So while electronic products are shrinking, the systems that enable them are getting larger and more complex, adding another layer of difficulty to the design process.
In addition to managing increasing complexity, semiconductor makers must also contend with shrinking design cycles, driven by consumer demand for new products every few months. Because it is impossible to develop new components for each design and meet the abbreviated timetables, electronics makers increasingly reuse hardware designs and implement functions with embedded software. Ironically, this approach essentially plants the seeds for even more complexity in the integration of hardware and software and the verification of system-level performance.
“Re-use of designs increases the complexity of SoC development,” says Frank Schirrmeister, the leader of the product management team for Cadence’s System Development Suite. “Even though block functionality is well established, the interaction of the IP must be verified at new levels. The verification challenge shifts to the integration of the hardware-software stack, which includes the various IP blocks, interconnects, low-level drivers, operating system, and middleware.”
Another major challenge for SoC developers revolves around the increasing role of embedded software and the traditional approach to developing software only after silicon is available. Efforts to create and debug growing amounts of software and verify that it performs as expected in the hardware environment within shorter design cycles, however, are handicapped by traditional design methodologies.
As a result, SoC developers are turning to emulation systems, which provide a virtual transaction-based test environment and full visibility into the design for both hardware and software debugging. Many of today’s emulation systems operate at 3 MHz and can test millions of lines of code and run billions of execution cycles in a matter of minutes and hours instead of weeks and months.
Fig. 1 - Hardware emulation systems have become a critical element in the system-on-chip developer’s toolbox, providing transaction-based verification that enables early architecture validation and hardware-software co-verification. (Courtesy of Synopsys.)
Emulation systems implement SoC hardware in an FPGA or other kind of programmable element, which can run at much higher speed. With these resources, the emulator can boot a Linux operating system in under 20 min. “Emulators get into the range where it is practical to boot a real-time operating system and run device drivers,” says Jim Kenney, director of marketing for emulation at Mentor Graphics.
With the ability to enable hardware-software co-verification in significantly reduced development cycles, today’s hardware emulators are becoming a universal verification tool. This, however, does not mean that traditional design tools have become obsolete. “Emulation is not a panacea by any means,” says Cadence’s Schirrmeister. “You need virtual prototyping, simulation, FPGA-based prototyping, and emulation all working together.”
Fig. 2 - Despite the value brought to the SoC development process by hardware emulation systems, traditional tools such as simulation, virtual prototyping, and FPGA-based prototyping still play important roles in the verification process. (Courtesy of Cadence Design Systems.)
Islands of Development
Prior to hardware emulation’s move into mainstream SoC design, the developer’s toolbox was limited to a few types of system development platforms. These included virtual prototyping, register-transfer level (RTL) simulation, and FPGA prototyping. Each has its own strengths and weaknesses.
SoC designers use virtual prototyping primarily for early software development and architecture evaluation before hardware for RTL simulation is available. This platform runs simulations of transaction-level models at or near real-time speeds. Unfortunately, the high-level models used in these simulations do not provide the timing and power information required to perform detailed architectural optimization, block-level verification, and multi-core software system design. The platform also suffers from the fact that it has separate hardware and software debug environments, requiring time-consuming data transfer between hardware and software teams.
RTL simulation, on the other hand, focuses on the SoC’s hardware. This platform accurately verifies RTL hardware, and the turn-around time for hardware description changes can be fast. Advanced RTL simulators follow an executable verification plan, tracking coverage and verification flow. The simulations, however, run in the kilohertz range, a speed that limits their value for software execution. In addition, their speeds slow as the size of the design increases.
Designers rely on FPGA-based prototyping primarily to perform post-RTL software development and regressions, capitalizing on the technology’s fast execution speeds (in the tens of megahertz range). The downside is that the platform offers limited hardware debugging. Designers can typically observe only a limited number of signals, and depending on the debug tools available, the signals are represented at the netlist level, which makes it difficult to match them to the RTL signals in the design.
While these platforms begin to address the design challenges faced by SoC development teams, no single platform provides the means to perform all of the hardware-software integration and system-level performance verification required to ensure optimal performance. The missing element is a platform that can scale for larger SoC designs and enable hardware-software co-verification and debugging within the constraints of shorter development cycles.
Beyond Emulation
Emulation has proven its value as an essential tool in SoC development, executing high-definition models 10,000 times faster than traditional hardware description language simulators and running multi-billion-cycle tests in hours instead of days or weeks. The latest generation of high-performance emulators pushes performance even further, capitalizing on multi-megahertz performance. But as SoCs continue to push the limits of size, complexity, and software usage, the requirements for development teams grow. Designers need accelerated architecture validation so that critical decisions can be made sooner. Software teams, under pressure to develop more and more code, call for high-speed models earlier in the design cycle to facilitate application development. And verification teams press for faster execution platforms simply to keep up with the growing complexity of systems.
“So how can development teams start hardware-software integration and verification earlier in the development process,” asks Tom Borgstrom, director of marketing of the Verification Group at Synopsys. “Increasingly we see a lot of interest in the use of what we call hybrid emulation. This is where you combine a virtual prototype and an emulator to get even higher performance and earlier availability of a pre-silicon platform that has enough performance to do the hardware-software co-development and co-bring-up.”
Hybrid emulation focuses on three key use cases. First, the platform enables early architecture validation by running RTL blocks or subsystems in the emulator as a high-performance, cycle-accurate model to optimize the architecture. This approach has the accuracy of an RTL simulation, but it can execute the number of cycles required to validate and optimize the architecture much faster.
Next, hybrid emulators provide software developers with high-speed models earlier in the design cycle. The hybrid’s virtual prototypes enable the software team to use SystemC/C++ processor models and SystemC TLM-2.0 models to begin developing and testing system software before silicon is available, leveraging the platform’s superior speed.
The third use case is software-driven verification. Here, designers model the processors in the virtual prototype environment using SystemC models, simultaneously modeling the other major pieces of the design in the emulator. Design teams connect the subsystems and processors to the emulated RTL blocks through transactors for bus-level communications. They also debug the software using traditional software development tools or tools from the virtual prototype environment.
“So with hybrid emulation, you can take the processor model out of the emulator and have that run as part of the virtual prototype in an instruction set simulator,” says Borgstrom. “You get a performance boost by running the software-driven test from the fast model, and run the rest of the verification in the emulator. This has the added benefit of augmenting the capacity of the emulator because you take some of the design out of the emulator.”
Convergence
Industry watchers foresee a future where it becomes easier and faster to move among the different phases and aspects of verification, going from simulation and virtual prototyping to emulation and FPGA prototyping. When design teams find an issue with one model of the design, at one of these different abstraction levels, they will be able to go to the next lower level of abstraction to perform more detailed debugging and verification.
The one constant is that all the systems will continue to evolve, with all helping to extend the margin of verification. “The verification challenge is huge, and it continues to grow," says Borgstrom. “I think most designers would say that verification is never really complete. There is always more that you can verify.”