Hardware Emulation: A Revolution in the Making

28 October 2014

Almost three decades ago, management consulting firm McKinsey and Co. postulated that a late market entry in highly competitive markets with short market windows has devastating effects on profits. Applying this wisdom to the semiconductor industry where almost all markets are highly competitive, three factors contribute to disastrous results, including engineering cost overruns, respin costs and revenue lost to competition by being late to market.

Missing a product delivering window by three months can cost more than a quarter of potential revenue. Worse yet, being late to market can be the final blow for any semiconductor company.

Of course, executives at every semiconductor company look for ways to mitigate these factors and accelerate time-to-market (TTM) scenarios to correct them, all the while balancing budget, affordability and return on investment (ROI). They often look to electronic design automation (EDA) companies to continue to develop innovative tools that ameliorate TTM and offer justifiable ROI. One such tool gaining in popularity in all segments of the industry is hardware emulation.

A recent article posted on Electronics360 covered the technical reasons behind why hardware emulation is becoming a must-have verification tool in the design and verification flow. This commentary analyzes its ROI and will help prove investing in hardware emulation to get the product out the door on time has the potential to outweigh any investment cost.

Justifiable ROI

While hardware emulation has been around for more than 20 years, it’s only recently that it has become suitable for complex system-on-chip (SoC) designs and considered a universal verification tool used across the entire SoC development cycle. Hardware emulation is versatile enough for testing hardware properties of a SoC design and validating embedded software. It’s used to verify the integration of hardware and the embedded software as well.

As a universal verification tool, its ROI is justifiable. Hardware and software designers can share the same system and design representations because it offers combined hardware and software views of the design, something neither simulation nor field programmable gate array (FPGA) prototyping can do. Simulation is essentially focused on hardware debug and not adequate for processing the billions of verification cycles necessary for embedded software validation. Conversely, FPGA prototyping outpaces all other verification tools when vast amounts of verification cycles are required, but falls short when hardware debugging is needed. With emulation, designers are able to work together to debug hardware and software interactions, tracking a design problem across embedded software and the hardware to find it.

An emerging trend is the move toward emulation design datacenters filled with emulation enterprise servers. Three capabilities justify their appeal. They can be accessed remotely, support several concurrent users and can handle multiple large-capacity designs or any combination of large and small designs. Obviously, the ability to remotely support entire teams of verification engineers and software developers worldwide strengthens the emulation’s case for both TTM and ROI. The ROI further increases since a centralized team of emulation experts now can support a multitude of users dispersed in separate geographies over different time zones.

Getting the ICE out

Allow me to veer off topic for a paragraph to describe how remote access is doable when most design teams use hardware emulation in In Circuit Emulation (ICE) mode, impossible for remote access.

With ICE, the design under test (DUT) is mapped inside the emulator and connected to the target system where the actual chip would reside. The connection is based on speed-adapters –– complex hardware circuitry that implements interface protocols and accommodates the fast speed of the target system to the relatively slow speed of the emulator. Changing DUT, or just changing the type of target testing, requires on-site assistance to switch speed-adapters, defeating remote access. A new technique known as transaction-based verification enables remote access that doesn’t require a staff of technicians to manage speed adapters when the user swaps designs or another user logs in. Transactors, mapped inside the emulator, are software-based protocol interfaces that replace speed adapters and perform at emulation speed. While designers have to create testbenches, they are written at a higher level of abstraction where they are less prone to mistakes and by far faster to execute.

A reliable tool

An often overlooked aspect of an emulation system is its reliability. Unlike a software-based verification tool that, once it’s reached maturity is fairly bug free and not subject to down-time, an emulation system may suffer hardware failures that critically affect its up-time. The higher the reliability of an emulation system, measured by its mean-time-between-failure (MTBF), the longer its up-time. Needless to say, an up-time of several weeks or a few months will increase the ROI.

With design costs escalating, hardware emulation doesn’t seem like it’s as big an expense as mask costs. At the 28nm technology node, an average mask cost exceeds $3 million. If an emulator finds a hardware bug that would have required a silicon re-spin if not caught ahead of tapeout, many millions of dollars would be saved. The fact is, hardware emulation is replacing respins.

Hardware emulation is a capital expense and the purchase decision is driven by project teams. Until recently, the cost was prohibitive and the machines were mainly used in ICE mode and hard to use, limiting adoption to large companies with graphics and microprocessor designs and sizeable budgets. Modern emulators are powerful, flexible, deployable in several operational modes addressing a variety of verification objectives. What is even more remarkable is that prices have decreased from a plateau of one or more dollars per gate to pennies per gate.

The Semiconductor Industry doesn’t need a McKinsey and Co. report to learn that being late to market is disastrous and keeps executives reviewing ways to hit their targets. While hardware emulation has been around for 20 years or more, the current commercially available versions have become viable tools that offer strong ROI.

Lauro Rizzatti is a verification consultant. He was formerly GM of EVE-USA and VP of marketing before Synopsys acquired EVE.

Related links:

IHS Semiconductors & Components

News articles:

Managing Complexity With Hardware Emulation

Cadence Expands EDA Software Suite for FPGA Prototyping

Companies Pin Hopes on the Internet of Things

Cadence Lands Biggest Deal in Years

China's VeriSilicon Files For IPO

Powered by CR4, the Engineering Community

Discussion – 0 comments

By posting a comment you confirm that you have read and accept our Posting Rules and Terms of Use.
Engineering Newsletter Signup
Get the GlobalSpec
Stay up to date on:
Features the top stories, latest news, charts, insights and more on the end-to-end electronics value chain.
Weekly Newsletter
Get news, research, and analysis
on the Electronics industry in your
inbox every week - for FREE
Sign up for our FREE eNewsletter