by Dr. Lauro Rizzatti, Rizzatti LLC
About 30 years ago, when computers revolutionized the semiconductor design process, a new verification technology appeared on the horizon under the name of hardware emulation. It was implemented in a big-box and billed as being able to verify and debug chip designs.
Truth be told, there were precursors to this design verification approach. IBM for one already had experimented with hardware to accelerate design verification in the Yorktown Simulation Engine (YSE) and in the Engineering Verification Engine (EVE). In the early/ mid 80s, Daisy Systems, Valid Logic, Teradyne and others also introduced hardware-based verification engines. The fact is all of these were simulation accelerators. That is, special purpose computers, some implemented in custom ASICs, others with commercial parts, that verified designs described via software languages. They ran faster than simulators but not fast enough to apply real-life stimulus to the design-under-test (DUT).
Hardware emulators belong to a different class of engines. They all use reprogrammable hardware, typically FPGAs, configured to mimic or “emulate” the behavior of a DUT. They are the closest to a representation of the design in silicon prior to tapeout. And because of it, their speed of execution approached real-life performance.
The forerunners include Mentor Graphics 1 , PiE Design Systems, Quickturn Design Systems 2 , IKOS Systems and Zycad Corporation. Several new comers followed later.
Looking at the history of hardware emulation, one can divide its evolution in three broad periods: the early years, roughly from the late 80s to the mid-90s; the Middle Ages, from mid 90s to the mid-2000s; and the last decade, from the mid- 2000s to today.
Let’s look closely at these three periods with special consideration to the characteristics of the hardware emulators and how they evolved. Let’s also take a look at what’s behind hardware emulation’s ascension into becoming a leading verification tool and why it is essential for every semiconductor company today.
THE FIRST DECADE
All of the early hardware emulators were built with vast arrays of commercial FPGAs that often numbered into the thousands, mounted on large boards in vast numbers. They were encased in bulky cabinets that accommodated the FPGA boards, interconnected via active and complex backplanes, also populated by FPGAs.
The input/output access to the emulator was implemented through a bulky bundle of cables plugged into a target system where, ultimately, the DUT would reside once released by the foundry. This deployment scheme became known as in-circuit-emulation or ICE.
Back then, ICE was the only way to use an emulator and was the essential reason for its existence. The attractiveness of ICE rested on the ability to apply real- life stimulus to the DUT, impossible to realize with a software simulator. This capability was appealing enough to overshadow all the negative considerations that affected such deployment.
ICE was –– and still is to this day –– cumbersome and unfriendly to use, rather unreliable due to significant hardware dependencies that include electrical, magnetic, and mechanical. A case in point was and is the need to insert speed adapters between the target system and the emulator. This is done to accommodate the fast clock rate of the former, in the ball park of 100 megahertz (MHz), to the relatively slow clock speed of the latter, at most one MHz. The supporting software was limited essentially to a compiler and a basic debugger.
The compiler read the gate-level netlist of the DUT and generated the FPGA programming bit-streams to emulate the DUT. The simplicity of this statement is not giving justice to the complexity of the task. Just consider that the industry coined the expression “time-to-emulation” (TTE) to measure the time elapsed between entering the design description into the compiler and the time the emulator was configured and ready for deployment. Typically, TTE was measured in several months, long enough that the emulator often was ready for use past the time the foundry released silicon, defeating the purpose of the tool.
One of the gravest issues in an array of FPGAs was –– and it still is today –– the limited number of input/output pins of the FPGAs that prevented the full utilization of the FPGA resources. Different companies experimented with various schemes to alleviate the bottleneck. Among them, Quickturn invented the partial crossbar to address the drawback. Virtual Machine Works, founded in October 1993 by a team of MIT scientists, devised an FPGA compiler technology called Virtual Wire that eased the bottleneck from the limited FPGA I/Os by transmitting several I/O signals on a single pin under the control of a timing scheduling process 3 . IKOS acquired VMW in 1996 and used the virtual wire technology in all of its emulation platforms.
Another source of trouble concerned the routing of clock trees. The unpredictable layout of the DUT onto the fabric of the FPGAs typically yielded to large numbers of timing violations. Countless number of the brightest minds devised a myriad of smart ideas to tame this issue.
The debugger was rather rudimentary. The lack of visibility into the commercial FPGAs of the time was addressed by compiling probes in limited numbers into the design. This overextended the design-iteration-times (DIT) every time a missing probe was required, which was virtually always, to trace a bug since it required a full recompilation of the designs, stretching it to days.
With a bit of exaggeration, it was said that hardware emulation required an army of experienced application engineers to support its deployment. And if not an army, certainly a squad of supporting technicians was mandatory to keep the equipment in operation working off a rigorous maintenance schedule. Reliability was admittedly poor, with MTBF of less than one week if not a day.
The early emulators were single-user resources, complex and unfriendly to use and rather unreliable. All of which corresponded to their high cost of ownership. In fact, these emulators were more expensive than most engineering budgets could manage. As a result, only the most difficult designs, such as graphics and processors, benefited from these machines.
Table 1: The table summarizes the characteristics of early hardware emulators, circa 1995.
||Array of commercial FPGAs
||Up to ~2 million gates
|Speed of Emulation
||A few hundred kHz
||Only single users
||Similar to Industrial Refrigerators
||Less than one week
- Strictly speaking, Mentor never launched an emulation product, but it was developing one under the name of Mentor Realizer when management elected to sell the technology and patents to Quickturn in 1992.
- Quickturn acquired PiE in 1993.
- This should not be confused with an asynchronous multiplexing scheme used by Quickturn and others.
Back to Top