by Tom Fitzpatrick, Editor and Verification Technologist
 "On a recent visit to the Evergreen Aviation & Space Museum in Oregon, I had an opportunity to see some great examples of what, for their time, were incredibly complex pieces of engineering. ...those successes were the result of early failures where engineers learned the hard way..."
—Tom Fitzpatrick
|
Welcome back to Verification Horizons! And for those of you reading this at the 49th Design Automation Conference, welcome to San Francisco! We're looking forward to a great show.
On a recent visit to the Evergreen Aviation & Space Museum in Oregon, I had an opportunity to see some great examples of what, for their time, were incredibly complex pieces of engineering. From a replica of the Wright brothers' Flyer, to the World War I Sopwith Camel (sure to be recognized by you Snoopy fans out there) to various World War II fighter planes, including the Spruce Goose, the aviation side of the museum does a great job showing how the technology of flight evolved in just over 40 years. The space side of the museum has a marvelous collection of jet airplanes and rockets, such as replicas of the Gemini and Apollo spacecraft, including the Lunar Module, and pieces of a Saturn V. When you think about the effort and risk people took to advance from Kitty Hawk, North Carolina to the moon in under 70 years, it gives a whole new perspective to the idea of verification.
A lot of those successes resulted from early failures where engineers learned the hard way what not to do. World War II also gave us Murphy's Law, which states that "if anything can go wrong, it will." Compared to our historical colleagues, our lives as verification engineers today are much easier since we have the benefit of simulation and other verification tools to try things out virtually before we actually build them. But in a bow to Murphy, we know that it's critical to make sure everything gets tested. A corollary to Murphy's Law for verification could easily be "if it isn't covered, it doesn't work."
And that's why this issue of Verification Horizons is dedicated to the topic of coverage. There are, of course, many types of coverage and many tools that generate coverage data. The key to being able to measure all these different types of coverage is to have a common database to hold all of the data so it can be analyzed and correlated. Our first article this month, "Mentor Has Accellera's Latest Standard Covered," describes the development of Accellera's new Unified Coverage Interoperability Standard (UCIS) and how the Questa® Unified Coverage Database (UCDB) provides the necessary infrastructure to realize the kinds of analysis envisioned by the standard.
The notion of coverage itself has also changed over the years. Code coverage was a useful metric because it was easy to automate, but it didn't really give enough information to know if your verification was really complete. As the use of constrained random stimulus took off, functional coverage became necessary as an application-specific way to measure whether certain scenarios were actually exercised. The problem with functional coverage is that, as the scenarios get more complex, it gets more difficult to specify what exactly you want to measure. This is where testbench automation comes in, and we have a trio of articles that show how Questa inFact Intelligent Testbench Automation can help you reach your coverage goals.
In "Is Intelligent Testbench Automation For You?" my colleague (and fellow MIT alum) Mark Olen explains the applications best suited to using inFact and where the tool will provide optimal results. If you see yourself in any of these use cases, you'll definitely want to check out inFact. In "Automated Generation of Functional Coverage Metrics for Input Stimulus," we explore how inFact can automatically create SystemVerilog covergroup definitions targeted to your stimulus. As anyone who has tried to define cross coverage of complex protocols knows, it's often difficult to define your coverage properly to achieve 100% coverage of all possible scenarios. This is often due to the fact that some scenarios are actually not possible. Because inFact is based on a graphical representation of the protocols, it is possible to automatically generate the exclusion conditions, allowing you to reach 100% coverage without repetition.
|
Input scenario coverage is only part of the problem. There are many times when it is necessary to define input scenarios that will put the DUT in a specific internal state in order to reach a desired coverage goal. "Targeting Internal-State Scenarios in an Uncertain World" outlines the difficulty that constrained-random testing generally has, in being able to set up the preconditions required to hit these cases and shows how inFact can target them more efficiently. By taking information from elsewhere in the testbench and feeding it into the stimulus rule graph, inFact can automatically generate reactive stimulus sequences that achieve the desired coverage.
Emulation gives us the ability to run simulations on large designs that would be impractical to test with simulation alone. But the benefits of emulation have mostly been restricted to small teams who have access to the physical emulator in the lab to connect peripherals via "in circuit emulation" ("ICE"). Our next article, "Virtualization Delivers Total Verification of SoC Hardware, Software, and Interfaces," shows how Mentor's VirtuaLAB library of virtual peripherals can be used with Veloce to provide an easily configurable emulation environment to supporttotal verification of the hardware, software and peripheral interfaces for teams around the world.
Switching gears a little, our next article comes from my colleague Mark Peryer who explains how to handle "On the Fly Reset" in a UVM testbench. While the Accellera UVM committee is still working on a phasing solution, Mark offers some concrete guidelines on how a well-planned testbench can handle this common scenario, which is actually one of the issues that phasing is intended to address. Since the solution is independent of the new run-time phases in UVM, it can be used in OVM testbenches, as well. I encourage you to take a look.
In our Partners' Corner, we begin with a paper co-authored by our friends at Xilinx in Longmont, Colorado that was presented at this year's DVCon. In "Relieving the Parameterized Coverage Headache," the authors discuss the problems associated with gathering coverage on a design that may itself be parameterized, which requires the coverage collection to be modified based on the parameter values used to configure the design. Using UVM (or OVM) configuration to pass information on these parameter values lets you simplify the definition of covergroups in your testbench, thus customizing your coverage collection to match the DUT configuration.
FPGA prototyping is another tried and true lab-based approach to verification. The next article, from our friends at MathWorks, shows you "Four Best Practices for Prototyping MATLAB and Simulink Algorithms on FPGAs" that let you reuse your high-level models as you move towards implementation. Tightly linked to Questa (and ModelSim), their HDL co-simulation step lets you analyze system-level behavior in terms of your original model, closing the loop back to your original design goals.
Finally, as a special bonus, we're including the runner-up Best Paper from this year's DVCon, "Better Living Through Better Class-Based SystemVerilog Debug," by my Mentor colleagues Rich Edelman, Raghu Ardeishar and John Amouroux. Congratulations, guys!
One more thought before I leave you: One of the jet airplanes they had at the Evergreen Museum was a Lear 24 from 1963. I hope you'll permit me a bit of filial pride in pointing out that my dad designed several of the cockpit instruments, including the altimeter, artificial horizon and airspeed gauge. That got me thinking about the other electronics on display in everything from the Spruce Goose to the SR-71 Blackbird to the Lunar Module (including the TV camera, which, I must point out, Dave Rich's dad helped design). Take a moment to consider how these instruments worked so well and accomplished so much without the "high tech" tools we sometimes take for granted today, such as those that are sure to be featured in the DAC exhibit hall. If you are attending, as always, please stop by the Mentor Graphics and Verification Academy booths and say hi.
Respectfully submitted,
Tom Fitzpatrick
Editor, Verification Horizons
Back to Top