Verification Horizons Complete Issue:
Verification Horizons Articles:
by Tom Fitzpatrick, Editor and Verification Technologist, Mentor Graphics Corporation
On a recent visit to the Evergreen Aviation & Space Museum in Oregon, I had an opportunity to see some great examples of what, for their time, were incredibly complex pieces of engineering. From a replica of the Wright brothers' Flyer, to the World War I Sopwith Camel (sure to be recognized by you Snoopy fans out there) to various World War II fighter planes, including the Spruce Goose, the aviation side of the museum does a great job showing how the technology of flight evolved in just over 40 years. The space side of the museum has a marvelous collection of jet airplanes and rockets, such as replicas of the Gemini and Apollo spacecraft, including the Lunar Module, and pieces of a Saturn V. When you think about the effort and risk people took to advance from Kitty Hawk, North Carolina to the moon in under 70 years, it gives a whole new perspective to the idea of verification. 
by Darron May, Product Marketing Manager, Abigail Moorhouse, R & D Manager, Mentor Graphics Corporation
If you can't measure something, you can't improve it. For years, verification engineers have used "coverage" as a way to measure completeness of the verification effort. Of course, there are many types of coverage, from different types of code coverage to functional coverage, as well as many tools, both dynamic and static, that provide coverage information. Simply put, coverage is a way to count interesting things that happen during verification and the measure of coverage is being able to correlate those things that happened back to a list of things you wanted to happen (also called a verification plan). Meaningful analysis requires a standard method of storing coverage data from multiple tools and languages, which is what Accellera's Unified Coverage Interoperability Standard (UCIS) finally delivers. 
by Mark Olen, Product Marketing Manager, Mentor Graphics Corporation
Intelligent Testbench Automation (iTBA) is being successfully adopted by more verification teams every day. There have been multiple technical papers demonstrating successful verification applications and panel sessions comparing the merits to both Constrained Random Testing (CRT) and Directed Testing (DT) methods. Technical conferences including DAC, DVCon, and others have joined those interested in better understanding this new technology. An entire course curriculum is available at the Verification Academy. And many articles have been published by various technical journals, including in this and previous Verification Horizons editions. So with all of the activity, how do verification teams separate out the signal from the noise? 
by Mike Andrews, Verification Technologist, Mentor Graphics Corporation
Verification teams are always under pressure to meet their project schedules, while at the same time the consequences of not adequately verifying the design can be severe. This puts the team between a rock and a hard place as they say. The main value of Questa® inFact is to help with the problem of meeting the schedule requirements by more efficiently, and more predictably, generating the tests needed to meet coverage goals in the case where coverage metrics are being used to determine 'completeness' of the verification project. 
by Matthew Ballance, Verification Technologist, Mentor Graphics Corporation
The challenges inherent in verifying today's complex designs are widely understood. Just identifying and exercising all the operating modes of one of today's complex designs can be challenging. Creating tests that will exercise all these input cases is, likewise, challenging and labor-intensive. Using directed-test methodology, it is extremely challenging to create sufficiently-comprehensive tests to ensure design quality, due to the amount of engineering effort needed to design, implement, and manage the test suite. 
by Emulation Division, Mentor Graphics Corporation
With the majority of designs today containing one or more embedded processors, the verification landscape is transforming as more companies grapple with the limitations of traditional verification tools. Comprehensive verification of multi-core SoCs cannot be accomplished without including the software that will run on the hardware. Emulation has the speed and capacity to do this before the investment is made in prototypes or silicon. 
by Mark Peryer, Verification Methodologist, Mentor Graphics Corporation
A common verification requirement is to reset a design part of the way through a simulation to check that it will come out of reset correctly and that any non-volatile settings survive the process. Almost all testbenches are designed to go through some form of reset and initialization process at their beginning, but applying reset at a mid-point in the simulation can be problematic. The Accellera UVM phasing subcommittee has been trying to resolve how to handle resets for a long time and has yet to reach a conclusion. 
by Christine Lovett, Bryan Ramirez & Stacey Secatch, Xilinx, Inc. Michael Horn, Mentor Graphics Corporation
Modern FPGA and ASIC verification environments use coverage metrics to help determine how thorough the verification effort has been. Practices for creating, collecting, merging and analyzing this coverage information are well documented for designs that operate in a single configuration only. However, complications arise when parameters are introduced into the design, especially when creating customizable IP.
This article will discuss the coverage-related pitfalls and solutions when dealing with parameterized designs. 
by Stephan van Beek, Sudhir Sharma, and Sudeepa Prakash, MathWorks
Using FPGAs to process large test data sets enables engineers to rapidly evaluate algorithm and architecture tradeoffs quickly. They can also test designs under realworld scenarios without incurring the heavy time penalty associated with HDL simulators. System-level design and verification tools such as MATLAB® and Simulink® help engineers realize these benefits by rapidly prototyping algorithms on FPGAs. 
by Rich Edelman, Raghu Ardeishar, John Amouroux, Mentor Graphics Corporation
This article covers simulator independent debugging techniques that can be adopted as aids during the testbench debug period. Each technique is a small step for better debug, but when considered together can compensate for tool limitations, staff training issues, or other issues with adopting the SystemVerilog language and verification libraries. 