by Tom Fitzpatrick, Editor and Verification Technologist
"If you're at DAC this year, please stop by the Verification Academy booth (#2408) to say "hi." "
Now that it's May here in New England, I'm sure you'll all be happy to know that, even after our record-setting snowfall this past winter, there is, in fact, no more snow on my lawn. When this statement became a fact is still a source of amazement, but I'm just glad the snow is gone.
Of course, that means it's time to get the lawn in shape, and I'm faced with the unfortunate situation that my 17-year-old son is too busy (with homework and standardized tests) to be of much help these days. Still, I'd rather ride the tractor than shovel snow any day, so I can't really complain. Speaking of greener grass, the lawn is lush on this issue's side of the fence.
We start out this issue with two articles about verifying airborne electronic hardware under the DO-254 standard. The first article comes from our friends at eInfochips, who focus on an assertion-based approach in "Verifying Airborne Electronics Hardware: Automating the Capture of Assertion Verification Results for DO-254." In particular, this article deals with how to maintain requirements traceability—a key part of the DO-254 process—when using assertions. The next article, from Verisense, shows how to apply a UVM- based testbench with a hardware tester validation platform to achieve "DO-254 Testing of High Speed FPGA Interfaces." As you'll see, the key is to reuse UVM-based simulation stimuli and results while relying on the hardware to ensure maximum flexibility for "robustness testing," another critical element of DO-254 compliance.
We continue with the theme of assertion-based verification in "Formal and Assertion-Based Verification of MBIST MCPs," a joint article from FishTail Design Automation and Mentor Graphics. Here we see how to apply formal verification to confirm the validity of multi-cycle paths in MBIST controllers. We will also learn how generating assertions for failing paths can help identify issues and improve the robustness of what is usually a very complex operation.
We follow this up with an article from our friends at Oski who show us the importance of "Starting Formal Right from Formal Test Planning." Many of us have talked for years about the importance of verification planning, and rightly so. This article does a great job of extending the ideas typically used in a simulation-based verification plan and applying them specifically to the job of planning formal verification. If formal verification is in your future, you'll find this article very interesting and useful.
In our next article, our friends at MathWorks show us how to "Reuse MATLAB® Functions and Simulink Models in UVM Environments with Automatic SystemVerilog DPI Component Generation." They also explain how their HDL Verifier facilitates the co-simulation of MATLAB models in UVM environments. The trick is that HDL Verifier TM can now generate SystemVerilog DPI components directly, which are then integrated into UVM components. In effect, it lets you embed the specification (in the form of the algorithmic MATLAB model) directly into your testbench, improving the reliability of verification.
Staying in the UVM arena for a bit, we have an article from our friends at Codasip® who share their thoughts on "Intelligent Testbench Automation with UVM and Questa® ."
This involves automatically generating the HDL representation of an application-specific instruction-set processor (ASIP) along with a UVM-based verification environment—including the ASIP reference model. A genetic algorithm is used to evolve the stimulus and optimize coverage.
We're all familiar with the concept of unit testing, but usually we only think about it when applied to the DUT. Our next article, from Neil Johnson of XtremeEDA and Mark Glasser of NVIDIA, shows how to expand on this by "Unit Testing Your Way to a Reliable Testbench." As with design-unit testing, the idea of verifying each unit of the eventual testbench by itself—rather than waiting until everything gets thrown together and iterating between "testbench bugs" and "design bugs" (which isn't always clear)— makes a lot of sense.
Next, we bring back Dr. Lauro Rizzatti with part two of his three-part article on "Hardware Emulation: Three Decades of Evolution." This section covers the expansion of emulation beyond verifying graphics and processors and the growth of FPGA-based emulators.
We close out our partner articles with "Accelerating RTL Simulation Techniques" from our friends at Marvell Semiconductor. I think you'll find this an extremely practical "how-to" article to identify some subtle performance- sapping coding styles that unfortunately are all too common. It's always important to remember that the fastest simulator in the world can be slowed down by poorly-written code, so please pay attention.
We wrap up this DAC edition of Verification Horizons with two articles from my colleagues here at Mentor Graphics. In "Emulation Based Approach to ISO 26262 Compliant Processors Design," the author shows us how to apply fault-injection to processor verification in automotive applications—a critical element in any notion of a self- driving car (at least any such car I'd consider buying). And last but not least, we have "Resolving the Limitations of a Traditional VIP for PHY Verification" in which we see how we can assemble a protocol- specific kit of verification components and stimulus to ensure that the PHY verification is self-contained and won't take away from your system verification when it's part of your SoC.
As always, if you're at DAC this year, please stop by the Verification Academy booth (#2408) to say "hi." It's always gratifying to hear from so many of you about how helpful you find both the Verification Academy and this newsletter. I'm proud to be able to help bring both of them to you.
Editor, Verification Horizons
Back to Top