by Tom Fitzpatrick, Editor and Verification Technologist
"I think it's important every once in a while to take a step back and appreciate the creativity, ingenuity and effort that has transformed the technology we experience in our everyday lives."
Hello, and welcome to the annual Super-Sized DAC Edition of Verification Horizons.
My 16 year-old son just came downstairs to my office (I'm blessed to work from home) to reboot our home network router so that our television could connect to the internet. This activity, which is probably somewhat mundane to most of you out there, hit me like the proverbial bolt of lightning as I stood at my desk (yes, I have a standing desk, which I highly recommend) trying to write this introduction. I am old enough to remember assisting my dad in replacing burned-out vacuum tubes to fix our television when I was a kid. He had a tube tester in his tool box, which included a mirror so he could see the picture from behind the television to tell if everything was working properly. And then I looked around my office and saw my two flat-screen monitors, with a bigger and sharper picture than I could have imagined back then, not to mention my iPhone, Kindle Fire and laptop.
I think it's important every once in a while to take a step back and appreciate the creativity, ingenuity and effort that has transformed the technology we experience in our everyday lives. It's easy for us to get complacent about such things, and even take the next new release of our favorite gadget in stride. But since we know what goes into making these things, we should never lose that sense of wonder. I'm reminded of that wonder every time my kids show their grandparents a new iPhone app or show them a PowerPoint presentation they did for homework, and it energizes me to know that I've contributed, in even a small way, to that sense of wonder. I hope that thought and these articles will help you appreciate the magic1 of what we do.
Our first article in this expanded DAC issue comes from two of my colleagues at Mentor Graphics, Josh Rensch and John Boone. In "Best Practices for FPGA and ASIC Development," Josh and John remind us of some key requirements, activities and deliverables for each phase of your development process. You can think of this article as some of the "incantations" necessary to make the magic happen.
I'm pleased to have the next article from another Mentor colleague (and one of my favorite presenters), my friend Rich Edelman. In "Visualizer™: Class-based Testbench Debugging using a New School Debugger," Rich talks about some of the challenges inherent in debugging classbased testbenches using UVM and SystemVerilog, and he shows you some of the "magic" available in Mentor's new Visualizer Debug Environment. We try not to be too product focused here at Verification Horizons, but every once in a while there's a new tool that I just have to share with you, and Visualizer is one.
Continuing with the idea of debug, our next article, from my colleague Russ Klein, shows us a debug-centric view of "Optimizing Emulator Utilization." Emulation-based debug always presents a tradeoff between interactivity and performance, particularly when the emulator is a shared resource for multiple groups in your company. In this article, you'll see how Codelink enables "trace-based debugging," which gives you the visibility you need while keeping the emulator running as fast as possible for everyone in your organization.
Next we have "MIPI LLI Verification Using Questa Verification IP," which explains how to take advantage of pre-verified verification components to verify complex protocols as part of your system. Using the LLI protocol as an example, the article shows the essential pieces of a UVM-based verification component and how you can utilize these pre-packaged components to get the most from your verification environment.
We round out the Mentor contributions in this edition with a follow-up from our "AMS Verification Dude," Martin Vlach. In " Stories of an AMS Verification Dude: Model Shmodel," Martin explains some of the different types of models used to represent analog blocks in a larger design. As with digital simulation, the amount of detail you actually model and simulate is inversely proportional to your performance, so a tradeoff must be made. In the case of analog functionality, since SPICE can be so slow, this tradeoff is even more important.
When it comes to knowing when you've reached your verification goals, tracking coverage using SystemVerilog covergroups can be extremely helpful. However, since no single test can reach all coverage goals in a reasonable amount of time, it is necessary to merge coverage results across multiple simulation runs to get an accurate picture of how you're doing. In "Merging SystemVerilog Covergroups by Example," our friends at Micron Technology show us how to set up covergroups, and where best to put them in our environment to make it easier to understand what's going on when results are merged. By taking you through several examples, exploring different options and approaches, this article does a great job of explaining how you can best utilize covergroups to ensure your verification success. We start off our Partners' Corner section with an article from our friends at eInfoChips, "Functional Coverage Development Tips: Do's and Don'ts." In this article, we get into specific discussions of how to specify your covergroups and take advantage of different language features to maximize the efficiency of your coverage data collection. Then, in "Increasing Verification Productivity Through Functional Coverage Management Automation" from our friends at Arrow Devices, we see a new script-based approach to automate the customization of covergroups by generating the actual SystemVerilog code from a common specification, including the ability to override bins and other aspects of your covergroups.
Of course, you have to be able to generate interesting stimulus to reach the goals specified in your covergroups. In "Use of Iterative Weight-Age Constraint to Implement Dynamic Verification," our friends at TrueChip explore some interesting ways to combine loops and variables in SystemVerilog constraints to generate interesting stimulus. Rounding out this issue, our friends at Frobas leave us with a case study, "UVM Testbench Structure and Coverage Improvement in a Mixed Signal Verification Environment," in which they use a script-based approach to automate the generation of a UVM Verification Environment. Once the environment is properly set up, the article explores the use of inFact intelligent testbench automation to achieve better coverage results.
I'd like to thank the authors who contributed to this edition of Verification Horizons, and thank you for reading. If you're at DAC, please stop by the Mentor Graphics or Verification Academy booth to say hi. And remember, every once in a while, to take a step back and appreciate the opportunity we get to do a little magic.
Editor, Verification Horizons
1 “Any sufficiently advanced technology is indistinguishable from magic.” — Arthur C. Clarke
Back to Top