by Tom Fitzpatrick - Mentor, A Siemens Business
As many of you who have been long-time readers of Verification Horizons know, I usually like to spend a little time in these Editor’s Notes relating an anecdote about my family and tying it back to a trend or theme in our industry related to the articles in that issue. But this time around, I feel moved to write about two members of our EDA family who were taken from us too soon this year.
Stu Sutherland passed away this summer, shortly after DAC. I first met Stu over twenty years ago when we were both working on what was then Verilog-1995. Since that time, we continued to collaborate on standards, including IEEE 1364 and IEEE 1800. In addition to being a key technical contributor to these standards, I’m not sure that many people outside of that process know just how much work Stu actually did as technical editor in merging 1364 and SystemVerilog 3.0 from Accellera into IEEE 1800. The word “Herculean” might come close to describing it. His efforts continued as editor through the most recent release of IEEE 1800. Stu was one of the premier independent Verilog educators in our industry as well as a prolific author of books and technical papers. I had the honor of co-authoring a few papers with Stu over the years. To all of these accomplishments, Stu always brought an unfailing sense of decency, humility and serenity that one could not help but treasure. To be able to have called Stu a friend as well as a colleague is truly one of the highlights of my career.
by Harry Foster - Mentor, A Siemens Business
There have been multiple studies on IC/ASIC functional verification trends published over the years. However, there are no published studies specifically focused on Field-Programmable Gate Array (FPGA) verification trends. To address this dearth of information, this article highlights a few key FPGA findings from the 2018 Wilson Research Group Functional Verification Study. The findings from this study provide invaluable insight into the state of today’s FPGA market in terms of functional verification.
IC/ASIC projects have often used the metric “number of required spins before production” as a benchmark to assess a project’s verification effectiveness. Historically, about 30% of IC/ASIC projects are able to achieve first silicon success, and most successful designs are productized on the second silicon spin. Unfortunately, FPGA projects have no equivalent metric. As an alternative to IC/ASIC spins, our study asked the FPGA participants “how many non-trivial bugs escaped into production?” The results shown in Fig. 1 are somewhat disturbing. In 2018, only 16% of all FPGA projects were able to achieve no bug escapes into production, which is worse than IC/ASIC in terms of first silicon success, and for some market segments, the cost of field repair can be significant.
by Matthew Ballance - Mentor, A Siemens Business
When using the Universal Verification Methodology (UVM), sequences are the primary mechanism by which stimulus is generated in the testbench. Sequences come in two flavors: simple sequences for driving a single interface, and virtual sequences that control more complex behavior. Simple sequences tend to work with a single sequence item, while virtual sequences often spawn off multiple sub-sequences to accomplish their intended task. Good virtual sequences are challenging to create, and even more challenging to reuse in a way not explicitly intended by the original author. Portable stimulus can make creating virtual sequences easier, increase the verification value achieved by running these virtual sequences, and enable more reuse of the description used to create the virtual sequence. This article will walk through an example showing how portable stimulus applies to creating scenarios for a DMA engine.
The DMA engine used in this article is a relatively simple 8-channel DMA engine. It has a register interface for programming DMA transfers, and two master interfaces for the DMA engine to use for transferring data. Each DMA channel can either perform direct memory-to-memory transfers, or can use peripheral handshake signals to transfer data from memory to a peripheral device, or from a peripheral device to memory.
by Madhur Bhargava and Awashesh Kumar - Mentor, A Siemens Business
The effective verification of low-power designs has been a challenge for many years now. The IEEE Std 1801-2015 Unified Power Format (UPF) standard for modeling low-power objects and concepts is continuously evolving to address the low-power challenges of today’s complex designs. One of the main challenges for low-power verification engineers has been the fact that there is a disconnect between the traditional RTL and low-power objects. Users cannot access and manipulate the low-power objects in the same way as they do the RTL objects. Low-power concepts are abstract and complexities arise because of the number of sources like UPF, HDL and Liberty all provide power intent in a low-power design. It has also been seen that the majority of verification time is spent debugging complex low-power issues. There are not too many ways in which users can do self-checking of their designs. As the low-power architecture is complex and the number of power-domains used in designs is high, selective reporting of a part of a design is needed. The lack of an industry standard in this regard has resulted in inconsistencies in the different ad-hoc approaches adopted by different tool vendors.
To keep pace with the increasing complexity of low-power architectures, the IEEE 1801 standard is expanding its gamut of constructs and commands to include more low-power verification and implementation scenarios. In this article we will present some innovative ways of writing Power Aware Apps using the UPF 3.0 information model HDL package functions and Tcl query functions. The article also demonstrates how these Power Aware Apps can help in reporting, debugging and self-checking low-power designs. We will also highlight how these apps will help offer an efficient way to significantly save verification effort and time.
by Sumit Vishwakarma - Mentor, A Siemens Business
The level of interaction between the analog and digital components in today’s mixed-signal SoCs is vastly more complex than it was in the past. The interplay between these domains is so integral to the functionality of the IC that it is no longer adequate to simulate analog and digital subsystems separately in a “divide and conquer” approach. Designers must simulate these two domains together, utilizing an array of advanced mixed-signal verification strategies to obtain the coverage closure required for first pass silicon success.
While designers must simulate the analog and digital subsystems together, the simulation algorithms are fundamentally different. High precision circuits, in many cases, require very accurate SPICE simulation to ensure proper operation, while digital circuits can rely on HDL simulators that run much faster. As a result, the analog simulation will typically dominate the overall system simulation time.