Verification Horizons Articles:
by Tom Fitzpatrick, Siemens EDA
Welcome to the DVConUS 2022 issue of Verification Horizons. As I may have mentioned previously, I am an avid golfer. Unfortunately, since I live in Massachusetts, my golfing opportunities this time of year are somewhat limited. I will admit to having a putting mat in my home office that I use regularly, but I don’t have room to swing any of my other clubs. As a result, it usually takes me some time at the beginning of each season to get my swing back in a groove. On top of that, I was unable to improve my scores last year as much as I had hoped, so this winter, I’ve adopted a new strategy: simulation!
That’s right. I’ve actually figured out how to relate my golf game to functional verification. As we know, if you can’t detect a problem, you can’t fix it. And if you can’t measure something, you can’t improve it. So, just as we have simulators and other tools for functional verification, I started using a simulator and taking lessons to improve my golf game. A new indoor golf facility just opened near my house, so now I can not only hit balls every weekend, but the simulator tracks and measures every aspect of my swing, so I know what I need to improve. I have learned two very important lessons. The first is that I am doing things of which I was unaware. The second is that there are some things I thought I was doing, but it turned out that I was doing something else. I won’t bore you with the details, but I have managed to add distance and accuracy (when I swing correctly) to my shots, which is the whole point. Being able to hit regularly over the winter and see my progress has me excited for the coming season.
by Dhruv Garg, Siemens EDA
In this era of digitalization, we can manage most of our personal stuff online, such as handling bank transactions, ordering clothes, and booking cab rides. The COVID-19 pandemic has pushed us even closer to digitalization. We are now ordering groceries online and entertaining ourselves through the plethora of content available on various streaming platforms. Inevitably, there has been significant growth in the amount of data available online and the number of users consuming it. This growth makes it vital to take a step ahead and look for a storage mechanism that can store this data without affecting the user experience.
NVMe has dominated the storage interface over the years to address the needs of consumers. Its advantages include low latency, scalability, and high performance. These advantages of NVMe have been possible because it uses a paired queuing mechanism, similar to SATA SSDs. The main difference lies in the significant number of queues that NVMe supports – 65,535 (64K) queues, where each queue can hold up to 64K commands. These queues increase the performance vector exponentially. NVMe also provides non-contiguous queues, metadata that could be used for end-to-end data protection, arbitration mechanisms, and much more.
by Rich Edelman, Siemens EDA
The Visualizer Debug Environment is the user interface to debug, analyze and verify all our functional verification tools.
Visualizer is first and foremost a waveform debugger. It also includes source code debug, transaction debug, C debug, driver tracing, X tracing, schematics, glitch debug, low power debug, and coverage analysis and coverage debug – all supporting Verilog, SystemVerilog, VHDL, System C, and C/C++. It supports debugging simulation with Questasim, emulation with Veloce, and prototyping with VPS.
Visualizer supports live simulation debug and post-simulation debug.
by Eric Cigan, MathWorks with Jake Wiltgen, Siemens EDA
Engineers can use Model-Based Design for requirements analysis, algorithm design, automatic HDL code generation, and verification to produce airborne electronic hardware that adheres to the DO-254 standard. The proposed Model-Based Design approach for DO-254 combines tools from MathWorks® and Siemens EDA for both design and verification. This workflow supports development phases from concept through implementation, streamlining development, and reducing costs.
Simulink® from MathWorks is the starting point to enable Model-Based Design within this process. Simulink allows engineers to manage requirements and test sets, develop architectural and behavioral models, perform formal verification, ensure conformance to modeling standards, and generate design and verification code for VHDL and Verilog.
by Michelle Lange and Tammy Reeve, Patmos Engineering Services with Jacob Wiltgen, Siemens EDA
Tools used in the design and verification of electronics have played a massive role in the dramatic evolution of these devices over the past few decades. After all, there is a limit to the amount of work and detail that even a good aerospace engineer can handle, but add the use of tools, and the sky (pun intended) is the limit.
While the use of state-of-the-art development tools has led to ever-increasing design complexity, the use of modern verification tools has, at the same time, made these complex designs more reliable. In addition, lifecycle management tools have facilitated the management of both the development process and data. All these types of tools have been essential in modern avionics development.
While tools make amazing designs possible, what happens when you need to “Qualify” these tools? What does that even mean? How much work is it? Is it worth it? These are common questions asked by tool users subject to RTCA/DO-254 compliance. Companies like Siemens, who provide tools that are of great benefit to the goal of safety (such as in the aerospace domain), must understand and support their tools in the context of these programs. This article describes the terminology and requirements related to tool qualification specific to the safety-critical programs governed by DO-254 compliance. It also provides practical examples of tool qualification processes and strategies for commonly used tools.
by Ben Cohen, SystemVerilog Assertions Expert
In my years of contributions to the Verification Academy SystemVerilog Forum, I have seen trends in real users’ difficulties in the application of assertions, the expression of the requirements, the angle of attacks for verification, the misunderstandings of how SVA works, and the confusion as to which SVA option to use.
In this first episode on the Verification Horizons, I am addressing two aspects of users' difficulties with SVA dealing with 1) expressing requirements for assertions; 2) SVA concepts concerning terminology, threads, and vacuity. In future episodes, I’ll address topics related to 1) the usage of these four relationship operators: throughout
, until
, intersect
, implies
; 2) workarounds of illegal SVA operations such as dynamic and range delays and repeats, dynamic $past
of clock ranges, and stability of signals with no clocking events; 3) the verification of uniqueness in each attempted assertion; 4) the understanding of the sampling regions and their impact in expressing immediate and concurrent assertions and their action blocks; 5) the testing approach of assertions, and 6) the best dos and don’ts in writing assertions for formal verification.
by Vicente Bergas, Andrew Robertson, and Marco Denicolai, Bitec
Co-simulating systems, including RTL and software, may often require excessive computational times if a cycle-accurate CPU model is used. However, many co-simulation exercises do not necessarily require precise CPU models and may benefit from the solution proposed here.
This article presents a not-widely-used method of co-simulation that doesn’t need a cycle-accurate CPU simulation model and reduces simulation time while still allowing functional testing of software and RTL. A newly designed library uses VPI hooks offered by Questa to bridge together DUT’s RTL and software, allowing emulating the whole system without the CPU model overhead. We will demonstrate the proposed technique using the Bitec DisplayPort IP and its API software library.
by Espen Tallaksen, EmLogic
Verification takes half of a typical FPGA project’s development time. It is possible to significantly reduce this time with only minor adjustments and no extra cost while dramatically increasing the ability to reuse testbench components.An FPGA design’s architecture – from the top to the microarchitecture – is critical for both the FPGA quality and the development time. The same is true of the testbench. For example, the open-source UVVM (the Universal VHDL Verification Methodology) reduces the verification time significantly while at the same time improving testbench reuse and product quality. UVVM provides the best VHDL testbench approach possible.
Its straightforward and powerful architecture allows designers to build test harnesses and test cases significantly faster. Equally important, UVVM has a unique reuse structure, making it a game-changer for efficient FPGA development. The FPGA community has rapidly adopted UVVM. According to the Wilson Research Group Functional Verification Study from September 2020, UVVM was the world’s number one and fastest-growing VHDL verification methodology. UVVM is also the leading VHDL verification methodology for ASIC development. This article will introduce you to UVVM and show you how simple it is to understand and how easy it is to use. We will also show you how UVVM helps you write better VHDL testbenches while reducing your workload.