Simulation
Simulation plays a pivotal role in the digital design and verification process. Its primary purpose is to validate whether the design being created functions according to the specified requirements. By running simulations early in the design phase, potential issues can be identified, thus minimizing the need for extensive code revisions. Simulations can be performed at different levels of abstraction and at various stages throughout the design process.
-
On-Demand
Simulating AMD’s Next-gen Versal Adaptive SoC Devices using QuestaSim
In this session, you will be provided with an in-depth guide on running simulation flows for a Versal Adaptive SoC. Additionally, we'll delve into QEMU, the open-source system emulator, and its co-simulation interface with Questa. Demonstrating how to conduct a system simulation of a Versal example design will be a focal point, showcasing Questa’s support for system simulation of Versal designs based on the Vitis™ hardware emulation flow.
-
Simulation Sessions
-
Simulating AMD’s Next-gen Versal Adaptive SoC Devices using QuestaSim
In this session, you will be provided with an in-depth guide on running simulation flows for a Versal Adaptive SoC. Additionally, we'll delve into QEMU, the open-source system emulator, and its co-simulation interface with Questa. Demonstrating how to conduct a system simulation of a Versal example design will be a focal point, showcasing Questa’s support for system simulation of Versal designs based on the Vitis™ hardware emulation flow. -
Win the Tick to Trade Race by Root Causing Bugs Faster with QuestaSim
Root causing RTL design or simulation testbench bugs can be tedious process, especially if just relying on traditional waveform viewing and debug. Also, it can be costly if more sophisticated debug ties up precious simulation resources during the debug process. -
Productivity in the Questa Simulation Flow
In this session, you will learn every step of the Questa Simulation-based verification flow has been optimized and accelerated, from regression management, to incremental compilation and elaboration, to debug and coverage. -
UVM Simulation of MathWorks® Designs at Block, Subsystem, and Chip Level
This session is a customer presentation on his experience using the UVMF and Mathworks® integration in block, subsystem, and chip level simulations. -
Reduce Gate-level Simulation Bring-up Time with Semi-formal X Verification
This session will describe a reliable formal-based method to manage Xs in GLS. It centers on the use of Siemens Avery SimXACT solution alongside your preferred simulator. -
Enterprise Debug for Simulation
In this session, you will learn more about common debug challenges and modern debug solutions. -
Questa Simulation - Power Aware
In this demo, you will learn the UPF based Power Aware features available in Questa PASim. -
Questa Simulation
This session will demonstrate how assertions can be used in simulation.
-
-
Simulation Discussion
-
Simulation Container
-
Simulation Overview
Simulation plays a pivotal role in the digital design and verification process. Its primary purpose is to validate whether the design being created functions according to the specified requirements. By running simulations early in the design phase, potential issues can be identified, thus minimizing the need for extensive code revisions. Simulations can be performed at different levels of abstraction and at various stages throughout the design process.
From an abstraction perspective, simulations can be performed at Register-transfer level (RTL) and at gate level. Gate-level simulation can further be subdivided as timing and zero-delay simulations depending on whether timing information is considered during the simulation process.
In the course of electronic system design, simulations are performed at different levels: block-level and system level. Block-level simulations are relatively shorter in runtime and less intricate compared to system level simulations. System level simulations encompass the entire system under test and typically involve a verification framework to drive the inputs and observe outputs.
-
Simulation Metrics
It would be impossible to just look at the outputs of the Design Under Test (DUT) and make any assumptions about the quality of the code written. There is a need for some type of metrics to determine if the DUT is performing according to the specification and that there are no unforeseen issues in the code.
Code Coverage
One of the main and commonly used metrics is code coverage. Code coverage is the measure of how much of the code written is executed during the simulation. There are various types of code coverage and each of them give a different perspective of which parts of the code are executed during the simulation.
Statement coverage shows whether a line in the code is executed or not.
Branch coverage shows whether all the possible ways a branch (if-else, Case Statements) can be executed has occurred during the simulation.
Expression Coverage shows whether all the possible ways an expression can be assigned has being executed.
Toggle coverage shows whether all the bits of a bus have toggled between 0 and 1.
Extending on this, State machine coverage provides metrics related to state machines in the design. It offers information on the active states, transition between states, and other relevant aspects.
Functional Coverage
Functional coverage plays a crucial role in the verification of electronic systems. Unlike code coverage, which focuses on the lines of code being executed, functional coverage is a user-defined metric that assesses how much of the design specification is covered during simulation. Verification engineers find functional coverage particularly valuable because it allows them to explicitly define coverage models. By doing so, they can verify the exact behavior of the design and ensure all expected values are observed during simulation.
Assertions
Assertions provide a powerful means to verify and monitor the behavior of a design. By incorporating assertions into the design, one can express specific rules based on the design specification. These rules act as checks that the simulation tool evaluates during simulation runs. If an assertion fails – meaning the specified condition is not met – the tool reports the violation. This feedback allows one to investigate why the design is not functioning as intended.
Constrained-Random Stimulus
The quality of the verification of the DUT relies heavily on the type of stimulus driving the inputs. Manually toggling inputs through all possible combinations and sequences is impractical due to the sheer complexity of modern designs. Therefore, there needs to be a practical approach to drive input ports.
Traditionally, verification engineers wrote directed tests – tests that explicitly set input values and change them at specific times. This approach becomes very cumbersome as the design gets more complex. To address this, random number generation was introduced. Randomly cycling through all the possible input combinations seems like a solution, but it quickly becomes impractical. The number of possible combinations becomes astronomically large, leading to long simulation runs. This leaves verification engineers spending a lot of time running simulations hoping to catch corner cases.
A good middle ground between directed tests and random stimulus is constrained random stimulus generation. By constraining the random value generation to the values that we know the inputs can have according to the specification, we can significantly reduce the number of combinations of the inputs and thereby reduce the total simulation time.
-
Verification Methodologies
In today’s complex digital designs, a good verification methodology plays a vital role in determining how effective and efficient your simulations are being run. Different verification methodologies exist, tailored to the programming languages you are most comfortable with.
The Universal Verification Methodology (UVM) is the go-to choice for SystemVerilog users. It has widespread adoption and offers guidelines, libraries and tools to create reusable and scalable test environments.
VHDL users can leverage UVM as well. By integrating UVM with VHDL they benefit from established practices while staying within their preferred design language. For those committed to exclusive use of VHDL, there is UVVM (Universal VHDL Verification Methodology) and OSVVM (Open-Source VHDL Verification Methodology).
For Python enthusiasts, they can harness cocotb – a coroutine-based co-simulation testbench environment, which uses a simulator’s Programming Language Interface (PLI) to drive stimulus from a Python environment to the underlying SystemVerilog or VHDL DUT.
-