by Vipul Patel, ASIC Engineer, eInfochips
This article focuses on Assertion-Based Verification (ABV) methodology and discusses automation techniques for capturing verification results to accelerate the verification process. Also, it showcases how requirement traceability is maintained with use of assertions to comply with the mandatory DO-254 standards.
WHAT IS REQUIREMENT TRACEABILITY AND ITS IMPORTANCE FOR DO-254 COMPLIANCE?
When developing airborne electronics hardware (AEH), documenting the evidence-linking verification results to requirements is an important compliance criterion for satisfying regulatory guidance for certification. Section 6.2 of the RTCA/DO-254 “Design Assurance Guidance for Airborne Electronic Hardware” defines verification processes and the subsequent objectives that are required for safety assurance and correctness, as given below:
- Section 6.2.1(1) – Evidence is provided that the hardware implementation meets the requirements.
- Section 6.2.1(2) – Traceability is established between hardware requirements, the implementation, and the verification procedures and results.
- Section 6.2.1(3) – Acceptance test criteria are identified, can be implemented, and are consistent with the hardware design assurance levels of the hardware functions.
Assertion-Based Verification (ABV) methodology is increasingly used to handle the complexity of present- day AEH designs in the avionics industry. Requirements tracing using ABV methodology can be accomplished by associating targeted functionality from requirements to assertion execution results. The entire process would include simulation log, assertion waveform, and assertion coverage through several intermediate steps as shown below in Figure 1.
|A “valid” should be followed by “ready” signal occurring no more than three clocks after valid is “asserted”.
However, the effort spent in manual tracing can be quite significant and prone to errors too, though it is a good target for automation. In this article, we describe a script that can capture the results of executing assertions automatically written in SystemVerilog and aligned to the requirements. The simulation results are stored by Mentor’s Questasim tool in simulation log files as well as in waveform (.wlf) files. The latter debugs the issues in visual form in a waveform window in a swift manner. The automation script identifies the SystemVerilog assertions from the procedure implementation file. It interprets the user-defined arguments such as assertion checking pass instances, the width and height resolution of assertion waveform snapshot (image file), and then proceeds to prepare the assertion results. The automation script can also create an assertion snapshot image file, a smaller and targeted version of the assertion wave file along with the required result to pose as evidence for traceability. Compared to the manual capture of a large number of assertion results, the automated script saves significant amount of time and even reduces the instances of operator errors.
IMPORTANCE OF ASSERTIONS AND THEIR USABILITY AND TRACEABILITY IN AEH DEVICES
Assertion-Based Verification (ABV) is a methodology, where the designer and verification engineer can utilize “assertions” to verify design functionality and improve the overall verification productivity.
An assertion is actually a statement about a specific functional characteristic or property that is expected to hold for the design being verified. Generally assertions are developed using SystemVerilog assert, property, and sequence constructs. For example, consider the device has the following requirement:
The following assertion can be developed to verify the above requirement:
// Property Specification
@ (posedge clk) disable iff (reset)
valid |-> ##[1:3] ready;
// Assertion Directive
a_ready_valid : assert property (p_ready_valid)
$display(“@%0dns Assertion Passed”, $time);
$display(“@%0dns Assertion Failed”, $time)
q_ready_valid is the assertion statement which uses “assert” to verify the property. The name given before “assert” keyword is assertion label. This assertion uses the p_ready_valid property declared before it.
p_ready_valid is a property that checks valid signals on each positive edge of the clock. Whenever valid is detected as asserted, it will verify the requirement by checking ready signal for 1 to 3 clock cycles. If the assertion finds the ready signal asserted within three clock cycles, it reports the PASS message in log file and dumps the assertion checking states in .wlf file to be viewed in waveform window.
If the assertion doesn’t find the “ready” signal asserted within three clock cycles, it reports with the FAIL message in log file and dumps the assertion checking states in .wlf file to be viewed in waveform window later.
The requirement can be verified by declaring the assertion using “assert”, “property” and “sequence” construct of the language. The sequence is the statement of a sequence of signals assertion and de-assertions that are required to be followed depending on the requirement. The property can use multiple sequences. And the “assert” can use the property to verify the requirement and send the report message.
Questa® Tools Outputs (SV assertions,log messages, waveform dumps) and usage of these for Traceability
The Questa tool gives an output of the assertion procedure results in simulation log file, waveform file and UCDB file to display the coverage of the assertion. The log message contains the final PASS and FAIL status of the assertion along with the user-debugged message. The waveform file contains the assertion ACTIVE, INACTIVE, PASS or FAIL, etc. state along with the information of signals used in assertion. The waveform (.wlf) file is used to load the assertion in the waveform window and help the verification engineer to further analyze and debug device behavior. The UCDB file contains the assertion coverage information to be viewed in the coverage report. The coverage report will have detailed information including the assertion hit count, pass count, etc. These output files can be used to provide traceability of the requirement.
Figure 2: Assertion snapshot for pass assertion
Figure 3: Assertion snapshot for fail assertion
Figure 4: Automation Script Input-Output
AUTO CAPTURE OF ASSERTION RESULTS
Developing and implementing an automated script is a step-by-step process.
Step 1: Develop Script
First of all, a generic assertion capture automation script is created to parse an assertion input file. One can use scripting language like Perl, shell, tcl, etc. to develop the script.
Step 2: Prepare Assertion Input File
Create an assertion input file and include information in it like clock period, reset period, assertion module name, assertion ID, assertion implementation file name (.sv file), simulation log file name, simulation wlf file name (incase log and wlf files have different names), assertion pass count, etc. What is assertion pass count?
Assertion triggers multiple times during simulation. Hence, users can obtain the snapshot for specific passed/failed assertion using assertion pass count, as illustrated below in figure 5.
Figure 5: Sample of Assertion Input file showing pass count
Step 3: Parsing Input file by Script
The script will use the assertion ID and search assertion label (refer to Sample Assertion Code to understand assertion label) from assertion implementation file. This assertion label will be subsequently used for further processing. There is a provision to input single or multiple assertion IDs while developing the script too.
In case the assertion ID does not match the assertion label in the implementation file, the user will be notified with an error message while running the script. The implementation of the assertions can then be checked likewise using the script.
Once assertion labels are identified, the simulation log file is checked for pass or fail messages depending on assertion pass count. The script will get simulation time grep from the message. This time is used for preparing width and height resolution of assertion waveform snapshot (image file).
The script will automatically find an appropriate height of waveform snapshot depending on signals used in assertion implementation and even find an appropriate width from the simulation time grep from simulation log file. If the user requires extra width or height, then it can be provided by specifying the same in the assertion input file.
By using information such as specific width, height, signals, etc., the script will be able to capture an assertion waveform snapshot (image file).
The next step involves the generation of a small and targeted version of .wlf file of assertion results. The .wlf file is the input and assertion start/end time will be used to create the small and targeted version of assertion .wlf file.
HOW TO APPLY A SCRIPT?
Once regression is completed, an assertion input file is created. The input file will have the required information as specified in Step 2: Prepare Assertion Input File.
On running the script, it will manipulate input attributes and capture results in an image file (.bmp) and a waveform file (.wlf). The resultant snapshots will be used for creating a verification result document and provide treaceability to requirements, as shown in Figure 1: Steps in Requirements Tracing for DO-254 using Assertions.
Figure 6: Example screenshot of Assertion Result .bmp file (image file)
- Automates the tedious process of capturing assertion results
- Saves time and manual efforts
- Drastically reduces probability of errors
For example, errors in identification of active and pass points in result, hiding signals, inaccurate size assessment, etc.
- Assertion capture script is reusable in different verification projects
- Simulation tool version or feature changes will require similar changes in the script
For example, the simulation tool commands used by script such as wlfman. If this command is removed or altered, the script needs to accommodate the changes. Similarly there are many commands in simulation tool that if changed will lead to script alteration.
Back to Top