Verification Academy

Search form

My Account Menu

  • Register
  • Log In
  • Topics
  • Courses
  • Forums
  • Patterns Library
  • Cookbooks
  • Events
  • More
  • All Topics
    The Verification Academy offers users multiple entry points to find the information they need. One of these entry points is through Topic collections. These topics are industry standards that all design and verification engineers should recognize. While we continue to add new topics, users are encourage to further refine collection information to meet their specific interests.
    • Languages & Standards

      • Portable Test and Stimulus
      • Functional Safety
      • Design & Verification Languages
    • Methodologies

      • UVM - Universal Verification Methodology
      • UVM Framework
      • UVM Connect
      • FPGA Verification
      • Coverage
    • Techniques & Tools

      • Verification IP
      • Simulation-Based Techniques
      • Planning, Measurement, and Analysis
      • Formal-Based Techniques
      • Debug
      • Clock-Domain Crossing
      • Acceleration
  • All Courses
    The Verification Academy is organized into a collection of free online courses, focusing on various key aspects of advanced functional verification. Each course consists of multiple sessions—allowing the participant to pick and choose specific topics of interest, as well as revisit any specific topics for future reference. After completing a specific course, the participant should be armed with enough knowledge to then understand the necessary steps required for maturing their own organization’s skills and infrastructure on the specific topic of interest. The Verification Academy will provide you with a unique opportunity to develop an understanding of how to mature your organization’s processes so that you can then reap the benefits that advanced functional verification offers.
    • Universal Verification Methodology (UVM)

      • Advanced UVM
      • Basic UVM
      • Introduction to UVM
      • UVM Connect
      • UVM Debug
      • UVMF - One Bite at a Time
    • Featured Courses

      • Introduction to ISO 26262
      • Introduction to DO-254
      • Clock-Domain Crossing Verification
      • Portable Stimulus Basics
      • Power Aware CDC Verification
      • Power Aware Verification
      • SystemVerilog OOP for UVM Verification
    • Additional Courses

      • Assertion-Based Verification
      • An Introduction to Unit Testing with SVUnit
      • Evolving FPGA Verification Capabilities
      • Metrics in SoC Verification
      • SystemVerilog Testbench Acceleration
      • Testbench Co-Emulation: SystemC & TLM-2.0
      • Verification Planning and Management
      • VHDL-2008 Why It Matters
    • Formal-Based Techniques

      • Formal Assertion-Based Verification
      • Formal-Based Technology: Automatic Formal Solutions
      • Formal Coverage
      • Getting Started with Formal-Based Technology
      • Handling Inconclusive Assertions in Formal Verification
      • Sequential Logic Equivalence Checking
    • Analog/Mixed Signal

      • AMS Design Configuration Schemes
      • Improve AMS Verification Performance
      • Improve AMS Verification Quality
  • All Forum Topics
    The Verification Community is eager to answer your UVM, SystemVerilog and Coverage related questions. We encourage you to take an active role in the Forums by answering and commenting to any questions that you are able to.
    • UVM Forum

      • Active Questions
      • Solutions
      • Replies
      • No Replies
      • Search
      • UVM Forum
    • SystemVerilog Forum

      • Active Questions
      • Solutions
      • Replies
      • No Replies
      • Search
      • SystemVerilog Forum
    • Coverage Forum

      • Active Questions
      • Solutions
      • Replies
      • No Replies
      • Search
      • Coverage Forum
    • Additional Forums

      • Announcements
      • Downloads
      • OVM Forum
  • Patterns Library
    The Verification Academy Patterns Library contains a collection of solutions to many of today's verification problems. The patterns contained in the library span across the entire domain of verification (i.e., from specification to methodology to implementation—and across multiple verification engines such as formal, simulation, and emulation).
    • Implementation Patterns

      • Environment Patterns
      • Stimulus Patterns
      • Analysis Patterns
      • All Implementation Patterns
    • Specification Patterns

      • Occurrence Property Patterns
      • Order Property Patterns
      • All Specification Patterns
    • Pattern Resources

      • Start Here - Patterns Library Overview
      • Whitepaper - Taking Reuse to the Next Level
      • Verification Horizons - The Verification Academy Patterns Library
      • Contribute a Pattern to the Library
  • All Cookbooks
    Find all the methodology you need in this comprehensive and vast collection. The UVM and Coverage Cookbooks contain dozens of informative, executable articles covering all aspects of UVM and Coverage.
    • UVM Cookbook

      • UVM Basics
      • Testbench Architecture
      • DUT-Testbench Connections
      • Configuring a Test Environment
      • Analysis Components & Techniques
      • End Of Test Mechanisms
      • Sequences
      • The UVM Messaging System
      • Other Stimulus Techniques
      • Register Abstraction Layer
      • Testbench Acceleration through Co-Emulation
      • Debug of SV and UVM
      • UVM Connect - SV-SystemC interoperability
      • UVM Versions and Compatibility
      • UVM Cookbook
    • Coding Guidelines & Deployment

      • Code Examples
      • UVM Verification Component
      • Package/Organization
      • Questa/Compiling UVM
      • SystemVerilog Guidelines
      • SystemVerilog Performance Guidelines
      • UVM Guidelines
      • UVM Performance Guidelines
    • Coverage Cookbook

      • Introduction
      • What is Coverage?
      • Kinds of Coverage
      • Specification to Testplan
      • Testplan to Functional Coverage
      • Bus Protocol Coverage
      • Block Level Coverage
      • Datapath Coverage
      • SoC Coverage Example
      • Requirements Writing Guidelines
      • Coverage Cookbook
  • All Events
    No one argues that the challenges of verification are growing exponentially. What is needed to meet these challenges are tools, methodologies and processes that can help you transform your verification environment. These recorded seminars from Verification Academy trainers and users provide examples for adoption of new technologies and how to evolve your verification process.
    • Upcoming & Featured Events

      • Creating an Optimal Safety Architecture  - February 9th
      • The ABC of Formal Verification - February 11th
      • Events Calendar
    • On Demand Seminars

      • I'm Excited About Formal...
      • Visualizer Coverage
      • Formal-based ‘X’ Verification
      • 2020 Functional Verification Study
      • All On-Demand Seminars
    • Recording Archive

      • Improving Your SystemVerilog & UVM Skills
      • Should I Kill My Formal Run?
      • Visualizer Debug Environment
      • All Recordings
    • Mentor Training Center

      • SystemVerilog for Verification
      • SystemVerilog UVM
      • UVM Framework
      • Instructor-led Training
    • Mentor Learning Center

      • SystemVerilog Fundamentals
      • SystemVerilog UVM
      • Questa Simulation Coverage Acceleration Apps with inFact
      • View all Learning Paths
  • About Verification Academy
    The Verification Academy will provide you with a unique opportunity to develop an understanding of how to mature your organization's processes so that you can then reap the benefits that advanced functional verification offers.
    • Blog & News

      • Verification Horizons Blog
      • Academy News
      • Academy Newsletter
      • Technical Resources
    • Verification Horizons Publication

      • Verification Horizons - November 2020
      • Verification Horizons - July 2020
      • Verification Horizons - March 2020
      • Issue Archive
    • About Us

      • Verification Academy Overview
      • Subject Matter Experts
      • Contact Us
    • Training

      • Questa® & ModelSim®
      • Questa® inFact
      • Functional Verification Library
  • Home /
  • Verification Horizons /
  • July 2020 /
  • Addressing VHDL Verification Challenges with OSVVM

Addressing VHDL Verification Challenges with OSVVM

Verification Horizons - Tom Fitzpatrick, Editor

 | Verification Horizons - July 2020 by Jim Lewis, SynthWorks Design, Inc.

INTRODUCTION

Most people don't think of VHDL as a verification language. However, with the Open Source VHDL Verification Methodology (OSVVM) utility and verification component libraries it is. Using OSVVM we can create readable, powerful, and concise VHDL verification environments (testbenches) whose capabilities are similar to other verification languages, such as SystemVerilog and UVM.

This article covers the basics of using OSVVM's transaction-based test approach to write directed tests, write constrained random tests, use OSVVM’s generic scoreboard, add functional coverage, add protocol and parameter checks, add message filtering, and add test wide reporting.

WHY VHDL? WHY OSVVM?

According the 2018 Wilson Research Group Functional Verification Study1:

  • 62% of FPGA designs worldwide use VHDL
  • 17% of FPGA verification projects worldwide use OSVVM (or 38% of VHDL FPGA verification projects)
  • For Europe, 30% of FPGA verification projects use OSVVM while only 20% use UVM

This makes OSVVM the #1 VHDL FPGA verification methodology worldwide and the #1 FPGA verification methodology in Europe.

BENEFITS OF OSVVM

For the VHDL community, OSVVM is a clear win. We can write tests in the same language we already know and re-use components, tests, and testbenches from other projects. More importantly OSVVM’s transaction-based approach simplifies creating readable and reviewable tests (an important metric in the safety critical community). In addition, OSVVM uses the same component/model based approach used by RTL design. Hence, not only can RTL designers read tests and verification components, they can write them. While having independent design and verification teams is important, it is also important to be able to deploy engineers to either a design or verification role on a project by project basis.

What are Transactions?

A transaction is an abstract representation of an interface operation (such as UART transmit) or directive (such as get transaction count). In OSVVM, a transaction is initiated with a procedure call. In the OSVVM verification component approach, the procedure places the transaction information into a record and passes it to the verification component. The component in turn executes the transaction and provides stimulus to the device under test (DUT).

The code and Figure 1 below show two calls to a send procedure and the corresponding waveforms produced by the UartTx verification component.

UartTbTxProc : process
begin
  WaitForBarrier(StartTest) ;
  Send(UartTxRec, X"4A") ; 
  Send(UartTxRec, X"4B") ;

Figure 1 - Two Calls to Send transaction and the resulting waveform


Each verification component in the OSVVM library implements a set of model independent transactions. The table gives a brief summary.

Table 1 - OSVVM Standard Transactions


THE OSVVM TESTBENCH FRAMEWORK

The OSVVM testbench framework looks identical to other frameworks, including SystemVerilog. It includes verification components (AxiMaster, UartRx, and UartTx) and TestCtrl (the test sequencer) as shown in figure 2. The top level of the testbench connects the components together (using the same methods as in RTL design) and is often called a test harness. Connections between the verification components and TestCtrl use VHDL records as an interface. Connections between the verification components and the DUT are the DUT interfaces. Tests are written by calling transactions in TestCtrl. Separate tests are separate architectures of TestCtrl.

Figure 2 - OSVVM Testbench Framework


The rest of this article focuses on writing tests in TestCtrl.

TESTCTRL, THE OSVVM TEST SEQUENCER

The TestCtrl architecture consists of a control process plus one process per independent interface, see the code block in figure 3. The control process is used for test initialization and finalization. Each test process creates interface waveform sequences by calling the transaction procedures (Write, Send, …).

Each architecture of TestCtrl creates a separate test in the test suite. Hence, a single test is visible in a single file, improving readability.

Since the processes are independent of each other, synchronization is required to create coordinated events on the different interfaces. This is accomplished by using synchronization primitives, such as WaitForBarrier (from TbUtilPkg in the OSVVM library).

Figure 3 - TestCtrl Architecture

architecture UartTx1  of TestCtrl is
  . . .
begin
  ControlProc : process
  begin
    . . .
    WaitForBarrier(TestDone, 5 ms) ;
    ReportAlerts ; 
    std.env.stop;
  end process ; 
  CpuTestProc : process
  begin
    WaitForBarrier(TestInit) ;
    Write(. . .) ;
    WaitForBarrier(DutInit);
    . . .
    WaitForBarrier(TestDone) ;
  end process ; 
  TxProc : process
  begin
    WaitForBarrier(DutInit);
    Send(. . .) ;
    . . .
    WaitForBarrier(TestDone) ;
  end process ; 
  . . . 

TEST INITIALIZATION

The ControlProc both initializes a test and finalizes a test. Test initialization is shown in figure 4. SetAlertLogName sets the test name. Each verification component calls GetAlertLogID to allocate an ID that allows it to accumulate errors separately within the AlertLog data structure. Accessing the IDs here allows the message filtering of a verification component to be controlled by the test. WaitForBarrier stops ControlProc until the test is complete.

Figure 4 - Test Initialization

ControlProc : process
begin
  SetAlertLogName("Test_UartRx_1");
  TBID <= GetAlertLogID("TB");
  RxID <= GetAlertLogID("UartRx_1");
  SB.SetAlertLogId("UART_SB") ; 
  SetLogEnable(PASSED, TRUE) ;
  SetLogEnable(RxID, INFO, TRUE) ;  
  WaitForBarrier(TestDone, 5 ms) ; 
  . . .

A SIMPLE DIRECTED TEST

A simple test can be created by transmitting (send) a value on one interface and receiving (Get) and checking (AffirmIfEqual) it on another interface. This is shown in figure 5.

Figure 5 - A Simple Directed Test

TxProc : process
begin
  Send (TRec, X"10") ;
  Send (TRec, X"11") ;
  . . . 
end process TxProc
RxProc : process
  variable RxD : ByteType;
begin
  Get(RRec, RxD) ; 
  AffirmIfEqual(TBID, RxD, X"10");
 
  Get(RRec, RxD) ; 
  AffirmIfEqual(TBID, RxD, X"11");
  . . .
end process RxProc ;

The AffirmIfEqual checks its two parameters. It produces a log "PASSED" message if they are equal and alert "ERROR" message otherwise. These are shown in figure 6.

Figure 6 - Messaging from AffirmIfEqual

%% Alert ERROR  In TB, Received: 08 /= Expected: 10 at 2150 ns 
%% Log   PASSED In TB, Received: 11 at 3150 nas

USING RANDOMIZATION

Constrained random randomly selects test values, modes, operations, and sequences of transactions. In general, randomization works well when there are a large variety of similar items to test.

The OSVVM package, RandomPkg, provides a library of randomization utilities. A subset of these is shown in figure 7.

Figure 7 - Subset OSVVM's Random library

-- Random Range:  randomly pick a value within a range
Data_slv8 := RV.RandSlv(Min => 0, Max => 15, 8) ;
 
-- Random Set: randomly pick a value within a set
Data1     := RV.RandInt( (1,2,3,5,7,11) ) ; 
 
-- Weighted distribution:  randomly pick a value between 
    0 and N-1 
-- where N is number of values in the argument
-- the likelyhood of each value = value / (sum of all values)
Data2     := RV.DistInt( (70, 10, 10, 5, 5) ;

An OSVVM constrained random test consists of randomization plus code patterns plus transaction calls. For example, the code in figure 8 generates a UART test with normal transactions 70% of the time, parity errors 10% of the time, stop errors 10% of the time, parity and stop errors 5% of the time, and break errors 5% of the time.

Figure 8 - An OSVVM Constrained Random Test

TxProc : process
  variable RV : RandomPType ;
  . . . 
for I in 1 to 10000 loop    
  case RV.DistInt( (70, 10, 10, 5, 5) ) is
    when 0  =>    -- Nominal case   70% 
      ErrorMode := UARTTB_NO_ERROR ; 
      TxD:= RV.RandSlv(0, 255, Data'length) ; 
    when 1  =>    -- Parity Error   10%
      ErrorMode:= UARTTB_PARITY_ERROR ; 
      TxD:= RV.RandSlv(0, 255, Data'length) ; 
    when  . . .   -- (2, 3, and 4)
  end case ;
  Send(UartTxRec, Data, ErrorMode) ; 
end loop ;

Hence, creating constrained random tests in OSVVM is simply a matter of learning the patterns. All of the pattern is written directly in the code, and hence, visible to review.

Constrained random introduces two issues to our testing. First, how do we self-check the test? Previously we recreated the transmit pattern on the receive side. Due to the complexity, this would be tedious and error prone. In the next section, we solve this problem by using OSVVM's generic scoreboards.

Second, how do we prove the test actually did something useful? We solve this problem by using OSVVM's functional coverage.

OSVVM'S SCOREBOARDS

A scoreboard facilitates checking data when there is latency in the system. A scoreboard receives the expected value from the stimulus generation process and checks the value when it is received by the check process, as shown in figure 9.

Figure 9 - Scoreboard Block Diagram


The OSVVM scoreboard supports small data transformations, out of order execution, and dropped values. It uses package generics to allow the expected type and actual type to differ. The "match" function that determines if the expected and actual values match is also a package generic. The FIFO-like data structure of the scoreboard is created internal to a protected type.

The use model for OSVVM's scoreboard is shown in figure 10. The scoreboard instance is created using a shared variable declaration. On the transmit side (TxProc), the expected value is pushed into the scoreboard (SB.Push), and then a transaction is transmitted (Send). On the receive side (RxProc), the transaction is received (Get), and then the received value is checked in the scoreboard (SB.Check). This greatly simplifies RxProc since it no longer reproduces what the transmit side did. Scoreboards can also be used to simplify checking in directed tests.

Figure 10 - OSVVM Scoreboard Use Model

use work.ScoreboardPkg_uart.all ; 
shared variable SB : work.Scoreboardpkg_uart.ScoreboardPType;
TxProc : process
  . . .
begin
  for I in 1 to 10000 loop
    case RV.DistInt((. . .)) is
       . . . 
    end case ; 
    SB.Push((TxD, ErrorMode));
    Send(TRec, TxD, ErrorMode);
  end loop ;
  . . .
RxProc : process
  variable RxD    : ByteType;
  variable RV     : RandomPType; 
begin
  for I in 1 to 10000 loop
    Get(RRec, RxD, RxErrorMode); 
    SB.Check((RxD, RxErrorMode));
  end loop ;
  . . .

ADDING FUNCTIONAL COVERAGE

Functional coverage is code that tracks items in the test plan. As such it tracks requirements, features, and boundary conditions.

If a test uses constrained random, functional coverage is needed to determine if the test did something useful. Going further as design complexity increases, functional coverage is recommended to assure that a directed test actually did everything that was intended.

There are two categories of functional coverage: item (aka Point) coverage and cross coverage. Item coverage tracks relationships within a single object. For a UART, were transfers with no errors, parity errors, stop bit errors, parity and stop bit errors, and break errors seen?

Cross coverage tracks relationships between multiple objects. For a simple ALU, has each set of registers for input 1 been used with each set of registers for input 2?

Why not just use code coverage that is provided with a simulator? Code coverage only tracks code execution. Hence, code coverage cannot track the examples above since the information is not in the code. On the other hand, if a design's code coverage does not reach 100% then there are untested items and testing is not done. Hence, both code coverage and functional coverage are needed to determine when testing is done.

Functional coverage in OSVVM is implemented as a data structure within a protected type.

Figure 11 continues with RxProc from the constrained random test and adds functional coverage. First an instance of the coverage object (RxCov) is created using a shared variable. Next "RxCov.AddBins (GenBin(N))" is called to construct the functional coverage model. The value "N" corresponds to the integer representation of the UART status bits for Break, Stop, Parity, and Data Available. The calls to AddBins all complete at time 0, before any stimulus is generated or checked. Next, after the received stimulus has been retrieved (using Get), RxCov.Icover(RxErrorMode) is called to record the coverage. At the end of the test, RxCov.WriteBin prints the coverage results.

Figure 11 - UART RxProc with functional coverage added

architecture CR_1 of TestCtrl is
  shared variable RxCov : CovPType ;      -- define coverage object
  . . . 
begin
. . .
RxProc : process
  . . .
Begin
                                          -- Define coverage model
  RxCov.AddBins( GenBin(1) ) ;            -- Normal
  RxCov.AddBins( GenBin(3) ) ;            -- Parity Error
  RxCov.AddBins( GenBin(5) ) ;            -- Stop Error
  RxCov.AddBins( GenBin(7) ) ;            -- Parity + Stop
  RxCov.AddBins( GenBin(9, 15, 1) ) ;     -- Break
  for I in 1 to 10000 loop
    Get(RRec, RxD, RxErrorMode); 
    SB.Check((RxD, RxErrorMode));
    RxCov.ICover(RxErrorMode) ;           -- Collecxt functional 
                                          -- coverage end loop ;
  . . .
  RxCov.WriteBin ;                        -- Print coverage results

ADDING PROTOCOL AND PARAMETER CHECKERS

OSVVM alerts are used to check for invalid conditions on an interface or library subprogram. Alerts both report and count errors. Alerts have the levels FAILURE, ERROR (default), and WARNING. By default, FAILURE level alerts cause a simulation to stop. By default, ERROR and WARNING do not cause a simulation to stop. When a test completes, all errors reported by Alert (and AffirmIf) can be reported using ReportAlerts.

Figure 12 shows a protocol checker used in a memory model to detect if a write enable (iWE) and read enable (iOE) occur simultaneously while the memory is addressed (iCE). Parameter checkers are similar to protocol checkers and check for invalid parameters to library programs.

Figure 12 - Memory Model Protocol Checker

SimultaneousAccessCheck: process 
begin
  wait on iCE, iWE, iOE ; 
  AlertIf(SramAlertID, (iCE and iWE and iOE) = '1',
      "nCE, nWE, and nOE are all active") ;
end process SimultaneousAccessCheck ;

Alerts can be enabled (default) or disabled via a call to SetAlertEnable. The stopping behavior of Alert levels can be changed with SetAlertStopCount. Figure 13 shows the usage of both of these.

Figure 13 - Usage of SetAlertEnable and SetAlertStopCount

-- Turn off Warnings for a verification component
SetAlertEnable(UartRxAlertLogID, WARNING, FALSE) ;  
 
-- If get 20 ERRORs stop the test
SetAlertStopCount(ERROR, 20) ;

ADDING MESSAGE FILTERING

OSVVM logs allow messaging to be turned on or off based on settings in the test – either globally or for a specific verification component. Logs have the levels ALWAYS, DEBUG, INFO and PASSED. Logs print when enabled. Log ALWAYS is always enabled. The other logs are disabled by default. Figure 14 shows the usage of log and its output.

Figure 14 - Log and its output

Log(TbID, "Test 1 Starting") ;
%% Log   ALWAYS   In Testbench, Test 1 Starting at 1770 ns

Logs are enabled or disabled using SetLogEnable. Figure 15 shows "PASSED" being enabled for the enitre testbench and "INFO" being enabled only for UartRx. Generally this is done in ControlProc at test initialization.

Figure 15 - SetLogEnable Usage

SetLogEnable(PASSED, TRUE) ;       -- Turn on PASSED for all models
SetLogEnable(RxID, INFO, TRUE) :   -- Turn on INFO for CpuID

Test Finalization

Test finalization is error checking and reporting that is done in ControlProc after test completion. This is shown in figure 16. Finalization starts when "WaitForBarrier(TestDone, 5 ms)" resumes. This happens either when all of the test processes have called their corresponding WaitForBarrier (normal completion) or 5 ms passes. The 5 ms is a test timeout (watch dog) that activates if one of the test processes did not complete properly. The sequence of calls to AlertIf check for proper test finish conditions. ReportAlerts prints test results (see Test Wide Reporting).

Figure 16 - Test Finalization

ControlProc : process
begin
  . . .  
  WaitForBarrier(TestDone, 5 ms) ; 
  AlertIf(TBID, NOW >= 5 ms,  "Test timed out") ;  
  AlertIf(TBID, not SB.Empty, "Scoreboard not empty") ;
  AlertIf(TBID, GetAffirmCount < 1, "Checked < 1 items") ; 
  ReportAlerts ; 
  wait ; 
end process ControlProc ;

TEST WIDE REPORTING

The AlertLog data structure tracks FAILURE, ERROR, and WARNING for the entire test as well as for each AlertLogID (see GetAlertLogID). ReportAlerts prints a test completion message using this information. If GetAlertLogID was not called during the test, ReportAlerts prints either the simple PASSED or FAILED message shown in figure 17.

Figure 17 - Test Finalization

%% DONE  PASSED  Test_UartRx_1  Passed: 48  Affirmations Checked: 48  at 100000 ns
%% DONE  FAILED  Test_UartRx_1  Total Error(s) = 10  Failures: 0  Errors: 1  Warnings: 1  Passed: 48  Affirmations Checked: 48  at 100000 ns

If GetAlertLogID was called during the test, ReportAlerts will include errors and passed for each AlertLogID as shown in figure 18.

Figure 18 - ReportAlerts for each AlertLogID

%% DONE  PASSED  Test_UartRx_1  Passed: 48  Affirmations Checked: 48  at 100000 ns
%% DONE  FAILED  Test_UartRx_1  Total Error(s) = 7  Failures: 0  Errors: 7  Warnings: 0  Passed: 48  Affirmations Checked: 48  at 100000 ns 
%%    Default                    Failures: 0    Errors: 0    Warnings: 0    Passed: 0
%%    OSVVM                      Failures: 0    Errors: 0    Warnings: 0    Passed: 0
%%    TB                         Failures: 0    Errors: 0    Warnings: 0    Passed: 0
%%    UART_SB                    Failures: 0    Errors: 0    Warnings: 0    Passed: 0
%%    Cpu_1                      Failures: 0    Errors: 7    Warnings: 0    Passed: 41
%%    Cpu_1 Data Error           Failures: 0    Errors: 7    Warnings: 0    Passed: 0
%%    Cpu_1 Protocol Error       Failures: 0    Errors: 0    Warnings: 0    Passed: 0
%%    UartRx_1                   Failures: 0    Errors: 0    Warnings: 0    Passed: 0

Including OSVVM Library

OSVVM provides context declarations (VHDL-2008) to allow the utility library and each verification component to be referenced with a single context reference, rather than multiple use clauses. This is shown in Figure 19.

Figure 19 - OSVVM Context References

library osvvm ;                             -- Utility Library
  context osvvm.OsvvmContext ;
library osvvm_axi4 ; 	                    -- AXI4 and AXI Stream 
  context osvvm_axi4.Axi4LiteContext ;
  context osvvm_axi4.AxiStreamContext ;
Library osvvm_uart ;	                    -- UART
  context osvvm_uart.UartContext ;

GETTING AND RUNNING OSVVM

OSVVM is available on GitHub at https://github.com/OSVVM. Retrieve it using git as shown in figure 20.

Figure 20 - Retrieving OSVVM from GitHub

git clone https://github.com/OSVVM/OSVVM.git
git clone --recursive https://github.com/OSVVM/VerificationIP.git

The Axi4Lite, AxiStream, and UART verifica-tion components come with OSVVM style testbenches. Figure 21 shows how to compile and run the tests for the AxiStream verifica-tion component. The tests for the UART and AXI4Lite verification components are run in the same manner.

Figure 21 - Compiling and running OSVVM

cd VerificationIP/Scripts
do startup.tcl
build ../osvvm
build ../VerificationIP/AXI4/AxiStream/RunTests.pro

SUMMARY

OSVVM goes well beyond the basics shown in this article. To learn more, see the documentation on the GitHub site, or take SynthWorks' Advanced VHDL Testbenches and Verification class.

REFERENCES

  1. 2018 Wilson Research Group Functional Verification Study

Back to Top

Table of Contents

Verification Horizons Articles:

  • What do Grubhub®, Doordash®, and Verification Technology Have in Common?

  • Formal Is The “New Normal” - Deploy These FV Apps In Your Next Project

  • Understanding the SVA Engine Using the Fork-Join Model

  • Bridging the Portability Gap for UVM SPI VIP Core Reuse From IP to Sub-System and SoC Using Portable Stimulus

  • PCIe Simulation Speed-Up Using Mentor QVIP with PLDA PCIe Controller for DMA Application

  • Extending SoC Design Verification Methods for RISC-V Processor DV

  • Addressing VHDL Verification Challenges with OSVVM

  • Effective Validation Method of Safety Mechanism Compliant with ISO 26262

© Mentor, a Siemens Business, All rights reserved www.mentor.com

Footer Menu

  • Sitemap
  • Terms & Conditions
  • Verification Horizons Blog
  • LinkedIn Group
SiteLock