Verification plan methodology

I am a little confused about some of the terms thrown around involving verification, and how to apply them; especially after watching the video series referenced in: Verification Flow Details | Verification Academy

Answer one, or answer as many as you like; your opinion is greatly appreciated. Yes, I tend to overthink things!

  1. A “verification plan” is the same as a “test plan” is the same as “coverage model” ? (The spreadsheet linked to covergroups, SVA, etc.)

  2. This “verification plan” should contain “verification requirements” ?

  3. These verification requirements should be extracted from the design specification, NOT from the design requirements ? ( So there is redundancy )

  4. The “design requirements” should be free of implementation? *NASA System Engineering handbook

  5. The “verification requirements” should link to design requirements ?

  6. All design requirements should be covered by a corresponding “verification requirement” ?

  7. The “verification requirements” should also link to specific verification constructs, such as tests, coverage groups, etc ?

  8. The “verification requirements” can include checks to implementation (which is not listed in design requirements, because it’s implementation.) ?

  1. A “verification plan” is the same as a “test plan” is the same as “coverage model” ? (The spreadsheet linked to covergroups, SVA, etc.)
    From above link:

[Ben] There is a difference between a verification plan and a test plan, but people do tend to intermix them. The main difference is:

  • verification plan addresses the items to be verified, but without addressing the methodologies. Thus, for example, a verification plan for a CPU will address the items to be verified, including the ISA, the IOs, environment (e.g., ISA mix, memory types (fast/slow), application software written in X language, etc).
  • Test plan addresses how the items that need to be verified will be checked. For the CPU example, the test methodology may include simulation, emulation, use of assertions, if simulation, use of UVM, constrained random tests, types of mixes, test application code, etc.
  1. This “verification plan” should contain “verification requirements” ?

YES

  1. These verification requirements should be extracted from the design specification, NOT from the design requirements ? ( So there is redundancy )

Maybe a bit of both. Design requirements address the desired functions, whereas design specs may address the technology (e.g., speed), IPs, and possibly other internal stuff including testability, test signature, etc.

  1. The “design requirements” should be free of implementation? *NASA System Engineering handbook

YES

  1. The “verification requirements” should link to design requirements ?

Generally agree, can you elaborate more on this?

  1. All design requirements should be covered by a corresponding “verification requirement” ?

What exactly is your definition of “design requirements”?

  1. The “verification requirements” should also link to specific verification constructs, such as tests, coverage groups, etc ?

Not necessarily. the verification requirements is more about what needs to be verified. The test plan is more about the how.

  1. The “verification requirements” can include checks to implementation (which is not listed in design requirements, because it’s implementation.) ?

Isn’t the verification plan independent of implementation?

Ben Cohen http://www.systemverilog.us/

  • SystemVerilog Assertions Handbook, 3rd Edition, 2013
  • A Pragmatic Approach to VMM Adoption
  • Using PSL/SUGAR … 2nd Edition
  • Real Chip Design and Verification
  • Cmpt Design by Example
  • VHDL books

Ben,

Thank you for your input.

It sounds as though the verification plan is the low level, bread-and-butter spreadsheet that allows the environment builder to create the appropriate coverage collectors, and for the test writer to know which tests to generate etc. Great description of test plan. It sounds as though everyone has a slightly different impression of the names.

  1. The “verification requirements” should link to design requirements ?
    Generally agree, can you elaborate more on this?

The design requirements, I.e. the functionality recipe (spreadsheet) that the design engineer is using to create the RTL implementation. Every line item from this design requirement spreadsheet should be mentioned at least once somewhere in my verification plan (in the form of an number: E.g. 1.2.). Agree or disagree?

Lastly, my impression was that the verification plan is NOT free of implementation. The design team has the liberty to implement RTL free of stipulations (design requirements shall be free of implementation), but the verification team will be handed a selected implementation. Much like the UART “test plan” from Mentor’s coverage cookbook, if I am verifying a DUT with a FIFO, I may added SVA constructs to check behavior of the FIFO when it is written while full, etc. If I include these design assertions in my verification, they are in the implementation category. thoughts?

In reply to bmorris:

It sounds as though the verification plan is the low level, bread-and-butter spreadsheet that allows the environment builder to create the appropriate coverage collectors, and for the test writer to know which tests to generate etc. Great description of test plan. It sounds as though everyone has a slightly different impression of the names.

Agree; this is because they both deal with verification.

Quote:

  1. The “verification requirements” should link to design requirements ?
    Generally agree, can you elaborate more on this?
    The design requirements, I.e. the functionality recipe (spreadsheet) that the design engineer is using to create the RTL implementation. Every line item from this design requirement spreadsheet should be mentioned at least once somewhere in my verification plan (in the form of an number: E.g. 1.2.). Agree or disagree?

Agree

Lastly, my impression was that the verification plan is NOT free of implementation. The design team has the liberty to implement RTL free of stipulations (design requirements shall be free of implementation), but the verification team will be handed a selected implementation. Much like the UART “test plan” from Mentor’s coverage cookbook, if I am verifying a DUT with a FIFO, I may added SVA constructs to check behavior of the FIFO when it is written while full, etc. If I include these design assertions in my verification, they are in the implementation category. thoughts?

But isn’t the verification plan a black box plan that deals with the requirements, whereas a tesplan is way to test the item, and that may include the hierarchical black boxes? Thus, if the implementation has an IP that does the FFT, or a UART, the testplan should include a line that states that all integrated IPs should have been tested prior to incorporation into the black boxes. The test planner could take the vendor’s promise at face value, or he could add additional tests. The test plan should also include tests that verify the interfaces an interactions with the rest of the design.

Years agao, I had the bad experience of purchasing a GM truck that turned out to be a lemon, as modular parts started to fail at rapid paces (oil pump, generator, starter, water pump, …). The first night I had the truck, oil dripped profusely all over the garage, and that fix entitled a major replacement of a part (3 days in the shop)!
Obviously, GM was OK in the requirements (what parts are needed to make a truck for that model year), bu the test plan and test implementation were pretty much F**Ked up!
Ben Cohen SystemVerilog.us

In reply to ben@SystemVerilog.us:

But isn’t the verification plan a black box plan that deals with the requirements, whereas a tesplan is way to test the item, and that may include the hierarchical black boxes?

I follow what you are saying. Design requirements are free of implementation; a verification plan that covers those design line items would consequently be implementation free. Basically saying “hit these spectrum/profile of input stimulii, while making sure your checkers don’t flag any incorrect behavior”.

however, touching on my earlier example again… if the design team handed me some custom IP, and told me it had a FIFO (which I’ll treat as a buried hierarchical IP ); also, this FIFO is a critical data bottleneck, and thus is checked by SVA (aka it’s a very simple “embedded IP”, and therefore SVA is all that is required to make sure it’s working correctly). Are you saying the SVA constructs should be itemized out in the test plan, and not mentioned in the verification plan? I would expect to see something like in spreadsheet form somewhere.

Let’s say it was a complicated IP, and had it’s own verification testplan. This would perhaps fit into the test plan… and NOT the verification plan?

In reply to ben@SystemVerilog.us:

Alright; right on. thanks Ben.

In reply to bmorris:

But isn’t the verification plan a black box plan that deals with the requirements, whereas a tesplan is way to test the item, and that may include the hierarchical black boxes?

[Ben] Agree

I follow what you are saying. Design requirements are free of implementation; a verification plan that covers those design line items would consequently be implementation free. Basically saying “hit these spectrum/profile of input stimulii, while making sure your checkers don’t flag any incorrect behavior”.

Agree

however, touching on my earlier example again… if the design team handed me some custom IP, and told me it had a FIFO (which I’ll treat as a buried hierarchical IP ); also, this FIFO is a critical data bottleneck, and thus is checked by SVA (aka it’s a very simple “embedded IP”, and therefore SVA is all that is required to make sure it’s working correctly). Are you saying the SVA constructs should be itemized out in the test plan, and not mentioned in the verification plan? I would expect to see something like in spreadsheet form somewhere.

I wouldn’t say “SVA constructs”, but rather address the test methodology; in this case, I would mention SVA checker or module bound to the FIFO to verify that device requirements, based on its verification plan. This SVA set of tests is used to verfiy the FIFO in the DUT’s environment. The actual code or constructs is not here.

]Let’s say it was a complicated IP, and had it’s own verification testplan. This would perhaps fit into the test plan… and NOT the verification plan?

Agree. Though that IP maybe verified with its own testplan, it is important to define in the testplan of the DUT how that IP is tested in the DUT’s environment; meaning how it is integrated with the correct configurations. Though IPs are tested by the vendor (if you can fully trust him), many issues occurred in the integration of that IP in the system because of misunderstandings in the interfaces and/or configurations.
Ben Cohen SystemVerilog.us