How to do Error Injection in UVM?

Hello,

I am a UVM novice. I am verifying the 10GE MAC Core in UVM. As of now, I was able to verify the normal working of the DUT. The DUT is working in loopback mode. I would like to inject errors to the DUT and then find the coverage and I don’t know how to to do that. A few scenarios on my mind are

  1. Multiple Start of Packet(SOP)
  2. No SOP
  3. Multiple End of Packet(EOP)
  4. No EOP.

Do I create a base test class which will take care of the correct working of the DUT and then keep extending the base class for each error case?( I am using a virtual sequence which calls the SOP, data and EOP seqquences)
Or do I create a base class and only one extended class which will take care of all the error scenarios?
Or do I create a base class, one extended class, create a master sequence override the slave sequences?

Any pointers will be very helpful.

In reply to prithvinkumble:

I guess for each normal mode you have at least a sequence defined. You can extend these sequences to model wrong data. This is common practice in OOP.

In reply to prithvinkumble:

If you are modeling transactions then SOP,data,EOP all belong to a single uvm_sequence_item (not seperate sequences or sequence items). Error generation is not a UVM question and depends on the drivers architecture…the sequence could insert errors in the sequence item during randomization or the sequence item could have fields that instruct the driver to create errors while transmitting the packet.