Communication between UVM monitor and Sequence Library

Hi All,

I have a scenario where there is a packet field which decides for how much duration the driver has to send a sequence of pattern on the Interface line,
this field shall be set in the sequence library, now how do i get this information in the monitor such that my monitor checks for the pattern only till the duration which is being set? ,

I have another situation similar to this, I would like to enable error_insert while the simulation is in progress for some duration and would like to turn off the error_insert in between simulation, now how the monitor has to turn of the pattern checking while the error_insert packet field is enabled, how does the monitor get this information?

also please suggest a suitable place to declare these fields, duration and error_insert, I feel its best to declare them as packet field, if any better suggestion is there, kindly advice.

Thanks in advance.

In reply to maximus:

I think both problems can be solved using configuration object. Typically a config object is instantiated in base class or env and passed to uvc using config_db.

In reply to maximus:

Without knowing what you are testing, I would suggest it’s generally not a good idea to turn off monitors. If the DUT has predictable behavior when stimulated with errors, then the monitor and coverage collector only need to recognize (via sampling) that there is an error at the input pins. This could apply to your first statement as well (i.e. the number of bus transactions generated from a sequence or sequence_item could be accounted for in coverage collection).

In reply to rohitk:

thank you for the reply, but my understanding is the config db shall be set and get in the build phase only right, I would like to enable/disable the error_insert signal in between the simulation, to set the duration I agree its done only once and can be set in the buildphase using uvm_config_db.

please suggest.

In reply to dhserfer:

Hi dhserfer,

The behavior during error insertion is not predictable, and the hence the turning off the check is required, basically its a PRBS sequence generator, so in between simulation I shall insert errors(that was required by the designer) and for now they are not worried about what is the behavior when error is inserted, and they would like to turn off the check in the monitor…

I can still have the check enabled in the monitor and ensure that when error insertion is done the output should not match the expected value, but my requirement is how does the monitor know when the error insertion logic is enabled.

thank you.

In reply to maximus:

config_db is just a mode of transferring objects and variables. What I meant was to define a config object for your monitor and driver, for ex:

class cfg extends uvm_object;
bit error_insert;
endclass

Now the error test can set this bit to 1 during run_time and monitor needs to check this bit before asserting error. Ideally this config object is declared and instantiated in base test and set in config_db during build_phase and all agent components would call get in there respective build_phase.

Thanks,
Rohit