IMPORTANT NOTICE: Please be advised that the Verification Academy Forums will be offline for scheduled maintenance on Sunday, April 6th at 2:00 US/Pacific.
I add #2000ns in my class(actually it’s a UVM test sequence), but it seems that it does not delay 2000ns, I print time before and after this delay, and find that the actual delay is 200ns. And I add a $printtimescale task in this class, then in my log it print:
“TimeScale of $unit is 100 ps / 100 ps”
My question is:
I add timescale in my tb_top.sv: `timescale 1ns/1ps, and I add " -timescale=1ns/1ps " option to my my simulation tool. Why in my class, timescale is 100ps/100ps ?
When I add #2000ns in class, why not 2000ns is delayed? It seems that it treat this sentence as “#2000”, and add 100ps unit, so the result is 2000 * 100ps = 200ns.
output
TimeScale of $unit is 1 ps / 1 ps
kranthi1 0.000ns
kranthi2 0.010ns
kranthi3 0.010ns
kranthi4 0.011ns
kranthi5 0.011ns
kranthi6 0.012ns
kranthi7 0.012ns
kranthi8 0.012ns
kranthi9 10.012ns
V C S S i m u l a t i o n R e p o r t
Time: 10012 ps
#10ns is working differently in class and module. Not sure what is use of timeunit and timeprecision? I thought hardcoded delays are not dependent on timescale. Is there any simple solution for this?
It is not good programming practice to declare code outside of any module or package, and to have any code using delays without specifying any timescales/timeunits. Although the SystemVerilog LRM says it chooses a tool specific default timescale, I think it should really be an error.
Write the code this way and you will see no issues.