In reply to dave_59:
Hi Dave ,
When using %t without first using $timeformat, the default is to scale the formatted time to the global time precision, 1ns
In the code above since we have only `timescale compiler directive , the global time precision is 1ns.
However if I were to add a timeunit within the module :
`timescale 10 ns / 1 ns
module time_test;
timeunit 1ns / 1ps ; // Takes priority over `timescale
integer a=0, b=0, c=0;
initial begin
#1.53 a=6; $display($realtime,," a == 6 @ T == %0t ",$realtime );
#2.56 b=9; $display($realtime,," b == 9 @ T == %0t ",$realtime );
#1.547 c=4; $display($realtime,," c == 4 @ T == %0t ",$realtime );
end
endmodule
// In this case : global time precision aka simulation time unit is 1ps
Table 20-3 of the LRM mentions the default for argument units_number as :
The smallest time precision argument of all the `timescale compiler directives in the source description
Although the smallest time precision argument of all the `timescale compiler directive(s) is 1ns , why does %t use global time precision of 1ps ?
The output for %t is different than what Table 20-3 states