[Q] What’s the difference between input skew :: #1step and #1 ?
LRM describes #1step as : A 1step input skew allows input signals to sample their steady-state values in the time step immediately before the clock event (i.e., in the preceding Postponed region).
#1step is same as the value in Preponed region of the same time step ( as essentially values don’t change in Postponed region of previous time step and Preponed region of current time step )
Is there a case that value sampled for clockvar a would be different for the 2 clocking block ?
#1step is a delay representing the global time precision. If #1 is being used in a timescale that is the same as that global precision, then they represent the same delay. But that is rarely the case.
3.14.3 Simulation time unit :
The global time precision, also called the simulation time unit, is the minimum of all the timeprecision statements, all the time precision arguments to
timeunit declarations, and the smallest time precision argument of all the `timescale compiler directives in the design.
The step time unit is equal to the global time precision.
Dave ,
If user were to define both a `timescale as well as timeunit , which one takes priority ?
`timescale 1ns / 10ps
module A (...);
timeunit 100ps;
timeprecision 10fs;
// What's the time unit and precision used within module A ?
...
endmodule
module B (...);
...
endmodule
If the files were compiled in the order of A , B then module B would simulate with time units in nanoseconds .
LRM 3.14.2.3 : If a timeunit is not specified within a module, program, package, or interface definition, then the time unit shall be determined using the
following rules of precedence :
a) If the module or interface definition is nested, then the time unit shall be inherited from the enclosing module or interface (programs and packages cannot be nested).
b) Else, if a ` timescale directive has been previously specified (within the compilation unit), then the time unit shall be set to the units of the last
` timescale directive.
c) Else, if the compilation-unit scope specifies a time unit (outside all other declarations), then the time unit shall be set to the time units of the
compilation unit.
d) Else, the default time unit shall be used.
The time unit of the compilation-unit scope can only be set by a timeunit declaration, not a ` timescale directive.
If it is not specified, then the default time unit shall be used.
So if timeunit were defined it takes precedence automatically .
(a) As you mentioned 1step is the is the minimum of all the timeprecision statements, all the time precision arguments to timeunit declarations, and the smallest time precision argument of all the `timescale compiler directives in the design .
[Q1] If I were to write #1 will it consider time_precision or the time_unit ( or combination of both ) of the scope where clocking block is defined ?
(b) I then tried an example using real nos. as skew . Mainly to check whether they are valid