Using system functions : $realtime and $time as arguments to format specification %t

In reply to dave_59:

Section 3.14.3 Simulation time unit says :
The global time precision, also called the simulation time unit, is the minimum of all the timeprecision statements, all the time precision arguments to timeunit declarations, and the smallest time precision argument of all the `timescale compiler directives in the design.

So this means that global time precision considers `timescale as well as timeunit and timeprecision construct and uses the minimum of them .

However in the LRM I can’t find the relation between global time precision and default value of argument units_number .

My understanding was global time precision is related to #1step of a clocking block .
As format specification %t would use default value of argument units_number ( in absence of $timeformat )

Section 20.4.2 $timeformat simply mentions the following about units_number :

The smallest time precision argument of all the `timescale compiler directives in the source description

Currently LRM doesn’t say default value of argument units_number uses global time precision or timeunit / timeprecision construct .
It only says out of all the compiler directive(s) `timescale present , the minimum of time precision is used .

My interpretation is that default value of units_number is independent of timeunit and timeprecision constructs .

Am I missing something in LRM ?