In reply to Tudor Timi:
Thanks Tudor! I can attest that command line arg will change random stability and it’s quite painful while trying to close coverage.
Consider the following case(s):
1000 seed regression run in optimized (nondebug mode for speed) with a low UVM verbosity, one failing seed. Re-run failing seed in debug mode (for breakpoints, signal visibility). Failing seed is now gone due to a change in stability. I have observed some cases where even changing the UVM verbosity can change random stability.
Stability change detection mechanism idea:
print a randomized number to a file at the end of a simulation for each seed in the regression. At end of simulation read the last run’s random number for the seed from the file and compare to the most recent. If they differ random stability has changed. Ideally the vendors would have a built in way… maybe they do… Mentor?