Hi community,
The idea is to have one of the sequence parameters be an asynchronous delay in picoseconds between when the sequence is started, and when one of its signals is driven to the port.
I implemented a synchronous version of the delay succesfully, where the driver triggers at each clock cycle and checks the number of elapsed cycles since receiving the transaction to know if it can drive the port signals yet
Doing the same for an asynchronous delay would mean using a driver loop that triggers every picosecond, which quickly makes simulation runtime explode for larger designs.
Using SystemVerilog
Suggestions on how to proceed?