I’ve recently discovered that adding a Clocking Block into an interface, and having the clocking block drive an output is akin to adding a driver to that interface signal.
LRM 14.6 For each clocking block output whose target is a net, a driver on that net shall be created. The driver so created shall have (strong1 , strong0 ) drive strength and shall be updated as if by a continuous assignment from a variable inside the clocking block
If you are just content with using interfaces as top level test harness things with a single modport direction… great. But this is super frustrating if you want to use the interface internally for synthesis BECAUSE…
The driver created by the clocking block now conflicts with drivers in the synthesizable code, and you get an error(Supressible)
Things are worse if you want to use a single interface with a mod port to declare the direction, as now you might want to drive every pin from a clocking block. AFAIK there is no way to associate a clocking block with a mod port.
Now this does get associated with a suppressible error… so you can still use the clocking block… but YUCK.
I dealt with this by just using the clocking block as a monitor (everything input) and then manually adding the output delays on everywhere my driver, drives an output. As a designer this feels quite frustrating, because it felt like this was a convenience clocking blocks were supposed to provide.
I’m guessing that this requirement comes from the simulators need to optimize, and wanting simple rules on who is a driver, so it can remove intermediate steps?
But, I gather I’m not the only one using interfaces to connect my design internally. It so handy to have the interfaces inside the design, so I can just point to them. But, if I have to choose between the convenience of using the clocking block to drive outputs, or using interface to connect my synthesizable blocks, I’m going to choose to connect my synthesizable blocks. But it feels like an artificial choice… Clocking blocks are purely a simulation only construct. I keep mine in a synthesis translate off. Why aren’t they treated like any other assignment from a class to an interface signal, like a deposit, instead of a full on driver? Is it because counting how many depositors there might be on an interface member is difficult, which makes optimization difficult?
Back to my title question.
How do other people deal with this.
(1) Do you not use clocking blocks drivers on internal interfaces
(2) Do you just suppress the error?
(3) Is there a magical work around to this issue I don’t know about! (hopeful!)