Easy question regarding the clock blocking syntax (inside an UVM driver in this case, although this is extrapolable to many other places). Basically I want to know what is the reason for using the semicolon “;” symbol after the “@ ( posedge )” clock event?
task run_phase(uvm_phase phase);
forever
begin
seq_item_port.get_next_item(req);
// Wiggle pins of DUT
@(posedge dut_vi.clock); // <---- HERE : Note the ';'
dut_vi.cmd <= req.cmd;
dut_vi.addr <= req.addr;
dut_vi.data <= req.data;
seq_item_port.item_done();
end
endtask
I have the suspicion that finishing the clock event with a semicolon might move the simulator into a different event time region. i.e. just after the clock has rised. Is this correct? What is the reasoning behind this?
This means you can have 0 or more timing controls in front of any statement. In your example
@(posedge dut_vi.clock) is a timing control followed by the null_statement.
Since this code is within a begin/end block, all these statements execute sequentially, so there is there is no functional difference using the semicolon or not. There would be a difference within a fork/join
Some people do prefer to add the semicolon simply because of the way some editors indent code. Without the semicolon, the next line becomes a continuation of a statement and gets indented.
Would you mind developing on the specific case with the fork? Would a ‘;’ (or lack of) make a difference in two forked procedural timing controled statements?