Constraining dynamic array size with reusable policy class

Hello,

We are using policy classes as described in the 2015 DVCon US paper, “SystemVerilog Constraint Layering via Reusable Randomization Policy Classes” by John Dickol.

They work great. However we are running into a problem sizing a dynamic array.

https://www.edaplayground.com/x/EPL_

class policy_base#(type ITEM=int);
  
  ITEM item;
  
  virtual function void set_item(ITEM item);
    this.item = item;
  endfunction
  
endclass

class txn;

  rand bit [31:0] addr;
  rand logic [31:0] data[];
  
  rand policy_base#(txn) policy[$];

  function void pre_randomize;
    foreach(policy[i]) policy[i].set_item(this);
  endfunction

endclass

class txn_policy extends policy_base#(txn);

  constraint c_data {
    item.data.size == 1;
  }

endclass

module top;
  
  initial begin
    txn t = new();
    txn_policy p = new();
    t.policy.push_back(p);
    t.randomize();
    $display("txn.data.size=%0d", t.data.size());
  end
  
endmodule

The following error is reported on Questa:

testbench.sv(38): randomize() failed due to conflicts between the following constraints:

testbench.sv(27): policy[0].c_data { (data.size() == 1); }

Where:

data.size() = 0

Am I doing something wrong?

Thanks.

Before doing t.randomize(), you need to assign t.data= new[1] which will solve this constraint error. Constraint failure has occurred because, data size in t was zero and in txn_policy it expects it to be of 1.

Thanks,
Juhi Patel.
https://www.linkedin.com/in/juhi-patel-455535149

In reply to Juhi_Patel:

Hi Juhi,

Thanks for your reply. I don’t think that’s it though. SV LRM section 18.4 states that an array will be “resized according to the size constraint”. The usefulness of the size constraint would be limited if you had to know the size of the array and call new before randomizing.

Plus, here’s an example of dynamic array resizing via constraint from verificationguide.com. Their example does not use new: https://www.edaplayground.com/x/5mEp

Thanks,
Ben

In reply to ben:

Yes, resizing is possible but it seems to be incase of base class or inheritance class not when passed in handle of other class.

Also, I have tried with adding this soft constraint in txn class instead of doing new of dynamic array, It worked for me.

constraint c_data {
    soft data.size() == 2;
  }

Thanks,
Juhi Patel.
https://www.linkedin.com/in/juhi-patel-455535149

In reply to Juhi_Patel:

You need to declare item as a rand variable.

Also it is a good idea to place a constraint on the size of the array in case no policies get added. You can either use a soft constraint as suggested, or a maximum hard constraint.

In reply to dave_59:

Hi Juhi/Dave,

I temporarily updated the example with rand on the item as follows:

class policy_base#(type ITEM=int);
  
  rand ITEM item;
  
  virtual function void set_item(ITEM item);
    this.item = item;
  endfunction
  
endclass

The same error persists, however. This may be since it is the item which is randomized, not the policy object. The policy object is not randomizing the item, only its rand variables. I should mention that other rand variables can be constrained without rand on item. It is just size constraints that error.

But as Juhi mentioned there is a workaround. Seems a soft or upper bound constraint in the item itself is required by the simulator. I’ve updated the example (https://www.edaplayground.com/x/EPL_) with a new c_data constraint as follows and the simulation runs without error:

class txn;

  rand logic [31:0] addr;
  rand logic [31:0] data[];
  
  rand policy_base#(txn) policy[$];

  constraint c_data {
    data.size < 4;
  }

  function void pre_randomize;
    foreach(policy[i]) policy[i].set_item(this);
  endfunction

endclass

Seems odd that this constraint is required by the simulator. For sure it is a good practice anyways, but if the policy object is applied, as in this case, I’m surprised the constraint solver couldn’t solve it.

Thanks to both of you.

In reply to ben:

I think their may be some ambiguities in the LRM. There seems to be a lack of consensus among the tools on EDAPlayground. You may want to discuss this with your tool vendor as this forum is not for discussing tool specific issues.