Hey good find. That’s pretty neat. My guess would be that this is a compile / elaboration / simulation time issue, where the compiler might not know the type and size of a system function input, and so defaults to a single bit. Later on at elaboration or simulation time the simulator sees that the function input is actually 32-bits, but the compiler has already set the literal to 1-bit. Dave will know the right answer though.
For improvements see: