1. Introduction

    In today’s semiconductor industry, design engineers face mounting pressure to deliver increasingly complex digital designs while meeting aggressive time-to-market demands. The exponential growth in design complexity, coupled with traditional compilation and simulation workflows, often creates bottlenecks that significantly impact development timelines. These bottlenecks are usually considered to be caused by the simulation runtime for each iteration. The increased time it takes to setup simu­lation runs for complex designs is also a factor. The setup overhead multiplied by the number of verifi­cation iterations equates to a significant part of the overall verification process.

    While simulation runtime does take up a lot of the verification process, the compilation and elabora­tion process for each iteration accumulatively amounts to a significant chunk of the overall verifi­cation cycle. Any methods that can alleviate that overhead, added over the whole verification cycle, equates to significant savings in time and compute resources.

    In light of these challenges, SmartCompile addresses this critical problem by introducing an innovative set of tools that strategically combines new capabilities with optimized coding style improvements. This comprehensive solution stream­lines the entire design flow process, from initial compilation through optimization to final simula­tion, enabling engineering teams to achieve substantial reductions in turnaround time while maintaining design integrity. By revamping the traditional approach to design compilation and simulation, SmartCompile empowers digital design and verification engineers to accelerate their devel­opment cycles and gain a competitive edge in today’s fast-paced digital marketplace.

    Traditional Simulation Process

    In a typical verification environment, anytime a change is made to the design or the verification environment, the modified files require recompila­tion. Luckily most tools already have some features to only compile the modified files and leave the unmodified files untouched during compilation. But in a traditional flow the same cannot be said about the optimization and elaboration phase. Even if you make only a very minor change in the design or testbench, you’ll have to re-optimize and re-elaborate the whole design. This may not be a big concern for smaller designs, but due to the ever-increasing complexity of designs the time taken to set up the simulation is starting to become a bottle­neck for many design and verification engineers.

    If you’re having to wait for half a day to run a compile-optimize-elaborate cycle, you are limited to only about two to three design changes per day. Most of the time is spent waiting for the simulation to be set up to do any meaningful debug of the design. This problem is exacerbated in a continuous integration (CI) and continuous deployment (CD) flow where check-ins are made at a higher frequency, and a check-in might trigger a rerun of the whole flow. In such scenarios you are not only spending time waiting for the compilation process to be completed, but you are also using up compute resources which depending on if it’s an on perm or cloud-based resource, this can also add up as a significant cost to the overall verification cycle.

  2. Download Paper