Redefining Static and Formal Verification
Recent studies show that first-time silicon success is at an all-time low. It is therefore fair to question whether verification technologies and methodologies deliver the solutions that teams require. Simultaneously, new technologies such as AI yield exciting new opportunities to redefine verification methodologies. Siemens EDA delivers a transformative use of static and formal technologies, empowered by AI and new forms of automation.

-
Introduction
As long as electronic systems have been designed, manufactured and used, whether discrete transistor based, FPGA, SoC, SiP or 3DIC, their developers perceived daunting complexity and a high cost of failure. These perceptions were correct, and led to constant concerns that the next project, the next generation of silicon, would become unattainable – the cost of failure too high. Yet these issues were kept at bay through Moore’s law, globalization and advances in development methodology. In addition, ample supplies of engineers entering the workforce fueled the machine, and while large investments in verification technologies and hiring did not improve first-time silicon yields, the situation remained largely status quo, for years. According to the Wilson Research Group Functional Verification Studies, the rate of first silicon success among surveyed engineers remained between 30 and 33 percent from the 2010 survey through 2020 (with one outlier at 26 percent in 2018). So, while the situation was always challenging, and the cost of respins rising, the industry was able to not lose ground, and the aggregate cost of failure was directly correlated with the cost of silicon respins.
Something fundamental has changed. Since 2020, first time silicon success declined 53 percent, from 32 percent in 2020 to just 14 percent in 2024. This rate of failure is unprecedented in the Wilson survey. Only one in seven projects now achieve first silicon success, down from one in three in 2020. This decline alters the industry’s aggregate cost of failure such that it is no longer simply related to the increasing costs of a respin.
For FPGA designs, the news is no better. In that same 2024 Wilson Research survey, only 13 percent of FPGA projects incurred only trivial production bug escapes. The other 87 percent were subject to at least one non-trivial bug, and 60 percent incurred two or more.
Indeed, in his white paper “Breaking the bottleneck – Overcoming the Verification Productivity Gap 2.0,” Harry Foster, Siemens EDA Chief Scientist of Verification, identified that the industry is encountering a new productivity gap, the result of new trends such as 3DIC and software-defined architectures, increasing security and reliability requirements and more rigorous power management requirements. At the same time that these new levels of requirements are stressing development teams like never before, other factors stress the development process even further: resource constraints brought on by an aging engineering workforce, the retreat from globalization and new engineers entering the workforce at a pace not predicted to meet demand. In the report, “The Growing Challenge of Semiconductor Design Leadership” from November 2022, Boston Consulting Group and SIA identified that in North America, designer demand would increase 50 percent by 2030, yet the rate of graduation of design engineers would only increase 1 percent by 2030.
While it is not time to panic just yet, these factors point to several futures where the complexities required of a challenged workforce amid an extremely prohibitive cost of failure will be untenable.
A trope misattributed to Einstein but true nonetheless charitably indicates that continuing to do verification the same way, yet hoping for different results, is not rational. Change demands change. So, verification needs to change.
-
Download Paper