- Darron May - Mentor, A Siemens Business
Big Data is a term that has been around for many years. The list of applications for Big Data are endless, but the process stays the same: capture, process and analyze. So why shouldn’t this technology help improve your verification process efficiency and predict your next chip sign-off? By providing a Big Data infrastructure, with state of the art technologies, within the verification environment, the combination of all verification metrics allows all resources to be used as efficiently as possible and enables process improvements using predictive analysis. This paper will cover the technology, the metrics, and the process, and it will explore a number of techniques enabled by such an infrastructure.
Big Data is a term that has been around for more than 20 years. It was initially defined as data sets captured, managed, and processed in a tolerable period of time beyond the ability of normal software tools. The only constant in Big Data’s size over this time is that it’s been a moving target driven by improvements in parallel processing power and cheaper storage capacity. Today most of the industry uses the 3V model to define the challenges and opportunities of Big Data as three dimensional: volume, velocity and variety. Lately this has been expanded to machine learning and digital footprints. The list of applications are endless, the process is the same — capture, process and analyze. Why shouldn’t this technology help improve your verification process efficiency and predict
your next chip sign-off?
Today’s verification environments have to be collaborative due to the size of devices, geographically dispersed teams, and pressures on time to market. It requires the efficient use of every cycle, managing hardware, software, and personnel resources.
This paper will define the typical verification environment and the data that it often leaves uncaptured across the period of a project. It will show how the process of capture, process, and analyze can be applied to improve predictability and efficiency of the whole verification process. This requires a flexible infrastructure that allows data to be extracted from the multiple systems that make up the typical verification flow. There must be a central repository that is able to store the data in a common way, so that data can be managed to keep it clean and relevant not only over the project’s duration but also into the future to allow comparisons and predictions on other and new projects. This paper will explain how new web-ready technologies can be applied to the hardware development flow to provide a plug-n-play infrastructure for collaboration. It will also highlight some of the types of analysis and insights that are possible by combining common coverage metrics with those normally lost data metrics, as well as inter-relationships between those metrics.
The ability to see gathered metrics over time can provide great insights into the process. Historical coverage data trended over time alone can give indications of how much more time is needed to complete sign-off. Being able to plot these single metrics together on the same graph also opens up information that is often lost. This paper will also show with examples other insights that can be gained by looking at cross-analytics between bug closure rates and source code churn, which when combined with coverage metrics can help predict progress towards sign-off. It will show how historical data can be used to spot similar patterns of events, and how recording a little more meta-data within existing systems allows cross-analytics to figure out information like how effective a new methodology or tool has been based on past projects. It also allows calculation of further metrics available from the data; such as mean time between bug fixes, mean time between regression failures, the last time a test passed or failed, and tool
license use by specific users — allowing us to answer and predict questions like “Will we have enough formal licenses for our peak usage on the next project?”
View & Download:
Read the entire Improving Verification Predictability and Efficiency Using Big Data technical paper.