Verification Horizons Complete Issue:
Download
- 6.3 MB
Verification Horizons Articles:
by Tom Fitzpatrick - Mentor, A Siemens Business
Welcome once again to our DVCon-US edition of Verification Horizons. I must confess that it feels strange not to be able to write about how the New England Patriots did in the Super Bowl this year since, after three consecutive appearances (winning two, by the way), the Patriots did not make it to the championship game this season. The bigger topic of football conversation around here is whether Tom Brady, our Greatest-of-all-Time quarterback for the past 20 seasons, will sign a new contract and return to the Patriots or will instead decide to sign with another team. Given that he’s made over $200 million in his career, and is married to one of the highest-paid super models in the world, I’m pretty sure his decision won’t be based just on money.
In addition to having played only for the Patriots, Brady also has the distinction of having played for the same head coach, Bill Belichick (also considered the Greatest-of-all-Time) for his entire career. As a result, there has been an ongoing discussion among football fans as to whether Brady’s success is due more to his own talent and work ethic or to Belichick’s coaching. If Brady signs with another team and isn’t as successful – which is common for 43-year-old quarterbacks – then everyone will judge that Belichick was the main reason for his success. Similarly, if Brady is successful with another team, then it could be argued that Belichick owes his success to Brady. It seems to me that they each have incentive to stay together. Perhaps they would both benefit from the advice of John Wooden, legendary UCLA basketball coach, who said, “It is amazing how much can be accomplished if no one cares who gets the credit.”
by Srinivasan Venkataramanan and Ajeetha Kumari, VerifWorks
Design Verification is a field that requires a lot of thinking and equally a lot of coding. Tighter time-to-market adds a lot of schedule pressure to the teams coding those testbenches and test cases.
The advent of UVM (Universal Verification Methodology) as the standard framework, has helped the industry make good progress in terms of structured testbenches. One of the primary objectives of UVM is to build robust, reusable testbenches. UVM has a set of guidelines on how to build reusable components and also provides a Base Class Library (BCL) that implements basic infrastructure.
As with any code development, verifying the code for compliance, quality and of-course functionality is a tedious task, but a much needed one. In this article we share our experience in helping customers use an innovative technology named Karuta to perform rule-checking on the UVM code base.
by Tomáš Vaňák, Codasip
Questa® SLEC, the formal analysis app from Mentor, was designed to automatically compare a block of code ("specification" RTL) with its functional equivalent that has been slightly modified ("implementation" RTL), helping design teams save considerable amounts of time and resources. Codasip, the leading provider of configurable RISC-V® IP, has come up with a new use of this tool: the verification team uses it to compare a fully UVM-verified HDL code, for example Verilog, with a new HDL output, such as SystemVerilog or VHDL, making sure that they are functionally identical – in a fraction of the time needed before. Total time required for full verification of a new processor design is then reduced by up to 66%, depending on the desired number of HDL outputs.
SLEC, or the Sequential Logic Equivalence Check, is used to formally verify that two designs that differ sequentially but are supposed to be functionally equivalent, really are equivalent—that is, that they produce the same output for the same input at all times. SLEC is typically used when a design has been modified in a small, yet operationally critical way, and verification of the new code is needed. RTL simulation can be also used for this purpose, but it is limited in testing all corner cases, which means that the verification will not be complete, thus not entirely reliable. RTL simulation is also very time-consuming.
by Asif Ahmad and Abhishek Chauhan, Agnisys
In this era of automation, significant advantages can be gained by automatically generating verification and validation sequences from natural language text using artificial intelligence (AI) based sequence detection techniques, and then using those sequences in C/UVM code. This article talks about the current state of development in this area and gives ideas about how you can implement your own solution to achieve true specification-driven software development.
With the continuing advancement of AI and machine learning (ML), their application has increased in diverse high technology fields, such as face detection, face wrapping, object detection, goal classifiers, language translation, chatbots, spam detection, and data scrapping. Through AI, rule-based applications have taken a back seat as many algorithms have been invented that are capable of defining their own rules or creating classifiers like linear regression, logistic regression, trees, and SVM. Along with the algorithms, what is really important is the data that is used to train the model of these algorithms. In EDA, the application of ML or deep learning techniques enables modeling and simulation with unprecedented levels of insight. Hence one can expect greater efficiency and accuracy from design tools, which will translate into shorter turnaround times and greater flexibility in analysis and simulation coverage, and thus, encourage broader automation. Machine learning can help identify patterns to optimize designs, allowing designers and testers to model more complex designs in a simpler way and in less time. This will make designs more efficient in multiple aspects of automation and design generation as well as in verification and validation by using assertions or by generating sequences for special registers[2] that will provide full test reporting to accelerate the entire design process. Also, ML-generated models can provide better feedback to the designers or engineers by indicating whether or not the design would live up to the expected performance at each step of the development process.
by Vishal Patel & Manoj Pandey, Arastu Systems
The Big Data technology has evolved to handle both volume and velocity of data, currently being generated by the chip design and verification activities. However, we feel that the core challenge of effective data management and hence actionable insight generation is still not available to the industry in the true sense. Connecting data islands as created by various tools in various formats across digital design and verification workflows and creating a Unified Data Lake is an important missing piece. In addition, we believe there is no open tool or framework available to augment clean operational data to product data as it gets generated, adding rich context and experience to the Data Lake.
We propose a unified and open-ended industry ready tool/framework to fill this vacuum. The tool can extend a core value proposition around data aggregation, analysis, and insight generation throughout ASIC design & verification workflows without requiring change in existing tools and methodologies.
by Dr. Nicole Fern, Tortuga Logic
Modern electronic systems are complex, and economics dictate that the design, manufacturing, testing, integration and deployment of Application Specific Integrated Circuits (ASICs), System on Chips (SoCs) and Field Programmable Gate Arrays (FPGAs) span companies and countries. Security and trust in this diverse landscape of 3rd party IP providers, processor vendors, SoC integrators and fabrication facilities, is both challenging and introduces security risks. There are numerous obstacles in building secure systems including but not limited to complex supply chains, reverse engineering, counterfeiting, physical tampering and side-channel attacks.
Security vulnerabilities can be introduced throughout the design lifecycle starting at the architectural level, where fundamental flaws in the security architecture, such as storing implicitly trusted boot code in unprotected writable SPI Flash [1] can open systems to attack. A flawed microarchitectural design decision can also open hardware to vulnerabilities (e.g., Meltdown [2] and Foreshadow [3]). Vulnerabilities can also be introduced during RTL design, such as unintentional backdoors in test and debug circuitry [4], as well as errors in configuration and usage of hardware by low-level firmware and software [5-7].
by Ashish Darbari, Axiomise
The verification of modern-day processors is a non-trivial exercise, and RISC-V® is no exception. In this article, we present a formal verification methodology for verifying a family of RISC-V® “low-power” processors. Our methodology is both new and unique in the way we address the challenges of verification going beyond just functional verification. We focus on architectural verification, lockstep verification (part of functional safety), X-issues due to low-power design, and security. Our approach finds bugs in previously verified and taped-out cores as well as establish bug absence through exhaustive proofs of correctness for end-to-end checks.
At Axiomise, we have designed a new RISC-V® ISA Formal Proof Kit® covering the RV32IC subset of the RISC-V® ISA (so far) to address the problem of architectural verification. We find architectural bugs – ISA instructions not executed in the micro architecture but also prove bug absence exhaustively when an ISA has been implemented correctly. Using Questa® PropCheck, so far, we found several bugs covering architectural violations, X-propagation, and potential security vulnerability on zeroriscy besides seeing multiple failures with deadlock checks. On ibex, a new 32-bit core under development, so far, we have discovered several failures on architectural checks and eight violations of lockstep verification, and X-issues. Our work defines a new milestone in exhaustive formal verification of microprocessors as it proposes a new way of addressing several verification challenges by combining our new formal tool vendor-neutral ISA Formal Proof Kit® along with lockstep verification, deadlock checking, X-checking, and security analysis.