Verification Horizons Complete Issue:
Verification Horizons Articles:
by Tom Fitzpatrick, Editor and Verification Technologist, Mentor Graphics Corporation
As verification engineers, we have to be able to forecast the accurate completion of our projects and also be able to cope with problems that may occur. Unfortunately, there are severe consequences when we get it wrong. And the stakes keep getting higher. We've spoken for years about designs getting more complex, but it's not just the number of gates anymore. The last few years have shown a continuous trend towards more embedded processors in designs, which brings software increasingly into the verification process. These systems-on-chip (SoCs) also tend to have large numbers of clock domains, which require additional verification. On top of this, we add multiple power domains and we need not only to verify the basic functionality of an SoC, but also to verify that the functionality is still correct when the power circuitry and control logic (much of which is software, by the way) is layered on top of the problem.
by Paul B. Egan, Rockwell Automation
Traditionally, connectivity verification at the block level has been completed using dynamic simulation, and typically involves writing directed tests to toggle the top level signals, then debugging why signal values did not transition as expected. For modules with a high number of wires and many modes of operation, the number of required tests quickly becomes unmanageable. We were searching for a better approach. This article will explain how we applied formal analysis at the block level, extended this to full chip and describe how we significantly reduced verification time at both the block and chip level. Just like a block and tackle provides a mechanical advantage, the formal connectivity flow provides a verification advantage.
by Hemant Sharma and Hans van der Schoot; Mentor Graphics
The standard practice of developing RTL verification and validation platforms as separate flows, forgoes large opportunities to improve productivity and quality that could be gained through the sharing of modules and methods between the two. Bringing these two flows together would save an immense amount of duplicate effort and time while reducing the introduction of errors, because less code needs to be developed and maintained.
A unified flow for RTL verification and pre-silicon validation of hardware/software integration is accomplished by combining a mainstream, transaction-level verification methodology – the Universal Verification Methodology (UVM) – with a hardware-assisted simulation acceleration platform (also known as co-emulation). Necessary testbench modifications to enable this combination are generally nonintrusive and require no third-party class libraries; thus, verification components from customer environments are readily reusable for pure simulation environments, different designs using the same block, and different verification groups.
by Ken P. McCarty, Mentor Graphics
System Level Code Coverage gives the architects and designers the opportunity to validate that their verification strategy is comprehensive and the design is optimal. It is important to understand that the System Level Code Coverage is a forward looking strategy, from an architectural perspective. It is preformed even before the first line of the RTL is ever written.
by Erich Marschner, Product Manager, Questa Power Aware Simulation, Mentor Graphics
Usage of the Unified Power Format (UPF) is growing rapidly as low power design and verification becomes increasingly necessary. In parallel, the UPF standard has continued to evolve. A previous article1 described and compared the initial UPF standard, defined by Accellera, and the more recent IEEE 1801-2009 UPF standard, also known as UPF 2.0. The IEEE definition of UPF is the current version of the standard, at least for now, but that is about to change. The next version, UPF 2.1, is scheduled for review by the IEEE Standards Review Committee in early March. When it is approved, UPF 2.1 will become the current version of the UPF standard.
UPF 2.1 is an incremental update of UPF 2.0, not a major revision. That said, UPF 2.1 contains a large number of small changes, ranging from subtle refinements of existing commands to improve usability, to new concepts that help ensure accurate modeling of power management effects. This article describes some of the more interesting enhancements and refinements coming soon in UPF 2.1.
by Ajeetha Kumari, Srinivasan Venkataramanan, CVC Pvt. Ltd.
Power management is a critical aspect of chip design today. This is especially true for chips designed for portable consumer electronics applications such as cell phones and laptop computers, but even non-portable systems are increasingly optimizing power usage to minimize operation costs and infrastructure requirements. Power management requirements must be considered right from the beginning, and the design and implementation of power management must occur throughout the flow, from early RTL design on through physical implementation. Verification of the power management logic is also essential, to ensure that a device operates correctly even when the power to various subsystems or components is turned off or varied to optimally meet operating requirements.
by Ben Cohen, author, consultant, and trainer.
Verification can be defined as the check that the design meets the requirements. How can this be achieved? Many verification approaches have been used over the years, and those are not necessarily independent, but often complementary. For example, simulation may be performed on some partitions while emulation in other partitions. The verification process in simulation evolved throughout history from simple visual (and very painful) examination of waveforms with the DUT driven by a driver using directed tests, to transaction-based pseudo-random tests and checker modules. This has led to the development of class-based frameworks (e.g., e, VMM, OVM, UVM) that separate the tests from the environment, are capable of generating various concurrent scenarios with randomness and constraints, and are capable of easily transferring data to class-based subscribers for analysis and verification. In parallel, designers and verification engineers have improved the detection and localization of bugs in their design code using assertion-based languages (e.g., PSL, SVA).
by Ajay Daga, CEO, FishTail Design Automation Inc.
There are two approaches to the verification of design constraints: formal verification and structural analysis. Structural analysis refers to the type of analysis performed by a static timing tool where timing paths either exist or not based on constant settings and constant propagation. Formal verification, on the other hand, establishes the condition under which a timing path exists based on the propagation requirements for the path. These path propagation requirements are then used to prove or disprove constraint properties based on a formal analysis of the design. Structural analysis is fast because it is simple. Formal verification, however, is more complete and less noisy. Formal verification allows engineers to guarantee that their design is safe from silicon issues that result from an incorrect constraint specification. Structural analysis cannot make this claim because it cannot independently establish the correctness of a variety of design constraints.
by Mark Litterick, Verification Consultant, Verilab GmbH
While it is generally accepted that the Universal Verification Methodology (UVM) is the way forward for modern SystemVerilog based verification environments, many companies have an extensive legacy of verification components and environments based on the predecessor, the Open Verification Methodology (OVM). Furthermore single push migration of all block-level and top-level environments between the two methodologies may be considered too disruptive for most companies taking into account families of complex derivative under development and scheduled for the near future. Several of our major clients are in the transition phase of initiating new product developments in UVM while having to maintain ongoing development in OVM. Furthermore it is imperative for the new development that we migrate the vast majority of existing verification components from OVM to UVM even though the projects providing the VIP remain with OVM for the time being.