Verification Horizons Complete Issue:
Verification Horizons Articles:
by Tom Fitzpatrick, Editor and Verification Technologist, Mentor Graphics Corporation
As I write this note, it's February here in Massachusetts, which in 2015 means that there's about five feet of snow on the ground, with predictions of another foot of snow coming our way this weekend. And after that, the temperatures will drop to the single digits (°F), so even though my children will be on vacation from school next week, it's not really going to be an enjoyable week. Oh well, at least we got to enjoy the New England Patriots winning Super Bowl XLIX on a tremendous last-second interception.
The Patriots' Super Bowl win gave me an excellent opportunity to reflect on the passage of time. For those of you who don't follow (American) football, this was the Patriots' sixth Super Bowl appearance and fourth championship since 2002. Having become an avid fan since I moved to Massachusetts, this has been quite a run. I can remember each game, and my memories include watching my son grow up with each passing season. Watching him deal with the disappointment of heartbreaking losses and being able to celebrate so many exciting victories with him has truly been a blessing. And it doesn't hurt that he's now big enough to handle most of the snow shoveling, when we're not watching a game.
by Harry D. Foster, Mentor Graphics
In 2002 and 2004, Collett International Research, Inc. conducted its well-known ASIC/IC functional verification studies, which provided invaluable insight into the state of the electronic industry and its trends in design and verification at that point in time. However, after the 2004 study, no additional Collett studies were conducted, which left a void in identifying industry trends. To address this dearth of knowledge, over the years Mentor Graphics has commissioned multiple world-wide, double-blind, functional verification studies, covering all electronic industry market segments. In this article, we present a few highlights from our most recent study, and try to address the question: "Does design size influence the likelihood of achieving first silicon success?"
by Adnan Khan, John Biggs & Eamonn Quigley, ARM® Ltd., and Erich Marschner, Mentor Graphics
Managing power consumption is now one of the key drivers of design and implementation of Systems on Chip, whether it is for extending battery life in mobile devices or constraining power envelopes to manage thermal dissipation. As a result, many more IC design houses and IC IP suppliers are now starting to become much more serious about defining complex power control strategies for their products. This is resulting in widespread adoption of IEEE 1801 UPF as a means for describing both these strategies from high level power intent right down to details of how the power control strategies are implemented.
UPF was originally developed as an Accellera standard [1]. That version of UPF was then updated and was released as IEEE Std 1801TM-2009 UPF [2] and has continued to evolve further as an IEEE standard [3,4]. The initial version of the UPF standard focused more on implementation detail, but IEEE 1801 UPF and its more recent updates have widened the focus to cover more strategic or abstract aspects of power control strategies.
by Gayathri SN and Badrinath Ramachandra, L&T Technology Services Limited
Power management is a major concern throughout the chip design flow from architectural design to RTL implementation and physical design. Multi-power domain SoCs are complex and present new integration and verification challenges because many blocks have different operating modes at different voltages, different clock period and duty cycles of each block being awake, asleep or in shutdown mode.
Mobile Devices which use standards like USB3 have raised the need for higher battery life. This means that USB3 designs need to be implemented with PowerAware architecture using low power techniques.
PowerAware verification for USB3 is a challenging task as there are numerous rules and conditions defined in the USB 3.0 Specification for lower, active and idle power management to ensure that USB3.0 designs are power efficient. Also it must ensure that the design specific power management circuitry functions as expected, and also that the overall chip functionality is not corrupted by the power intent described in the UPF descriptions.
by Dr. Lauro Rizzatti, Rizzatti LLC
About 30 years ago, when computers revolutionized the semiconductor design process, a new verification technology appeared on the horizon under the name of hardware emulation. It was implemented in a big-box and billed as being able to verify and debug chip designs.
Truth be told, there were precursors to this design verification approach. IBM for one already had experimented with hardware to accelerate design verification in the Yorktown Simulation Engine (YSE) and in the Engineering Verification Engine (EVE). In the early/mid 80s, Daisy Systems, Valid Logic, Teradyne and others also introduced hardware-based verification engines. The fact is all of these were simulation accelerators. That is, special purpose computers, some implemented in custom ASICs, others with commercial parts, that verified designs described via software languages. They ran faster than simulators but not fast enough to apply real-life stimulus to the design-under-test (DUT).
by Ram Narayan, Oracle
Project RAPID is a hardware-software co-design initiative in Oracle Labs that uses a heterogeneous hardware architecture combined with architecture-conscious software to improve the energy efficiency of database-processing systems. This article, adapted from a paper we presented at DVCon 2014, describes how formal methods went from being used opportunistically to a central place in the verification methodology of the RAPID SoC. Though not without a learning curve, formal did help the Oracle team achieve its verification goals of finishing on schedule and achieving first pass silicon success.
The initial RAPID verification plan largely relied on a constrained-random simulation environment developed using UVM. Most newly designed units had their own verification environment to thoroughly verify the unit prior to integration into the RAPID SoC. The SoC verification was largely used to verify the interaction between the different units under various operating modes.
by Ashley Winn, Sondrel IC Design Services
In any verification environment it takes a significant amount of work to keep all the tests running and to ensure that each test continues to be effective. To make this job easier, tests need to be kept as short as possible and should be written at the highest level of abstraction possible for the feature being tested. In UVM, sequences provide an ideal structure for creating test procedures at the right layer of abstraction for a particular feature. I'd like to recommend a strategy that uses default virtual sequences running on an environmentlevel virtual sequencer. This set up makes it easy to define and start sequences within a test, simplifies the end-of-test logic, and it allows you to move reused test procedures into a sequence library. It also helps keep your test files manageably small.
by Samrat Patel & Vipul Patel, eInfochips
The fundamental goal of a verification engineer is to ensure that the Device Under Test (DUT) behaves correctly in its verification environment. As chip designs grow larger and more complex with thousands of possible states and transitions, a comprehensive verification environment must be created that minimizes development effort. To minimize effort, functional coverage is used as a guide for directing verification resources by identifying tested and untested portions of the design. The approach should give any verification engineer confidence in DUT functionality.
Functional coverage is a user-defined coverage which maps each functionality defined in the test plan to be tested to a cover point. Whenever the functionality to be tested is hit in the simulation, the functional coverage point is automatically updated. A functional coverage report can be generated which summarizes how many coverage points were hit. Functional coverage metrics can be used to measure the progress of a verification effort.