Verification Horizons Complete Issue:
Verification Horizons Articles:
by Tom Fitzpatrick, Editor and Verification Technologist, Mentor Graphics Corporation
Ah, metrics! There is no substitute for clear, objective data when evaluating a player, or a process for that matter. A productive quarterback is measured most of all on how many times he wins, especially in the Super Bowl (advantage: Brady), but there are other metrics that can be used as well. For verification, productivity really comes down to being able to reliably determine if your chip will run correctly as efficiently as possible. Just as the quarterback with the better completion percentage doesn't always win, verification productivity is about much more than just how fast your simulator runs a particular test. We must think about the full spectrum of verification tasks and the tools and technology available to apply to them. Just as a football game is ultimately determined by the final score, all that matters to us as verification engineers is how well we can determine the correctness of our design within the schedule allotted. Keep these thoughts in mind as you read this month's issue of Verification Horizons.
by Rachida El IDRISSI, ST-Ericsson
As is true in many engineering projects, the most difficult step in verification is knowing when you are done. The challenge of reaching verification closure stems mostly from increasing complexity. Recall that as a general rule of thumb, verification cycles rise at some multiple to the number of gates in a design, which is enough to give one pause in the age of billion-gate designs. Other obstacles on the way to closure include new verification methodologies and their sometimes-steep learning curves, aggressive milestones that sometimes force verification teams to truncate their efforts, and the difficulty of reusing verification components.
by Roger Sabbagh, Mentor Graphics
It's no secret. Silicon development teams are increasingly adopting formal verification to complement their verification flow in key areas. Formal verification statically analyzes a design's behavior with respect to a given set of properties. Traditional formal verification comes in the form of model checking which requires hand-coded properties, along with design constraints. While there certainly are some design groups who continue to be successful with that approach, what are getting more widespread adoption in the industry are the automatic approaches which require much less manual setup. Let's take a look at the top five applications
being used across the industry today.
by Shabtay Matalon and Mark Peryer, Mentor Graphics
Developing a SoC is a risky business in terms of getting it right considering the technical complexity involved, managing the mixture of hardware and software design disciplines, and finding an optimal trade-off between design performance and power. One way to reduce these risks is to use a design and verification flow that is scalable enough to handle the complexity and is flexible enough to explore architectural alternatives
early in the design cycle before implementation starts.
Mentor's System Design and Verification flow encapsulates a range of design and verification disciplines such as: embedded software; SystemC platform design and validation tools; and HDL simulation tools that can be used in various combinations using common interfaces. The flow originally came out of pilot work for the TSMC ESL reference flow and has since matured into a methodology that allows Mentor's Sourcery™ CodeBench, the Vista™ ESL platform, and the Questa® platform to be used together.
by Erich Marschner, Product Manager, Questa Power Aware Simulation, Mentor Graphics
Power management is a critical aspect of chip design today. This is especially true for chips designed for portable consumer electronics applications such as cell phones and laptop computers, but even non-portable systems are increasingly optimizing power usage to minimize operation costs and infrastructure requirements. Power management requirements must be considered right from the beginning, and the design and implementation of power management must occur throughout the flow, from early RTL design on through physical implementation. Verification of the power management logic is also essential, to ensure that a device operates correctly even when the power to various subsystems or components is turned off or varied to optimally meet operating requirements.
by Ahmed Eisawy, Mentor Graphics
Nearly all of today's chips contain Analog/Mixed-Signal circuits. Although these often constitute only 25% of the total die, they may be 100% of the product differentiation and also, unfortunately, 80% of the problems in actually getting the chip to market in a cost effective and timely way. With growing complexity and shrinking time-to-market Mixed-Signal verification is becoming an enormous challenge for designers, and improving Mixed-Signal verification performance and quality is critical for today's complex designs.
The challenges in mixed-signal verification stem from two opposing forces, time-to-market constraints and shrinking process technologies.
by Jim Lewis, SynthWorks VHDL Training
VHDL-2008 (IEEE 1076-2008) is here! It is time to start using the new language features to simplify your
RTL coding and facilitate the creation of advanced verification environments.
VHDL-2008 is the largest change to VHDL since 1993. An abbreviated list of changes includes:
• Enhanced Generics = better reuse
• Assertion language (PSL) = better verification
• Fixed and floating point packages = better math
• Composite types with elements that are unconstrained arrays = better data structures
• Hierarchical reference = easier verification
• Simplified Sensitivity List = less errors and work
• Simplified conditionals (if, ...) = less work
• Simplified case statements = less work.