Search Results

Filters
Reset All

Filters

Topic

Show More

Show Less

Content Type

Audience

Resource Type

Tags

Show More

Show Less

83 Results

  • Part 2: Power Aware Static Verification – From Power Intent to Microarchitectural Checks of Low-Power Designs

    Part I of this article provided a consolidated approach to understand verification tools and methodologies that applies a set of pre-defined power aware (PA) or multi-voltage (MV) rules based on the power requirements, statically on the structures of the design.

  • The RISC-V Verification Interface (RVVI) – Test Infrastructure and Methodology Guidelines

    The open standard ISA of RISC-V is at the forefront of a new wave of design innovation. The flexibility to configure and optimize a processor for the unique target application requirements has a lot of appeal in emerging and established markets alike. RISC-V can address the full range of compute requirements such as an entry-level microcontroller, a support processor, right up to the state-of-the-art processor arrays with vector extensions for advanced AI applications and HPC.

  • Automation and Reuse in RISC-V Verification Flow

    The Open RISC-V Instruction Set Architecture (ISA) managed by the RISC-V foundation 1 and backed by an ever increasing number of the who's who in the semiconductor and systems world, provides an alternative to legacy proprietary ISA's. It delivers a high level of flexibility to allow development of very effective application optimized processors, which are targeted to domains that require high performance, low area or low power.

  • Simulation

    Simulation plays a pivotal role in the digital design and verification process. Its primary purpose is to validate whether the design being created functions according to the specified requirements. By running simulations early in the design phase, potential issues can be identified, thus minimizing the need for extensive code revisions. Simulations can be performed at different levels of abstraction and at various stages throughout the design process.

  • Questa CDC-FX: Metastability Effects Delay Modeling

    In this paper, we survey traditional metastability effect models and discuss the shortcomings of each of them. We then present the model used by Questa CDC-FX, from Siemens EDA, and describe why it is a more accurate and complete metastability-effects model.

  • Three Steps to Unified SoC Design and Verification

    Developing a SoC is a risky business in terms of getting it right considering the technical complexity involved, managing the mixture of hardware and software design disciplines, and finding an optimal trade-off between design performance and power. One way to reduce these risks is to use a design and verification flow that is scalable enough to handle the complexity and is flexible enough to explore architectural alternatives early in the design cycle before implementation starts.

  • Monitors, Monitors Everywhere – Who Is Monitoring the Monitors

    The reader of this article should be interested in predicting or monitoring the behavior of his hardware. This article will review phase-level monitoring, transaction-level monitoring, general monitoring, in-order and out-of-order transaction-level monitors, A protocol specific AXI monitor written at the transaction-level of abstraction will be demonstrated. Under certain AXI usages, problems arise.

  • First Time Unit Testing Experience Report with SVUnit

    Verification teams don’t typically verify testbench components. But this Qualcomm Technologies IP team realized the necessity of unit testing a critical testbench component and the corresponding debug time and frustration it could prevent for downstream IP and chip teams.

  • Memory Softmodels - The Foundation of Validation Accuracy

    As always, we must continue to reduce the time-to-market of SoCs and complex systems. An FPGA prototype implementation of these systems can be used as a basis for early software or firmware development, hardware-software co-verification and system validation, and all this can be achieved before actual silicon is available.

  • Best Practices for FPGA and ASIC Development

    This is an overview of best practices for FPGA or ASIC design, assuming a traditional waterfall development process. There are four development phases: PLAN, EXECUTE, VERIFY and SUPPORT. A review step is recommended between each phase, as prescribed by DO-254. These concepts can be used in alternative methodologies, like Agile.

  • "Hug the Debug" – Before It’s Too Late

    Though the term “shift-left” originated in the software industry, its importance is often cited in the hardware (semiconductor) industry where the end-product (chip) costs are skyrocketing. The increase in cost is driven by the global chip shortage, especially in the automotive industry.

  • Enabling Model-Based Design for DO-254 Certification Compliance

    Engineers can use Model-Based Design for requirements analysis, algorithm design, automatic HDL code generation, and verification to produce airborne electronic hardware that adheres to the DO-254 standard. The proposed Model-Based Design approach for DO-254 combines tools from MathWorks® and Siemens EDA for both design and verification. This workflow supports development phases from concept through implementation, streamlining development, and reducing costs.

  • RISC-V Design Verification Strategy

    As the RISC-V architecture becomes increasingly popular, it is being adopted across a diverse range of products. From the development of in-house cores with specialized instructions, to functionally safe SoCs and security processors for a variety of verticals – RISC-V adoption brings several verification challenges that are discussed in this article, along with potential approaches and solutions.

  • The Democratization of Digital Methodologies for AMS Verification

    A mixed-signal design is a combination of tightly interlaced analog and digital circuitry. Next-generation automotive, imaging, IoT, 5G, computing, and storage markets are driving the strong demand for increasing mixed-signal content in modern systems on chips (SoCs). There are two critical reasons for this trend.

  • What Siemens’ Acquisition of OneSpin Means for Formal Verification – and You

    Preface: in May 2021 Siemens EDA acquired OneSpin Solutions, combining Siemens' Questa Formal products and expertise (with roots and team members from 0-In) with OneSpin’s “apps first” approach to key growth markets including Trust&Security, Safety, RISC-V, and FPGAs. The combination adds to a cohesive Siemens EDA verification solution spanning simulation, formal, emulation, and prototyping.

  • Verification Cookbook Glossary

    This page is an index to the glossary of various terms defined and used in the Cookbook.

  • Robustness Verification of ARINC708’s Manchester Codes in a DO-254 Project

    In this article, we will discuss the Display Data Bus of ARINC-708. The bus plays a critical role in terms of the pilot’s point of view. It requires a bi-phase Manchester encoding and decoding. For both Manchester coding and the bus protocol, some examples of possible error types are considered.

  • Hardware-Assisted Verification Through the Years

    A quick glance in today’s design verification toolbox reveals a variety of point tools supporting the latest system-on-chip (SoC) design development. When combined and reinforced by effective verification methodologies, these tools trace even the most hard-to-find bug, whether in software or in hardware. The focus on tools and delivering a tightly woven integration between complementary tools is a strategic focus at Siemens EDA.

  • Celebrating 10 Years of the UVM

    Version 1.0 of the UVM class library was released by Accellera at the end of February 2011, the result of a unique collaborative effort between fierce competitors (Siemens EDA, formerly Mentor Graphics, Cadence, and Synopsys) and a small number of activist user companies. The objective was to provide an industry standard SystemVerilog based verification methodology.

  • Smoothing the Path to Software-Driven Verification with Portable Stimulus

    Designs are becoming more complex and increasingly include a processor – and often multiple processors. Because the processor is an integral part of the design, it's important to verify the interactions between software running on the processor and the rest of the design.

  • On-Chip Debug – Reducing Overall ASIC Development Schedule Risk

    With ASIC complexity on the increase and unrelenting time-to-market pressure, many silicon design teams still face serious schedule risk from unplanned spins and long post-silicon debug cycles. However, there are opportunities on both the pre-silicon and post-silicon sides that can be systematically improved using on-chip debug solutions.

  • Is Intelligent Testbench Automation For You?

    Intelligent Testbench Automation (iTBA) is being successfully adopted by more verification teams every day. There have been multiple technical papers demonstrating successful verification applications and panel sessions comparing the merits to both Constrained Random Testing (CRT) and Directed Testing (DT) methods. Technical conferences including DAC, DVCon, and others have joined those interested in better understanding this new technology.

  • Evolving the Use of Formal Model Checking in SoC Design Verification

    Project RAPID is a hardware-software co-design initiative in Oracle Labs that uses a heterogeneous hardware architecture combined with architecture-conscious software to improve the energy efficiency of database-processing systems.

  • Caching in on Analysis

    The on-chip bus interconnect has become a critical subsystem of a System On a Chip (SoC). Its function is to route data between different parts of the system at a rate that allows the system to meet its performance goals.

  • How Do You “Qualify” Tools for DO-254 Programs?

    Tools used in the design and verification of electronics have played a massive role in the dramatic evolution of these devices over the past few decades. After all, there is a limit to the amount of work and detail that even a good aerospace engineer can handle, but add the use of tools, and the sky (pun intended) is the limit.