Search Results

Filters
Reset All

Filters

Topic

Show More

Show Less

Content Type

Audience

Resource Type

Tags

Show More

Show Less

83 Results

  • Formal Etiquette for Code Coverage Closure

    Coverage closure is a key step in any IP design verification project. Code coverage is a much-needed metric in most modern-day IP designs. It helps teams to ensure that all RTL code written is indeed exercised and verified prior to tape-out. Without such a guarantee, a semiconductor design house may well be risking millions of dollars in a potential bug-escape to silicon. As the process of code coverage is well automated, it is widely used in the industry.

  • Design for Test Verification

    As semiconductor devices become increasingly complex and diverse, spanning automotive, AI/ML, 5G, and heterogeneous 3D-IC designs, Design-for-Test (DFT) verification plays a crucial role in ensuring not only high-test quality but also seamless integration with system-level requirements. While test insertion flows such as scan insertion, BIST/MBIST integration, and boundary scan logic have matured to deliver cost-effective, scalable test solutions, DFT verification remains a bottleneck that demands significant attention.

  • The Six Steps Of RISC-V Processor Verification Including Vector Extensions

    The open standard ISA of RISC-V allows SoC developers to also build or modify a processor core optimized to the application requirements. The SoC verification tasks are adapting to address the significant increases in complexity. This article covers the 6 key components of RISC-V processor verification: The DV Plan, RTL DUT, Testbench, Tests, Reference model, and Siemens EDA Questa SystemVerilog simulation environment.

  • Formal and Assertion-Based Verification of MBIST MCPs

    Built-In Self-Test (BIST) is widely used to test embedded memories. This is necessary because of the large number of embedded memories in a circuit which could be in the thousands or even tens of thousands. It is impractical to provide access to all these memories and apply a high-quality test.

  • Extending SoC Design Verification Methods for RISC-V Processor DV

    As SoC developers adopt RISC-V and the design freedoms that an Open ISA (Instruction Set Architecture) offers, DV teams will need to address the new verification challenges of RISC-V based SoCs. The established SoC verifications tasks and methods are well proven, yet depend on the industry wide assumption of ‘known good processor IP’ based on the quality expectations associated with IP providers such as Arm or MIPS Technologies.

  • Top Five Reasons Why Every DV Engineer Will Love the Latest SystemVerilog 2012 Features

    SystemVerilog has become the most widely deployed Verification language over the last several years. Starting with the early Accellera release of 3.1a standard, the first IEEE 1800-2005 standard fueled the widespread adoption in tools and user base. Since 2005 there is no look-back to this "all encompassing" standard that tries to satisfy and do more for RTL Designers and Verification engineers alike.

  • Debug

    Debugging is a critical aspect of the digital design and integrated circuit design process. It ensures that the designed system or chip functions as intended, identifying and rectifying errors, glitches, and unforeseen issues. Effective debugging saves time and resources, preventing costly mistakes from reaching the final product. It also enhances product reliability, crucial in safety-critical applications, and reduces post-production maintenance. Debugging tools and techniques are essential for engineers to pinpoint and address issues, making the design process more efficient and leading to the creation of high-quality, dependable digital systems and integrated circuits.

  • Improving FPGA Debugging with Assertions

    Here’s one reason why FPGA design starts dwarf ASIC design starts: choosing flexible, inexpensive and readily available FPGAs is one fairly obvious way to reduce risk when designing complex SoCs for everything from mobile devices and smartphones to automobile electronics.

  • Ten Rules to Successfully Deploy Formal

    About four years ago I gave a couple of talks on the myths surrounding formal. Although, formal has seen more adoption since then, we have a long way to go before it is recognized as a mainstream technology used throughout design and verification. I still see some of these myths clouding the judgement of end users and their managers.

  • Formal Apps Take the Bias Out of Functional Verification

    When we spend hours, days, or even weeks putting our hearts and minds into creating something, we have a tendency to emphasize its strengths and minimize its weaknesses. This is why verification engineers have a blind spot for their own verification platforms. This blind spot, or bias, often leads to overlooking those areas where bugs may lurk, only to emerge at the worst possible time when errors are most costly and take longer to fix.

  • Sequential Logic Equivalence Checking

    In this track, you will be introduced to the concept of sequential logic equivalence checking and its common applications. You will also learn how to start with Questa® SLEC to verify design optimization, bug fix/ECOs, low power clock gating logic, and safety mechanisms.

  • Veloce Hardware-Assisted Verification – Complete, Unified, and Progressive

    Despite abundant rumors predicting the end of life for Moore’s Law (the axiom stating transistor density doubles every 24 months), semiconductor design sizes continue to grow exponentially with no end in sight. In the process, design sizes push costs off the roof. According to market research International Business Strategies (IBS), the total cost of designing a state-of-the-art system on chip (SoC) at the 5nm process technology node exceeds half a billion dollars.

  • Deploying HLS in a DO-254/ED-80 Workflow

    The adoption of tools into safety-critical workflows is often challenging as these new technologies must demonstrate sufficient safeness to use before being deployed in production environments. The demand for High-Level Synthesis capabilities within DO-254 projects is growing and this paper describes the requirements and considerations to successfully use High-Level Synthesis within a DO-254 workflow.

  • Introduction to the Verification Academy

    This session provides a common framework for all advanced functional verification tracks contained within the Verification Academy.

  • Verify Thy Verifyer

    Design Verification is a field that requires a lot of thinking and equally a lot of coding. Tighter time-to-market adds a lot of schedule pressure to the teams coding those testbenches and test cases. The advent of UVM (Universal Verification Methodology) as the standard framework, has helped the industry make good progress in terms of structured testbenches. One of the primary objectives of UVM is to build robust, reusable testbenches.

  • FPGA Verification

    FPGA (Field-Programmable Gate Array) verification, including methods like simulation and formal verification, is invaluable for ironing out design issues before deploying hardware in the lab. Simulation allows engineers to comprehensively test the FPGA design under various conditions, helping detect and rectify potential bugs and ensuring functionality. By conducting thorough FPGA verification, costly and time-consuming hardware iterations are minimized, significantly reducing the risk of errors and shortening the time-to-lab phase. This approach ultimately leads to more efficient development, lower expenses, and a faster path to achieving operational hardware.

  • Four Best Practices for Prototyping MATLAB and Simulink Algorithms on FPGAs

  • Delivering First Silicon Success for Your Next SoC or 3DIC

    In this session, you will learn about the protocol and memory verification solutions needed for your next silicon verification project whether in Datacenter, Storage, 3DIC, Networking, Automotive or Mil/Aero applications.

  • Marking Milestones: In Life and in Technology

  • Equivalence Checking for FPGA

    In this session, you will learn the need and methodologies to apply Equivalence Checking for FPGAs, plus the advantages and challenges of stepwise netlist verification.

  • Overcoming Today’s Verification, Supply Chain, and Legacy Technology Challenges Associated with FPGA-based Designs

    In this session you will gain an understanding of the core challenges facing designers of FPGA-based devices. Everything from ensuring the functionality to dealing with FPGA supply chain issues to extending the life of legacy designs powered by old or obsolete FPGAs.

  • System Level Functional Coverage Example

    System level functional verification can take full advantage of the fact that the entire design is a self contained unit that will be used by customers, and thus has some logical use model that the customer will follow. Also, being a system, often it is made up of trusted IP, and the verification focus is aimed more at the block interconnect and any new functionality.

  • QVIP Provides Thoroughness in Verification

    The present day designs use standard interfaces for the connection and management of functional blocks in System on Chips (SoCs). These interface protocols are so complex that, creating in-house VIPs could take a lot of engineer’s development time. A fully verified interface should include all the complex protocol compliance checking, generation and application of different test case scenarios, etc.

  • A Formal-based Approach for Efficient RISC-V Processor Verification

    The openness of RISC-V allows customizing and extending the architecture and microarchitecture of a RISC-V based core to meet specific requirements. This appetite for more design freedom is also shifting the verification responsibility to a growing community of developers. Processor verification, however, is never easy. The very novelty and flexibility of the new specification results in new functionality that inadvertently creates specification and design bugs.

  • Effective Validation Method of Safety Mechanism Compliant with ISO 26262

    The metrics to measure the effectiveness of Safety Mechanisms include code coverage rate, SPFM (Single- point failure metric) and LFM (Latent failure metric). Especially in SPFM and LFM, if the specified value is not reached on the Fault Injection Simulation (using Gate Level) at the end of verification, it will cause iterations, which will cause a significant increase in time and cost compared to consumer LSIs.