Here's a collapsed set of questions asked during the live web seminar along with Harry's answers. If you don't see your question here, or need further clarification, you can always ask on the Verification Academy Forums.
Are you seeing regional biases in the adoption of various verification languages?
Since we experienced a large increase in participation from Europe, combined with a declined in North American participation, we believe that there could be a small regional bias associated with adoption (or lack of adoption) of various languages. Realistically, this bias in for the most part minimal.
Are you seeing regional biases in the adoption of various functional verification techniques?
We are not seeing any regional biases associated with the adoption of various functional verification techniques.
Harry, thank you for the great presentation. Just curious, do you think that the large increase in the percentage of ASIC projects that use ISO 26262 may be attributable to the fact that you got more responses from Europe and fewer from North America?
In general, we have seen the adoption of ISO 26262 higher for ASICs versus FPGAs in our previous studies. So, no, I don’t think there is any regional bias here.
As far as ASIC vs FPGA is concerned, is there a trends to outline compare to few years ago: Increased complexity, high-end techno node used in FPGA, Kind of design, flexibility, reuse and life cycle...
In general, FPGAs are following a similar complexity curve that ASIC (and specifically SOC-class designs) experience about 15 years ago. Up until recently, we say a lag of about 5 years in the complexity curve, but it has shortened recent. Nonetheless, you can observe in the data the impact of these complexity curves in terms of increasing resources on both ASIC and FPGA projects.
Design engineers' verification time - is it exclusively unit level testing and some formal tool flow like CDC checks (or) does it include debugging bugs found by verification team?
The amount of time that a design engineer working on verification task include their own simulation, formal, CDC, and debugging work in associating with the verification team.
Is there a split between "creating test" and "running simulation?"
This has not been split out in previous studies, but is something we might consider in the future.
In slide 29 what kind of work is related to Testbench development?
Creating drivers, checkers, scoreboards, coverage models. Any type of verification component.
Respin: It is interesting to compare the verification flow, bug tracking and fixing process.
Agree, something we can look at with some future analysis of the data.
FPGA design language adoption: Strange that HLS didn't take off significantly.
We are beginning to see interest in HLS. Keep in mind that the current studies is focused on RTL, so HLS is likely not well represented in the pool of participants.
Is python an augmentation on top of SystemVerilog, or it means the teams are using python-based simulators?
The adoption represents both. It wasn’t split out.
For ASIC Methodologies and TB, does Accellera UVM include IEEE UVM? Maybe so.
Yes.
Interesting to see that property checking is well adopted in complex design (gate count). Believing that normally it should not be practicable.
There is higher adoption of property checking on larger designs due to the cost of bug escapes. That doesn’t mean that properties were proven at the top level of the large design. Typically, property checking was applied to high-risk individual blocks within the larger design.
Interesting to know which kind of bug escape is it: Hard to cover, time consuming, lack of spec...
We did track the root cause of bug-escapes into production, which are discussed on separate slides.
What is meant by the item mentioned in verification challenges "knowing my verification coverage"?
There are a number of projects that question “what is the correct type of coverage required to have confidence in the verification process.” This is often beyond simple code coverage or even functional coverage.
How exactly an FPGA can be verified? I'm accustomed with ASIC verification, where we have possibility to add to simulation hundreds of signals. But at FPGA we cannot do that...
Actually, building a simulation environment to test an FPGA is no different than one for ASICs.
Biases related to the number of responses from different regions were discussed. Have you normalized the data to the *actual* number of users in the different regions? Is there any data for how the all users are distributed over the world?
Yes, we have ongoing analysis to identify potentially bias. At the moment we are not finding any biases associated with verification technique adoption. We did find some minor biases associated with the adoption (such as VHDL) and lack of adoption of certain languages.
Hi, do you have data on how many bug-escapes where there in FPGA projects WITHOUT verification at all?
We don’t have sufficient data on this to conclude anything with confidence.
What will be role of machine learning in verification in future?
In general, improve coverage, deduce debug, and optimize the entire process.
So, if Design Engineers are 50% on verification, and 1/2 the total team is Verification, then 75% of total work is verification - correct?
It is hard to have an absolute figure on this. That is why we look at multiple data-points to access verification effort. In general, about 70% of the effort is spent in verification. But this can vary significantly across various market segments and designs.
Was looking for any early numbers for adoption of PSS and DSL? Has there been any study done on that?
We have just started looking at PSS and DSL. We really can’t conclude any trends. It is too early.
What do you think accounts for lack of uptake of OVL?
The maturing of SVA. OVL was developed before there were any industry standards.
What sort of criteria is involved in "First Pass Success"? Does this include the usage of debugging or chicken switches?
It can. The answer was entirely left up to the participant.
Do you think we will have better verification methodology than UVM in near future? In next 5 years or 10 years?
Verification methodologies will continue to evolve to address growing complexity in design. Hence, yes.
Do we have data available on IC specific to Safety and Security?
Yes, the reports have some specific findings related to safety and security. In addition, I have done some deeper analysis by certain market segments (e.g., ISO 26262 or DO-254).
Is there any standardization activity for Python based test bench methodology?
None in terms of Accellera or IEEE. This will likely occur as Python testbench methodology (e.g., cocotb, etc.) matures.
Is there a split in terms of verification resources towards formal versus simulation-based approach?
This is dependent on the team, and to some extent market segments. For example, the processor guys (who are typically more mature in their adoption of formal) have dedicated formal engineers. In terms of percentages, this is not something we studied.
Do you anticipate Python eating into the use of SystemVerilog and UVM for verification over time (ASIC and/or FPGA)?
I believe Python will complement SystemVerilog over time.
Do you think there will be an entry and increase in FPGA adoption in Automobile industry?
I don’t have sufficient data to draw that conclusion. The current trends don’t support that, but it could change.
What are your key take-aways from all the data. How would you recommend to apply these findings to improve the average verification flow?
A company could use this data to benchmark themselves to the industry as a whole and determine if there are opportunities to improve their process.
Which tools, other than cocotb, are included in the Python data?
The study did not enumerate the Python methodology options.
Over the years, how well does people's predictions for next year match the actual data collected at the next study?
Historically, the prediction for the next year has been optimistic.
How long it takes Front End Verification Engineers adopt Emulation or to become Emulation Engineer
This is not something we studied. I suggest you reach your to your emulation provider.
What is the Biggest Verification Design Challenge and How do we resolve the solution for complex multi-million gate ASIC's / IP Cores Engineers?
Historically, the biggest challenges have been associated with either debug or coverage. For example, defining a coverage model, closing coverage, etc.
Maybe i missed it, but what is the % of verification engineers that use; Specman/SystemVerilog/Python/C/....?
For ASICs, this is what we saw:
- Specman e - 4%
- SystemVerilog - 75%
- C/C++ - 41%
Obviously, the numbers don’t sum to 100% since a project can be using multiple languages (e.g., SystemVerilog and C/C++). For all the languages we study, see the reports.
How much of portion Emulation takes in Verification process compare to Simulation?
This is dependent on the design size.
Can we get the schedule time split between design and verification, like the mean peak resource comparison?
The question is not clear. We did study the time spend in design versus verification by a design engineer.
I see increase in C/C++ and Python Based Verification. Is it just a regional bias or do you see it trending globally in times to come.
It is a global trend, although for Python, we are seeing more methodology adoption in Europe.
Your research does not talk about PSS.. can you share if there is a significant trend on PSS
I did cover PSS in the web seminar and in the reports (3% ASIC adoption and <1% FPGA adoption). Realistically, it is too soon to identify any trends.