Taken for Granted

ESL, embedded processors, and more

The Myths of EDA – the 70% Rule

Filed under: Uncategorized — November 27, 2008 @ 11:58 pm

I have long been interested in some of the cliches that have gained credence in the EDA and design communities – statements made and repeated many times, and accepted as facts.  One that has intrigued me is the statement “70% of the effort in product development is spent on verification” (or variations thereof – could just be hardware, IC design only, etc.).    This is such a venerable statement that a google search turned up, just on the first two pages, references to it from 1998 (DAI), 2002 (Mentor), 2004 (Virginia Tech),  2005 (Denali), 2006 (nVidia), and 2008 (Jasper), indicating how constant repetition of the statement has turned it into a “fact”.  And in fact, I repeated this claim in 2001 in the foreword I wrote to the book System-on-a-chip Verification by Prakash Rashinkar, Peter Paterson and Leena Singh (Kluwer/Springer, 2001).

But is it a fact, or is it a myth?   What is this claim based on?  In fact, I am hardly the first to wonder about this.  Most notably, Richard Goering, writing for EETimes in 2004, (“Is verification really 70 percent?”), commented:

The oft-quoted statistic that functional verification takes 70 percent of the chip design cycle may be more myth than science, according to a new EDA user survey by EE Times. But verification is clearly a significant chunk of a design cycle that’s lengthening.

But I am interested in the roots of the statement.  So, in the spirit of the Mythbusters on TV:

Adam Savage and Jamie Hyneman, TV’s “Mythbusters” (Discovery Channel)

I will do a little digging into where this statement comes from.

Ironically, the earliest reference I can find to measuring verification effort in design projects comes from colleagues of mine who worked at BNR/Nortel in Ottawa, Canada in the 1980s and 1990s.   The CAD and Silicon Design groups at BNR/Nortel made a serious effort to measure things like use of EDA tools, time spent in design projects, productivity of design, etc.   At DAC 1998, a paper entitled “Functional Verification of Large ASICs”, by Adrian Evans, Allan Silburt, Gary Vrckovnik, Thane Brown, Mario Dufresne, Geoffrey Hall, Tung Ho, and Ying Liu, described verification methodologies in use at Nortel and gave an excellent breakdown of the time spent in various design tasks:

Figure 3 from DAC 1998 paper by Evans et al.

Although they don’t sum up the verification effort, by eyeballing the chart it looks to me that verification accounts for 60% of the effort, or perhaps a little more.  Note from the paper that this was for three chip design projects.

I believe people from Nortel reported on much of this information at least a year earlier.   Allan Silburt is listed in the DBLP bibliography server as having presented a paper at CHARME 1997: “Allan Silburt: ASIC/system hardware verification at Nortel: a view from the trenches. CHARME 1997: 1″.   I could not find the paper, but they either reported on the effort results or were gathering the data at that time.

Later, it appears that Ron Collett and his colleagues at Collett International Research used their surveys of ASIC design teams to extract updated information about the amount of effort dedicated to verification.   In his 2008 DVCon keynote, “Ending Endless Verification“, Mentor Graphics’ Wally Rhines showed the following graphic and attributed it to “Collett International Research 2004 & Farwest Research 2008 IC/ASIC Functional Verification Study”  (Far West Research has as a partner Dominic Lusinchi, who was general manager of research for Collett International).

Although Collett International Research does not seem to exist anymore (at least via an internet search), Ron Collett heads the company Numetrics, and it is possible some of these reports still exist.  Unfortunately, they were rather expensive reports and I have never seen them.

Nevertheless, based on the graphic, we see verification counted as 58%, comprising the 35% listed as “Verification and Test”, and the 46% of the 51% counted as “Design”.   Collett International had an extensive database of design teams that it surveyed, across many companies, so its results are no doubt more definitive.

And that’s it!   This is the only quantitative data I could find on this, other than the EETimes survey from 2004 which included IC, PCB and FPGA designers where functional verification only accounted for 22% of the effort (of course, FPGA and PCB designers probably did very little functional verification).  As Richard Goering said in 2004:

… it’s easy to throw around a vague statistic to make a point, and if enough people hear it enough times, no one questions it.

But I would be loathe to call this as a myth “busted”, because verification clearly accounts for a lot of effort in design projects, and it seems to still be growing.   So rather than call it busted, and rather than try to ascribe an exact number to the claim, I would restate the claim:  How about

A large and continually growing part of the effort in product development is spent on verification

With this restatment, and  in true Mythbusters spirit, I would rate this version of the claim as:

What do you think?  Comments very welcome!  Are there other “myths” of EDA and semiconductor design that should be analysed?


  1. Observations from Uppsala » Blog Archive » Grant Martin on the “Verification is 70% of the Effort” Claim:

    [...] at Taken for Granted, Grant Martin just did a very good write-up on the “accepted fact” that verification is seventy percent of a chip design effort. It is not exactly easy to prove this point, but is it really just an urban myth that has gained [...]

  2. Jakob Engblom:

    Nice work! Looking deep into “accepted truth” can often show that there is very little substance to the fact… but here there definitely is some.

    What would be good to know is how much verification accounts for designs of different types, including completely-new designs as opposed to building from existing designs and making a few tweaks. But it seems to me that this type of data is as elusive as good data on how software development projects work out…


  3. Brian Bailey:

    When I used to perform large quantities of seminars, I would often use the 70% number as well. It kind of makes sense that it should be at least 50% since both teams have to create a model (and we can argue which model is easier to create somewhere else), and the verification team is often the group that is charged with doing the actual running of verification and at least part of the debug. So, I would often ask by a show of hands how many people thought they spent more than 70% on verification, and how many less than that. When I called on a few of those people I would get widely differing numbers ranging from 20% to 90%. Then I would start asking some question about what they included in the task of verification and that is when things just got too difficult to control. For some it included some very high level steps that others would call design, and for others they only considered it verification after the designers had passed it off as working. So part of the problem here is what is the definition of the verification task, and that depends on the design type, design style, group make-up and many other factors. Quite simply – I just don’t think you can put a number on it, except to say that it is significant and without some verification breakthroughs – getting larger.

  4. Grant Martin:

    Brian, thanks very much for the additional anecdotal evidence, which indicates, as Jakob also said in his comment, that the amount of effort will vary a lot depending on design type, style, group makeup and lots of other factors. So your summary that “it is significant and without some verification breakthroughs – getting larger” fits nicely with my “plausible” conclusion on the reformulated hypothesis above.

    If anyone else has additional anecdotal or statistical evidence to add, please add a comment.

  5. Ross Dickson:

    It seems that Lauro Rizzatti of EVE is also using the 70% number over at the DACeZine http://www.dac.com/newsletter/shownewsletter.aspx?newsid=63

    “By consensus within the electronics industry and designers worldwide, verification consumes 70 percent or more of the development cycle.”

    I wonder how many significant digits can be extracted from a number based on “By consensus within the electronics industry”.

  6. Chris Wilson:

    Hi Grant,

    I posted on this subject a couple of weeks ago in my blog.
    You may want to take a look at it.

    I don’t think the overall number is as important as the trend.
    It is my belief that the percentage of verification doesn’t grow with design complexity. I think it is essentially a constant that depends on the methodology and culture of a particular project. If you have 2 verification engineers for each designer on your current project, you probably will have the same number on the next project even if it is a more complex design.

    I do agree that statements to the effect that verification is the bottleneck come from vendors or other entities with a vested interest. We have asked a lot of companies about how much effort they spend on verification. Many of them say, “we have two or three verification engineers per designer.” This is good enough for me to say they are spending 70% of their effort on verification.



  7. Grant Martin:

    Many thanks for your comment, your anecdotal way of assessing verification effort, and for your blog. I just went there and added it to my list of blog subscriptions and see there is a lot of food for thought there. I hope others who read this thread of comments will also check out your blog.
    Thanks, Grant

  8. Shalom Bresticker:

    Michael Bartley related to this claim in a recent paper at SNUG Europe 2008 called “Lies, Damn Lies and Hardware Verification”.

    I was the Technical Committee member assigned to that paper.

    Shalom Bresticker, Intel Jerusalem

  9. Grant Martin:

    Shalom, thanks very much. The paper and presentation are listed for access at
    However, you have to be a Synopsys customer with a Synopsys account to access papers here. It would be nice to see this paper out in the public domain especially if Michael has some background that can fill in some of the gaps on the claim.
    Best, Grant

  10. Justin:

    I suppose that “70%”, or any other number, is a subjective belief of the amount of design-efforts more or less tagged with a property called “verification”. It does not make too much sense to require an accuracy within, say, 10 pertage points.

    Though subjective, this fraction is a fair and universal estimation of the verification effort required for NOWADAYS designs. A striking fact is that “verification” now requires more effort than “implementation” does, which could be qualitively understandable. Implementation of a component-based SoC has been practically reduced to integrating pre-designed components. The HDL code size grows only linearly with respect to the number of components; however, the emerging properties caused by the integration are implicit and hidden, which, however, must be explicitly controlled/observed by new codes in TB construction (or formal temporal logic), IN ADDITION TO the existing TB code at component-level. If there were a magical synthesis tool that could convert a TB composed in any HDL/HVL constructs to gates, it would be guaranteed that TB gates will outcount the gates in its DUT. The number of emerging properties, and therefore, the efforts spent in verification them, will simply grow increasingly faster as design integration levels-up.

    Hence, 70%, if not already being hit, will eventually be hit and soon become an under-estimate — unless a real methodological breakthrough happens in verification. Don’t we already have methodologies? Not really, if we understand “methodology” not in the way EDA vendors do. The current verification pratices overall are still officially described as “ad-hoc and experimental” (ITRS2007-desgin). The mainstream verification approach, namely, constructing sophisticate TB, is actually leading us to a verification blackhole. The first article in the HLDVT08 proceedings, which was written by me, is an attempt (at a higher level of abstraction) to turn the attention away from the TB approach.

  11. Frank Schirrmeister:

    Trust us Germans. Some of us measure everything… I ran across a set of measured data as it relates to development efforts. It was created between 1991 and 1994 in four projects in the video encoding and decoding domain I participated in.

    I am not sure hwo to post a figure here in a comment, so I have put it into a post on my blog (see below in the comment). My data from that time shows the effort distribution across four chip development projects. The first four elements are architecture definition, specification, RTL development and schematic entry. After the bar we have the verification tasks, manual code inspections, validation by simulation and validation against a software reference (in only one chip). If we look at this as design vs. verification, then the verification effort even for these relatively small chips – they were all between 90K and 200K gates – was 54%, 44%, 46% and 45%.

    Scaling these data up to more complex designs can easily add 20%. In addition the data was dominated by design and verification of the IP blocks itself. If one now takes into account the system integration, i.e. the connection of the IP blocks to make up a system or a system on chip, then verification of that integration will easily add some more verification complexity.

    Bottom line my data seems to confirm that 70% effort for verification is a realistic number.

    Here is the Blog post with the figure in it: http://www.synopsysoc.org/viewfromtop/?p=59 .

  12. Larry Drenan:

    I used to make charts like those for Western Digital, and also control who could use the Zycads, and my experience bears out this 60-70% claim. At least back then. Intuitively the grows as designs get more complex.

    But there is another angle here. The full statement should be “Verification takes x amount of time, and however much x is, it is usually not enough” That is, designers have traditionally stopped verifying because time ran out, not because they thought they were done.

    Hopefully, newer methodologies with increased emphasis on formal verification and more integrated verification strategies will change this.

  13. Grant Martin:

    New evidence from Frank Schirrmeister. A Synopsys supported analysis of design projects, carried out by IBS, reviewed the effort and elapsed time for 12 projects and Frank summarises the findings in an article you can find online, “Increasing Verification Efficiency using Virtualization and Reuse of System-level models”, at URL http://www.synopsys.com/Tools/Verification/Newsletter/Pages/IncreasingVerification-art4.aspx

    Thanks to Frank for the pointer to new information in this area.

  14. Taken for Granted » New evidence for the Verification Takes X% of the effort ….:

    [...] of those who read this blog may recall a post I wrote last November, entitled “The Myths of EDA – the 70% rule”, about the oft-stated claim that Verification takes 70% of the effort in a design project. I was at [...]

RSS feed for comments on this post. TrackBack URI

Leave a comment

Line and paragraph breaks automatic, e-mail address never displayed, HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>