Part of the  

Chip Design Magazine

  Network

About  |  Contact

Posts Tagged ‘EDA’

Next Page »

One EDA Company Embraces IP in an Extreme Way

Tuesday, June 7th, 2016

Silvaco’s acquisition of IPextreme points to the increasing importance of IP in EDA.

By John Blyler, Editorial Director

One of the most promising directions for future electronic design automation (EDA) growth lies in semiconductor intellectual property (IP) technologies, noted Laurie Balch in her pre-DAC (previously Gary Smith) analysis of the EDA market. As if to confirm this observation, EDA tool provider Silvaco just announced the acquisition of IPextreme.

At first glance, this merger may seems like an odd match. Why would an EDA tool vendor who specializes in the highly technical analog and mixed signal chip design space want to acquire an IP discovery, management and security company? The answer lies in the past.

According to Warren Savage, former CEO of IPextreme, the first inklings of a foundation for the future merger began at DAC 2015.  The company had a suite of tools and an ecosystem that enabled IP discovery, commercialization and management. What they lacked was a strong sale channel and supporting infrastructure.

Conversely, Silvaco’s EDA tools were used by other companies to create customized analog chip IP.  This has been the business model for most of the EDA industry where EDA companies engineer and market their own IP. Only a small portion of the IP created by this model have been made commercially available to all.

According to David Dutton, the CEO of Silvaco, the acquisition of IPextreme’s tools and technology will allow them to unlock their IP assets and deliver this underused IP to the market. Further, this strategic acquisition is part of Silvaco’s 3-year plan to double its revenues by focusing – in part – on strengthening it’s IP offerings in the IOT and automotive vertical markets.

Savage will now lead the IP business for Silvaco. The primary assets from IPextreme will now be part of Silvaco, including:

  • Xena – A platform for managing both the business and technical aspects of semiconductor IP.
  • Constellations – A collective of independent, likeminded IP companies and industry partners that collaborate at both the marketing and engineering levels.
  • Coldfire processor IP and various interface cores.
  • “IP Fingerprinting” – A package, which allows IP owners to “fingerprint” their IP so that their customers can easily discover it in their chip designs and others using ”DNA analysis” software without the need for GDSII tags.

The merger should be mutually beneficial for both companies. For example, IPextreme and its Constellation partners will now have access to a worldwide sales force and associated infrastructure resources.

On the other hand, Silvaco will gain the tools and expertise to commercialize their untapped IP cores. Additionally, this will complement the existing efforts of customers who use Silvaco tools to make their own IP.

As the use of IP grows, so will the need for security. To date, it has been difficult for companies to tell the brand and type of IP in their chip designs. This problem can arise when engineers unknowingly “copy and paste” IP from one project to another. The “IP fingerprinting” technology developed by IPextreme creates a digital representation of all the files in a particular IP package. This representation is entered into a Core store that can then be used by other semiconductor companies to discover what internal and third-party IP is contained in their chip designs.  This provides a way for companies to protect against the accidental reuse of their IP.

According to Savage, there is no way to reverse engineer a chip design from the fingerprinted digital representation.

Many companies seem to have a disconnect between the engineering, legal and business side of their company. This disconnect causes a problem when engineers use IP without any idea of the licensing agreements attached to that IP.

“The problem is gaining the attention of big IP providers who are worried about the accidental reuse of third-party IP,” notes Savage. “Specifically, it represents a liability exposure problem.”

For smaller IP providers, having their IP fingerprint in the CORE store could potentially mean increased revenue as more instances of their IP become discoverable.

In the past, IP security measures have been implemented with limited success with hard and soft tags. (see, “Long Standards, Twinkie IP, Macro Trends, and Patent Trolls”) But tagging chip designs in this way was never really implemented in the major EDA place and route tools, like Synopsys’s IC Compiler. According to Savage, even fabs like TSMC don’t follow the Accellera tagging system, but have instead created their on security mechanisms.

For added security, IPextreme’s IP Fingerprinting technology does support the tagging information, notes Savage.

The Dangers of Code Cut-Paste Techniques

Wednesday, January 20th, 2016

A common coding practice can lead to product efficiency but also software maintenance nightmares and technical debt.

By John Blyler, Editorial Director

At a recent EDA event, Semi-IP Systems talked with Cristian Amitroaie, the CEO of AMIQ, about the good and bad side of developing software from code that is copied from another program. What follows is a paraphrased version of that video interview (AMIQ_DAC2015_Pt1). – JB

Cristian Amitroaie (left), the CEO of AMIQ, is interviewed by John Blyler (right), editorial director of JB Systems.

Blyler: I noticed that you recently released a copy-paste detection technology. Why is that important?

Amitroaie: We introduced these capabilities in Verissimo Testbench Linter. We’ve had it in mind for some time, both because it is very useful – especially when you accumulate large amounts of code – but also because it is fun to see how much copy-paste code exists in your program. There are all sorts of tools that detect copy-paste in the software development world, so why not include that in the chip design and verification space. Also, the topic of copy-paste code development is very interesting.

Blyler: Is the capability to copy and paste a good or a bad thing for developers?

Amitroaie: To answer that question, you first have to first see how copying and pasting are being used. While it is not a fundamental pattern used by software engineers, it is a common technique. When engineers want to build something new or solve an existing problem, they usually start from something that is working or basically doing what they need. They take what exists to use directly or to enhance it for another application. Copying and pasting is a means to start something or enhance it. In software, it is very easy to do.

It may happen that you don’t know if what you need is already available. For example, you make work in a big company and similar activities are being done in parallel that you don’t know about. So, you develop the same thing again. Now you have code duplication in the overall code base.

Another reason to use a copy-paste approach is that junior engineers lack the senior-level skills or experience to start from scratch. They copy and then build upon existing solutions.

Whatever the reason, in time most companies will have duplicate code. The fact that you use copy-paste to solve problems isn’t bad, because you take something that works. You don’t have to start from scratch, so you save time. You tweak the existing code and use it to solve a new problem. After all, engineering is about making things work. It’s not about finding the best, most ideal solution.

Blyler: We are talking about software programmers who prefer elegant solutions, aren’t we?

Amitroaie: Yes, but today you have lots of time pressures. Plus you often don’t have enough resources to get the best solution. But you need to get a solution within the market window. So elegance tends not to be the highest priority. The top thing is to make it work. In this sense, copying and pasting is a practice that makes sense. It is also unavoidable.

The bad thing is that, as time goes by, you accumulate and duplicate more code.  When a mistake is detected, you must now go to several places to fix it in the original code – if there is such a thing. It’s an interesting question: In this copy-paste world, is there such a thing as the original code? But that’s another matter.

Fixing or enhancing the code is problematic. For example, if you want to enhance an algorithm or functionality, you must remember where all the duplications are located. Many times, when you duplicate code, you don’t understand the intention of the original program. You think that you understand so you copy and paste it, adding a few tweaks. But maybe you really didn’t understand the intentions or implications of the original programmer and unknowingly insert a bug.

In this sense, copying and pasting is bad. As the code base grows, you can have what is called “technical debt” resulting from the copy-paste activity. Technical debt results from code that hasn’t’ been cleaned. We never have time to clean-up the code. We say that we’ll do it later but never do. If you go to your manager with a request for code clean-up, he/she will say “no.” Who approves time for code cleanup? Very few. They all talk about it but I’ve never seen it happen. Even though we are in the EDA market, we are still software developers and have the same challenges when trying to improve our code. I know how hard it is for a team leader to approach code clean-up. This is what is known as technical debt, which is analogous to interest that accumulates on a financial loan. You add more and more and the “clean-up debt accumulates that will add to higher maintenance cost over time. You can end up having huge piles of code that no one knows where it starts or ends and how much is duplicated. That makes it tough to redesign or make the code more compact. It will blow up in your face at the worse possible moment.

Blyler: The debt comes due when you are least able to afford it.

Amitroaie: Yes, it’s unavoidable. It is similar to software entropy in that it keeps accumulating. At some point, it will become more cost effective to rewrite the code from scratch than to maintain it.

The good side of copying and pasting is that it is a fundamental way of getting code developed quickly. It helps programmers advance in an efficient way, at least from a result oriented prospective. The bad side is that you accumulate technical debt that can lead to maintenance nightmares.

Blyler: Thank you.

For more information: AMIQ EDA Introduces Duplicate Code Detection in Its Verissimo SystemVerilog Testbench Linter

Originally posted on Chipestimate.com “IP Insider”

Long Standards, Twinkie IP, Macro Trends, and Patent Trolls

Friday, May 10th, 2013

In Part II, IP Extreme’s Savage reveals why IP standards take so long while discussing brand values, macro trends, and changes wrought by patent trolls.

Blyler: Last time we talked, we covered the ongoing development of a soft IP standard. Should we expect an update in the near future?

Savage: The standard is in draft form and being reviewed among the technical contributors at Accellera. It’s just a matter of getting consensus within the EDA community and with the equipment manufacturers, who will need a mechanism to read the soft IP. What helps a lot is that this standard is based on the existing one for hard IP tagging. There are just a few extra things that needed to be added (to the soft IP standard).

Blyler: Please tell us more about those “extra things.”

Savage: One of the extra things proposed in the soft IP standard is the inclusion of export control information in the tag. That’s important in IP. For example, if the semiconductor IP has an Export Control Classification Number (ECCN), that information could be placed in the tag. It could then be discovered later on an actual device – perhaps in a (geographic) location where it shouldn’t be.

Blyler: Some have complained that the soft IP standard is taking too long to ratify. Any comments?

Savage: Things just move slowly in the semiconductor IP world – especially when you have interoperability with the EDA community. The challenge is that you need a handshake between the IP developers, the EDA companies who create the tools (that will need to make the [soft IP] machine-readable), and finally the semiconductor companies (who will actually be using the tools). You have to get all of those constituencies lined up. Like any standard, it takes a number of years before everyone agrees on the details and then gets the standard into widespread industry use.

Blyler: Has that process been made easier with all of the consolidations taking place in the EDA community? Do things move faster now because there are fewer players?

Savage: Surprisingly, the consolidation probably works against that. The problem is assessing a dollar value gain (to the soft IP). How much more can I charge if I support this standard? If you can’t answer that question, there is not a lot of motivation for EDA companies to invest in these things – especially in comparison to developing features for which people will pay extra money.

Blyler: Any other trends that you see?

Savage: We work with many companies to help create external channels for their internal IP. Lots of semiconductor companies talk with us about how to efficiently manage both their internal and external IP.

There is a nice video that Kevin Kline from Freescale did for us at our recent user event. One of his key points was that the value of the internal IP is worth more than the market cap of the company itself. It is analogous toHostess Twinkies in that the value of the brand is worth way more than the Hostess factories. It’s similar with IP at large semiconductor companies.

I’ve had analysts call me to ask about the value of a specific company IP portfolio in relation to the competition. It seems that an increasing number of semiconductor companies are taking a more strategic view of their IP – beyond just the raw material and resources point of view. Within the next five years, I think that companies will think completely differently about their IP. This is a big macro trend – a new way of looking at IP.

Blyler: How do companies determine the real marketplace value of their IP? Is there an accepted benchmark or other means of open comparison?

Savage: The situation is very fluid. Look at the activities of Google and Motorola, where companies were being bought just for their IP. But their IP became a lot more valuable once Apple and Google started fighting it out in the marketplace (i.e., iOS vs. Android). A company’s IP may not have much value until something happens in the market. Then it becomes extraordinarily valuable. The big problem facing most companies is that they don’t know what IP they have. They might have this big opportunity because the market shifts and they are suddenly sitting on a treasure trove that they didn’t know they had.

Blyler: What about patent and IP trolls? I’ve seen companies that announced partnerships with certain patent houses and then, a month later, sued a competitor for patent infringement. I’m wondering what effect that has on innovation. Do you think patent trolls slow down innovation in favor of quick financial returns?

Savage: Most people have a pretty negative view of patent trolls – like the modern version of the highwayman. The troll analogy is quite good, as they seem to wait for someone interesting to appear. Then they pop out and ask for “your money or your life.” Inevitably, the industry will be heading for some kind of legislation to put some brakes on that activity – especially since there are a lot of people trying to get rich quick by specifically setting up practices to do patent trolling. It’s an extremely negative thing. But that is another reason why companies need to be on top of what IP they have. In these situations, you might have cross-licensing and such.

Blyler: Thank you.



Free counter and web stats


9 Issues Face Today’s Semiconductor Supply Chain

Friday, January 25th, 2013

While the Global Semiconductor Alliance (GSA) report focuses on China, the challenges discussed apply to the global IC-supply-chain market.

The GSA recently released its “State of China IC Design Industry 2012” report. While primarily focused on China, the report characterizes global challenges facing the semiconductor industry.

To understand these challenges, it’s important to understand today’s IC-supply-chain ecosystem, from EDA design tools and IP reuse to manufacturing and packaging processes. The report notes that fabless companies, which comprise most of the IC design space, rely on IP cores, libraries, design services, software, and embedded operating systems (OSs).

Once produced, most ICs (e.g., ASICs, FPGAs, etc.) are sold to system manufacturers to become part of a larger electronic system before entering the end market as a complete product.

Source: Dr. Wei’s presentation at GSA SLFT 2012

The GSA report lists nine major changes facing the supply-chain process:

  1. Planar CMOS Comes to an End
  2. Application-Driven Innovation
  3. Innovative Business Model
  4. Software Becomes a Must
  5. Knowledge about Process Technology
  6. Few Foundry Resources
  7. Foundry’s Support Capability Lowers
  8. Intention of Investment
  9. A New Relationship between Fabless and Foundry

I’ve covered most of these changes in past stories. But the cumulative impact of all nine warrants a fresh look at each. We’ll start with the first one next time.


Free counter and web stats


SoC Costs Cut by Multi-Platform Design

Friday, June 1st, 2012

Upward SoC cost trend blunted as designers reused software, verified IP and fewer blocks, reports long-time EDA analyst Gary Smith.

During last year’s Design Automation Conference (DAC), EDA-veteran analyst Gary Smith predicted that it cost slightly over $75 million to design the average high-end System-on-Chip (SoC). This was way over the $50 million targeted by IDM-fabless companies and even further from the $25 million start-up level preferred by funding institutions.

Shortly after that prediction, several companies reported building SoCs around the $40million level. How did they beat the expectation? First, they used previously developed software. Second they used IP that came with verification suites. Lastly, these companies significantly decreased the number of SoC blocks – below the preferred five core blocks. Taken together, these three factors constituted a methodology nicknamed the Multi-Platform Based Design approach.

In essence, this approach was based on the integration of existing platforms enhanced with a new application level to add competitive advantage. The greatest cost savings was realized from the reduction of new core designs.

The multi-platform based design platform has three levels: functional, foundation and application. The functional level represents the core of the SoC design, the broadest of the three platforms. Typically, it often comes from a third party, e.g., ARM Cortex A9 processing system, that is not geared to specific industry or product. If it comes from an in-house design, then it consists of all reused cores. This level provides no competitive advantage since it uses third party cores or IP.

The Foundation platform, also usually from a third party vendor, provides only a slight industry or market differentiation. Most foundation cores are focused on the mobile and consumer electronic markets, e.g., Nvidia’sTegra 3, TI’s OMAP and Qualcomm’s Snapdragon platforms. While enabling differentiation for a particular market segment – often the mobile or consumer electronic markets – foundation cores still provide only a small competitive advantage. Together, the functional and foundation platforms make up between 75 to 90 percent of the total gates in the SoC design.

At the top of the multi-platform based design is the application level, which provides the most market differentiation. This level consists of in-house or proprietary designs, e.g. IP or software from car-maker Audi’s navigation and infotainment systems. The drawback is that this level has the shortest product life cycle.

Applications that are popular can move from the application-level to the foundation level, as in the case of GPS and GPU SoCs. Foundation suppliers then begin to include these popular IPs in their regular offerings. If the application involves processing – like a GPU – then it may even evolve into the functional-level.

Those companies that create a popular application offering have a sustainable advantage, which becomes very hard for competitors to surpass. Smith cited the example of the PC- market. IBM developed the original PC, but within a decade Intel had taken over the market thanks to their platform approach. Now, as the processing has shifted to low-power mobile devices, Intel’s platform has been surpassed by ARMs.

Smith suggested that the good news for DAC is that the platform companies will find a welcomed business for their IP in the evolving system-level EDA market.

Points-of-Interest in the “DAC Zone”

Thursday, May 31st, 2012

If I had my choice, these are the papers and events that I would attend at the upcoming Deign Automation Conference (DAC).

As Sean “Rod Sterling” O’Kane intones: “… you’re moving into a land of both substance and possibilities … You’ve just crossed over into the DAC Zone.”

  

In that same spirit, I’ve scoured the upcoming DAC schedule to find the papers and events of both substance and possibilities. What follows is my list of activities that grabbed my attention – my DAC “must-sees.”

There is just one problem: I’m not the captain of my fate at trade shows. Typically, my scheduled is decided by others. But if your fate is freer, then I humbly submit these entries for your consideration in “the DAC Zone.”

++++++++++++++++

Sunday (May 3, 2012)

7pm – Come hear the 24th annual update on the state of EDA by Gary Smith.

This year’s talk will focus on multi-platform designs and how these platforms are dramatically cutting the cost of design. (Location: Marriott Hotel, Salon 6) 

+++++++++++++++++

Monday (May 4, 2012)

8:30am – System-Level Exploration of Power, Temperature, Performance, and Area for Multicore Architectures

Summary: With the proliferation of multicore architectures, system designers critically need simulation tools to perform early design space exploration of different architectural configurations. Designers typically need to evaluate the effect of different applications on power, performance, temperature, area and reliability of multicore architectures. (Location: 305, Tutorial repeats at 11:30am and 3:30pm)

11:30 am – Dr. John Heinlein of ARM will present the “IP Talks!” keynote. ( Chipestimate.com booth #1202) 

12:15 pm – A celebration of the 10th Anniversary of OpenAccess – Si2 Open Luncheon (Location: 303)

1:00 pm – Xilinx’s Tim Vanevenhoven will probably talk about the challenges of FPGA IP integration. Tim is an engaging speaker. Be sure to ask him about his recent cart-athlon experience. (Chipestimate.com Booth 1202)

3:15pm - Pavilion Panel: The Mechanics of Creativity

What does it take to be an idea machine? Design is an inherently creative process, but how can we be creative on demand? How can we rise above mundane tasks with flashes of brilliance? Discover secrets of technical and business creativity and calculated risk taking, and share stories of innovation. (Location: Booth #310)

Moderator: Karen Bartleson from Synopsys, Inc.

Speakers: Dee McCrorey from Risktaking for Success LLC; Sherry Hess from AWR Corp.;    Lillian Kvitko from Oracle

+++++++++++++++++++++

Tuesday (June 5, 2012)

8:30 am - Keynote: Scaling for 2020 Solutions 

Comparing the original ARM design of 1985 to those of today’s latest microprocessors, ARM’s Mike Muller will look at how far has design come and what EDA has contributed to enabling these advances in systems, hardware, operating systems, and applications and how business models have evolved over 25 years. He will then speculate on the needs for scaling designs into solutions for 2020 from tiny embedded sensors through to cloud based servers which together enable the internet of things. He will look at what are the major challenges that need to be addressed to design and manufacture these systems and proposes some solutions. (Location: 102/103)

10am – Pavilion Panel: Hogan’s Heroes: Learning from Apple

Apple. We admire their devices, worship their creators and praise their stock in our portfolios. Apple is synonymous with creative thinking, new opportunities, perseverance and wild success. Along the road, Apple set new technical and business standards. But how much has the electronics industry, in particular EDA, “where electronics begins,” learned from Apple? It depends. (Location: Booth #310)

Moderator: Jim Hogan from Tela Innovation, Inc.

Speakers: Jack Guedjf from Tensilica, Inc.; Tom Collopy from Aggios, Inc.; and Jan Rabaey – Univ. of California, Berkeley

 

(Why did the DAC committee schedule these two powerful talks at the same time?)

10am – Software and Firmware Engineering for Complex SoCs

Summary: Early software development is crucial for today’s complex SoCs, where the overall software effort typically eclipses the hardware effort. Further, delays in software directly impact the time to market of the end product. The presentations in this session explore how to architect ASIPs for wireless applications, how to bridge RTL and firmware development, and approaches in pre-silicon software development. (Location: 106)

Speakers from IMEC, Marvell, and Intel

11am – (Research Paper) Design Automation for Things Wet, Small, Spooky, and Tamable - Realizing Reversible Circuits Using a New Class of Quantum Gates

Summary: The future of design automation may well be in novel technologies and in new opportunities. This session begins with design techniques that in the past may have applied exclusively to electronic design automation, but now are applied to the wet (microfluidics), the small(nanoelectronics), and the spooky (quantum). The papers cover routing and placement, pin assignment, cell design, and technology mapping applied to microfluidics biochips, quantum gates, and silicon nanowire transistors. (Location: 300)

1:30pm – Can EDA Combat the Rise of Electronic Counterfeiting?

Summary: The Semiconductor Industry Association (SIA) estimates that counterfeiting costs the US semiconductor companies $7.5B in lost revenue, and this is indeed a growing global problem. Repackaging the old ICs, selling the failed test parts, as well as gray marketing, are the most dominant counterfeiting practices. Can technology do a better job than lawyers? What are the technical challenges to be addressed? What EDA technologies will work: embedding IP protection measures in the design phase, developing rapid post-silicon certification, or counterfeit detection tools and methods? (Location: 304)

– I’ve been discussion this area with growing interest:

1:30pm – 9.1: Physics Matters: Statistical Aging Prediction under Trapping/Detrapping

With shrinking device sizes and increasing design complexity, reliability has become a critical issue. Besides traditional reliability issues for power delivery networks and clock signals, new challenges are emerging. This session presents papers that cover a wide spectrum of reliability issues including long-term device aging, verification of power and 3-D ICs, and high-integrity, low-power clock networks. (Location: 300)

 

2pm – Stephen Maneatis of True Circuits will undoubtedly highlight trends in low node PLL and DLL IP, a critical element in all ICs.

 

4pm – Self-Aware and Adaptive Technologies: The Future of Computing Systems? — 14.1: Self-Aware Computing in the Angstrom Processor

Summary: This session will present contributions from industry and universities toward the realization of next-generation computing systems based on Self-Aware computing. Self-Aware computing is an emerging system design paradigm aimed at overcoming the exponentially increasing complexity of modern computing systems and improving performance, utilization, reliability, and programmability. In a departure from current systems which are based on design abstractions that have persisted since the 1960s which place significant burden on programmers and chip designers, Self-Aware systems mitigate complexity by observing their own runtime behavior, learning, and taking actions to optimize behaviors automatically. (Location: 304)

 

 

+++++++++++++++++++

Wednesday (June 6, 2012)

 

9:15am – Dark Side of Moore’s Law

Semiconductor companies double transistor counts every 22 months, yet device prices stay relatively the same. This has been a windfall for customers but not for chip makers, who have exponentially increasing design costs every new cycle. Venture capitalist Lucio Lanza and panelists will discuss what it will take to bring design costs and profitability back into harmony with Moore’s Law. (Location: Booth #310)

Moderator: Lucio Lanza – Lanza TechVentures

Speakers: John Chilton from Synopsys, Consultant Behrooz Abdi and Steve Glaser from Xilinx

 

 

 

John Chilton from Synopsys

 

 

 

9:30am – Low-Power Design and Power Analysis –  22.2: On the Exploitation of the Inherent Error Resilience of Wireless Systems under Unreliable Silicon

For some applications, it is sometimes worth giving up a limited amount of precision or reliability if that leads to significant power savings. Similarly, being able to operate “off the grid” means one needs to give up the certainty of traditional power sources to enable power harvesting opportunities. The papers in this session illustrate the trade-offs inherent in operating in extreme low-power regimes. (Location: 306)

 

10:45am – Keynote: Designing High Performance Systems-on-Chip

Experience state-of-the art design through the eyes of two experts that help shape these advanced chips! In this unique dual-keynote – IBM’s Joshua Friedrich and Intel’s Brad Heaney, the design process at two leading companies will be discussed. The speakers will cover key challenges, engineering decisions and design methodologies to achieve top performance and turn-around time. The presentations describe where EDA meets practice under the most advanced nodes, so will be of key interest to both designers and EDA professionals alike. (Location: 102/103)

 

1:30pm – Design Challenges and EDA Solutions for Wireless Sensor Networks

The good folks at CEA-LETI, Grenoble, France, aim to present a complete overview of the state-of-the-art technologies and key research challenges for the design and optimization of wireless sensor networks (WSN). Thus, it will specifically cover ultra-low-power (ULP) computing architectures and circuits, system-level design methods, power management, and energy-scavenging mechanisms for WSN. A key aspect of this special session is the interdisciplinary nature of the discussed challenges in WSN conception, which go from basic hardware components to software conception, which requires an active engagement of both academic and industrial professionals in the EDA field, computer and electrical engineering, computer science, and telecommunication engineering. (Location: 304)

 

3pm – Synopsys’s John Swanson speaks on verification IP. Afterward, Cadence’s Susan Peterson will talk on the same topic. Might be worth listening to see how the two EDA giants differentiate one another. (Chipestimate.com Booth 1202)

 

3:30pm Cadence’s Susan Peterson will address the audience on verification IP. You’ll probably want to catch the prior Synopsys presentation, too.

 

3:30pm – Pavilion Panel: Teens Talk Tech

High school students tell us how they use the latest tech gadgets, and what they expect to be using in three to five years. They give insights into the next killer applications and what they would like to see in the next generation of hot new electronics products that we should be designing now. (Location: Booth #310)

Moderator: Kathryn Kranen from Jasper Design Automation

Speakers: Students from Menlo High School, Atherton, CA

 

4pm – Breaking out of EDA: How to Apply EDA Techniques to Broader Applications

Throughout its history, myriads of innovations in EDA (Electronic Design Automation) have enabled high performance semiconductor products with leading edge technology. Lately we have observed several research activities where EDA innovations have been applied to broader applications with complex nature and the large scale of data sets. The session provides some tangible results of these multi-disciplinary works where non-traditional EDA problems directly benefit from the innovation of EDA research. The examples of non-EDA applications vary from bio-medical applications to smart water to human computing. (Location: 304)

 

4:30pm – Pavilion Panel: Hardware-Assisted Prototyping and Verification: Make vs. Buy?

As ASIC and ASSP designs reach billions of gates, hardware-assisted verification and/or prototyping is becoming essential, but what is the best approach? Should you buy an off-the-shelf system or build your own? What criteria – time-to-market, cost, performance, resources, quality, ease of use – are most important? Panelists will share their real world design trade-offs. (Location: Booth #310)

Moderator: Gabe Moretti from Gabe on EDA

Speakers: Albert Camilleri from Qualcomm, Inc.; Austin Lesea from Xilinx, Inc.; and Mike Dini from The Dini Group, Inc.

 

 

 +++++++++++++++++

Thursday (June 7, 2012)

 

11am – Keynote: My First Design Automation Conference – 1982

C. L. Liu talks about his first DAC experience: It was June 1982 that I had my first technical paper in the EDA area presented at the 19th Design Automation Conference. It was exactly 20 years after I completed my doctoral study and exactly 30 years ago from today. I would like to share with the audience how my prior educational experience prepared me to enter the EDA field and how my EDA experience prepared me for the other aspects of my professional life.

 

1:30pm – It’s the Software, Stupid! Truth or Myth?

It’s tough to differentiate products with hardware. Everyone uses the same processors, third party IP and foundries; now it’s all about software.  But, is this true?  Since user response, power consumption and support of standards rely on hardware, one camp claims software is only as good as the hardware it sits on. Opponents argue that software differentiates mediocre products from great ones. A third view says only exceptional design of both hardware and software creates great products – and the tradeoffs make great designers. Watch industry experts debate whether it’s really all about software. (Location: 305)

Chair: Chris Edwards from the Tech Design Forum

Speakers: Serge Leef from Mentor Graphics Corp.; Chris Rowen from Tensilica, Inc.; Debashis Bhattacharya from FutureWei Technologies, Inc.; Kathryn S. McKinley from Microsoft Research, Univ. of Texas; and Eli Savransky from NVIDIA Corp.

 

3:30pm – Parallelization and Software Development: Hope, Hype, or Horror?

With the fear that the death of scaling is imminent, hope is widespread that parallelism will save us. Many EDA applications are described as “embarrassingly parallel,” and parallel approaches have certainly been effectively applied in many areas. Before the panel begins, come hear perspective on software development and the challenges associated with writing good software that are only exacerbated by the growing need to write robust, testable, and efficient parallel applications. Then watch the panelists debate future productive directions and dead ends to developing and deploying parallel algorithms. Find out if claims to super speedups are exaggerated and if the investment in parallel algorithms is worth the high development cost. (Location: 305)

Chair: Igor Markov from the Univ. of Michigan

Speakers: Anirudh Devgan from Cadence Design Systems, Inc.; Kunle Olukotun from Stanford Univ.; Daniel Beece from IBM Research; Joao Geada from CLK Design Automation, Inc.; and Alan J. Hu from the Univ. of British Columbia

 

3:30pm – Research Paper: Wild And Crazy Ideas

It cannot get any crazier! Your friends on Facebook verify your designs. Your sister is eavesdropping on your specification. Do not take “no” for implication. Build satisfying circuits with noise. Let spin-based synapses make your head spin. Use parasitics to build 3-D brains. (Location: 308)

– 53.1: CrowdMine: Towards Crowdsourced Human-Assisted Verification

Chair:   Farinaz Koushanfar from Rice Univ.

Speakers: Wenchao Li from the Univ. of California, Berkeley; Sanjit A. Seshia from the Univ. of California, Berkeley; and Somesh Jha from the Univ. of Wisconsin

 

+++++++++++++++++

Works in progress

 

55.18 — Using a Hardware Description Language as an Alternative to Printed Circuit Board Schematic Capture

This paper proposes using hardware description languages (HDLs) for PC board schematic entry. Doing so provides benefits already known to ASIC and FPGA designers including the ability to design using standard and open languages, the ability to edit designs using familiar text editors, the availability source code control systems for collaboration and the tracking and managing of design changes, and the use of IDE’s to help in the design entry process. This talk will introduce PHDL – an HDL specifically developed for doing PC board design capture and describe examples of its initial use for PC board designs.

Speakers from Brigham Young Univ.

55.21 — TinySPICE: A Parallel SPICE Simulator on GPU for Massively Repeated Small Circuit Simulations

Nowadays variation-aware IC designs require many thousands or even millions of repeated SPICE simulations for relatively small nonlinear circuits. In this work, we present a massively parallel SPICE simulator on GPU, TinySPICE, for efficiently analyzing small nonlinear circuits, such as standard cell designs, SRAMs, etc. Our GPU implementation allows for a large number of small circuit simulations in GPU’s shared memory that involve novel circuit linearization and matrix solution techniques, and eliminates most of the GPU device memory accesses during the Newton-Raphson iterations, which thereby enables extremely high-throughput SPICE simulations on GPU. Compared with CPU-based SPICE simulations, TinySPICE achieves up to 264X speedups for SRAM yield analysis without loss of accuracy.

Speakers from Michigan Technological University

 

+++++++++

Originally published on Chipestimate.com – “IP Insider”

Blacker Boxes Lie Ahead

Thursday, June 30th, 2011

Few pundits have addressed the system engineering development implications of the recent EDA and semiconductor company’s move toward platforms that include both chip hardware with associated firmware software.

Niche industries are notoriously myopic. They have to be, since excelling in a highly specific market segment usually requires a sharp focus on low-level details. A good example of a niche market is the semiconductor Electronic Design Automation (EDA) tools industry, that fine group of highly educated professionals who create the tools that allow today’s atom-sized transistors to be designed and manufactured.

The EDA industry has long talked about the importance of software (mostly firmware) as a critical complement to the design of processor-intensive hardware ASICs. While the acknowledgement of the importance of software is nothing new, it has only been in the last few years that actual hardware-software platforms have been forthcoming by the industry.

What does this trend really mean, i.e., the move to include firmware (devices drivers) in the System-in-Chip integrated circuits (ICs)? To date, the result is that companies offer a platform that contains both the SOC hardware and accompanying software firmware. In some cases, like Mentor, the platform also includes a Real-Time Operating System (RTOS) and embedded code optimization and analysis capabilities.

One could argue that this move to include software with the chips is an inevitable step in upward abstraction, driven by the commoditization of processor chips. Others argue that end users are demanding it, as time-to-market windows shrink in the consumer market.

But rather than follow the EDA viewpoint, let’s approach this trend from the standpoint of the end-user. I define the end-user as the Systems Engineer who is responsible for the integration of all the hardware and software into a workable end-system or final product (see figure). Note the big “S” in SE, meaning the system beyond the hardware or software subsystems.

The integration phase of the typical Systems Engineering V diagram is just as critical as the design phase for hardware-software systems.

What is the end-system or final product? It might be a digital camera or tablet; or perhaps a heads-up display for commercial or military aircraft; or even a radiation detector for a homeland security devicen. Regardless of the end system or product, the role of the Systems Engineer is changing as he/she receives software supported ICs from the chip supplier, courtesy of the EDA industry. In essence, the “black box” that traditionally consisted of a black package chip just got a bit blacker.

Some might say that the systems engineer now has less to worry about. No longer will the SE have to manage the hardware and software co-design and co-verification of the chip. Traditionally, that would mean long meetings and significant time spent in “discussions” with the chip designers and the firmware developers over interface issues. Today, that job has effectively been done by the EDA company and the chip supplier as the latest generation of chips come with the needed firmware, e.g., the first offering from Cadence’s multi-staged EDA360 strategy.

On the embedded side, the chip-firmware package might also include an RTOS and tools for software developers to optimize and analyze their code. Mentor is leading this area among EDA tool suppliers.

But how does this happy union of chip hardware and firmware affect the work of a module or product level SE? Does it make his/her job easier? That is certainly the goal, e.g., to greatly reduce co-design and co-verification issues between the silicon development and associated software while including hooks into upper level application development. Now, companies claim that many of these issues have been taken care of for a variety of processor systems.

One should note that these chip hardware-software platforms don’t yet really extend to the analog side of the business. This is hardly surprising since the software requirement is far less than on the digital processor side. Still, software is needed for such things as communication protocol stacks (thing PHY and MAC layers).

Yet, even on the digital side of the platform space, important considerations remain. How does hardware and software intellectual property (IP) fit into all of this? Has the new, higher abstracted blacker box that SEs receive been fully verified? The answer to this question might be partially addressed by the emergence of IP subsystems (“IP Subsystem – Milestone or Flashback“).

Other questions remain. How will open system code and tools benefit or hinder the hardware chip and firmware platforms? From an open systems angle, the black box may be less black but is still opaque to the System Engineer.

What will be the new roles and responsibilities for systems engineers during design and – perhaps more importantly – during the hardware and software integration phase? Will he/she have to re-verify the work of the chip hardware-software vendors, just to be sure that everything is working as required? Will the lower-level SE, formerly tasked with integrating chip hardware and firmware, now be out of a job?

If history is any indication, then we might look back to the early days of RTL synthesis for clues. With the move to include chip hardware and firmware, the industry might expect a shifting of job responsibilities. Also, look for a slew of new interface and process standards to deal with issues of integration and verification. New tool suites will probably emerge.

How will the new chip hardware and firmware platforms affect the integration System Engineer is not yet certain. But SE’s are very adaptable. A black box – even a blacker one – still has inputs and outputs that must be managed between variant teams throughout the product life cycle. At least that activity won’t change.

Mentor or EDA Industry – Who is to Blame?

Friday, April 1st, 2011

Mentor’s woes with Carl Icahn may stem from a misunderstanding of the complicated EDA market rather than the company’s financial condition.

The local paper – the Oregonian – has been dutifully reporting on Carl Icahn’s noisy financial challenges to Mentor Graphics: Casablanca Capital sides with Carl Icahn, castigates Mentor Graphics

An important element in this ongoing challenge is that, until yesterday, Icahn never really offered to buy Mentor at $17/share – a point that Wally Rhines recently confirmed. Rather, Icahn said that “if” he were to buy Mentor, then he would pay that amount. This minor clarification is important since no one has stepped forward to buy Mentor, which means that Icahn has no quick way of exiting the Mentor mess.

I would argue that, in terms of value to the investors, Mentor is no better or worse than Cadence or Synopsys. The challenge to investors is the EDA market, which is a very different beast from, say, the Internet or video rental businesses. (Both Yahoo and Blockbuster were past Icahn purchases.) This is the mistake that I believe Icahn is making, i.e., not understanding the market.

[This is part of an ongoing discussion on Facebook.

Mentor-Icahn and the Outside Acquisition of EDA

Friday, July 16th, 2010

Mike Rogoway of the Portland-based Oregonian recently wrote a good summary of investor Carl Icahn’s potential reasons for increasing his market share in Mentor Graphics: Mentor Graphics rebounds, but Carl Icahn casts a shadow

In talking with Mike for his story, I began to ponder the actual possibility of an Enterprise Resource Planning (ERP) or Product Lifecycle Management (PLM) company acquiring an EDA company.

Over the last several years, many – including myself – have speculated that a PLM company or a supply chain vendor company might buy an EDA tool vendor. While special cases of the reverse have happened – consider Mentor’s recent acquisition of Valor in the PCB space – I no longer think that an outside business will acquire any EDA company.

My change in opinion came from a discussion with a business savvy expert. He pointed out that the merger of a PLM or ERP company with and EDA vendor would ultimately result in one CEO managing both companies. This would be a serious problem, since a PLM or ERP CEO would have little or no understanding of the technology or business needs of the EDA world.

As further proof of the natural break point between the two worlds, I ask this question: Can anyone remember the last time a business outside of EDA come into our chip industry to acquire an EDA company?

This doesn’t mean the EDA suppliers are in trouble, i.e., no one will acquire them. Indeed, it suggests that a natural bread point exists between EDA and PLM-ERP companies.

If you accept this reasoning, then extend it slighty to a more natural and obvious business and technology break point – namely, between hardware and software systems. Although hardware is becoming a commodity and software a differentiator, does it really make sense for hardware companies to acquire software vendors, e.g., Intel and WindRiver. Have such unions ever been successful in the past? Now there is something to think about.

Hardware “Software” is not Software “Software”

Thursday, May 13th, 2010

As EDA and semiconductor communities venture more fully into the realm of software development and applications, they must remember that software has a multitude of meanings.

Suddenly, everyone is talking about software. The great “software” epiphany has been heralded by almost all levels of hardware developers. But are they all referring to the same “software?” For example, do they mean the EDA software that is used to automate the extremely complex task of creating integrated circuits at submicron geometries? Perhaps some are referring to the hardware-intensive embedded firmware drivers that are embedded inside the chip or board. Maybe others mean the operating system that communicates with the high-level systems on the board? Or does software mean the software protocol stacks that are necessary for most of today’s wired and wireless interfaces? Do you suppose they mean the user interfaces or application programs, such as those written for the iPhone? Could it be they are referring to higher-level (non-embedded) desktop or server level applications. Or scripting for the latest Flash or website program?

But even these semantic ponderances are too hardware-focused. Looking from the software side, where open systems and inexpensive apps – like those for the iPhone – are becoming the mainstay, programmers see a different picture. The world of software development – beyond hardware-centric firmware and operating system – is also changing in profound ways. Even the sci-fi community has picked up on this trend (see my interview with Lou Anders).

 In the past, hardware-centric companies have not done well in the software world. Most fail to understand the real difference between both hardware-software technology as well as the differing business models that govern each discipline. In addition to these challenges, hardware companies must be diligent in watching trends in both the hardware and software side of the business. This is no easy feat, but success will depend upon their intimate semantic understanding of “software” – be it hardware “software or software “software.”

Next Page »