Part of the  

Chip Design Magazine

  Network

About  |  Contact

Posts Tagged ‘Mentor Graphics’

Next Page »

World of Sensors Highlights Pacific NW Semiconductor Industry

Tuesday, October 25th, 2016

Line-up of semiconductor and embedded IOT experts to talk at SEMI Pacific NW “World of Sensors” event.

The Pacific NW Chapter of SEMI will be holding their Fall 2016 event highlighting the world of sensors. Mentor Graphics will be hosting the event on Friday, October 28, 2016 from 7:30 to 11:30 am.

The event will gather experts in the sensor professions who will share their vision of the future and the impact it may have on the overall semiconductor industry. Here’s a brief list of the speaker line-up:

  • Design for the IoT Edge—Mentor Graphics
  • Image Sensors for IoT—ON Semiconductor
  • Next Growth Engine for Semiconductors—PricewaterhouseCoopers
  • Expanding Capabilities of MEMS Sensors through Advanced Manufacturing—Rogue Valley Microdevices
  • Engineering Biosensors for Cell Biology Research and Drug Discovery—Thermo Fisher Scientific

Register today and meet and network with industry peers from these companies, Applied Materials, ASM America, Brewer Science, Cascade Microtech, Delphon Industries, FEI Company, Kanto, Microchip Technology, SSOE Group, VALQUA America and many more.

See the full agenda and Register today.

Increasing Power Density of Electric Motors Challenges IGBT Makers

Tuesday, August 23rd, 2016

Mentor Graphics answers questions about failure modes and simulation-testing for IGBT and MOSFET power electronics in electronic and hybrid-electronic vehicles (EV/HEV).

By John Blyler, Editorial Director

Most news about electric and hybrid vehicles (EV/HEV) electronics focuses on the processor-based engine control and the passenger infotainment systems.  Of equal importance is the power electronics that support and control the actual vehicle motors. On-road EVs and HEVs operate on either AC induction or permanent magnet (PM) motors. These high-torque motors must operate over a wide range of temperatures and in often electrically noisy environments. The motors are driven by converters that generally contain a main IGBT or power MOSFET inverter.

The constant power cycling that occurs during the operation of the vehicle significantly affects the reliability of these inverters. Design and reliability engineers must simulate and test the power electronics for thermal reliability and lifecycle performance.

To understand more about the causes of inverter failures and the test that reveal these failures, I presented the following questions to Andras Vass-Varnai, Senior Product Manager for the MicReD Power Tester 600A , Mentor Graphic’s Mechanical Analysis Division. What follows is a portion of his responses. – JB

 

Blyler: What are some of the root causes of failures for power devices in EV/HEV devices today, namely, for insulated gate bipolar transistors (IGBTs), MOSFETs, transistors, and chargers?

Vass-Varnai: As the chip and module sizes of power devices show a shrinking tendency, while the required power dissipation stays the same or even increases, the power density in power devices increases, too. The increasing power densities require careful thermal design and management. The majority of failures is thermal related, the temperature difference between the material layers within an IGBT or MOSFET structure, plus the differences in the coefficient of thermal expansion of the same layers lead to thermo-mechanical stress.

The failure will develop ultimately at these layer boundaries or interconnects, such as the bond wires, die attach, base plate solder, etc. (see Figure 1). Our technology can induce the failure mechanisms using active power cycling and can track the failure while it develops using high resolution electric tests, from which we derive thermal and structural information.

Figure 1: Cross-section of an IGBT module.

Blyler: Reliability testing during power cycling improves the reliability of these devices. How was this testing done in the past? What new technology is Mentor bringing to the testing approach?

Vass-Varnai: The way we see it, traditionally the tests were done in a very simplified way, companies used tools to stress the devices by power cycles, however these technologies were not combined with in-progress characterization. They started the tests, then stopped to see if any failure happened (using X-ray microscopy, ultrasonic microscopy, sometimes dissection), then continued the power cycling. Testing this way took much more time and more user interaction, and there was a chance that the device fails before one had the chance to take a closer look at the failure. In some more sophisticated cases companies tried to combine the tests with some basic electrical characterization, however none of these were as sophisticated and complete as offered by today’s power testers. One major advantage of today’s technology is the high resolution (about 0.01C) temperature measurement and the structure function technology, which helps users to precisely identify in which structural layer the failure develops and what is its effect on the thermal resistance, all of these embedded in the power cycling process.

The combination with simulation is also unique. In order to calculate the lifetime of the car, one needs to simulate very precisely the temperature changes in an IGBT for a given mission profile. In order to do this, the simulation model has to behave exactly as the real device both for steady state and transient excitations. The thermal simulation and testing system must be capable of taking real measurement data and calibrating the simulation model for precise behavior.

Blyler: Can this tester be used for both (non-destructive) power-cycle stress screening as well as (destructive) testing the device all the way to failure? I assume the former is the wider application in EV/HEV reliability testing.

Vass-Varnai: The system can be used for non-destructive thermal metrics measurements (junction temperature, thermal resistance) and also for active power cycling (which is a stress test), and can track automatically the development of the failure (see Figure 2).

Figure 2: Device voltage change during power cycling for three tested devices in Mentor Graphics MicReD Power Tester 1500A

Blyler: How do you make IGBT thermal lifetime failure estimations?

Vass-Varnai: We use a combination of thermal software simulation and hardware testing solution specifically for the EV/HEV market. Thermal models are created using computational fluid dynamics based on the material properties of the IGBT under test. These models accurately simulate the real temperature response of the EV/HEV’s dynamic power input.

Blyler: Thank you.

For more information, see the following: “Mentor Graphics Launches Unique MicReD Power Tester 600A Solution for Electric and Hybrid Vehicle IGBT Thermal Reliability

Bio: Andras Vass-Varnai obtained his MSc degree in electrical engineering in 2007 at the Budapest University of Technology and Economics. He started his professional career at the MicReD group of Mentor Graphics as an application engineer. Currently, he works as a product manager responsible for the Mentor Graphics thermal transient testing hardware solutions, including the T3Ster product. His main topics of interest include thermal management of electric systems, advanced applications of thermal transient testing, characterization of TIM materials, and reliability testing of high power semiconductor devices.

 

Trends in Hyper-Spectral Imaging, Cyber-Security and Auto Safety

Monday, April 25th, 2016

Highlights from SPIE Photonics, Accellera’s DVCon and Automotive panels focus on semiconductor’s changing role in emerging markets.

By John Blyler, Editorial Director

Publisher John Blyler talks with Chipestimate.TV executive director Sean O’Kane during the monthly travelogue of the semiconductor and embedded systems industries. In this episode, Blyler shares his coverage to two major conferences: SPIE Photonics and Accellera’s Design-Verification Conference (DVCon). He concludes with the risk emphasis in automotive electronics from a recent market panel. Please note that what follows is not a verbatim transcription of the interview. Instead, it has been edited and expanded for readability. Cheers — JB

O’Kane: Earlier this year, you were at the SPIE Photonic show in San Francisco. Did you see any cool tech?

Blyler: As always, there was a lot to see at the show covering photonic and optical semiconductor-related technologies. One thing that caught my attention was the continuing development of hyperspectral cameras.  For example, start-up SCiO prototypes a pocket-sized molecular scanner based on spectral imaging that tells you everything about your food.

Figure 1: SCiO Molecular scanner based on spectral imaging technology.

O’Kane: That sounds like the Star Trek Tricorder. Mr. Spock would be proud.

Blyler: It’s very much so. I talked with Imec’s Andy Lambrechts at the Photonics show.  They have developed a process that allows them to deposit spectral filter banks in both the visible and near infra-red range on the same CMOS sensor. That’s the key innovation for shrinking the size and – in some cases – the power consumption. It’s very useful for quickly determining the health of agricultural crops. And all thanks to semiconductor technology.

 

Figure 2: Imec Hyperspectral imaging technology for agricultural crop markets.

O’Kane: Recently, you attended the Design and Verification Conference (DVCon). This year, it was Mentor Graphic’s turn to give the keynote. What did the CEO Wally Rhines talk about?

Blyler: His presentations are always rich in data and trends slides. What caught my eye were his comments about cyber security.

Figure 3: Wally Rhines, CEO of Mentor Graphics, giving the DVCon2016 keynote.

O’Kane: Did he mention Beckstrom’s law?

Blyler: You’re right! Soon, the Internet of Things (IoT) will expand the security need to almost everything we do, which is why Beckstrom’s law is important:

Beckstrom’s Laws of Cyber Security:

  1. Everything that is connected to the Internet can be hacked.
  2. Everything is being connected to the Internet
  3. Everything else follows from the first two laws.

Naturally, the semiconductor supply chain want some assurance the chips are resistant to hacking. That’s why chip designers need to pay attention to three levels of security breaches: Side-Channel Attacks (On-Chip Countermeasures); Counterfeit Chips (Supply-chain security); and Malicious Logic Inside Chip (Trojan detection)

EDA tools will become the core of the security framework, but not without changes. For example, verification will move from its traditional role to an emerging one:

  • Traditional role: Verifying that a chip does what it is supposed to do
  • Emerging role: Verifying that a chip does nothing it is not supposed to do

This is a nice lead into safety-critical design and verification systems. Safety critical design requires that both the product development process and related software tools introduce no potentially harmful effects into the system, product or the operators and users. One example of this is the emerging certification standards in the automotive electronics space, namely, ISO 26262.

O’Kane: How does this safety standard impact engineers developing electronics in this space?

Blyler: Recently, I put that question to a panel of experts from the automotive, semiconductor and systems companies (see Figure 4). During our discussion, I noted that the focus on functional safety seems like yet another “Design-for-X” methodology, where “X” is the activity that you did poorly during the last product iteration, like requirements, testing, etc. But ISO 26262 is a compliant, risk-based safety standard for future automobile systems – not a passing fad.

 

Figure 4: Panel on design of automotive electronics hosted by Jama Software – including experts from Daimler, Mentor Graphics, Jama and Synopsys.

Mike Bucala from Daimler put it this way: “The ISO standard is different than other risk standards because it focuses on hazards to persons that result from the malfunctioning behavior of EE systems – as opposed to the risk of failure of a product. For purposes of liability and due care, reducing that risk implies a certain rigor in documentation that has never been there before.”

O’Kane: Connected cars are getting closer to becoming a reality.  Safety will be critical issues for regulatory approval.

Blyler: Indeed. Achieving that approval will encompass everything all aspects of connectivity, for example, from connected system within the automobile to other drivers, roadway infrastructures and the cloud. I think many consumers tend to focus on only the self-driving and parking aspects of the evolving autonomous vehicles.

Figure 5: CES2016 BMW self-parking connected car.

It’s interesting to note that connected car technology is nothing new. It’s been used in the racing industry for years at places like the Sonoma Raceway near San Francisco, CA. The high performance race cars are constantly collecting, conditioning and sending data throughout different parts of the car, to the driver and finally to the telemetry-based control centers where the pit crews reside. This is quite a bit different from the self-driving and parking aspects of consumer autonomous vehicles.

Figure 6: Indy car race at Sonoma Raceway.

 

 

 

Our Day at DAC – Day 1 (Monday)

Monday, June 2nd, 2014

Here are the brief observations on noteworthy presentations, cool demonstrations and hall-way chats from the editorial staff covering “Day 1″ at DAC 2014 – John Blyler, Gabe Moretti and Hamilton Carter.

++++++++++++

DAC Report from Hamilton Carter:

Puuurrrple, so much purple!  The stage at the packed Synopsys, Samsung, ARM briefing this morning was backed by ceiling to floor Synopsys-purple curtains.  The Samsung vision video played on the two large screens on either side of the stage.  To steal a phrase from “Love Actually”, Samsung’s vision is that “touch-screens are… everywhere” .  Among the envisioned apps were a touch screen floor for your kids’ room, complete with planetarium app; a touchscreen window for your Town-Car so you can adjust the thermostat in the car as your driver taxis you to your destintion; and finally a touchscreen gadget for the kitchen that when laid flat weighs the food and registers the number of calories in the amount you’ve sliced off on its cutting board tough screen, displays the recipe you’re using when upright, and finally, get ready for it… checks the ‘safety’ of your food displaying an all clear icon complete with a rad safe emblem.  Apparently the future isn’t completely utopian!

Phil Dworsky, director of strategic alliances, for Synopsys introduced the three featured speakers, Kelvin Low, of Samsung, Glenn Dukes of Synopsys, and Rob Aitken from ARM, and things got under way.  The key impetus of the presentation was that the Samsung/Synopsys/ARM collaboration on 14 nm 3D finfet technology is ready to go.  The technology has been rolled out on 30 test chips and 5 customer chips that are going into production.

Most of the emphasis was on the 14 nm process nodes, but the speakers were also quick to point out that the 28 nm node wasn’t going away anytime soon  With its single patterning, and reduced power consumption, it’s seen as a perfect fit for mobile devices that don’t need the cutting edge of performance yet.

Interesting bits:

  • It was nice to visit with Sanjay Gupta, previously of IBM Austin, who is now at Qualcomm, San Diego.
  • While smart phones have been outshipping PCs for a while, tablets are now predicted to outship PCs starting in 2015.
  • Bryan Bailey of verification fame was one of the raffle winners.  He’s now a part of the IoT!
  • IoT predictions are still in the Carl Sagan range, there will be ‘billions and billions’.
  • Samsung GLOBALFOUNDRIES has a fab, Fab8, in Saratoga, NY.
  • Last year’s buzzword was ‘metric driven’, this year’s is ‘ecosystem’ so far.  The vision being plugged is collaborations of companies and/or tools that work as a ‘seamless, [goes without saying], ecosystem’.

Catching up with Amiq

I got to catch up with Christian from Amiq this morning.  Since they’re planted squarely in the IDE business, Amiq gets the fun job of working directly with silicon design and verification engineers.  There products on display this year include their Eclipse based work environment, with support for e, and SystemVerilog built in, their verification-code-centric linting tool Verissimo, and their documentation generation system Specador.

IC Manage

I’m always drawn in by a good ‘wrap a measurable, or at least documentable flow around your design process story’, so I dropped by the IC Manage booth this morning.

Their product encapsulates many of the vagaries of the IC development flow into a configuration management tool.  The backbone of the tool can be customized to the customer’s specific flow via scripts, and it provides a real-time updated HTML based view of what engineers are up to as project development unfolds.

++++++++++++++++

DAC Report from Gabe Moretti:

Power Management and IP

Moscone South is all about IP and low power.  This is the 51st DAC and my 34th.  Time flies.  The most intimidating thing is that the Apple Developers Forum is going on at the same time, and they have TV trucks and live interview on the street.  We of course do not.  It was nice to hear Antun Domic as one of the two keynote speakers this morning  His discussion on how the latest EDA tools are used to produce designs fabricated with processes as old as 180 nanometers was refreshing.  In general people equate the latest EDA tools with the latest semiconductor process.  Yet one needs to manage power even at 180 nanometers.

Chip Estimate runs a series of talks from IP developers in its booth.  I listened to Peter Mc Guiness of Imagination Technologies talk about advances in image processing.  it was interesting to hear him talk about lane departure warning as an automotive feature employing such technology.  Now I know how it works in one of my cars.  On the other hand to hear how the retail industry is planning to use facial recognition to choose for me what I should be interested in purchasing is not so reassuring.  But, on the other hand, its use in robotics applications is fashinating.

++++++++++++++++++++

DAC Report from John Blyler:

I. IP Panel: The founders for several successful private IP companies shared their experiences with an audience of near 50 attendees. The panelist included CAST, IPExtreme, Methods2Business, and Recore Systems. The main takeaways were that starting an IP company takes passion and a plan.  But neither will work if you don’t have some product to offer and a few key relationships in the industry. (Warren said you need 3 key customers to start.) I’ll write more about this panel later. Here’s a link to a pre-DAC position statements from the panelist.

II. NI and Cadence – The Best of Both Worlds

George Zafiropoulos, VP, Solutions Marketing at National Instruments (NI)-AWR, has brought his many years of chip design and verification experience from the EDA industry to NI. He spoke at the DAC Cadence Theater about post- and pre-silicon verification being the best of both worlds. Those worlds consist of NI, which has traditionally been used for post-silicon verification testing, and Cadence, which is known for pre-silicon design and verification. George has proposed the use of NI test hardware and software to do pre-silicon verification in combination with Cadence’s emulation tools, i.e, Palladium. This proposed combination of tools elicited many questions from the audience who were more familiar with the pre-silicon tools than the post-silicon testers. Verification languages were an issue for those who had never used the Mindstorm or other NI graphic tools suits. I’m sure we’ll learn more on this potential partnership between NI and Cadence tool suites.

III. Visionary Talk by Wally Rhines, CEO, Mentor Graphics (prior to the afternoon keynote):

The title described it all; “EDA Grows by Solving New Problems.” Wally’s vision focused on how EDA industry will grow even with the constraints on its relatively flat revenue. As he noted back in the 2004 DAC keynote, the largest growth with EDA tools is associated with the adoption of new methodologies, e.g., ESL, DFM, and FPGAs. Further, tools that support new methodologies have been the main drives of growth in the PCB and semiconductor worlds.

“EDA need to tap into new budgets … for emulation, embedded software … and in new markets,” explained Rhines. “The automotive industry is at the same stage of development as was the chip design industry in the 1970s. Their development process will have to be automated and with new tools.”

Another growth market will be hardware cyber security.

Passion Project for Engineers

Friday, May 16th, 2014

Engineers are a creative bunch but what are their passions outside of work? You might be surprised.

Sonia Harrison, Senior PR Manager at Mentor Graphics, recently pinged me about a contest that will culminate at the upcoming Design Automation Conference (DAC). It’s called the “Passion Project” and its goal is to celebrate engineering design creativity.  Engineers are encouraged to show what they love to do outside of work. “We know they are very creative in the workplace, but can bet they are the same during their spare time,” notes Harrison.

Mentor is asking these creative technologists to share their passion with other engineers and those attending DAC. They can do this by submitting a photo and brief description (150 characters) of their hobby. The contest ends on June 3 at 11:59 PST, and the winner, who will be chosen at random, will be awarded a $300 prize on Wednesday of DAC, June 4 at 3:30 pm at the Mentor booth #1733.

Even though I’m also an engineer, I was glad to learn that media editors were also encouraged to participate. Accordingly, I’ll submit my “passion” post in the near future. – JB

Xpedition Awaits for PCB Designers

Friday, March 21st, 2014

Mentor Graphics announces improvements to and the re-branding of its well-known Expedition line of printed circuit board (PCB) design and manufacturing tools.

By John Blyler, Chief Content Officer

Earlier this week, Mentor Graphics announced the launch of a new PCB design platform called Xpedition xPCB. The release marks the first phase in an ongoing update of the company’s board-level design and manufacturing tool suites.

The company is integrating its former Expedition PCB suite of tools under the new name. “The first release of the re-branded Xpedition platform aims to greatly improve board-level design productivity,” explained David Wiens, Product Marketing Manager at Mentor.

Today’s board designers face challenges from increasing design complexity and workforce dynamics to the handling of larger systems. Like chips, board designs are growing in complexity with high speed (e.g., 28Gbit) signals, shrinking board sizes with more layers and increasing board densities.

Greater design complexity means that today’s PCB designer efforts are shifting from drafters to board and systems engineers. Unfortunately, there are fewer layout designers in part due to the decline in engineering graduation rates. Adding to these workforce challenges is the move from stand alone PCB projects to more complete systems that include electrical, mechanical and software sub-systems.

Many of these challenges can be met with improvements to design productivity, such as streamlining the design process. Additionally, systematic component placement and planning should be available throughout the development process. The quality and speed of automated routing also should be improved. Finally, PCB development should support both 2D and 3D design, as well as integrate both electronic and mechanical systems.

Xpedition xPCB claims to address all of these challenges. For example, to help streamline the design activity, the platform provides a more consistent and logical user interface, personalized layout toolbars, faster learning curves features and more.

With increasing circuit complexity comes intricate topologies that require more in-depth planning, not to mention the careful placement of thousands of board components. xPCB provides planning and placement features for all of these parts throughout the entire design process, from schematic capture to layout.

Once the design is complete, the board must be routed. The routing environment of xPCB addresses design scenarios including digital and analog subsystem, high-speed signaling plus flex and rigid-flex board implementations.

Routing is tricky. “While automatic routing engines can help, they often add too many vias,” said Charles Pfeil, Engineering Director atMentor. “They also tend to meander. Results from auto routers can take longer than the designers doing it themselves.”

Mentor’s claims that its new routing environment makes it easier for users to get good results with several specific features: sketch router, dynamic router, differential signals, and curved router. The sketch router matches the quality of manual routing but allows the user to manage the location of traces with options for routing styles. It also enables the selection of via patterns. A surprising result is that an optimized, efficient routed selection is also pleasing to the eye, much like a work of art.

Conversely, differential pair routing may not look as good to the casual observer but it is critical for today’s high-speed signals. The challenge with differential pair routing is to maintain symmetrical pad entry as well as trace length and phase matching. “Phase match tuning is all about noise management,” noted Pfeil. “Signals and hence traces that are kept in phase with less impedance mismatch are less susceptible to EMI noise.”

To appreciate the benefits of automating the task of differential pair routing – e.g., quickly moving along rule areas and via pads – check out this video.

Another routing feature of the xPCB tool is the way it handles curved routing, which are needed for BGAs or connectors with staggered pins. The traces for such pins can not be at the usual 45 degrees but instead must be arcs. With high speed signals – around 20GHz, the arcs result in less noise from signal reflections than the more angular 45 degree traces.

The final feature to help increase PCB productivity is in the use of 3D design. Why use 3D techniques in PCB design? Perhaps the biggest benefit is that 3D approaches reduce the errors and resulting iterations between PCB and MCAD designs. A tighter collaboration between the electronic and mechanical board domains is essential in today’s need for full system development.

The other driver in 3D designs is that many applications require a flexible PCB in addition the more traditionally rigid boards.

Xpedition xPCB claims to support true 3D layout instead of just a 2D design interface. The tool boasts a true parametric 3D mechanical kernel, which supports one environment for both 2D and 3D development. The 3D layout designer provides the same features as the 2D version including a generous library of models; component planning, placement and manipulation; constraints and design rule checks; a spatial measurement capabilities. Naturally, the 3D tool has additional features such as board flipping.

In summary, the Xpedition layout platform boasts an automated router that provides hand-routed quality in a shorter design time. Addition, the tool supports component placement and planning, 3D design and validation for minimal MCAD respins, and a better user experience via a more intuitive interface.

 

This is the first of many announcements under the new Xpedition brand that should make the development and manufacturing of PCBs considerably easier for designers.

Software-Hardware Integration of Automotive Electronics

Friday, October 11th, 2013

My SAE book arranges and extrapolates on expert papers in automotive hardware-software electronic integration at the chip, package, and network vehicle levels.

My latest book - more of a mini-book – is now available for pre-order from the Society of Automotive Engineers. This time, I explore the technical challenges in the hardware-software integration of automotive electronics. (Can you say “systems engineering?”)  I selected this topic to serve as a series of case studies for my related course at Portland State University. This work includes quotes from Dassault Systemes and Mentor Graphics.

 

Software-Hardware Integration in Automotive Product Development

Coming Soon – Pre-order Now!

Software-Hardware Integration in Automotive Product Development brings together a must-read set of technical papers on one of the most talked-about subjects among industry experts.

The carefully selected content of this book demonstrates how leading companies, universities, and organizations have developed methodologies, tools, and technologies to integrate, verify, and validate hardware and software systems. The automotive industry is no different, with the future of its product development lying in the timely integration of these chiefly electronic and mechanical systems….

 

DAC – Video Latency; Platform as a Service; 262626; and ARM-12

Tuesday, June 4th, 2013

My Tuesday at DAC involved CAST IP, Mentor Graphics, Dassault Systemes, Chipestimate.com, and Globalfoundries-ARM. 

Here are but a few of the companies, hallway discussions, and presentations that I enjoyed during Tuesday at DAC:

> Performance is a function of latency and power, as Gary Smith noted in his pre-DAC EDA and IP trends presentation. One example of the need to balance latency and power is in the application of real-time video streaming (e.g., H.264 video encoders). Latency is the delay that occurs between the processing and transmission of live video. A simple way to initially gauge latency is by waving your hand quickly in front of the camera and watching for blurring of the image on the display. I saw none during my demo.

 

Other news from CAST highlighted a joint announcement with IP company Beyond Semiconductor concerning an ultra-low-power, 32-bit BA21 embedded processor.

 

 

 

> Hallway chat with Mentor’s M&A expert, Serge Leef:

Software as a Service (SaaS) for EDA cloud-based applications seems passé. Platform as a Service (PaaS) is the new “black.” The key driver in this change seems to be the push by next-generation chip designers for a more robust user experience (UE; see “Experience Required,” http://chipdesignmag.com/sld/blog/2013/05/30/experience-required/). Serge sees the trend to user-experience designs as essential to the evolution of EDA tools. He even believes them to be a source of revenue in terms of a micro-business model.

 

> Dassault Systemes offered several interesting technology demos. While their Netvibes product provides for intelligent dashboarding, Tuscany’s PinPoint enables tracking progress from synthesis to GDSII.

http://www.tuscanyda.com/

> IP protection and management includes the synchronization of databases and documentation. In this way, a close partnership with Magillem is proving very useful. (More about this in the near future.)

> Simulation Lifecycle Management (SLM) for semiconductor verification and validation (V&V) flows may evolve quickly into a framework. The effort in the automotive industry via ISO262626 may establish a working model for the EDA industry.

 

> Globalfoundries presentation at Chipestimate.com, “IP Talks” – Subi Kerngeri, VP of the Advanced Technology Division, talked briefly about many things, mostly centering on the need to offer a combination of device technology design and SoC manufacturing expertise.  But this need is fraught with challenges. (Reference: “Modular FinFET Increases Planar-to-Non-Planar IP Reuse”)

http://www.chipestimate.com/blogs/IPInsider/?p=1264   He noted that Globalfoundries was the first fab to optimize for the newly announced ARM Cortex-A12 CPU – POP IP combined with Globalfoundries’ 28-SLP process. Also, Kerngeri emphasized the success of Fully-Depleted SOI technology at 28 nm, saying that it was pretty much like bulk CMOS for designers. STMicro is their partner in FD-SOI. This technology has enabled 0.63 v at 1-GHz performance in a dual A-9 implementation.

 

 

 

 

Supply Chains, Big Data, and Point-of-Sale for EDA and IP

Wednesday, April 24th, 2013

These issues were addressed by supply-chain, product-lifecycle-management, board-design, and chip-design services companies.

We live in a tumultuous world in terms of disruptive technologies, natural disasters, and global politics. Do chip designers need to worry about such seemingly external influences, as manifested by the global semiconductor manufacturing and supply chain? What help will come from “Big Data” analytics? Will EDA/IP (chip companies) ever be this tightly coupled with end-product manufacturers? I asked these questions of professionals in the manufacturing-supply-chain, product-lifecycle-management (PLM), and board- and chip-design services industries, respectively: Geoff Annesley, CTO at Serus; Brian Haacke, High Tech Industry Sales Director, Dassault Systemes; Michael Ford, Marketing Development Manager, Mentor Graphics–Valor; and Naveed Sherwani, President and CEO of Open Silicon. What follows is a portion of their remarks. –JB  

Blyler: Do chip designers really need to worry about the seemingly external influences of the global semiconductor manufacturing and supply chains?

Haacke: Designers do care about manufacturing with a primary focus on the impact of design rules provided by the foundries. The more design rules for which they are compliant, the more flexible they can be when choosing a foundry – and mitigating risks if some natural disaster impacts one foundry over another. Regarding supply-chain influence, there are many aspects to consider. Designers would not be impacted by material-supply disruptions because they typically do not “design in” any of the materials used in manufacturing. However, a closed-loop feedback to designers on manufacturing test results can improve responsiveness to design-related issues impacting yield ramp-up – especially if that feedback is tied to requirements and design intelligence.

Sherwani: It doesn’t require an earthquake or other natural disaster. In the coming move from traditional single-die chips to the era of 2.5-dimensional (2.5D) stacked dies, everything changes. With 2.5D, naked dies have to be tested, placed on interposers, and then positioned into a single package. The industry has never tested or sold anything like this before. I think it will disrupt the normal supply chain and its well-understood chain of command.

Annesley: Design needs to be linked to execution in the global market. You need a feedback mechanism for companies to decide the best price and combination of packaging and manufacturing processes that result in the lowest-cost chip. That is a good example of tying back execution data to the design process and vice versa. For example, you have the material information for your design – be it chip or board. You may have alternates that you need to use (e.g., due to natural disasters). It’s important for companies to track what actual alternates were picked for every component build. Then they will have traceability and accountability with respect to the specifications.

Ford: Designers are motivated to create a product that meets the criteria set in terms of technologies, materials, costs, quality, life expectancy, etc. There is significant influence on this from the manufacturing-production side, which – if not known by the designer – can result in product variations and the product not living up to expectations. Designing a product with some knowledge of the materials to be used and the actual production environment would allow the designer to design-in features that promoted better production quality, lower manufacturing cost, or reduced variation. Typically, though, this does not happen except in rare cases, as the technologies of material choice and manufacturing capability are not visible in a way that designers can understand. This is a clear opportunity for improvement.

Blyler: One supply-chain trend is the increased use of “Big Data” analytics to allow companies to connect between very different databases. In doing so, they can discover clues to improve supply-chain performance. Comments?

Haacke: “Big Data” analytics isn’t just a good idea. Nor is it just about connecting disparate data sources.  To be competitive, companies must be able to have visibility into their supply-chain data and make informed decisions based on the intelligent correlation of requirements, design, simulation, test results, and yield data. Connecting data sets is a start. Yet it is the marriage of operational and design intelligence that enables effective analytics to improve traceability, root-cause analysis, and time-to-yield ramp-up.

Ford: This can be useful. The real issue today is that the end-product distribution chain is shrinking, due to Internet sales and quickly changing fashionable technology products. This leads to many product variations – the changing demand profile of which comes closer to the factory than in the past. Factories are then asked to be agile, supplying different quantities of products with short notice of changes. This really puts pressure on their supply chain to source materials more quickly and effectively. Otherwise, there is a large increase of inventory at the factory, which cripples the operation on costs. Managing the changing demand from the customer and translating it into short-term raw-material availability is a growing issue today.

Annesley: Data mining and analytics are necessary to do predictive analysis (e.g., to foresee shortages in the supply chain). The resulting operational metrics include such things as yield, test, cycle-time, and on-time delivery trending – all the actuals on how you are performing. The real-time metrics and calculations can be used to do alert notifications (e.g., when you are drifting from your inventory targets). Then there is the longer term, where we collect the statistics on how the supply chain is used.

Blyler: Another trend is the use of point-of-sale (POS) data from retailers to adjust supply chain and manufacturing. Will EDA/IP (chip companies) ever be this tightly coupled with the end-product manufacturers?

Haacke: This is a good question – one that I’ve gone back and forth on. Ultimately, I don’t see much relevance to POS [as related to direct business-to-consumer (B2C) chip sales] being of any significant source of demand input to EDA/IP companies. The coordination required to track this data through every device – using a given chip – would be an enormous effort. So I don’t think there is any near-term future in which they are tightly coupled. However, I do see other possibilities for these companies to anticipate the demands in the marketplace by monitoring the end-consumer “experience” with products that contain their chips and/or IP. This data could be used to anticipate how consumers and competitors will act in the future.

Today, I think “social listening” may not be obvious to companies – especially the further down the supply chain they are from the end consumer. Still, with the right tools in place, EDA/IP companies can add the thoughts and ideas of their customers and competitors to their pool of “Big Data.” This data could then be part of their analytics and correlation of cause-and-effect events that drive effective decision-making and produce competitive advantages.

Ford: I am not sure about chips themselves. But ultimately, the answer would be “yes.” Still, the issue comes down to agility and the resistance to making changes. In printed-circuit-board (PCB) production – with good management tools – we can manage the changes to schedules and allocation of operations to work orders as demands change. For the chip areas, I think it will depend on how agile the processes are to be able to adjust volumes (move to alternate machines or reassign production cells). 

Blyler: Thank you.



Free counter and web stats


More SI, Less EDA at DesignCon 2012

Wednesday, February 8th, 2012

This year’s DesignCon show focused more on board-level signal intregity and testing issue than on chip design and verification.

DesignCon has changed over the years. It started as a board-level interface show. In recent years, a large chip-level Electronic Design Automation (EDA) and verification component was added (see references for past coverage of the show). This year, the EDA tools component was greatly diminished as the show returned its roots, although with a much stronger emphasis on board-level testing, debug and signal-power integrity issues.

As usual for an editor, I spent more time in meetings that actually walking the show floor. Still, there was plenty to catch my eye when I did wander onto the exhibition hall. Here is a brief summary of my meetings and show floor highlights from Designcon 2012.

Read the full story at “IP Insider

Next Page »