Part of the  

Chip Design Magazine

  Network

About  |  Contact

Archive for November, 2012

Semiconductor GPU IP Faces Cloud Division

Thursday, November 29th, 2012

Cloud computing offers many challenges including the division of graphic-processing IP and rendering tasks between the mobile device and cloud-based servers.

Cloud computing has created new challenges for system designers in terms of the division of functionality between the mobile device and the cloud. This division will directly affect the design of system-on-a-chip (SoC) processors and graphic processing units (GPUs). Autodesk Media & Entertainment Tech Innovators asked the experts at Imagination Technologies and AMD to address the question of graphic processing and rendering partitioned between mobile devices and cloud-based servers. Here’s a portion of their responses. – JB

David Harold, Director of PR, Imagination Technologies

The answer, as is so often the case, depends on the use.

By and large, we think that the cost, silicon area, and power budget of our GPUs makes them efficient enough to be in the device rather than the cloud. This is especially true of the mobile space and most consumer devices.

But of course, there are often exceptions. Most creative professionals are, in our experience, running fairly slow and demanding apps anyway. They are very intolerant of any additional lag or performance hit. But there can be cases where – say, for a final render of a static image – they may be more willing to send it to the cloud.

In the more mainstream arena, the applications are less demanding. But still, gamers tend to be intolerant of lag too – especially when playing against each other online.

For applications for office, presentation, etc., there may well be a case for dumb terminals with rendering (and everything else) in the cloud. However, the cost needs to be notably lower than conventional devices, which I’m not sure has so far proven to be the case.

Because our technologies are designed for mobile applications first and foremost, they are very efficient and low power, making them very suitable for render cloud applications. When you take enough of them together, there is little in the GPU or GPU-compute space that they cannot achieve. And application programming interfaces (APIs) like OpenCL are making the interface between app and cloud relatively simple to program for.

Rendering in the cloud is, of course, not just rendering. One also requires technologies to control the elements of the system and divide work, etc. Really, it is a CPU/GPU combination for which we have very suitable technologies with our Meta CPU, PowerVR GPU, and RTU technologies.

For those willing to move ray tracing to the cloud, the power to access it is now available from an iPad. In the secre­tive world of high-stakes film and game production, though, many users aren’t ready or willing to send projects to the cloud. For these users, the challenge remains scaling down high-end ray-tracing tech­nology to the laptop. This is a work in progress, but something we can enable with our ray-tracing-unit (RTU) technology.

It may well be that in the future, the ideal solution will be a combination: local rendering at your main workplace (or gaming space), where performance matters – but with documents stored in the cloud. A cloud-rendering solution will be available as a backup for when you’re in meetings or on the road.

++++++++++

Bahman Dara, WS worldwide product marketing, AMD

We believe cloud-based services (including remote rendering) will begin to grow exponentially as technology and pricing/licensing models continue to be refined. There seem to be several approaches being developed – including approaches for both interactive design-review and final rendering.

The first method, which is expected to be used mostly for design-review purposes, is server-side rendering in conjunction with client-side user interaction and input on PC/mobile devices. In this scenario, a rendered image is generated in the “cloud,” compressed, and then sent over the network (or the Internet) to a mobile or other device, such as a PC. This scenario is similar to video streaming, except the user is able to interactively control camera and object parameters.

A second cloud-enabled method for design-review visualization, which is being developed by several companies, will make use of hybrid server/client rendering. This user experience is similar to the previous method. However, in this scenario, the cloud server does the heavy lifting – handling the harder lighting and procedural calculations. The “solved” scene is then converted into lightweight data, which can be transmitted instantly to a PC or handheld device, where the image is quickly rendered in 3D. Powerful, low-power, and low-cost technology (such as AMD APUs) will almost certainly accelerate 3D rendering capabilities of mobile devices in this scenario.

For final high-quality, “cinematic” rendering of HD-level stills and animations, cloud-based render farms have been around for a good many years. We see this type of service continuing to be relevant – and even expanding to include near-instantaneous (or at least very fast) GPU-based rendering of high resolution for a range of purposes.

In all of these scenarios (and any that we missed here), there are likely some limiting factors, such as network bandwidth, which will continue to be problematic. Images cannot be processed/rendered until all of the raw content is uploaded to the cloud server. In many cases, there will likely be a considerable delay as large datasets and high-resolution raw assets are transmitted to the “cloud” before any rendering can commence.

Examples include large texture and image maps, video content, and other “big” files that are sometimes required for “cinematic”-style rendering.

Reference articles

 

Originally posted on “IP Insider.” 

System Simulation Moves from Goods and Services to Experiences

Thursday, November 8th, 2012

This is the first of two stories about Dassault Systemes’s move into the experience-based economy and the world of semiconductor development.

User-group events are typically a balance of corporate marketing and real-world user experiences. Still, if done properly, they can be both interesting and educational. That was my impression from the recent Dassault Systemes 3DExperience, the company’s high-level user-group event. So much took place at the show that all I can do, for now, is highlight the sessions and panels that I had time to attend.

Al Bunshaft – Managing Director, North America, Dassault Systemes

Welcome and Introductions

Highlights:

  • Al Bunshaft was the host for the morning session’s C-Level presentations. “Experience” was the key word and recurring theme for the entire event. The 3D experience brand and concept extends the company’s flagship PLM products into the next stage of business evolution. I’ll elaborate on this point shortly. Each speaker’s goal was to help the audience understand and appreciate the importance of customized experiences.
  • Major acquisitions in 2012 further support the company’s goals of harmonizing product, nature, and life: 1) Netvibes: software to discover useful information with company databases and through external public domains
    2) Gemcom: software that helps mining companies make decisions about the excavation of precious materials
  • Bunshaft introduced a new hire formerly from MatrixOne – Patricia Megowan, Business Transformation Leader for NA Operations. Do you remember MatrixOne, which Dassault Systemes acquired in 2006 to create the next generation of its ENOVIA brand? MatrixOne had a close partnership with Cadence Design Systems to develop PLM tools based on both companies’ products.

B. Joseph Pine II, acclaimed author, speaker, and management advisor

The Experience Economy: Work is Theatre and Every Business a Stage

Highlights:

  • Pine, a motivational speaker and book author, used the gumball machine as a clever example of how an experience may supersede and even supplant the actual product. In the gumball machine, kids enjoy watching the purchased gum travel down a spiral column to reach the delivery shoot. Pine suggested that the adult version of this was the Autostadt’s car vending machine.
  • The evolution of human business activity has moved from agrarian to goods and then to services. The Internet has commoditized goods (i.e., price comparisons are easy). Now, services are being commoditized in the same way. What is the next stage beyond services? Experiences!
  • Experiences are customizations aimed at the individual. Companies need to innovate experiences to maintain profitability. One example is REI, which provides a climbing mountain in its stores for customers to try out the company’s equipment before purchasing. Such experiences lead to greater product sales, but the experiences themselves bring in revenue.
  • Pine: “If you customerize a good, it becomes a service. Customization is a great differentiator. Customization is the antidote to commoditization.” [Personal note: This idea of customization addressing the shortcomings of commoditization was an eye-opener for me. In the semiconductor-EDA-electronic spaces, we all know that hardware has become a commodity. How can businesses and engineers still find value in hardware design? Today, hardware customization – even including FPGAs - is done via software. But something more is  needed. How do we, as engineers, participate in designing the experience? Intel and others have asked – and tried to answer – this same question.]
  •  “Customers don’t want too many choices. Business and designers must offer what the customer wants. Ford and Chevy were experts at mass production. Now, Tesla is becoming an expert in mass customization.” [Personal note: How does the designer figure out what the customer wants? Via simulations and prototypes.]
  • Digital information can augment real-world experiences (e.g., Google Glasses).
  •  How does one stage a digital 3D experience? Pine explained that physicists describe our experiences as bounded by time, matter, and space. But through digital experiences, we can go to no-time (manipulate sense of time by simulating the past or future). We can experience “no-matter” because matter is built on digital substances and we are moving from atoms (matter) to bits (no-matter).  We can experience “no-space” in the digital virtual arena to create things that are not physically possible. [Personal note: When motivational speakers talk about science, it always gets interesting. Perhaps Pine was trying to shake the audience up by relating concepts in physics to key marketing elements. His comments prompted me to send out this Tweet: J. Pine: No-matter is digital substance, not atoms but bits. So SW is no matter? Interesting. @Dassault3DS @3DXForum]

Bernard Charles, President and CEO, Dassault Systemes

Dassault Systemes Opens New Horizons With 3DExperience

Highlights:

  • Charles began by explaining the meaning of the compass – a symbol designed to position the company’s brands and how they work together to deliver 3DExperiences.

North – connecting people

West – the world with 3D as a medium, not just tools

South – virtual plus real, connecting the virtual with the real world

East – information intelligence; discover needed internal and external information

Middle – This is the experience.

  • A systems approach is needed to deal with the challenges facing humanity (e.g., urbanization, resource management, global health, food supply, education, and globalization).
  • “It’s easier to find a good answer if we ask the right question.” [Personal note: That’s why systems engineers spend so much time and energy defining the problem early in the system life cycle.]
  • Dassault Systemes is the seventh-largest software application company in the world. Can you guess which firm (based in Redmond, WA) is the first?
  • The future of simulation will come from indexed information.
  • Several years ago, scientist Georges Mougin suggested towing icebergs from the South Pole as a source of fresh water for southern Africa. Charles showcased this idea as a good example of how to evaluate possibilities (feasibilities) using social-media platforms and system-based 3D simulation. One of the biggest challenges was to limit the amount of melting in moving the iceberg to Africa. Many professionals freely offered their advice through NetVibe online discussion rooms and 3DVia virtual system simulations. Published material was gathered on metrology, global current flows, instrumented navigation data, and more. The conclusion was that the water melt rate from the iceberg would be very low. It was determined that the sea currents (and perhaps wind via large parachutes) could move the iceberg, but a steering mechanism would be needed. Furthermore, the momentum of the iceberg could even bring energy to the African coast.  [Personal note: Using the Internet to discover useful data and connect multidomain experts is not a new concept. Indeed, Dassault’s implementation of this approach is reminiscent – but on a much grander scale – of James Burke’s “Knowledge Web” project from the last decade. See “It's the End of the World as We Know It!” ]
  • System modeling is no longer the domain of experts with powerful processing hardware. Charles used the powerful Catia modeling application running on the latest Apple iPad to do a significant modeling task. This version of Catia is free from Dassault – for now.
  • Another cool simulation is the virtual modeling of Paris from today to the past.  I wonder if this could be used to reconstruct specific moments in history, like the construction of Apollo 11 or the first transistor?

Monica Menghini, EVP, Industry, MarCom, Dassault Systemes

From Product Experience to Business Experience: The New Social Industry Era

  • Product innovation is misleading, as it doesn’t include the experience that product can enable. The Internet expanded the power of the consumer and reshaped industry.
  • Like it or not, engineers may one day become comfortable with social media. Menghini noted that the next generation is already at ease with social applications, with most kids now using touchscreens instead of a mouse on a PC. Social apps are also gaining favor with non-engineers as a way to get useful technical information.
  • “Consumers buy experiences. Experiences are bigger than products.” She cited the example of a coffee machine, where you smell the coffee aroma before you purchase the actual product. Starbucks has grown beyond a goods (commodity) company to both a service and experience vendor. For product lifecycle management to grow, it must be extended to include experiences via 3D simulations (more on that later).
  •  Menghini presented this interesting mapping of activities from today to tomorrow:

From PLM to engineering business experience

From discipline collaboration to social industry world

From product modeling to business modeling

From document management to experience management

From search to dashboard intelligence

From product attributes to consumer experience

Michel Tellier, VP Aerospace and Defense, Dassault Systemes

Live 3dExperience A&D Demo – Introducing the A&D Solution Experience: Winning Program

Highlights:

  • Knowledge retirement is a main concern in the aerospace and defense (A&D) industries, as 40% of employees are eligible for retirement in the next three years. There is an urgent need to retain this experience by capturing project requirements with modern PLM systems.  [Personal note: This was also a problem in the late 1980s, when I was with the Department of Defense (DoD). Back then, it was the retirement of engineers who worked on the early space systems. We tried to document our systems engineering process, but it was a labor-intensive task and databases were less sophisticated. Today, technology has greatly improved.]
  • The Vee-Diagram – You can’t escape it if you want to do system-level engineering.
  • Case study: Two proposals for a defense drone project. The winning contract used 3D simulations in addition to engineering drawings. These simulations addressed all aspects of the project from build through deployment and delivery. Simulations are key for mission-critical problems, as in flight test.
  • I’d forgotten the defense industry’s propensity for odd terms: “source of truth” and “experience of judgment combined with the creativity process.” The latter was meant to describe a risk-management process.
  • Aerospace now uses behavioral models to create mature designs. The semiconductor EDA’s electronic-system-level (ESL) design and verification communities can sympathize with the challenges of creating behavioral models.
  • Flight systems need to simulate both design and operation (e.g., landing on an aircraft-carrier flight deck). Today, that operational simulation includes the entire carrier – including sailors’ movements on the deck. This level of detail was helpful to understand the blast pattern of drones during takeoff.

Glenn Isbell Jr., Director, System Engineering and Operations, Bell Helicopter Textron Inc.

 Bell Helicopter Textron Inc.

Highlights:

  • Bill of materials exists for every group (e.g., engineering, manufacturing, planning, and such). The problem is that most of these exist separately from one another.
  • Not everyone welcomes the move from paper-based to online systems. Bell migrated a ton of systems documents covering requirements generation through implementation and build via the Enovia PLM tool. Legacy electrical and mechanical CAD data was migrated to online databases using Catia.
  • Organizational change management is important. It is easier to install a system than it is to change human behavior. At Bell, a single PLM platform helped bring siloed organizations together. 3D modeling helped different disciplines visualize issues.
  • When you improve (expose) data, you get better visibility and you’ll see behavioral shifts.

Laura Wilber, Solution Analyst, Exalead, Dassault Systemes

Big Data and Innovation: Product-in-Life Intelligence from Machine Data

Highlights:

  • Big data is estimated to reach a growth rate of zetta bytes (ZB) in 2015.  (1 ZB=1 trillion GBs). The Hadron Collider generates 1 Peta-byte of data every second.
  • ERP and CAD data represents structured data. Machine and human data (e.g., web, social media) is unstructured data. Unstructured data should be modeled using statistical and semantic processing instead of traditional structured, relational database techniques. (Semantical processing refers to the manipulation of data based on its meaning.)
  • Product-in-Life intelligence: Gathering machine and human (e.g., social media and email) data about a product once it is out in the world.  Human data does require natural language processing to filter out meaningful information.
  • Case study: exploratory investigations using the Exalead discovery engine on embedded device data in the French postal service. This data – from a sorting machine including OCR and video-coding systems – was going unused. After analysis of the unused data, the postal service gained end-to-end visibility of the letter flow system. This visibility presented new ways to track letters and provide revenue-generating services like mail-to-email (auto PDF), virtual mailbox, SMS push, and other personalized services for consumers.
  • IP issues with data collection and usage? No personal data, just aggregations, so it wasn’t a problem. But some collectors of machine data have tried to sell data as service. In response, several open-source systems have arisen.

Ralph Jacobson – Global Consumer Products Industry Marketing Leader, IBM

Leveraging Social Media for New and Collaborative Product Development

Highlights:

  • Consumers seeking advice on the Internet:
    1) About 70% trust “independent” sites like Yelp
    2) About 18%  trust what they read on brand sites, including retail and corporations
  •  New product development: Where will it be in the next few years? Here are a few examples:
  1. 3D online shopping  (on Dassault site)
  2. Dollar Shave Club – (non) future of packaging – send you shaving razors. (JB: recall Inside secure and packaging sensor)
  3. Quirky – Using social media (crowdsourcing) for product development

Chip Design Enters the Third Dimension

Friday, November 2nd, 2012

How will stacked die affect the IP supply chain? FinFet transistor structures will require new Spice models. But what else?

Today is a mix of medias – all dealing with the coming die and chip 3D challenges:

1. My colleague and editor-in-chief of the Low-Power High-Performance portal – Ed Sperling – has written a three-part IP- supply-chain series based upon interviews with the following: Jim Hogan, an independent VC; Jack Brown, senior vice president at Sonics; Mike Gianfagna, vice president of marketing at Atrenta; Paul Hollingworth, vice president of strategic accounts at eSilicon, and Warren Savage, CEO of IPextreme. Here’s the last of that series:  Experts At The Table: The Business Of IP.

- Notable quote from Warren Savage, CEO of IPextreme, responding to a question about the future of IP on stacked die:

“It also means that IP can live longer because it can stay in older nodes longer. Foundries producing 0.18 and 0.13 can continue to make it. IP already has an extremely long life.” – Warren Savage

2. In this short video, Sean O’Kane from Chipestimate.TV and I talk with ARM’s John Heinlein about the coming challenges faced by IP designers using FinFet transistor structures.

- TSMC OIP 2012 – John Heinlein interview

 

 

 

Originally posted on Chipestimate.TV’s “IP Insider.