Part of the  

Chip Design Magazine

  Network

About  |  Contact

Archive for June, 2013

Augmented Tools Design Reality

Friday, June 28th, 2013

Augmented reality appears in consumer applications and in virtual-prototyping designs for the early validation of everything from tablets to refrigerators.

Augmented reality (AR) continues to make the news. At the recent Entertainment Expo (E3) video-game show, Sony’s PlayStation division showcased its latest AR and motion-control camera technology in the Playstation4. 

But augmented reality is more than just a game. For many consumers, their first interaction with AR came through Google Maps and the direction arrows superimposed over the actual road. Other implementations include IBM, which has created technology that marries augmented reality with comparison-shopping. Imec, the Flemish government’s R&D nanotech giant, has created augmented-reality contact lenses for medical and cosmetic applications. Metaio’s Juaio application turns iOS and Android-based cell phones into AR-enabled devices (see Figure 1).

Figure 1: The latest augmented-reality browser uses ordinary objects as markers to get virtual information. (Courtesy of Metaio)

These varied implementations for augmented-reality technology make it difficult to calculate the total market value for the semiconductor industry. One reason is that with just one software development kit (SDK), AR can be implemented on a number of devices.  It’s a market that is decentralized, open source, and hardware independent.

In forecasting for the augmented market, analysts at Semico Research considered the varied types of products that implement the technology while weighing the popularity and other factors driving interest in these products. The result is that the AR market is expected to reach almost $620 billion by 2016 (see Figure 2). This analysis is part of a comprehensive and informative report published in October 2012.

Figure 2: Shown are forecasts for the total augmented-reality hardware market. (Courtesy of Semico Research)

The need for specific hardware and software tools that create AR experiences is also growing. One such tool suite from Dassault Systemes was used by designers to create a consumer application that transforms a comic book into a 3D augmented-reality experience, while another application accurately reconstructs the creation of the city of Paris.

While these AR implementations are educational, the semiconductor and embedded electronic-design community may scoff at them as mere entertainment. But this same technology is finding use in early design-validation “virtual prototypes” of everything from cell phones and tablets to refrigerators. 

Augmented Versus Virtual Reality

To understand the benefit that virtual-prototyping platforms can offer to designers, one must first appreciate the relationship between augmented versus virtual reality.

“In a virtual-reality environment, you see virtual objects,” explains Vincent Merlino, High-Tech Industry Solutions Leader at Dassault Systemes. “But with augmented reality, you see augmentations in the real world.”

“Virtual reality defines a completely immersed digital environment in which reality is replicated for the user,” elaborates Trak Lord, marketing and media relations at Metaio. “Augmented reality instead utilizes reality itself as an anchor for virtual content that can be experienced as part of the real world.” Whereas virtual reality inserts the user into an entirely new and simulated reality, augmented reality instead inserts virtual elements into the user’s reality. 

The electronic-design community is using augmented reality to both design end-user applications and improve the conceptual design of future products. One example of the former comes from the automotive market, where Audi uses image recognition and Metaio’s AR software to power its Interactive Manual application. Mobile users point their smartphone devices at different surfaces and objects in the car to get instant feedback on the identity, function, and quick-start instructions of that specific feature. For example, pointing a smartphone toward windshield wipers yields “Windshield Wiper” identification, with the ability to swipe through to a brief, animated tutorial on how to use the wipers.

Designers are also using augmented reality to improve the conceptual design and early validation of future products. Here, product teams use the technology to create a virtual (as opposed to physical) prototype to gain critical end-user feedback before the design is realized in hardware or software. Consider the design of a new mobile phone or tablet (see Figure 3). Augmented reality could be used to validate the usefulness of a new form factor –size, shape, and even “feel” – of the future product. This would result in significant cost and time savings over traditional in-person focus-group meetings.

Figure 3: The editor takes a picture of a virtual prototype of a future tablet at a recent trade show.

The cost and time benefits of AR-based virtual-reality prototypes are dependent upon the complexity and size of the end-user product or system. “The bigger (or more complex) the product, the bigger the savings over a physical prototype,” remarks Merlino. “If you want a physical prototype of a large refrigerator with one color and configuration, you may spend over a million yen (or about $10k). With a virtual-reality prototype, you only spend money on the software with the advantage of many configurations and global distribution.” Augmented reality allows you to compare the virtual refrigerator alongside a real (competitive) offering (see Figure 4).

Figure 4: By comparing a competitor’s product with one created using augmented reality, hi-tech companies can validate design issues and user experiences before committing time and money on the actual product. (Courtesy of Dassault Systemes)

Almost any product could be validated with augmented reality, including the electrical and mechanical subsystems on both chips and circuit boards. Naturally, cost will ultimately determine the use of augmented reality over an actual physical prototype. Once a designer decides to use augmented reality in the virtual prototyping of a design, however, what is needed in terms of hardware and software?

Designing For Augmented Reality

To accomplish the insertion of virtual elements into a user’s reality, camera technology is used to identify and recognize real-world images and objects. Digital and virtual content is then added to them in real time.

Designers of end-user applications need to consider both the software and hardware aspects of their AR implementations. Most vendors provide augmented-reality software application development kits (ADKs) that work on the majority of iOS and Android platforms. “Beyond the basic needs of front-facing camera and reasonable performance, many of the newer platforms offer new compute resources, such as programmable image processors that promise improved computer visioning capabilities,” says Lord. “The ongoing improvement in graphic-processing-unit (GPU) and general-purpose-GPU (GP-GPU) processing also provides more opportunity to improve augmented-reality user experiences.”

In addition to improved performance, hardware must provide more power-efficient depth-of-field imaging sensors and greater ease of programming for synchronized, multi-sensor data streams.

High-performance, low-power GPUs and associated computing engines are a critical part of the design of AR systems. Companies like Metaio offer dedicated hardware image processors for accelerating augmented-reality experiences. Dubbed the “AREngine,” the acceleration chip works by taking on much of the processing required to run AR experiences from the general CPU. The company claims a drastic reduction in battery power consumption and an increase in initialization speeds.

Many designers use the compute power of the existing mobile-device GPU to enhance performance and minimize power for their AR applications. This requires careful integration of the GPU, video, and camera-vision processing to ensure the best performance.

What is the difference between image and graphics processors? Image processing deals with the manipulation of images acquired through some device, like a camera. The emphasis is on analysis and enhancement of the image. Today’s popular computer vision systems require the use of image analysis.

Conversely, graphic processing deals with synthesizing images based upon geometry, lighting, materials, and textures.

“Augmented-reality applications usually blend live video with computer images, where 3D graphics rendering is performed using OpenGL software APIs,” explains David Harold, Senior Director of Marketing Communications at Imagination Technologies. “Powerful cores can provide high-quality 3D graphics rendering, which can then be blended into the real-world camera capture. Also, by implementing features like camera image texture streaming, GPUs are capable of processing camera images as textures to enable 3D and reality integration with minimal CPU loading.” Efficient integration of camera images into the 3D rendering flow is essential for good performance and efficiency in augmented-reality designs (see Figure 5).

Figure 5: Harold shows a tablet connected to a screen to demonstrate the real-time computer power of GPUs.

 

What Does The Future Hold For Designers?

One tantalizing future would be the complete simulation of both hardware and software in the virtual prototype. For example, while looking at a virtual prototype of a tablet, the designer or end user could be running an application on the tablet (like a game) or using the camera.

Another near-horizon goal is the greater incorporation of social-media input to the design process. Using AR-based prototypes, a company could include stakeholders or a larger social network to provide valuable feedback as part of a crowd-sourcing team. In many cases, this would be much easier than developing just a physical prototype.

Finally, companies are in the early stages of developing image-processing techniques for gesture-recognizing augmented reality. In one implementation, an application would superimpose imagery over the screen of a smart-phone or tablet, allowing users to interact with it via hand gestures. Another implementation leverages Wi-Fi signals to detect specific hand gestures without the need for sensors on the human body or cameras.  

Augmented reality has quickly moved beyond gaming to the wider consumer market. The technology has created a new set of hardware and software applications that allow designers to create high-performance and low-power augmented-reality experiences. That’s the reality.

“Reality: What a concept?!” – Robin Williams



Free counter and web stats


DAC 2013 Pictures

Friday, June 7th, 2013

DAC – Video Latency; Platform as a Service; 262626; and ARM-12

Tuesday, June 4th, 2013

My Tuesday at DAC involved CAST IP, Mentor Graphics, Dassault Systemes, Chipestimate.com, and Globalfoundries-ARM. 

Here are but a few of the companies, hallway discussions, and presentations that I enjoyed during Tuesday at DAC:

> Performance is a function of latency and power, as Gary Smith noted in his pre-DAC EDA and IP trends presentation. One example of the need to balance latency and power is in the application of real-time video streaming (e.g., H.264 video encoders). Latency is the delay that occurs between the processing and transmission of live video. A simple way to initially gauge latency is by waving your hand quickly in front of the camera and watching for blurring of the image on the display. I saw none during my demo.

 

Other news from CAST highlighted a joint announcement with IP company Beyond Semiconductor concerning an ultra-low-power, 32-bit BA21 embedded processor.

 

 

 

> Hallway chat with Mentor’s M&A expert, Serge Leef:

Software as a Service (SaaS) for EDA cloud-based applications seems passé. Platform as a Service (PaaS) is the new “black.” The key driver in this change seems to be the push by next-generation chip designers for a more robust user experience (UE; see “Experience Required,” http://chipdesignmag.com/sld/blog/2013/05/30/experience-required/). Serge sees the trend to user-experience designs as essential to the evolution of EDA tools. He even believes them to be a source of revenue in terms of a micro-business model.

 

> Dassault Systemes offered several interesting technology demos. While their Netvibes product provides for intelligent dashboarding, Tuscany’s PinPoint enables tracking progress from synthesis to GDSII.

http://www.tuscanyda.com/

> IP protection and management includes the synchronization of databases and documentation. In this way, a close partnership with Magillem is proving very useful. (More about this in the near future.)

> Simulation Lifecycle Management (SLM) for semiconductor verification and validation (V&V) flows may evolve quickly into a framework. The effort in the automotive industry via ISO262626 may establish a working model for the EDA industry.

 

> Globalfoundries presentation at Chipestimate.com, “IP Talks” – Subi Kerngeri, VP of the Advanced Technology Division, talked briefly about many things, mostly centering on the need to offer a combination of device technology design and SoC manufacturing expertise.  But this need is fraught with challenges. (Reference: “Modular FinFET Increases Planar-to-Non-Planar IP Reuse”)

http://www.chipestimate.com/blogs/IPInsider/?p=1264   He noted that Globalfoundries was the first fab to optimize for the newly announced ARM Cortex-A12 CPU – POP IP combined with Globalfoundries’ 28-SLP process. Also, Kerngeri emphasized the success of Fully-Depleted SOI technology at 28 nm, saying that it was pretty much like bulk CMOS for designers. STMicro is their partner in FD-SOI. This technology has enabled 0.63 v at 1-GHz performance in a dual A-9 implementation.