Part of the  

Chip Design Magazine

  Network

About  |  Contact

Archive for October, 2013

Will a New DSP-based IP Subsystem Emerge from Rumors?

Monday, October 28th, 2013

Two rumors about Qualcomm, Arteris, and DSP architectures lead to tantalizing speculation about a new type of DSP-based IP subsystem.

Rumors are tricky things – hard to prove but sometimes harder to disprove. Experience has taught me that the best way to judge technology-based rumors is by looking for convergence. Here’s a case in point: John Cooley at DeepChip has posted emails suggesting that Qualcomm may be engaged in an asset buy with network-on-a-chip (NoC) company Arteris.

This rumor must be weighed in the context of today’s semiconductor intellectual-property (IP) environment. It isn’t unusual for smaller companies to be acquired by larger ones – ostensibly for their IP. Conversely (or to encourage such an acquisition), smaller companies are licensing more and more of their in-house IP for external sales. Why? Kevin Kline from Freescale has noted that the value of the internal IP of a smaller company can be worth more than the market cap of the company itself.

But legal IP guru Jonathan Kaplan (see his latest blog) once told me that, in general, IP holdings are more valuable to smaller rather than very large companies. If that’s true, why would an IP giant like Qualcomm seek to acquire Arteris?

To answer that question, let’s consider another rumor that Qualcomm may license its digital-signal-processing (DSP) architecture. If this supposition proves accurate, the company will join a trend among other multicore-processor giants (e.g., IBM licensing Power CPU; nVidia licensing its GPUs; and Imaginations licensing MIPS).

Does Qualcomm’s rumored interest in Arteris – plus the rumor that Qualcomm may license its DSP architecture – provide evidence of a convergence? That is hard to tell, as we lack a legitimacy weighting for each rumor. Still, they do seem to point to a convergence. Acquiring Arteris’s chip-level interconnect IP might make it easier to integrate Qualcomm’s DSP IP with other non-company cores on a heterogeneous system-on-a-chip (SoC). This is only speculation, as Arteris’s competitors (e.g., Sonics) may disagree with the ease-of-integration assertion.

Still, if these two rumors are true (and that’s a big “if”), chip designers may see the emergence of a new DSP-based IP subsystem. Or so it’s rumored.

Originally posted on Chipestimate.comIP Insider

History Lesson from Acorn to ARM for IoT

Wednesday, October 23rd, 2013

Do you remember developing on the 6502 processor? How about using Acorn’s BBC Micro? Well, surely you’ve heard of ARM. If you answer “no” to any of the above and are interested in developing for the Internet-of-Things (IoT), this video will help.

Dominic Pajak, ARM's Embedded Strategist, reminisces with John Blyler, Content Officer at Extension Media, about hardware and software progress from the days of the 6502 processor. The context for this discussion is today's Internet of Things (IoT) design challenges.

Learn more about Internet-of-Things development by attending the IoT Kickstarter event at ARM Tech Con on October 31.

Further reading on IoT:

 

Complexity Spreading vs. Intelligent Embedded in the IoT

Friday, October 18th, 2013

Which phrase is more indicative of the Internet-of-Things depends upon your viewpoint – namely, from the process or the products.

This week, my attention was on the Internet-of-Things (IoT). Earlier this week, several experts from ARM, Cadence, and other companies shared the challenges of integrating essential analog/mixed-signal and RF technologies into digitally based IoT devices. At mid-week, I interviewed ARM’s Embedded Strategist, Dominic Pajak. (Look for that video next week.) Pajak, a former microcontroller designer, brings a seasoned viewpoint to the embedded side of the IoT equation.

This system-level perspective – chip, embedded, and beyond – helped me to overcome a bias that I had to the “IoT” phrase. To be honest, I thought that IoT was just the latest marketing ploy to rehash the old computation-communication convergence “wars” from the last decade. (See “Are Mobile Communication and Computer RF Technologies on a Collision Course?,” June 2000.)

Certainly, there are similarities between these two concepts. For example, both require the ongoing convergence of analog (mixed-signal) and digital systems. But today’s IoT adds the element of sensor systems to the mix. Many engineers may view this as a trivial addition to the overall design equation, as sensors, instrumentation, and data-acquisition technologies have been understood since the ’50s.

What is new is the scale of integration, which adds a new level (network?) of complexity. This is not the deep complexity of a new transistor device structure like a FinFET or the associated 3D manufacturing challenges. Instead, the IoT requires a spreading out of reasonably well-understood technologies, which is a different kind of complexity. The end result is a complexity spreading or even smearing-out process. (From a device or product perspective, this is known as smart sensors or intelligent embedded.)

This is not a new phrase. In the world of cellular code-division-multiple-access (CDMA) design, the term “complexity spreading” often refers to various algorithmic designs related to spread-spectrum systems. However, in terms of the smart sensor technology (sensors with some processor intelligence), I would argue that complexity spreading refers to the pushing out of this intelligence as far as possible – in this case, to the interface with the analog physical world experienced by the sensor. Whenever “intelligence” is pushed outward, so too is the technology needed to enable that intelligence – namely, digital microcontrollers, processor, memory, and interface (wired or wireless) systems.

By nature, complexity spreading requires low-power systems. Sensors are often located in remote regions or, if not remote, in very restricted areas. This necessitates extreme power conservation, which is why ARM’s processors – already a dominant force in mobile wireless technology (e.g., mobile phones) – are well positioned for the complexity spreading required by IoT.

Want to learn more? Then you should attend the “Kick-Starting IoT Session” at the ARM Tech Con:

Date: Thursday, October 31, 2013

Location: Santa Clara Convention Center (Mission City Ballroom B1)

Kick-starting IOT Session Agenda:

09.00 – 10.00      Keynote: John Cornish, ARM – Enabling the Internet of Things (Main Ballroom)

10.20 – 11.00      Simon Ford, ARM – Connecting ARM Cortex-M to the Cloud

11.00 – 12.00      Panel Session: Zach Shelby, ARM; Michael Finnegan, Sprint

12.00 – 12.30      Matt Webb, Berg – Experiences behind Innovative Product Design

12.30 – 13.00      Includes lunch

13.30                   IoT Developer Hands-On Lab

Flashback to June 2000: “Are Mobile Communication and Computer RF Technologies on a Collision Course?”

Friday, October 18th, 2013

Originally posted in Penton’s Wireless Systems Design magazine, June 2000

By John Blyler, Senior Technology/West Coast Editor, Wireless Systems Design

I was chatting with a few colleagues over pizza the other day when someone asked about multiple protocol designs in wireless devices. I assumed he meant multi-mode RF designs in communication devices, like cellphones, that will support both GSM and CDMA. As I started to talk about multi-mode issues, though, his raised eyebrow and slight smile told me I was off base. He continued to look at me oddly for a few seconds before explaining that his concern was about a two-protocol notebook that would run both 802.11b for wireless LAN connections, as well as Bluetooth for access to other devices on a personal area network (PAN).

It occurred to me that here was yet another challenge to be addressed before the much-touted convergence of communication and computational devices would become a reality. The former dealt with transmission protocols such as GSM and CDMA that would traditionally require two sets of RF systems to handle each of these competing protocols. Computational devices, however, needed to connect to a wired network infrastructure, using protocols like 802.11b or one of the PAN standards, like Bluetooth. This required separate RF subsystems as well, but the implementation was quite different.

The most common practice used to support different wireless access technologies in mobile computing devices, like laptops and PDAs, is to integrate 802.11 support into the device while providing Bluetooth access via an interface card. Several companies, such as Compaq and Toshiba, are beginning production on wireless-enabled notebooks with integrated 802.11b capabilities. Bluetooth would then be supported via a PC card. This approach acknowledges both the maturity of 802.11b and growing support for Bluetooth.

The operational coexistence of these two RF sources is not without its share of problems. Potential interference challenges between 802.11b and Bluetooth, when operating simultaneously on a single device, may mean that both radios cannot operate effectively together. If they do, then degradation in speed or performance may result. For example, if Bluetooth is receiving or transmitting, then 802.11b may operate at a reduced rate.

Designers of mobile computing devices are working on several approaches that will minimize the interference effects of these two transmission technologies. Since RF interference diminishes with the increase in distance between competing sources, one solution is to position each RF subsystem as far apart from one another as possible. Admittedly, the delta distance between RF sources in a laptop or PDA is small, but every little bit can help.

Another consideration is the design issues associated with the performance, type, and position of the antenna. (See the June issue of Wireless Systems Design magazine.)

Perhaps a better approach is to design the system with both RF sources in mind, i.e., a systems approach. Design options will allow switching back and forth between the two protocols in a more or less seamless fashion.

OK, so much for the competing RF problems on the mobile computing side. How about similar issues with mobile communication designs that support GSM and CDMA, for example? A traditional technique is to implement multimode RF subsystems through the use of distinct transceiver chains, i.e., a separate set of circuits for each mode. Not surprisingly, this is a costly approach that increases the chip count in the final product, thus increasing the size, weight and complexity of the device.

RF designers have long known that eliminating the Intermediate Frequency (IF) conversion stage of the receiving subsystem could decrease power consumption while reducing the form factor of the circuit. The catch has been that, in a traditional superheterodyne radio architecture, most of the IF conversion is performed with analog circuits. While effective, analog components often consume more power and are larger than their digital counterparts. Companies like Analog Devices and Texas Instruments have developed high-performance DSP chips that offer a digital solution to handling the IF conversion. Recent advances in reconfigurable hardware (see the June 12 DSP Alert) offer yet another solution to digitizing traditional analog RF functionality.

A new approach eliminates the need for IF conversion all together. Parker Vision’s RF transceiver implements a hardware-software solution, through their Direct2Data (D2D) technology, that removes the need for IF conversion in many applications.

In terms of actual ICs, several vendors are beginning to offer products that provide multi-mode RF solutions. Tropian Inc., for example, has introduced TimeStar, the first multi-mode RF transmitter in a single chipset. New products from many vendors will follow quickly in the coming months and years, as the rollout of 2.5G and 3G technologies become a reality.

But the question of interoperability of multi-mode RF communication systems and multi-protocol mobile computing devices remains largely unanswered. Nevertheless, convergence of these technologies will require a comprehensive answer in the near future. We’ll keep you posted.

Software-Hardware Integration of Automotive Electronics

Friday, October 11th, 2013

My SAE book arranges and extrapolates on expert papers in automotive hardware-software electronic integration at the chip, package, and network vehicle levels.

My latest book - more of a mini-book – is now available for pre-order from the Society of Automotive Engineers. This time, I explore the technical challenges in the hardware-software integration of automotive electronics. (Can you say “systems engineering?”)  I selected this topic to serve as a series of case studies for my related course at Portland State University. This work includes quotes from Dassault Systemes and Mentor Graphics.

 

Software-Hardware Integration in Automotive Product Development

Coming Soon – Pre-order Now!

Software-Hardware Integration in Automotive Product Development brings together a must-read set of technical papers on one of the most talked-about subjects among industry experts.

The carefully selected content of this book demonstrates how leading companies, universities, and organizations have developed methodologies, tools, and technologies to integrate, verify, and validate hardware and software systems. The automotive industry is no different, with the future of its product development lying in the timely integration of these chiefly electronic and mechanical systems….

 

NIST Is Closed – Try the French Government

Friday, October 4th, 2013

A casual Google search on nanotechnology results in embarrassment over the unnecessary US shutdown and a desire for French croissants.

This is crazy! Earlier today, I was listening to a presentation at the IEF in Dublin, Ireland. The presentation was given by the French research firm CEA-Leti about its CMOS version down to 7-nm technology. My interest was piqued by one aspect of the firm’s talk, so I searched Google on stacked nano-wires. But the primary Google result reminded me that the US government was closed for business: “NIST Closed, NIST and Affiliated Web Sites Not Available.”

“Due to a lapse in government funding, the National Institute of Standards and Technology (NIST) is closed and most NIST and affiliated web sites are unavailable until further notice. We sincerely regret the inconvenience.”

Thank you, members of Congress (you know who you are), for embarrassing the US in front of the world. I think I’ll enjoy a croissant while conducting my research with a government that is open for business.