This week I was invited to speak at an Intel-sponsored symposium at the Technion, Haifa, Israel. The theme of the symposium was “Challenges and Opportunites in System-Level Design and Verification” (see here for an outline of the symposium). I spoke on the theme “Software-Defined Everything: The impact on high-level design and validation”.
There were several interesting speakers, most from universities talking about their advanced research, including David Harel who I had the chance to see give a talk for the first time.
The idea for the talk was taken from the work I have been doing on baseband processors and systems the last couple of years, and the term “Software-Defined Radio” which has become a little out of fashion in the last little while. I think this is due to taking the words a little too literally. To quote the Wikipedia definition:
A software-defined radio system, or SDR, is a radio communication system where components that have been typically implemented in hardware (e.g. mixers, filters, amplifiers, modulators/demodulators, detectors, etc.) are instead implemented by means of software on a personal computer or embedded computing devices
While an interesting concept, this definition seems to take an “all or nothing” position in which everything that was done hard (with hardware) is now done soft (as software running on processors). This is a little too absolutist. I think in modern embedded systems, the key phrase is “software-defined“, but not necessarily “100% software implemented“. That is, software running on embedded processors defines the functionality of the system, and will definitely be the implementation vehicle for much of that functionality, but not necessarily all of it. However, where functions continue to be delivered via hardware blocks, software, and the processors it runs on, will control it, shape it, and thus define what it is.
So moving on from Radio (which of course, are vital parts of cellphones, whether smart or not, tablets, and indeed untethered computing devices of all kinds) to Everything: we see that there is a major shift in product architectures, and many more embedded processors are used to deliver major parts of the product function, and will define the rest. It goes way beyond cell phones to include almost every consumer product. It was a long time ago that I first heard of simple microcontrollers used in appliances such as microwave ovens, washing machines, etc. Now in theory and in practice most appliances have software-defined functions, and it is hard to conceive of any electronics-based device without a software defining component, and a growing one at that.
Of course, from my perspective, many of these products are increasingly using application-specific instruction set processors (ASIPs) as well as fixed-ISA processors to do this.
Despite John Bruggeman leaving Cadence recently, the focus he brought in 2010 to the idea of EDA360 and “App-driven design” was a manifestation of this trend. While there were not a lot of new ideas in the white paper Cadence produced, it did represent a synthesis of a lot of ideas that had been out in the industry for a while and it was interesting to watch Cadence’s product line evolve – some of which clearly tied to the software-driven approach they espoused. It is not completely clear how this will go further now that John Bruggeman is no longer there although Cadence says that it is still a key part of its vision.
So what is the impact of “Software-defined everything” on high level design and verification? It is pretty profound.
- It re-emphasises the concept of platform-based design
- It opens up room for new types of processing engines, usually derived from ASIP design flows
- It emphasises the need for sophisticated up-front design space exploration and architectural analysis: that is, part of ESL (electronic system level design)
- It requires highly software and processor centric verification methodologies and tools, thus (with the above) leading to virtual prototyping and virtual platforms
- It allows high level synthesis for hardware blocks to be squeezed into the design methodology for those blocks that still need to be mapped to digital hardware and to be designed rapidly.
- It changes the nature of hardware prototyping, as processors map to FPGA devices somewhat differently than digital hardware blocks designed at the direct RTL level.
It also exposes some key areas for future tool and technology development:
- Debug with multiple heterogeneous processors and engines becomes more complex, and current single-focus debug methods must evolve to more of a sytem level debug concept
- Verification technology must move to support the use of “multi-core” to design “multi-processor”. Techniques for this are still evolving
- As systems continue to grow in complexity there is an opportunity to reconsider an old idea – true system-level synthesis.
Lots of opportunities exist for future innovation and research. As always, I would welcome your comments.