How many engineers think of their career as being spent on a treadmill? It is a larger portion than most realize.
The majority of electrical engineers work on products that are performance based. A fundamental often overriding characteristic of their product’s value is the speed of operation. When one version of the product is finished, development starts on the next more powerful version.
This is true in the computer industry, the telecommunications industry, and certain portions of the consumer electronics industry. The sales of semiconductors into these industries, shows that between two-thirds and three-quarters of electronics engineers work on an ongoing performance treadmill.
In searching the Internet there were several articles that either claimed that treadmill was slowing down, or continuing unabated. There were also articles that bemoaned being a consumer of products that were becoming obsolete too fast and they wanted to get off the treadmill. A couple of interesting articles are referenced below.
A technology treadmill and the effects on the vision of an industry.
The quarterly business treadmill and the effects on project management.
However, I did not find any articles that either described how this affects the life of the engineer developing the products, or gave any advice on how to cope with being on the treadmill.
The Predictive Value of Moore’s Law
It has been said that the value of Moore’s Law (http://special-sarfunshafi.blogspot.com/2007/11/meeting-man-behind-moores-law.html) was that it allowed Intel to plan the features to incorporate into each new microprocessor. The Law allowed Intel to estimate of the number of transistors could be incorporated into each generation of the microprocessor. This allowed them to determine when it would be feasible to incorporate certain new features into their devices such as floating point arithmetic.
There were also versions for other industries. One I was familiar was Bill Joy’s Law of workstation performance. It was also exponential, but not at the same rate as Moore’s Law. This law was critical for planning a simulation accelerator product in the Electronic Design Automation (EDA) industry. The key feature of a simulation accelerator was how much faster it was compared to a standard workstation. The law allowed a projection of how the product would compare to a workstation when it was introduced, and to estimate the product life of how long the product would be viable.
As the treadmill progresses some methods may breakdown, or need to evolve. In the systems design class at MIT, we were taught that changes of that are of an order of magnitude in size, will cause unexpected components to break, or require a fundamentally new solution. The example that I remember is of a new plane that could descend more rapidly. The problem was the plane kept being landed in Tokyo bay on early morning flights. The problem was that the new rapid descent did not give the pilot’s eyes time to adjust to the early morning sunlight as they approached Tokyo. They were in effect flying blind until their eyes adjusted.
The progression of the software in-circuit emulator described below is illustrative.
In the 70s through early 80s the preferred embedded software development tool was the in-circuit emulator (ICE). This tool would plug into the socket for the microprocessor with a probe and act in the place of the device in the target design with additional debugging capabilities. However, this approach became unfeasible and uneconomical as the design speed increased. The cost to develop an ICE was escalating and the supportability became questionable as the probe method of access proved increasingly unreliable. This method effectively broke down at higher speeds.
In the mid-80s microprocessor vendors starting adding debug features such as breakpoints into the device silicon to aid the embedded software developer. This allowed a fundamentally different approach to building the software tool. This proved an effective alternative to the probe and worked well with increasing device speeds. Today, virtually every microprocessor device supports silicon-based access to debug features.
Focus on the core
Another aspect of being on the performance treadmill is that you become acutely aware of the portions of the design that drive performance. These become the focus of tremendous amount of engineering effort. Just as the critical path of a schedule get additional attention, so also does the performance core of a design.
The portions of the design that are not performance critical are often somewhat neglected. They get the time that is left over after the performance critical portions are under control.
In the EDA industry this can be seen in that the tools are continually being re-built the new geometry of the latest semiconductor process. The base tools get completed before any other tools get much attention. This explains why it took so many years for timing analysis and other non-critical tools to mature into complete products.
One strategy for these non-critical features is to out-source them. Once a particular function becomes large enough, it may support a third-party developing it and making a business of taking over the problem. Since it is a less critical portion of the design, the risk of being dependent on a third party may become acceptable. There are many examples where this is has proven to be successful.
Thriving on the Treadmill
In summary here are three suggestions for thriving on the treadmill.
One, step back and look at the implications of the treadmill performance. Knowing the speed of the treadmill is important to getting out ahead of potential issues.
Two, watch for predictable places the system could break down especially when attempting a change that is an order of magnitude in size.
Three, know the core parts of design for performance and assess whether a third-party can take over the non-critical portions of the design.
If you know of other guidelines that you would recommend, please add a comment.
Packet Plus™, Inc.