By Jack Ganssle

Examining Your Most Important Tools

Published in Embedded Systems Design, July, 2009

What is your favorite tool?

I'm particularly fond of my wood lathe, but for this publication let's narrow the question to tools used by embedded developers.

I vote for the oscilloscope. It's the universal electronic engineering tool, beloved by hardware and software people. It's also the most underused tool in the firmware realm.

A recent Embedded Market Survey conducted by TechInsights, this magazine's publisher, showed that just 37% of us rated the scope as their "favorite/most important hardware/software tool." Number one, at 55% is the compiler and assembler.

"Favorite" and "most important" probably don't go well together. The compiler is certainly one of the most important tools we use since it's impossible to build any bit of firmware without it. But it's hardly my favorite. In fact, the ideal compiler should be so non-intrusive, should do its quite boring work so well, that we'd hardly notice it was there. It's like a bilge pump that silently and constantly does a hugely important job, but is unnoticed and under-appreciated.

Why did the scope fair so poorly? According to http://www.electronics.ca/presscenter/articles/802/1/Global-Oscilloscopes-Market-Projected-to-Inch-up-to-125-Billion-by-2010/Page1.html the global market for scopes will be about $1.25 in 2010. That's pretty small compared to many categories of products, but is huge compared to the market size for embedded tools.

Various studies peg the entire embedded tool market at around $3 billion, a number I just don't believe. Often analysts inscrutably lump RTOS sales in their tool figures, which makes little sense. But summing up sales of the biggest players in the industry, with RTOSes and all, gives a number of well under $1B. So $1.25B in oscilloscope sales overwhelms the entire size of the embedded tool market. One would think a large percentage of those scope sales would be to us, the embedded hardware and software designers.

This industry went through a radical tools transformation over the last 15 years. In the 80s and 90s most of us used in-circuit emulators. These tools let us debug in the procedural domain - single step, examine variables, etc - as well as the time domain. The ICE supported the latter via real-time trace, performance analysis, timers, and other features that are essential to managing microseconds.

But processor speeds increased to rates which made it impractical to bring signals to an ICE pod. Surface mount technology shrank packages to sizes that couldn't be probed. And sophisticated on-chip features like caches, pipelines and MMUs removed any correlation between what the CPU was doing, and the signals on the pins. The ICE market all but disappeared, though a few companies, for instance Lauterbach and Signum, continue to provide such tools.

BDM and JTAG debuggers replaced the ICE in most development shops. Cheap and easy to set up, these devices used logic on the target CPU to move debugging data to a PC. They worked regardless of target clock rate, and used a simple dedicated connector, which ameliorated all of the surface mount issues.

But BDM and JTAG debuggers provided nothing for handling the time domain. The gave us a Visual Studio-like interface. Scopes therefore became much more important. Twiddle a bit in the code and the scope will instantly show execution time, latency, and pretty much anything else you need to see.

The good news is that the BDM/JTAG tools have improved. More vendors are throwing transistors at the debugging problem, and add all sorts of wonderful on-chip resources like hardware breakpoints, trace, and more. Real-time debugging returned.

Ironically, people involved with designing CPUs tell me they are being squeezed by management to remove as many of these capabilities as possible. The boss wants to reduce transistor counts, or devote them to cranking out more performance or to add additional peripherals. Just as in Jules Verne's From the Earth to the Moon, in which there was an essential tension between armor makers and canon builders, in the embedded world there's a never-ending tension between debugging resources and reducing the cost of goods. Alas, few follow the implications of that debate to the logical conclusion: cruddier debuggers lead inescapably to higher engineering costs. Higher engineering costs drive the sell-price up, since those NRE dollars must be amortized over the number of units shipped. At least, if one wishes to make a profit over the long term. But perhaps long-term thinking is merely a quaint notion that has grown obsolete.

Looking at the study data's results on debugging tools in more depth, we see 53% of us call the debugger our favorite/most important tool. Presumably "the debugger" is the software application that we use to drive the hardware tools. I agree that this is a hugely important tool. Most projects consume 50% of the development effort in debug and test, so we're slaving away in front of this application for months or years on end. Important? You betcha. Favorite? I sure hope so, and hope it is so well designed it seamlessly enables our work and doesn't hinder our efforts.

Of the top seven rankings in the study, all but two were for debugging tools: debugger, scope, JTAG/BDM, logic analyzer and ICE. This result shows just how hard it is to get firmware right. It also suggests just how broken our development strategies are. Why spend so much time debugging? Another philosophy, one that actually works, is to not inject the defects in the first place. Companies that embrace that approach get higher quality code on a shorter schedule. In examining 4000 software projects, Capers Jones found defects to be the biggest contributor to late schedules (Jones, Capers. Assessment and Control of Software Risks. Englewood Cliffs, N.J., Yourdon Press, 1994).

Consider, for instance, the SPARK approach, also called "Correctness by Construction." (See http://praxis-his.com/). Projects done with SPARK have almost unmeasurable error rates. Even better, the tools are now free.

Other Tools

In the study the IDE came in at 29%. No separate entry existed for the editor, so I presume most rolled that into their IDE response. Certainly the subject of editors is one that spurs the most passion of all among firmware people. The VI folks will never be swayed by the mad bayings of, well, I don't plan to open either the seventh seal of hell or the editor war.

The IDE complements and contains the debugger, and it is the environment that is open on the desktop more than any other tool. We have too many IDEs: every tool vendor seems to supply their own. Switching processors usually means changing the development toolchain, which means learning an entirely new set of procedures. That's great for vendors who can exploit this to lock customers in to their products, but not so wonderful for engineers trying to make reasonable technical decisions.

One of the most hopeful changes in the tool landscape was the appearance of Eclipse, a sort of universal open-source IDE that can handle vendor plug-ins. Yeah, it needs a lot of resources to run well, but resources are cheaper every year. Interestingly, the Wikipedia entry (http://en.wikipedia.org/wiki/Eclipse_(software)#History) claims the name was chosen to "eclipse" Microsoft's Visual Studio IDE. That has not happened. It's hard to get decent data on the use of Eclipse in the embedded world, but judging from the number of press releases on embedded.com from vendors offering compatible products, it's increasing. I suspect few vendors will release proprietary IDEs anymore, though.

Configuration management tools garnered just 13% of the favorite/most important vote. They're boring. But CM tools are as essential as the Ethernet cabling that interconnects computers and routers. Nothing happens without them, or at least nothing reliable. Some companies unbelievably continue to use sneakernet, and others unbelievably continue to use sneakernet or a pile of disks and directories to manage version control. As Gregory Wilson wrote in "Where's the Real Bottleneck in Scientific Computing?" (Gregory Wilson, American Scientist, January/February 2006) ignorance of version control is computational illiteracy.

Every other category of tool ranked under 10%. Source code analysis got 8% of the votes. Amazing but true. I assume that includes Lint, a syntax checker on steroids that has been around since the 70s. Lint is an imperfect and at times frustrating tool. But it finds bugs. Consider this snippet:

int do_something(int end)
     {
       int i, j=0;
       int r=0;
       for(i=0; i< end; + + i)j != toggle_bit();
       r=(j + i) & 0xff;
       if(j < end)r=end;
       return r;
     }


What's wrong with the code?

Actually, there's plenty that's problematic with this, not the least the uninformative variable names and the use of C's poorly-defined "int" directive. But look deeper. See anything?

I bet you stared at that for a while before noticing the spaces around the increment operator. Lint finds this error in a millisecond. This stuff is hard, and anything that can be automated, must be.

Other static analyzers include the tools from Polyspace, Klocwork, and many others which look for execution-time errors without actually running the code. They'll see what values a variable can assume, for instance, and look for out-of-range problems. My informal surveys of embedded developers using static analyzers have almost universally found the engineers are satisfied with these tools. Everyone is unhappy with the cost, which can be surprisingly hard to pin down as salesmen do the old "investment, not cost" two-step. But I've little doubt costs will come down and the capabilities will improve.

The study found the least popular or least important tools are those that help with software test. That's an astounding finding, considering the importance of test in creating great products. Perhaps the result stems from the poor selection of testing tools concurrently on the market. But some amazing products do exist. One of the challenges we face is moving test from the end of the project (since things run late test is inevitably cut) to an on-going process that starts as soon as the first line of code appears. For instance, Virtutech, VaST and others help you create an entire simulation of your system, including all of the peripherals and the external world with which they interact. I've played with some of these and have been amazed at the accuracy of the simulations and their speed.

They cost money, though, serious money in absolute terms. But on a large project figure tens of cents per line of code in your firmware. That's about two orders of magnitude cheaper than the cost of your development effort. Viewed in this light, the costs are reasonable and will be recouped by earlier verification and validation.

For my entire career tool costs have been a problem for software developers. The hardware crowd routinely get funding for $50k logic analyzers while firmware people struggle to justify a decent compiler. Something like 80% of the cost of engineering an embedded project is in the software development; to me it's logical that spending money there to get the product to market faster and cheaper is a no-brainer. Management seems to disagree. Sometimes the fault lies with the tool vendors who do a poor job of demonstrating the benefits to the bottom line.

We work within an entire ecosystem of tools that synergistically work together. My lathe is tremendous fun, but is worthless without an extensive complement of jaws, polishers, and turning tools. I have two sets of the latter: cheap ones made of poor steel, and top-of-the-line gouges and chisels. The good ones cost a bundle, but they stay sharp longer, chatter, less, and are more fun to use.

Good tools lead to increased productivity. In this flat world, to compete we need everything that helps us do more with less. No matter what a tool costs, that price pales compared to our salaries.