Religious Wars
How do we select a programming language? Often in quite dysfunctional ways.
Published April, 1997 in Embedded System Programming
By Jack Ganssle
The robed and turbaned assassin stands confidently, legs planted firmly, a strong stance sure to keep him standing in the coming battle. With a practiced flourish he spins a pair of dazzling scimitars overhead, ready to deal the mortal blow.
Skillless, without even taking a careful aim, Indiana Jones draws his pistol and, with a casual pull of the trigger, dispatches his assailant from 20 meters.
Technology replaces skill. Our tools let us move beyond a well-known paradigm that defines a society. Jones's pistol lets him be an effective without spending a lifetime studying the art of swordplay. This freed-up time lets him pursue archeology and other interests.
We see this pattern repeated time after time. Simple but fierce weapons replaced the Knights of yore with the citizen soldiers of today. Kodachrome turns even children into effective portrait painters. The flick of the wrist that starts a diesel substitutes for the sailor's skills of handing, reefing and steering.
Our tools, our technology, makes us savants, wizards wielding amazing capabilities with barely a thought. In the course of an average day we harness the fruits of progress to effectively do the work that once required dozens of specialized experts.
A hundred years ago Mark Twain described how the (by today's standards) poor abilities of the 19th century made his protagonist in "A Connecticut Yankee in King Arthur's Court" a magician, one who made even Merlin but a simple conjurer.
Yet these tools, these new abilities, are always born in controversy. A counterpoint of disdain from the priests of the status quo slows and confuses our acceptance of "new". The useless sword persisted for hundreds of years, the horse-drawn cavalry lasted even into World War II, and the Gutenburg Bible was painstakingly hand-illuminated - because "that's the way books are always made."
Sometimes a healthy skepticism forces us to look at change and try to understand it, to weigh each innovation and measure it's real value and costs. Now in the communications age we have the unprecedented ability to debate the impact of change. Even the most pro-technology-at-any-cost folks need to temper their enthusiasm with a larger view of the impact of what we create. It's important to understand other viewpoints! even when we disagree with them.
Sometimes, though, the controversy persists as simply a futile effort to preserve a familiar context, or for fear of change, or even due to a reluctance to learn the new approach. Look at the debate that rages over audio components: many audiophiles complain that the demise of the vacuum tube amplifier has diminished the quality of sound. Similarly, the CD is seen by some as a poor-substitute for the LP. Some even debate the sound quality of speaker cables. Maybe they are right; the purist perhaps can sense some difference in sound quality. Most of us, I suspect, just aren't sophisticated enough to notice the difference. My rock 'n roll abused ears think a well-recorded CD played through ICs and power transistors sounds awfully good. Though it pained me to abandon a collection of LPs, the lack of scratches, and not having the yell at the kids to stop jumping and making the needle skip, more than makes up for this.
A similar area, one sure to raise the ire of many ESP readers, is that of bloatware. In the PC world it's clear that normal">big software has won. The 28k Wordstars of the past are history. Period. The newest version of Microsoft Office more or less fills an entire CD ROM! but, when a 1 Gb disk costs $200, who cares? The cost of the technology is periodic computer upgrades. The benefits are, in my opinion, enormous.
Perhaps some of the frustration comes not from bloatware itself, but from a refusal to acknowledge the cost of desktop computers. I talk to many engineers saddled with two-generation-old computers at the office. Few businesses recognize that to effectively use computers they simply must have a continuos budget for new machines. That nifty 200 MHz Pentium will be junk in two or three years, so we've got to plan and budget for new machines in that time frame.
Language Wars
The most recent IEEE Annals of the History of Computing (Volume 19, No. 1, 1997, pg 69) has an interesting commentary on the invention of Fortran by John Backus. He started developing Fortran in 1953 for the IBM-704, as the cost of writing code, even then, was very high. The first version appeared in 1957.
I'm fascinated by the reaction of the computer community to the new language. "The major objection from experienced programmers was that the compiler could not possibly turn out object code as good as theirs."
I read this while getting frustrated with yet another battle on USENET about the viability of C in the embedded world. Hundreds of heated messages debated both sides of this issue, using words almost identical to those above. Those who don't learn from history are doomed to repeat it.
Part of my frustration rises from the tendency for any well-reasoned discussion on USENET to degenerate into personal attacks. A reasonable question or statement, one worthy of thoughtful responses, becomes a battleground. Observers jump in from the sidelines transmitting occasional bits of wisdom, which are usually obscured by the insults hurled by the vocal minority. Jeez, what ever happened to manners?
How can a subject as apparently sterile as embedded systems be so contentious? Are moderated groups the only kind that preserve basic decency? For an example of how a net discussion can preserve respect for all sides of an argument, check out Steve Talbot's Netfuture newsletter (www.ora.com/people/staff/stevet/netfuture/index.html). Though I disagree with most of the positions taken, the arguments are so well reasoned that I find the newsletter a delight to read and quite thought-provoking. I wish we had a similar medium for the embedded world.
So, being fortunate to have this forum for broadcasting my opinions, I feel compelled to make a few points about the Great Language Debate.
An argument I've heard for years comes from assembly programmers who fancy themselves (rightly or wrongly) as being coding demons. "I can write assembly as fast as anyone else can write C" they tell us. Maybe they can; however, like the pistol in the Indiana Jones story, a high level language lets the average programmer outcode all but the very few very highly skilled assembly programmers. A high level language is exactly paradigm of battle.
The language shifts the rules of the playing field. It elevates the average person - the one lacking the tremendous insight into computers that was so common years ago - into a very effective programmer. It turns the wizard into one who can balance the best offered by C and by assembly, creating the best possible product in the shortest possible time. The tool elevates the abilities of the entire industry.
Coding speed is only a small part of a much bigger problem, though. Neglecting the software life cycle is as silly as ignoring the fact that you've got to continuously budget for new computers. Someone else will have to make sense of your code. It's pretty clear that abstraction promotes understanding. It's possible to make sense of a .GIF file by viewing the binary, but is far easier to see it as a picture. Assembly makes a lot more sense than the machine code. Well written C is easier to understand than well written assembly. The details of moving data between registers obscures a program's important functionality, no matter how carefully documented. C masks these details.
The holy grail of the assembly versus C debate is always performance, something that is indeed critical in real time systems. Code size and speed debates are meaningless out of context, though. Yes, C has an impact on performance. The impact varies tremendously between compilers and even more between processor architectures.
A stackless machine like an 8051 does do an abysmal job of managing automatic variables, though the compiler vendors have some elegant solutions. It's shortsighted to disdain C due to a stack problem unless you've looked at the options provided by these vendors, who have spent years inventing ways around the problem.
Though an 8 bit processor is surely limited in its computational abilities, I see hundreds of successful C applications every year on these small machines. Recently I had the, ah, thrill, of working on an 8080 emulator (of all things), and was shocked to see that a rather large C demo program runs - apparently - as fast on this ancient 2 MHz clunker as on a 40 MHz 16 bitter. The program is a controller, and responds mostly at human speeds as many controllers do. The difference between infinite speed and 8080 speeds were simply irrelevant, as is often the case with small embedded systems.
Indiana Jones's gun ran out of performance after 6 pulls of the trigger. The sword suffers no such limitation. Each is appropriate for its own application. So it is with C or any other language. A wise designer analyzes his or her application in the cold light of realism, and applies the right tool, the one that will make him maximally effective.
A Question of Balance
The religious wars waged over questions like language selection, processor appropriateness, and other technical issues, much like a bureaucracy take on their own meaning. The debate, though, is indeed useful when tempered by rational thought. "Is C better than assembly? Is C++ better than C?" is a bit like asking "are 10 penny nails better than 4 penny nails for house construction?" Each has its place. The trick is being astute enough to find where a particular language or technology is appropriate.
We need to tone down the hysteria and start to accommodate the new realities into our lives and projects. What negative impact will C, C++ or Java really have on your project? How's the upside? Where's the balance?
This goes for all of the technology we're faced with. Do we agree that bloatware is the future of the desktop? What's the downside? If it is the future, what must we change in ourselves to deal with it? What about the net - must it be in every classroom? Or, is there a balance here, where the net is merely one of many useful tools, but not the sine qua non?
The pace of change is such that we're forced to become experimenters just to survive. No amount of debate will tell us the number of angels that fit on the head of a pin, or the perfect lingo for a particular application. Try things, see what happens, construct small experiments, with measurable results, that can fail without causing a disaster. For example, I'm not enthusiastic about C++ in small embedded systems, but only a fool wouldn't be playing with it, learning about it, and making some sort of reasonable assessment about it's use in today's or tomorrow's projects. Ditto for Java.
I fear that the pace of change is such that there's no longer room for any sort of failure. Concurrent engineering means the damn thing simply must work right, on time, on budget, with no room for error. Too few of us have the opportunity to actively learn things, as the engineering environment is simply too demanding to permit time off for any reason.
Our challenge in the 90s and beyond is to find a way to learn new things despite the awful time-to-market pressures that drive our careers. Having watched so many programmers and engineers move from newbie to seasoned developer, I'm struck by the difficulty of learning these skills. It's all on the job experience; we have not moved much beyond the teaching techniques of the old-time apprentices and journeymen. You learn a language by fiddling with it a bit; you only become proficient at it by painstakingly writing thousands of lines of code.
There seems to be no easy way to develop new skills. Somehow we've got to find the time and the motivation to make this painful learning process a part of our everyday existence. The world of embedded systems is surely moving ahead at a dizzying pace; if we don't do the same, we'll be left behind, maybe to become the pathetic manager Scott Adams delights in humiliating in his Dilbert strip.
Continual Improvement
I remember well when the shift from assembly to C began in the 80s. So many projects were absolute disasters! The compilers, the debuggers, and most of all the people all contributed to a swarm of troubles. All too many systems started as pure-C projects, only to get delivered as amalgams of cruddy C peppered with #asms accompanied by lots of assembly modules.
But we learned. The tools improved. Programmers mastered the art of C programming, something that is not the same as assembly. A new way of thinking was needed. Helping the situation were the colleges which churned out new graduates who had never been polluted by thinking in Dartmouth Basic, and whose first language was C. Now it's rare to find a system coded even mostly in assembly. Yes, they are there, and often there's a great reason for using assembly, but the bulk of embedded development is now done in C.
We've learned to put up with its shortcomings and exploit its strengths. It's like the 6 penny nail that shrinks and stretches a bit to cover a broad array of applications, some better than others.
Now we have other technologies clamoring for our attention. I think it's time to swipe some ideas from the advocates of TQM (Total Quality Management), and adopt a philosophy of continual improvement. Look for new concepts that might solve engineering problems or ease the development process and give them a try. Look for lots of small gains all of the time, rather than one silver bullet whose success is usually doubtful and whose risks are large.
Never forget that engineering is a normal">process. Yes, a critical goal is to get something delivered to the customer. Just as important is creating an evolving system that produces these products; one that uses reasonable technology, one that is open to new ideas, and one that actively, constantly, looks for ways to be more efficient and effective.
Above all, learn, and be open-minded. Politeness is not a bad idea either - especially in your flames to me!