|
Go here to sign up for The Embedded Muse.
Embedded Muse 214 Copyright 2011 TGG October 4, 2011
You may redistribute this newsletter for noncommercial purposes. For commercial use contact jack@ganssle.com. To subscribe or unsubscribe go to https://www.ganssle.com/tem-subunsub.html or drop Jack an email at jack@ganssle.com.
EDITOR: Jack Ganssle, jack@ganssle.com
Contents:
- Editor's Notes
- Quotes and Thoughts
- Talking to Management
- C Sucks
- C++11
- The Dumbest Thing I Did
- Jobs!
- Joke for the Week
- About The Embedded Muse
Editor's Notes
Lower your bug rates.
And get the product out faster.
Sounds too good to be true, but that's the essence of the quality movement of the last few decades. Alas, the firmware community didn't get the memo, and by and large we're still like Detroit in the 1960s.
It IS possible to accurately schedule a project, meet the deadline, and drastically reduce bugs. Learn how at my Better Firmware Faster class, presented at your facility. See https://www.ganssle.com/onsite.htm .
I'll be conducting public versions of this class in Chicago on October 21 and in London on October 28. Why not rev up your team? There's more info here: https://www.ganssle.com/classes.htm . Act now - there's a discount until September 21 for Chicago and the 28th for London.
Quotes and Thoughts
We want to establish the idea that a computer language is not just a way of getting a computer to perform operations but rather that it is a novel formal medium for expressing ideas about methodology. Thus, programs must be written for people to read, and only secondarily for machines to execute. - Harold Abelson and Gerald Jay Sussman
Talking to Management
Carol Duffy sent the following last week: "Since you taught your firmware class here last year we've rethought our entire approach to engineering our systems. Some of us were shocked by the number you mentioned, about a study of 4000 late software projects revealing the biggest reason for them being late is bugs (who did that study? We can't remember.)
"We kept no real metrics prior to the class so are lacking in scientific numbers, but the feeling is that by implementing just your suggestions about managing bugs we've shortened schedules quite a bit. What we do know, though, is our shipped bug rates are down by about a third. We're tracking DRE, for instance, and each quarter continues to show an improvement.
"But what is your take on dealing with management? They resist every change we implement. We're getting a ton of pushback on doing the peopleware thing."
As I told Carol, the first step is to realize that revolution rarely works; think evolution instead. Teams go wrong in two fashions: some try to slam in a complete revamping of their processes when the culture just will not allow such radical adjustments. Others feel that change is too hard, or will face so much resistance, that they do nothing; they give up and continue the unhappy status quo.
Secondly, by keeping metrics Carol's group is creating a potent weapon. Most of the time the boss is an engineer, and engineers love the quantitative. It's easy to dismiss a vague feeling, but nearly impossible to argue with carefully-measured numbers.
Implement the changes you can and document the results. Share the numbers with management. It's fascinating to visit a highly-disciplined production line. Most of the time the walls are lined with charts showing the line's quality and production rates. Yet engineering rarely does the same.
I've worked with a lot of groups that had agonizing battles with management as they tried to change things, but later found the same bosses proudly pointing out the measured improvements in engineering to visitors. It takes time, courage, and metrics, but generally management will be very supportive of proven change.
(Oh, and that study about late software projects was done by Capers Jones).
C Sucks
C, the most popular of all embedded languages, is an utter disaster, a bizarre hodgepodge meant to give the programmer far too much control over the computer. The language is designed to provide infinite flexibility, to let the developer do anything that can be done on the computer.
Don't get me wrong - I do like programming in C. Assembly is even more fun, which proves I'm some sort of computer gearhead, more fascinated with the system's internals than with actually delivering products.
But no language should allow stupid mistakes like buffer overruns or undetected array overflows. When compute cycles were in short supply it was logical for a language to not check results, but those days are largely behind us.
Geodesic claims that 99% of all PC programs (most written in C and C++ of course) have memory leaks, all caused by poor use of malloc() and free(). Though these constructs are less common in the embedded world, an awful lot of firmware does use dynamic memory allocation. The language should simply not permit leaks; checks to guarantee memory integrity are essential. The cost is minimal. (Check out mem.txt at snippets.org, a simple bit of code you can link into your embedded app to detect all sorts of malloc()-related problems.)
Pointers are utterly unbounded in C. Want to dereference a null pointer? Go ahead! The language cares not a whit. Feel free to do any sort of math to any pointer. It's fun!
Here's a C hint that will improve your job security: embrace double indirection. Even better, try triple. Real programmers use quadruple. The only limit to the number of asterisks placed in front of a pointer is the size of one's cajones or how adventurous you feel.
Exception handlers are totally optional in C. Sure, they're nice to have, but the language itself does nothing force us to either write such handlers, or to write them in a way that's likely to work.
Even something as simple as integer math produces unexpected results. 20,000 + 20,000 may be a negative number. Is this cool or what! Even better, the fundamental concept "int" has no meaning at all. It depends on the CPU, the compiler, and the wind direction.
C has no formatting rules. It's easy and usual to write source in astonishingly cryptic ways. C is more an encryption tool than an aid to creating reliable and maintainable code.
No other language has an obfuscated code contest. Win by writing code that works but that's so convoluted no C expert can understand why. Most of the entries look like a two year old hit a few thousand random keys. And no, I'm not putting the URL of the contest here; these people are code terrorists who should be hunted down and shot.
A great programming language should encourage users to create perfect code. It must bound our options, limit our freedom, remove the degrees of freedom that lead to stupid bugs. Ada did this. It was so syntactically strict that the rule of thumb was "if you can make the damn thing compile at all it'll probably work." The language itself served as a straitjacket that led to reliable firmware. So of course Ada more or less failed, now barely appearing as a blip on language surveys.
C sucks. Sure it's fun and efficient. Debugging is much more of a kick than taming an angry syntax checker that insists on correctness. But it's time for the entire firmware community to either embrace a restrictive dialect of the language, or to adopt an Ada-like lingo that implicitly leads to correct code.
But C is like COBOL - it'll never go away. The best we can hope for is the use of standards, like MISRA, that constrain the language.
What do you think?
C++11
Andy Stromme had some thoughts about C++ and the new standard: "I'm looking forward to support for C++11. In general, I far prefer C++ to C - the extra options C++ provides allow me to write considerably more robust embedded software. From the existing C++ features that don't exist in C including the class, object creation and automatic destruction as things come in and out of scope (RAII, even with no exceptions), namespaces, simple templates to avoid macro badness, specialized templates to catch compile time errors, and operator overloading of mathy types, to the new features like uniform initialization, type inference (making auto useful), lambda functions, explicit overrides to catch more compile time errors, constant expressions.. there's all sorts of goodness coming.
"Sure, there's more one can misuse, but that comes with any tool. Taking some time to understand what features incur what overhead (be it run time, memory, or code space) is quite enlightening. I find that the "overhead" of many of these features ultimately requires me to write less code. If I want to do the same thing in C it often consumes the same resources, with more code to maintain."
The Dumbest Thing I Did
When interviewing I always ask candidates (those with experience) about their dumbest mistake and what they learned from it. Those with no mistakes are generally those with no experience - or are perhaps truthiness-challenged. Do you have any?
I worked as an electronics technician while a teenager, and one day was sharing the lab with an old engineer. "Old" meant maybe 35 to my young eyes. He was working on some ground support gear for a NASA project, and hooked up a power supply backwards. The entire system smoked. $4000 worth of gear was wrecked, which was a lot of money back in 1969. I'll never forget the look in his eyes as he slumped for a minute, then stood up and walked into his boss's office. He offered to resign, an offer the boss wisely refused. But that was when I learned just how serious a dumb mistake could be.
Joke for the Week
Note: These jokes are archived at www.ganssle.com/jokes.htm.
Pete Friedrichs contributed this:
A software weenie spends his life making a living writing COBOL programs. Eventually he retires in the early 1990's, to a life of golf and relaxation.
Well, as the year 2000 approaches, anxiety skyrockets over so-called "Y2K" problems with date and time keeping routines. IT people start leaving phone messages to the effect that, "...we hear you know COBOL, and staring at the end of the millennium, we're concerned about our code..." The guy is pinged mercilessly until he agrees to return to work as a consultant. For the next few years, he travels around the country fixing COBOL Y2K problems.
In late 1999, he starts to feel some anxiety of his own. What if the "fixes" he implemented don't work? He decides the best thing to do is to have himself cryogenically preserved for a year, and then thawed out. By that time, he figures, all of the hooplah will be over, and even if he did screw something up, someone else will surely have taken care of it. So, off to cryogenic suspension he goes.
Needless to say, something goes wrong. (Perhaps the cryo-timer routines were written in pre-Y2K COBOL!) He wakes up in a strange room filled with strange people. "He's alive! We did it!" the strangers congratulate themselves. One guy finally approaches the man and says, "Welcome to the year 2999!"
The programmer is despondent. He is alone and everyone he has ever known or loved is dead. But the people from the future are quick to point out how great their life really is. "Nobody is ever sick, nobody ever goes hungry. We have robot companions, starships, teleporters, and holodecks.
"That's great," says the programmer. "But why did you bother to resurrect me?"
"Well, you see," answers one of the men from the future," the new millennium starts in a few months and your records show that you know COBOL..."
About The Embedded Muse
The Embedded Muse is a newsletter sent via email by Jack Ganssle. Send complaints, comments, and contributions to me at jack@ganssle.com.
The Embedded Muse is supported by The Ganssle Group, whose mission is to help embedded folks get better products to market faster. can take now to improve firmware quality and decrease development time.