For novel ideas about building embedded systems (both hardware and firmware), join the 35,000 engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype and no vendor PR. Click here to subscribe.
|
This month's giveaway is a Cypress CY8KIT-044 PSoC 4 M-series Pioneer kit. Enter here. |
|
By Jack Ganssle
Assembly Languages
Published11/14/2008
A very long time ago, when dinosaurs still roamed the Earth, I designed a computer (using the fire-blackened end of a spear on a cave's wall.) Still in high school, I was geekily-somewhat competent with digital circuits but knew nothing about computer architecture. But I had learned Fortran and naively assumed some parallel between that language and how computers worked. The result: a machine that would have been a complete failure, and which used a greatly subsetted and compressed form of Fortran as its native instruction set.
College gave me grade-destroying access to a Univac 1108 and I quickly learned its assembly language. Suddenly computer architecture became crystal clear. The one-to-one mapping of machine instructions to simple logic circuits was beautiful; the stored program that substituted instructions in memory for massive amounts of hardware breathtaking.
Since then I've read many books about computer design but feel none would reveal a fundamental insight into CPU architecture without relying heavily on the essentials of assembly language. The ALU, program counter and stack pointers are dead lifeless things, capable of nothing till animated like Frankenstein's monster with instructions stored in memory.
Assembly is both the basis of all computers and the name of a class of languages. Often used to specify a particular variant (e.g,. "8051 assembly"), it oddly doesn't even get a capitalized first letter as all other languages do. Or, did, until grammar died a horrible death at the hands of clowns sporting marketing degrees. Proper nouns like Fortran, Ada, C, and Pascal gave way to iPhone, dBASE, and eEverything. The nuns at St. Camillus would have beaten us senseless for peppering our writing with stuDLycAps, yet today that affectation is not only common, one is relieved when at the very least the spelling is correct.
In the early days of microprocessors all programs were written in assembly. No C compilers existed for the minimal CPUs of the day and memory was so expensive and processors so slow that no one dreamed of sacrificing any form of efficiency for reduced development costs. All firmware folk were experts at at least one assembly language. Usually several.
Perhaps the two greatest gifts to the embedded world were C and IDEs. Though I still think assembly is more fun than using a high-level language, C reduces development costs so much I'd never dream of cranking much assembly code anymore.
But one effect of this great gift is a new distance between engineers and the underlying hardware. That has both positive and negative consequences. On the down side I worry that a lot of us no longer have that deep insight into computer architecture. A brief review of the nature of assembly in Programming 201 hardly forces one through PC-relative versus absolute jumps, or how the ARM uses a link register to preserve returns addresses. In real-time systems interrupts reign, but their very real costs are disguised by simple C structures that hide the stacking and unstacking of the system's context.
Do you use much assembly anymore? Are you fond of programming at the bare metal level?