|
|
||||||
You may redistribute this newsletter for noncommercial purposes. For commercial use contact jack@ganssle.com. To subscribe or unsubscribe go to https://www.ganssle.com/tem-subunsub.html or drop Jack an email. |
||||||
Contents | ||||||
Editor's Notes | ||||||
Did you know it IS possible to create accurate schedules? Or that most projects consume 50% of the development time in debug and test, and that it’s not hard to slash that number drastically? Or that we know how to manage the quantitative relationship between complexity and bugs? Learn this and far more at my Better Firmware Faster class, presented at your facility. See https://www.ganssle.com/onsite.htm. |
||||||
Quotes and Thoughts | ||||||
Not really a quote, but I was struck by Toyota's recent $1.2 billion payout. With something like 1 million lines of code involved, that's around $1200/line, probably the most expensive software ever written. I wrote about it here, and one commenter sagely said "If you think that the cost of doing aircraft maintenance and certification is high, just look at the costs of dealing with the accident that happens if you don't." |
||||||
Tools and Tips | ||||||
Please submit clever ideas or thoughts about tools, techniques and resources you love or hate. Here are the tool reviews submitted in the past. My article on debouncing contacts gets downloaded several thousand times per month. Trent Cleghorn did a nice implementation of the algorithms in C and C++, which is available here. Richard Tomkins sent:
Thor Johnson has been playing with a Red Pitaya. This is a scope, logic analyzer and function generator on a single board. Their website is terrible and it's hard to find out much, but a press release last week from Elecktor claims it will cost "under $500." Thor's comments:
|
||||||
Responses to "A Plea To Compiler Vendors" | ||||||
A number of people responded to my call for compilers that give timing information. Bill Gatliff and I had a dialog about this. He wrote:
I replied: I agree that the issues are hard, and in some cases impossible. With cache there’s no predicting anything, and many safety-critical systems prohibit the use of it because there’s no guarantee things will run in a timely manner. And, yes, the tools would have to know everything about the execution environment, like wait states, clocks, etc. Load MPLab or a similar tool and you have to tell the debugger exactly which variant of a chip you’re using. Why not do the same at compile time, and add in other required parameters? But most embedded systems are small, using PICs, Atmels, Cortex-M, etc, which don’t have these fancy features. In most of those cases the toolchains are integrated – e.g., IAR and others provide the whole shooting match, from compiler to linker. They do preserve a lot of line-by-line info in the debugging file, so some sort of mapping from source to object is possible. Sure, optimizations mess things up but there’s still a ton of useful info because of C sequence points. The data would be imperfect. But I’d kill for 80% accuracy instead of the 0% we have today. If I wrote an application in assembly this wouldn't be too hard. Why can’t the tools be as smart as we are? He came back:
|
||||||
The History of the Microprocessor, Part 4 | ||||||
A minor error in Part 3: Clyde Shappee noted that the AGC had 2800 ICs, not 2200. Phil Martel was part of this history, and writes about the TX-0, which I mentioned as one of the first transistorized computers:
Microprocessors Change the WorldI have always wished that my computer would be as easy to use as my telephone. My wish has come true. I no longer know how to use my telephone.- Bjarne Stroustrup Everyone knows how Intel invented the computer on a chip in 1971, introducing the 4004 in an ad in a November issue of Electronic News. But everyone might be wrong. TI filed for a patent for a "computing systems CPU" on August 31 of that same year. It was awarded in 1973 and eventually Intel had to pay licensing fees. It's not clear when they had a functioning version of the TMS1000, but at the time TI engineers thought little of the 4004, dismissing it as "just a calculator chip" since it had been targeted to Busicom's calculators. Ironically the HP-35 calculator later used a version of the TMS1000. But the history is even murkier. In an earlier installment of this series I explained that the existence of the Colossus machine was secret for almost three decades after the war, so ENIAC was incorrectly credited with being the first useful electronic digital computer. A similar parallel haunts the first microprocessor. Grumman had contracted with Garrett AiResearch to build a chipset for the F-14A's Central Air Data Computer. Parts were delivered in 1970, and not a few historians credit the six-chips comprising the MP944 as being the first microprocessor. But the chips were secret until they were declassified in 1998. Others argue that the multi-chip MP944 shouldn't get priority over the 4004, as the latter's entire CPU did fit into a single bit of silicon. In 1969 Four-Phase Systems built the 24 bit AL1, which used multiple chips segmented into 8 bit hunks, not unlike a bit-slice processor. In a patent dispute a quarter century later proof was presented that one could implement a complete 8 bit microprocessor using just one of these chips. The battle was settled out of court, which did not settle the issue of the first micro. Then there's Pico Electronics in Glenrothes, Scotland, which partnered with General Instruments (whose processor products were later spun off into Microchip) to build a calculator chip called the PICO1. That part reputedly debuted in 1970, and had the CPU as well as ROM and RAM on a single chip. Clearly the microprocessor was an idea whose time had come. Japanese company Busicom wanted Intel to produce a dozen chips that would power a new printing calculator, but Intel was a memory company. Ted Hoff realized that a design with a general-purpose processor would consume gobs of RAM and ROM. Thus the 4004 was born. The face that launched a thousand chips. It was a four-bit machine packing 2300 transistors into a 16 pin package. Why 16 pins? Because that was the only package Intel could produce at the time. Today fabrication folk are wrestling with the 20 nanometer process node. The 4004 used 10,000 nm geometry. The chip itself cost about $1100 in today's dollars, or about half a buck per transistor. Best Buy currently lists some desktops for about $240, or around 10 microcents per transistor. And that's ignoring the keyboard, display, 500 GB hard disk and all of the rest of the components and software that goes with the desktop. Though Busicom did sell some 100,000 4004-powered calculators, the part's real legacy was the birth of the age of embedded systems and the dawn of a new era of electronic design. Before the microprocessor it was absurd to consider adding a computer to a product; now, in general, only the quirky build anything electronic without embedded intelligence. At first even Intel didn't understand the new age they had created. In 1952 Harold Aiken figured a half-dozen mainframes would be all the country needed, and in 1971 Intel's marketing people estimated total demand for embedded micros at 2000 chips per year. Federico Faggin used one in the 4004's production tester, which was perhaps the first embedded system. About the same time the company built the first EPROM and it wasn't long before they slapped a microprocessor into EPROM burners. It quickly became clear that these chips might have some use after all. Indeed, Ted Hoff had one of his engineers build a video game - Space War - using the four-bitter, though management felt games were goofy applications with no market. In parallel with the 4004's development Intel had been working with Datapoint on a computer, and in early 1970 Ted Hoff and Stanley Mazor started work on what would become the 8008 processor. 1970 was not a good year for technology; as the Apollo program wound down many engineers lost their jobs, some pumping gas to keep the families fed. (Before microprocessors automated the pumps gas stations had legions of attendants who filled the tank and checked the oil. They even washed windows.) Datapoint was struggling, and eventually dropped Intel's design. In April, 1972, just months after releasing the 4004, Intel announced the 8008. It had 3500 transistors and cost $650 in 2014 dollars. This 18 pin part was also constrained by the packages the company knew how to build, so it multiplexed data and addresses over the same connections. Typical development platforms were an Intellec 8 (a general-purpose 8008-based computer) connected to a TTY. One would laboriously put a tiny bootloader into memory by toggling front-panel switches. That would suck in a better loader from the TTY's 10 character-per-second paper tape reader. Then, read the editor and start typing code. Punch a source tape, read in the assembler. That read the source code in three passes before it spit out an object tape. Load the linker, again through the tape reader. Load the object tapes, and finally the linker punched a binary. It took us three days to assemble and link a program that netted 4KB of object code. Needless to say, debugging meant patching in binary instructions with only a very occasional re-build. The Intellec 8 The world had changed. Where I worked we had been building a huge instrument that had an embedded minicomputer. The 8008 version was a tenth the price, a tenth the size, and had a market hundreds of times bigger. It wasn't long before the personal computer came out. In 1973 at least four 8008-based computers targeted to hobbyists appeared: The MCM-70, the R2E Micral, the Scelbi-8H, and the Mark-8. The latter was designed by Jon Titus, who tells me the prototype worked the first time he turned it on. The next year Radio Electronics published an article about the Mark-8, and several hundred circuit boards were sold. People were hungry for computers. "Hundreds of boards" means most of the planet's billions were still computer-free. I was struck by how much things have changed when the PC in my woodworking shop died. I bought a used Pentium box for $60. The seller had a garage with pallets stacked high with Dells, maybe more than all of the personal computers in the world in 1973. And why have a PC in a woodworking shop? Because we live in the country and radio stations are very weak. Instead I get their web broadcasts. So this story, which started with the invention of the radio several issues ago, circles back on itself. Today I use many billions of transistors to emulate a four-tube radio. By the mid-70s the history of the microprocessor becomes a mad jumble of product introductions by dozens of companies. A couple are especially notable. Intel's 8080 was a greatly improved version of the 8008. The part was immediately popular, but so were many similar processors from other vendors. The 8080, though, spawned the first really successful personal computer, the Altair 8800. This 1975 machine used a motherboard into which various cards were inserted. One was the processor and associated circuits. Others could hold memory boards, communications boards, etc. Offered in kit form for $1800 (in today's dollars), memory was optional. 1KB of RAM was $700. MITS expected to sell 800 a year but were flooded with orders for 1000 in the first month. Computers are useless without software, and not much existed for that machine. A couple of kids from New England got a copy of the 8080's datasheet and wrote a simulator that ran on a PDP-10. Using that, they wrote and tested a Basic interpreter. One flew to MITS to demonstrate the code, which worked the very first time it was tried on real hardware. Bill Gates and Paul Allen later managed to sell a bit of software for other PCs. The 8080 required three different power supplies (+5, -5 and +12) as well as a two-phase clock. A startup named Zilog improved the 8080's instruction set considerably and went to a single-supply, single-clock design. Their Z80 hugely simplified the circuits needed to support a microprocessor, and was used in a stunning number of embedded systems as well as personal machines, like Radio Shack's TRS-80. CP/M ran most of the Z80 machines, and was the inspiration for the IBM PC's DOS. But processors were expensive. The 8080 debuted at $400 ($1700 today) just for the chip. Then MOS Technology introduced the 6501 at a strategic price of $20 (some sources say $25, but I remember buying one for twenty bucks). The response? Motorola sued, since the pinout was identical to their 6800. A new version with scrambled pins quickly followed, and the 6502 helped launch a little startup named Apple. Other vendors were forced to lower their prices. The result was that cheap computers meant lots of computers. Cut costs more and volumes explode. Active ElementsIn this series of articles I've portrayed the history of the electronics industry as a story of the growth in use of active elements. For decades no product had more than a few tubes. Because of RADAR between 1935 and 1944 some electronic devices employed hundreds. Computers drove the numbers to the thousands. In the 50s SAGE had 55,000 per machine. Just 6 years later the Stretch squeezed in 170,000 of the new active element, the transistor. In the 60s ICs using a dozen to a few hundred transistors shrank electronic products. We embedded folk whose families are fed by Moore's Law know what has happened: some micros today contain 3 billion transistors on a single die; memory parts are even denser. And on the low end a simple 8 bitter costs tens of cents, not bad compared to the millions of dollars needed for a machine just a few decades ago. But how often do we stand back and think about the implications of this change? Active elements have shrunk in length by about a factor of a million, but an IC is a two-dimensional structure so the effective shrink is more like a trillion. The cost per GFLOP has fallen by a factor of about 10 billion since 1945. It's claimed the iPad 2 has about the compute capability of the Cray 2, 1985's leading supercomputer. The Cray cost $35 million more than the iPad. Apple's product runs 10 hours on a charge; the Cray needed 150 KW and liquid Flourinert cooling. My best guess pegs an iPhone at 100 billion transistors. If we built one using the ENIAC's active element technology the phone would be about the size of 170 Vertical Assembly Buildings (the largest single-story building in the world). That would certainly discourage texting while driving. Weight? 2500 Nimitz-class aircraft carriers. And what a power hog! Figure over a terawatt, requiring all of the output of 500 of Olkiluoto power plants (the largest nuclear plant in the world). An ENIAC-technology iPhone would run a cool $50 trillion, roughly the GDP of the entire world. And that's before AT&T's monthly data plan charges. Without the microprocessor there would be no Google. No Amazon. No Wikipedia, no web, even. To fly somewhere you'd call a travel agent, on a dumb phone. The TSA would hand-search you... uh, they still do. Cars would get 20 MPG. No smart thermostats, no remote controls, no HDTV. Vinyl disks, not MP3s. Instead of an iPad you'd have a pad of paper. CAT scans, MRIs, PET scanners and most of modern medicine wouldn't exist. Software engineering would be a minor profession practiced by a relative few. Accelerating TechGenus Homo appeared around 2 million years ago. Perhaps our first invention was the control of fire; barbecuing started around 400,000 BCE. For almost all of those millennia Homo was a hunter-gatherer, until the appearance of agriculture 10,000 years ago. After another 4k laps around the sun some genius created the wheel, and early writing came around 3,000 BCE. Though Gutenberg invented the printing press in the 15th century, most people were illiterate until the industrial revolution. As I described in part 1 of this series that was the time when natural philosophers started investigating electricity. In 1866, within the lifetime of my great-grandparents, it cost $100 to send a ten word telegram through Cyrus Field's transatlantic cable. A nice middle-class house ran $1000. That was before the invention of the phonograph. The only entertainment in the average home was the music the family made themselves. One of my great-grandfathers died in the 1930s, just a year after he first got electricity to his farm. My grandparents were born towards the close of the 19th century. They lived much of their lives in the pre-electronic era. When probing for some family history my late grandmother told me that, yes, growing up in Manhattan she actually knew someone, across town, who had a telephone. That phone was surely a crude device, connected through a manual patch panel at the exchange, using no amplifiers or other active components. It probably used the same carbon transmitter Edison invented in 1877. My parents grew up with tube radios but no other electronics. I was born before a transistorized computer had been built. In college all of the engineering students used slide rules exclusively, just as my dad had a generation earlier at MIT. Some calculators existed but were very expensive and I never saw one during my time in school. We had limited access to a single mainframe. But my kids were required to own laptops as they entered college, and they have grown up never knowing a life without cell phones or any of the other marvels enabled by microprocessors that we so casually take for granted. A scientific calculator, circa 1971 The history of electronics spans just a flicker of the human experience. In a century we've gone from products with a single tube to those with hundreds of billions of transistors. The future is inconceivable to us, but surely the astounding will be commonplace. As it is today. Thanks to Stan Mazor and Jon Titus for their correspondence and background information. |
||||||
Jobs! | ||||||
Let me know if you’re hiring embedded engineers. No recruiters please, and I reserve the right to edit ads to fit the format and intents of this newsletter. Please keep it to 100 words. |
||||||
Joke For The Week | ||||||
Note: These jokes are archived at www.ganssle.com/jokes.htm. OS/2 Chicken: It tried to cross the road several times and finally gave Windows 95 Chicken: You see different colored feathers while it crosses, Microsoft Chicken: It's already on both sides of the road. And it just Mac Chicken: No reasonable chicken owner would want a chicken to cross C++ Chicken: The chicken wouldn't have to cross the road; you'd simply Visual Basic Chicken: USHighways!TheRoad cross (aChicken). Delphi Chicken: The chicken is dragged across the road and dropped on the Java Chicken: If your road needs to be crossed by a chicken, the server Web Chicken: Jumps out onto the road, turns right, and keeps on running. |
||||||
Advertise With Us | ||||||
Advertise in The Embedded Muse! Over 23,000 embedded developers get this twice-monthly publication. . |
||||||
About The Embedded Muse | ||||||
The Embedded Muse is Jack Ganssle's newsletter. Send complaints, comments, and contributions to me at jack@ganssle.com. The Embedded Muse is supported by The Ganssle Group, whose mission is to help embedded folks get better products to market faster. |