ARM TechCon 2014
Summary: ARM TechCon had a number of surprises.
The tenth annual ARM TechCon was held in early October in Santa Clara. It was in some ways a strange event, an amalgam of embedded, enterprise and IT people. ARM CPUs, after all, go to the entire computer industry. As a result some of the talks sounded, to an embedded person, as if they'd be dreadfully boring, and others included a mix of discourse about these very diverse disciplines. Other presentations were fascinating hard-core embedded talks.
The event kicked off with a keynote by ARM's Pete Hutton (EVP) and Mike Muller (CTO). The main topic was the IoT. You'd think that would be exclusively embedded, but they forecast our devices being connected via the cloud to data centers. They think 30 billion connected "things" will be deployed by 2020, which is a more believable number than the 1 trillion anticipated over the same timeframe by GE, Cisco and others. All of those will generate vast amounts of data. Some will be processed locally, but most, the speakers said, will become "big data" that those enterprise folks will vacuum up. It wasn't quite clear to me why the information will stream into monster servers; perhaps the NSA has a critical need to know the temperature setting of our NEST thermostats.
As part of the effort to unify the sensors and the cloud ARM announced a new direction for mbed. If you haven't checked out mbed.org I recommend taking a look. It hosts a browser-based compiler. Using one of the several dozen eval boards they support one can go from knowing nothing about mbed or the board to writing and testing firmware in about 3 minutes. I don't think mbed is production-ready in that it does not support separate compilation or any debugging past a printf. However, for the last year or two I've used the browser-based tool on a number of small, experimental projects, which typically are under 1000 lines of code, and am super pleased with the results. It removes all of the hassle of configuring IDEs, project files, etc.
30 billion devices is a lot of technology. Each will have millions of transistors, so the electronic density can only be expressed with exponential notation. Ironically, during the hour-long keynote while the speakers expounded on this incredibly complex world they are enabling they experienced a succession of microphone failures. I lost count, but think they went through six mikes.
More A/V problems were to come. Micrium's Christian Legare gave a talk about the IoT and the A/V folks worked during his entire presentation to get the laptop projecting slides. They never managed to do so. Christian was a rock star; he gave the talk extemporaneously, with neither notes nor slides, and never faltered. I was interested and surprised to hear that only about 16% of those IoT devices will be traditional consumer products. The rest will be in infrastructure, buried in walls, buildings and the like.
ARM also introduced the newest member of the Cortex-M family: the M7. Now, if you're not a regular consumer of semiconductor vendors' press releases it might be a little confusing where each member of this club goes. Here's the official cheat sheet:
M0 - Low silicon area
M0+ - Highest energy efficiency
M3 - Best energy/performance balance
M4 - Blended MCU and DSP
M7 - Highest performance
The M7 offers around 5 Coremark/MHz, nearly twice that of the M4, and can run at a faster clock rate, too. On a 40LP process it can hit 400 MHz. Be wary of scaling the figures by MHz, though, since at 400 MHz I doubt memories will keep up. Both I and D cache are available, depending on what the individual licensee orders, which can range from 4KB to 64KB.
It has some big-CPU features like a 6 stage dual-issue pipeline. The M4's optional floating point is single precision; the M7 adds double precision. DSP instructions include a single-cycle MAC for both fixed and floating point. The off chip bus is 64 bits, though the CPU remains a 32 bit machine.
ARM is responding to an increasingly-tough regulatory landscape with optional ECC and memory BIST controller. If a licensee elects to create a dual-core M7 the two can run in lockstep with logic checking to ensure both execute exactly the same way. There's a documentation package available to ease the certification process.
Old-timers subconsciously equate "microcontroller" with "small processor." Think 8051. But the M series defies this stereotype, and now the M7 is bringing enormous amounts of horsepower to the MCU world. A couple of vendors announced M7 silicon at the show, but to date none are listed on Digi-Key. I'm dying to find out the prices. There are a ton of M4 parts listed in the $3 to $4 range in few-thousand piece quantities. That's a heck of a price. In the 70s we were paying almost $400 per chip for 8080 CPUs, which is about $2000 in today's dollars. Can you imagine paying $2k for a part that even a small PIC can outrun?
TechCon's show floor had a number of interesting products. The cloud and IoT were the themes. Micrium now has Spectrum, which is a flow of products covering the IoT from RTOS to cloud support, with all of the usual middleware in between.
Express Logic bundled their RTOS and related components into the X-Ware Platform. (The "X" comes from the names of the products: ThreadX, GUIX, etc.). It's an interesting concept: for a series of popular eval boards the X-Ware suite will package everything needed to start developing your application. Drivers, BSP, RTOS and the rest of the products are all integrated and working. The company claims the code is industrial-quality; it's not some of that hacked together toy code the silicon vendors are famous for providing. It's available for Renesas' RZ/A1 RSK board today with more coming soon.
I was only able to spend two days at the show but attended a lot of interesting classes. As always, time constraints meant missing a lot that sounded great from the brochure.
Did you make it to the show? What was your take on it?
Published October 6, 2014