By Jack Ganssle

When Computers Were Human

Computers were not always digital. Nor were they always analog or mechanical. For far longer that electronic data processing has been around the word "computer" described a human whose job was to make calculations.

A 2005 book by IEEE Computer columnist David Alan Grier named "When Computers Were Human" gives a marvelous history of the pre-computer computer. Like Gaul, the book is divided into three parts:

  • 1682 to 1880 when nearly all computation was devoted to astronomy.
  • 1880 to 1930, when some machinery, like mechanical calculators, became available, and computation found acceptance in many other fields.
  • 1930 to 1964, when "computers" became an independent discipline with professional standards.
It starts off with the expected 1785 reappearance of Halley's Comet. No one was quite sure when it would show up, but scientists accepted that Newton's Laws were in the driver's seat so its orbit should be predictable. The motion of the comet was a classic three body problem, which is not amenable to analysis; numerical means were required to figure the contributions of the then-known outer planets to the comet's orbit around the sun. Three friends worked together at a kitchen table for almost two years calculating the comet's position at a particular time, then advancing the body a few degrees in its orbit and recomputing where it would be next.

In November, 1757 their calculations were complete. The trio predicted that Halley's Comet would reach perihelion the following April 15 plus or minus a thirty days. The actual result: March 13. Not bad, but later analysis showed they had a happy confluence of self-canceling errors. (Uranus and Neptune had not been discovered, which skewed the results as well).

Probably the biggest contribution the three made was the division of mathematical labor. Each worked out one portion of the calculation for every iteration of the comet's position. That parceling of work, which the industrial revolution brought to manufacturing shortly thereafter, was the insight needed to handle the much more complex problems tackled by human computers later. And ironically, that is the very problem we haven't solved in the modern multicore era.

Over time other needs surfaced. As is all-too-common wars fueled computers' work. Shell trajectories were dependent on numerous factors like the barrel's elevation, wind direction and speed, barometric pressure and more. Soldiers under fire are neither competent to do advanced math nor do they have the time, so armies of computers created ballistics tables. ENIAC, arguably the first electronic digital computer, was created to compute these tables as new versions were always needed to match advances in gunnery.

Navigation For most of human history the oceans have carried the fruits of commerce. But till recent times navigation was a thorny problem. Early sailors used dead reckoning (dead from "ded" or "deduced"): given a course and speed it's easy to compute location. If there's no wind, no current, and if you can measure those parameters accurately. In practice these uncertainties lead to huge errors that can put a ship hundreds of miles off course.

Eventually seafarers realized that the celestial bodies could help. At local noon the sun's angle above the horizon gives the latitude by an easy calculation. Of course, no one knew the time, but local noon is when the sun is as high as it will get that day.

The seafarer used a sextant (in earlier days octants and other devices) to measure the angle between the horizon and the celestial object. At a particular time the body is directly over a single point on Earth. If the measured angle is, say, 43 degrees, the sailor knows he's exactly on a huge circle which is centered around the spot the body hovers above. (If one were closer to the body than the circle, the angle would be larger; further away and it diminishes. But walk around the arc and, if time were frozen, the angle will be constant). Sight another body at the same time, and the ship is at one of the two places where those circles intersect. (A bit of plotting compensates for the fact one cannot observe two bodies simultaneously.)

Without an accurate knowledge of time it is impossible to calculate longitude. A fiendishly complex process using the moon and stars was eventually developed, but few sailors mastered the math. England ruled the seas, so offered a prize for a chronometer that could meet the needs of navigation. John Harrison succeeded in the last decades of the 18th century. But the new technology gave way to yet another problem: where are the celestial bodies at any particular time?

To this day navigators simplify the calculations by assuming Copernicus was wrong, and that the sun and stars circle the Earth. They move fast - an error of just four seconds in time can result in being a mile off course. So one needs to know the ephemeris data for the body - where it is for every second of the year. Where does this data come from?

Immense amounts of calculation. Much more than any sailor can do aboard ship. So about the time Harrison produced a useful chronometer the Admiralty created an office to compute the first Nautical Almanac, which contained tables listing the positions of the sun, stars and planets throughout the year. Here's a portion of a page from a recent Almanac:

Table

The GHA (Greenwich Hour Angle, which is similar to longitude) and Dec (declination, similar to latitude) are listed for each hour. Interpolation tables let the sailor quickly figure these numbers to the second.

There's a lot of data in the Almanac, and each figure is the result of a ton of meticulous work. Computers were called upon to calculate this ephemeris data. At first they needed two years to produce one Almanac, which wasn't very helpful as a new one is needed annually. But better formulas and an increasing multitude of workers reduced the time needed. I imagine today a single electronic computer produces it in seconds or minutes.

The Nautical Almanac is still printed even in these GPS times, and is still used by those few old-timers that swing a sextant, including yours truly. Celestial navigation is surprisingly accurate; I can routinely get within two miles, and in settled weather do better than a mile.

Aids To Calculations Early computation used nothing more than the tools of arithmetic we learn in grammar school. Multiplication - and worse, long division - is tedious. The logarithm was invented in 1544. It reduced those two calculations to simple additions and subtractions. But logs weren't much use to computers without tables of logarithms, which had to be figured by, well, groups of computers doing arithmetic. It wasn't till the late 18th century that log tables became available. Once they were computers routinely used these handbooks to speed their work.

Increasing demands for calculations led to a number of innovations like Babbage's Difference Engine, though it was never made practical. Various forms of the slide rule emerged. By 1881 that instrument took fire in the USA due to ruling machines that could crank out accurate versions cheaply. A slide rule sure is fast - faster, even, than a modern calculator. But with only around three digits of precision they never found favor with computers.

Herman Hollerith's work on the 1880 census led him to devote three years of effort to create a machine that used punched cards (like the looms of the day) to complete electrical circuits to record results. The census bureau followed his efforts keenly and employed the machines in 1890's decennial event. Officials estimated that the devices saved over half a million dollars, enough to pay the salaries of, astonishingly, five hundred census clerks.

One computer said the sounds of the machines were "the voice of the archangel, which, it is said, will call the dead to life and summon every human soul to face his final doom."

I think I've heard similar statements made about Apple's yearly iStuff introductions.

Hollerith's company eventually morphed into IBM, whose business remained mostly tabulators and similar machinery till the vacuum tube was harnessed to make digital computers a reality.

As history glided into the twentieth century computers were mostly women, working under male supervisors. Until the US Civil War nearly all were men. But the war put most of them into the army so some women were hired. World War I ruptured the multi-country alliance that did the dreadfully time-consuming work of compiling the Nautical Almanac, so even more women were hired as computers now that each of the warring countries had to produce their own (essentially identical) versions of this essential book. Mechanical calculators were used to eliminate most of the manual arithmetic.

Much of the story of computers until 1964 is one of a maturing industry jockeying for funding, waging political battles large and small to support various groups. The story gets a little less interesting during these years. An exception is John Atanasoff's tabulator modifications that used binary numbers and electrical circuits to solve a great range of problems. The machine "punched" cards using a high voltage discharge that would occasionally set the card aflame. One wonders if he had a little better PR if a software defect wouldn't be the bug Grace Hopper found in a relay contact, but would be now called a "conflagration."

In 1972 the courts ruled that ENIAC's patent was invalid due to the prior art embodied in Atanasoff's invention.

Halley Redux Scientists eagerly awaited the 1986 return of Halley's Comet and used modern electronic computers to predict its perihelion. The best estimate was off by just over five hours, even despite using corrections derived from a 1982 observation of the body. The book is silent on where those errors came from.

I wish Grier's work gave more descriptions about how these complex problems were broken down into the many little steps done by each person in a room full of human computers. And more detail about the early mechanical calculators would enhance the volume. This is a dense book: Lots of small type and minimal white space. Only a techie could love it. It's not at all dry, but, like Henry Petroski's 450 page book about the pencil, you have to be fascinated by history to not nod off.

I found the book wonderful and very enlightening. Though every page uses the word "computer" in the context of humans, even by the end I still found the word a bit jarring when used that way. Did you find the same while reading this? This is no fault of Grier; it's that the word has been so altered in meaning that in our world "computer" != "human."

Published June 1, 2010