Roots: How Computers Lost Their Jobs
For centuries, assistants to mathematicians performed some of the grunt work of the calculations required for their work. These human computers became obsolete when digital computers replaced their jobs in the mid-twentieth century. Because of their math skills, they often became the first programmers. Since mainly women held these computer jobs, the dawn of the programming age saw many female computing legends leading the way.
If you wish to understand where technology is going, it is important to take a look back at where we’ve been. In the series Roots, we trace the history of computing: from the very birth of the digital age, through networking, all the way to the current era of smart devices, and beyond. This month, a prologue: the end of human computers.
The computer gained popularity in the world of science in the 17th century, mainly with astronomers who were calculating the movements of planets. This job required some heavy-duty math, and astronomers hired assistants to offload some of their calculations. Because these assistants mainly performed computations, these men (as women were not allowed in the sciences) were called ‘computers’.
This era birthed some of the fundamental concepts essential to computing, such as logarithms. Logarithms were developed as ‘artificial numbers’ to replace the multiplications scientists had to perform beforehand. The scientists implemented a system to use a base and an exponent to acquire complex numbers in a much simpler way. Assistants (computers) performed many calculations faster utilizing this method than they did with the earlier algorithm known as Prosthaphaeresis, a system to quickly determine the product of numbers. Digital computers nowadays fundamentally behave the same way: they do the basic grunt work and execute calculations, just at speeds that are quite impossible to achieve for a human.
Computers shared tables of their calculations among each other to decrease their workloads and re-use earlier acquired results of the same questions. Math performed by hand became easier with the arrival of the log-log slide rule in the 19th century. This device enabled computers to display logarithms of logarithms, so these men could perform a wide range of calculations with roots and exponents. Slide rules became increasingly complex, with unwieldy examples of these devices at observatories that were dozens of feet in length, or with complex devices as Thacher’s Calculating Instrument: a cylinder of slide rules (PDF). That mechanism was much more accurate than any conventional system.
Dawn of distributed computing
Around this time – the mid to late 19th century – it also became acceptable, and indeed popular, to hire women for the job. Human computers were sometimes added together in large working groups to do complicated tasks by dividing the calculations over a large group. In a sense, this was an early way of distributed computing: each unit performed a task in an organized way and the results were brought together higher up in the organizational hierarchy to achieve a final result.
This distributed process consisted of breaking down a large mathematical formula into smaller pieces and hand-off these pieces to computers to calculate the results of a given input. These results would then be re-assembled by the original mathematician to achieve a final result of the larger formula, wrote Edith Law and Luis von Ahn in their book of lectures on Artificial Intelligence and Machine Learning.
Examples of this distributed work are creating decimal trigonometry tables in the 19th century, estimating the trajectories of comets moving through our solar system, plotting bomb trajectories early 20th century, modeling the stock market after the crash of ‘29 and subsequent depression in the 1930s, and decrypting intercepts during the Second World War.
Human computers were more and more assisted by mechanical devices, complex machines to help in the crunching of vast numbers. The first tabulation machines and mechanical computers often had the initials AC in their name, which stood for Automatic Computer, to differentiate them from human computers. We will look at mechanical devices like these, Difference Engines, the Bombe, and more in a future story on the age of mechanical computing leading up to electronic computing.
WWII changed the game
During the interbellum and Second World War, computer jobs were increasingly held by women. Pre-war, they had fewer chances to gain higher education and break into the world of science and math. While the war was raging in Europe, the U.S. Navy had a small department of male code breakers to decrypt intercepted messages that might be of interest to the Americans. When the United States was suddenly involved in this global war, this small department had to quickly scale to look at intercepted messages from all over the globe, explains Liza Mundy in her book Code Girls.
But the Navy was faced with a personnel problem. In the weeks after the attack on Pearl Harbor, many able-bodied American males volunteered for service and were shipped off to fight in the war. To fill the vacuum left by men, computers decrypting intercepted messages were almost exclusively women. Navy intelligence bases had rooms filled with computers armed with slide rulers, notepads, and pencils to work out their piece of a decrypt. They would then hand them off to another station which would assemble results and do further computations. These would then, in turn, be handed off to yet another station that analyzed the intercepts.
The women who did the code-cracking were trained in arts like substitution ciphers and frequency analysis, reverse-engineer super-encipherment, and more in order to complement their math skills with specific cryptanalysis skills that would help them decrypt enemy communications. These women were college-educated, which was a rarity until the mid 20th century, but the U.S. Navy (and Army, too) turned to women’s colleges to acquire the so desperately needed new workforce into their growing cryptanalysis department. Female teachers, another a college-educated workforce, were also recruited into these intelligence services.
From human to electronic
During the war, the first electronic computers were under development. The armed conflict gave a boost to these technological advances, as the military tends to invest heavily in technology during wartime. For instance, scientists started to design what the press later called the Giant Brain, officially known as the Electronic Numerical Integrator and Computer (ENIAC), in 1943. They needed computer skills for the tedious job of calculating essential algorithms and programming parts of the machine. As there was a war on, and the required skillset resided in the female staff of the Army and Navy, some of these women were recruited to help create their successors.
The first programmers were often women. Katherine Johnson, whose calculations were crucial to the first U.S. manned spaceflight, started her career as a human computer at NACA, de predecessor to NASA. Trailblazers like Kathleen Booth, Annie Easley, Barbara Poulson, Mavis Batey, and many more have been largely ignored by history books, but these human computers were instrumental in not only winning the Second World War, but also the later space race and simultaneously ushering in the dawn of silicon-based computing.
End of an era
Even though the electrical computer, which would eventually supplant the need for human computers, arrived in the 1940s, human computers were around until the 1960s and even the 1970s. Slowly, large electrical computers replaced many of the tasks of pools of human computers and there was less need for human computing. Codebreakers were, after the war, sometimes folded back into quiet suburban life – they were prohibited to tell anyone about the nature work during the war, and were often listed as doing secretarial work – but some pursued a career in programming, and sometimes they went on into other fields of science.
One such area, which is of great importance to the history of computing, is modeling trajectories, calculating burn times, etcetera, at NASA’s Jet Propulsion Laboratory. This department required many computations in the 1950s, as the US was expanding its efforts to achieve space flight. These human computers were crucial to the success of Friendship 7, the first American orbital spaceflight, piloted by John Glenn. Computers set the stage for the digital revolution that made their jobs obsolete. Also at NASA, electronic computers and their glass tubes (which are unsuitable for space flight) were being replaced by a new-fangled technology that would have stayed in laboratories were it not for the space race: silicon transistors.