Last edited: May 08, 2004


Alan Turing: Thinking Up Computers

The Cambridge University Mathematician Laid the Foundation for the Invention of Software

BusinessWeek, May 10, 2004

By Andy Reinhardt in Paris

The Great Innovators

As part of its anniversary celebration, BusinessWeek is presenting a series of weekly profiles for the greatest innovators of the past 75 years. Some made their mark in science or technology; others in management, finance, marketing, or government. In late September, 2004, BusinessWeek will publish a special commemorative issue on Innovation.

The rarefied world of early 20th-century mathematics seems light years away from today’s PCs and virtual-reality video games. Yet it was a 1936 paper by Cambridge University mathematician Alan M. Turing that laid the foundation for the electronic wonders now crowding into every corner of modern life. In a short and eventful life, Turing also played a vital role in World War II by helping crack Germany’s secret codes—only to be persecuted later for his homosexuality.

A shy, awkward man born into the British upper middle class in 1912, Turing played a seminal role in the creation of computers. To be sure, many other people contributed, from mathematicians Charles Babbage and Ada Lovelace in the 1830s to Herman Hollerith—whose tabulating company became IBM (IBM )—at the turn of the century. But it was Turing who made the critical conceptual breakthrough, almost as an aside in a paper he wrote while in his 20s. Attempting to resolve a long-standing debate over whether any one method could prove or disprove all mathematical statements, Turing invoked the notion of a “universal machine” that could be given instructions to perform a variety of tasks. Turing spoke of a “machine” only abstractly, as a sequence of steps to be executed. But his realization that the data fed into a system also could function as its directions opened the door to the invention of software. “He is the one who found the underlying reason why an automatic calculating device can do so many things,” says Martin Davis, professor emeritus of computer science at New York University and a visiting scholar at the University of California at Berkeley.

As basic as Turing’s notion seems today, it was radical in the mid-1930s. But before the first programmable computers were built, Turing got diverted into the war effort. He worked for five years at Bletchley Park, north of London, with dozens of Britain’s brightest minds. Through endless hours and logical deduction, they unraveled the Enigma code used by the Germans to send messages to field commanders and U-boats.

Turing was himself an enigma. He adored maps and chess as a child and survived the brutal boarding school system by withdrawing into eccentricity. Later he found solace in distance running. Turing realized young that he was attracted to other men, but homosexuality was outlawed. So he lived a secret life, torn by inner battles between the mind and body. As long as he was useful to the government, officials overlooked his sexuality, says his biographer, Oxford mathematics research fellow Andrew Hodges. After the war, Turing became more overt in his relationships and was convicted in 1952 of “gross indecency.” He was subjected to injections of female hormones, ostensibly to quell sexual desires, and shunned as a security risk. In 1954, at 41, he died suddenly, almost certainly by suicide from eating a cyanide-laced apple.

Turing didn’t live to see the revolution he unleashed. But he left an enormous legacy. In 1950 he proposed a bold measure for machine intelligence: If a person could hold a typed conversation with “somebody” else, not realizing that a computer was on the other end of the wire, then the machine could be deemed intelligent. Since 1990 an annual contest has sought a computer that can pass this “Turing Test.” Nobody has yet taken the $100,000 purse. Turing would no doubt be delighted that engineers the world over are still trying.


[Home] [World] [United Kingdom]

 

1