At the age of 13 I got my first computer: a custom-built gaming rig with a brand new dual core processor. For the first time in my life I didn’t have to share a computer with anybody else in my family — I could use it exactly how I wanted and, more importantly, I could give it whatever name I chose.
Naming a household appliance may seem like a cute affectation — only marginally better than giving a name to your favourite rock or comfiest underwear — but is a necessary part of networking. Without a hostname, computers cannot identify themselves to each other to share information. The principle is somewhat abstracted on the Internet, but essentially, the www in “www.tychosnose.com” is the hostname of our web server.
In honour of the grandfather of computing I named my first computer babbage, so it was only natural that my first laptop – one of those laptops for teenagers – obtained several years later, should then be called lovelace (read on if you don’t know why). After babbage’s retirement, however, all my devices have been named after pioneering women in computer science or related fields.
The ‘middle’ Tuesday in October is now known as Ada Lovelace Day, an opportunity to celebrate the achievements of women in science and technology. As part of the celebrations, today I would like to share the stories of four amazing women my computers are named after.
The computer: Sony VAIO E-series, dual-booting Windows 7 and Manjaro Linux. My main workhorse and gaming machine.
The person: Ada Lovelace was the only legitimate child of Lord Byron and his wife Anne Isabella. Her gifted mother had been tutored by a Cambridge professor as a child, and especially took to mathematics, an interest which she passed down to Ada — some say as an attempt to prevent her becoming a poet like her notorious father.
Despite this, Ada considered herself a ‘poetical scientist’, intersecting ideas from literature and philosophy with science and maths. At the age of seventeen she met Charles Babbage, then Lucasian Professor of Mathematics at Cambridge (an esteemed title previously held by Isaac Newton and more recently by Paul Dirac and Stephen Hawking), who was working to build the Analytical Engine, a vast programmable calculating machine that was never completed.
Impressed by her mathematical ability, Babbage began collaborating with Ada soon after their first meeting. When the Italian engineer Luigi Menabrea, later to become Prime Minister, wrote a paper based on one of Babbage’s lectures in 1842, Ada was commissioned to translate it into English. Among her extensive annotations the world’s first computer program — an algorithm for calculating the series of Bernoulli numbers on the Analytical Engine.
Though Babbage saw his engine purely as a calculator, Ada’s poetically scientific approach led her to understandthat all data could be represented by numbers. Going so far as to imagine that an engine could even compose music scientifically if pitch and harmony were suitably encoded, she was the first person to realise the full potential of the computer age.
The computer: MacBook Pro running OS X Mavericks. Smaller than lovelace, so easier to carry around. Has a Danish keyboard, invaluable for writing that elusive Ångström symbol.
The person: Hedy Lamarr is the only non-computer scientist on this list, instead holding the glamorous titles of inventor and Hollywood actor. Born in 1914 as Hedwig Eva Maria Kiesler and raised in Vienna, she began acting as a teenager. At nineteen, she married munitions dealer Friedrich Mandl, an abusive husband who tried to halt her acting career and kept her from leaving their castle home, where he had entertained such eminent guests as Benito Mussolini.
Accompanying her husband to technical meetings exposed her to discussions with scientists and engineers, and inspired her to teach herself about military technology. She eventually escaped from Mandl by disguising herself as a maid, going first to Paris and then to London where she met film producer Louis B. Mayer (of Metro-Goldwyn-Mayer fame), who made her one of Hollywood’s greatest stars.
While there, she met avant-garde composer George Antheil, who scored many films in the 1930s and 40s. An eccentric who considered himself an amateur expert on female endocrinology, she sought his advice on how to improve her appearance, but they found a common interest in military technology.
At the time, torpedoes could be guided by radio control, but were easily jammed if a strong competing signal was broadcast on the same frequency. In 1941 Lamarr and Antheil submitted a joint patent for a method of controlling a torpedo by preventing potential jamming attacks. This was done by continually changing the signal’s frequency according to a pre-defined pattern, which would require a huge amount of power to disrupt without prior knowledge.
Though not the first people to devise a spectrum-hopping technique, they were the first to suggest its use for radio control, and even developed a practical implementation: Antheil suggested the use of player-piano rolls to code a pattern across 88 frequencies — the number of keys on a typical piano.
Though this was only meant to be one example of how such a control system would work, it became the invention’s downfall — Navy officers imagined trying to fit player piano parts inside torpedoes, and dismissed the idea.
Their invention was finally implemented in 1962 during the US Navy’s blockade of Cuba, though this was three years after the patent had expired. Today, frequency-hopping spread-spectrum techniques are used to prevent interference in wireless communications such as Bluetooth.
The computer: Asus Eee PC 701, running Ubuntu Server, acts as a file server for my music, films and TV shows.
The person: If Babbage and Lovelace were the grandparents of computing, Rear Admiral Grace Hopper was its mother. Born in New York City in 1906, she received a PhD in mathematics from Yale in 1934, and began lecturing at Vassar College, New York. During the Second World War she joined the Navy Reserve, who were working with the Harvard Mark I, one of the first electrical computers, and in turn heavily inspired by Babbage’s Analytical Engine.
Hopper’s most groundbreaking work was on the UNIVAC, or UNIVersal Automatic Computer, a commercially-available machine which counted the US Census Bureau as its first customer, when she proposed the idea that computers could be controlled with human-readable languages. Much to the disbelief of people who at the time said computers were only capable of performing mathematical tasks, she wrote the first compiler — a program which translates a human-readable programming language into machine code. Hopper saw computers not just as glorified calculators, but as “machines that assisted the power of the brain rather than muscle.”
The language that Hopper’s A-0 compiler read was a major influence on COBOL, the COmmon Business-Oriented Language developed in 1959. COBOL became the most popular language for business applications over the next three decades, and although it seems quaintly dated today, the amount of ancient hardware used by organisations such as banks and governments means that as of 2012, over 60% of surveyed companies used the language.
Grace Hopper also famously coined the term “debugging” when she found an early computer bug — in this case, a moth which had become stuck in one of the Mark II’s relays.
The computer: Raspberry Pi B+, dual-booting Raspbian and OpenELEC. Used as a media centre, streams films and shows from hopper.
The person: Sophie Wilson worked for Acorn Computers, a now-defunct company who had a phenomenal role in British computing history. While still an undergraduate, Wilson designed the Acorn Microcomputer in 1979, based on a computer she had built for herself and an automated cow feeder she had previously designed. Though intended for use in small firms and labs, the Microcomputer’s relatively low cost and compact size meant that it took off with hobbyists for use at home.
Acorn’s biggest break came in 1981, when the BBC launched its Computer Literacy Project, which aimed to get the British public using computers via an educational TV series and textbook about the BASIC language and, most ambitiously, a powerful but cheap and easily-usable computer.
Acorn were commissioned to produce this computer, the iconic BBC Micro, which sold over 100 times the expected number (around 1.5 million machines), and raised a generation of programmers. Indeed, even I learnt to program on one of these machines, owned by my primary school – though this was in the late 90s, when it was already a dusty relic.
Sophie Wilson’s contribution reaches far beyond our small island, however. The BBC Micro was the first device to use the ARM processor architecture which, now licensed out for production by other companies, is now used in well over 95% of smartphones, as well as in many embedded systems such as digital TVs and other appliances.
One device using an ARM processor made by Broadcomm, for whom Wilson now works, is the Raspberry Pi, a credit card-sized computers that cost around £30 each. Although aimed to help schools teach children how to program, they have been taken up as the go-to device for hackish projects, such as household automation and wearable computing.
The aim of the Raspberry Pi is very similar to that of the BBC Micro — to be a cheap and easy-to-use computer that can help everyone learn to program. In honour of this shared goal, and because they contain a processor using her design, it only made sense to name my Pi after Sophie Wilson.
Keir Little is a student and far poorer than his many computers would suggest. He knew about Lovelace way before it was cool, and tweets about next year’s biggest historical stars @diglyme