1. One wall was reserved for Project Cyberfolk, an ambitious effort to track the real-time happiness of the entire Chilean nation in response to decisions made in the op room. Beer built a device that would enable the country’s citizens, from their living rooms, to move a pointer on a voltmeter-like dial that indicated moods ranging from extreme unhappiness to complete bliss. The plan was to connect these devices to a network—it would ride on the existing TV networks—so that the total national happiness at any moment in time could be determined.
     
  2. To convince workers that cybernetics in the service of the command economy could offer the best of socialism, a certain amount of reassurance was in order. In addition to folk music, there were plans for cybernetic-themed murals in the factories, and for instructional cartoons and movies. Mistrust remained. “CHILE RUN BY COMPUTER,” a January, 1973, headline in the Observer announced, shaping the reception of Beer’s plan in Britain.

    At the center of Project Cybersyn (for “cybernetics synergy”) was the Operations Room, where cybernetically sound decisions about the economy were to be made. Those seated in the op room would review critical highlights—helpfully summarized with up and down arrows—from a real-time feed of factory data from around the country.

     
  3. A Guardian article, written in 1964, quoted in “Dinosaur and Co”, about Shirley and the early IT industry:
    “The main qualification is personality…Much of the work is tedious, requiring great attention to detail, and this is where women usually score…Mrs Steve Shirley…has found in computer programming an outlet for her artistic talents in the working out of logical patterns.
    Now retired with a young baby, she has found that computer programming, since it needs only a desk, a head and paper and pencil, is a job that can be done from home between feeding the baby and washing the nappies. She is hoping to interest other retired programmers in joining her work on a freelance basis.”
     
  4. image: Download

    (via Not Even Silicon Valley Escapes History - Alexis C. Madrigal - The Atlantic)
     
  5. Rich’s Guide to Santa Clara County’s Silicon Valley in 1983….

    I discovered a copy of this rare book in Berkeley’s library system and realized that it was a fantastic dataset: If I stuck all of the locations onto a map, I could reconstruct the Valley as it was 30 years ago, right before the Japanese manufacturers and the forces of globalization pulled and pushed chip production to East Asia. …
    In our Internet-happy present, it’s easy to forget that up until the mid-1980s, Silicon Valley was an industrial landscape. …
    The Valley was as important a manufacturing center as Detroit or Pittsburgh were. This was the place making the foundational technology of the era… Rich’s Guide, I realized, would let me map this first peak of Silicon Valley

     
  6. in the film Licklider explains the potential for technologies like the ARPANET to improve the lives of average people by stripping out unnecessary elements like the paper a message is printed on:

    "There isn’t any real need to change things just for the sake of changing, but I tend to believe that things are going to be considerably better for a lot of people when and if we ever get changed over to an essentially electronic base. It’s just fundamental that if one wants to deal with information he ought to deal with the information and not with the paper it’s written on."

     
  7. One million songs were downloaded in the store’s first week, 25 million by the end of 2003, and one billion by February of 2006. iPod sales responded in kind, jumping from under one million in 2003 to over four million in 2004 to a staggering 22.5 million in 2005. By the time iPod sales reached their peak at nearly 55 million in 2008, the iTunes Store had supplanted Best Buy as the number one music retailer in the US. Less than two years later, in February of 2010, iTunes became the number one music retailer on the planet
     
  8. According to “Tubes: Behind the Scenes at the Internet” … 
    "…on New Year’s Day 1983, all the host computers on ARPANET adopted the electronic rules that remain the basic building block of the Internet. In technical terms, they switched their communications protocol, or language, from NCP, or “Network Control Protocol,” to TCP/IP, or “Transmission Control Protocol/Internet Protocol.” …  (The changeover)…kept dozens of system administrators tied to their desks on New Year’s Eve, struggling to make the deadline—leading one to commemorate the ordeal by making I SURVIVED THE TCP/IP TRANSITION buttons. Any node that did not comply was cut off until it did. But once the dust had settled several months later, the result was the computing equivalent of a single international language"
     
  9. At one point in the story, Gulliver encounters a fascinating machine while visiting the Academy of Projectors in the land of Lagando. Gulliver describes the machine, called The Engine: It was twenty feet square, placed in the middle of the room. The superfices was composed of several bits of wood, about the bigness of a die, but some larger than others. They were all linked together by slender wires. These bits of wood were covered, on every square, with paper pasted on them; and on these papers were written all the words of their language, in their several moods, tenses, and declensions; but without any order. This is one of the earliest known mention of a machine that could be considered as a computer in literature, more than a hundred years prior to the first calculating engine designs by Charles Babbage. The Engine might be seen as a computer, but perhaps it’s better thought of as a sort of random-number generator. The machine would create prose and poetry, entirely mechanically. The method of its operation involved turning the frame on which all the words of the language hung and having students read them aloud while capturing the results
     
  10. "(Alan Turing) wanted to invite us to dinner and, thinking us all asleep but having nothing on which to write, was posting through our letter box an invitation scratched on a rhododendron leaf" — there’s some lovely anecdotes in this story, as well as lots of detail about Max Newman
     
  11. Real breakthroughs aren’t always immediately identifiable as breakthroughs. Sometimes, they just go on to change the world without anyone knowing it’s going to happen or even talking about it much. Their influence becomes so pervasive that people think of it as unremarkable, not remarkable. Take, for instance, Grid Systems’ Grid Compass 1101, a portable computer which was announced in April, 1982. It wasn’t the first computer designed to be toted. It was just the first one in a briefcase-shaped case with a screen on one half of the interior, a keyboard on the other and a hinge in the middle. It was, in other words, the first computer with a clamshell case–or, to use a more common term, the first laptop.
     
  12. (via Turing and pride in Manchester - Boing Boing)
     
  13. This is a lovely article, not just about his achievements but also about him as a person

     
  14. 23:41 4th May 2012

    Notes: 2

    Reblogged from fragmentorium

    Tags: Quipscomputing history

    It would be an exaggeration to say that the British mathematician Alan Turing explained the nature of logical and mathematical reasoning, invented the digital computer, solved the mind-body problem, and saved Western civilization. But it would not be much of an exaggeration - Steven Pinker
     
  15. I read that Charles Babbage, who invented the first computer, had seen the Turk and that was part of what inspired him.

    Yup. People then didn’t know it was an illusion. They thought it was a thinking machine. And Babbage thought: “My god, if they can build a machine that plays chess, I should be able to make a machine that that can execute various rational functions.”

    So it was later that he built the analytical engine.

    Which was programmed with punch cards. And Jaquard, whose looms were programmed that way, may have also seen the Turk. And that was how computing began.