Computer history is filled with quirky individuals—and Edmund Berkeley, born March 21, 1909, is among the most fascinating.
It’s hard to pinpoint what Berkeley should be most known for: Founding the Association for Computing Machinery (ACM)? (Relations with ACM later became strained when Berkeley—a dedicated pacifist—denounced ACM at its 25th anniversary dinner for the use of computing in the Vietnam War.) Publishing the magazine Computers and People with the pseudonymous associate editor Neil D. Macdonald? (He didn’t want it to be obvious he was producing the publication singlehandedly.) Or developing Simon, a kit computer that could be considered the first PC? (Simon performed simple arithmetic and logic operations and cost $300 to $600 to build.)
#DidYouKnow In 1949 Edmund Berkeley predicted “a small pocket instrument that we carry around with us, talking to it whenever we need to, and either storing information in it or receiving information from it” @smarterMSP
Berkeley’s brainy book
Perhaps, though, Berkeley should be best known for his 1949 book Giant Brains, or Machines that Think. Berkeley’s popular book was the first widely used comparison of the term “brain” in reference to computing. In it, he predicted some future technological developments:
- “We can foresee the development of machinery that will make it possible to consult information in a library automatically. Suppose that you … wish to look up ways for making biscuits. You will be able to dial into the catalogue machine ‘making biscuits.’ There will be a flutter of movie film in the machine. Soon it will stop, and, in front of you on the screen, will be projected the part of the catalogue which shows the names of three or four books containing recipes for biscuits.”
- “We can even imagine what new machinery for handling information may some day become: a small pocket instrument that we carry around with us, talking to it whenever we need to, and either storing information in it or receiving information from it.”
Berkeley (and Macdonald) passed away March 7, 1988, after a lifetime of helping the public see just how brainy computers could be.