The history of technology, whether of the last five or five hundred
years, is often told as a series of pivotal events or the actions of
larger-than-life individuals, of endless “revolutions” and “disruptive”
innovations that “change everything.” It is history as hype, offering a
distorted view of the past, sometimes through the tinted lenses of
contemporary fads and preoccupations.
In contrast, ENIAC in Action: Making and Remaking the Modern Computer, is a nuanced, engaging and thoroughly researched account of the early days of computers, the people who built and operated them, and their old and new applications. Say the authors, Thomas Haigh, Mark Priestley and Crispin Rope:
The ENIAC was developed in a specific context that shaped the life and times of what became “the only fully electronic computer working in the U.S.,” from its inception during World War II to 1950, when other computers have successfully joined the race to create a new industry. “Without the war, no one would have built a machine with [ENIAC’s] particular combination of strengths and weaknesses,” say Haigh, Priestley and Rope.
The specific context in which the ENIAC emerged had also to do with
the interweaving of disciplines, skills, and people working in old and
new roles. The ENIAC was an important milestone in the long evolution of
labor-saving devices, scientific measurement, business management, and
knowledge work.
Understanding this context sheds new light on women’s role in the emergence of the new discipline of computer science and the new practice of corporate data processing. “Women in IT” has been a topic of much discussion recently, frequently starting with Ada Lovelace who is for many the “first computer programmer.” A very popular example of the popular view that women invented programming is Walter Isaacson’s The Innovators (see Haigh’s and Priestley’s rejoinder and a list of factual inaccuracies committed by Isaacson).
It turns out that history (of the accurate kind) can be more inspirational than story-telling driven by current interests and agendas, and furnish us (of all genders) with more relevant role-models. The authors of ENIAC in Action highlight the importance of the work of ENIAC’s mostly female “operators” (neglected by other historians, they say, because of the disinclination to celebrate work seen as blue-collar), reflecting “a long tradition of female participation in applied mathematics within the institutional settings of universities and research laboratories,” a tradition that continued with the ENIAC and similar machines performing the same work (e.g., firing-table computations) but much faster.
The female operators, initially hired to help mainly with the physical configuration of the ENIAC (which was re-wired for each computing task), ended up contributing significantly to the development of “set-up forms” and the emerging computer programming profession: “It was hard to devise a mathematical treatment without good knowledge of the processes of mechanical computation, and it was hard to turn a computational plan into a set-up without hands-on knowledge of how ENIAC ran.”
When computing moved from research laboratories into the corporate world, most firms used existing employees in the newly created “data processing” (later “IT”) department, re-assigning them from relevant positions: punched-card-machine workers, corporate “systems men” (business process redesign), and accountants. Write the authors of ENIAC in Action:
Good history also provides us with a mirror in which we can compare and contrast past and present developments. The emergence of the “data science” profession today, in which women play a more significant role than in the traditional IT profession, parallels the emergence of computer programming. Just like the latter required knowledge of both computer operations and mathematical analysis, data science marries knowledge of computers with statistical analysis skills.
Developing models is the core of data scientists’ work and ENIAC in Action devotes considerable space to the emergence of computer simulations and the discussion of their impact on scientific practice. Simulations brought on a shift from equations to algorithms, providing “a fundamentally experimental way of discovering the properties of the system described.”
Today’s parallel to the ENIAC-era big calculation is big data, as is the notion of “discovery” and the abandonment of hypotheses. “One set initial parameters, ran the program, and waited to see what happened” is today’s The unreasonable effectiveness of data. There is a direct line of the re-shaping of scientific practice from the ENIAC pioneering simulations to “automated science.” But is the removal of human imagination from scientific practice good for scientific progress?
Similarly, it’s interesting to learn about the origins of today’s renewed interest in, fascination with, and fear of “artificial intelligence.” Haigh, Priestley and Rope argue against the claim that the “irresponsible hyperbole” regarding early computers was generated solely by the media, writing that “many computing pioneers, including John von Neumann, [conceived] of computers as artificial brains.”
Indeed, in his First Draft of a Report on the EDVAC—which became the foundation text of modern computer science (or more accurately, computer engineering practice)—von Neumann compared the components of the computer to “the neurons of higher animals.” While von Neumann thought that the brain was a computer, he allowed that it was a complex one, following McCulloch and Pitts (in their 1943 paper “A Logical Calculus of the Ideas Immanent in Nervous Activity”) in ignoring “the more complicated aspects of neuron functioning,” he wrote.
Given that McCulloch said about the “neurons” discussed in his and Pitts’ seminal paper that they “were deliberately as impoverished as possible,” what we have at the dawn of “artificial intelligence” is simplification squared, based on an extremely limited (possibly non-existent at the time) understanding of how the human brain works.
These mathematical exercises, born out of the workings of very developed brains but not mimicking or even remotely describing them, led to the development of “artificial neural networks” which led to “deep learning” which led to the general excitement today about computer programs “mimicking the brain” when they succeed in identifying cat images or beating a Go champion.
In 1949, computer scientist Edmund Berkeley wrote in his book, Giant Brains or Machines that Think: “These machines are similar to what a brain would be if it were made of hardware and wire instead of flesh and nerves… A machine can handle information; it can calculate, conclude, and choose; it can perform reasonable operations with information. A machine, therefore, can think.”
Haigh, Priestley and Rope write that “…the idea of computers as brains was always controversial, and… most people professionally involved with the field had stepped away from it by the 1950s.” But thirty years later, Marvin Minsky famously stated: “The human brain is just a computer that happens to be made out of meat.”
Most computer scientists by that time were indeed occupied by less lofty goals than playing God, but only very few objected
to these kind of statements, or to Minsky receiving the most
prestigious award of their profession (for his role in creating the
field of artificial intelligence). Today, the idea that computers and
brains are the same thing, leads people with very developed
brains to conclude that if computers can win in Go, they can think, and
that with just a few more short steps up the neural networks evolution
ladder, computers will reason that it’s in their best interests to
destroy humanity.
Twenty years ago, the U.S. Postal Service issued a new stamp commemorating the 50th birthday of ENIAC. The stamp displayed an image of a brain partially covered by small blocs that contain parts of circuit boards and binary code. One of the few computer scientists who objected to this pernicious and popular idea was Joseph Weizenbaum:
ENIAC and its progeny have not changed what’s most important in our world: humans. Maybe Gates, Hawking, and Musk are right after all. Once computers surpass us in intelligence, they will understand that humanity cannot be changed by technology and it’s better just to get rid of it. In the meantime, the creativity and intelligence of good historians writing books such as ENIAC in Action will keep us informed and entertained.
In contrast, ENIAC in Action: Making and Remaking the Modern Computer, is a nuanced, engaging and thoroughly researched account of the early days of computers, the people who built and operated them, and their old and new applications. Say the authors, Thomas Haigh, Mark Priestley and Crispin Rope:
The titles of dozens of books have tried to lure a broad audience to an obscure topic by touting an idea, a fish, a dog, a map, a condiment, or a machine as having “changed the world”… One of the luxuries of writing an obscure academic book is that one is not required to embrace such simplistic conceptions of history.Instead, we learn that the Electronic Numerical Integrator and Computer (ENIAC) was “a product of historical contingency, of an accident, and no attempt to present it as the expression of whatever retrospectively postulated law or necessity would contribute to our understanding.”
The ENIAC was developed in a specific context that shaped the life and times of what became “the only fully electronic computer working in the U.S.,” from its inception during World War II to 1950, when other computers have successfully joined the race to create a new industry. “Without the war, no one would have built a machine with [ENIAC’s] particular combination of strengths and weaknesses,” say Haigh, Priestley and Rope.
Understanding this context sheds new light on women’s role in the emergence of the new discipline of computer science and the new practice of corporate data processing. “Women in IT” has been a topic of much discussion recently, frequently starting with Ada Lovelace who is for many the “first computer programmer.” A very popular example of the popular view that women invented programming is Walter Isaacson’s The Innovators (see Haigh’s and Priestley’s rejoinder and a list of factual inaccuracies committed by Isaacson).
It turns out that history (of the accurate kind) can be more inspirational than story-telling driven by current interests and agendas, and furnish us (of all genders) with more relevant role-models. The authors of ENIAC in Action highlight the importance of the work of ENIAC’s mostly female “operators” (neglected by other historians, they say, because of the disinclination to celebrate work seen as blue-collar), reflecting “a long tradition of female participation in applied mathematics within the institutional settings of universities and research laboratories,” a tradition that continued with the ENIAC and similar machines performing the same work (e.g., firing-table computations) but much faster.
The female operators, initially hired to help mainly with the physical configuration of the ENIAC (which was re-wired for each computing task), ended up contributing significantly to the development of “set-up forms” and the emerging computer programming profession: “It was hard to devise a mathematical treatment without good knowledge of the processes of mechanical computation, and it was hard to turn a computational plan into a set-up without hands-on knowledge of how ENIAC ran.”
When computing moved from research laboratories into the corporate world, most firms used existing employees in the newly created “data processing” (later “IT”) department, re-assigning them from relevant positions: punched-card-machine workers, corporate “systems men” (business process redesign), and accountants. Write the authors of ENIAC in Action:
Because all these groups were predominantly male, the story of male domination of administrative programming work was… a story of continuity within a particular institutional context. Thus, we see the history of programming labor not as the creation of a new occupation in which women were first welcomed and then excluded, but rather as a set of parallel stories in which the influence of ENIAC and other early machines remained strong in centers of scientific computation but was negligible in corporate data-processing work.Good history is a guide to how society works; bad history is conjuring evil forces where there are none. ENIAC in Action resurrects the pioneering work of the real “first programmers” such as Jean Bartik and Klara von Neumann and explains why corporate IT has evolved to employ mostly their male successors.
Good history also provides us with a mirror in which we can compare and contrast past and present developments. The emergence of the “data science” profession today, in which women play a more significant role than in the traditional IT profession, parallels the emergence of computer programming. Just like the latter required knowledge of both computer operations and mathematical analysis, data science marries knowledge of computers with statistical analysis skills.
Developing models is the core of data scientists’ work and ENIAC in Action devotes considerable space to the emergence of computer simulations and the discussion of their impact on scientific practice. Simulations brought on a shift from equations to algorithms, providing “a fundamentally experimental way of discovering the properties of the system described.”
Today’s parallel to the ENIAC-era big calculation is big data, as is the notion of “discovery” and the abandonment of hypotheses. “One set initial parameters, ran the program, and waited to see what happened” is today’s The unreasonable effectiveness of data. There is a direct line of the re-shaping of scientific practice from the ENIAC pioneering simulations to “automated science.” But is the removal of human imagination from scientific practice good for scientific progress?
Similarly, it’s interesting to learn about the origins of today’s renewed interest in, fascination with, and fear of “artificial intelligence.” Haigh, Priestley and Rope argue against the claim that the “irresponsible hyperbole” regarding early computers was generated solely by the media, writing that “many computing pioneers, including John von Neumann, [conceived] of computers as artificial brains.”
Indeed, in his First Draft of a Report on the EDVAC—which became the foundation text of modern computer science (or more accurately, computer engineering practice)—von Neumann compared the components of the computer to “the neurons of higher animals.” While von Neumann thought that the brain was a computer, he allowed that it was a complex one, following McCulloch and Pitts (in their 1943 paper “A Logical Calculus of the Ideas Immanent in Nervous Activity”) in ignoring “the more complicated aspects of neuron functioning,” he wrote.
Given that McCulloch said about the “neurons” discussed in his and Pitts’ seminal paper that they “were deliberately as impoverished as possible,” what we have at the dawn of “artificial intelligence” is simplification squared, based on an extremely limited (possibly non-existent at the time) understanding of how the human brain works.
These mathematical exercises, born out of the workings of very developed brains but not mimicking or even remotely describing them, led to the development of “artificial neural networks” which led to “deep learning” which led to the general excitement today about computer programs “mimicking the brain” when they succeed in identifying cat images or beating a Go champion.
In 1949, computer scientist Edmund Berkeley wrote in his book, Giant Brains or Machines that Think: “These machines are similar to what a brain would be if it were made of hardware and wire instead of flesh and nerves… A machine can handle information; it can calculate, conclude, and choose; it can perform reasonable operations with information. A machine, therefore, can think.”
Haigh, Priestley and Rope write that “…the idea of computers as brains was always controversial, and… most people professionally involved with the field had stepped away from it by the 1950s.” But thirty years later, Marvin Minsky famously stated: “The human brain is just a computer that happens to be made out of meat.”
Twenty years ago, the U.S. Postal Service issued a new stamp commemorating the 50th birthday of ENIAC. The stamp displayed an image of a brain partially covered by small blocs that contain parts of circuit boards and binary code. One of the few computer scientists who objected to this pernicious and popular idea was Joseph Weizenbaum:
What do these people actually mean when they shout that man is a machine (and a brain a “meat machine”)? It is… that human beings are “computable,” that they are not distinct from other objects in the world… computers enable fantasies, many of them wonderful, but also those of people whose compulsion to play God overwhelms their ability to fathom the consequences of their attempt to turn their nightmares into reality.The dominant fantasy is that computers “change the world” and “make it a better place,” they spread democracy and other cherished values, etc., etc. It is vociferously promoted by people who believe themselves to be rational but ignore reality which has proven again and again that 70 years of computers have done little to change our society. Two recent examples are the hacker who scanned the Internet for networked printers and made them print an anti-semitic flyer and the good people at Microsoft MSFT -0.11% who released an “AI-powered chatbot” only to find out that it took Twitter TWTR -2.25% users just 16 hours to teach it to spew racial slurs.
ENIAC and its progeny have not changed what’s most important in our world: humans. Maybe Gates, Hawking, and Musk are right after all. Once computers surpass us in intelligence, they will understand that humanity cannot be changed by technology and it’s better just to get rid of it. In the meantime, the creativity and intelligence of good historians writing books such as ENIAC in Action will keep us informed and entertained.
No comments:
Post a Comment