Turing's Cathedral: The Origins of the Digital Universe (Anglais) Relié – Séquence inédite, 6 mars 2012
|Neuf à partir de||Occasion à partir de|
Téléchargement audio, Version intégrale
|Gratuit avec l'offre d'essai Audible au lieu de EUR 29,82|
Description du produit
POINT SOURCE SOLUTION
I am thinking about something much more important than bombs. I am thinking about computers.
—John von Neumann, 1946
There are two kinds of creation myths: those where life arises out of the mud, and those where life falls from the sky. In this creation myth, computers arose from the mud, and code fell from the sky.
In late 1945, at the Institute for Advanced Study in Princeton, New Jersey, Hungarian American mathematician John von Neumann gathered a small group of engineers to begin designing, building, and programming an electronic digital computer, with five kilobytes of storage, whose attention could be switched in 24 microseconds from one memory location to the next. The entire digital universe can be traced directly to this 32-by-32-by-40-bit nucleus: less memory than is allocated to displaying a single icon on a computer screen today.
Von Neumann’s project was the physical realization of Alan Turing’s Universal Machine, a theoretical construct invented in 1936. It was not the first computer. It was not even the second or third computer. It was, however, among the first computers to make full use of a high-speed random-access storage matrix, and became the machine whose coding was most widely replicated and whose logical architecture was most widely reproduced. The stored-program computer, as conceived by Alan Turing and delivered by John von Neumann, broke the distinction between numbers that mean things and numbers that do things. Our universe would never be the same.
Working outside the bounds of industry, breaking the rules of academia, and relying largely on the U.S. government for support, a dozen engineers in their twenties and thirties designed and built von Neumann’s computer for less than $1 million in under five years. “He was in the right place at the right time with the right connections with the right idea,” remembers Willis Ware, fourth to be hired to join the engineering team, “setting aside the hassle that will probably never be resolved as to whose ideas they really were.”
As World War II drew to a close, the scientists who had built the atomic bomb at Los Alamos wondered, “What’s next?” Some, including Richard Feynman, vowed never to have anything to do with nuclear weapons or military secrecy again. Others, including Edward Teller and John von Neumann, were eager to develop more advanced nuclear weapons, especially the “Super,” or hydrogen bomb. Just before dawn on the morning of July 16, 1945, the New Mexico desert was illuminated by an explosion “brighter than a thousand suns.” Eight and a half years later, an explosion one thousand times more powerful illuminated the skies over Bikini Atoll. The race to build the hydrogen bomb was accelerated by von Neumann’s desire to build a computer, and the push to build von Neumann’s computer was accelerated by the race to build a hydrogen bomb.
Computers were essential to the initiation of nuclear explosions, and to understanding what happens next. In “Point Source Solution,” a 1947 Los Alamos report on the shock waves produced by nuclear explosions, von Neumann explained that “for very violent explosions . . . it may be justified to treat the original, central, high pressure area as a point.” This approximated the physical reality of a nuclear explosion closely enough to enable some of the first useful predictions of weapons effects.
Numerical simulation of chain reactions within computers initiated a chain reaction among computers, with machines and codes proliferating as explosively as the phenomena they were designed to help us understand. It is no coincidence that the most destructive and the most constructive of human inventions appeared at exactly the same time. Only the collective intelligence of computers could save us from the destructive powers of the weapons they had allowed us to invent.
Turing’s model of universal computation was one-dimensional: a string of symbols encoded on a tape. Von Neumann’s implementation of Turing’s model was two-dimensional: the address matrix underlying all computers in use today. The landscape is now three-dimensional, yet the entire Internet can still be viewed as a common tape shared by a multitude of Turing’s Universal Machines.
Where does time fit in? Time in the digital universe and time in our universe are governed by entirely different clocks. In our universe, time is a continuum. In a digital universe, time (T) is a countable number of discrete, sequential steps. A digital universe is bounded at the beginning, when T = 0, and at the end, if T comes to a stop. Even in a perfectly deterministic universe, there is no consistent method to predict the ending in advance. To an observer in our universe, the digital universe appears to be speeding up. To an observer in the digital universe, our universe appears to be slowing down.
Universal codes and universal machines, introduced by Alan Turing in his “On Computable Numbers, with an Application to the Entscheidungsproblem” of 1936, have prospered to such an extent that Turing’s underlying interest in the “decision problem” is easily overlooked. In answering the Entscheidungsproblem, Turing proved that there is no systematic way to tell, by looking at a code, what that code will do. That’s what makes the digital universe so interesting, and that’s what brings us here.
It is impossible to predict where the digital universe is going, but it is possible to understand how it began. The origin of the first fully electronic random-access storage matrix, and the propagation of the codes that it engendered, is as close to a point source as any approximation can get.
Revue de presse
—William Poundstone, The New York Times Book Review
“Dyson combines his prodigious skills as a historian and writer with his privileged position within the [Institute for Advanced Study’s] history to present a vivid account of the digital computer project . . . A powerful story of the ethical dimension of scientific research, a story whose lessons apply as much today in an era of expanded military R&D as they did in the ENIAC and MANIAC era . . . Dyson closes the book with three absolutely, hair-on-neck-standing-up inspiring chapters on the present and future, a bracing reminder of the distance we have come on some of the paths envisioned by von Neumann, Turing, et al.”
—Cory Doctorow, Boing Boing
“A fascinating combination of the technical and human stories behind the computing breakthroughs of the 1940s and ’50s . . . It demonstrates that the power of human thought often precedes determination and creativity in the birth of world-changing technology . . . An important work.”
—Richard DiDio, Philadelphia Inquirer
“Dyson’s book is not only learned, but brilliantly and surprisingly idiosyncratic and strange.”
—Josh Rothman, Braniac blog, Boston Globe
“Beyond the importance of this book as a contribution to the history of science, as a generalist I was struck by Dyson’s eye and ear for the delightfully entertaining detail . . . Turing’s Cathedral is suffused . . . with moments of insight, quirk and hilarity rendering it more than just a great book about science. It’s a great book, period.”
—Douglas Bell, The Globe and Mail
“The greatest strength of Turing’s Cathedral lies in its luscious wealth of anecdotal details about von Neumann and his band of scientific geniuses at IAS. Dyson himself is the son of Freeman Dyson, one of America’s greatest twentieth-century physicists and an IAS member from 1948 onward, and so Turing’s Cathedral is, in part, Dyson’s attempt to make both moral and intellectual sense of his father’s glittering and yet severely compromised scientific generation.”
—Andrew Keen, B&N Review
“A mesmerizing tale brilliantly told . . . . The use of wonderful quotes and pithy sketches of the brilliant cast of characters further enriches the text . . . . Meticulously researched and packed with not just technological details, but sociopolitical and cultural details as well—the definitive history of the computer.”
—Kirkus (starred review)
“The most powerful technology of the last century was not the atomic bomb, but software—and both were invented by the same folks. Even as they were inventing it, the original geniuses imagined almost everything software has become since. At long last, George Dyson delivers the untold story of software’s creation. It is an amazing tale brilliantly deciphered.”
—Kevin Kelly, cofounder of WIRED magazine, author of What Technology Wants
“It is a joy to read George Dyson’s revelation of the very human story of the invention of the electronic computer, which he tells with wit, authority, and insight. Read Turing’s Cathedral as both the origin story of our digital universe and as a perceptive glimpse into its future.”
—W. Daniel Hillis, inventor of The Connection Machine, author of The Pattern on the Stone
Aucun appareil Kindle n'est requis. Téléchargez l'une des applis Kindle gratuites et commencez à lire les livres Kindle sur votre smartphone, tablette ou ordinateur.
Pour obtenir l'appli gratuite, saisissez votre numéro de téléphone mobile.
Détails sur le produit
Si vous vendez ce produit, souhaitez-vous suggérer des mises à jour par l'intermédiaire du support vendeur ?
Commentaires en ligne
Meilleurs commentaires des clients
Commentaires client les plus utiles sur Amazon.com (beta) (Peut contenir des commentaires issus du programme Early Reviewer Rewards)
Turing and von Neumann make their appearances here, of course, along with Mauchley, Eckert, Oppenheimer, Ulam, Freeman Dyson (the authors' father), and other notables of the era. But Dyson also tells the story of a number of pioneers and contributors to the design, construction, and most of all the theory of computation, who have been overlooked by history. Most remarkable, perhaps, is Nils Barricelli, who could justifiably be called the founder of computational biology. Working in the early 1950s with a computer having less computational power and memory than a modern day sewing machine, he created a one-dimensional, artificial,universe in order to explore the relative power of mutation and symbiosis is the evolution of organisms. His work led to a number of original discoveries and conclusions that would only be rediscovered or proposed decades later, such as the notion that genes originated as independent organism, like viruses, that combined to create more complex organisms.
There's an entire chapter on a vacuum tube, the lowly 6J6, a dual triode created during the war that combined several elements necessary for the creation of a large scale computer: Simplicity, ruggedness, and economy. It fulfilled one of von Neumann's guiding principals for ENIAC: Don't invent anything. That is, don't waste time inventing where solutions already exist. By the nature of its relative unreliability and wide production tolerances relative to project goals, it also helped stimulate a critical line of research, that of how to created reliable systems from unreliable components- something more important now than ever in this era of microprocessors and memory chips with millions and even billions of components on a chip.
The chapter on Alan Turing is particularly good, covering as it does much of his work that has been neglected in biographies and presenting a much more accurate description of his work and his contributions to computational science. The great importance of his conceptual computer- the "Turing Machine"- is not, as is commonly stated in popular works, that it can perform the work of any other computer. It is that it demonstrated how any possible computing machine can be represented as a number, and vice versa. This allowed him to construct a proof that there exist uncomputable strings, I.e., programs for which it could not be determined a priori whether they will eventually halt. This was strongly related to Godel's work on the completeness of formal systems, and part of a larger project to disprove Godel's incompleteness theorem.
What makes this a particularly exceptional book is the manner in which Dyson connects the stories of individuals involved in the birth of electronic computing with the science itself. He does an exceptional job of explaining difficult topics like Godel incompleteness, the problems of separating noise from data, and the notion of computability in a way that the intelligent read who may not have advanced math skills will understand. More importantly, he understands the material well enough to know what are the critical concepts and accomplishments of these pioneers of computing, and doesn't fall into the trap of repeating the errors of far too many popular science writers. The result is a thoroughly original, accurate, and tremendously enjoyable history. Strongly recommended to anyone curious about the origins of computers and more importantly, the science of computing itself.
The prose is engaging but written for fellow scientists or, at least, the scientifically literate. Because of the fact that Dyson chose to write the book above the level of popular science his book didn't go viral in the way of, say, Blink by Malcolm Gladwell.
However, even though I read this book several years ago, I can say I've rarely had the pleasure of reading a finer work since. Highly recommended for a select kind of reader.