Acheter d'occasion
EUR 6,14
+ EUR 2,99 (livraison en France métropolitaine)
D'occasion: Très bon | Détails
État: D'occasion: Très bon
Commentaire: Ships from USA. Please allow 2 to 3 weeks for delivery. Very good condition book with only light signs of previous use. Sail the seas of value.
Vous l'avez déjà ?
Repliez vers l'arrière Repliez vers l'avant
Ecoutez Lecture en cours... Interrompu   Vous écoutez un extrait de l'édition audio Audible
En savoir plus
Voir cette image

Turing's Cathedral: The Origins of the Digital Universe (Anglais) Relié – Séquence inédite, 6 mars 2012

3.0 étoiles sur 5 1 commentaire client

Voir les 10 formats et éditions Masquer les autres formats et éditions
Prix Amazon
Neuf à partir de Occasion à partir de
Format Kindle
"Veuillez réessayer"
Relié, Séquence inédite, 6 mars 2012
EUR 30,42 EUR 6,14
click to open popover

Offres spéciales et liens associés


Description du produit

Extrait

Preface
 
POINT SOURCE SOLUTION
 
I am thinking about something much more important than bombs. I am thinking about computers.
—John von Neumann, 1946
 
 
There are two kinds of creation myths: those where life arises out of the mud, and those where life falls from the sky. In this creation myth, computers arose from the mud, and code fell from the sky.
 
In late 1945, at the Institute for Advanced Study in Princeton, New Jersey, Hungarian American mathematician John von Neumann gathered a small group of engineers to begin designing, building, and programming an electronic digital computer, with five kilobytes of storage, whose attention could be switched in 24 microseconds from one memory location to the next. The entire digital universe can be traced directly to this 32-by-32-by-40-bit nucleus: less memory than is allocated to displaying a single icon on a computer screen today.
 
Von Neumann’s project was the physical realization of Alan Turing’s Universal Machine, a theoretical construct invented in 1936. It was not the first computer. It was not even the second or third computer. It was, however, among the first computers to make full use of a high-speed random-access storage matrix, and became the machine whose coding was most widely replicated and whose logical architecture was most widely reproduced. The stored-program computer, as conceived by Alan Turing and delivered by John von Neumann, broke the distinction between numbers that mean things and numbers that do things. Our universe would never be the same.
 
Working outside the bounds of industry, breaking the rules of academia, and relying largely on the U.S. government for support, a dozen engineers in their twenties and thirties designed and built von Neumann’s computer for less than $1 million in under five years. “He was in the right place at the right time with the right connections with the right idea,” remembers Willis Ware, fourth to be hired to join the engineering team, “setting aside the hassle that will probably never be resolved as to whose ideas they really were.”
 
As World War II drew to a close, the scientists who had built the atomic bomb at Los Alamos wondered, “What’s next?” Some, including Richard Feynman, vowed never to have anything to do with nuclear weapons or military secrecy again. Others, including Edward Teller and John von Neumann, were eager to develop more advanced nuclear weapons, especially the “Super,” or hydrogen bomb. Just before dawn on the morning of July 16, 1945, the New Mexico desert was illuminated by an explosion “brighter than a thousand suns.” Eight and a half years later, an explosion one thousand times more powerful illuminated the skies over Bikini Atoll. The race to build the hydrogen bomb was accelerated by von Neumann’s desire to build a computer, and the push to build von Neumann’s computer was accelerated by the race to build a hydrogen bomb.
 
Computers were essential to the initiation of nuclear explosions, and to understanding what happens next. In “Point Source Solution,” a 1947 Los Alamos report on the shock waves produced by nuclear explosions, von Neumann explained that “for very violent explosions . . . it may be justified to treat the original, central, high pressure area as a point.” This approximated the physical reality of a nuclear explosion closely enough to enable some of the first useful predictions of weapons effects.
 
Numerical simulation of chain reactions within computers initiated a chain reaction among computers, with machines and codes proliferating as explosively as the phenomena they were designed to help us understand. It is no coincidence that the most destructive and the most constructive of human inventions appeared at exactly the same time. Only the collective intelligence of computers could save us from the destructive powers of the weapons they had allowed us to invent.
 
Turing’s model of universal computation was one-dimensional: a string of symbols encoded on a tape. Von Neumann’s implementation of Turing’s model was two-dimensional: the address matrix underlying all computers in use today. The landscape is now three-dimensional, yet the entire Internet can still be viewed as a common tape shared by a multitude of Turing’s Universal Machines.
 
Where does time fit in? Time in the digital universe and time in our universe are governed by entirely different clocks. In our universe, time is a continuum. In a digital universe, time (T) is a countable number of discrete, sequential steps. A digital universe is bounded at the beginning, when T = 0, and at the end, if T comes to a stop. Even in a perfectly deterministic universe, there is no consistent method to predict the ending in advance. To an observer in our universe, the digital universe appears to be speeding up. To an observer in the digital universe, our universe appears to be slowing down.
 
Universal codes and universal machines, introduced by Alan Turing in his “On Computable Numbers, with an Application to the Entscheidungsproblem” of 1936, have prospered to such an extent that Turing’s underlying interest in the “decision problem” is easily overlooked. In answering the Entscheidungsproblem, Turing proved that there is no systematic way to tell, by looking at a code, what that code will do. That’s what makes the digital universe so interesting, and that’s what brings us here.
 
It is impossible to predict where the digital universe is going, but it is possible to understand how it began. The origin of the first fully electronic random-access storage matrix, and the propagation of the codes that it engendered, is as close to a point source as any approximation can get.

Revue de presse

“An expansive narrative . . . The book brims with unexpected detail. Maybe the bomb (or the specter of the machines) affected everyone. Gödel believed his food was poisoned and starved himself to death. Turing, persecuted for his homosexuality, actually did die of poisoning, perhaps by biting a cyanide-laced apple. Less well known is the tragic end of Klári von Neumann, a depressive Jewish socialite who became one of the world’s first machine-language programmers and enacted the grandest suicide of the lot, downing cocktails before walking into the Pacific surf in a black dress with fur cuffs. Dyson’s well made sentences are worthy of these operatic contradictions . . . A groundbreaking history of the Princeton computer.”
—William Poundstone, The New York Times Book Review

“Dyson combines his prodigious skills as a historian and writer with his privileged position within the [Institute for Advanced Study’s] history to present a vivid account of the digital computer project . . .  A powerful story of the ethical dimension of scientific research, a story whose lessons apply as much today in an era of expanded military R&D as they did in the ENIAC and MANIAC era . . . Dyson closes the book with three absolutely, hair-on-neck-standing-up inspiring chapters on the present and future, a bracing reminder of the distance we have come on some of the paths envisioned by von Neumann, Turing, et al.”
—Cory Doctorow, Boing Boing
 
“A fascinating combination of the technical and human stories behind the computing breakthroughs of the 1940s and ’50s . . . It demonstrates that the power of human thought often precedes determination and creativity in the birth of world-changing technology . . . An important work.”
—Richard DiDio, Philadelphia Inquirer
 
“Dyson’s book is not only learned, but brilliantly and surprisingly idiosyncratic and strange.”
—Josh Rothman, Braniac blog, Boston Globe
 
“Beyond the importance of this book as a contribution to the history of science, as a generalist I was struck by Dyson’s eye and ear for the delightfully entertaining detail . . . Turing’s Cathedral is suffused . . . with moments of insight, quirk and hilarity rendering it more than just a great book about science. It’s a great book, period.”
—Douglas Bell, The Globe and Mail
 
“The greatest strength of Turing’s Cathedral lies in its luscious wealth of anecdotal details about von Neumann and his band of scientific geniuses at IAS.  Dyson himself is the son of Freeman Dyson, one of America’s greatest twentieth-century physicists and an IAS member from 1948 onward, and so Turing’s Cathedral is, in part, Dyson’s attempt to make both moral and intellectual sense of his father’s glittering and yet severely compromised scientific generation.”
—Andrew Keen, B&N Review

“A mesmerizing tale brilliantly told . . . . The use of wonderful quotes and pithy sketches of the brilliant cast of characters further enriches the text . . . . Meticulously researched and packed with not just technological details, but sociopolitical and cultural details as well—the definitive history of the computer.”
Kirkus (starred review)
 
“The most powerful technology of the last century was not the atomic bomb, but software—and both were invented by the same folks. Even as they were inventing it, the original geniuses imagined almost everything software has become since. At long last, George Dyson delivers the untold story of software’s creation. It is an amazing tale brilliantly deciphered.”
—Kevin Kelly, cofounder of WIRED magazine, author of What Technology Wants
 
“It is a joy to read George Dyson’s revelation of the very human story of the invention of the electronic computer, which he tells with wit, authority, and insight. Read Turing’s Cathedral as both the origin story of our digital universe and as a perceptive glimpse into its future.”
—W. Daniel Hillis, inventor of The Connection Machine, author of The Pattern on the Stone

Aucun appareil Kindle n'est requis. Téléchargez l'une des applis Kindle gratuites et commencez à lire les livres Kindle sur votre smartphone, tablette ou ordinateur.

  • Apple
  • Android
  • Windows Phone
  • Android

Pour obtenir l'appli gratuite, saisissez votre numéro de téléphone mobile.



Détails sur le produit

Commentaires en ligne

3.0 étoiles sur 5
5 étoiles
0
4 étoiles
0
3 étoiles
1
2 étoiles
0
1 étoile
0
Voir le commentaire client
Partagez votre opinion avec les autres clients

Meilleurs commentaires des clients

Format: Broché Achat vérifié
C'est une histoire détaillée mais où la "petite histoire" tient beaucoup de place par rapport à l'histoire des idées et des problématiques. Il n'était guère utile, par exemple de connaître les numéros de chambre à Princeton des différents acteurs
Remarque sur ce commentaire Avez-vous trouvé ce commentaire utile ? Oui Non Commentaire en cours d'envoi...
Merci pour votre commentaire.
Désolé, nous n'avons pas réussi à enregistrer votre vote. Veuillez réessayer
Signaler un abus

Commentaires client les plus utiles sur Amazon.com (beta) (Peut contenir des commentaires issus du programme Early Reviewer Rewards)

Amazon.com: 3.9 étoiles sur 5 153 commentaires
3 internautes sur 3 ont trouvé ce commentaire utile 
5.0 étoiles sur 5 Breathtaking in scope, depth, and originality 6 mai 2016
Par Michael J. Edelman - Publié sur Amazon.com
Format: Broché Achat vérifié
The early history of computing is usually presented in a simple linear fashion: Atonsoff, Mauchley and Eckert, Turing and the Enigma project, Von Neumann, and the post war explosion. That's the way I learned it in college in the 70s, and the way just about every book presents it. It's correct, insofar as it goes, but it leaves out a tremendous amount of richness and detail that George Dyson relates in this book. His narrative consists of over a dozen parallel, interrelated, stories, each concentrating on one person or project, along with how they or it relates to the overall narrative. The story begins with the history of Princeton, New Jersey, and the two men most responsible for the creation of the Institute for Advanced Study: Abraham Flexner, and Oswald Veblen, son of economist Thorsten Veblen. Flexner and the younger Veblen shared a vision of creating a place in which the world's greatest thinkers, able to interact freely and freed from the mundane obligations of teaching and practical applications, would advance the world's knowledge on a heretofore unprecedented scale. In so doing they inadvertently created one of the era's greatest centers for applied research into computing.

Turing and von Neumann make their appearances here, of course, along with Mauchley, Eckert, Oppenheimer, Ulam, Freeman Dyson (the authors' father), and other notables of the era. But Dyson also tells the story of a number of pioneers and contributors to the design, construction, and most of all the theory of computation, who have been overlooked by history. Most remarkable, perhaps, is Nils Barricelli, who could justifiably be called the founder of computational biology. Working in the early 1950s with a computer having less computational power and memory than a modern day sewing machine, he created a one-dimensional, artificial,universe in order to explore the relative power of mutation and symbiosis is the evolution of organisms. His work led to a number of original discoveries and conclusions that would only be rediscovered or proposed decades later, such as the notion that genes originated as independent organism, like viruses, that combined to create more complex organisms.

There's an entire chapter on a vacuum tube, the lowly 6J6, a dual triode created during the war that combined several elements necessary for the creation of a large scale computer: Simplicity, ruggedness, and economy. It fulfilled one of von Neumann's guiding principals for ENIAC: Don't invent anything. That is, don't waste time inventing where solutions already exist. By the nature of its relative unreliability and wide production tolerances relative to project goals, it also helped stimulate a critical line of research, that of how to created reliable systems from unreliable components- something more important now than ever in this era of microprocessors and memory chips with millions and even billions of components on a chip.

The chapter on Alan Turing is particularly good, covering as it does much of his work that has been neglected in biographies and presenting a much more accurate description of his work and his contributions to computational science. The great importance of his conceptual computer- the "Turing Machine"- is not, as is commonly stated in popular works, that it can perform the work of any other computer. It is that it demonstrated how any possible computing machine can be represented as a number, and vice versa. This allowed him to construct a proof that there exist uncomputable strings, I.e., programs for which it could not be determined a priori whether they will eventually halt. This was strongly related to Godel's work on the completeness of formal systems, and part of a larger project to disprove Godel's incompleteness theorem.

What makes this a particularly exceptional book is the manner in which Dyson connects the stories of individuals involved in the birth of electronic computing with the science itself. He does an exceptional job of explaining difficult topics like Godel incompleteness, the problems of separating noise from data, and the notion of computability in a way that the intelligent read who may not have advanced math skills will understand. More importantly, he understands the material well enough to know what are the critical concepts and accomplishments of these pioneers of computing, and doesn't fall into the trap of repeating the errors of far too many popular science writers. The result is a thoroughly original, accurate, and tremendously enjoyable history. Strongly recommended to anyone curious about the origins of computers and more importantly, the science of computing itself.
5 internautes sur 5 ont trouvé ce commentaire utile 
5.0 étoiles sur 5 The origins of my work environment 10 mars 2016
Par James A. Lewis - Publié sur Amazon.com
Format: Format Kindle Achat vérifié
I entered the digital computer world as an enlisted cryptographic in 1964. By that date the most meaningful events described in this book had transpired. I spent my working life in the lower orders of data processing -- first as a hardware technician, then analysis salesman and finally as a designer. Reading this book was mesmerizing because it revealed where and how my workplace originated. For a student of history though, I was thrilled by our country's CULTURAL robbery of Europe's finest intellectuals at the time that both we and the dictators needed them most. Our luck and the barbarians stupidity.
1 internautes sur 1 ont trouvé ce commentaire utile 
5.0 étoiles sur 5 Good coverage of an important period in the history of computing 12 mai 2017
Par Jack Murray - Publié sur Amazon.com
Format: Format Kindle Achat vérifié
This is an excellent introduction to the early years of computers as seen from the vantage point of the Institute of Advanced Studies at Princeton and the fascinating people who worked there during the thirties and WWII. The participation of these figures, and others, in the development of the atom bomb is examined with more than a hint of the crucial issue of the conflict between secrecy and open sharing, between the commercial (e.g., patents) and computers as vehicles for pure research or free public usage. George Dyson is a clear and gifted writer, commands the fields he treats, and has a leg up on the Institute, given that he grew up there with his celebrated father Freeman Dyson. By the way, though, Turing is a background figure in the book, though of course a vital one.
5.0 étoiles sur 5 Making a Minor Effort to Provide This Book the Recognition it Deserves 6 mai 2017
Par Aran Joseph Canes - Publié sur Amazon.com
Format: Broché Achat vérifié
I read this book several years ago, but I feel it didn't get all of the attention it deserved. Dyson manages to weave together the stories of the birth of the computer, the beginning of the Institute for Advanced Study and the personalities of some of the leading scientists of the 20th century all in one compelling narrative.

The prose is engaging but written for fellow scientists or, at least, the scientifically literate. Because of the fact that Dyson chose to write the book above the level of popular science his book didn't go viral in the way of, say, Blink by Malcolm Gladwell.

However, even though I read this book several years ago, I can say I've rarely had the pleasure of reading a finer work since. Highly recommended for a select kind of reader.
2 internautes sur 2 ont trouvé ce commentaire utile 
5.0 étoiles sur 5 The Birth of the Computer 12 février 2017
Par Amazon Customer - Publié sur Amazon.com
Format: Broché Achat vérifié
Computers are now so omnipresent it is difficult to imagine how they first came into being. Read this book to discover the people, institutions, and historical forces that were present at the birth of the first computers. As one might expect, the story is full of remarkably odd people, and unexpected twists and turns.
Ces commentaires ont-ils été utiles ? Dites-le-nous