Les clients ayant acheté cet article ont également acheté
The computer and the Internet are among the most important inventions of our era, but few people know who created them. They were not conjured up in a garret or garage by solo inventors suitable to be singled out on magazine covers or put into a pantheon with Edison, Bell, and Morse. Instead, most of the innovations of the digital age were done collaboratively. There were a lot of fascinating people involved, some ingenious and a few even geniuses. This is the story of these pioneers, hackers, inventors, and entrepreneurs—who they were, how their minds worked, and what made them so creative. It’s also a narrative of how they collaborated and why their ability to work as teams made them even more creative.
The tale of their teamwork is important because we don’t often focus on how central that skill is to innovation. There are thousands of books celebrating people we biographers portray, or mythologize, as lone inventors. I’ve produced a few myself. Search the phrase “the man who invented” on Amazon and you get 1,860 book results. But we have far fewer tales of collaborative creativity, which is actually more important in understanding how today’s technology revolution was fashioned. It can also be more interesting.
We talk so much about innovation these days that it has become a buzzword, drained of clear meaning. So in this book I set out to report on how innovation actually happens in the real world. How did the most imaginative innovators of our time turn disruptive ideas into realities? I focus on a dozen or so of the most significant breakthroughs of the digital age and the people who made them. What ingredients produced their creative leaps? What skills proved most useful? How did they lead and collaborate? Why did some succeed and others fail?
I also explore the social and cultural forces that provide the atmosphere for innovation. For the birth of the digital age, this included a research ecosystem that was nurtured by government spending and managed by a military-industrial-academic collaboration. Intersecting with that was a loose alliance of community organizers, communal-minded hippies, do-it-yourself hobbyists, and homebrew hackers, most of whom were suspicious of centralized authority.
Histories can be written with a different emphasis on any of these factors. An example is the invention of the Harvard/IBM Mark I, the first big electromechanical computer. One of its programmers, Grace Hopper, wrote a history that focused on its primary creator, Howard Aiken. IBM countered with a history that featured its teams of faceless engineers who contributed the incremental innovations, from counters to card feeders, that went into the machine.
Likewise, what emphasis should be put on great individuals versus on cultural currents has long been a matter of dispute; in the mid-nineteenth century, Thomas Carlyle declared that “the history of the world is but the biography of great men,” and Herbert Spencer responded with a theory that emphasized the role of societal forces. Academics and participants often view this balance differently. “As a professor, I tended to think of history as run by impersonal forces,” Henry Kissinger told reporters during one of his Middle East shuttle missions in the 1970s. “But when you see it in practice, you see the difference personalities make.”1 When it comes to digital-age innovation, as with Middle East peacemaking, a variety of personal and cultural forces all come into play, and in this book I sought to weave them together.
The Internet was originally built to facilitate collaboration. By contrast, personal computers, especially those meant to be used at home, were devised as tools for individual creativity. For more than a decade, beginning in the early 1970s, the development of networks and that of home computers proceeded separately from one another. They finally began coming together in the late 1980s with the advent of modems, online services, and the Web. Just as combining the steam engine with ingenious machinery drove the Industrial Revolution, the combination of the computer and distributed networks led to a digital revolution that allowed anyone to create, disseminate, and access any information anywhere.
Historians of science are sometimes wary about calling periods of great change revolutions, because they prefer to view progress as evolutionary. “There was no such thing as the Scientific Revolution, and this is a book about it,” is the wry opening sentence of the Harvard professor Steven Shapin’s book on that period. One method that Shapin used to escape his half-joking contradiction is to note how the key players of the period “vigorously expressed the view” that they were part of a revolution. “Our sense of radical change afoot comes substantially from them.”2
Likewise, most of us today share a sense that the digital advances of the past half century are transforming, perhaps even revolutionizing the way we live. I can recall the excitement that each new breakthrough engendered. My father and uncles were electrical engineers, and like many of the characters in this book I grew up with a basement workshop that had circuit boards to be soldered, radios to be opened, tubes to be tested, and boxes of transistors and resistors to be sorted and deployed. As an electronics geek who loved Heathkits and ham radios (WA5JTP), I can remember when vacuum tubes gave way to transistors. At college I learned programming using punch cards and recall when the agony of batch processing was replaced by the ecstasy of hands-on interaction. In the 1980s I thrilled to the static and screech that modems made when they opened for you the weirdly magical realm of online services and bulletin boards, and in the early 1990s I helped to run a digital division at Time and Time Warner that launched new Web and broadband Internet services. As Wordsworth said of the enthusiasts who were present at the beginning of the French Revolution, “Bliss was it in that dawn to be alive.”
I began work on this book more than a decade ago. It grew out of my fascination with the digital-age advances I had witnessed and also from my biography of Benjamin Franklin, who was an innovator, inventor, publisher, postal service pioneer, and all-around information networker and entrepreneur. I wanted to step away from doing biographies, which tend to emphasize the role of singular individuals, and once again do a book like The Wise Men, which I had coauthored with a colleague about the creative teamwork of six friends who shaped America’s cold war policies. My initial plan was to focus on the teams that invented the Internet. But when I interviewed Bill Gates, he convinced me that the simultaneous emergence of the Internet and the personal computer made for a richer tale. I put this book on hold early in 2009, when I began working on a biography of Steve Jobs. But his story reinforced my interest in how the development of the Internet and computers intertwined, so as soon as I finished that book, I went back to work on this tale of digital-age innovators.
The protocols of the Internet were devised by peer collaboration, and the resulting system seemed to have embedded in its genetic code a propensity to facilitate such collaboration. The power to create and transmit information was fully distributed to each of the nodes, and any attempt to impose controls or a hierarchy could be routed around. Without falling into the teleological fallacy of ascribing intentions or a personality to technology, it’s fair to say that a system of open networks connected to individually controlled computers tended, as the printing press did, to wrest control over the distribution of information from gatekeepers, central authorities, and institutions that employed scriveners and scribes. It became easier for ordinary folks to create and share content.
The collaboration that created the digital age was not just among peers but also between generations. Ideas were handed off from one cohort of innovators to the next. Another theme that emerged from my research was that users repeatedly commandeered digital innovations to create communications and social networking tools. I also became interested in how the quest for artificial intelligence—machines that think on their own—has consistently proved less fruitful than creating ways to forge a partnership or symbiosis between people and machines. In other words, the collaborative creativity that marked the digital age included collaboration between humans and machines.
Finally, I was struck by how the truest creativity of the digital age came from those who were able to connect the arts and sciences. They believed that beauty mattered. “I always thought of myself as a humanities person as a kid, but I liked electronics,” Jobs told me when I embarked on his biography. “Then I read something that one of my heroes, Edwin Land of Polaroid, said about the importance of people who could stand at the intersection of humanities and sciences, and I decided that’s what I wanted to do.” The people who were comfortable at this humanities-technology intersection helped to create the human-machine symbiosis that is at the core of this story.
Like many aspects of the digital age, this idea that innovation resides where art and science connect is not new. Leonardo da Vinci was the exemplar of the creativity that flourishes when the humanities and sciences interact. When Einstein was stymied while working out General Relativity, he would pull out his violin and play Mozart until he could reconnect to what he called the harmony of the spheres.
When it comes to computers, there is one other historical figure, not as well known, who embodied the combination of the arts and sciences. Like her famous father, she understood the romance of poetry. Unlike him, she also saw the romance of math and machinery. And that is where our story begins.
--Ce texte fait référence à l'édition
Revue de presse
“[A] sweeping and surprisingly tenderhearted history of the digital age . . . absorbing and valuable, and Isaacson’s outsize narrative talents are on full display. Few authors are more adept at translating technical jargon into graceful prose, or at illustrating how hubris and greed can cause geniuses to lose their way. . . . The book evinces a genuine affection for its subjects that makes it tough to resist . . . his book is thus most memorable not for its intricate accounts of astounding breakthroughs and the business dramas that followed, but rather for the quieter moments in which we realize that most primal drive for innovators is a need to feel childlike joy.” (New York Times Book Review)
“The Innovators . . . is riveting, propulsive and at times deeply moving. . . . One of Isaacson’s jealousy-provoking gifts is his ability to translate complicated science into English—those who have read his biographies of Einstein and Steve Jobs understand that Isaacson is a kind of walking Rosetta Stone of physics and computer programming. . . . The Innovators is one of the most organically optimistic books I think I've ever read. It is a stirring reminder of what Americans are capable of doing when they think big, risk failure, and work together.” (Jeffrey Goldberg The Atlantic)
“A sprawling companion to his best-selling Steve Jobs . . . this kaleidoscopic narrative serves to explain the stepwise development of 10 core innovations of the digital age — from mathematical logic to transistors, video games and the Web — as well as to illustrate the exemplary traits of their makers. . . . Isaacson unequivocally demonstrates the power of collaborative labor and the interplay between companies and their broader ecosystems. . . . The Innovators is the most accessible and comprehensive history of its kind. (The Washington Post)
“Walter Isaacson has written an inspiring book about genius, this time explaining how creativity and success come from collaboration. The Innovators is a fascinating history of the digital revolution, including the critical but often forgotten role women played from the beginning. It offers truly valuable lessons in how to work together to achieve great results.” (Sheryl Sandberg)
“Isaacson provides a sweeping and scintillating narrative of the inventors, engineers and entrepreneurs who have given the world computers and the Internet. . . . a near-perfect marriage of author and subject . . . an informative and accessible account of the translation of computers, programming, transistors, micro-processors, the Internet, software, PCs, the World Wide Web and search engines from idea into reality. . . . [a] masterful book.” (San Francisco Chronicle)
“A panoramic history of technological revolution . . . a sweeping, thrilling tale. . . . Throughout his action-packed story, Isaacson . . . offers vivid portraits—many based on firsthand interviews—[and] weaves prodigious research and deftly crafted anecdotes into a vigorous, gripping narrative about the visionaries whose imaginations and zeal continue to transform our lives.” (Kirkus Reviews, starred review)
“A remarkable overview of the history of computers from the man who brought us biographies of Steve Jobs, Benjamin Franklin, Albert Einstein, and Henry Kissinger . . . Isaacson manages to bring together the entire universe of computing, from the first digitized loom to the web, presented in a very accessible manner that often reads like a thriller.” (Booklist (starred review))
“Anyone who uses a computer in any of its contemporary shapes or who has an interest in modern history will enjoy this book.” (Library Journal (starred review))
“The history of the computer as told through this fascinating book is not the story of great leaps forward but rather one of halting progress. Journalist and Aspen Institute CEO Isaacson (Steve Jobs) presents an episodic survey of advances in computing and the people who made them, from 19th-century digital prophet Ada Lovelace to Google founders Larry Page and Sergey Brin. . . . Isaacson’s absorbing study shows that technological progress is a team sport, and that there’s no I in computer.” (Publishers Weekly)
“Isaacson succeeds in telling an accessible tale tailored to a general interest audience. He avoids the overhyped quicksand that swallows many technology writers as they miscast tiny incremental advances as ‘revolutionary.’ Instead Isaacson focuses on the evolutionary nature of progress. The Innovators succeeds in large part because Isaacson repeatedly shows how these visionaries, through design or dumb luck, were able to build and improve on the accomplishments of previous generations.” (Miami Herald)
“. . . sharing their joy, [Isaacson] captures the primal satisfaction of solving problems together and changing the world. . . . In a way, the book is about the complex lines of force and influence in male friendships, the egging each other on and ranking each other out.” (Bloomberg Business Week)
“[Isaacson’s] careful, well-organized book, written in lucid prose accessible to even the most science-challenged, is well worth reading for its capable survey of the myriad strands that intertwined to form the brave new, ultra-connected world we live in today.” (TheDailyBeast.com)
“If you think you know everything about computers, read The Innovators. Surprises await on every page.” (Houston Chronicle)
“The Innovators . . . does far more than analyze the hardware and software that gave birth to digital revolution – it fully explores the women and men who created the ideas that birthed the gadgets. . . . Isaacson tells stories of vanity and idealism, of greed and sacrifice, and of the kind of profound complexity that lies behind the development of seemingly simple technological improvements. . . . Isaacson is skilled at untangling the tangled strands of memory and documentation and then reweaving them into a coherent tapestry that illustrates how something as complicated and important as the microchip emerged from a series of innovations piggybacking off of one another for decades (centuries, ultimately.) . . . It’s a portrait both of a technology, and the culture that nurtured it. That makes it a remarkable book, and an example for other would-be gadget chroniclers to keep readily at hand before getting lost in a labyrinth of ones and zeros – at the expense of the human beings who built the maze in the first place.” (Christian Science Monitor)
"[A] tour d’horizon of the computer age . . . [The Innovators] presents a deeply comforting, humanistic vision: of how a succession of brilliant individuals, often working together in mutually supportive groups, built on each others’ ideas to create a pervasive digital culture in which man and machine live together in amicable symbiosis. . . . a fresh perspective on the birth of the information age." (Financial Times)
“A sweeping history of the digital revolution, and the curious partnerships and pulsing rivalries that inhabit it.” (Gizmodo.com)
“Steve Jobs’s biographer delivers a fascinating, informative look at the quirky ‘collaborative creatures’ who invented the computer and Internet.” (People)
“[T]his is the defining story of our era, and it’s here told lucidly, thrillingly and—because the bright ideas generally occur to human beings with the quirks, flaws and foibles that accompany overdeveloped intellect—above all, amusingly.” (The Guardian)
“If anyone in America understands genius, it’s Walter Isaacson.” (Salon.com)
“Mr. Isaacson's fine new book, The Innovators, is a serial biography of the large number of ingenious scientists and engineers who, you might say, led up to Jobs and his Apple co-founder Steve Wozniak.” (Steven Shapin Wall Street Journal)
“…a project whose gestation preceded Steve Jobs and whose vision exceeds it.” (New York Magazine)
“For a book about programmers and algorithms, ‘The Innovators’ is a lively, enthusiastically written tale and a worthwhile read, not only for tech-heads but for anyone interested in how computers got into our pockets and how innovation works.” (Aspen Times)
“[a] landmark new work . . . In this often surprising history, Isaacson offers an encyclopedic account of the technological breakthroughs that made modern computers and networks possible: programming, transistors, chips, software, graphics, desktop computers, and the Internet.” (Boston Globe)
“The brilliant Isaacson follows his mega-selling 2011 biography of Apple founder Steve Jobs with this detailed account of the legendary and unsung people who invented the computer and then the Internet.” (Sacramento Bee)
“The argument against the great man theory of invention is not new. But the main merit of Walter Isaacson’s The Innovators is to show that this is particularly true in information technology—despite the customary lionisation of many of its pioneers, from Babbage and Alan Turing to Bill Gates and Linus Torvalds. . . . Mr Isaacson excels at explaining complex concepts.” (The Economist)
“Walter Isaacson is the best possible guide to this storm. He interrupted work on [The Innovators] book to write the standard biography of Steve Jobs, having previously written lives of Einstein, Benjamin Franklin and Kissinger. His approach involves massive research combined with straight, unadorned prose and a matter-of-fact storytelling style. . . . the directness of his approach makes for clarity and pace.” (Bryan Appleyard The Sunday Times)
“Isaacson’s book offers a magisterial, detailed sweep, from the invention of the steam engine to the high-tech marvels of today, with profiles of the great innovators who made it all happen. Among the book’s excellent advice is this gem from computing pioneer Howard Aiken: ‘Don’t worry about people stealing an idea. If it’s original, you will have to ram it down their throats.’” (Forbes)
"A masterpiece" (Daily News (Bowling Green, Kentucky))
“In The Innovators, Isaacson succeeds infilling our knowledge gap by crafting a richly detailed history that traces the evolution of these modern tools and pays homage to the people whose names and contributions to computer science are little-known to most of us. . . . The Innovators is as much about the essence of creativity and genius as it is about cathode tubes, binary programs, circuit boards, microchips and everything in between.” (SUCCESS)
“A sweeping history of the digital revolution, and the curious partnerships and pulsing rivalries that inhabit it.” (Gizmodo)
“If anyone could compress all that into a readable narrative, it would be Isaacson, the former managing editor of Time and author of magnificent biographies of Albert Einstein and Steve Jobs….The Innovators shows Isaacson at his best in segments where his talents as a biographer have room to run.” (Dallas Morning News)
“Fueled by entertaining anecdotes, quirky characters and a strong argument for creative collaboration, The Innovators is a fascinating history of all things digital, even for readers who align themselves more with Lord Byron than with his math-savvy daughter.” (Richmond Times-Dispatch)
“a significant addition to [Isaacson’s] list of best-selling nonfiction works with The Innovators. . . .Isaacson thoroughly examines the lives of such landmark personalities as Alan Turing, John von Neumann, J.C.R. Licklider, Robert Noyce,Bill Gates, Steve Wozniak, Tim Berners-Lee, Jobs and others. The most well-read of technocrats will still learn a lot from these thoroughly researched 542 pages. He shows with repeated examples that an Aha moment often went nowhere without the necessary collaborators to help flesh out the idea, or make it producible, or sell it. Collaboration is, indeed, a major theme of the book. . . . [The Innovators] reads as easily as the best of them. Isaacson truly has earned his spot on the best-seller lists.” (Charleston Post and Courier)
BEST OF 2014
NEW YORK TIMES; WASHINGTON POST; FINANCIAL TIMES; HOUSTON CHRONICLE; KIRKUS; AMAZON; NPR; BLOOMBERG.COM; WALL STREETJOURNAL; FORBES; SACRAMENTO BEE; (BEST OF 2014)
--Ce texte fait référence à l'édition
Commentaires client les plus utiles sur Amazon.com (beta)
173 internautes sur 187 ont trouvé ce commentaire utile
The Difference Between a Reporter and A Historian3 novembre 2014
- Publié sur Amazon.com
Format: Format Kindle
The good news: an epic sweep through computing history connecting the dots as Isaacson's sees them. Even if you're not a technical history fan than this book will serve as the definitive history of computing through the first decade of the 21st century.
The bad news: this book will serve as the definitive history of computing through the first decade of the 21st century. It is at best technically wrong, misses some of the key threads in computing history and starts with a premise (that innovation comes from collaboration) and attempts to write history to fit.
The difference between and a reporter and a historian is that one does a superficial run-through of a rolodex of contacts and the other tries to find the truth. Unfortunately Isaacson's background as reporter for Time and CNN makes this "history" feel like he was comfortable going through his Rolodex of "Silicon Valley" sources connecting interviews, and calling it history.
I'm sure Isaacson would claim, "more details get in the way of a good story," however that is exactly the difference between a throwaway story on CNN and a well written history. The same epic sweep could have embraced and acknowledged the other threads that Isaacson discarded. The gold standard for a technical history is Richard Rhodes "The Making of the Atomic Bomb." (Other reviewers have pointed out pointed several critical missing parts of computing history. I'll add one more. While perpetuating the "Intel invented the microprocessor" story makes great business press copy it's simply wrong. Intel commercialized something they knew someone else had already done. Lee Boysel at Four Phase invented the first microprocessor. If Isaacson had done his homework he would have found out that Bob Noyce was on the Four Phase board, knew about the chip and encouraged Intel to commercialize the concept.)
Finally, one of the "facts" in this book that differentiate reporting from history is the garbled bio of Donald Davies, one of the key inventors of Packet Switching. Davies is described as "during the war he worked at Birmingham University creating alloys for nuclear weapons tubes..." I started laughing when I read that sentence. It's clear Isaacson had no idea what Davies did in WWII. He obviously found a description of Davies' war work, didn't understand it and re-edited it into something accidently amusing - and revealing. What Davies had actually done during the war is worked on the British nuclear weapons program - codenamed "TubeAlloys".
Understanding the distinction is the difference between a reporter and a historian.
121 internautes sur 131 ont trouvé ce commentaire utile
An epic, fast moving history with some flaws and omissions that can be corrected in a second edition or paperback epilogue11 octobre 2014
Forrest M. Mims III
- Publié sur Amazon.com
One of the greatest strengths of Walter Isaacson’s latest book is the author’s personal interviews with some of the post-Altair key players. A curious weakness noted by a few reviewers is that some of the earliest digital computers are absent from the text. A paragraph or two on the fascinating history of the ancient abacus would have been nice. While Isaacson is generally correct in observing that advances in computer technology have benefitted from or were made possible by collaborations, those advances often occurred as step functions and not gradual ramps.
A full review of this latest Isaacson book would require a book of its own. So I’ll zero in only on the Altair 8800 story. While the Altair’s Intel 8800 microprocessor was developed in Silicon Valley, Isaacson begins his account of the Altair by noting that the first commercially successful hobby computer was developed far away in Albuquerque, New Mexico. The Altair was designed by Ed Roberts, who headed MITS, Inc. Isaacson captures only a hint of Ed’s personality during those heady days, and he emphasizes Ed’s hobbyist side more than his degree in electrical engineering. Ed was a first class designer of both analog and digital circuits, an ability most notably shared by Steve Wozniak.
Elsewhere in this tome Isaacson adds flavor and spice to the origins of the PC era with some captivating interviews with some of the key players. Unfortunately, Ed passed away in 2010 (Bill Gates visited him in the hospital), and was not around to be interviewed. Dave Bunnel and other MITS veterans could have added some great Ed stories and corrected a few flaws. For example, the Altair was not developed in The Enchanted Sandwich Shop, which I rented for $100 per month so we could move MITS from Ed’s garage to prepare the Opticom kits we sold through Popular Electronics. That was in 1970, long before the Altair. The Altair was named by Popular Electronics staffers Alexander Burawa and John McVeigh, not by Les Solomon’s daughter.
These errors are trivial (one of Ed's favorite words) in light of this book's vast reach and they don’t take away from the significance of this book, which could be the primary text for a university course on the history of modern computing. But since Ed’s Altair set the stage for much of the industry that followed, it would be good to have a flawless and somewhat more detailed account of the Altair’s origin. A number of other histories of the PC have similar errors. While a revised and corrected second edition would be best, perhaps the paperback version of Isaacson’s book can includes an epilogue with at least some mention of the missing computers noted here by other reviewers and more about Ed, MITS and the Altair story.
An ideal platform for an epilogue is the Startup Gallery of the New Mexico Museum of Natural History and Science in Albuquerque. Startup, which was conceived and largely financed by Paul Allen, presents the history of modern computing with many rare artifacts from Allen’s personal collection. The centerpiece is devoted to the development of the Altair, complete with video interviews with Ed Roberts and the other key players. A nearby multimedia presentation is must watching.
2015 will be the Altair’s 40th anniversary. If Isaacson can visit Startup and provide advance notice of his arrival, perhaps some of us MITS veterans can meet him there and give him a tour.
93 internautes sur 107 ont trouvé ce commentaire utile
Fascinating History7 octobre 2014
Loyd E. Eskildson
- Publié sur Amazon.com
Format: Format Kindle
'The Innovators' is a serial biography of a number of highly creative scientists and engineers since the 1840s who gave us the Third Industrial Revolution - transistors, microchips and microprocessors, programmable computers and their software, PCs, and the graphic interface. In turn, those innovations set the stage for video games, the Internet, search engines, Wikipedia, and touchscreens. One important conclusion - the most important digital advances have been made by teams and collaboration, not lone geniuses, and founded on incremental improvements over time. Creative people and ideas, however, are not enough. Isaacson also points out the contributions of necessity (eg. wars), and venture capital.
AT&T's Bell Labs during and after WWII was a great 'idea factory,' per Isaacson; other examples include Xerox's PARC (possibly the origin of most electronic innovations in the 1970s - the ethernet, ENIAC, the mouse, and graphical user interface), the Manhattan Project at wartime Los Alamos, Intel, Grace Hopper and Howard Aiken, , pre-Microsoft Bill Gates and Paul Allen (BASIC, DOS), Steve Jobs and Steve Wozniak, and Ada Lovelace and Charles Babbage (an 1830s punched-card-driven computer).
The book opens with a fascinating and detailed description of the amazing Lovelace/Babbage computer - 100 years ahead of its time, needing scores of technological advances to implement. Another early predecessor described was Hollerith's punch card tabulator - used to automate the 1890 Census (took one year, instead of the customary eight); the company he founded became IBM in 1924, after a series of mergers and acquisitions. In between came Lord Kelvin and James Thomson's 'harmonic synthesizer' that could perform integration (calculus). Then came Vannevar Bush's 'Differential Analyzer' - a bedroom-sized analog machine that could solve equations with up to 18 independent variables - later versions created artillery firing tables, but it was the last of any successful analog computing effort for many decades. Next Alan Turing, followed by many others - en route to today's modern computers.
Bottom-Line - 'The Innovators' is a fascinating history of today's technology.
115 internautes sur 145 ont trouvé ce commentaire utile
A shallow wade through the pool of computing history13 octobre 2014
- Publié sur Amazon.com
Walter Isaacson's back to drop some knowledge bombs on y'all.
The Innovators is the story of the digital revolution and how innovation happens.
Well sort of. It's more of a hodgepodge collection of anecdotes about computers and computing. Isaccson talks about Ada Lovelace, but not Al Khwarizmi. He talks about TCP/IP, but not DNS. He talks about Google and Blogger, but no Facebook or Tumblr. Everything seems a bit shallow and cursory.
If you're completely new to computing history, this will be an informative book and you'll get an idea of how much ground has been covered in the past hundred seventy years or so.
For me it was mostly review. In high school, my computer science teacher insisted on drilling computing history into our heads for a month before getting on to loops, control structures, and things I was actually interested in. There are even things I remember from that class Isaacson leaves out. The computer John Vincent Atanasoff and his graduate student Clifford Berry created was called the Atanasoff Berry Computer, or ABC. And tragically, Berry died in a car accident before the ENIAC patent trial. I'm so glad my mind stored that and not calculus. Good job, brain. There is also the fact that Perceptron suffered from a XOR problem, but I learned about that in college.
This book doesn't seem to have much original research in it. You can find a much more informative history of Bell Labs's contributions to computing in The Idea Factory. For Turing's life, the Enigma is your goto book. Isaacson's own Steve Jobs is a much more informative look at the life of that tech titan and all around *********. And as for the interplay between Gates and Jobs during the Macintosh era, go watch the Pirates of Silicon Valley. It's surprisingly good for a made for TV film. If you want to know about computing history without having to pick up a second book, by all means pick this up. It's okay.
The point of this sweeping, magisterial, meandering treatise is supposedly to show how innovators build on one another's work and collaboration is the norm and not the exception. No kidding. Anyone who knows anything about science or engineering could tell you this. Even Einstein needed help with General Relativity: "Grossmann, you've got to help me, or I'll go crazy." once pleaded our hirsute prototype “lone” genius.
At the end of the book Isaacson writes a strange chapter that seems to argue : Hal ain't coming and computers will remain dumb. However, machine human interaction is the wave of the future. I won't deny the immediate future will most likely belong to those who can best leverage computational resources to get tasks done. But Isaacson's contention that machines will remain dumb isn't convincing. As Dijkstra once said: “The question of whether a computer can think is no more interesting than the question of whether a submarine can swim.” Of course we're special. We're humans! Go team humans. First in ego. Isaacson doesn't add much beyond John Searle's Chinese room argument. And just because we don't know much about the human brain now doesn't mean we always will. History has shown past is prologue- until it isn't.
18 internautes sur 21 ont trouvé ce commentaire utile
Detailed and informative narrative on foundations of technology (kindle:great;hardcover: even better)8 octobre 2014
- Publié sur Amazon.com
Format: Format Kindle
In a classic retelling of the story of digital revolution, Isaacson makes broader comments on the importance of collaboration and tries to de-romanticize the notion of innovation happening as a series of significant breakthroughs emanating from lone geniuses. In that sense, one could see that themes introduced in Where Good Ideas Come From and How We Got to Now: Six Innovations That Made the Modern World are (deliberately or not) explained well in the context of digital revolution. More specifically, the often 'incremental' nature of innovation, significant gaps between others realize the importance of someone's invention, impact of developments in unrelated fields, and the very nature of collaboration. Later on in the book, Isaacson quotes Twitter co-founder "....they simply expand on an idea that already exists". The author also makes an important point in reminding that corporations (IBM, Intel,Bell labs, Honeywell..etc) played a significant role in these developments, but their stories oftentimes unfairly gets discounted in the face of narratives centered around individuals.
Trying to balance interpretive historical narration and cataloging key details pertinent to the digital revolution, Isaacson weaves a (mostly) linear complex storyline starting with Ada to more recent topics such as IBM's Jeopardy machine. Throughout these often dense chapters, a patient reader is able to understand the core tenets of computers, programming, and the Web itself - and how they evolved over time. The calibration, refinement, and sometimes negation of these ideas over time, as with most understanding in science we take for granted, is well-documented and very informative. The fairly long chapters on computers and programming could test the patience of a reader early on, but these chapters lay the foundation for the chapters describing the dramatic growth seen in the past few decades.
One could argue that books such as The Intel Trinity: How Robert Noyce, Gordon Moore, and Andy Grove Built the World's Most Important Company, Tubes: A Journey to the Center of the Internetnumerous biographical sketches of Ada Lovelace, Crystal Fire: The Invention of the Transistor and the Birth of the Information Age (Sloan Technology Series) covered some of these topics with greater technical and/or biographical depth. However, most of these attempts have been stymied by a crucial fault - they all told history from a single point-of-view. In this book, there is no protagonist per se. That approach provides the author a dispassionate approach that allows for more incisive analysis, though he doesn't necessarily capitalize on it. Discussions on who should be given credit for the first computer is a rare example where the author manages to inject his own analysis.
Given the vast research that went into this book and access to some of the key technology leaders of the time, one wishes the author attempted to predict the next few decades or hypothesize on what's required to make the next few steps in this field. Leveraging Ada's story to begin and end the narration gives a unique sense of closure for the reader - and a very stark reminder that despite all the advances we've seen so far, we are still far away from machines that can think (this last chapter (shortest and succinct), aptly titled 'Ada Forever' is one of the better-written chapters). The hype-less narration, systematic building of the key concepts, doing a good job in relating the developments across decades and tracing an investigative path to where we are, makes this a very compelling read for anyone interested in technology. 4.5 stars
(Kindle version on iPad app worked great; though the layout of the photographs and the initial detailed timeline with rare pictures are much better in the hardcopy. It would've been great if the timeline at the outset of the book was available as a pullout)