You Are Not A Gadget: A Manifesto (Anglais) Broché – 3 février 2011
Rentrée scolaire 2017 : livres, fournitures, cartables, ordinateurs, vêtements ...
|Neuf à partir de||Occasion à partir de|
- Choisissez parmi 17 000 points de collecte en France
- Les membres du programme Amazon Prime bénéficient de livraison gratuites illimitées
- Trouvez votre point de collecte et ajoutez-le à votre carnet d’adresses
- Sélectionnez cette adresse lors de votre commande
Produits fréquemment achetés ensemble
Les clients ayant acheté cet article ont également acheté
Description du produit
THE IDEAS THAT I hope will not be locked in rest on a philosophical foundation that I sometimes call cybernetic totalism. It applies metaphors from certain strains of computer science to people and the rest of reality. Pragmatic objections to this philosophy are presented.
What Do You Do When the Techies Are Crazier Than the Luddites?
The Singularity is an apocalyptic idea originally proposed by John von Neumann, one of the inventors of digital computation, and elucidated by figures such as Vernor Vinge and Ray Kurzweil.
There are many versions of the fantasy of the Singularity. Here’s the one Marvin Minsky used to tell over the dinner table in the early 1980s: One day soon, maybe twenty or thirty years into the twenty- first century, computers and robots will be able to construct copies of themselves, and these copies will be a little better than the originals because of intelligent software. The second generation of robots will then make a third, but it will take less time, because of the improvements over the first
The process will repeat. Successive generations will be ever smarter and will appear ever faster. People might think they’re in control, until one fine day the rate of robot improvement ramps up so quickly that superintelligent robots will suddenly rule the Earth.
In some versions of the story, the robots are imagined to be microscopic, forming a “gray goo” that eats the Earth; or else the internet itself comes alive and rallies all the net- connected machines into an army to control the affairs of the planet. Humans might then enjoy immortality within virtual reality, because the global brain would be so huge that it would be absolutely easy—a no-brainer, if you will—for it to host all our consciousnesses for eternity.
The coming Singularity is a popular belief in the society of technologists. Singularity books are as common in a computer science department as Rapture images are in an evangelical bookstore.
(Just in case you are not familiar with the Rapture, it is a colorful belief in American evangelical culture about the Christian apocalypse. When I was growing up in rural New Mexico, Rapture paintings would often be found in places like gas stations or hardware stores. They would usually include cars crashing into each other because the virtuous drivers had suddenly disappeared, having been called to heaven just before the onset of hell on Earth. The immensely popular Left Behind novels also describe this scenario.)
There might be some truth to the ideas associated with the Singularity at the very largest scale of reality. It might be true that on some vast cosmic basis, higher and higher forms of consciousness inevitably arise, until the whole universe becomes a brain, or something along those lines. Even at much smaller scales of millions or even thousands of years, it is more exciting to imagine humanity evolving into a more wonderful state than we can presently articulate. The only alternatives would be extinction or stodgy stasis, which would be a little disappointing and sad, so let us hope for transcendence of the human condition, as we now
The difference between sanity and fanaticism is found in how well the believer can avoid confusing consequential differences in timing. If you believe the Rapture is imminent, fixing the problems of this life might not be your greatest priority. You might even be eager to embrace wars and tolerate poverty and disease in others to bring about the conditions that could prod the Rapture into being. In the same way, if you believe the Singularity is coming soon, you might cease to design technology to serve humans, and prepare instead for the grand events it will bring.
But in either case, the rest of us would never know if you had been right. Technology working well to improve the human condition is detectable, and you can see that possibility portrayed in optimistic science fiction like Star Trek.
The Singularity, however, would involve people dying in the flesh and being uploaded into a computer and remaining conscious, or people simply being annihilated in an imperceptible instant before a new superconsciousness takes over the Earth. The Rapture and the Singularity share one thing in common: they can never be verified by the living.
You Need Culture to Even Perceive Information Technology
Ever more extreme claims are routinely promoted in the new digital climate. Bits are presented as if they were alive, while humans are transient fragments. Real people must have left all those anonymous comments on blogs and video clips, but who knows where they are now, or if they are dead? The digital hive is growing at the expense of individuality.
Kevin Kelly says that we don’t need authors anymore, that all the ideas of the world, all the fragments that used to be assembled into coherent books by identifiable authors, can be combined into one single, global book. Wired editor Chris Anderson proposes that science should no longer seek theories that scientists can understand, because the digital cloud will understand them better anyway.*
Antihuman rhetoric is fascinating in the same way that selfdestruction is fascinating: it offends us, but we cannot look away.
The antihuman approach to computation is one of the most baseless ideas in human history. A computer isn’t even there unless a person experiences it. There will be a warm mass of patterned silicon with electricity coursing through it, but the bits don’t mean anything without a cultured person to interpret them.
This is not solipsism. You can believe that your mind makes up the world, but a bullet will still kill you. A virtual bullet, however, doesn’t even exist unless there is a person to recognize it as a representation of a bullet. Guns are real in a way that computers are not.
Making People Obsolete So That Computers Seem More Advanced
Many of today’s Silicon Valley intellectuals seem to have embraced what used to be speculations as certainties, without the spirit of unbounded curiosity that originally gave rise to them. Ideas that were once tucked away in the obscure world of artificial intelligence labs have gone mainstream in tech culture. The first tenet of this new culture is that all of reality, including humans, is one big information system. That doesn’t mean we are condemned to a meaningless existence. Instead there is a new kind of manifest destiny that provides us with a mission to accomplish. The meaning of life, in this view, is making the digital system we
call reality function at ever- higher “levels of description.”
People pretend to know what “levels of description” means, but I doubt anyone really does. A web page is thought to represent a higher level of description than a single letter, while a brain is a higher level than a web page. An increasingly common extension of this notion is that the net as a whole is or soon will be a higher level than a brain. There’s nothing special about the place of humans in this scheme. Computers will soon get so big and fast and the net so rich with information that people will be obsolete, either left behind like the characters in Rapture novels or subsumed into some cyber-superhuman something.
Silicon Valley culture has taken to enshrining this vague idea and spreading it in the way that only technologists can. Since implementation speaks louder than words, ideas can be spread in the designs of software. If you believe the distinction between the roles of people and computers is starting to dissolve, you might express that—as some friends of mine at Microsoft once did—by designing features for a word processor that are supposed to know what you want, such as when you want to start an outline within your document. You might have had the experience of having Microsoft Word suddenly determine, at the wrong moment, that you are creating an indented outline. While I am all for the automation of petty tasks, this is different.
From my point of view, this type of design feature is nonsense, since you end up having to work more than you would otherwise in order to manipulate the software’s expectations of you. The real function of the feature isn’t to make life easier for people. Instead, it promotes a new philosophy: that the computer is evolving into a life-form that can understand people better than people can understand themselves.
Another example is what I call the “race to be most meta.” If a design like Facebook or Twitter depersonalizes people a little bit, then another service like Friendfeed— which may not even exist by the time this book is published— might soon come along to aggregate the previous layers of aggregation, making individual people even more abstract, and the illusion of high- level metaness more celebrated.
Information Doesn’t Deserve to Be Free
“Information wants to be free.” So goes the saying. Stewart Brand, the founder of the Whole Earth Catalog, seems to have said it first.
I say that information doesn’t deserve to be free.
Cybernetic totalists love to think of the stuff as if it were alive and had its own ideas and ambitions. But what if information is inanimate? What if it’s even less than inanimate, a mere artifact of human thought? What if only humans are real, and information is not?
Of course, there is a technical use of the term “information” that refers to something entirely real. This is the kind of information that’s related to entropy. But that fundamental kind of information, which exists independently of the culture of an observer, is not the same as the kind we can put in computers, the kind that supposedly wants to be free.
Information is alienated experience.
You can think of culturally decodable information as a potential form of experience, very much as you can think of a brick resting on a ledge as storing potential energy. When the brick is prodded to fall, the energy is revealed. That is only possible because it was lifted into place at some point in the past.
In the same way, stored information might cause experience to be revealed if it is prodded in the right way. A file on a hard disk does indeed contain information of the kind that objectively exists. The fact that the bits are discernible instead of being scrambled into mush—the way heat scrambles things—is what makes them bits.
But if the bits can potentially mean something to someone, they can only do so if they are experienced. When that happens, a commonality of culture is enacted between the storer and the retriever of the bits. Experience is the only process that can de- alienate information.
Information of the kind that purportedly wants to be free is nothing but a shadow of our own minds, and wants nothing on its own. It will not suffer if it doesn’t get what it wants.
But if you want to make the transition from the old religion, where you hope God will give you an afterlife, to the new religion, where you hope to become immortal by getting uploaded into a computer, then you have to believe information is real and alive. So for you, it will be important to redesign human institutions like art, the economy, and the law to reinforce the perception that information is alive. You demand that the rest of us live in your new conception of a state religion. You need us to deify information to reinforce your faith.
*Chris Anderson, “The End of Theory,” Wired, June 23, 2008 (www.wired.com/science/discoveries/magazine/ 16- 07/pb_theory).
From the Hardcover edition. --Ce texte fait référence à une édition épuisée ou non disponible de ce titre.
Revue de presse
Lucid, powerful and persuasive . . . Necessary reading for anyone interested in how the Web and the software we use every day are reshaping culture and the marketplace (Michiko Kakutani, New York Times)
There is hardly a page that does not contain some fascinating provocation (Guardian)
Mind-bending, exuberant, brilliant (Washington Post)
A pioneer in the development of virtual reality and a Silicon Valley veteran, Mr. Lanier is a digital-world insider concerned with the effect that online collectivism and the current enshrinement of "the wisdom of the crowd" is having on artists, intellectual property rights and the larger social and cultural landscape. In taking on such issues, he's written an illuminating book that is as provocative as it is impassioned. (Michiko Kakutani's Top 10 Books of the Year 2010 New York Times)
In the world of technologists, Jaron Lanier is that rare combination: a pioneer and a skeptic. A legendary computer scientist, he did crucial early work in the field of virtual reality (the phrase is his). But he now recoils at the way Web 2.0 and social media sell us short as human beings, both in our relationships and in our sense of who we are. In purposeful, reasoned steps, always informed by a profound understanding of how software really works, he lays out his vision of where it all went wrong and champions the power of the human brain in an age of ever smarter machines. (Lev Grossman Time Magazine Top 10 Non-Fiction Books of 2010)
Aucun appareil Kindle n'est requis. Téléchargez l'une des applis Kindle gratuites et commencez à lire les livres Kindle sur votre smartphone, tablette ou ordinateur.
Pour obtenir l'appli gratuite, saisissez votre numéro de téléphone mobile.
Détails sur le produit
Si vous vendez ce produit, souhaitez-vous suggérer des mises à jour par l'intermédiaire du support vendeur ?
Quels sont les autres articles que les clients achètent après avoir regardé cet article?
Meilleurs commentaires des clients
But Jaron Lanier does not even refer to Marshall McLuhan. And he does not follow that track.
He targets two types of Technologists he identifies as “cybernetic totalists” and “digital Maoists.” This community is qualified by what they advocate or represent. First of all they are the open culture community, those people who consider everything has to be on the Internet and everything on the Internet has to be free of access, economically free hence everyone can get it for nothing, and what’s more everyone can do what they want with what they find and appropriate freely. Jaron Lanier calls that mashups. These people believe in Creative Commons, a license that is no license at all, a license that authorizes anyone who wants to use something for a non commercial production to do it without in anyway contacting the initial proprietor and without leaving any tracks behind. The appropriated “goods” are thus used in all possible ways without anyone knowing really who is responsible for the final product or products thus produced, the afore-mentioned mashups. Their mascot software is Linux which is nothing but the old command-line software known as UNIX wrapped up in a Graphical User’s Interface to make it user-friendly. They are the people of the Artificial Intelligence lobby that pretends that they can, or will soon be able to, simulate human intelligence and the machine they will use to simulate that intelligence will be intelligent, just as if a plane, since it can fly, were a bird. They are the full proponents of web 2.0, this version of the web that enables the circulation of all kinds of products, freely and easily, with the development on top of it of social networks. And finally they are characterized by the fact that they want to share and mashup files that have no context, meaning they cannot be attached to anyone or anything that could claim some propriety right on the file. They are called anti-context file sharers and remashers.
Jaron Lanier takes a strong stance against these people but not in the name of the technology they propose or advocate, but in the name of the deep consequences of these technologies. The whole book is dedicated to that exploration. But he defines his objective as soon as page 19 when he explains the five reasons why all this is important, all that amounting to “people defining themselves downward.”
1- “Emphasizing the crowd means deemphasizing the individual in the design of society, and when you ask people not to be people, they revert to bad moblike behaviors. This leads to not only empowered trolls but to a generally unfriendly and unconstructive online world.”
2- “Finance was transformed by computing clouds. Success in finance became increasingly about manipulating the cloud at the expense of sound financial principles.”
3- “There are proposals to transform the conduct if science along similar lines. Scientists would then understand less of what they do.”
4- “Pop culture has entered into a nostalgic malaise. Online culture is dominated by trivial mashups of the culture that existed before the onset of mashups, and by fandom responding to the dwindling outposts of centralized mass media. It is a culture of reaction without action.”
5- “Spirituality is committing suicide. Consciousness is attempting to will itself out of existence.”
The diagnosis is severe and the book is trying to suggest solutions.
His first question then is about how this cloud or web 2.0 technology is changing people. It develops in them a crowd mentality, what he calls a “hive mind” or “noosphere.” The reference to “noosphere” is never exploited, but the term “hive mind” is vastly exploited and developed into “hive mind thinking,” “hive thinking” and other expressions of this type. It is a metaphor and he may not be responsible for it since it is an old metaphor. But using it for the mentality of the people blindly using web 2.0 and cloud technology is warping the metaphor out of any meaning but excludes the only proper meaning of a herd stampeding wildly across the virtual sky of the Cloud. A hive is a social organization with a very clear and rather rigid hierarchy, with each member having to do one task everyday, each category of members having one special task to perform, including the queen who has to feed in order to lay eggs. The hive produces several products that are highly sophisticated all transformed from collected pollens: honey, wax, royal jelly, propolis, and many others. They take care of the hive and keep it in perfect shape: any mishap endangers the whole colony or swarm. There is nothing of the sort in the cloud, on the Internet on web 2.0. What’s more bees have a language that enables one to tell the others where she has found a good field of flowers. This language is a highly symbolic sign and dance language based on extremely objective elements like the sun, angular orientation to the sun, distances, etc. No one has studied what happens to a bee who could not accurately give that kind of information, or who would endanger the hive and the swarm by reckless actions. That kind of social organization of the survival project of a beehive requires some kind of regulatory authority to take care of trespassers. Hackers are not welcome.
This metaphor is bad and it would have been well advised to use another one like herd psychology or crowd psychology. In fact he could have even been ironical with an expression like Panurge’s sheep borrowed from Rabelais’s Pantagruel, himself borrowing it from antiquity, Panurge meaning in Greek “he who can do everything”.
Beyond that Jaron Lanier insists on the reductionism of this cloud ideology. It forces to anonymity and pseudonymity, both practices that reduce simple personal humanity. He points out how this ideology, this technology produces a complete contradiction that they assume: “It’s the people who make the forum, not the software. Without the software the experience would not exist at all.” (p. 72) The forum is then illusionary. He says the software is “flawed.” The point is that everyone knows it is flawed in its very principle of requiring in the form of an encouragement and an incitation to use personae and avatars instead of real identity and pictures, and then everyone makes do with this software, with this technology. And yet Jaron Lanier is not entirely clear since he advises not to concentrate on the software because then you forget the person behind or the person in the user of the software. If the software is bad, it has to be gotten rid of. But we have to wonder if this anonymity and pseudonymity is not in a way a positive element. Not for security of course, since the IP of a computer can be traced within seconds by any let’s say “security authority” not to speak of hackers and spywares. Some people complain that the Internet enables anyone to say anything without any control. Then what’s the problem? The Internet does not aim at only telling the truth, and what is the truth? Something decided by Parliament or Congress or the United Nations? Some people consider we are not dealing with real people since they are hiding behind avatars. And then what! Deal with the ideas expressed by these avatars, if they express ideas, otherwise forget them. Jaron Lanier seems to believe that this crowd psychology was invented by the Internet and web 2.0. That is certainly not true. We all know “bread and circuses” events in all societies in all historical periods including some war episodes to satisfy public opinion and popular demand. Some of these mass events could be very grim like hanging and drawing and quartering people in England, frying homosexuals in oil in France, impaling people in other countries, and still beheading people with swords like in Saudi Arabia still.
He is right when he says Cybernetic totalism has failed spiritually by fetishizing objects and objectizing people; behaviorally by undervaluing individuals and overvaluing the crowd; and economically by endangering the economy of all types of expression (music, videos, photography, text, etc) and by permitting highly risky financial schemes that could not be devised before. This cloud reduces the creativity of individuals by erasing any circumstantial, existential, experiential real data from Internet products. Real creativity can only come from a circumstantial, existential, experiential real environment of one real individual who invests all that environment in his creativity and in his creation. If the Internet and web 2.0 succeed in that line, how long can the world live without creativity? Yet I will express some reserve on this extreme vision. Real creative people are produced by their circumstantial, existential, experiential context. The Internet and the Cloud can be part of this context but cannot erase it. Mozart would always have been Mozart even if he hadn’t died in poverty: he would still have been composing on his death bed, I guess. The new point is that all those whose creativity is very limited can today “create” and broadcast their “creations” thus producing a tremendous inflation on the cultural or musical market. But even if that may harm many professional creators of value, these have to find ways to protect their work and to guarantee their survival. That’s called union action. I believe that the proportion of creative artists is not going to go down because of this technology. Plays in theaters, concerts in concert halls, films in cinemas, but also the DVDs of these live shows are multiplying their audiences, direct live audience as well as indirect audience at a distance in space and time. A full reform of the management of the Internet is to be thought through and brought about but there is no reason to believe creativity is going to be drowned by the mediocre flock bleating of the herded crowd of the newly Internet-empowered people.
Jaron Lanier is conscious of this dimension and he proposes a humanistic approach of this Cloud technology. The main suggestion is to make all products freely reachable on the Internet but the user would not pay a flat rate but a rate in proportion with the quantity of bits that user would have reached no matter what, including the pictures of his/her sweetheart/boyfriend. On the other hand that user would get a payment for all the bits of his/hers that have been reached by other people, including from his/her sweetheart/boyfriend. This suggestion should be taken seriously because then the circulation of bits on the Internet would become a market and that would bring quality at the top. Though we must not forget that before the Internet and that will be eternal all that reaches the broadcasting public sphere is not necessarily good and all that is good does not necessarily reach the broadcasting public sphere. Thousands of good books have never been published and thousands of good Mozarts have never been able to perform or become publicly known. Jaron Lanier’s approach though requires some reflection on how a creative work is produced, by whom, at what and which and whose cost, how that creative production can be encouraged? Subsidize it, encourage the profitable broadcasting of it, create events where creators can confront themselves with others and with an audience;, including critics, and many other solutions have to be found. Personally I am quite more afraid of the weight of norms, standards and traditions in professional fields than of the competition from the herd’s mooing and dooking.
He insists on another effect of computational technology on any knowledge or let’s say semantic data. It grinds it down into small items in order to digitalize them. It standardizes the basic units: computationalized music notes do not contain any fuzzy variation; they are pure but no instrument played by any musician will ever produce pure notes. Considering the meaning of anything comes from the variations this anything contains, a dog being seen differently by any single person thinking of a dog, this systematic purification and simplification of every item processed digitally produces an enormous loss of meaning. Imagine the 25 or so ways Eskimos have to speak of the snow and Egyptians or Arabs have to speak of the sand or the sun. This grinding of everything down into some bit-powder destroys the architecture of the original object and its inner hierarchy: it aims at simulating a phenomenon or an object but a beautiful picture of a rose does not smell like a rose: it does not prick either. What’s more all the particular environment attached to that item by the person who carries it is erased and lost.
That’s when Jaron Lanier tries to cope with language and bring it back into his conception of computationalism. He is no linguist and he refers to people who are no linguists. To come to his own version he has to reject other approaches. First of all Ray Kurzweil’s Singularity as becoming a newly invented secular religion:
“Those who ,enter into the theater of computationalism are given all the mental solace that is usually associated with traditional religions. These include consolations for metaphysical yearnings, in the form of the race to climb to ever more “meta” or higher-level states of digital representations, and even a colorful eschatology, in the form of the Singularity. And indeed, through the Singularity a hope of an afterlife is available to the most fervent believers.” (p. 178)
He rejects in the same way the approach that considers the inner thing is the same thing as the outer thing that supports that a computer with specialized features is similar to a person, hence is a person. He rejects of course the Turing approach since it is basically a very similar attempt: a machine that cannot be differentiated from a human person in its and his/her reactions is as intelligent as that human person, hence is a human person.
It’s when he suggests a realistic approach of computationalim that he gets lost into language.
He starts with Jim Bower and tries to compare olfaction with language. He asserts that both work “from entries in a catalog not from infinitely morphable patterns” (p. 165). He contradicts this assertion for language page 167: “Only a handful of species, including humans and certain birds, can make a huge and ever-changing variety of sounds.” Of Course he speaks of sounds and before he spoke of words. That’s just the point. The words have been phylogenetically produced from sounds. He misses the articulations of language. He contradicts his first assertion again page 190: “We can make a wide variety of weird noises through our mouths, spontaneously and as fast as we think. That’s why we are able to use language.” He does not wonder why we can do that: what physiological particularity enables us to do it?
He continues his parallel with olfaction and says: “the grammar of language is primarily a way of fitting those dictionary words into a larger context. Perhaps the grammar of language is rooted in the grammar of smells.” (p. 165) This is a non-cautious assertion about linguistic syntax. It negates the various articulations that build the hierarchy of language. Language can’t really be compared with smells. Once again the grammar of language is an invention of man and has been produced from scratch by a long and complex phylogenic process from simple isolated sounds to complex discourses.
To crown it all he compares the Tourette syndrome in which a man or woman uncontrollably produces all kinds of swear words to the “pheronomic system [that] detects very specific strong odors given off by other animals (usually of the same species) typically related to fear and mating.” (p. 165) First consider the fact that all mammals produce the same hormone for fear, which explains that in the wild a man’s fear can be detected by other mammals which will get on the offensive because an animal who is afraid attacks, and since the man here is detected as being afraid hence as going to attack, the best defense is to attack, so the wild animal will attack. Anyone who has some practice of some jungle knows that. Never be afraid in such a situation if you want to have one chance to survive. Then I can’t see how he can compare these pheromonic smells, their detection and the reactions a mammal may have to them to swear words. A Tourette patient cannot use swear words he/she has not heard first, learned second, memorized third. Swear words are not instinctive.
At that point we have to say Jaron Lanier is completely off the point concerning language. He does not take into account the phylogeny of language experienced by Homo Sapiens in concrete conditions; he does not consider the psychogenesis of language experienced by a child learning it in concrete conditions. He does not know about the hierarchical articulations of language and the immense variations from one family of languages to the next, and within each family of languages. Finally he does not know about the distinction between “langue” which represents the infinite expressive potential carried by language and “discourse” which is the concrete realization of one expression of one meaning in real conditions.
And yet he is brandishing the essential concept to approach these problems: neoteny, the fact that human children are born extremely immature, premature, dependent for a long period of several years. That would have given to all his other arguments a power they do not have. Yet he concludes properly not as the final conclusion of the whole book but as the conclusive deduction of the final concept of neoteny brought up at the end of the book.
Moore’s law (the exponential development rate of hardware) will have to accept to be slowed down or even blocked by the very slow development rate of software, the fact that neoteny has a conservative effect since the younger generation are forced into an ever longer period of training that reproduces and ossifies previous knowledge and know-how. Cultural neoteny is even more drastic since it leads to Bachelard’s Poetics of Reverie, vastly overused here since Bachelard is from a period when these modern techniques did not exist, when life expectancy was very limited and when education was only for an elite but the general idea is correct: “The good includes a numinous imagination, unbounded hope, innocence and sweetness.” (p. 183) But on the other side childhood can also produce what Jaron Lanier identifies as William Golding’s Lord of the Flies: “The bad is more obvious, and includes bullying, voracious irritability, and selfishness.” (p. 183) His conclusion is realistic for once: “The net provides copious examples of both aspects of neoteny.” (p. 183) This constant dichotomy, and in fact we should see more than two sides, on the Internet is the possibility for the Internet to be the place were various approaches will be confronted, confronting one another, hence will be a marketplace of some sort, the marketplace of global communication.
If he is right about childhood and youth, we better start thinking of education and start integrating the internet and the Cloud in our systematic education efforts not to moralize, not to demonize, not to advocate the Internet but to teach children how to use it to their own advantage along their own motivations, not the teachers’. He sure is right when he says: “Our secret weapon is childhood.” (p. 188) Why the heck did he not start from there and consider the phylogeny of Homo Sapiens and the psychogenesis of all children.
I will overlook his “Post Symbolic communication.” Homo Sapiens started on his/her track to humanity by developing his symbolic power and among other things by using it to invent language from his multiple sounds through a simple process of discriminating items, identifying them including with names and classifying them into concepts and conceptual classes. Homo Sapiens could only recognize one item when he had already encountered it, discriminated it, identified it and classified it, otherwise Homo Sapiens had to start all over again for the item he did not know.
If by any chance Homo Sapiens moved beyond that symbolic power and lost it he/she would lose everything, including all his/her knowledge that was constructed with language. If Jaron Lanier wants to mean that man is going to reach a higher level of symbolic power, I would entirely agree. The machines developed today by the scientific and technical elite of the world are going to be used by everyone as soon as they are born, and even before their birth, which will increase their intelligence tremendously. The increased intelligence of the global population will also mean an increased intelligence of the elite of the world. The elite only reflects the level of their surrounding masses.
But Jaron Lanier forgets that Homo Sapiens is still an animal species going through mutations. The point is that there is no natural selection among humans any more. All those who are different are treated as handicapped or dangerous and they are kept aside or away. It is high time we start changing our vision and consider the potential of those who are different. Autistic children with the Asperger syndrome for example seem to have great possibilities, among other things in languages. Daniel Tammet is one example of a successful Asperger Savant in foreign languages. It is urgent to consider that Childhood is our secret weapon and to really make an effort to screen these new different people and help them find out their real capabilities and develop them to the best level possible. Right now we might be rejecting the people who represent the future of our species, not the destroyers of it, those who will bring our intellect and intelligence to a higher ever point and will event even better machines to serve humanity.
Dr Jacques COULARDEAU
Commentaires client les plus utiles sur Amazon.com
The remainder of the book covers Lanier's other interests, which are many and varied, and include science, virtual reality, ancient musical instruments, music, cephalopods, a particular polygon, art and humanism. The intelligence and grasp of varied subjects in varied disciplines that come across in this book are amazing.
What informs the entire range of his subjects is an awareness and cherishing of that elusive and often ineffable part of life which does not compute -- tenderness, empathy, delight, a sense of magic, a hint of something larger than ourselves. An appreciation for this aspect of our humanity seems so very absent in much of today's culture and, to my mind and to Lanier's, explains in part why so much of it is vacant and derivative.
This unsettling book explores some of the strange conundra created by our fascination with all things `web 2.0'. From the way one programmer's convenience becomes the next generation's strait-jacket, to the loss of identity in wiki-based knowledge, and the lowering of self-esteem among Facebook addicted youth, to the `ideal' of perpetual existence as a stream of electrons in a computer's `consciousness, this book takes science fiction and roots it deep into the rich manure of common current `culture'.
The concept that structure and process can speed up adoption and dissemination of new ideas by lowering volatility and improving message targeting is anathema to the proponents of wiki-style freedom. But is the freedom of information necessarily worth the sacrifice of individual expression, attribution and control? Proponents of the hive mind or noosphere would argue that case but Lanier takes an independent stance that values contribution of individuals as individuals, with their personal intelligence, experience and emotion, above the anonymous, often re-edited and variable outputs of agglomerated information mash-ups. It is a brave, but valid, stance and coherently reasoned.
The doctrine of crowd-based wisdom is infiltrating strategy and policy development processes. Whilst involvement is inherently useful, it appears obvious, upon reading this treatise, that there should be clear limits to the way in which crowds are used and scope for individual attributable contributions to retain relevance.
The use of pseudonyms and anonymous postings is definitely supporting the rise of `Trolls'. Trolls, in cyberspace, are people who are abusive towards other people or ideas. They have been implicated in cyber-bullying which leaves boards exposed to claims of failure to prevent harassment and/or discrimination. The move towards transparency is greatly hampered when organisations interact online with anonymous respondents.
As Lanier points out, "If you win anonymously no one knows, and if you lose, you just change your pseudonym and start over, without having modified your point of view one bit. If the Troll is anonymous and the target is known then the dynamic is even worse." Any company is at risk of a cyber-storm if their operations, brand or philosophy should offend a tribe of trolls. The case of Nestle and the palm oil debate is a dramatic illustration of this principle in action.
Another of Lanier's bugbears is the principle of `lock-in', where decisions made in the early stages of development establish constraints on decision-making in the later stages until they become ingrained as `facts'. Reducing the richness of individual experience to suit the templates of networking sites is a harrowing process to any innovative thinkers. Cutting the glissando of music into computer recognisable notes is anathema to many musicians. Both of these processes have enabled sharing and progress on a scale unparalleled in human history. Both are reducing the expression of future potential by fitting it into a template based on past expedience.
Lanier is one of the leading thinkers of the internet age and this book has set him apart, and at odds, from his fellows. It has also provided a necessary space for consideration in our headlong rush to the brave new lands of the internet fuelled universe. Like the maps of olden-times, at the edges of our current knowledge it would be well to mark the internet with signs stating `Here be Dragons'. They may only be dragons of our own invention but it is as well proceed towards them with due caution.
Highly recommended for both fans and sceptics of web 2.0 plus anyone who is still undecided.
Available at amazon.com
* Julie Garland McLellan is a professional non-executive director, board and governance consultant and mentor. She is the author of "Presenting to Boards", "Dilemmas, Dilemmas: practical case studies for company directors', "The Director's Dilemma", "All Above Board: Great Governance for the Government Sector" and numerous articles on corporate strategy and governance.
Dilemmas, Dilemmas: Practical Case Studies for Company Directors
Presenting to Boards: Practical Skills for Corporate Presentations (Volume 1)
says, but I think he brings up a number of very important points. The point that resonated
strongest is that about the devaluation of individual creativity. It is certainly true
that the work of individuals frequently crystalizes the thinking of society. But there
does seem to be a fairly broad movement to devalue the role of individuals in
the process of discovery. Lanier argues that ultimately replacing individual creativity
by the computationally guided "inventions" of the hive mind is a step in the wrong
Lanier's writing is clear, and should be required reading for regular followers of
Wired and similar publications (like myself). It does become clear that those
publications, and (if Lanier is right) a large part of Silicon Valley, subscribes to a
fairly coherent new philosophy. Lanier calls this set of belief a religion, simply
because many of the adherents will follow it without question. Whether he is ultimately
right on his main points or not, Lanier succeeds in making the reader question
assumptions about technology which many consider to be unquestionable truths.
Rechercher des articles similaires par rubrique