84 internautes sur 85 ont trouvé ce commentaire utile
- Publié sur Amazon.com
This was the second-hardest book I ever read. Honestly, it took me years and years to get through it. I even had to buy a 2nd copy, because I kept getting frustrated and throwing the first copy across the room until it was destroyed. So yes, this book requires a substantial effort to read.
But the payback!! I've gotten more return on investment from this book than from any other book I've ever read. If you dilligently read and master this book, you will be able to analyze and solve problems your collegues just can't.
The basic idea behind Kolmogorov complexity is straighforward: a good measure of the complexity of an object is the length of the shortest computer program which will construct that object. From this basic idea an amazing variety of insights and powerful techniques have been developed, and this book is quite comprehensive in cataloging and explaining them.
For computer scientists and working programmers, probably the most useful result of Kolmogorov complexity would be the "Incompressibility Method", which is a powerful technique for the analysis of the runtime of algorithms. Typically, it is relatively easy to figure out what the best case or the worst case runtime of an algorithm is. Until now, it was hard to calculate the average runtime of an algorithm, because it usually involved a tricky counting problem, to enumerate all possible runs of the the algorithm and summing over them. The incompressibility method eliminates the need for doing these complicated enumerations, by letting you perform the analysis on a single run of the algorithm which is guarunteed to be representative of the average runtime of the algorithm. If you program for a living like I do, this will give you an edge, because if you can accurately predict that the worst-case runtimes almost never happen, you can usually simplify and streamline your programs by optimizing it for the average case. If your competitors are wasting time optimizing for a worst case which almost never happens--at the expense of _not_ optimizing for the average case, you win bigtime.
For philosophers of science and AI/knowledge representation folks, the most useful results of Kolmogorov complexity are probably the contributions of Kolmogorov complexity to Baysianism. To be a Baysian is to follow a two step process: (STEP 1) for every possible sentence, assign to it a number between 0 and 1 which represents how certain you are that that sentence is true. This initial assignment should be a probability distribution over all possible sentences. It should be a "good" probability distrubution, but of course it won't be perfect, since you don't know everything. (STEP 2) when confronted with new evidence, e.g. an observation, update your current "good" degrees of belief by using Bayes' law, to yield a new "better" set of degrees of belief.
The Baysians always had a good story for Step 2--just use Bayes law. But until now, they were mostly hand-waving on Step 1--what would constitude a "good" initial probability distribution? There were many proposals (e.g. maximum entropy) but all proposals had benefits and drawbacks. What Kolmogorov complexity provides is the so-called "universal" distribution, which is guarunteed to be a "good" initial distirbution. This book devotes much time to explaining and exploring this, and shows how previous techniques, like maximum entropy, minimum description length, etc all can be seen as computable approximations to the (unfortunately uncomputable) universal distribution. This really gives a nice framework for evalutating and formulating good prior distributions.
After remarking on how hard this book was to read, I should emphasize that this is not due to bad writing on the part of the authors! Indeed, after throwing the book across the room, I was always drawn back by Li & Vitanyi's most engaging writing style to pick the book back up, dust it off, and have another go at it. If it were not for their wonderul ability to expain a very complicated subject matter, I never would have gotten through it.
An unsung hero of this book is Peter Gacs, who wrote a set of lecture notes which really could be considered to be an Urtext for this book. If you tackle this book, I highly recommend that you also get ahold of these notes, because it is sometimes very useful, when trying to puzzle out a difficult argument, to get another description/explaination of it from a different point of view. These notes are available on the web, just google for "Lecture note on descriptional complexity and randomness" by Peter Gacs.
If you're up to the challange, then buy this book, dilligently read it, swear at it--then swear by it.