The best way-- really, the only way-- to sum up David Foster Wallace's Everything and More: A Brief History of ∞ is by quoting a bit from it. This comes from the middle part of the book, after a discussion of Fourier series, in one of the "If You're Interested" digressions from the main discussion:
(IYI There was a similar problem involving Fourier Integrals about which all we have to know is that they're special kinds of 'closed-form' solutions to partial differential equations which, again, Fourier claims work for any arbitrary functions and which do indeed seem to-- work, that is-- being especially good for physics problems. But neither Fourier nor anyone else in the early 1820s can prove that Fourier Integrals work for all f(x)'s, in part because there's still deep confusion in math about how to define the integral... but anyway, the reason we're even mentioning the F. I. problem is that A.-L. Cauchy's work on it leads him to most of the quote-unquote rigorizing of analysis he gets credit for, some of which rigor involves defining the integral as 'the limit of a sum' but most (= most of the rigor) concerns the convergence problems mentioned in (b) and its little Q.E.I. in the --Differential Equations part of E.G.II, specifically as those problems pertain to Fourier Series.)
There's a little footnote just before the closing parenthesis, which reads:
There's really nothing to be done about the preceding sentence except apologize.
That's the book in a nutshell. It's a breathless survey of several thousand years of mathematical history, replete with footnotes, asides, and quirky little abbreviations ("Q.E.I." is a "Quick Embedded Interpolation," and "E.G.II" is "Emergency Glossary II"). The quoted paragraph is admittedly an extreme example, but if that style makes you want to run screaming, don't pick this book up.
On the other hand, if it makes you say, "Hmmmm.... That's a unique approach to a math text...," then get this and read it, because the whole thing is like that, only better.
The book (or "booklet," as he refers to it throughout, which I suppose he's entitled to do, as he's best known as a writer of thousand-page novels) is a really interesting stylistic exercise. It's a densely argued survey of mathematics, full of forward and backward references ("as we will see in §7" and "recall from §3(f)," respectively), but the entire thing is written in a headlong sort of rush to suggest that it's being improvised in one lengthy typing session. There are even little asides containing phrases like "if we haven't already mentioned it, this would be a good place to note that..." It's a remarkable piece of work, and does a good job of conveying a sense of excitement regarding some pretty abstruse mathematical issues.
The other fascinating thing about it, for a popular science work, is just how much it focusses on the math. There's a three-page (or so) biographical interlude about Georg Cantor, and there are a smattering of references to the more melodramatic aspect's of Cantor's career, but those remain firmly in the background. This is in stark contrast Richard Reeves's book on Rutherford, part of the same Great Discoveries series of books, in which the scientific aspects are subordinate to the biography.
This is a very math-y book, and quite daunting in some places. If you can handle Wallace's writing style, though (personally, I love it), the math shouldn't be too much of a challenge. And the discussion of the math of the infinite is really outstanding.
This isn't a book that will suit all tastes-- far from it-- but if you've read and liked other things by Wallace, it's worth a read. You'll never look at pure math the same way again.
- Log in to post comments
http://ap.google.com/article/ALeqM5gcMD6YE5F4f-YQgiszTunCUrWw6gD9368TQO0 RIP
I'll have to read that book. I bailed out on "Infinite Jest" about halfway through; just too randomly discursive for me, but I love Wallace's shorter works, the articles and essays, etc. He was a brilliant writer.
It's the sort of thing that makes you want to say, "Why can't more science writers write like that instead of being all sensationally biographical or dryly technical?" and then you realize what a dumb question that is, because science writers are doing good if they can write as well as Asimov, never mind Wallace.
I'm currently 2/3rds of the way through Consider the Lobster, and loving it. I figured I'd check out some more of his essays, and some short stories before attempting to scale Infinite Jest, but I think I'll have to read Everything and More before Infinite Jest, it sound completely up my street (plus, Infinite Jest scares me, so I'm trying to put it off).
Everything and More is easily the best pop-math book I've ever read, and one of the best pop science books in general. It's certainly one of my favorites of any book. I really can't recommend it highly enough.
I enjoy Wallace's writing, and I applaud the ambition it took to attempt Everything and More. Sadly, there's a lot of wrong math (and probably "not even wrong" math) in it. There's good stuff, too. One can be stimulated by this book and probably learn some things... but don't trust it!
This review mentions a handful of errors in Everything and More, and it's a fun review to read: http://www.ams.org/notices/200406/rev-harris.pdf
But it's written for mathematicians.
I love David Foster Wallace and I love math, but I never picked this one up. I heard that a lot of it was wrong, and I recall that it came out at the same time as another book about infinity that wasn't full of math errors and got much better reviews.
I guess that just means that there's still one more DFW book I haven't read! I'll just try to treat it as a work of quasi-fiction.
My opinion of it is quite the opposite: it's by far the worst piece of science writing I've ever seen. It's full of serious mathematical errors (not just minor technical goofs, but places where the entire argument is simply nonsense) and serious historical errors. The review by Harris that EJ mentioned discusses some of these errors. DFW also just didn't know much about the subject: even when what he's saying is mathematically correct, it's often a really roundabout or confusing way of explaining it.
Basically, it all comes across like a late-night college bullsh*t session in which DFW is trying to impress people with how clever and erudite he is, while hoping nobody notices that he's telling us everything he knows plus making up a little more. It can be fun to read (for those who like his style), but as popular math writing it's basically a failure.
I don't think that's fair at all. The only really serious error is his confusion over just what the continuum hypothesis actually says. Other than that the book gets its points home in style and without errors which are likely to seriously mislead.
(1) Rudy Rucker has written entertaining erudite stuff about infinities.
(2) Plenty of books misinterpret Cantor and Godel. But Doug Hofstadter's "Godel, Escher, Bach" may still be the most amusing place to start.
The only really serious error is his confusion over just what the continuum hypothesis actually says.
I agree that the business about the continuum hypothesis is the most serious error (he repeatedly states it in a way that is totally incorrect and furthermore directly contradicts results explained elsewhere in the book). However, there are lots of others. I don't have my copy of the book handy, but here's a quote from Harris's review:
The Extreme Value Theorem is used to prove, Zeno be damned, that on any time interval [t_1, t_2] the "time function" [sic] has an absolute minimum t_m which is "mathematically speaking, the very next instant after t_1" (p. 190).
This is total gibberish, and it makes me wonder what sort of editing the book went through. (Did any logician see the text before it was published?) I understand how hard it is to get all the subtle mistakes out of a manuscript, but there's no excuse for writing nonsense.
"the very next instant" makes no sense, I agree, unless you reject the continuum outright and believe the universe to have discrete quantized time (i.e. chronons). Of course, Stephen Wolfram is a leader of the folks who believe just that.