Newton? Einstein? Morons!

Isaac Newton was a total nutjob. Did you know that he tried to pop his own eyeball out with a knitting needle as a part of an experiment? That he nearly blinded himself staring into the sun? That he was an avid alchemist?

Why do we pay so much respect to a person who was clearly mentally IMBALANCED? Why
would anyone take such a total lunatic seriously? It can't be because of science - his
science was a sloppy mess that he had a hard time explaining to anyone else.

The only reason we look on him as such a figure of respect is because we're told to.
Scientists and mathematicians are fascinated by this figure of lunacy, and placed him on a
pedestal. The rest of us accept what they tell us because they're scientists, right? They
know who was really smart. But is that good science? Or is it just insane hero worship?

The way to tell is to look at the science. Newton's science was a mess - a hodge-podge of
never-before-seen mathematics, mixed with sloppy experiments performed between his alchemical
studies.

Look at Newton's so-called "law of universal gravitation". It ASSUMES that the LAWS OF MATHEMATICS
can accurately describe the LAWS OF NATURE, and that the LAWS OF NATURE are the same everywhere. I
won't go into detail about it, but it should be clear that anyone who actually takes the time to think about it that the whole "law of gravity" is full of basic flaws in both the assumptions and the methodology used to devise this so-called law.

Ok, so I don't really think that, and I'm sure that none of you got past the first six words without being certain that I was up to something writing that. That's the style of attack in a piece recently published in The American Chronicle, which is trying to dispute the validity of
relativity.

Compare what I wrote above with the introduction to the Chronical piece:

Many notable scientists such as the French mathematician, Henri Poincare rejected Einstein's Theory of
Relativity due to it's lack of sound mathematical procedures, absence of clearness of vision or rigorous
arguments.

It has been noted that often when Einstein gave a public speech, that less than ten percent of the
audience spoke German and out of these only a few were physicists. Even though 99% of the audience didn't
have the slightest idea what he had said in his mysterious presentation, he still got a standing ovation.
Was this good science or just a popular fad? Should populism be the basis for accepting or rejecting a
scientific theory?

We start off with a claim that some great scientist rejected Einstein's theory; and then move on to
a rather doubtful apocryphal story.

In fact, the story of Einstein and Poincaré is a fascinating one, which shouldn't be dismissed
with a handwave. In fact, Poincaré and Einstein were both working on relativity at the
same time. It's Poincaré who worked out the Lorentz trasformation. They were publishing their work
on it at the same time. There was a very strong sense of competition between them about the relativity
work; Poincaré refused to cite Einstein; Einstein never cited Poincaré until after he died,
and even then was very grudging in what he would admit Poincaré did. Pointcaré criticized
Einstein's work on relativity - but he did it not as a critique of relativity, but as a personal critique
of Einstein, who's work on relativity Poincaré believed to be inferior to his own. Not quite the
impression that you'd get from reading the intro to the Chronicle piece, eh? (If you're interested in
the history of this, there's a book which recieved very good reviews which has more details; I've ordered a copy, but it hasn't arrived yet. It's called Einstein's Clocks, Poincare's Maps: Empires of Time)

The story about audiences not understanding Einstein but applauding anyway is highly doubtful.
Since the author provides no citations for it, it's hard to refute. But an important thing to remember
is that before world war 2, German was the main language of science. Most scientific work was
published in German. My father spoke German, because when he was in college studying physics,
you were required to take several years of scientific German in order to be able to read
scientific papers. The idea that Einstein was giving lectures in German to scientific audiences who
couldn't understand it is quite strange.

But this introduction is just the least of the silliness in the Chronicle piece. Mr. Williams is
clearly very uncomfortable with some of the ideas of relativity, and he feels that it must be
wrong. The basic idea that there is no such thing as an absolute velocity or a universal time is unacceptable to Mr. Williams. But Mr. Williams cannot address the math or physics of relativity. He can't understand them enough to address them in any way. But they must be wrong. So how can he make that argument? The same way as crackpots everywhere. What do I always say is the very worst kind of math? No math!

So how does Mr. Williams go about refuting relativity? By building an argument that bans the use of math for the purposes of science! No, really! I'm not joking! According to Mr. Williams, you can't use
mathematical conclusions in science, only physical models, and the physical models must be math free. Any attempt to combine mathematical and physical models are automatically totally invalid:

MATHEMATICAL MODELS are abstract, idealized, imaginary models which contain
characteristics and assumptions which cannot exist in reality (such as points, lines, triangles, spheres,
etc.) These models can be purely logical, purely mathematical, geometric, kinematic, dynamic or
electromagnetic. All of these models are based on the LAWS OF MATHEMATICS (symbols, equations, formulas
etc.) which only approximate the physical LAWS OF NATURE. These mathematical models produce deductive
conclusions which only apply to the idealized mathematical models.

PHYSICAL MODELS are real models composed of physical objects which are used in empirical experiments. By repeating identical (as far as possible) testing with these models, inductive conclusions can be made. These inductive conclusions, because they are the results of physical experiments, are the fundamental method used by science in an attempt to understand the LAWS OF NATURE. Knowledge about the real Universe must come from physics and not from mathematics. Experience has always been the primary guide in human reasoning concerning physical facts.

METAPHYSICAL MODELS are sophistic models which contain both mathematical characteristics and physical characteristics. These are mingled or mixed models. Neither mathematical deductions nor physical inductive conclusions can be produced from these models because they cannot exist in reality. These are pseudoscience models which result from "thought experiments". These are purely imaginary mental creations. Any conclusions whatsoever that these models produce are completely irrelevant to anything. They cannot produce any valid understanding of reality. A sophist is someone who deceives people based on clever-sounding, but flawed arguments or explanations. However, these fallacious theories do produce popular imaginary science fiction tales about time travel (Back to the Future, motion picture) and spaceships that travel at the speed of light (Star Trek, motion picture).

See, you aren't allowed to mix math and science. They've got to stay absolutely separate. If you use
math to reason out something about the world, even if it makes good predictions, it just doesn't matter,
because by combining the "PHYSICAL MODEL" and the "MATHEMATICAL MODEL", you've created a "METAPHYSICAL MODEL", which cannot produce any kind of valid conclusions.

This is what made me open up this post with the little Newton pastiche. You see, the critique
of Einstein is derived from this nonsense about non mixing math and physics. The same exact
technique can be used to tear down Newton.

Newton looked at the way the world seemed to work, and derived the laws of motion: a
set of mathematical equations that described how forces ask on physical bodies. Along the
way, he designed a new set of mathematical tools - which he called fluxions - for the
purposes of performing mathematical analysis of physical phenomena. He then used those same
mathematical methods to try to explain observations of how celestial bodies moved. He used what
Williams referred to as a metaphysical model: a model that combines physical observations,
physical explanations, and mathematical models to create an explanation that uses both to create an explanation and predictions.

That's how real science works. Math is a formal tool for describing relationships. Underlying
nearly all scientific theories are mathematical constructions that formalize the theory
enough to make it precise and testable. Toss out the math, and you can't even describe
how how a wagon gains speed when you push it - any attempt to explain it will be a linguistic explanation of a mathematical relationship. You can say "The wagon will move when I push it; it will gain speed faster if I push harder on it" - but that's just a sloppy and imprecise way of saying "F=ma".

As I've written before, mathematical modeling isn't easy. You can't just sit around imagining how things might work, and pick something that seems to make sense. You look at the data gained from observations, and try to find a mathematical model that describes it. You use the model to make predictions, and then validate it by doing experiments that show that the predictions of the model
match reality.

It's true that even if prior predictions of a model appear to be true, new things that can be
derived from the mathematical model need their own validation. At every step of using a model, each time you come up with a new prediction, you need to validate it. That's what we call "science".

That's what Einstein did.

He looked at the information we had about how things worked. And he took some observations of things
that just did not seem to make sense - places where the existing models did not adequately explain
reality. And he tried to see what was wrong with those, and if he could develop a model that described
reality more precisely. That led to relativity.

Mr. Williams goes on to criticize some of the experiments that have been used to validate relativity.
He spends a lot of time on the 1919 eclipse, in which measurements of light deflection appeared to confirm
the predictions of relativity. It's come to light in the years since the experiment that not all of the
data from that set of observations really did fit predictions. In fact, some analysts have suggested that
the degree of deflection observed was within the margin of error of the experimental procedure. Be that as
it may, people didn't stop testing relativity in 1919. In's been nearly 90 years since that experiment,
and relativity has held up in quite literally thousands of experiments, ranging from other
gravitational lensing experiments, to particle delay experiments, to satellite timing experiments. In
fact, if you have a GPS device in your car, every time you use it, you're doing a sort of
relativity-validating experiment. GPS works by a kind of triangulation from multiple satellites; without a
relativistic correction, you wouldn't be able to get anywhere near the kind of precision that the GPS
needs.

His second criticism of experimental validations is even worse:

A second 'proof' of General Relativity regarding the orbit of the planet Mercury, contains too many unknowns to substantiate any mathematical theory. The composition and mass of every body in the solar system is only an approximation, even of our own planet Earth.

Bzzt. Sorry, but you can't make a claim like that without doing some math! Take the information that we have, with error-bars. Do the computation of what Newtonian gravity predicts, keeping careful track of the error bound. Do the same thing with a relativistic computation. Then show how the error bars of the
predictions are too large to allow us to distinguish which explanation is closer. Of course, the math
of that is beyond Williams. After all, we can't expect him to actually do any work, even though he's trying to refute an incredibly well-supported theory. Hand-waving should be enough, shouldn't it?

Williams then goes into a hopelessly clueless babble about quantum physics, which he quite clearly does not understand. Among other things, he says that quantum theory proved that light is a particle, not a wave. Talk about clueless - ever heard of the double slit experiment?. Apparently not in his case.

Then, finally, we get to the heart of his complaint - the part that really bothers him. Like most anti-relativity crackpots, it's the idea that there is no central, fundamental, canonical frame of reference, no absolute notion of velocity or time:

Almost all scientists today agree with Einstein's assumption that there is no fixed frame of reference for the Universe. This also induces them to accept Einstein's false assumption about the speed of light. However, the Big Bang theory implies that there is a fixed (Central) frame of reference in the Universe. This fixed frame of reference for all (Absolute) motion is the physical location of the Big Bang. It is the location from which all universal expansion began. It is from this location alone that the constant speed of light can be defined (or any motion of any object). This location cannot be known because of the immense size of the Universe. It can only be hypothetically conceived or roughly estimated. The conception that every location in the Universe is expanding or moving away from every other location is an abstract idea which is essentially true but it doesn't negate the real characteristic that the total mass of the Universe still has a common center. The expansion of any real mass, must have a mass to begin with and a real location. An expanding Universe doesn't imply a moving Universe. If there is nothing outside of the Universe, then the idea of a moving Universe has no meaning, and the entire Universe should be considered stationary. A stationary mass must have a stationary common center. This is the Central frame of reference, (0,0,0,0).

A quick question for Mr. Williams before I let him go on. Where's the center of the surface of a balloon? When I'm blowing up a balloon, every point on its surface is getting farther away from every other point. Which one is the fixed, unmoving center?

Astronomers have measured the movements of the stars in our Milky Way galaxy in order to approximate the center of the galaxy. Even for our local galaxy, this method only produces an approximate location. This is difficult to do with the billions of galaxies for the entire Universe, but this is the only way to hypothetically predict the center of the Universe. This center is the hypothetical fixed frame of reference. Call it the Big Bang frame of reference at location (x,y,z,t) or (0,0,0,0) at time zero. It has also been called the Central frame of reference.

Why is the existence of a fixed frame of reference important to the Special Theory of Relativity and the speed of light? It is because the definition of velocity has no meaning unless it is with reference to a specific frame of reference. All light (quantum) has constant motion (or Absolute motion) only relative to the only fixed (Central) frame of reference (Big Bang). All other motion can only be relative motion.

Einstein's observation that the speed of light is constant was probably correct, but without a fixed frame of reference, there is nothing for it to be constant in relation to. A velocity must be motion relative to something else. Einstein's assumption that the speed of light is constant relative to all moving frames of reference was partially correct. But it is correct only if the relative movement of the different frames of reference are taken into account relative to the Big Bang frame of reference. The false assumption about the relative speed of light was a consequence of Einstein's false rejection of the Big Bang frame of reference for the entire Universe.

As my question above suggests, Mr. Williams dispute with the reference-frame conclusions
of relativity comes from not being capable of understanding what the theories he's talking about actually say. It's part of the risk when you insist on taking a mathematical argument, and trying to express it
without any math.

The big bang theory doesn't say that the universe exploded into space from a singularity; it says that an explosion from a singularity created space. It doesn't say that the matter in the universe that was condensed into a point floating around in space, but that before the big bang, there wasn't any space at all. The big bang didn't occur at a location in space - because before it, there was no space for it to have a location in.

There is no such thing as a big bang reference frame. It's a meaningless concept.

Further, his continual griping about the speed of light demonstrates that he doesn't have a clue about
what relativity actually says. The fascinating thing - one of the problems that relativity was created to
explain - is that no matter how precisely we measure it, no matter where we measure it, no matter how fast
we're going when we measure it, light always appears to be moving at the same speed. Measure the speed of light from the equator at sunrise, and at sunset - and despite the fact that you're moving at significant difference velocities, there's no variation in the speed of light. Measure it on the equator, and measure it in Paris, and you'll get the same value. Measure it in a supersonic plane, or in a stationary laboratory, and you'll get the same value. It doesn't matter. It's always the same. If it had an absolute velocity, and there was no time dilation, you'd be able to see differences in the speed
of light - if you were moving towards a light source, you'd see photons coming towards you at what appeared to me faster that the speed of light. But you don't. You see red-shifts in light from objects moving away from you - but you don't see any difference in the speed of that light.

Williams concludes with a summary of his critique:

Mathematical models are composed of abstract, idealistic, imaginary characteristics and impossible objects which cannot exist in reality.

Physical models are composed of real physical objects in which all of the characteristics cannot be known or accurately measured.

CONCLUSION: Because of the totally different systems (math and physics) in which these two models are conceived and utilized:

1. The deductive conclusions from the abstract math model cannot be applied to the physical model. To do this, produces the fallacy of analogy abuse. Analogies are used in science to help convey ideas, not to form judgments or inferences.

2. Also because of this difference, the two types of models cannot be mixed or mingled. This only produces metaphysical, sophistical, pseudoscience models which have nothing to do with reality and produce no physical conclusions whatsoever. Metaphors and allegories are literary devices not appropriate in scientific theories.

The failure of the Special Theory of Relativity to produce conclusions which were logically and physically sound, essentially resulted from a repeated failure to recognize this relationship between mathematics and physics. From imaginary sophistical metaphysical models, the Special Theory produces conclusions which are contradictory to reality and contrary to common sense.

I find this conclusion fascinating. It starts by repeating his nonsense about how you can't
mix math and science, because anything which mixes them is just metaphysical nonsense. But then he
ends with the kicker that explains the real reason behind the whole article: relativity
is contrary to common sense. It's not comfortable and easy to understand, so it must be wrong. And
that's coming from a guy who's claiming that quantum theory is an example of science
done right! What's less common-sensical than quantum theory?

Categories

More like this

WORDS

By douchebag (not verified) on 06 Dec 2007 #permalink

With the title of your post, I half expected you to close with a simple, "Inconceivable!"

Very nice, very detailed and easily understandable takedown.

By G Barnett (not verified) on 06 Dec 2007 #permalink

From the title of your post, it would be fair to say that it is 'inconceivable' that someone could be as ignorant as Mr. Williams...

Someone has been watching too much x-files.

As has been pointed out in the Pharyngula thread on this guy- Einstein did in fact speak English.

By Stephen Wells (not verified) on 06 Dec 2007 #permalink

Saddest thing of all, next to the article, under his picture is the following "Mathematician graduate of Arizona State University."

Re Quantum Mechanics

A couple of quotes on the subject of quantum mechanics should be noted.

1. "If you think you understand quantu mechanics, then you don't understand quantum mechanics," Nobel Prize winning physicist Richard Feynman.

2. "Quantum mechanics is a totally preposterous theory which unfortunately appears to be correct," Nobel Prize winning physicist Steven Weinberg.

Waitaminnit! I learned in high school history (and physics) class that guys like Copernicus, Kepler & Galileo laid the foundations for modern scientific methods because they were able to support their observations and theories with mathematics. Didn't Newton (and Leibniz) invent new mathematics to help describe real physical phenomena and shapes? My high school learning was nothing but a sham? Math isn't supposed to be used for science anymore? We just bumble along and say "Magic Man done it!" to support a theory?

When you're driving along the highway and you say, "I can get another hundred miles on this tank of gas," you're using a mathematical model. From the angular position of the gas-gauge needle, you judge how much fuel is remaining in the tank, and based on the assumption that you consume fuel at a fairly constant rate (so-and-so many miles per gallon), you judge how far you can go without stopping to fill 'er up.

According to Mr. Williams, Interstate travel is impossible, or at the very least, "metaphysical" and illegitimate.

I'm collecting links to posts on this subject here.

The real irony here, is that even if he is philosophically correct (and he does have some points, although not very good ones), the fact that mathematical models are fundamental in understanding and creating scientific models is an empirical fact, not a deductive one. And as he implies, the empirical results are the only ones that matter.

I'm constantly amazed by Mark's ability to come to grips with the large, stinking pile of nutters. Williams has clearly no idea of science, relativity, quantum theory, cosmology or nuclear physics. And why American Chronicle would publish what is clear crackpottery is beyond reasonable understanding.

Mathematics is founded on logic and reason. Sophism is the exact opposite.

What irony.

It is really to errors to tackle all of Williams errors. But to somewhat complement Mark's description, I note that Williams misses that testing is deductive and by way of measurements always results in probabilistic descriptions. He also can't accept modern atomic theories, or that Einstein was the first to fully understand what electromagnetic theory tells us about light and its velocity.

This fixed frame of reference for all (Absolute) motion is the physical location of the Big Bang. It is the location from which all universal expansion began.

Ironically, while big bang is the expansion process universally in every point it does indeed provide a reference frame by what I believe can be described as symmetry breaking. It is the frame of where the cosmic background radiation is at rest, and against which we can measure our movement. But before Williams get to this he have yet another century of physics to (mis)understand completely...

By Torbjörn Lars… (not verified) on 06 Dec 2007 #permalink

"to come to grips with the large, stinking pile of nutters".

WTF? It was supposed to be "to come to grips with large, stinking piles of nutters". But the above fits too, I guess.

And I note a lot of other grammatical errors. Sorry, it's beyond sleep time for me I guess.

By Anonymous (not verified) on 06 Dec 2007 #permalink

So, this guy never took high school physics, I guess? I recall at least a moderate amount of "mathematical modelling" to describe weights falling, trollies rolling down inclines, projectile motion, waves interfering in ripple tanks (and extensions thereof to light passing through slits), and a bunch of stuff like that.

While the basic separation of math and physics seems nonsense - the problem of expanding space IS a tricky one. To go with the blowing up of the balloon model, you can blow up the balloon into nothing, and claim that our "space" is only on the inside. But to an outside observer, the balloon clearly has a center, and is clearly expanding into an existing space. But we have assumed there is no outside to make the model work.
Now, recently astronomers have found an area of the cosmic background that is too cold to fit the standard model, and one explanation was that it's an overlap with another universe. Or, to go back to balloons, our balloon is bouncing into a neighboring balloon. What then means there is an outside, a frame of reference and a fixed (x,y,z,t=0).
Too bad we won't be around long enough to see if the "outside" fills up with so many universe bubbles, and if the bubbles then collapse like bath foam.

Mu:

The point of the balloon example isn't that the center is in the center of the balloon. It's that the entire surface of the balloon is expanding - so there is no center *of the surface*. The entire surface is expanding, and there is no center. As the balloon expands, it's surface expands; it's not radiating out from any fixed point on the surface.

I can't believe the title. Newton was a genius, and chances are this guy will never compare to him in his life. If this guy seriously believes that the world cannot be modeled with mathematics, then he must have never attended school, or measured his height.

Great read.

By Anonymous (not verified) on 06 Dec 2007 #permalink

[Math is a formal tool for describing relationships. Underlying nearly all scientific theories are mathematical constructions that formalize the theory enough to make it precise and testable.]

I like how you said 'enough', even though the term 'enough' doesn't really qualify as precise. It shows that scientific theories do NOT use mathematics in a purely formal way, and thus invaldiates Mr. Williams position that one can't mix "math and physics".

[Toss out the math, and you can't even describe how how a wagon gains speed when you push it - any attempt to explain it will be a linguistic explanation of a mathematical relationship. You can say "The wagon will move when I push it; it will gain speed faster if I push harder on it" - but that's just a sloppy and imprecise way of saying "F=ma".]

One can view the second part of the wagon statement as a special instance of F=ma, but I don't think of it as sloppy or imprecise of saying F=ma, unless there exists evidence that the speaker wants to talk about general physical ideas, as opposed to more specific ones. The first part of the wagon statement doesn't even hold. A wagon won't simply move by any push on it. One needs a sufficiently large of enough a push to overcome the maximum force of static friction. So, one would first have to say "If I push it hard enough, the wagon will move." In which case one needs mathematics to approximate the maximum force of static friction (which could vary slightly depending on atomic changes, I suppose, but... not much variation, so the approximation will work even if necessarily "approximate"). Then one preferably would talk about the direction of push also, and we get something like "given that I push in the same direction for harder pushes, if push A has more force to it than push B, then push A will accelerate faster in the same direction than push B does in that same direction." Ummm... one needs math, or at least always imples math, by proclaiming one push more forceful (or "harder") than another. So, yeah... Mr. Williams comes out the REAL sophist here.

Still, I wouldn't under-estimate a linguistic approach for some purposes. It works out much easier and more accurate to say what you mean if you say something "if the temperature is very high, then increase the air conditioning very much." Of course, one can translate this into mathematical terms, and maybe you want to call that engineering instead of science. Fine. But, the point I mean to make comes as don't just pick either natural language or math, and think one necessarily sloppier than the other. People can write nonsense in math. Use BOTH math and natural language, and use both so that they work together.

By Doug Spoonwood (not verified) on 06 Dec 2007 #permalink

Regarding the story about Einstein lectures in German. In 1923 Einstein gave his Nobel-lecture at a funfare, Liseberg, in Göteborg in Sweden. The lecture was an introduction to the Theory of relativity, despite the fact that the prize was awarded mainly for the photoelectic effect "and with no regard to whether the theory of relativity is true or not", and was of course delivered in German (English translation available from the Nobel homepages). In the audience were about 600 scientists from all of Scandinavia, but also some 400 luminaries from the city of Göteborg, as well at the Swedish king. I would say that it is safe to assume that the majority of those 400, and some of the scientist, which included biotanists and the like from the university in Göteborg, had very little grasp of what was going on. One Swedish newspaper described it under the headline "Einstein speaks in pressing heat. The king attends the distinguished lecture" with the words

"the silence was compact, and when Professor Einstein, to some small extent tried to concretise his lecture by making elegant gestures towards the spacious arches of the hall when talking about different coodinate systems, or pointing towards the corners and centre of the lectern to clarify his definition of the concept simultaneousness, many spectators, with probably already overwrought comprehension, craned their necks in expectation as if it was a spititualistic experiment and the materialisation of ghosts." (my somewhat sloppy translation)

There were applauds of course.

I can't figure out where less than 10% of the audience would have understood German, and Einstein would still have used it without an interpreter.

In those days (and even now) educated Europeans spoke several languages. The Germans weren't that interested in other living languages, which explains why their neighbours typically had to learn German. But even German classical schools had thorough courses in Latin and Greek.

Poincaré was certainly fluent in German. Einstein, as any good Swiss patent clerk, spoke also tolerable French, and even gave public lectures using it. He also knew some Italian from his younger days, and once asked Tullio Levi-Civita to write in Italian.

His English was poor, though. When he lectured in the Royal Astronomical Society in 1921, he asked Erwin Freundlich to act as an interpreter. Freundlich was fluent in English, because is mother was British.

By Lassi Hippeläinen (not verified) on 07 Dec 2007 #permalink

Even worse for Mr. Williams - GPS relies on both special and general relativistic corrections.

By Bob Henderson (not verified) on 07 Dec 2007 #permalink

The comment about Einstein lecturing in German is at best a strawman and at worst totally stupid.

Firstly as MarkCC has already pointed out most of the world's leading mathematicians and physicists spoke and read German in the first thirty years of the twentieth century this is because half of them were German speaking, at this time Germany totally dominated the world in these disciplines, and the other half had all attended German universities because that is what the best mathematical scientists did then.

Secondly as has been pointed out many times on Science Blogs a scientific theory is judged not on lectures but on papers published in peer reviewed journals. It wasn't any different for the Theories of Relativity. The original papers were of course in German but they were already collected along with original papers from Lorentz, Weyl and Minkowski and published in English translation in 1923. Einstein own popular explanation of his theories was published in English in 1920 and Hermann Weyl's excellent Space, Time and Matter was published in English in 1922. Two years later saw the publication of Arthur Eddington's The Mathematical Theory of Relativity. So English speaking scientists who could not speak German had every chance of judging the theories for themselves.

Having mentioned Eddington, who of course led the infamous solar eclipse expedition of 1921, gives me a chance to repeat my favourite Eddington anecdote for any who may not have already heard it. A reporter once said to Eddington that people said that only three men in the world understood the Theory of Relativity. Eddington supposedly replied "Oh really, who is the third?"

I think Williams got one and only one pont right, when he said that light are particles and not waves. If my understanding of Quantum Physics is OK, duality wave-particle is just a pop-sci simplification. Light are particles in each physical interaction. When you are not looking at them, they behave as a probabilistic wave.

Is it correct?

*rolls eyes* May this moron be smitten mightily by His Noodly Appendage!

#28:

No, you are not correct. Light is neither a particle nor a wave, light is light. We can, however, model light as either a wave or a particle. For some situations the wave model will be "easier" while in other situations the particle model will be easier.

And this may be the only thing in which Williams might be correct. It is important to not confuse our mathematical models of reality with reality itself. But to go from there to saying that you can't use math to describe reality is so completely wrong as to be inconceivable.

SteveM wrote: "No, you are not correct. Light is neither a particle nor a wave, light is light. "

Precisely. When I lecture on wave-particle duality, I often say that light behaves sometimes like a wave, sometimes like a particle, which really means that it's neither -- it's something new that we don't have a simple word to describe.

Evidently, he is unaware of the Heisenberg Uncertainty Principle as well.

"Evidently, he is unaware of the Heisenberg Uncertainty Principle as well."

Well, Heisenberg obviously didn't know what he is talking about, because he's so uncertain! :)

(Okay, I deserve to be punched for that quip.)

I am interested in the implication that the fact that Poincare had issues with it suggests that there's something wrong with it.

The underlying assumption appears to be that advances in science are instantaneous, and instantaneously understood, and it's not like there's any process of discussion between scientists (or scientists and mathematicians) as they work these things out. Presumably, if there was nothing wrong with relativity, one day Einstein would announce a complete theory of it, all other scientists would look it over, see that it's right, and not have any questions to raise... 'Cause it's not like legitimate questions can be raised in the development of a theory if that theory's any good, or that the theory even undergoes a period of development, except maybe in the mind of its developer, which will of course be carried out alone in isolation from any other scientist.

Or so the underlying assumption seems to be.

By El Christador (not verified) on 07 Dec 2007 #permalink

Or more succinctly, his assumption seems to be "people only raise objections if the theory is wrong".

Correct theories never get questioned by smart people of the calibre of Poincare, because it's not like a theory could be complex enough that smart people like that find it worthwhile to discuss it. It's either all in agreement right away, with universal crystal clear understanding of the theory in all its depth, or the theory is bogus.

By El Christador (not verified) on 07 Dec 2007 #permalink

Thus we read in Ecclesiastes: "A fool's voice is known by a multitude of words." Centuries later, Ezra Pound writes: "The less we know, the longer our explanations."

There are a couple of problems in the discussion of the "Big Bang" model in the post.

The actual Big Bang models, as used by cosmologists, are agnostic about the first moments of the universe. That is, they assume that there is a universe in existence (in a really hot, dense state) and go from there. (See, for example, Peebles', "Principles of Physical Cosmology" [1993], page 6, I believe.)

If one sticks to a very abstract model, one can extrapolate to a singularity before the very early universe, and thus say that there was a "creation" of space and time at the singularity, but that really goes beyond our knowledge of the behaviour of physics in the early universe.

Still, the Big Bang models do not have an actual center of mass for the universe. So, this is not a great problem.

A larger problem for the post is that Big Bang models do have "a big bang reference frame." There is a frame of reference in which the expansion of the universe is described and the clocks of (ideal) galaxies are synchronized. The important thing for physics is that this reference frame appears to play no other physical role and it cannot be counted on to distinguish between speeds. Thus it is not a preferred reference frame in the sense in which the phrase is normally used in discussions of relativity and its foundations.

By CosmologyGuy (not verified) on 08 Dec 2007 #permalink

One of the most egregious errors in Mr. Williams tome is the claim that there is sufficient uncertainty in the masses of the planets to mask relativistic effects on the orbit of Mercury. This is complete garbage. Aside from Venus, which has no moon, the masses of the other planets have been known with great accuracy since the latter half of the 19th century. The uncertainties in those masses were recognized 125 years ago to be totally insufficient to account for the 43 seconds of arc per century contribution to the precession rate of Mercurys' orbit which is explained by GTR.

A brief judicious account in English is Scott Walter, "Henri Poincaré and the Theory of Relativity"; see also his "Minkowski, Mathematicians and the Mathematical Theory of Relativity".
It's disappointing that Einstein, Poincaré, and Minkowski, each of whom had major works to his credit, sufficient to ensure a distinguished place in the history of physics, did not see fit to acknowledge each other's contributions to Special Relativity. But all too human, no doubt.
Two books (in French, not translated yet into English) on the relations between Einstein and Poincaré have taken up the question of priority:
Jules Leveugle. La Relativité, Poincaré et Einstein, Planck, Hilbert : Histoire véridique de la théorie de la relativité [Relativity, Poincaré and Einstein, Planck, Hilbert: a true history of relativity] Available at amazon.fr.
Jean Hladik. Comment le jeune et ambitieux Einstein s'est approprié la Relativité restreinte de Poincaré [How the young and ambitious Einstein appropriated the special relativity of Poincaré] . Available at amazon.fr.
Both argue that professional jealousy led the Germans to give Poincaré less than his due.

A further note: Had the uncertainties of Venus's mass been resolved in such a way as to help correct Mercury's orbit, the Earth's orbit would have been totally off.

By Michael Ralston (not verified) on 08 Dec 2007 #permalink

Haha, I read that article, but it was painful to do so because of the high density of stupid that permeated it.

Excellent parody with the Newton thing.

Nice bit. Clarifying. All of my high school and collegiate lectures center around the concept of mathematical modeling, and "using them to predict the future and make money." As such, I am a thorough-going member of the Math Model Society and grateful that *you* read the referred article, so that I don't have to! Thanks!

Nice post! Ironically, one of your most enjoyable responses to this goofball was the physical analogy concerning the 'center' of a balloon's surface. Apparently, Williams isn't even good thinking in terms of 'physical models'.

When I saw that Williams was a "Mathematician Graduate of Arizona State University" I was immediately imagining the chagrined expressions of math professors at ASU. Of course, ASU physics professors are probably rolling their eyes and saying, "I guess it could be worse - he could have been a physics graduate."

By Anonymous (not verified) on 10 Dec 2007 #permalink

Many people during that time were alchemists a-hole. Also, if you want to be a completely lame dumb a** conformist you can study only what your government and your colleagues study or you can have some fun, otherwise what makes life worth living. Stop being so serious lame-o.

Re #45:

It never ceases to amaze me how many people will send me flames without bothering to actually read what I wrote. I've gotten about a half-dozen flames in email, plus posted to comments on reddit, etc., written by people who didn't get past the first paragraph of the post.

It's a friggin' parody, you moron!

Not only is it a parody, but the post explicitly states that it's a parody!

In fact, approximately 7/8th of the post comes after the admission that it's a parody, and goes into great detail about why I wrote that parody.

If you're to goddamned lazy to actually read the post, then why would you bother posting a clueless lame-assed comment that proves what an idiot you are?

Did the author fail calculus and decide to refute the need for the existence of calculus?

Wow, just wow.

By John Ramey (not verified) on 10 Dec 2007 #permalink

Maybe he got a speeding ticket, and he's arguing that velocity is meaningless. Therefore, he should not be fined because there is no velocity to exceed.

By John Ramey (not verified) on 10 Dec 2007 #permalink

RE: #13 (better late than never?)

There is no reference frame in which the CMB photons are at rest. Light (photons) travels at the same speed relative to any reference frame, so the speed of the CMB radiation definitely does NOT distinguish any frame as being special.

Re: #50
I think that #13 refers to the frame in which there's no CMB dipole. In the frame at rest relative to the Sun, for instance, there's a clear blueshift in the CMB in one direction and redshift in the other -- that's what they meant by "not at rest relative to the CMB".

It's not the speed of the CMB radiation, it's the energy (or peak wavelength, or however you want to describe that).

By Joshua Zucker (not verified) on 11 Dec 2007 #permalink

Is that guy a creationist? Somehow it seems likely.

What do cdesign porponentists think of quantum mechanics?

By Valhar2000 (not verified) on 12 Dec 2007 #permalink

From the article you quoted: The false assumption about the relative speed of light was a consequence of Einstein's false rejection of the Big Bang frame of reference for the entire Universe.

I suppose that it's technically true that Einstein rejected the Big Bang frame of reference, since he rejected all absolute frames of reference. However, Einstein couldn't have rejected the Big Bang frame of reference because Big Bang theory itself did not arise until after he published his GTR. Or am I forgetting something?

I think that #13 refers to the frame in which there's no CMB dipole.

Yes, at rest relative the redshift (or energy density as you note). As I noted, my language (and/or head) wasn't particularly clear at the writing.

By Torbjörn Lars… (not verified) on 29 Dec 2007 #permalink

Einstein's paper of 1905 proves nothing its a collection of mathematical errors.

@56:

And we should clearly take your word for it over the word of the thousands of mathematicians and scientists who've studied it. After all, you presented such a thorough (and grammatical) argument demonstrating all of the problems that everyone else missed.