New Measurements, New Theories

This is a history of science question for all of you folks. What do you think are the most prominent historical cases in which new ways of taking measurements of nature, or new scientific calculations, have themselves brought about major new theories or hypotheses? In other words, what are the case studies of how new means of measurement have themselves effectively changed reality? Feedback would be appreciated.

More like this

A big topic of conversation at Scifoo seems to be the future of scientific communication. I have renounced using the term Open Access, this term has been applied to so many different aspects of scientific publishing that it is utterly worthless. It's a buzz word. It's cool. But what does it mean?…
Following up on my query about what it would take for a Young Earth Creationist "to write a doctoral dissertation in geosciences that is both 'impeccable' in the scientific case it presents and intellectually honest," I'm going to say something about the place of belief in the production of…
Thomas Woodward, author of the new book Darwin Strikes Back: Defending the Science of Intelliigent Design turned up at the Washington D.C. offices of the Discovery Intstitute last night. Since it's alays nice to have an excuse to hang out in the big city, I decided to check it out. There's rather…
People sometimes worry that throwing ethics coursework at scientists-in-training is not such a great strategy for training them to be ethical scientists. (I've explored worries of this sort myself.) For one thing, at many schools the existing coursework may be a fairly broad "moral issues" course…

Some obvious examples:

Tycho Brahe's invention of precision astronomy, even without the telescope, was enough to trigger the discovery of Kepler's Laws

Galileo's observation with the telescope of many new stars too dim to be seen with the naked eye was probably the most important single thing which destroyed the image of the stars as fixed in a "heavenly sphere" all at the same distance of the Earth, replacing it by a potentially infinite universe with more or less uniformly spread stars. Giordano Bruno and others had already suggested this, but it was more a philosophical speculation. After the mid-seventeenth century the infinite, homogeneous universe seems to be taken more or less by granted, and I think the telescope is a main reason for this.

I guess everyone will mention the Michelson-Morley experiment, althoug it is a matter of historic controversy exactly how much was Einstein influenced by its results.

The realization that earth was not flat.
The realization that earth was not the center of the universe or even the galaxy (big blow for the catholics waybackwhen).

Off the top of my head:

1) The invention of Microscopy (van Leeuwenhoek) led to the Germ Theory of Disease.

2) The invention of interferometry(sp?) {in particular, the Michaelson/Morely experiments} was critical to Einstein's Special Relativity.

3) And of course (Hubbles's) measurement of the Red Shift spelled the death knell of the Steady State Universe.

I'm not sure if that's what you're looking for, but it was the first things that came to mind.

The obvious things are the telescope, the microscope, the
electron microscope, the radio telescope, the camera, the scintillation detector etc. But let me
also put in a word for the spectroscope. Trying to explain those funny little lines in the spectra of rarified gases ended up totally undermining the entire basis of how we thought the world worked.

This was discussed in the 60s and 70s a bit: see

Gillispie, Charles Coulston (1960), The edge of objectivity: an essay in the history of scientific ideas. Princeton: Princeton University Press.

I know I'm stepping in a different direction, but I'd like to mention something which was a conceptual development (not a technological advance). Probability. The idea that you could turn our intuitive feeling of different likelihoods for events, into a consistent mathematical theory that allowed quantification and comparison of uncertainties. It was far less obvious a step than it seems now with hindsight. And thanks to it, it was possible to control noise in physical theories in predictive ways. It was possible to do robust statistics. No probabilities, no quantum theory. No modern synthesis of evolution. Etc. It was a major addition to the "toolbox", and certainly the major win we ever got from gambling!

By Irrelephant (not verified) on 01 Sep 2006 #permalink

I'd say we are in the midst of such a revolution for cosmology and subatomic physics, which are increasingly intertwined.

Some results from the Wilkinson Microwave Anisotropy Probe and the Sloan Digital Star Survey have physicists scratching their heads about oddities like dark energy, even while still trying to make sense of dark matter (known for 30+ years). Just when the "standard model" of particle physics was getting tied up neatly by neutrino mode oscillation, we now have to wonder whether there are whole classes of matter (mirror matter, supersymmetric matter, etc.) we know little or nothing about. And some people like Lee Smolin are now writing that string theory is unraveling without a predictive success to its credit (book reviews of two titles to come in about a month).

(Can you tell I just finished a manuscript of a high-school/college reference in an upcoming Twentieth-Century Science set? My book will be called Physics: Decade by Decade.)

As usual, I have a favorite among my book reviews. This book was written before any results of the WMAP and SDSS were in or neutrino mode oscillation had been established, Strange Matters: Undiscovered Ideas at the Frontiers of Space and Time by Tom Siegfried. The review has an opening limerick for those who enjoy the form. Its URL is

http://www.scienceshelf.com/StrangeMatters.htm

I'm fond of the Big Bang. The primary evidence for the Big Bang rests on three things : (1) the expanding Unvierse, (2) the Cosmic Microwave Background (CMB), and (3) the succesful prediction of the primordial abundances of Hydrogen, Helium, and Deuterium (and a couple of other things).

Penzias and Wilson discovered the CMB when they were trying to reduce noise in radio receviers, and found there was a minimum background they couldn't get rid of, coming from everywhere on the sky. That matched predictions of the Big Bang, that the Universe was once a dense plasma and that there should be cooled-off-by-redshift residual plasma left over from it.

Alpher and Gamow were trying to show that the ratios of the elements could have been formed in stars... but their calculations (and those of many others -- I'm not sure if Alpher and Gamow are exactly the right people to credit for the turning point) ended up supporting the Big Bang that they were dubious about.

Before these things, lots of people believed in a "steady state" Universe that always was and ever shall be as it is now : expanding, with new matter being created all the time to fill the gaps. The Big Bang was derided by many. Not too long after the evidence pointed the other way, the Big Bang became (and remains) the standard paradigm.

-Rob

PET scans - monitoring brain activity

As mentioned above, "radio-carbon dating" has been important, but perhaps more important to the survival of the human species is "internet dating".

Not sure what it measures, though I am fairly certain it's not intelligence.

By Dark Tent (not verified) on 01 Sep 2006 #permalink

case studies of how new means of measurement have themselves effectively changed reality

Yuck. I didn't like that formulation when Kuhn used it and I don't like it now. If it's an objective reality, how does it make sense to speak of changing it? Rather, we are changing our view of it; or better, our understanding of it. Notwithstanding its limitations, I like the "gestalt switch" analogy better. The idea that, post-paradigm-shift, the scientist "lives in a different world" is a metaphor that I think is easily overburdened.

I was going to mention the microscope, in which context you might make separate cases for the electron and atomic force microscope.

This also reminds me of real-time studies of cellular activities, premised on the ability to image individual proteins in real time. A good analogy is to a football game: imagine trying to reconstruct the rules from a series of snapshots taken at various points in different games, and how much easier the task would be if you had real-time videos of several games.

Not sure if it falls under your umbrella as a new way of taking measurements, but Descartes' first steps toward analytic geometry and the eventual adoption of what we now call the cartesian coordinate system certainly changed how science is done.

In a similar vein, non-euclidean geometry ...

By bob koepp (not verified) on 01 Sep 2006 #permalink

The photographic plate, which led to the discovery of natural radiation, and man-made radiation (X-rays). This led to many discoveries, like sub-atomic particles, and modern physics. Also great advances in medicine, astronomy, and many other areas.

Might offer a bit of caution about the way that you phrased the question. Just having a new form of measurement and a new related theory doesn't mean that the relationship was unidirectional (although it certainly might). Could be that new theories, experimental investigations, or technical developments lead to the need or desire to create new forms of measurements in order to test/confirm those theories, investigations &c. -- the two feeding off of each other.

May be useful to take the cases as starting points to more deeply examine the relationship between measurement and theory.

Well, you did ask for the history of science approach...

By Peter Parley (not verified) on 01 Sep 2006 #permalink

Re Alejandro

Even more important was Galileos' use of the telescope to discover the moons of Jupiter. This discovery falsified the Ptolemic idea that the entire universe revolved around the earth.

The invention of the compass!

The X-Ray diffraction photo of regular DNA that Rosalind Franklin took, and the resulting model that Watson and Crick built, which allowed them to formulate the central dogma and discern a biological mechanism for both information transmission and information transformation.

Or, for that matter, the tracking of the pea flowers and later of the flies which helped Mendel and successors figure out that genetic information must come in linear pairs. Counting rather than measuring.

Lavoisier's ability to measure precise volumes and weights and map out basic chemistry. . .whatever scales and glass blowing precision that allowed him to do that.

I assume you've read plenty of Gallison. . .A People's History of Science is on my reading list. My understanding is that it's a bit dogmatic and shrill, but still chock full of good examples.

The measurements done on blackbody radiation led to the birth of quantum mechanics (Planck's formula for blackbody radiation).

The measurment of the electron wavelength by Davidson and Germer in 1925 confirmed that electrons obey the (quantum mechanical) DeBroglie relationship between particle momentum and wavelength. This experiment was the first to show that material particles have wave properties.

The measurement of the Lamb Shift was an exceedingly precise confirmation of one of the pillars of modern physics, Quantum Electrodynamics.

The measurement during a solar eclipse of the bending of star light grazing the sun was a confirmation of Einstein's general relativity theory, as was the measurement of the gravitational red shift of a photon emitted by a massive body.

By Dark Tent (not verified) on 01 Sep 2006 #permalink

Charles Darwin's first two books were geological monographs. When his interests turned to biology, he brought with him the geological concept of a very old earth, which allowed him to see the (already recognized but considered homeostatic) mechanism of natural selection as an agent of species change.

On the fringe (appropriately enough) of the idea you're entertaining stands Paracelsus, both brilliant and nutty. His notorious defiance of accepted authority in favor of his own (erratic) observations arguably led to an era of independent research and thinking, which overthrew both medieval "natural philosophy" and (most of) his own work.

Going even further back, consider Portugal's Prince Henry the Navigator, whose new measurement (the idea that sub-Saharan African lands were reachable, hence exploitable) led to his shipyard becoming the world's first applied research & development lab.

A comprehensive review of this question would have to deal with the mathematics developed by the ancient Greeks, even enabling a decent calculation of the size of the Earth. That this was influential seems inarguable, though finding specific cause-&-effect links might be a challenge. Tracing the influence of major Arab mathematicians from the Islamic golden age of the Caliphate would be a comparable exercise.

Closer to home, the "effective changing of reality" is not exclusively scientific. Numerous cultural historians seem willing to link the counter-intuitive but well-publicized theories of Einstein, Heisenberg, Schrodinger, et al to the collapse of tradition and triumph of the avant-garde in early 20th-century Europe. Some might claim that the influence ran the other way and it was the decline of the old order that made those revolutionary scientific ideas thinkable (an argument also applicable to Renaissance figures including Paracelsus).

By Pierce R. Butler (not verified) on 01 Sep 2006 #permalink

A quick scans suggests nobody has placed a shout out for nuclear magnetic resonance spectroscopy and its twin sister, magnetic resonance imaging. Admittedly, it's not as foundational as the telescope or microscope, but modern chemistry, molecular biology, material science and medicine wouldn't be nearly the same without it. Few techniques reveal so much detail with so little sample (at least in the case of NMR) in so little time.

How about a consequential miscalculation? The sailor now known as Christopher Colombus was not unique (as I hope the textbooks no longer allege) in thinking that the earth was spherical: CC's contribution was to (ahem) misunderestimate the circumference of the world by several thousand miles, thus leading him to insist that he could find Asia only about 3,000 miles west of Iberia.

By Pierce R. Butler (not verified) on 01 Sep 2006 #permalink

Another Renaissance example: the discovery of geometric rules of perspective led European painters to create and disseminate a new, well, perspective on viewing the world, resulting eventually in the Cartesian coordinate system among other concepts.

Iirc, Lewis Mumford dealt with this brilliantly in The Pentagon of Power. I was greatly struck with an illustration, possibly from that period, of an appealing nude woman on a couch, being intently but unemotionally studied by a man holding a rectilinear grid between them.

By Pierce R. Butler (not verified) on 01 Sep 2006 #permalink

1) PCR!!

2) Microarrays

3) Those new methods / techniques that resulted or were an important part of Nobels.

this may be a tad off topic, but it's an interesting question. i'm studying use of the Bayes factor instead of hypothesis testing or the "significance threshold" in science. it's part of preparation for doing an article on that to contribute to Charles Daney's proposed physical sciences blog carnival.

anyway, this may be a bit of an extreme way of putting it, but according to some good sources, the statistics i and many engineers i know were taught in college is thoroughly wrong and has foundational as well as methodological problems. a key summary is S N Goodman, "Toward evidence-based medical statistics", parts 1 and 2, pp 995-1013, Annals of Internal Medicine, 1999; 130. basically, it rebuts the integrity of Neymann-Pearson (and possibly Fisher) hypothesis testing. Goodman's main point is that the medical literature is filled with results based upon questionable statistical procedures and, so, he asks Where Does That Leave Us?

i mean, i know that hypothesis testing has always been sleazy, with some steps pulled out of the thin air. statisticians have tried to keep practioners on the straight and narrow, invoking notions of power analysis and relative risk to guide people into choosing alphas and betas based upon something substantial. (see Helena Chmura Kraemer, Sue Thiemann, How Many Subjects?) unfortunately, hypothesis testing becomes a lot harder once that is done, often because relative risks are either just not available or are subjective.

fortunately, a lot of science is guided more by scientific sense and experience and less by raw statistical inference. but as statistical inference gets more important (think pharmaceuticals! think terrorist detection!), it seems proper to worry that we have this device right.

so i guess i'm posing the question of what scientific applecarts might be upset if N-P hypothesis testing turns out bogus.

incidently, there's a balanced and very readable treatment of much of this in Gelman, Carlin, Stern, Rubin, Bayesian Data Analysis. i contrast that with Berger, Statistical Decision Theory and Bayesian Analysis which, although impressive and a good thing to have on your shelf because of its treatment of conjugate families, is not helpful.

i should have linked Goodman's paper in the above. it's available online, part 1 and part 2.

admittedly, the above is a bit of a troll, as the controversy between Neymann-Pearson/Fisher and Bayesian approaches has a long technical history, recounted in part by Goodman in his part 1. Goodman has more.

no doubt, properly done, hypothesis testing in the form of statistical quality control works, given its assumptions are granted. they include appreciable statistical power, a sizable population of units which can be considered identically distributed, reasonable compliance with normality or unimodality, and relative costs of false acceptance or rejection being available. in industrial or commercial contexts, these are often available. sometimes the populations are small. Levine and Berliner write:

Note that in both the Fisherian and Neyman-Pearson formulations, the two hypotheses are treated asymmetrically. The null hypothesis plays a special role in each case. (This issue is particularly important in developing techniques for attribution versus detection.) A fully decision theoretic approach, formulated early in the work of A. Wald, generally permits asymmetric treatment of actions. The key is that probabilities of errors are not necessarily the appropriate quantities for control and optimization. Rather, the (expected) losses associated with particular actions are minimized. Such procedures are particularly important in the study of remediation.

what's less clear is what to do in a purely scientific context, and that's why i raise the question.

Bayesian approaches as with the Bayes factor have limitations. but i think it's better to suffer those than violate Feynman's principle about fooling yourself.

1) PCR!!

2) Microarrays

And...

3) Sanger sequencing and the highthroughput techniques that have come about since.

All of these molecular genetics tools have revolutionized every single discipline of the biological sciences from ecology to evolution to cell biology to physiology, etc etc...

How about the tape measure? - which disproved the perception that the distance between places varied with one's degree of fatigue.

By Matt Lowe (not verified) on 02 Sep 2006 #permalink

B.F. Skinner's 1932 use of the cumulative recorder, which logged each instance of a bar-press that a rat in an experiment made over time, opened a whole new realm of inquiry in psychology and biology. I think most mouse- and rat-based research in medicine today uses modifications and extensions of Skinner's methods of counting responses.

Among the first insights the new way of counting responses revealed was that when you stop "rewarding" an animal for making some particular trained response, that response generally increases in frequency, not decreases.

I didn't read all the comments, but here are the first things that came to my mind:

1. Particle accelerators, for quantum physics.

3. Non-euclidean geometry.

4. Carbon13 spectroscopy (for geology, history etc...

I'm sure there must be an important 2nd place i forgot to mention.