Free Thought

OK after making fun of System Biologists out comes Peter Sorger's latest paper in Cell. In this paper, Sorger's team collected almost 8000 intracellular measurements (they collected some of the data directly and got the rest from the literature - I'll have to check on that) plugged it into an algorithm or a ... ... compact representation of the entire compendium by using discriminant partial least squares regression (DPLSR; Janes et al., 2004). A DPLSR map was created such that the signaling proteins and cytokines were projected onto a set of "principal components" that maximized covariation…
You have two 50g containers of cream. One is 10% fat, and the other 20% fat. You combine them. What is the percentage of fat in the mixture? A. 10% of 50 is 5, 20% of 50 is 10. (10+5)/(50+50) is 15%. The answer is 15%, the arithmetic mean of 10% and 20%. B. 14.1%, the geometric mean of 10% and 20%. C. 15.8%, the root mean square of 10% and 20%. D. It could be either A, B, or C. There is nothing to stop you using any of these means as the answer. And anyway, the Navier-Stokes equations are hard to solve, so how can we figure out what happens when we combine two fluids? If…
This is too cool. One of the world's most powerful supercomputers has conjured a fleeting moment in the life of a virus. The researchers say the simulation is the first to capture a whole biological organism in such intricate molecular detail. The simulation pushes today's computing power to the limit. But it is only a first step. In future researchers hope that bigger, longer simulations will reveal details about how viruses invade cells and cause disease. (Continued below) The fleeting simulation, published in this month's Structure, reveals that although the virus looks symmetrical it…
{NOTE: Here is the post that was delayed last week due to my announcement of arson at the Holocaust History Project.} It occurs to me that I haven't done much straight science blogging lately. Yes, debunking pseudoscience and quackery is fun, useful, and has the potential to educate people about how science is misused, but this is ScienceBlogs. Since arriving here four weeks ago, I haven't fulfilled my quota of science blogging, and it's time to remedy that. Fortunately, while perusing a recent issue of Cancer Research, I found just the ticket, something that would let me discuss science and…
Buried beneath some unseemly but justified squee-ing, Scalzi links to an article about "counterfactal computation", an experiment in which the group of Paul Kwiat group at Illinois managed to find the results of a quantum computation without running the computer at all. Really, there's not much to say to that other than "Whoa." The article describing the experiment is slated to be published in Nature, so I don't have access to it yet, but I'll try to put together an explanation when I get a copy. The experiment involves a phenomenon know as the "Quantum Zeno Effect," though, which deserves a…
Last week's Casual Friday study was all about illusion. For example, you may have thought our goal was to see how well you could recognize an illusion. However, we really just wanted to know what kind of computers our readers use: Amazingly, Cognitive Daily readers use Macs at a rate (22.8 percent) about seven times higher than the U.S. market share of Apple Computers (roughly 3 percent). We did also want to know something about how you see illusions, so we designed a simple experiment based on a brilliant illusion by Akiyoshi Kitaoka. If you haven't visited his web site full of astonishing…
The final and most recent of the Top Eleven is an experiment that goes right to the heart of the weirdness inherent in quantum mechanics. Who: Alain Aspect (1947-present), a French physicist. (Again, Wikipedia is a let-down, but CNRS has useful information.) When: Around 1982 (there are several experiments involved, but the 1982 one is cited by most people). What: His group performed the first experimental tests of Bell's Inequality, which shows that the predictions of quantum mechanics cannot be explained by a "local hidden variable" theory. Explaining that will take some space, so I'll…
Sure enough, Bush did the "science" thing last night. I've already preemptively explained why he's not a credible messenger on this topic; so has DarkSyde (and I'm sure many others on the blogs). Still, let's parse the president's message a bit more: First, I propose to double the federal commitment to the most critical basic research programs in the physical sciences over the next 10 years. This funding will support the work of America's most creative minds as they explore promising areas such as nanotechnology and supercomputing and alternative energy sources. Second, I propose to make…
Over at Gene Expression, Razib spins an interesting question off my call for blog posts: why are there so many biology bloggers? As I said in comments over there, I think there are two main reasons why you find more bio-bloggers than physics bloggers. The first is that there are simply more biologists than physicists-- we're expecting an unprecedented 13 senior physics majors next year, which is forcing some frantic re-organization to handle the load, but a class that small would be a major crisis for the Biology department. The second reason is that biology is really the main front of the "…
Chris Mooney has a well-written review of Michael Crichton's State of Fear. I picked up a copy at the book store and read a couple of pages from the middle. It was like a Tech Central Station column, except that it was a speech by one of the characters, with occasional lame objections by another character. Oh, and it had footnotes. I don't know if you were supposed to imagine Crichton's character speaking the footnotes or what. I didn't buy the speech or the book. John Quiggin also has a book review. His is of Lomberg's new book. Over at RealClimate Michael…
In this column, Richard Muller claims that McKitrick and McIntyre have shown that the hockey stick graph is an "artifact of poor mathematics". If you have been following the global warming debate this claim should look familiar, because McKitrick and McIntyre made the same claim last year as well. So what's new? Well, last year they claimed that the hockey stick was the product "collation errors, unjustifiable truncations of extrapolation of source data, obsolete data, geographical location errors, incorrect calculations of principal…
The graph above, which Iain Murray claimed showed that "The fact that the ten hottest years happened since 1991 may well be an artifact of the collapse in the number of weather monitoring stations contributing to the global temperature calculations following the fall of communism (see graph)" comes from this paper by Ross McKitrick. McKitrick recently was in the news for publishing a controversial paper that claimed that an "audit" of the commonly accepted reconstruction of temperatures over the past 1000 years was incorrect, so I thought it would be…
R2 values using county-level 1977-2000 (Corresponds to Lott's corrected Table 3a)   Violent Crime Murder Rape Aggrvtd Assault Rbbry Prprty Crimes Auto Theft Brglry Lrcny R2 without any shall-issue variable 0.86 0.81 0.76 0.80 0.91 0.81 0.84 0.81 0.80 R2 (Single dummy variable model) 0.86 0.81 0.76 0.80 0.91 0.82 0.85 0.81 0.80 R2 (Spline model) 0.86 0.81 0.76 0.81 0.91 0.82 0.85 0.81 0.80 R2 (Hybrid model) 0.86 0.81 0.76 0.81 0.91 0.82 0.85 0.81 0.80 Values that have increased when the shall-issue variable is added are in bold. My thanks to David Powell for computing these values.
Crime rates go up and crime rates go down. Before seizing on some possibly coincidental factor such as gun training or gun control as the cause of the change, we need to establish if the change was unusual, i.e. statistically significant. The only attempt I have seen to establish this is in Kleck and Bordua's paper which claims that the change was significant since it exceeded two standard deviations. This is wrong. A rate two standard deviations from the mean would be significant, but changes exceeding two standard deviations occur 15% of the time for normal variates, nowhere near the 5%…
If you want to consider population density, Alaska has a density 7 times that of Yukon. This is a rather enormous difference. Andy Freeman said: But, is it a significant one? The relative size of the empty spaces probably doesn't matter much, except when it comes to computing average population density, because we really can ignore places where there's no one around to kill or be killed. I think it is up to those who claim that the two places are comparable, to show that, ignoring uninhabited areas, the densities are the same. Here is another way they differ: % of population living in…
The Terminator said: Excluding the United States and Switzerland would make this worse. Further, do you have any justification for excluding them? Eliminating data points, simply because they don't "fit" isn't very good methodology. Because with least squares estimation, outlying values bias the results. In this case they make the correlation higher, giving a value that would be quoted by a politician, and not a statistician. I made the same mistake on an oral presentation a semester ago. I was computing a linear regression for data taken in the Millikan Oil Drop Experiment. I "threw out"…