Starts With A Bang

Comments of the Week #179: From mirrorless telescopes to the physics ideas that must die

Outside the event horizon of a black hole, General Relativity and quantum field theory are completely sufficient for understanding the physics of what occurs. But near the singularity, a quantum theory of gravity is needed. Image credit: NASA.

“It’s easier to hold onto a bad idea if you never share it, and it’s harder to defend one if you let it out.” -Victor LaValle

After catching up with a big double-dose of our comments last week, Starts With A Bang! is here again with the latest! For those of you looking forward to my newest book, Treknology, it drops just one week from today! I will have special instructions on next week’s comments of the week for anyone who wants me to personally ship them an autographed copy, so look for it if you want one! With that said, let’s take a look back at our past week, and all the stories we’ve hit:

There’s a whole lot coming up; I’ll be on coast-to-coast AM next weekend (very early Monday morning), I’ve been involved in a bunch of podcasts, and I’ll be involved in all sorts of speaking engagements over the coming months in Oregon and Washington, and then (hopefully) even further! And for those of you into classic RPGs and world-building, you may really enjoy this piece I wrote on adding some science to your fictional world. With that said, let’s dive into what you had to say on this edition of our comments of the week!

From macroscopic scales down to subatomic ones, the sizes of the fundamental particles play only a small role in determining the sizes of composite structures. Image credit: Magdalena Kowalska / CERN / ISOLDE team.

From Michael Kelsey on how we measure the size of a particle: “So instead, in the real world, we measure the size of quantum entities by scattering: throw something at what you want to measure and see how it bounces. This works extremely well for charged entities. It’s how Rutherford figured out that the gigantic (10^-10 m) atom has a really tiny (10^-14 m) hard little core in the middle, with pretty much empty space around it.”

Michael’s entire comment is a really great explanation of how deep inelastic scattering works, and he even goes into the math a bit. I’m going to try to put that math into words, so you understand what he’s talking about. (And know that he is completely correct.) Basically, you assume the particle is a single point, with all the properties (electric charge, color charge, weak hypercharge, etc.) contained in a single point-like entity, and you fire other particles towards it that will interact with it. If it were truly a point, there would be a very specific angular distribution of how the particles would recoil from interacting with it; if it were anything other than a point, such as taking up a finite volume, it would depart from that angular distribution.

When we say “the size of a particle is equal to this amount,” that is the size/scale where we observe that departure from the expected distribution. When we say, “the size of a particle is less than this amount,” we mean we observe no departure from the expected distribution down to the limits of our sensitivity. And to the best we can tell, all the Standard Model particles are point-like.

This photo, taken at the Astrium France facility in Toulouse, shows the complete set of 106 CCDs that make up Gaia’s focal plane. The CCDs are bolted to the CCD support structure (CSS). The CSS (the grey plate underneath the CCDs in this photo) weighs about 20 kg and is made of silicon carbide (SiC), a material that provides remarkable thermal and mechanical stability. The focal plane measures 1 × 0.5 metres. Image credit: ESA’s Gaia / Astrium.

From Frank on Optical Phased Arrays: “There are new tech I think you don’t know about”

The tech that Frank is referring to can be used to make a “lens-free camera,” but the big problem is it relies on a series of light waves impacting a surface to determine direction, and it needs that light to be coherent, like laser light. This is fine for very specific applications, like the optical equivalent of radar, but it won’t give you the very high-resolution image you need. Gathering light and maximizing the utility of every photon is what makes observational astronomy progressively more powerful as time goes on. If we could simply measure the wavelength and the direction of every incoming photon to arbitrary accuracy, we’d have the perfect telescope.

Mirrors lose photons, but can measure direction to the accuracy of the ratio of photon wavelength to the diameter of the primary mirror, and wavelength to the accuracy of the filter utilized. Optical phased arrays smear out that direction by a factor of many thousands. Resolution matters a lot in astronomy; this is not the way to go.

Atoms can link up to form molecules, including organic molecules and biological processes, in interstellar space as well as on planets. Is it possible that life began not only prior to Earth, but not on a planet at all? Image credit: Jenny Mottar.

From Steve Blackband on Star Trek: Discovery: “The bio/physics stuff is fluffy, magic space mushrooms (dude, cool) – but so is the rest.
Its going to go wrong of course – thats the problem with prequels – its not in all the following series, though in terms of quantum meets biology maybe this is the forerunner of the Genesis device.”

I will say that there is a difference between science fiction and science fantasy. To me, the difference is that science fiction looks at our Universe as we know it, seeks to extend it in an unproven direction, and apply those extensions to how that might play out for humanity. Science fantasy looks at our Universe as we know it and ignores some of what we know, rather than circumventing it with new ideas/theories, to suit the story’s plot needs. The bio/physics stuff about the magic space fungi falls into that latter category; they could have done it only slightly differently to make it viable. So it was laziness on their part, as though they consulted with a half-assed scientist instead of a bona fide expert, and that’s what they got.

Michael Burnham, initially brought over as a prisoner, finds herself aboard the USS Discovery, captained by the intriguing but sketchy Gabriel Lorca. Image credit: Jan Thijs/CBS © 2017 CBS Interactive.

From Denier (and Adam) on a neat theory about Star Trek: Discovery: “I think you hit the nail on the head with that one. I think the entire ship is Section 31. Even the ship number is NCC-1031.

Suru’s warcrimes don’t matter to Sec 31 because it was done in the interest of killing the enemy and they have a war to win. I also put zero stock in how superior officers were acting, weak security, and even the genital fungus beast. It has already been established that Sec 31 tests their recruits before admitting them. That the captain knew she had broken in to the lab supports the idea that it was all just evaluation testing.”

I love this Section 31 theory. I think it’s perfect. I think it’s so good I’m going to keep it in mind when I watch the latest episode of Trek tonight, and depending on how it goes, I might put it into my write-up for Forbes on Monday. I was thinking that Discovery is the Federation’s “black ops” department, and the pieces are fitting together. The ship’s call number, NCC-1031, is the biggest reveal. Thank you for putting that together.

A computer simulation, utilizing the advanced techniques developed by Kip Thorne and many others, allow us to tease out the predicted signals arising in gravitational waves generated by merging black holes. Image credit: Werner Benger, cc by-sa 4.0.

From dean on the Nobel Prize in Physics this year: “Amusing to see two of the primary deniers of any science they don’t like (which is pretty much all of it) acting all butt-hurt over an award they try to dismiss.

Congratulations to the researchers on well-deserved honors.”

When you see something you don’t like, or you’re skeptical of, it’s only human nature to seek out the other voices that don’t like it or express skepticism. When it comes to evaluating those other claims, however, you must scrutinize them as carefully as possible. What the Danish team that LIGO’s detractors point to actually said was that:

But that does not mean there’s no signal. Take a look at the most recent GW event:

The noise (top), the strain (middle), and the reconstructed signal (bottom) in all three detectors. Image credit: The LIGO and VIRGO scientific collaborations.

Do you see that signal in the Livingston detector? That’s a signal-to-noise ratio of 14. If it were just Hanford and VIRGO, it would be far less compelling. But we have lots of events (4) now, lots of data, and lots of evidence. If you were seeking independent confirmation, you have it now from multiple detectors and multiple events. Everything is consistent.

Haters gonna hate, but Nobel Prizes are still going to be awarded. And in this case, rightfully so!

Rainer Weiss, Barry Barish and Kip Thorne are your 2017 Nobel Laureates in physics. Image credit: © Nobel Media AB 2017.

From Pino on who deserves the Nobel Prize: “What about Einstein? …
He still doesn’t deserve the Prize for Relativity?
Nonsense.
p.”

There is a big push in the scientific community to get the Nobel Prizes to honor people that have done their work more recently, rather than the “typical” awards that go to research that was done decades ago and whose value is only properly recognized today. The awards for the Higgs boson and the discovery of Gravitational Waves are the exceptions; the recent prizes for topology in materials or blue LEDs are far more common.

There is also a push to begin awarding collaborations, which I’m split about. Yes, science these days is done by large collaborations mostly, not by sole individuals, and so the award should award everyone who contributes substantially, not just a few of the most famous individuals or those in the highest positions of leadership. But that would dilute the impact and the prestige of the prize, if some ~1000+ people had to share it; would you even feel like a Nobel meant anything if so many people won it?

But no one is talking about posthumous awards here. The Nobel is for the living only. I like that rule.

Two possible entanglement patterns in de Sitter space, representing entangled bits of quantum information that may enable space, time and gravity to emerge. Image credit: Erik Verlinde.

From Sean T on new ideas vs. good ideas: “has it ever occurred to you that the so-called “groupthink” is simply a result of looking at the evidence at hand and coming to an agreement that the current scientific consensus is what best fits that evidence? Creative is not necessarily correct. The main factor that suppresses new ideas is that pesky thing called observational evidence. If the new idea doesn’t explain the observations better than the old idea, then it will be rejected. The burden is on the creator of the new idea to show that it is better than the consensus one, not on the scientific community to demonstrate that the consensus idea is superior.”

This is the problem with most new ideas is that they add one new free parameter to explain one new observation. That’s generally not a good physics idea, since there’s really nothing gained by that. (It meets criterion #2, below, only.) To modify gravity or replace our current theory, you need to, remember:

  1. Reproduce all the prior theory’s successes.
  2. Explain the observations the pre-existing theory does not.
  3. Make new predictions, distinct from the prior theory’s, that can be tested.

Verlinde’s theory has failed at #1, succeeded in one aspect at #2, and made new, unsuccessful predictions for #3. In other words, current data already invalidates them. The burden is on him to show that his theory is not garbage to start with. If you read the whole live blog, you’ll see my conclusion:

Too bad; no addressing of the quantitative aspects of his idea. It was interesting to listen and I’m glad I’ve heard it, but I’m more convinced that this is a cherry-picked solution he claims to have come up with, and that the details, particularly as the Universe evolves, won’t turn out to be consistent. I’m also convinced that the physicist who shows this won’t be Erik Verlinde.

If gravitation isn’t fundamental, but is rather an emergent force that comes about from the properties of fundamental qbits of information, perhaps this new way of looking at the Universe will answer some of our greatest fundamental puzzles. Image credit: flickr gallery of J. Gabas Esteban.

There are still some mysteries that we will have to wait a little longer for the Universe’s answer.

From Michael Mooney on housekeeping over here: “I appreciate that Ethan tried to clean up the nastiest personal abuse here (WOW for example) but why is Elle H.C. (see #9) still here spewing venomous insults?”

Yes, Chelle is not always kind to others on this blog. In fact, I’d probably rank them as the fourth most routinely-offensive of our present suite of commenters. If you want me to ban Chelle, then I’d have to ban Ragtag Media, you, and CFT also. Let’s try keeping all of you around for a little longer, shall we?

In other words: don’t get worse.

The earliest stages of the Universe, before the Big Bang, are what set up the initial conditions that everything we see today has evolved from. Image credit: E. Siegel, with images derived from ESA/Planck and the DoE/NASA/ NSF interagency task force on CMB research.

From Fred on whether inflation is or isn’t science: “I’m generally a fan of your sensible take on things but I’m surprised you would dimiss the objections to inflation raised by Steinhardt and others (also expressed in Sabine Hossenfelder’s recent guest appearance here), without properly addressing those criticisms. “

So I wrote what I wrote because showing how inflation is science, independent of subjective or irrelevant criticisms (legitimate or not), was the strongest refutation I could think of to the claims that inflation is not science anymore. Yes, I don’t think you learn very much about the Universe by building a bunch of models, unless you take the tactic that some have (including Kamionkowski et al. in his ARAA paper from 2016) of showing which models are valid and which models are disfavored based on observables like the scalar-to-tensor ratio. But I don’t think saying, “I can concoct an arbitrarily complicated model, with as many free parameters as it takes, to produce anything I want” is a good argument.

There are a huge set of classes of simple, well-motivated inflationary models that make the generic predictions I provided you with. They all give you flatness, a causally connected horizon, no monopoles, etc., within your observable Universe, regardless of the initial conditions that your Universe possessed before inflation took place. In fact, regardless of the initial conditions that your Universe possess before the final 10^-33 seconds (or so) of inflation!

Sabine thinks it isn’t science until you have a well-defined probability measure for the end results of inflationary phase space, and inflation doesn’t have it. I think that showing this generic behavior in a wide range of cases where the phase space is overwhelmingly resulting in a Universe like our observable one is an adequate substitute, as do many others (we’ve been having an intense conversation on Twitter; start here and follow the sub-threads) like Will Kinney and Richard Easther. But arguing over “what it takes to convince us that X is true” when we are convinced by different pieces of information highlights the fact that we haven’t “proven” everything about inflation. Some think that means it’s not science; some think the counterarguments are ridiculous. But that’s the reason why things broke down as they did.

The stars and galaxies we see today didn’t always exist, and the farther back we go, the closer to an apparent singularity the Universe gets, but there is a limit to that extrapolation. To go all the way back, we need a modification to the Big Bang: cosmological inflation. Image credit: NASA, ESA, and A. Feild (STScI).

Also from Fred on an alternative to inflation: “Also: what do you make of this, which claims to do away with the need for an inflationary phase: http://nautil.us/issue/53/monsters/the-universe-began-with-a-big-melt-not-a-big-bang? Can you tell whether the paper referenced is serious work?”

So, do you remember what I said about the problem with most new ideas? How they add X free parameters (or X pieces of new physics) to solve X number of problems? Whereas you should be solving Y number of problems, where Y > (or ideally >>) than X? In the case of this new paper, which the Nautilus article is based on, X = 2. I predict that no one who’s last name isn’t Padmanabhan will continue to work on this (IMO) bad idea.

Image credit: Axion Dark Matter eXperiment (ADMX), via U. of Washington.

From Axil on axionic dark matter: “Whatever happened to the Axion? If a few Axions are generated, then the Proton would split.”

Well, that’s supremely not true. The standard Peccei-Quinn axion was ruled out back in the 1990s by the ADMX collaboration, but variations are still being looked for, and theoretical work is still being done. In no variant that I’ve ever seen does an axion mediate proton decay; I think you made that part up because it sounded good to you. It is not only wrong, it is not feasible.

Image credit: John Cooke, of “Piltdown Man”, one of history’s most elaborate scientific hoaxes.

And finally, from Julian Frost (seconding but elaborating on) MobiusKlein‘s comments: “I have to second MobiusKlein. I thought it was a part of science to devise hypotheses, test them and, from the experimental results, develop refinements to them. Or are you saying that in these cases the refinements don’t cover the difference between the original hypotheses and the experimental results?”

For all five of these cases that I gave, there was an initial motivation for investigated these areas. Proton decay was supposed to be an indicator of SU(5) grand unification, but the predicted lifetime failed. Now people have gone to more elaborate unification schemes, some of which have been already ruled out, but they keep pushing to larger-and-larger groups to extend the proton’s lifetime. Modified gravity was supposed to be an alternative to dark matter. 35+ years on, it still cannot reproduce the first criterion of a new theory: to reproduce the successes of the one it’s seeking to replace. Supersymmetry was designed to solve the hierarchy problem. That ship has sailed based on experimental data. Technicolor predicted that there would be no Higgs boson. There is one. And WIMP dark matter predicted a specific production amount related to mass and cross section. Those values are 100% ruled out.

These things could still happen: the proton could still decay, gravity could still need a modification, SUSY could exist at some higher scale, Technicolor variants could be discovered at some higher energy scale, and dark matter could have some WIMP-like properties that are very different from what the initial predictions were. But the original motivations for them are gone, and people are plowing ahead having learned no lessons and doing nothing different, except doubling down on their initial assumptions and looking for the same ill-motivated things to higher and higher precision. And that’s why they need to die; because the ideas that motivated them are already dead.

And with that said, see you back here next week for more Starts With A Bang!