Wednesday, May 23, 2018

This is how I pray

I know, you have all missed my awesome chord progressions, but do not despair, I have relief for your bored ears.


I am finally reasonably happy with the vocal recordings if not with the processing, or the singing, or the lyrics. But working on that. The video was painful, both figuratively and literally; I am clearly too old to crouch on the floor for extended periods and had to hobble stairs for days after the filming.

The number one comment I get on my music videos is, sadly enough, not "wow, you are so amazingly gifted" but "where do you find the time?". To which the answer is "I don't". I did the filming for this video on Christmas, the audio on Easter, and finished on Pentecost. If it wasn't for those Christian holidays, I'd never get anything done. So, thank God, now you know how I pray.

Friday, May 18, 2018

The Overproduction Crisis in Physics and Why You Should Care About It

[Image: Everett Collection]
In the years of World War II, National Socialists executed about six million Jews. The events did not have a single cause, but most historians agree that a major reason was social reinforcement, more widely known as “group think.”

The Germans who went along with the Nazis’ organized mass murder were not psychopaths. By all accounts they were “normal people.” They actively or passively supported genocide because at the time it was a socially acceptable position; everyone else did it too. And they did not personally feel responsible for the evils of the system. It eased their mind that some scientists claimed it was only rational to prevent supposedly genetically inferior humans from reproducing. And they hesitated to voice disagreement because those who opposed the Nazis risked retaliation.

It’s comforting to think that was Then and There, not Now and Here. But group-think think isn’t a story of the past; it still happens and it still has devastating consequences. Take for example the 2008 mortgage crisis.

Again, many factors played together in the crisis’ buildup. But oiling the machinery were bankers who approved loans to applicants that likely couldn’t pay the money back. It wasn’t that the bankers didn’t know the risk; they thought it was acceptable because everyone else was doing it too. And anyone who didn’t play along would have put themselves at a disadvantage, by missing out on profits or by getting fired.

A vivid recount comes from an anonymous Wall Street banker quoted in a 2008 NPR broadcast:
“We are telling you to lie to us. We’re hoping you don't lie. Tell us what you make, tell us what you have in the bank, but we won’t verify. We’re setting you up to lie. Something about that feels very wrong. It felt wrong way back then and I wish we had never done it. Unfortunately, what happened ... we did it because everyone else was doing it.”
When the mortgage bubble burst, banks defaulted by the hundreds. In the following years, millions of people would lose their jobs in what many economists consider the worst financial crisis since the Great Depression of the 1930s.



It’s not just “them” it’s “us” too. Science isn’t immune to group-think. On the contrary: Scientific communities are ideal breeding ground for social reinforcement.

Research is currently organized in a way that amplifies, rather than alleviates, peer pressure: Measuring scientific success by the number of citations encourages scientists to work on what their colleagues approve of. Since the same colleagues are the ones who judge what is and isn’t sound science, there is safety in numbers. And everyone who does not play along risks losing funding.

As a result, scientific communities have become echo-chambers of likeminded people who, maybe not deliberately but effectively, punish dissidents. And scientists don’t feel responsible for the evils of the system. Why would they? They just do what everyone else is also doing.

The reproducibility crisis in psychology and in biomedicine is one of the consequences. In both fields, an overreliance on data with low statistical relevance and improper methods of data analysis (“p-value hacking”) have become common. That these statistical methods are unreliable has been known for a long time. As Jessica Utts, President of the American Statistical Association, pointed out in 2016 “statisticians and other scientists have been writing on the topic for decades.”

So why then did researchers in psychology continue using flawed methods? Because everyone else did it. It was what they were taught; it was generally accepted procedure. And psychologists who’d have insisted on stricter methods of analysis would have put themselves at a disadvantage: They’d have gotten fewer results with more effort. Of course they didn’t go the extra way.

The same problem underlies an entirely different popular-but-flawed scientific procedure: “Mouse models,” ie using mice to test the efficiency of drugs and medical procedures.

How often have you read that Alzheimer or cancer has been cured in mice? Right – it’s been done hundreds of times. But humans aren’t mice, and it’s no secret that mice-results – while not uninteresting – often don’t transfer to humans. Scientists only partly understand why, but that mouse models are of limited use to develop treatments for humans isn’t controversial.

So why do researchers continue to use them anyway? Because it’s easy and cheap and everyone else does it too. As Richard Harris put it in his book Rigor Mortis: “One reason everybody uses mice: everybody else uses mice.”

It happens here in the foundations of physics too.

In my community, it has become common to justify the publication of new theories by claiming the theories are falsifiable. But falsifiability is a weak criterion for a scientific hypothesis. It’s necessary, but certainly not sufficient, for many hypotheses are falsifiable yet almost certainly wrong. Example: It will rain peas tomorrow. Totally falsifiable. Also totally nonsense.

Of course this isn’t news. Philosophers have gone on about this for at least half a century. So why do physicists do it? Because it’s easy and because all their colleagues do it. And since they all do it, theories produced by such methods will usually get published, which officially marks them as “good science”.

In the foundations of physics, the appeal to falsifiability isn’t the only flawed method that everyone uses because everyone else does. There are also those theories which are plainly unfalsifiable. And another example are arguments from beauty.

In hindsight it seems perplexing, to say the least, but physicists published ten-thousands of papers with predictions for new particles at the Large Hadron Collider because they believed that the underlying theory must be “natural”. None of those particles were found.

Similar arguments underlie the belief that the fundamental forces should be unified because that’s prettier (no evidence for unification has been found) or that we should be able to measure particles that make up dark matter (we didn’t). Maybe most tellingly, physicists in these community refuse to consider the possibility that their opinions are affected by the opinions of their peers.

One way to address the current crises in scientific communities is to impose tighter controls on scientific standards. That’s what is happening in psychology right now, and I hope it’ll also happen in the foundations of physics soon. But this is curing the symptoms, not the disease. The disease is a lacking awareness for how we are affected by the opinions of those around us.

The problem will reappear until everyone understands the circumstances that benefit group-think and learns to recognize the warning signs: People excusing what they do with saying everyone else does it too. People refusing to take responsibility for what they think are “evils of the system.” People unwilling to even consider that they are influenced by the opinions of others. We have all the warning signs in science – had them for decades.

Accusing scientists of group-think is standard practice of science deniers. The tragedy is, there’s truth in what they say. And it’s no secret: The problem is easy to see for everyone who has the guts to look. Sweeping the problem under the rug will only further erode trust in science.



Read all about the overproduction crisis in the foundations of physics and what you – yes you! – can do to help in my book “Lost in Math,” out June 12, 2018.

Tuesday, May 15, 2018

Measuring Scientific Broadness

I have a new paper out today and it wouldn’t have happened without this blog.

A year ago, I wrote a blogpost declaring that “academia is fucked up,” to quote myself because my words are the best words. In that blogpost, I had some suggestions how to improve the situation, for example by offering ways to quantify scientific activity other than counting papers and citations.

But ranting on a blog is like screaming in the desert: when the dust settles you’re still in the desert. At least that’s been my experience.

Not so this time! In the week following the blogpost, three guys wrote to me expressing their interest in working on what I suggested. One of them I never heard of again. The other two didn’t get along and one of them eventually dropped out. My hopes sank.

But then I got a small grant from the Foundational Questions Institute and was able to replace the drop-out with someone else. So now we were three again. And I could actually pay the other two, which probably helped keeping them interested.

One of the guys is Tom Price. I’ve never met him, but – believe it or not – we now have a paper on the arXiv.
    Measuring Scientific Broadness
    Tom Price, Sabine Hossenfelder

    Who has not read letters of recommendations that comment on a student's `broadness' and wondered what to make of it? We here propose a way to quantify scientific broadness by a semantic analysis of researchers' publications. We apply our methods to papers on the open-access server arXiv.org and report our findings.

    arXiv:1805.04647 [physics.soc-ph]

In the paper we propose a way to measure how broad or specialized a scientists’ research interests are. We quantify this by analyzing the words they use in the title and abstracts of their arXiv papers.

Tom tried several ways to quantify the distribution of keywords, and so our paper contains four different measures for broadness. We eventually picked one for the main text, but checked that the other ones give largely comparable results. In the paper, we report the results of various analyses of the arXiv data. For example, here is the distribution of broadness over authors:

It’s a near-perfect normal distribution!

I should add that you get this distribution only after removing collaborations from the sample, which we have done by excluding all authors with the word “collaboration” in the name and all papers with more than 30 authors. If you don’t do this, the distribution has a peak at small broadness.

We also looked at the average broadness of authors in different arXiv categories, where we associate an author with a category if it’s the primary category for at least 60% of their papers. By that criterion, we find physics.plasm-ph has the highest broadness and astro-ph.GA the lowest one. However, we have only ranked categories with more than 100 associated authors to get sensible statistics. In this ranking, therefore, some categories don’t even appear.

That’s why we then also looked at the average broadness associated with papers (rather than authors) that have a certain category as the primary one. This brings physics.pop-ph to the top, while astro-ph.GA stays on the bottom.

That astrophysics is highly specialized also shows up in our list of keywords, where phrases like “star-forming” or “stellar mass” are among those of the lowest broadness. On the other hand, the keywords “agents”, “chaos,” “network”, or “fractal” have high values of broadness. You find the top broadest and top specialized words in the below table. See paper for reference to full list.


We also compared the average broadness associated with authors who have listed affiliations in certain countries. The top scores of broadness go to Israel, Austria, and China. The correlation between the h-index and broadness is weak. Neither did we find a correlation between broadness and gender (where we associated genders by common first names). Broadness also isn’t correlated with the Nature Index, which is a measure of a country’s research productivity.

A correlation we did find though is that researchers whose careers suddenly end, in the sense that their publishing activity discontinues, are more likely to have a low value of broadness. Note that this doesn’t necessarily say much about the benefits of some research styles over others. It might be, for example, that research areas with high competition and few positions are more likely to also be specialized.

Let me be clear that it isn’t our intention to declare that the higher the broadness the better. Indeed, there might well be cases when broadness is distinctly not what you want. Depending on which position you want to fill or which research program you have announced, you may want candidates who are specialized on a particular topic. Offering a measure for broadness, so we hope, is a small step to honoring the large variety of ways to excel in science.

I want to add that Tom did the bulk of the work on this paper, while my contributions were rather minor. We have another paper coming up in the next weeks (or so I hope), and we are also working on a website where everyone will be able to determine their own broadness value. So stay tuned!

Friday, May 11, 2018

Dear Dr B: Should I study string theory?

Strings. [image: freeimages.com]
“Greetings Dr. Hossenfelder!

I am a Princeton physics major who regularly reads your wonderful blog.

I recently came across a curious passage in Brian Greene’s introduction to a reprint edition of Einstein's Meaning of Relativity which claims that:
“Superstring theory successfully merges general relativity and quantum mechanics [...] Moreover, not only does superstring theory merge general relativity with quantum mechanics, but it also has the capacity to embrace — on an equal footing — the electromagnetic force, the weak force, and the strong force. Within superstring theory, each of these forces is simply associated with a different vibrational pattern of a string. And so, like a guitar chord composed of four different notes, the four forces of nature are united within the music of superstring theory. What’s more, the same goes for all of matter as well. The electron, the quarks, the neutrinos, and all other particles are also described in superstring theory as strings undergoing different vibrational patterns. Thus, all matter and all forces are brought together under the same rubric of vibrating strings — and that’s about as unified as a unified theory could be.”
Is all this true? Part of the reason I am asking is that I am thinking about pursuing String Theory, but it has been somewhat difficult wrapping my head around its current status. Does string theory accomplish all of the above?

Thank you!

An Anonymous Princeton Physics Major”

Dear Anonymous,

Yes, it is true that superstring theory merges general relativity and quantum mechanics. Is it successful? Depends on what you mean by success.

Greene states very carefully that superstring theory “has the capacity to embrace” gravity as well as the other known fundamental forces (electromagnetic, weak, and strong). What he means is that most string theorists currently believe there exists a specific model for superstring theory which gives rise to these four forces. The vague phrase “has the capacity” is an expression of this shared belief; it glosses over the fact that no one has been able to find a model that actually does what Greene says.

Superstring theory also comes with many side-effects which all too often go unnoticed. To begin with, the “super” isn’t there to emphasize the theory is awesome, but to indicate it’s supersymmetric. Supersymmetry, to remind you, is a symmetry that postulates all particles of the standard model have a partner particle. These partner particles were not found. This doesn’t rule out supersymmetry because the particles might only be produced at energies higher than what we have tested. But it does mean we have no evidence that supersymmetry is realized in nature.

Worse, if you make the standard model supersymmetric, the resulting theory conflicts with experiment. The reason is that doing so enables flavor changing neutral currents which have not been seen. This became clear in the mid 1990s, sufficiently long ago so that it’s now one of the “well known problems” that nobody ever mentions. To save both supersymmetry and superstrings, theorists postulated an additional symmetry, called “R-parity” that simply forbids the worrisome processes.

Another side-effect of superstrings is that they require additional dimensions of space, nine in total. Since we haven’t seen more than the usual three, the other six have to be rolled up or “compactified” as the terminology has it. There are many ways to do this compactification and that’s what eventually gives rise to the “landscape” of string theory: The vast number of different theories that supposedly all exist somewhere in the multiverse.

The problems don’t stop there. Superstring theory does contain gravity, yes, but not the normal type of gravity. It is gravity plus a large number of additional fields, the so-called moduli fields. These fields are potentially observable, but we haven’t seen them. Hence, if you want to continue believing in superstrings you have to prevent these fields from making trouble. There are ways to do that, and that adds a further layer of complexity.

Then there’s the issue with the cosmological constant. Superstring theory works best in a space-time with a cosmological constant that is negative, the so-called “Anti de Sitter spaces.” Unfortunately, we don’t live in such a space. For all we presently know the cosmological constant in our universe is positive. When astrophysicists measured the cosmological constant and found it to be positive, string theorists cooked up another fix for their theory to get the right sign. Even among string-theorists this fix isn’t popular, and in any case it’s yet another ad-hoc construction that must be added to make the theory work.

Finally, there is the question how much the requirement of mathematical consistency can possibly tell you about the real world to begin with. Even if superstring theory is a way to unify general relativity and quantum mechanics, it’s not the only way, and without experimental test we won’t know which one is the right way. Currently the best developed competing approach is asymptotically safe gravity, which requires neither supersymmetry nor extra dimensions.

Leaving aside the question whether superstring theory is the right way to combine the known fundamental forces, the approach may have other uses. The theory of strings has many mathematical ties with the quantum field theories of the standard model, and some think that the gauge-gravity correspondence may have applications in condensed matter physics. However, the dosage of string theory in these applications is homeopathic at best.

This is a quick overview. If you want more details, a good starting point is Joseph Conlon’s book “Why String Theory?” On a more general level, I hope you excuse if I mention that the question what makes a theory promising is the running theme of my upcoming book “Lost in Math.” In the book I go through the pros and cons of string theory and supersymmetry and the multiverse, and also discuss the relevance of arguments from mathematical consistency.

Thanks for an interesting question!

With best wishes for your future research,

B.

Thursday, May 03, 2018

Book Review: “The Only Woman In the Room” by Eileen Pollack

The Only Woman in the Room: Why Science Is Still a Boys Club
By Eileen Pollack
Beacon Press (15 Sep 2015)

Eileen Pollack set out to become an astrophysicist but switched to a career in writing after completing her undergraduate degree. In “The Only Woman In The Room” she explores the difficulties she faced that eventually led her to abandon science as a profession.

Pollack’s book is mostly a memoir, and an oddly single-sided one in that. At least for the purpose of the book, she looks at everything from the perspective of gender stereotypes. It’s about the toys she didn’t get, and the teachers who didn’t let her skip a year, and the boys who didn’t like nerdy girls, and the professors who didn’t encourage her, and so on.

I had some difficulties making sense of the book. For one, Pollack is 20 years older than I and has grown up in a different country. In the book she assumes the reader understands the context, but frankly I have no idea whatsoever how American school education looked like in the 1960s. I also missed most of the geographic, religious, and cultural references but wasn’t interested enough to look up every instance.

Leaving aside that Pollack clearly writes for people like her to begin with, the rest of the story didn’t make much sense to me either. The reader learns in a few sentences that Pollack in her youth develops an eating disorder. She also seems to have an anxiety disorder, is told (probably erroneously) that she has too high testosterone levels, and that later she regularly sees a therapist. But these asides never reappear in her narrative. Since it’s exceedingly unlikely her problems just disintegrated, there must have been a lot going on which the reader is not told about.

The story of the book is that Pollack sets out to track down her former teachers, professors, and classmates, and hear what they’ve been up to and what, if anything, changed about the situation for women in physics. Things did change, it turns out: The fraction of female students and faculty has markedly increased and many men have come to see the good in that. Pollack concludes with a somewhat scattered list of suggestions for further improvement.

Pollack does mention some studies on gender disparities, but her sample seems skewed to confirm her beliefs and she does not discuss the literature in any depth. She entirely avoids the more controversial questions, like whether some gender differences in performance are innate, whether it’s reasonable to assume women and men should be equally represented in all professions, or whether affirmative action is compatible with constitutional rights.

Despite this, the book has its uses. It sheds light on the existing problems, and (as Google will tell you) in reaction many women have spoken about and compared their experiences. For me, the value of the book has been to let me see my discipline through somebody else’s eyes.

I found it surprising just how different Pollack’s story is from my own, though my interests seem to be very close to hers. I’ve been told from as early as I can recall that I’m not social enough, that I don’t play with the other kids enough, that I’m too quiet, don’t integrate well, am bad at group work, and “will never make it at the university” unless I “learn to work with others.” I am also the kind of person who doesn’t give a shit what other people think I should do, so I went and got a PhD in physics.

The problem that Pollack blames most for her dropping out – that professors didn’t encourage her to pull through courses she had a hard time with – is a problem I never encountered because I didn’t get bad marks to begin with. I didn’t have friends among the students either, but I was just glad they left me alone. And where I am from, university is tuition-free, so while my money was short, financing my education was never a headache for my family.

Like Pollack, I have a long string of DSM classifiers attached to me and spent several years in therapy, but it never occurred to me to blame my profs for that. When doctors checked my testosterone levels (which has happened several times over the decades) I didn’t conclude I must be a man, but that it’s probably a standard check for certain health problems. And since now you wonder, my hormone levels are perfectly normal. Or at least that’s what I thought until I read that Pollock had a crush on pretty much every one of her profs. Maybe I’m abnormal in that I never fancied my profs. Or that I never worried I might not find a guy if I study physics.

Nevertheless, Pollack is right of course that we have a long way to go. Gender disparities which reinforce stereotypes are still omnipresent, and now that I am mother of two daughters I don’t have to look far to see the problems. The kids’ teachers are all women except for the math teacher. The parents who watch their toddlers at the playground are almost exclusively mothers. And I get constantly told I am supposedly aggressive, sometimes for doing nothing more than looking the way I normally look, that is, mildly dismayed at the bullshit men throw at me. But I’m not quite old enough to write a memoir, so let me leave it at this.

I’d recommend this book for anyone who wants to understand why some women perceive science and engineering as extremely unwelcoming workplaces.

Monday, April 30, 2018

Me, Elsewhere

  • I spoke with Iulia Georgescu, who writes for the Nature Physics blog, about my upcoming book “Lost in Math.
  • The German version of the book now also has an Amazon page. It sells me as “Ketzer,” meaning “heretic.” Well, I guess I indeed make some blasphemous remarks about other people’s beliefs.
  • Chris Lee has reviewed my book for Ars Technica. He bemoans it’s lacking dramatic turns of plot. Let me just say it’s really hard to be surprising if your editor puts the storyline in the subtitle.
  • It seems there will be an audio version after all. Will let you know if details emerge.
  • When I was in New York last year, the Brockmans placed me in front of a camera with the task to speak about what has been on my mind recently, just that I shouldn’t mention my book, which of course has been the only thing on my mind recently. I did my best.


Wednesday, April 25, 2018

A black hole merger... merger... merger


For my 40th birthday I got a special gift: 2.5 σ evidence for quantum gravity. It came courtesy of Niayesh Afshordi, Professor of astrophysics at Perimeter Institute, and in contrast to what you might think he didn’t get the 2.5 σ on Ebay. No, he got it from a LIGO-data analysis, results of which he presented at the 2016 conference on “Experimental Search for Quantum Gravity.”

Frankly I expected the 2.5 σ gift to quickly join the list of forgotten anomalies. But so far it has persisted, and it seems about time I unwrap this for you.

Evidence for quantum gravity is hard to come by because quantum fluctuations of space-time are so damn tiny you can’t measure them. To overcome this impasse, Afshordi and his collaborators looked at a case where the effects of quantum gravity can become large: Gravitational waves produced by black hole mergers.

Their idea is that General Relativity may not correctly describe black hole horizons. In General Relativity, the horizon bounds a region that, once entered, cannot be left again. The horizon itself has no substance and indeed you would not notice crossing it. But quantum effects may change the situation.

Afshordi and his group therefore studied that quantum effects turn the horizon into a physical obstacle that partly reflects gravitational waves. If that was so, the gravitational waves produced in a black hole merger would bounce back and forth between the horizon and the black hole’s photon sphere (a potential barrier at about 1.5 times the Schwarzschild radius). This means the waves would slowly leak out in each iteration rather than escape in one bang. If that’s what’s really going on, gravitational wave interferometers like LIGO should detect echoes of the original merger signal.

2.5 σ means that roughly one-in-a-hundred times random fluctuations conspire to look like the observed echo. It’s not a great level of significance, at least not by physics standards. But it’s still 2.5σ better than nothing.

Afshordi’s group extracted the echo signal from the LIGO data with their own analysis methods. Some members of the LIGO collaboration criticized this method and did their own analysis, concluding that the significance is somewhat lower. Afshordi’s group promptly complained the LIGO people make misleading statements and the results are actually consistent. You see they’re having fun.

Bottomline is there’s some quarrel about exactly what the level of significance is, and exactly what’s the right way to analyze the data, but the results of both groups are by and large comparable. Something is there, but at this point we cannot be sure it’s a real signal.

I will admit that as a theorist, I am not enthusiastic about black hole echoes because there are no compelling mathematical reasons to expect them. We know that quantum gravitational effects become important towards the center of the black hole. But that’s hidden deep inside the horizon and the gravitational waves we detect are not sensitive to this. That quantum gravitational effects are also relevant at the horizon is speculative and pure conjecture, and yet that’s what it takes to have black hole echoes.

But theoretical misgivings aside, we have never tested the properties of black hole horizons before, and on unexplored territory all stones should be turned. Indeed the LIGO collaboration has now included the search for echoes into their agenda.

There is even another group, this one in Toronto, which has done their own scan of the LIGO data. They found echoes at 3 σ. The Toronto group’s analysis has the benefit of being largely model-independent. But they advocate the use of periodic window-functions which induce spurious periodic signals. The authors show that in certain frequency regimes the side-effect of the windowing can be neglected and that in simulations they were able to extract the actual signal. But I suspect it will take more than this to convince anyone that imposing a periodic signal on data is a good way to look for a periodic signal.

Afshordi and his collaborators meanwhile have put out another paper, claiming that indeed the evidence is as high as 4.2 σ. That’s a pretty high significance, inching close to an actual discovery. I am, however, not convinced by their latest study. The reason is that the more they doctor on their model, the better they will get at finding specific types of echo in the noise. To correctly evaluate the significance they’d then have to take into account the number of different models which they tried. Without doing that, the significance is bound to increase simply because they’ve tested more hypotheses.

So I’d advise you to not read too much into the 4.2 σ. On the other hand, the LIGO people probably tried very hard to make the signal go away but didn’t manage to. Therefore I think at this point we can be confident there is something in the data. But to find out whether it’s more than just funny-looking random fluctuations, we will have to wait for more black hole mergers.


[I wrote about black hole echoes previously for Aeon and more recently for Quanta Magazine, and the story will probably come back a few times more.]

Monday, April 23, 2018

Interna

I am giving (another) seminar in Heidelberg on Wednesday (April 25th), this time about my upcoming book.

May 1st is a national holiday in Germany (labor day) and I’ll be off-grid due to family affairs for some days.

May 7th to 9th I am in Stockholm to get yelled at (it’s complicated).

On May 26th I am in Hay-on-Wye which is a village someplace UK that hosts an event called How The Light Gets In at which I am supposed to debate how “the pursuit of beauty drive[s] the evils that hold back a better society.” I wouldn’t go as far as calling grand unification an evil, so please don’t judge me by the tagline.

I have also been asked to share this image below. Hope it makes more sense to you than to me.


On May 28th I am giving a public lecture in Dublin at the Irish Quantum Foundations conference.

In summary this means May will be a very slow month on this blog.

Wednesday, April 18, 2018

Guest Post: Brian Keating about his book “Losing the Nobel Prize"

My editor always said “Don’t read reviews”... But given that I’ve received some pretty amazing reviews lately, how bad could it be? Nature even made a delightfully whimsical custom-illustration of my conjecture: that some of my fellow astronomers look to the skies for the Nobel Prize:

Illustration by Stephan Schmitz for Nature.


When I saw Sabine had finally gotten round to reading my book, I was thrilled! This is sure to be an awesome review from a fellow-traveller: a first-time author herself.

Gulp. After reading Sabine’s blog, I immediately regretted not taking my editor’s advice. But, Sabine was kind enough to offer me a chance to reply to her review (a review of a review?) so here goes.

First off, speaking of not reading things, the cover of the version I sent to Sabine explicitly says “don’t quote without checking against the final version” (See the white cover version in the upper left hand corner of this photograph on Medium).

Unfortunately, Sabine never read the finished version. In fact, the few times I asked her about her progress reading the “ADVANCE UNCORRECTED” review copy’ book sent to her in August, she only replied “I’ve not started it”. Fair enough, she was writing her own book about things being Lost. And she did pass it on to a German publisher on my behalf, which was terribly kind of her.

But the version Sabine read was not even proof-read, nor copyedited, nor fact-checked. Right on the cover it implores the reader to “not quote for publication without checking against the finished book”. This is something she, as an author, probably should have realized before writing her review. I’ve reviewed multiple books for fellow physicists long before writing my own -- as has she -- and always make sure to cross-check against the final version(s) [plural because often I’ve read and listened to the audio book before writing my review].

[[SH: I quoted a single sentence, and I assume that sentence is still in the book because otherwise hed have rubbed it in by now.]]

But, what about the substance of her review? Well, much of what’s inaccurate about it stems from unwarranted or incorrect assumptions. For example, she complains that I did not inquire as to what “Swedish Royal Academy has to say about the reformation plans”.

First of all, I’m not sure how she could possibly know with whom I’ve been in communication with...she’s not Zuckerberg!

[[SH: I guessed that much from my exchange with him, and reading the book confirmed it because if he had been in touch with them he’d have mentioned it.]]

Secondly, not only did I seek (and receive permissions from the Swedish Academy), I corresponded with a member of the Swedish Academy...and they agreed with my proposal:

“Thanks for sending me your interesting piece in the Scientific American. Although I, for obvious reasons, cannot comment on any details concerning the Nobel prize, I can assure you that all the points that were rised [sic] in your article are actively discussed in the Committee and the Academy, and have been so for a long time. We are acutely aware of both challenges and difficulties related to revising the self-imposed rules for how the prize is awarded. We also very much welcome debate about these issues, so I thank you for caring about the future of the Nobel prize, and I will forward you article to the other members of the committee.

How’s that for seeing what “they have to say”?

[[SH: I was the one who got Brian in contact with the above quoted member of the Swedish Academy after it became clear to me that Brian had not bothered communicating them.]]

Some of this appears in the final book version.

[[SH: I am so happy I could be of help.]]

And all of this I would have been happy to share with Sabine...had she asked. I was gratified to see that my concerns were shared by them and that they had not, as Sabine asserted, ignored my ideas: “I’d be surprised if the Royal Academy even bothers responding to Keating’s book.”

I agree with her: I’m not convinced anything will come of it...until the day the Nobel Prize in physics is boycotted or sued. And I think that day is coming.

Why? It relates to two objections Sabine raised early in her review:

“I have found Keating’s book outright perplexing. To begin with let us note that the Nobel Prize is not a global community award. It’s given out by a Swedish committee tasked with executing the will of a very dead man.”

It really not that perplexing, Sabine. The Nobel Prize is a global event, not just a simple Swedish smorgasbord. The prize for peace, for example, is for world peace, not merely to implore Norsemen to stop making war with their many conniving enemies, right?

Fact: According the Nobel Foundation, 100 million people tune into the festivities each year...ten times the population of Sweden and about 10% of the audience the Oscars receive. Winners become celebrities and the Nobel Committees revel in the fame and adoration the events receive.

Why, they’re even moving into a brand, spanking new $150M building in Stockholm next year, designed by a fancy architecture firm, for all their many festivities (the old venue is too small apparently).

The winners are disproportionately non-Scandinavian and the prize aims to reward those who have benefitted “all mankind”, not just Swedes. In fact, the prize for literature is currently undergoing a bonafide sex scandal:

“STOCKHOLM — A sexual abuse and harassment scandal roiling the committee that awards the Nobel Prize in Literature deepened on Wednesday, as the king of Sweden and the foundation that finances the prize warned that the scandal risked tarnishing one of the world’s most important cultural accolades.” [emphasis mine]

The Nobel Prize leaders and the King recognize the power of the prize. It is not only science’s greatest accolade, it’s the greatest one humanity has to offer as well. As such it should be held to a higher standard.

With respect to the many comments others have made about me having sour grapes, no one who reads the book could come away thinking I actually still want to win it. Of course, after reading Sabine’s review, many have cancelled their orders so they may never learn!

But even that notwithstanding, I’m often criticized for writing about it without having won the Nobel. I find that a bit silly. Can one not criticize Harvey Weinstein without being an member of the Academy? Can one not criticize President Trump if one has not been president?

As to this snarky bit: “Keating apparently thinks he knows better what Alfred Nobel wanted than Alfred Nobel himself. Maybe he does. I don’t know, my contacts in the afterworld have not responded to requests for comments.”

I resonate with Sabine’s admission that she has no direct lines to the afterworld; neither do I. And that’s the exact purpose of a will, isn’t it? “A last will and testament is a legal document that communicates a person's final wishes pertaining to possessions and dependents

Alfred died without any children or spouse...his will specified what was to be done with the money he made from the (world wide) patent on dynamite. After providing some kroner for his friends, and his nephews and nieces (the nieces got only ½ of what his nephews received...perhaps an early source of the prize’s legendary sexism?), he left money for the titular prizes. Reading the will, we learn:

“The whole of my remaining realizable estate shall be dealt with in the following way: the capital, ...shall be annually distributed in the form of prizes to those who, during the preceding year, shall have conferred the greatest benefit to mankind. ... one part to the person who shall have made the most important discovery or invention within the field of physics…”

One sees three conditions for awarding the prize:

1. The person [in the singular]
2. Who in the Preceding year
3. Conferred “the greatest benefit to mankind”

I don’t need to have clairvoyance into the netherworld because Alfred did it for me. It is abundantly clear what he wanted and all three of these rules are routinely ignored and have been for over a century.

As for the power of the prize to affect the judgement and career choices of scientists, let me just say it affects non-astronomers too: "With physicist Peter Mansfield, Lauterbur in 2003 bested Damadian to Nobel recognition of magnetic resonance imaging (MRI). This outcome prompted the appearance of full-page ads financed by the Friends of Raymond Damadian in a number of newspapers, including The New York Times.” [emphasis mine].

Wherever there is an idol, people will bow down to it, the Nobel is no baal [Warning: another Old Testament reference], but it is no exception either.

As for the pro-experimental bias of my book “Keating for example suggests that the Nobel Prize only be given to “serendipitous discoveries,” by which he means if a theorist predicted it then it’s not worthy.” Sabine, how could you miss the lovely pie chart I made for you and your fellow brainiacs as well as the accompanying text: “A serendipity criterion would mean Nobel Prizes would go to the theorist(s) who predict new phenomena, though they should win only after experimental verification.”



[[SH: He also explains that if a theorist predicted it, then the experimental verification wasn’t serendipitous, so what gives? And yeah, I was about to make a joke about that pie chart but then felt sorry for the graphics designer who probably just did what Brian asked for.]]

In the book I am advocating that more theorists should win it, and experimentalists should not win it if they/we merely confirm a theory...that leaves them/us susceptible to confirmation bias. For reference, this was in the copy Sabine read.

Alas, time is fleeting and the launch of my book is few short days away, so I must take leave of back reacting.

But I am thankful that Sabine has permitted me this chance to address some of her concerns. And I am grateful for the many kind words she did employ in her review. I enjoy your work and I wish you best of luck with your book...may you never read a review you have to react to!

Monday, April 16, 2018

Book Review: “Losing the Nobel Prize” by Brian Keating

Losing the Nobel Prize: A Story of Cosmology, Ambition, and the Perils of Science’s Highest Honor
Brian Keating
W. W. Norton & Company (April 24, 2018)

Brian Keating hasn’t won a Nobel Prize. Who doesn’t know the feeling? But Keating, professor of physics at UC San Diego, isn’t like you and I. He had a good shot at winning. Or at least he thought he had. And that’s what his book is about.

Keating designed the BICEP telescope, whose upgrade – BICEP2 – made headlines in 2014 by claiming the first indirect detection of primordial gravitational waves through B-modes in the cosmic microwave background. Their supposed detection turned out to be contaminated by a foreground signal from dust in the Milky Way and, after a few months, was declared inconclusive. And there went Keating’s Nobel Prize.

In his book, Keating tells the story of the detection and its problems. But really the book is about his obsession to win the Nobel Prize and his ideas for reforming the award’s criteria. That’s because Keating has come to the conclusion that pursuing science to the end of winning a Nobel prize is no good, and he doesn’t want his colleagues to go down the same road. He also doesn’t think it’s fair to hand out the prize for maximally three people (who moreover should be alive) because by his own accounting he’d have been fourth on the list. At best.

“Losing the Nobel Prize” is well written and engaging and has a lot of figures and, ah, here I run out of nice things to say. But we all know you didn’t come for the nice things anyway, so let’s get to the beef.

I have found Keating’s book outright perplexing. To begin with let us note that the Nobel Prize is not a global community award. It’s given out by a Swedish committee tasked with executing the will of a very dead man. Keating apparently thinks he knows better what Alfred Nobel wanted than Alfred Nobel himself. Maybe he does. I don’t know, my contacts in the afterworld have not responded to requests for comments.

In any case, you’d think if someone writes a book about the Nobel Prize they’d hear what the Swedish Royal Academy has to say about the reformation plans. But for all I can tell Keating never even contacted them. His inside knowledge about the Nobel Prize is having been invited to nominate someone.

Keating doesn’t mention it, but the club of those eligible to submit nominations for the Nobel Prize is not very exclusive. Every tenured professor in the Nordic countries (in the respective discipline) can nominate candidates. Right, that’s not very equal opportunity. Fact is the Nobel Prize is unashamedly North European. And North Europeans in general, with Swedes in particular, don’t care whether US Americans like what they do. I’d be surprised if the Royal Academy even bothers responding to Keating’s book.

Even stranger is that Keating indeed seems to believe most scientists pursue their research because they want to win a Nobel Prize. But I don’t know anyone who has ever chosen a research project because they were banking on a Nobel. That’s because at least in my discipline it’s widely recognized that winning this award doesn’t merely require scientific excellence but also a big chunk of luck.

Maybe Keating is right and Nobel obsession is widespread in his own discipline, experimental astrophysics. But no such qualifier appears anywhere in the book. He is speaking for all of science. “Battle is an apt metaphor for what we scientists do.” And according to Keating, it’s not Nature’s reclusiveness that we battle but each other. And it’s all because everyone wants to win the Nobel Prize.

Keating at some point compares the Nobel Prize to Olympic Gold, but that’s a poor comparison. If you wanted to compare the Nobel Prize to say, discus throw, you’d have to let 99.9% of discuses randomly explode right after being thrown. And when that happens you disqualify the athlete.

In my experience, while almost everyone agrees that Nobel Laureates in Physics deserve their prizes, they also acknowledge it matters to be at the right place at the right time. And since everyone knows talent isn’t the only relevant factor to win the Prize, few hold grudges about not winning it. It’s really more a lottery than a competition.

Having said that, even after reading the book I am not sure just how Keating proposes we think about the Nobel Prize. While he uses analogies to athletic competition, he also compares the Nobel Prize with the Oscar and with a religious accolade, depending on what suits his argument. Possibly the reason for my difficulty understanding Keating is that he assumes his readers know what’s the fourth and fifth commandment and what was up with that Golden Calf in the Old Testament. I may know what B-modes are, but that one beat me.

He also lost me at various other places in the book where I just couldn’t figure out what’s going on or why. For example, I guess pretty much everyone who reads the book will know that BICEP failed to measure the signal they were after. Yet the reader has to wait until Chapter 14 to hear what happened there. Chapter 15, then, is titled “Poetry for Physicists,” contains some reflections on how we’re all made of stardust and so on, praises Jim Simons for funding the Simons Observatory, mentions as an aside that Keating will be the Observatory’s director, and ends with a poem about dust. The next chapter is slap back to Nobel’s will and then goes on about the lack of female laureates. If follows an image of Keating’s academic genealogy, and then we are transported into a hospital room to witness his father’s death.

If that sounds confusing it’s because it is. There are so many things in the book that didn’t make sense to me I don’t even know where to begin.

Keating for example suggests that the Nobel Prize only be given to “serendipitous discoveries,” by which he means if a theorist predicted it then it’s not worthy. You read that right. No Nobel for the Higgs, no Nobel for B-modes, and no Nobel for a direct discovery of dark matter (should it ever happen), because someone predicted that. Bad news for theorists I suppose. The culprit here seems to be that Keating (an experimentalist) doesn’t believe theory development has any role to play in experimental design. He just wants rich guys to crank out money so experimentalists can do whatever they want.

The most befuddling aspect of this book, however, is that Keating indeed seems to believe he had a chance of winning the Nobel Prize. But the Nobel Prize committee would have done well not handing out a prize even if BICEP had been successful measuring the signal they were after.

B-mode polarization from primordial gravitational waves would have been an indirect detection of gravitational waves, but there was a Nobel Prize for that already and being second doesn’t count. And in contrast to what Keating states in the book, this detection would not have been evidence for quantum gravity because the measurement wouldn’t have revealed whether the waves were or weren’t quantized.

Neither for that matter, would it have been evidence for inflation. As Keating himself notes in the passing by quoting Steinhardt, models for inflation can predict any possible amount of B-modes. He doesn’t mention it, but I’ll do it for him, that some models without inflation also predict B-modes.

So the claim that B-modes would be evidence for inflation is already wrong. Even worse, Keating repeats the myth that such a detection would moreover be evidence for the multiverse because that just sounds so spectacular. But it doesn’t matter how often you repeat this claim it’s still wrong, and it isn’t even difficult to see why. If you want to make any calculation to predict the B-mode spectrum you don’t need to begin with a multiverse. All you need is some effective theory in our universe. And that theory may or may not have an inflaton.

My point being simply if you don’t need X it to make prediction Y, measuring Y isn’t evidence for X. So, no, B-modes aren’t evidence for the multiverse.

I don’t personally know Brian Keating, but he’s on twitter and he seems to be a nice guy. Also I got this book for free, so I want to warmly recommend you buy it because what else can I say. If nothing else, Keating has a gift for writing. And who knows, his next book might be about not winning a Nobel Prize for Literature.

Tuesday, April 10, 2018

No, that galaxy without dark matter has not ruled out modified gravity

A smear with dots, 
also known as NGC 5264-HST.
Did you really have to ask?

And if you had to ask, why did you have to ask me?

You sent me like two million messages and comments and emails asking what I think about NGC 1052-DF2, that galaxy which supposedly doesn’t contain dark matter. Thanks. I am very flattered by your faith.

But I’m not an astrophysicist, I’m a theorist. I invent equations and then despair over my inability to solve them. That’s what I do. I know about as much about telescopes as penguins know about cacti. And until last week I thought a globular cluster is a kind of glaucoma. (Turns out it’s not.)

But since you ask.

If nothing else, I have the benefit of a university account with subscriptions to all major journals, so at least I could look at the Nature paper in question. We can read there:
“the existence of NGC1052–DF2 may falsify alternatives to dark matter. In theories such as modified Newtonian dynamics (MOND) and the recently proposed emergent gravity paradigm, a ‘dark matter’ signature should always be detected, as it is an unavoidable consequence of the presence of ordinary matter.”
This paragraph is decorated with two references, one of which is to Milgrom’s MOND and one is to Erik Verlinde’s emergent gravity. And, well, I’m a theorist. Therefore I can tell you right away those people don’t know a thing about the theories they try to falsify.

It is beyond me why so many astrophysicists believe that modified gravity is somehow magically different from particle dark matter, or indeed all other theories we have ever heard of. It’s not.

For both modified gravity and particle dark matter you have additional degrees of freedom (call them fields or call them particles) which need additional initial conditions. In a universe in which you have a large variety of initial conditions (seeded by quantum fluctuations), you will get a large variety of structures. Same thing for modified gravity as for particle dark matter.

Another way to put this is that you can always cook up exceptions to the rule. The challenge isn’t to explain the exception. The challenge is to explain the rule. Modified gravity does that. Particle dark matter doesn’t.

Of course you don’t see the exceptions in Milgrom’s or in Verlinde’s paper. The reason is that both merely contain equations which describe time-independent situations. These equations derived in Milgrom’s and Verlinde’s papers are not theories, they are certain limits of theories, approximations that work in some idealized circumstances, such as equilibrium. The full theories are various types of “modified gravity” and if you want to rule those out, you better find out what they predict first.

But we don’t even have to stoop so low. Because, interestingly enough, the authors of the Nature study rule out MOND without even making a calculation for what that limit would predict. Stacy had to do it for them. And he found that MOND is largely compatible with the upper value of their supposed measurement results.

Having said that, let us have a look at their data.

So the galaxy in question is an “ultra-diffuse galaxy” with “globular clusters.” For all I can tell that means it’s a smear with dots. The idea is that you measure how fast the dots move. Then you estimate whether the visible mass suffices to explain the speed by which the dots move. If it does, you call that a galaxy without dark matter. Have I recently mentioned that I am not an astrophysicist?

There are ten of the globular clusters, and here is the data. With the best-fit Gaussian.

Figure 3b of Dokkum et al, Nature 555, p. 629–632


In case that looks a little underwhelming, some nice words must be added here about how remarkable an achievement it is to make such a difficult measurement and how brilliant these scientists are and so on. Still, ehm, that’s some way to fit data.

And it’s not the only way to analyze the data. Indeed, they tried three different ways which gives the results 4.7 km/s, 8.4 km/s and 14.3 km/s. All I learn from this is that it’s not enough data to make reliable statistical estimates from. But then I’m a theorist.

Michelle Collins, however, is an actual astrophysicist. She is also skeptical. She even went and applied two other methods to analyze the data and arrived at mean values of 12 +/-3 km/s or 11.5+/-4 km/s, which is well compatible with Stacy’s MOND estimate.

Michelle also points out that globular clusters are often not good representatives to measure what is going on in a whole galaxy, because these clusters might have joined the galaxy at a late stage of formation. In other words, even those estimates might be totally wrong because the sample is skewed.

When I factor all this information together, I arrive at a 95 percent probability that this supposedly dark-matter-less galaxy will turn out to contain dark matter after all and it will be well compatible with modified gravity.

I give it an equally high probability that five years after the claim has been refuted, astrophysicists will still say modified gravity has been ruled out by it. Like the Nature paper refers once again to the Bullet cluster but fails to mention that the Bullet cluster can well be explained with modified gravity, but is difficult to explain with particle dark matter.

***

A recent Nature editorial praised a “Code of Ethics for Researchers” that was proposed by the World Economic Forum Young Scientists Community. In this Code, you can read that scientists are supposed to pursue the truth and
“Pursuing the truth means following the research where it leads, rather than confirming an already formed opinion. This is particularly challenging but necessary when questioning current beliefs [...] Results must be represented accurately without over- or understatement, hiding facts and/or drawbacks, or misleading the reader in any way.”
Maybe the editors at Nature should read what they recommend.

Update April 11: The paper’s first author has some comments on the various points of criticism that have been raised here.
Update April 14: A group of astrophysicists has a response on the arXiv: “Current velocity data on dwarf galaxy NGC1052-DF2 do not constrain it to lack dark matter.”
Update April 16: Another paper on the arXiv today, this one showing that the observations aren’t in conflict with modified gravity: MOND and the dynamics of NGC1052-DF2.

Wednesday, April 04, 2018

Particle Physicists begin to invent reasons to build next larger Particle Collider

Collider quilt. By Kate Findlay.
[Image: Symmetry Magazine]
Nigel Lockyer, the director of Fermilab, recently spoke to BBC about the benefits of building a next larger particle collider, one that reaches energies higher than the Large Hadron Collider (LHC).

Such a new collider could measure more precisely the properties of the Higgs-boson. But that’s not all, at least according to Lockyer. He claims he knows there is something new to discover too:
“Everybody believes there’s something there, but what we’re now starting to question is the scale of the new physics. At what energy does this new physics show up,” said Dr Lockyer. “From a simple calculation of the Higgs’ mass, there has to be new science. We just can’t give up on everything we know as an excuse for where we are now.”
First, let me note that “everybody believes” is an argument ad populum. It isn’t only non-scientific, it is also wrong because I don’t believe it, qed. But more importantly, the argument for why there has to be new science is wrong.

To begin with, we can’t calculate the Higgs mass; it’s a free parameter that is determined by measurement. Same with the Higgs mass as with the masses of all other elementary particles. But that’s a matter of imprecise phrasing, and I only bring it up because I’m an ass.

The argument Lockyer is referring to are calculations of quantum corrections to the Higgs-mass. Ie, he is making the good, old, argument from naturalness.

If that argument were right, we should have seen supersymmetric particles already. We didn’t. That’s why Giudice, head of the CERN theory division, has recently rung in the post-naturalness era. Even New Scientist took note of that. But maybe the news hasn’t yet arrived in the USA.

Naturalness arguments never had a solid mathematical basis. But so far you could have gotten away saying they are handy guides for theory development. Now, however, seeing that these guides were bad guides in that their predictions turned out incorrect, using arguments from naturalness is no longer scientifically justified. If it ever was. This means we have no reason to expect new science, not in the not-yet analyzed LHC data and not at a next larger collider.

Of course there could be something new. I am all in favor of building a larger collider and just see what happens. But please let’s stick to the facts: There is no reason to think a new discovery is around the corner.

I don’t think Lockyer deliberately lied to BBC. He’s an experimentalist and probably actually believes what the theorists tell him. He has all reasons for wanting to believe it. But really he should know better.

Much more worrisome than Lockyer’s false claim is that literally no one from the community tried to correct it. Heck, it’s like the head of NASA just told BBC we know there’s life on Mars! If that happened, astrophysicists would collectively vomit on social media. But particle physicists? They all keep their mouth shut if one of theirs spreads falsehoods. And you wonder why I say you can’t trust them?

Meanwhile Gordon Kane, a US-Particle physicist known for his unswerving support of supersymmetry, has made an interesting move: he discarded of naturalness arguments altogether.

You find this in a paper which appeared on the arXiv today. It seems to be a promotional piece that Kane wrote together with Stephen Hawking some months ago to advocate the Chinese Super Proton Proton Collider (SPPC).

Kane has claimed for 15 years or so that the LHC would have to see supersymmetric particles because of naturalness. Now that this didn’t work out, he has come up with a new reason for why a next larger collider should see something:
“Some people have said that the absence of superpartners or other phenomena at LHC so far makes discovery of superpartners unlikely. But history suggests otherwise. Once the [bottom] quark was found, in 1979, people argued that “naturally” the top quark would only be a few times heavier. In fact the top quark did exist, but was forty-one times heavier than the [bottom] quark, and was only found nearly twenty years later. If superpartners were forty-one times heavier than Z-bosons they would be too heavy to detect at LHC and its upgrades, but could be detected at SPPC.”
Indeed, nothing forbids superpartners to be forty-one times heavier than Z-bosons. Neither is there anything that forbids them to be four-thousand times heavier, or four billion times heavier. Indeed, they don’t even have to be there at all. Isn’t it beautiful?

Leaving aside that just because we can’t calculate the masses doesn’t mean they have to be near the discovery-threshold, the historical analogy doesn’t work for several reasons.

Most importantly, quarks come in pairs that are SU(2) doublets. This means once you have the bottom quark, you know it needs to have a partner. If there wouldn’t be one, you’d have to discontinue the symmetry of the standard model which was established with the lighter quarks. Supersymmetry, on contrast, has no evidence among the already known particles speaking in its favor.

Physicists also knew since the early 1970s that the weak nuclear force violates CP-invariance, which requires (at least) three generations of quarks. Because of this, the existence of both the bottom and top quark were already predicted in 1973.

Finally, for anomaly cancellation to work you need equally many leptons as quarks, and the tau and tau-neutrino (third generation of leptons) had been measured already in 1975 and 1977, respectively. (We also know the top quark mass can’t be too far away from the bottom quark mass, and the Higgs mass has to be close by the top quark mass, but this calculation wasn’t available in the 1970s.)

In brief this means if the top quark had not been found, the whole standard model wouldn’t have worked. The standard model, however, works just fine without supersymmetric particles. 

Of course Gordon Kane knows all this. But desperate times call for desperate measures I guess.

In the Kane-Hawking pamphlet we also read:
“In addition, a supersymmetric theory has the remarkable property that it can relate physics at our scale, where colliders take data, with the Planck scale, the natural scale for a fundamental physics theory, which may help in the efforts to find a deeper underlying theory.”
I don’t disagree with this. But it’s a funny statement because for 30 years or so we have been told that supersymmetry has the virtue of removing the sensitivity to Planck scale effects. So, actually the absence of naturalness holds much more promise to make that connection to higher energy. In other words, I say, the way out is through.

I wish I could say I’m surprised to see such wrong claims boldly being made in public. But then I only just wrote two weeks ago that the lobbying campaign is likely to start soon. And, lo and behold, here we go.


In my book “Lost in Math” I analyze how particle physicists got into this mess and also offer some suggestions for how to move on.

Tuesday, April 03, 2018

Book Review: “Farewell to Reality” by Jim Baggott

Farewell to Reality: How Modern Physics Has Betrayed the Search for Scientific Truth
Jim Baggott
Pegasus Books (1 Aug 2013)

Not sure how I missed “Farewell to Reality” when it came out. Indeed, I didn’t take note of Jim Baggott’s writing until I was asked to review one of his more recent books for Physics World. And having enjoyed that, I had a look at his previous books.

“Farewell to Reality,” which appeared in 2013, is a critical take on supersymmetry, string theory, multiverses, many worlds, and related ideas. Baggott collectively refers to them as “fairy tale science.”

His book is a well-researched and methodological approach to these speculative theories. Baggott first establishes a basis, spelling out what he will take the purpose of scientific explanation to be. Then he goes through the background knowledge (general relativity, quantum mechanics, standard model). Having done that, he evaluates whether the “fairy tales” are worth being taken seriously, concluding (unsurprisingly given the book’s title) that, no, they’re not.

The book suffers somewhat from its rather heavy, philosophical opening chapter, followed by the inevitable but necessary terminology, but the later chapters pick up in pace. It takes some enthusiasm to get through the book’s first part. But this slow start has the benefit of making the book fairly self-contained; I believe you don’t need to bring more than high-school physics to make sense of Baggott’s explanations.

I largely agree with Baggott’s assessment, though I am less critical of research on the foundations of quantum mechanics and I could quibble with his take on black hole evaporation, but it seems somewhat besides the point. I share Baggott’s worry that presenting unfounded speculations, like that we live in a multiverse, as newsworthy research undermines public trust in science. Baggott writes:
“[T]he continued publication of popular science books and the production of television documentaries that are perceived to portray superstring theory or M-theory as ‘accepted’ explanations of empirical reality (legitimate parts of the authorized version of reality) is misleading at best and at worst ethically questionable.”

and
“I believe that damage is being done to the integrity of the scientific enterprise. The damage isn’t always clearly visible and is certainly not always obvious. Fairy-tale physics is like a slowly creeping yet inexorable dry rot. If we don’t look for it, we won’t notice that the foundations are being undermined until the whole structure comes down on our heads.”

which I think is entirely correct.

Baggott is a gifted science writer whose explanations seem as effortless as I’m sure they’re not. He knows his stuff and isn’t afraid of clear words. And having noted this, it is not irrelevant to mention that Baggott is no longer working in academia; he has no reason to sell fairy tales as science. And he doesn’t. He’s a writer you can trust.

While I am sorry I missed Baggott’s book when it appeared, I am glad I didn’t read it before writing my own book. It’s somewhat depressing to look at “Farewell to Reality” years after it was published and see that nothing has changed.

I would recommend “Farewell to Reality” to anyone who is looking for a sober assessment of what to make of all the interesting but speculative ideas that theoretical physicists have cooked up in the past decades.


I also recommend of course that you buy my book. It covers some more topics that Baggott doesn’t discuss, such as models for inflation, dark matter, various approaches to quantum gravity, and what the absence of supersymmetric particles at the LHC means.

Wednesday, March 28, 2018

Why Black Stars Aren’t A Thing.

Not a black star,
but about equally real.
I came to physics by accident. I had studied mathematics, but the math department was broke. When I asked the mathematicians for a job, they sent me to the other side of the building. “Ask the physicists,” they said.

The physicists didn’t only give me a job. They also gave me a desk, a computer, and before I knew I had a topic for a diploma thesis. I was supposed to show that black holes don’t exist.

I didn’t know at that time, but it was my supervisor’s shtik, the black-holes-don’t-exist-speech. Prof Dr Dr hc mult Walter Greiner, who passed away two years ago, was the department head when I arrived. His argument against black holes was that “God wouldn’t separate himself from part of the universe.” Yo. He mostly worked on heavy ion physics.

I had made pretty clear to him that I wasn’t interested in heavy ion physics. Really I wasn’t sure I wanted to graduate in physics at all; it wasn’t even my major. But I was the math person, so certainly I could prove that black hole’s weren’t, no?

It was either that or computer simulations of big nuclei or back to the broke mathematicians. I picked the black holes.

That was 1997. Back then, measurements of the motion of stars around Sag A* were running, but they would not be published until 1998. And even after Sag A* proved to be dark, small, and heavy enough so that it should be a black hole, it took ten more years to demonstrate that indeed it doesn’t have hard surface, thus providing evidence for a black hole horizon.

We now know that Sag A* is a supermassive black hole, and that such black holes are commonly found in galactic centers. But when I was a student the case was not settled.

Greiner had explained to me why he thought black holes cannot form in stellar collapse. Because we know that black holes emit radiation, the famous “Hawking radiation.” So, when a star collapses it begins emitting all this radiation and it loses mass and the horizon never forms. That was his great idea. Ingenious! Why had no one thought of this before?

After some months digging in the literature, it became clear to me that it had been tried before. Not once, but several times, and equally many times it had been shown not to work. This was laid out in various publications, notably in Birrell and Davies’ textbook, but Greiner’s interest in the topic didn’t go far enough to look at this. Indeed, I soon found out that I wasn’t the first he had put on the topic, I was the third. The first delivered a wrong proof (and passed), the second left. Neither option seemed charming.

Black hole with accretion disk
and jet. Artist's impression.
[Image Source]
The argument for why Greiner’s idea doesn’t work is a shitload of math, but it comes down to a very physical reason: You can’t use Hawking radiation to prevent black holes from forming because that’s in conflict with the equivalence principle.

The equivalence principle is the main tenet of general relativity. It says that a freely falling observer should not be able to tell the presence of a gravitational field using only data from their vicinity, or “locally” as the terminology has it.

Hawking radiation obeys the equivalence principle – as it should. This means most importantly that an observer falling through the black hole horizon does not notice any radiation (or anything else that would indicate the presence of the horizon). The radiation is there, but its wavelengths are so long – of the size of the horizon itself – that the observer cannot measure the radiation locally.

If you want to know how Hawking-radiation affects the black hole you must calculate the total energy and pressure which the quantum effects creates. These are collected in what’s called the (renormalized) stress-energy-tensor. Turns out it’s tiny at the black hole horizon, and the larger the black hole, the smaller it is.

All of this is perfectly compatible with the equivalence principle. And that’s really all you need to know to conclude you can’t prevent the formation of black holes by Hawking-radiation: The contribution to the energy-density from the quantum effects is far too small, and it must be small because else an infalling observer would notice it, screwing over the equivalence principle.

What normally goes wrong when people argue that Hawking-radiation can prevent the formation of black hole horizons is that they use the result for the Hawking radiation which a distant observer would measure. Then they trace back this radiation’s energy to the black hole horizon. The result is infinitely large. That’s because if you want to emit anything at the horizon that can escape at all, you must give it an infinite amount of energy to start with. This is nonsense because Hawking radiation is not created at the black hole horizon. But it’s this infinity that has led many people to conclude that a collapsing star may be able to shed all of its energy in Hawking radiation.

But whenever you do physics and the math gives you an infinity, you should look for a mistake. Nothing physically real can be infinite. And indeed, the infinity which you get here cannot be observed. It is is cancelled by another contribution to the stress-energy which comes from the vacuum polarization. Collect all terms and you conclude, again, that the effects at the horizon are tiny. Done correctly, they do, of course, obey the equivalence principle.

In summary: Yes, black holes evaporate. But no, the energy-loss cannot prevent the formation of black hole horizons.

That was the status already in the late 1970s. Walter Greiner wasn’t the first but also not the last to try using quantum effects to get rid of the black hole horizon. I come across one or the other variation of it several times a year. Most recently it was via a piece on Science Daily, which also appeared PhysOrg, Science Alert, Gizmodo, BigThink, and eventually also Scientific American, where we read:
Black Hole Pretenders Could Really Be Bizarre Quantum Stars

New research reveals a possible mechanism allowing “black stars” and “gravastars” to exist

These articles go back to a press release from SISSA about a paper by Raúl Carballo-Rubio which was recently published in PRL (arXiv version here).

Carballo-Rubio doesn’t actually claim that black holes don’t form; instead he claims – more modestly – that “there exist new stellar configurations, and that these can be described in a surprisingly simple manner.”

These new stellar configurations, so his idea, are stabilized by strong quantum effects in a regime where general relativity alone predicts there should be nothing to prevent the collapse of matter. These “black stars” do not actually have a horizon, so the quantum effects never actually become infinitely large. But since the pressure from the quantum effects would get infinitely large if the mass were compressed into the horizon, the radius at which it stabilizes must be outside the horizon.

In other words, what stabilizes these black stars is the same effect that Greiner thought prevents black holes from forming. You can tell immediately it’s in conflict with the equivalence principle for there is nothing locally there, at the horizon or close by it, from which the matter would know when to stop collapsing. At horizon formation, the density of matter can be arbitrarily low, and the matter doesn’t know – cannot know! – anything about redshift from there to infinity. The only way this matter can know that something is supposed to happen is by using global information, ie by violating the equivalence principle.

Indeed that’s what Carballo-Rubio does, though the paper doesn’t really spell out where this assumption comes in, so let me tell you: Carballo-Rubio assumes from the onset that the system is static. This means the “quantum star” has no time-dependence whatsoever.

This absence of time-dependence is an absolutely crucial point that you are likely to miss if you don’t know what to look for, so let me emphasize: No stellar object can be truly static because this means it must have existed forever and will continue to exist for all eternity. A realistic stellar object must have formed somewhen. Static solutions do not exist other than as math.

The assumption that the system be static is hence a global assumption. It is not something that you can reach approximately, say, at the end of a collapse. Concretely the way this enters the calculation is by choice of the vacuum state.

Yes, that’s right. There isn’t only one vacuum state. There are infinitely many. And you can pick one. So which one do you pick?

Before we get there, allow me a digression. I promise it will make sense in a minute. Do you recall when Walter Wagner sued CERN because turning on the LHC might create tiny black holes that eat up earth?



It is rare for black hole physics to become a matter of lawsuits. Scientists whose research rarely attracts any attention were suddenly in the position of having to explain why these black holes, once created, would be harmless.

On the face of it, it’s not a difficult argument. These things would have interaction-probabilities far smaller than even neutrinos. They would readily pass through matter, leaving no trace. And being created in highly-energetic collisions, they’d be speedy, fly off to outer space and be gone.

But then, these tiny black holes would have a small but nonzero probability to become trapped in Earth’s gravitational field. They would then keep oscillating around the center of the planet. And if they stuck around for sufficiently long, and there were sufficiently many of them, they could grow and eventually eat up Earth inside-out. Not good.

That, however, the scientists argued, could not happen because these tiny black holes evaporate in a fraction of a second. If you believe they evaporate. And suddenly theoretical physicists had to very publicly explain why they are so sure black holes evaporate because otherwise the LHC might not be turned on and their experimentalist friends would never forgive them.

Rather unsurprisingly, there had been one-two people who had written papers about why black holes don’t evaporate. Luckily, these claims were easy to debunk. The court dismissed the lawsuit. The LHC turned on, no black holes were created, and everyone lived happily ever after.

For me the most remarkable part of this story isn’t that someone would go try to sue CERN over maybe destroying the world. Actually I have some understanding for that. Much more remarkable is that I am pretty sure everyone in the field knows it’s easy enough to find a theoretical reason for why black holes wouldn’t evaporate. All you have to do is postulate they don’t. This postulate is physical nonsense, as I will explain in a moment, so it would merely have complicated the case without altering the conclusion. Still I think it’s interesting no one even brought it up. Humm-humm.

So what’s that nonsense postulate that can keep black holes from evaporating? You choose a vacuum state in which they don’t. Yes, you can do that. Perfectly possible. It’s called the “Boulware state.” The price you pay for this, however, is that the energy created by quantum effects at the black hole horizon goes to infinity. So it’s an unphysical choice and no one ever makes it.

Ah! I hear you say. But not very loudly, so let me summarize this in plain terms.

You can assume a black hole doesn’t evaporate on the expense of getting an infinite amount of stress-energy in the horizon region. That’s an unphysical assumption. And it’s the same assumption as postulating the system does not change in time: Nothing in, nothing out.

And that – to tie together the loose ends – is exactly what Carballo-Rubio did. He doesn’t actually have a horizon, but he uses the same unphysical vacuum-state, the Boulware state. That’s the reason he gets such a large quantum pressure, hence violating the equivalence principle. It comes from the assumption that the system is static, has always been static, and will always remain static.

Let me be clear that Carballo-Rubio’s paper is (for all I can tell) mathematically sound. And the press-release is very carefully phrased and accurate. But I think he should have been clearer in pointing out that the assumption about time-independence is global and therefore he is describing a physically impossible situation that is not even approximately realistic.

If you followed my above elaborations, it should be clear that the details don’t matter all that much. The only way you can prevent a horizon from forming is to violate the equivalence principle. And worse, this violation must be possible when space-time curvature is arbitrarily small, as small or even smaller than what we have here on Earth.

Of course you can postulate whatever you want and calculate something. But please let us be clear that all these black stars and gravastars and quantum stars  and what have you require throwing out general relativity in regions where there is no local measure whatsoever that would call for such a breakdown. Doesn’t matter how much math you pour over it, it’s still in conflict with what we know about gravity.

The realistic situation is one in which matter collapses under its gravitational pull. In this case you have a different vacuum state (the Unruh state), which allows for evaporation. And that brings you full circle to the above argument for why the stress-energy is too small to prevent horizon formation. There’s no way to avoid the formation of a black hole. Nope, there isn’t. Black holes really exist.

As to my diploma. I simply wrote my thesis about something else but didn’t mention that until after the fact. I think Greiner never forgave me. A few years later he fired me, alas, unsuccessfully. But that’s a different story and shall be told another time.

That was a long post, I know. But I hope it explains why I think black stars and gravastars and qantum stars and so on are nonsense. And why I happen to know more about the topic than I ever wanted to know.