I admit to having been skeptical. The Higgs – so what? It’s not like this was unexpected. But the Lindau meeting is a good place to get many different perspectives on the new discovery: watching the CERN press conference, attending Lindau’s own press conference with Carlo Rubbia, David Gross and Martinus Veltman, listening to the discussion of those three with George Smoot (moderated by CERN’s Felicitas Pauss), or just talking to the young researchers (a number of which work on the CMS and ATLAS experiments; additional thanks to Albrecht Wagner and Juan García-Bellido for answering some of my remaining questions). So here is a brief guide to the basics and subtleties of the Higgs search update. And yes, by now I’ve become infected with my particle colleagues’ enthusiasm for what has happened.
Let’s start with the basics.
The accelerator: Energy and beam quality
This is, of course, all about the Large Hadron Collider or LHC, which, in the best tradition of particle accelerators worldwide, brings particles to impressively high speeds before smashing them into each other. For the new measurements, each proton had an energy of 4 TeV – more then 4000 times the energy a proton has when it is simply at rest, corresponding to a speed of 99.9999945 % the speed of light.
Energy is the key, but you also need beam quality. In the LHC, bunches of protons circulate in opposite directions in a ring made of vacuum tubes, adorned with magnets to get the particles around the curves, and acceleration cavities to get them up to speed and keep them there. On the order of a thousand bunches race clockwise, and the same number anti-clockwise. As the accelerator is ring-shaped, you can try again and again to bring the protons into collision – if they miss each other at one time, they will circle back for another try.
It’s not an easy task to aim bunches of protons at each other so as to produce collisions: You need to focus the beams and make sure that, at the collision points, they head for each other very precisely. A huge amount of work goes into producing a quality beam or, in particle physics parlance, "achieving a high luminosity". Good luminosity was another key to the results we are now seeing.
What’s a particle, anyway?
Everyday objects are very complicated. Elementary particles, in contrast, are very simple. They have only very few distinguishing characteristics. Their rest mass is one of those characteristics (why "rest mass"? Because mass, as defined via the fact that it is more difficult to deflect a particle with a high mass than one with little mass, gets larger as a particle gains speed relative to the observer. It’s an Einstein thing). Their spin, in overly simplified terms "how fast the particle is rotating like a spinning top", is another. Electric charge is yet another – electrons, for instance, have a charge that, in the usual units of particle physics, is -1; quarks have charges of either +2/3 or -1/3 in those same units. There are a number of additional charges, and characteristic properties called "quantum numbers", all characterized by simple numbers, typically integers.
Completely specify a particle’s spin, mass, charges and quantum numbers and you have defined the particle – almost, that is. In the case of the Higgs particle, there is an important additional property: The Higgs interacts with all other elementary particles that have a mass (more precisely: a rest mass) in a very specific way. We’ll come to that later on.
The theory: Can’t do everything, but can do this
The good news is that physicists have a theory that describes elementary particle behavior very, very accurately: the standard model of particle physics. The other good news is that, while it’s fiendishly difficult to apply that theory fully to a number of important situations (states in which particles are bound together: the proton! the hydrogen atom!), it provides for a straightforward way to calculate what happens in situations where particles pass each other, barely interacting. Happily, this is exactly what you need to calculate what happens in particle collisions in accelerators.
In particular, the theory allows you to calculate the probabilities for different types of particle reactions to happen. If you provide information on how (at what speed? oriented how? at what distance?) two particles are passing, it will give you the probability that a specific interaction between the particles happens.
When particles interact in accelerators, you can describe the basic reactions in a simple way: The only thing that happens is that some particles emit or absorb other particles, either changing in the process or not.
There are rules for which particles can emit or absorb which. The charges play a key double role there: For one, they impose book-keeping rules, as do the quantum numbers. For every particle reaction, there is before and after. If you have a certain amount of electric charge before the particles interact, you must have exactly the same amount afterwards. If you start with a particle with electric charge plus 1 and one with charge -1, then it’s OK if you end up with an uncharged particle [since +1 and -1 give 0], or with two particles with charge +1 and two with charge -1 [since +1 plus -1 is the same as +2 plus -2], but you cannot, for instance, end up with nothing more than two particles with electric charge +1 [since +2 is not the same as +1 plus -1].
The charges also tell you how strongly particles will interact via a specific force. The electric charge, for instance, will tell you how strongly the particle will attract or repel other particles via the electromagnetic force.
Many of the particles produced in collisions at particle accelerators are very short-lived. They decay into other particles within a small fraction of a second. A typical particle physicist’s puzzle consists of lots of data about particles that have been found in the detectors, using the known rules for particle reactions, reconstruct what happened.
Reacting particles: Probabilities
Particle reactions are all about probability. You cannot predict what is going to happen (that’s quantum theory for you), but you can predict probabilities. These probabilities are energy-dependent, and a typical prediction of particle theory is that, at a given energy, there is an X percent chance of reaction A occuring, Y percent chance of reaction B, and so on.
Then, you analyze your accelerator experiments and add up the numbers. Your predictions tell you how many cases of reaction A ("A-events") to expect, how many B-events, and so on.
If a reaction is very rare, you need to arrange for as many collisions as you can. One way of doing this is to leave your accelerator running for a long time. The other way is to increase beam quality (luminosity): Get those particle bunches concentrated, so collisions become more likely than otherwise!
Reacting particles: Energies
As already mentioned, reaction probabilities depend on energy. At some energies, particles react comparatively rarely – they will mostly pass each other by without anything happening. At other energies, lots of interactions will occur.
One key factor in this is the mass of the particle being produced. Einstein has taught us that rest mass can be thought of as a form of energy. That’s why, in your book-keeping, particle physicists do not look at mass and energy separately. What you need to look at is the total energy: rest masses, kinetic energy and possible other forms of energy combined.
So consider you are putting a certain amount of energy into a collision. You can never get out a particle whose rest mass is greater than that energy! (If you worry about units, read: "whose rest mass times the square of light…".)
In fact, at energies where you’re just about to afford producing a specific particle, the reaction rate will go up. That’s a typical way you realize there is a new particle in the first place. Here’s a slide Felicitas Pauss from CERN showed at the Lindau discussion session on Wednesday. It shows data from the CMS experiment, one of the two experiments looking at LHC particle collisions that contributed to Wednesday’s announcement. On the x axis, you see the energy you can measure for one specific reaction (which has two photons coming out) – which is a measure for the energy that went into the reaction (since what energy went in will be the same that came out- these two photons). On the y axis is the number of events at that particular energy. You can see there’s a bump a bit around the energy of 120 GeV (GeV is a measure of energy usual in particle physics): That bump is the visual signature of the newly discovered particle, and also the clue as to its mass: There is a bump at about 125 GeV. There are more events there. Around that particular energy, you see more events; evidently, a particle is produced with this particular mass.
There are similar results – bumps – for other reactions measured by the same CMS experiments, and for these reactions measured by ATLAS, another LHC experiment. That’s the key evidence we’re discussing here.
Particles can also contribute, albeit very slightly, to what happens at energies significantly lower than their mass ("radiative corrections"). In this way, high-precision measurements at the LHC’s predecessor, the electron-positron collider LEP, had already shown that the Higgs mass must be less than about 152 GeV.
Reactions and energies: Some caveats
If you try to add up the numbers, you might wonder what the connection with energy really is.
Doesn’t the LHC have much more energy than 125 GeV? The previous and current measurements that gave indications of the new particle used LHC energies of 7 to 8 TeV (that is, 3.5 TeV and 4 TeV per proton), corresponding to 7000 to 8000 GeV. Why so much? Because that energy is the sum of the energies of the protons, and protons are composite particles: they contain quarks (three apiece) and gluons, figuratively the glue binding those quarks together. The "elementary collisions" within the accelerators are between those constituents: two quarks interacting with each other, or a quark and a gluon. But those constituents will only have a fraction of the energy of the whole proton. That fraction of energy is what you’re working with to create a new particle, such as the Higgs.
What about that sigma?
If you go into the details, there are numerous occurences of the greek letter sigma. What about those?
The easiest is not to look at the background (i.e. not look up the Wikipedia entry standard deviation) but simply accept those sigmas as short-hand notations for probability.
More concretely, for the probability that the effect you think you see in your data (such as the bump in the image above) is not due to some specific effect, but instead due to statistical fluctuations. Nearly all of what you measure in physics has a measurement error. You can never control all the error sources – detectors seeing something where there is nothing, the transistors of your instruments giving some small spurious signals, and a myriad more little things. As soon as you repeat a measurement, you can estimate the combined effect of all those tiny little errors. You can’t miss them: They’re the ones that make your result come out slightly different each time, even if you’re really measuring the same thing.
Once you know the size of those fluctuations, you can estimate the probability that these fluctuations will conspire in a way so as to impersonate a physical effect – for instance: the probability of the fluctuations adding up to form the bum to see in the above image. This probability is expressed in terms of sigma, where different values mean the following:
So when physicists say that there is "a detection at the 5 sigma level", what they’re really claiming is that "the probability that what we’re seeing is due to statistical fluctuations and not a real effect at all is 1 in 1.7 million, or 0.00006%".
It is conventional in science to accept an effect as real once a certain sigma level (or "level of significance") has been reached. Particle physicists usually demand 5 sigma. Astronomers are often happy with 3 sigma. In other sciences, you could see results at the 2 sigma level (a.k.a. "statistically significant") or 3 sigma level (a.k.a. "highly significant").
Note that this does not mean that e.g. at the 5 sigma level, the data show that the probability of the bump being the consequence of a new particle is 99.99994% (which number you obtain by subtracting 0.00006% from 1). The sigma-probability quoted here tells you nothing about the effect itself, only about the statistical fluctuations of your measurements. If you want to make direct deductions about your model from your data, go learn about Bayesian statistics.
The CMS experiment had indications of a particle of mass around 125.3 GeV at a level of 4.1 sigma from events resulting in two photons, and 3.2 sigma for events resulting in four leptons (that is, particles such as electrons and their kin). Including some smaller contributions from other reactions, they obtain a combined value of 4.9 sigma.
ATLAS has 4.5 sigma for the photons and 3.4 sigma for the four-lepton events for a particle of mass of 126.5. Again together with other types of reaction, they get a combined value of 5 sigma.
The mass difference between the two experiments (125.3 GeV and 126.5 GeV) on the other hand is likely to be due to those pesky statistical fluctuations, and currently no cause for concern.
Before we go on to the question of whether this thingy is the Higgs or not, let’s take one step back:
The Higgs: What is it good for?
Absolutely noth… oh, no, wait. On the contrary: The Higgs is needed to make the standard model of particle physics work. Very briefly: When it comes to forces like electromagnetism and the like, the standard model has a highly successful formalism. The problem: Theories like this, as far as we know, only work if some specific particles associated with the force ("gauge particles" or force-carriers) are massless. But the properties of the weak force indicate that those very particles do have quite a large mass. What to do?
As it turns out, you can have both: Those particles are massless to begin with. But they have one particular form of interaction with an all-pervading field – the Higgs field – that, under conditions like those we see around us (that is, not the ultra-hot high-energy universe shortly after the big bang) can endow particles with a rest mass.
Like every other field in particle physics, the Higgs field should have an associated particle. That’s the Higgs boson.
This clever trick (mass without conventional mass) was thought up by several people independently; in alphabetical order: Robert Brout, Francois Englert, Gerald Guralnik, Carl Hagen, Peter Higgs and Tom Kibble. Higgs’ name stuck – we’re calling it the Higgs boson (and the Higgs field, and the Higgs mechanism). Some scientists say we should properly talk about the Englert-Brout-Higgs-Guralnik-Hagen-Kibble mechanism (or variations thereof). It’s probably not going to catch on – unfair, I know. But it’s not a bad idea to keep in mind these days that on the theory side, there are more people involved than just Peter Higgs. It’s no coincidence that CERN invited not only Higgs, but all of the inventors that are still living (Brout died in 2011) to their press conference on Wednesday (and all but Tom Kibble came, and were present at the announcement).
David Miller invented a nice (and, by now, famous) analogy for what the Higgs does. Heuer used the same analogy (without saying whence it came, though – tsk, tsk, tsk).
Why was the discovery made right now? That follows from the, let’s say: somewhat turbulent history of the LHC. Construction of the highly complex machine was finished in late 2008. A bit more than a week after the first beam was successfully circulated, disaster struck: a good 100 superconducting magnets suddenly lost their superconductivity ("quench"), blowing tons of liquid helium into the tunnel and damaging a substantial part of the accelerator.
By late 2009, the plucky people at CERN had gotten their accelerator back on track. But they were very careful to avoid another accident, and went to higher and higher energies slowly and carefully, choosing all the beam parameters very conservatively. In 2011, they reached 3.5 TeV per proton (7 TeV for two protons colliding), 50% of their eventual goal.
As far as the other beam parameters were concerned, the return to normal also went much faster than anyone had expected. The number of particles in each circulating bunch, for instance, is now much higher than expected. This and other parameters have led to a beam of much higher quality than the physicists had hoped for. And with the higher quality came higher luminosity, that is: a better chance for particles to actually collide, and for rarer events to be seen.
That’s when things began to get interesting.
In April 2012, the LHC increased its beam energy to 4 TeV each (8 TeV for the two beams together). They let it run for three month. Once the data was in, and it became clear that there was something special going on, the physicists embarked on two weeks of highly intensive data analysis (temporarily abandoning distracting habits such as sleeping, one presumes).
Once your data been condensed into quantities such as the number of events for a given reaction at a given energy, you need to compare it with the theoretical predictions – that is, with what you expect to see if your theory and your models of what is happening inside your detectors are correct. In fact, the transition from collision energy 7 TeV to 8 TeV happened unexpectedly fast, so that it was quite a challenge for the people doing computer simulations of particle reactions ("Monte Carlo simulations", where you send lots of simulated particles into simulated collisions) to keep up!
Did they find the Higgs, then?
Too early to tell (as all physicists, Nobel laureates and young researchers included, pointed out when asked this question!). Remember that a particle is defined by its rest mass, spin, charges and quantum numbers, plus, in the case of the Higgs, characteristic properties of its interaction with other (non-zero mass) particles.
So far, the experiments show that we’re dealing with a particle with a mass around 125 GeV. That is, in fact, about what physicists had been expecting for a standard-model Higgs: The LEP measurements had given an upper bound of 152 GeV. In fact, the earlier LHC measurements had excluded the whole mass region between 127 GeV and 600 GeV – at that masses, certain reactions would have been so abundant that the LHC experiments would have noticed. On the other hand, the Higgs mass could not be lower than 114 GeV. Otherwise, the particle would already have been found at LEP.
Beyond that, the observed decays show that the particle is no fermion (that is, no matter particle such as a quark or electron) and that it is no vector particle (such as the force carriers: photon, W, Z, gluons). That is also consistent with the Higgs, which is a so-called scalar instead.
But that’s about all we know.
What the ATLAS and CMS physicists need to do before they can be sure this is the Higgs boson is to look at different reactions involving the Higgs. So far, they’ve looked into reactions that left as their traces either two photons, or four leptons. Looking at the other possible reactions, they will be able to answer the crucial question whether or not the strength of the Higgs’ interaction with other particles is proportional to those particles masses. That is the key defining feature. If the new particle does that, it is definitely the Higgs.
It’ll be some time until we know for sure, though. CERN director Rolf-Dieter Heuer spoke of 3, 4 years minimum.
So what are the chances this is the Higgs?
Chances are good – and all the physicists I heard talk on the subject agreed that this is a Higgs. A number expressed their hope that it was not the Higgs, that is, the Higgs predicted by the standard model. Because if it isn’t, it could be the first sign of new physics, and that would certainly make matters much more interesting indeed.
For instance, enlarged versions of the standard model known as "supersymmetric standard models", which have some desirable (to theorists) properties of their own, predict several different Higgs bosons – five in total, for the simplest model. If you were to find those, it would open up a whole new world of particles. On the other hand, as CERN theoretician John Ellis said in the podium discussion in Lindau (in which he participated by video-link): If supersymmetry is not found once the LHC has reached its design energy of 13 or 14 TeV, then supersymmetry is in trouble. He called it a "make or break" point.
So what do we learn from the find?
First of all, it’s a strong indication that the Higgs mechanism works, and that particles indeed get their rest masses in this way. For some people working on alternatives to that mechanism, this is going to be disappointed; others are going to feel a quiet satisfaction, and still others will have known all along. As Martinus Veltman said, this is like "closing the door" on the standard model – it’s not all, but a large part of the last missing piece of the standard model, so that is that.
But we also know the mass of the new particle, and that tells us where to look next.
First of all, Heuer said that they are going to extend the current phase of LHC operations by 2.5 to 3 months. The LHC was heading for a 2-year shut-down and overhaul phase, which they have now postponed. The additional data should enable them to tell whether the Higgs is a scalar or a pseudoscalar, for instance, and also give more indication whether or not this really is a Higgs in the first place.
Longer-term, the mass value of 125 GeV tells physicists what kind of accelerator they need to build in order to explore the properties of this particle – let’s just call it "the Higgs", despite the caveats – further. Carlo Rubbia likened this situation to the one he got his Nobel prize for, the discovery of the W and Z particles. The particles were found at the proton-antiproton collider SPS at CERN. But while protons are good for reaching high energies – which in turn make it possible to produce new particles not previously found -, proton collisions are very messy. That is due to the fact that the proton itself is messy, consisting of quarks and gluons. In each collision, you will get lots of debris. For the W and Z, CERN followed up with LEP, which collided electrons and positrons. That make for much cleaner reactions, allowing, for instance, the precise determinations of the masses of W and Z, and the determination of the number of standard model neutrinos (3). [Incidentally, right at the end of LEP operations, it was claimed that they had seen some traces of the Higgs. But that was at a mass of 115 GeV; the new LHC find casts serious doubt on that LEP claim.]
Rubbia proposed doing something similar for the Higgs (or whatever it turns out to be): build a "Higgs-factory" such as an electron-positron collider slightly larger than LEP, or a linear collider, or a collider using muons, to do precision measurements. David Gross then chimed in to say that this was the chance for the US to come back into the race, and regain its former particle physics glory!
All in all, there are some interesting times ahead. But we are likely to need some patience – it could well be that we will only know for sure in 3-4 years.