Moving a very large magnet to measure very small particles.
July 12, 2013 4:32 PM Subscribe
By making a very precise measurement of the muon g-2 value and comparing the results to its predicted value, researchers at Fermilab hope to uncover evidence of new, undiscovered particles and forces. This continues work done at Brookhaven National Laboratory a decade ago. To do so, Fermilab needs their 50-ft ring magnet and its fragile, precisely assembled superconducting coils. After six months of planning, the magnet was slowly hauled on an eight-axle trailer through the streets of Long Island, loaded onto a barge, and tugged down the Atlantic coast and into the Gulf of Mexico, where it will go up the Tennessee-Tombigbee Waterway to the Mississippi and then through Illinois waterways before it's trucked again through suburban Chicago. Fermilab has a photo and video gallery and is posting updates.
Recently, Chris Pauley of Fermilab was on the How to Do Everything podcast to talk about the move.
Popular Mechanics has a comprehensive article.
A few more videos: 1 2 3 (last one has nsfw language)
Recently, Chris Pauley of Fermilab was on the How to Do Everything podcast to talk about the move.
Popular Mechanics has a comprehensive article.
A few more videos: 1 2 3 (last one has nsfw language)
This would be much improved with a theremin playing in the background.
posted by hal9k at 5:17 PM on July 12, 2013
posted by hal9k at 5:17 PM on July 12, 2013
Ironically, if these scientist types had just focused on teleportation and/or antigravity first, they'd have a lot less trouble getting stuff like this done.
posted by Joakim Ziegler at 5:44 PM on July 12, 2013
posted by Joakim Ziegler at 5:44 PM on July 12, 2013
After six months of planning, the 50-ft ring magnet was slowly hauled on an eight-axle trailer through the streets of Long Island, ...
... thoroughly degaussing everyone's mp3 collections in the process.
posted by ceribus peribus at 6:08 PM on July 12, 2013 [4 favorites]
... thoroughly degaussing everyone's mp3 collections in the process.
posted by ceribus peribus at 6:08 PM on July 12, 2013 [4 favorites]
We burned many folk at the stake for reciting the devils rituals...just so you know
posted by Mack Twain at 6:28 PM on July 12, 2013 [1 favorite]
posted by Mack Twain at 6:28 PM on July 12, 2013 [1 favorite]
Me too. I find it strangely attractive.
posted by hal9k at 6:33 PM on July 12, 2013 [6 favorites]
posted by hal9k at 6:33 PM on July 12, 2013 [6 favorites]
... thoroughly degaussing everyone's mp3 collections in the process.
I work about a mile from Fermilab. I can see the main building out my office window. I guess I better wrap my laptop in a faraday cage before I leave for vacation.
One of these days there's gonna be some weird siren I've never heard before and I'm keeping a crowbar under my desk just in case.
posted by JoeZydeco at 7:33 PM on July 12, 2013 [14 favorites]
I work about a mile from Fermilab. I can see the main building out my office window. I guess I better wrap my laptop in a faraday cage before I leave for vacation.
One of these days there's gonna be some weird siren I've never heard before and I'm keeping a crowbar under my desk just in case.
posted by JoeZydeco at 7:33 PM on July 12, 2013 [14 favorites]
Is it too big to go through the St. Lawrence seaway, and through to Lake Michigan?
posted by Pruitt-Igoe at 8:31 PM on July 12, 2013
posted by Pruitt-Igoe at 8:31 PM on July 12, 2013
This reminds me a lot of the moving of the great mirror for Mt. Palomar Observatory, in 1934.
Is it too late to line the streets and cheer ?
posted by Kakkerlak at 9:32 PM on July 12, 2013
Is it too late to line the streets and cheer ?
posted by Kakkerlak at 9:32 PM on July 12, 2013
There's a good explanation of what the ring is here, and why they want a muon storage ring to measure the g-factor of the muon. Mind you, actual papers are slow going.
posted by sebastienbailard at 9:47 PM on July 12, 2013
posted by sebastienbailard at 9:47 PM on July 12, 2013
Well, magnetic fields drop off quick, so I wouldn't be too worried. Besides, the Tevatron had 774 Nb-Ti superconducting magnets to hold the beam on the ring, each generating 4.2 T fields, and we somehow managed to not wipe everyone's mp3 collections then.
I'm stuck in a hotel room after a conference, so physics fun times will be had on metafilter. You have been warned.
What the muon g-2 experiment is looking for are very very small deviations in the muon magnetic moment. What do any of those words mean, and why do physicists care? Other than us having too much free time or something.
Muons are electrons heavier siblings. An electron masses 0.511 MeV (mega-electronvolt, using E=mc^2 to convert from mass to energy. This is 9.1 10^-28 grams). A muon masses 207 times more than that: 105.7 MeV. But other than this, electrons and muons are identical: they are both spin-1/2 particles (fermions) that have the same electric and weak-force interactions. Neither interacts with the strong nuclear force, thus they are dubbed "leptons" as opposed to hadrons that feel this other force. Each is paired with an uncharged neutrino, the electron-type and muon-type neutrino. Their similarity cause Isador Rabi to say "who ordered that?" when they were discovered. After all, muons seem very extraneous; isn't an electron enough?
In fact, in turns out that there is a third charged lepton, the tau, which is heavier still, 1777 MeV, along with its partner, the tau neutrino (which is nearly massless). These three generations (or flavors) of leptons are mirrored by three generations of quarks, in pairs of two: the up and down quark (which are the primary constituents of protons and neutrons), the heavier charm and strange, and the heaviest top and bottom (the down is heavier than the up, the charm heavier than the strange, and the top heavier than the bottom. Each generation is heavier than the previous). The masses of these particles conforms to no known pattern, and this randomness, along with why there are three generations is a huge mystery in particle physics. Regardless, it is known that interactions that could violate flavor are highly suppressed, especially in the lepton sector. Why this is also a huge mystery.
Incidentally, the discovery of the Higgs has more or less ruled out a 4th generation of quarks heavier than the top, unless that 4th generation is very different in character from the preceding 3. We already knew that there could also not be a 4th neutrino with identical properties to the known three, though there is plenty of room for things similar to neutrinos. As a result, whatever the reason for 3 generations, there are only 3 in the pattern we recognize as "the Standard Model," not 4 or more.
Particles would generically love to decay to the lightest things around, as long as that decay conserves energy, momentum, and certain conserved charges. So a neutron can decay into a lighter proton, along with an electron and an anti-electron-neutrino, conserving baryon number, lepton number, and charge, but a proton cannot decay (at least, not in the Standard Model, and experimentally we know the lifetime is longer than 10^29 years), because there is no combination of particles lighter than a proton that conserve all the charges necessary. Muons, on the other hand, can decay into a muon-neutrino (conserving the flavor "mu-ness"), an electron (conserving charge), and an electron anti-neutrino (this balances the "e"-ness of the electron, conserving flavor again). This decay occurs fairly slowly, as particles go, taking 2 microseconds. This is "slow" because the decay is mediated by the weak nuclear force, and so takes long to occur than hadronic decays, which are mediated by the strong nuclear force. Standing around on Earth, you are bombarded by downgoing muons, created in the upper atmosphere from cosmic rays (mostly protons and such) hitting the atoms in the air and shattering into pions, which in turn decay into muons. Despite traveling near the speed of light, these cosmic muons would not live long enough to hit the ground, were it not for the time-dilation they experience due to special relativity (alternatively, from the muon's point of view, they see the Earth's atmosphere as thinner because of length contraction. Relativity is relative). The fact that they decay makes muons difficult to measure the properties of, compared to stable electrons, but their long-ish lifetime means they are easier to work with than pretty much all other unstable particles, as you can keep them around long enough (if you store them traveling near lightspeed in a giant magnetic ring you shipped from Long Island) to do fun things.
So, g-2? I'm getting there.
As a charged particle with spin, we expect the muon (and the electron) to create a tiny little magnetic field. Magnets: let me tell you how they work. They work by spinning charges in circles. For ferromagnets, it's the electrons in the iron atoms spinning in their orbits making tiny magnetic fields that all aline inside the bar of metal to make one giant field. For electromagnets, you coil a big spool of wire and run a current (charged particles moving) through it. For a superconducting magnet like the one taking a little sea voyage, you do the same thing, but with wires that feel no resistance to current flow. Electrons and muons, and all other charged fermions, can be thought of as tiny little spinning charges (except they are, as far as we know, point particles, so don't get too carried away with this analogy).
So, they create a magnetic field. So what?
It turns out this strength of the magnetic field (the "magnetic moment) of a point particle can be calculated from quantum mechanics. When you do this in undergrad physics classes, you get an answer that says the magnetic moment is proportional to the spin of the particle times 2. Why two? Quantum mechanics, man. I'm trying to figure out a non-mathy answer for that, but I can't. Basically, the factor of two is saying that an electron is twice as good as creating a magnetic field as I would have expected if I thought of it as a little classical charge spinning around.
Then, I go and measure the magnetic moment of an electron. And it turns out to be proportional to the electron spin times 2.
ooops, sorry, it's times 2.002319302436146. We measured this bad boy to 1 part in a trillion. It is the most precise measurement I am aware of in science. We made this ridiculously good measurement because we can predict from pure theory the deviation of factor in the magnetic moment away from 2. Call this factor g, so if you want to know how far away the true answer is from the first "easy" calculation, you'd be interested in g-2 (thus the name).
g-2 for an electron is not zero because of virtual particles. Every "real" particle in the Universe has a little cloud of particles that are popping in an out of existence around them at all times. You've heard of Heisenberg, right? If you know the velocity, you can't know the position, and vice versa? Well, there's another uncertainty principle: the accuracy to which you can measure the energy of anything is inversely proportional to the time you have to measure it. So, in a certain sense, an electron can spin off a photon with some tiny energy, and that photon can create a new electron/positron pair, and that's fine, as along as these particle poof out of existence fast enough so that the Universe doesn't "notice." What this means is that if I want to calculate the electron magnetic moment, I first calculate the magnetic moment of an electronby itself (and get g=2, as above), then I calculate the magnet moment if the electron spits off a little loop of e-/e+ (or muons or taus or whatever), and get the "one-loop correction," then I personally stop because loop calculations suck. If you're insane and/or German though*, you keep going, and calculate the 1-loop, 2-loop, 3-loop, and 4-loop corrections. And you get a very very very precise prediction for the g-2 of the electron. Then you compare to the work of the experimentalists. And you both get the same answer, within both the error of the experiment and the theory.
Again. This is the most accurate prediction and measurement I am aware of in all of physics.
So, that's the electron. Then you can do the same thing for the muon. The experiment is harder, because muons decay, so you don't have the luxury of effectively infinite numbers of them to work with, and you have to keep pumping them in, using a high energy, high intensity particle accelerator. Fortunately, we at Fermilab and Brookhaven have some of those. The theory is harder too, because the muon loops have big effects from hadronic physics (things that feel the strong force in them), while those effects were negligible for electrons. This is because the muons are heavier, and so the loop energies are close to the mass scale of the light hadrons. That wasn't a problem for electrons, but it is relevant enough for muons for us to care. This is bad because we have to extract hadronic parameters from experiment, which adds uncertainty (this is because there are no bare quarks: only combinations of quarks, so getting at the fundamental parameters means unwrapping the physics that glues quarks together. Leptons don't have this problem).
The critical thing is that, as a heavier particle, the muon is slightly more sensitive to heavy particles running in the loops. The heavier the virtual particle, the smaller it's effect (it gets "borrowed" from the Universe very quickly), but the effect is slightly larger for muons than electrons (it'd be larger still for taus, but they decay really quickly and so are really hard to work with). So, if there were new charged particles out there, things we haven't found yet, you could look for them in slight deviations away from the predicted value of g-2 in muons.
Let's go to the data. From the Brookhaven E281 experiment, the one using this giant-ass magnet originally, they measured g-2 for the muon to be 2 x (116 592 080 +/- 63) x 10^-11. The best theory prediction right now is that g-2 is 2 x (116 591 785 +/- 51) x 10^-11.
Everyone see the difference? These two numbers are about 3 sigma apart. That is to say, IF the theory prediction is correct, there is less than a 0.3% chance that the experimental result could be consistent with the Standard Model of particle physics. That could be huge. There could be new particles lurking around, heavier enough and/or decaying in difficult to see ways that make them so-far invisible at the LHC, but are leaving their imprint in a precision measurement at very low energies. Right now, the Standard Model explains every physics result we've found, with the exceptions of dark matter, neutrino masses, and dark energy (which may not related at all to particles, but I'll throw it in there). We have no other clues as to where to look, so finding anything at all that deviates from prediction is a Big Deal.
The response from most particle physicists (including me) has been "ok...." This could be hugely interesting. But there are big theory issues that need to be worked out before we know whether this is just hadronic physics fucking us over once again. Or we just got unlucky. 3 sigma deviations show up all the time (well, 0.3% of the time, but we do a lot of experiments, and there's a bias to talk about those that get interesting unusual results). 5 sigma would be better. But to get 5 sigma, we'd need more muons than Brookhaven was able to throw into their big-ass magnet. Fermilab has the capability of producing lots of muons. Brookhaven has a big-ass magnet. It's two great tastes that go great together. Just ship that ring on over.
This also represents a bit of a pivot of Fermilab from the "energy frontier" to the "intensity frontier." Rather than try to build the biggest, highest energy collider we can, we're going for very very high intensity beams, and working on precision physics. Things that won't give us new particles directly, but will hopefully illuminate their effects through these sorts of deviations from the Standard Model predictions. Both frontiers are important (and I hate the distinction that the two names suggest. It's all particle physics), and I'd dearly love for a new high energy machine to be build in the States in my lifetime - I think it is important for the long-term health of particle physics in particular and the American physics program in general. However, interesting physics is interesting physics, and I'm glad Fermilab will be leading the way on new things now that the Tevatron is done.
And that, my friends, concludes physics-story time.
posted by physicsmatt at 10:33 PM on July 12, 2013 [54 favorites]
I'm stuck in a hotel room after a conference, so physics fun times will be had on metafilter. You have been warned.
What the muon g-2 experiment is looking for are very very small deviations in the muon magnetic moment. What do any of those words mean, and why do physicists care? Other than us having too much free time or something.
Muons are electrons heavier siblings. An electron masses 0.511 MeV (mega-electronvolt, using E=mc^2 to convert from mass to energy. This is 9.1 10^-28 grams). A muon masses 207 times more than that: 105.7 MeV. But other than this, electrons and muons are identical: they are both spin-1/2 particles (fermions) that have the same electric and weak-force interactions. Neither interacts with the strong nuclear force, thus they are dubbed "leptons" as opposed to hadrons that feel this other force. Each is paired with an uncharged neutrino, the electron-type and muon-type neutrino. Their similarity cause Isador Rabi to say "who ordered that?" when they were discovered. After all, muons seem very extraneous; isn't an electron enough?
In fact, in turns out that there is a third charged lepton, the tau, which is heavier still, 1777 MeV, along with its partner, the tau neutrino (which is nearly massless). These three generations (or flavors) of leptons are mirrored by three generations of quarks, in pairs of two: the up and down quark (which are the primary constituents of protons and neutrons), the heavier charm and strange, and the heaviest top and bottom (the down is heavier than the up, the charm heavier than the strange, and the top heavier than the bottom. Each generation is heavier than the previous). The masses of these particles conforms to no known pattern, and this randomness, along with why there are three generations is a huge mystery in particle physics. Regardless, it is known that interactions that could violate flavor are highly suppressed, especially in the lepton sector. Why this is also a huge mystery.
Incidentally, the discovery of the Higgs has more or less ruled out a 4th generation of quarks heavier than the top, unless that 4th generation is very different in character from the preceding 3. We already knew that there could also not be a 4th neutrino with identical properties to the known three, though there is plenty of room for things similar to neutrinos. As a result, whatever the reason for 3 generations, there are only 3 in the pattern we recognize as "the Standard Model," not 4 or more.
Particles would generically love to decay to the lightest things around, as long as that decay conserves energy, momentum, and certain conserved charges. So a neutron can decay into a lighter proton, along with an electron and an anti-electron-neutrino, conserving baryon number, lepton number, and charge, but a proton cannot decay (at least, not in the Standard Model, and experimentally we know the lifetime is longer than 10^29 years), because there is no combination of particles lighter than a proton that conserve all the charges necessary. Muons, on the other hand, can decay into a muon-neutrino (conserving the flavor "mu-ness"), an electron (conserving charge), and an electron anti-neutrino (this balances the "e"-ness of the electron, conserving flavor again). This decay occurs fairly slowly, as particles go, taking 2 microseconds. This is "slow" because the decay is mediated by the weak nuclear force, and so takes long to occur than hadronic decays, which are mediated by the strong nuclear force. Standing around on Earth, you are bombarded by downgoing muons, created in the upper atmosphere from cosmic rays (mostly protons and such) hitting the atoms in the air and shattering into pions, which in turn decay into muons. Despite traveling near the speed of light, these cosmic muons would not live long enough to hit the ground, were it not for the time-dilation they experience due to special relativity (alternatively, from the muon's point of view, they see the Earth's atmosphere as thinner because of length contraction. Relativity is relative). The fact that they decay makes muons difficult to measure the properties of, compared to stable electrons, but their long-ish lifetime means they are easier to work with than pretty much all other unstable particles, as you can keep them around long enough (if you store them traveling near lightspeed in a giant magnetic ring you shipped from Long Island) to do fun things.
So, g-2? I'm getting there.
As a charged particle with spin, we expect the muon (and the electron) to create a tiny little magnetic field. Magnets: let me tell you how they work. They work by spinning charges in circles. For ferromagnets, it's the electrons in the iron atoms spinning in their orbits making tiny magnetic fields that all aline inside the bar of metal to make one giant field. For electromagnets, you coil a big spool of wire and run a current (charged particles moving) through it. For a superconducting magnet like the one taking a little sea voyage, you do the same thing, but with wires that feel no resistance to current flow. Electrons and muons, and all other charged fermions, can be thought of as tiny little spinning charges (except they are, as far as we know, point particles, so don't get too carried away with this analogy).
So, they create a magnetic field. So what?
It turns out this strength of the magnetic field (the "magnetic moment) of a point particle can be calculated from quantum mechanics. When you do this in undergrad physics classes, you get an answer that says the magnetic moment is proportional to the spin of the particle times 2. Why two? Quantum mechanics, man. I'm trying to figure out a non-mathy answer for that, but I can't. Basically, the factor of two is saying that an electron is twice as good as creating a magnetic field as I would have expected if I thought of it as a little classical charge spinning around.
Then, I go and measure the magnetic moment of an electron. And it turns out to be proportional to the electron spin times 2.
ooops, sorry, it's times 2.002319302436146. We measured this bad boy to 1 part in a trillion. It is the most precise measurement I am aware of in science. We made this ridiculously good measurement because we can predict from pure theory the deviation of factor in the magnetic moment away from 2. Call this factor g, so if you want to know how far away the true answer is from the first "easy" calculation, you'd be interested in g-2 (thus the name).
g-2 for an electron is not zero because of virtual particles. Every "real" particle in the Universe has a little cloud of particles that are popping in an out of existence around them at all times. You've heard of Heisenberg, right? If you know the velocity, you can't know the position, and vice versa? Well, there's another uncertainty principle: the accuracy to which you can measure the energy of anything is inversely proportional to the time you have to measure it. So, in a certain sense, an electron can spin off a photon with some tiny energy, and that photon can create a new electron/positron pair, and that's fine, as along as these particle poof out of existence fast enough so that the Universe doesn't "notice." What this means is that if I want to calculate the electron magnetic moment, I first calculate the magnetic moment of an electronby itself (and get g=2, as above), then I calculate the magnet moment if the electron spits off a little loop of e-/e+ (or muons or taus or whatever), and get the "one-loop correction," then I personally stop because loop calculations suck. If you're insane and/or German though*, you keep going, and calculate the 1-loop, 2-loop, 3-loop, and 4-loop corrections. And you get a very very very precise prediction for the g-2 of the electron. Then you compare to the work of the experimentalists. And you both get the same answer, within both the error of the experiment and the theory.
Again. This is the most accurate prediction and measurement I am aware of in all of physics.
So, that's the electron. Then you can do the same thing for the muon. The experiment is harder, because muons decay, so you don't have the luxury of effectively infinite numbers of them to work with, and you have to keep pumping them in, using a high energy, high intensity particle accelerator. Fortunately, we at Fermilab and Brookhaven have some of those. The theory is harder too, because the muon loops have big effects from hadronic physics (things that feel the strong force in them), while those effects were negligible for electrons. This is because the muons are heavier, and so the loop energies are close to the mass scale of the light hadrons. That wasn't a problem for electrons, but it is relevant enough for muons for us to care. This is bad because we have to extract hadronic parameters from experiment, which adds uncertainty (this is because there are no bare quarks: only combinations of quarks, so getting at the fundamental parameters means unwrapping the physics that glues quarks together. Leptons don't have this problem).
The critical thing is that, as a heavier particle, the muon is slightly more sensitive to heavy particles running in the loops. The heavier the virtual particle, the smaller it's effect (it gets "borrowed" from the Universe very quickly), but the effect is slightly larger for muons than electrons (it'd be larger still for taus, but they decay really quickly and so are really hard to work with). So, if there were new charged particles out there, things we haven't found yet, you could look for them in slight deviations away from the predicted value of g-2 in muons.
Let's go to the data. From the Brookhaven E281 experiment, the one using this giant-ass magnet originally, they measured g-2 for the muon to be 2 x (116 592 080 +/- 63) x 10^-11. The best theory prediction right now is that g-2 is 2 x (116 591 785 +/- 51) x 10^-11.
Everyone see the difference? These two numbers are about 3 sigma apart. That is to say, IF the theory prediction is correct, there is less than a 0.3% chance that the experimental result could be consistent with the Standard Model of particle physics. That could be huge. There could be new particles lurking around, heavier enough and/or decaying in difficult to see ways that make them so-far invisible at the LHC, but are leaving their imprint in a precision measurement at very low energies. Right now, the Standard Model explains every physics result we've found, with the exceptions of dark matter, neutrino masses, and dark energy (which may not related at all to particles, but I'll throw it in there). We have no other clues as to where to look, so finding anything at all that deviates from prediction is a Big Deal.
The response from most particle physicists (including me) has been "ok...." This could be hugely interesting. But there are big theory issues that need to be worked out before we know whether this is just hadronic physics fucking us over once again. Or we just got unlucky. 3 sigma deviations show up all the time (well, 0.3% of the time, but we do a lot of experiments, and there's a bias to talk about those that get interesting unusual results). 5 sigma would be better. But to get 5 sigma, we'd need more muons than Brookhaven was able to throw into their big-ass magnet. Fermilab has the capability of producing lots of muons. Brookhaven has a big-ass magnet. It's two great tastes that go great together. Just ship that ring on over.
This also represents a bit of a pivot of Fermilab from the "energy frontier" to the "intensity frontier." Rather than try to build the biggest, highest energy collider we can, we're going for very very high intensity beams, and working on precision physics. Things that won't give us new particles directly, but will hopefully illuminate their effects through these sorts of deviations from the Standard Model predictions. Both frontiers are important (and I hate the distinction that the two names suggest. It's all particle physics), and I'd dearly love for a new high energy machine to be build in the States in my lifetime - I think it is important for the long-term health of particle physics in particular and the American physics program in general. However, interesting physics is interesting physics, and I'm glad Fermilab will be leading the way on new things now that the Tevatron is done.
And that, my friends, concludes physics-story time.
posted by physicsmatt at 10:33 PM on July 12, 2013 [54 favorites]
Huh, Emmert International is the same company that moved the boulder for LACMA's Elevated Mass sculpture. I remain childishly fascinated with the transportation of Very Large Things.
posted by carsonb at 10:57 PM on July 12, 2013 [1 favorite]
posted by carsonb at 10:57 PM on July 12, 2013 [1 favorite]
(Yes, I'm aware magnetic field density drops off proportional to distance^3, and besides it wasn't powered in transit anyway. Still fun to frame it as an RIAA secret weapon, though.)
posted by ceribus peribus at 11:22 PM on July 12, 2013
posted by ceribus peribus at 11:22 PM on July 12, 2013
I feel like I should say something here, but physicsmatt covered it pretty well. As a physicist who used to work on CDF and on the Tevatron before that, it is indeed with mixed emotions that I see the shift from energy frontier to intensity frontier. But, as physicsmatt says, it's all particle physics. Anyhow, I'm just glad to see Fermilab doing cutting edge experiments. fingers crossed for a muon collider in our future
posted by cyclotronboy at 2:32 AM on July 13, 2013
posted by cyclotronboy at 2:32 AM on July 13, 2013
One of these days there's gonna be some weird siren I've never heard before and I'm keeping a crowbar under my desk just in case.
Large chunks of ferromagnetic material can definitely be used to destroy a huge electromagnet, but possibly not *quite* in the way you are hoping. I'd be sure it's always closer to the lab than you are, is all I'm saying.
posted by solotoro at 2:38 AM on July 13, 2013 [3 favorites]
Large chunks of ferromagnetic material can definitely be used to destroy a huge electromagnet, but possibly not *quite* in the way you are hoping. I'd be sure it's always closer to the lab than you are, is all I'm saying.
posted by solotoro at 2:38 AM on July 13, 2013 [3 favorites]
*stands and applauds physicsmatt* - yet again with the clever and the humour! Awesome stuff.
Also "the muon g-2 experiment" should be a film, stat.
posted by marienbad at 3:42 AM on July 13, 2013
Also "the muon g-2 experiment" should be a film, stat.
posted by marienbad at 3:42 AM on July 13, 2013
Too bad this was too big for the Erie Canal, I could have waved from my house.
posted by tommasz at 4:08 AM on July 13, 2013
posted by tommasz at 4:08 AM on July 13, 2013
Awesome, physicsmatt, thanks!
One thing that caught my eye, as a non scientist who reads about such things: I've never heard of dark light.I suppose that's suppose to be a radiative counterpart to dark matter?!
(I'll look it up later, but I enjoy the takes of the various physics mefi nerds)
posted by JKevinKing at 5:22 AM on July 13, 2013
One thing that caught my eye, as a non scientist who reads about such things: I've never heard of dark light.I suppose that's suppose to be a radiative counterpart to dark matter?!
(I'll look it up later, but I enjoy the takes of the various physics mefi nerds)
posted by JKevinKing at 5:22 AM on July 13, 2013
Dark light is a theoretical idea that proposes that dark matter (the invisible stuff that outmasses normal baryonic matter like you and me by a factor of 5 or so) feels a long range force, similar to the long range force of electromagnetism. Since the photon (light) is the mediator of EM, we call the proposed force in the dark sector "dark photons" or "dark light."
I wrote a fairly early paper on the topic, though we were certainly not the first to do so. I'm pretty proud of that paper though, especially since Sean and I started working on it because we thought the phrase "dark light" was too funny not to use. Our conclusion was that dark photons cannot interact with dark matter as strongly as photons interact with normal matter. If they did, then the halos of dark matter that form galaxies and clusters of galaxies would have different shapes than what we have determined them to be, using gravitational lensing and the motion of stars and galaxies. However, reducing the coupling strength by some 2 orders of magnitude would be fine. What this more or less rules out is large-scale objects like stars and planets made up of dark matter. Though as a recent paper by Lisa Randall and company shows that you can avoid these bounds if only 10% of the dark matter or so had long-range interactions.
Alternatively, what you can do is make the dark photon massive. A force mediated by a massive particle becomes short-range; the reason is exactly the Heisenberg uncertainty principle between energy and time that I mentioned in the previous comment. Forces are transmitted by "throwing" mediators back and forth between two particles; if the particle you have to throw is heavy, you can only throw it so far before the Universe requires you to pay the energy back. That's a total hand-wave, but it captures the essential facts of the process. The weak nuclear force is so weak because the mediating W and Z bosons are so heavy, and so the force is very very short-ranged (the actual weak coupling -the strength of the interaction between the W/Z bosons and particles- is larger than the electromagnetic coupling, actually).
So if the dark photon was heavy - not W/Z heavy - but a few tens of MeV to a GeV (so nuclear-scale energies) then the dark force could avoid all current bounds, and still have important effects on the history and evolution of dark matter in the Universe.
There are a couple of reasons you (for values of "you" that equal "particle physicist") might be interested in dark photon models. The primary instigator was the PAMELA anomaly, which was probed as well by AMS-2 (and which I wrote about on metafilter here). Basically, if some odd signals of antimatter production in space were due to dark matter, you'd need very large interaction cross sections (think of this as the "size" of dark matter particles in some sense), but such a large interaction would not give you the right amount of dark matter in the early Universe (see my linked comment for more details). Having a long-ranged force means that as the Universe expands and dark matter cools, the effective cross section goes up (basically, if you're moving slowly, you can get more easily sucked into a nearby potential well). The paper that kicked off that idea was this one. You notice that that paper has been cited over 670 times by now. People paid attention.
Now, the PAMELA anomaly has sort of gone away as something of interest to dark matter phenomenologists. However, one of the fun things about crazy results is that they get us to think about ideas we hadn't considered before. It turns out that dark photons are pretty generic in many extensions of the Standard Model. They are NOT present in the minimal supersymmetric Standard Model, and back when we were waiting for the LHC, we thought that the MSSM was the best bet, and spent less time coming up with truly out-there ideas. Now that we have more data and the MSSM looks slightly less compelling (though it is not dead yet), we are a little more willing to get crazy.
So, it turns out that dark photons can crop up all over the place. We can look for them directly in several ways. Most of these methods rely on the fact that dark photons can "mix" with regular photons. The mixing has to be small, otherwise we would know about dark photons already; but that's not unlikely or difficult to arrange in a theory. This means that if you can create a lot of photons, occasionally you can create a dark photon. That dark photon will travel along some distance (maybe a few mm, maybe a few cm, maybe more, depending on parameters we don't know the values of yet), and then decay down into visible particles - like an e-/e+ pair, for example. The easiest way to look for this (at least for certain values of the average distance travelled) is "light shining through walls." Shoot huge numbers of electrons at a target; when they pass through, they'll radiate off photons and (maybe) dark photons. You will get huge numbers of e-/e+ pairs - which would obscure the rare pair created by the dark photon. However, regular e-/e+ and photons can't travel far through solid materials. Dark photons don't give a fuck about matter, and so fly straight through. Then they decay. So you see a e-/e+ produced on the other side of a shielding material, and you can use that to infer the presence of dark photons.
Since there are a wide range of parameters here, you can't look for all versions of this model at once, but there are many experiments we are working on trying to cover as much of the space of models as we can. That all of them require high intensity beams of stuff (electrons, most commonly), and so fall into the "intensity frontier" I mentioned.
posted by physicsmatt at 9:24 AM on July 13, 2013 [11 favorites]
I wrote a fairly early paper on the topic, though we were certainly not the first to do so. I'm pretty proud of that paper though, especially since Sean and I started working on it because we thought the phrase "dark light" was too funny not to use. Our conclusion was that dark photons cannot interact with dark matter as strongly as photons interact with normal matter. If they did, then the halos of dark matter that form galaxies and clusters of galaxies would have different shapes than what we have determined them to be, using gravitational lensing and the motion of stars and galaxies. However, reducing the coupling strength by some 2 orders of magnitude would be fine. What this more or less rules out is large-scale objects like stars and planets made up of dark matter. Though as a recent paper by Lisa Randall and company shows that you can avoid these bounds if only 10% of the dark matter or so had long-range interactions.
Alternatively, what you can do is make the dark photon massive. A force mediated by a massive particle becomes short-range; the reason is exactly the Heisenberg uncertainty principle between energy and time that I mentioned in the previous comment. Forces are transmitted by "throwing" mediators back and forth between two particles; if the particle you have to throw is heavy, you can only throw it so far before the Universe requires you to pay the energy back. That's a total hand-wave, but it captures the essential facts of the process. The weak nuclear force is so weak because the mediating W and Z bosons are so heavy, and so the force is very very short-ranged (the actual weak coupling -the strength of the interaction between the W/Z bosons and particles- is larger than the electromagnetic coupling, actually).
So if the dark photon was heavy - not W/Z heavy - but a few tens of MeV to a GeV (so nuclear-scale energies) then the dark force could avoid all current bounds, and still have important effects on the history and evolution of dark matter in the Universe.
There are a couple of reasons you (for values of "you" that equal "particle physicist") might be interested in dark photon models. The primary instigator was the PAMELA anomaly, which was probed as well by AMS-2 (and which I wrote about on metafilter here). Basically, if some odd signals of antimatter production in space were due to dark matter, you'd need very large interaction cross sections (think of this as the "size" of dark matter particles in some sense), but such a large interaction would not give you the right amount of dark matter in the early Universe (see my linked comment for more details). Having a long-ranged force means that as the Universe expands and dark matter cools, the effective cross section goes up (basically, if you're moving slowly, you can get more easily sucked into a nearby potential well). The paper that kicked off that idea was this one. You notice that that paper has been cited over 670 times by now. People paid attention.
Now, the PAMELA anomaly has sort of gone away as something of interest to dark matter phenomenologists. However, one of the fun things about crazy results is that they get us to think about ideas we hadn't considered before. It turns out that dark photons are pretty generic in many extensions of the Standard Model. They are NOT present in the minimal supersymmetric Standard Model, and back when we were waiting for the LHC, we thought that the MSSM was the best bet, and spent less time coming up with truly out-there ideas. Now that we have more data and the MSSM looks slightly less compelling (though it is not dead yet), we are a little more willing to get crazy.
So, it turns out that dark photons can crop up all over the place. We can look for them directly in several ways. Most of these methods rely on the fact that dark photons can "mix" with regular photons. The mixing has to be small, otherwise we would know about dark photons already; but that's not unlikely or difficult to arrange in a theory. This means that if you can create a lot of photons, occasionally you can create a dark photon. That dark photon will travel along some distance (maybe a few mm, maybe a few cm, maybe more, depending on parameters we don't know the values of yet), and then decay down into visible particles - like an e-/e+ pair, for example. The easiest way to look for this (at least for certain values of the average distance travelled) is "light shining through walls." Shoot huge numbers of electrons at a target; when they pass through, they'll radiate off photons and (maybe) dark photons. You will get huge numbers of e-/e+ pairs - which would obscure the rare pair created by the dark photon. However, regular e-/e+ and photons can't travel far through solid materials. Dark photons don't give a fuck about matter, and so fly straight through. Then they decay. So you see a e-/e+ produced on the other side of a shielding material, and you can use that to infer the presence of dark photons.
Since there are a wide range of parameters here, you can't look for all versions of this model at once, but there are many experiments we are working on trying to cover as much of the space of models as we can. That all of them require high intensity beams of stuff (electrons, most commonly), and so fall into the "intensity frontier" I mentioned.
posted by physicsmatt at 9:24 AM on July 13, 2013 [11 favorites]
ceribus peribus: "After six months of planning, the 50-ft ring magnet was slowly hauled on an eight-axle trailer through the streets of Long Island, ...
... thoroughly degaussing everyone's mp3 collections in the process."
I am waiting to hear how the class action suit from people in that area with genital piercings plays out.
posted by Samizdata at 10:03 AM on July 13, 2013
... thoroughly degaussing everyone's mp3 collections in the process."
I am waiting to hear how the class action suit from people in that area with genital piercings plays out.
posted by Samizdata at 10:03 AM on July 13, 2013
Besides, I was surprised they did not go with an electronuclearmagnet. It's the inevitable next step.
posted by Samizdata at 10:09 AM on July 13, 2013
posted by Samizdata at 10:09 AM on July 13, 2013
Huh, Emmert International is the same company that moved the boulder for LACMA's Elevated Mass sculpture. I remain childishly fascinated with the transportation of Very Large Things.
posted by carsonb
So maybe Stonehenge is associated with a stardrive after all!
Only not quite as directly as those stories would have us believe.
posted by jamjam at 2:23 PM on July 13, 2013
posted by carsonb
So maybe Stonehenge is associated with a stardrive after all!
Only not quite as directly as those stories would have us believe.
posted by jamjam at 2:23 PM on July 13, 2013
Metafilter: it is known that interactions that could violate flavor are highly suppressed, especially in the lepton sector
posted by hattifattener at 1:44 AM on July 15, 2013
posted by hattifattener at 1:44 AM on July 15, 2013
« Older "The bloodiest battles took place in the marketing... | twink has become a non-specific word Newer »
This thread has been archived and is closed to new comments
posted by sammyo at 5:00 PM on July 12, 2013 [3 favorites]