Of course, we still don't know what the true theory of quantum gravity looks like. But many people have explained over and over again that there are already extremely tight constraints on what this theory will look like: That theory in the small should better look like the standard model for energies up to about 100GeV and it has to look like General Relativity for length scales starting from a few micrometers to tests at very high precision at scales of the size of the solar system (screwing up the laws of gravity at that scale just by a tiny amount will change the orbits of the moon --- known with millimeter precission --- an the planets on the long run and will very likely render the solar system unstable at time scales of the age of the solar system). At larger scales up to the Hubble scale we still have good evidence for GR, at least if you accept dark matter (and energy). These are the boundary conditions a reasonable candidate for quantum gravity has to live with.
Considerations like this suggest that most likely you will need observations close to the Planck scale to see anything new but not so radically new that you would have seen it already.
I know only of two possible exceptions: Large extra dimensions (which lower the Planck scale significantly) allowing for black hole production at colliders (which I think of as possible but quite unnatural and thus unlikely) and cosmological gravitational waves (aka tensor modes). At least string theory has not spoken the final verdict on those but there are indications that those are way beyond detection limits in most string inspired models (if you accept those as being well enough understood to yield reliable predictions).
Let me mention two ideas which I consider extremely hard to realise such they yield observable predictions and are now yet ruled out: The first example is a theory where you screw with some symmetry or principle by adding terms to your Lagrangian which have a dimensionful coefficient corresponding to high energies. Naively you would think that this should influence your theory only at those high energies and these effects are hidden for low energy observers like us unless we look really hard.
One popular example of such theories are theories that mess with the relativistic dispersion relation and for example introduce an energy dependent speed of light. Proponents of such theories suggest one should look at ultra high energy gamma rays which have traveled a significant fraction of the universe. Those often come in bursts of very short duration. If one assumes those gammas were all emitted at the same time but then one observes that the ones of higher energies within one burst arrive here either systematically earlier or later this would suggest that the speed at which they travel depends on the energy.
The problem with screwing with the dispersion relation is that you are very likely breaking Lorentz invariance. The proponents of such theories then say that this breaking is only by a tiny amount visible only at large energies. Such arguments might actually work in classical theories. But in a quantum theory, particles running in loops are transmitting such breaking effects to all scales.
Another way to put this: In the renormalisation procedure, you should allow for all counter terms allowed by you symmetries. For example in phi^4 theory, there are exactly three renormalisable, Lorentz invariant counter terms: phi^2, phi^4 and phi Box phi. Those correspond to mass, coupling constant and wave function renormalisation. But if your theory is not Lorentz invariant at some scale you have no right to exclude terms like for example phi d_x phi (where d_x is the derivative in the x-direction). Once those terms are there, they have no reason to have tiny coefficients (after renormalisation group flow). But we know with extremely high precission that those coefficients are extremely tiny if non-zero at all in the real world (in the standard model say).
So if your pet theory breaks Lorentz invariance you should better explain why (after taking into account loop effects) we don't see this breaking already today. So far, I have not seen any proposed theory that achieves this.
There is an argument of a similar spirit in the case of non-commutative geometry. If you start out with [x,y] = i theta then theta has dimension length^2 (and in more than 2D breaks Lorentz invariance but that's another story). If you make it small enough (Planck length squared, say) it's suggestive to think of it as a small perturbation which will go unnoticed at much larger scales (like the quantum nature of the world is not visible if you typical action is much larger than h-bar). Again, this might be true in the classical theory. But once you consider loop effects you have UV/IR mixing (the translation from large energy scales to low energy scales) and your tiny effect is seen at all scales. For example in our paper we worked it out in a simple example and demonstrated that a 1/r^2 Coulomb type force law is modified and the force dies out exponentially over distance scales of the order of sqrt(theta), the length scale you were going to identify with the Planck scale in the first place and, whoops, your macroscopic force is gone...
A different example are variable fundamental constants. At first, those look like an extremely attractive feature to a string theorist: We know that all those constants like the fine structure constant are for example determined by details of your compactification. Those in turn are parametrised by moduli and thus it's very natural to think of fundamental constants as scalar field dependent. Concretely, for example for the fine structure constant, in your Lagrangian you replace F^2 by sF^2 for some scalar s which you add and which by some potential is stabilised at the value that corresponds to alpha=1/137.
The problem is once again that either the fixing is so tight that you will not see s changing or s is effectively sourced by electric fields and shows up in violations of the equivalence principle as it transmits a fifth force. You might once more be able to avoid this problem by extreme fine tuning the parameters of this model just to avoid detection (you could make the coupling much weaker than the coupling to gravity). But this fine tuning is once more ruined by renormalisation and after including the first few loop orders, s will not anymore couple only to F^2 but to all standard model fields and have an even harder time to play hide and seek and up to today remain unobserved (an yes, tests of the equivalence principle have very high precission).
You might say that non-commutative geometry and varying constants are not really quantum gravity theories. But the idea should be clear: We already know a lot of things about physics and it's very hard to introduce radical new things without screwing up things we already understand. I'm not saying it's impossible. It's just that you have to be really clever and a naive approach is unlikely to get anywhere. So the New Einstein really has to be a new Einstein.
But in case you feel strong today, consider this job ad.
Thursday, March 06, 2008
Subscribe to:
Posts (Atom)