lpetrich

2001-Nov-27, 11:22 PM

One interesting conclusion from the state of the art in cosmology is that, when looking back in time, we reach a stage that we cannot really "see" beyond, and that stage is where quantum-gravity effects become important. Which is why I think Stephen Hawking's quantum-cosmology theories rather speculative.

Why this is so can be seen from Stephen Hawking's gravitational-collapse singularity theorems; the Big Bang can be handled with these theorems as gravitational collapse run backwards. And for physically reasonable equations of state of the Universe's matter-energy content, one finds a singularity at some finite-time point, where the time is measured by some "inhabitant" of the Universe.

However, this analysis assumes that the classical limit is a reasonable approximation, and that assumption becomes invalid when length scales, densities, etc. approach their Planck values, which are values formed from Planck's constant, the speed of light in a vacuum, and the gravitational constant -- the natural unit scales for quantum gravity.

Also, this stage is a domain of Grand Unified Theories, which typically predict a big zoo of extra particles that are too massive to be observed at particle-accelerator energies.

And although the Standard Model and many GUT's can be constructed fairly self-consistently, it is difficult to do so for quantum gravity. The SM/GUT problem is that divergences appear when one tries to do some perturbative calculations; divergences due to behavior at short distances and high energy scales. These divergences can be subtracted out by assuming that they are always present, a process called renormalization.

Attempts to create a renormalizable quantum-gravity theory have been far from successful; construction of quantum gravity theories is also hampered by time being poorly-defined in the strong-quantum-gravity limit.

One possible way out is to imagine a lot of elementary-particle fields that act as counterterms, at least partially subtracting away the divergence; that effect happens with supersymmetric theories. And one set of theories that uses that approach is superstring theories, which specify an infinite number of such extra fields.

And although superstring theories are generally well-behaved, they prefer to live in 10 and not 4 space-time dimensions, and getting the Standard Model from them has been rather difficult. For starters, 6 of the 10 dimensions have to be curled up into a very small ball ("compactification"), and that ball's topology determines low-energy physics, as it is called.

So we either have GR + Standard Model (traditional approach) -> no self-consistency, or superstrings (GR + self-consistency) -> no Standard Model

Meaning that it's difficult to draw conclusions about the Universe's quantum-gravity epoch, which it had once passed through.

Why this is so can be seen from Stephen Hawking's gravitational-collapse singularity theorems; the Big Bang can be handled with these theorems as gravitational collapse run backwards. And for physically reasonable equations of state of the Universe's matter-energy content, one finds a singularity at some finite-time point, where the time is measured by some "inhabitant" of the Universe.

However, this analysis assumes that the classical limit is a reasonable approximation, and that assumption becomes invalid when length scales, densities, etc. approach their Planck values, which are values formed from Planck's constant, the speed of light in a vacuum, and the gravitational constant -- the natural unit scales for quantum gravity.

Also, this stage is a domain of Grand Unified Theories, which typically predict a big zoo of extra particles that are too massive to be observed at particle-accelerator energies.

And although the Standard Model and many GUT's can be constructed fairly self-consistently, it is difficult to do so for quantum gravity. The SM/GUT problem is that divergences appear when one tries to do some perturbative calculations; divergences due to behavior at short distances and high energy scales. These divergences can be subtracted out by assuming that they are always present, a process called renormalization.

Attempts to create a renormalizable quantum-gravity theory have been far from successful; construction of quantum gravity theories is also hampered by time being poorly-defined in the strong-quantum-gravity limit.

One possible way out is to imagine a lot of elementary-particle fields that act as counterterms, at least partially subtracting away the divergence; that effect happens with supersymmetric theories. And one set of theories that uses that approach is superstring theories, which specify an infinite number of such extra fields.

And although superstring theories are generally well-behaved, they prefer to live in 10 and not 4 space-time dimensions, and getting the Standard Model from them has been rather difficult. For starters, 6 of the 10 dimensions have to be curled up into a very small ball ("compactification"), and that ball's topology determines low-energy physics, as it is called.

So we either have GR + Standard Model (traditional approach) -> no self-consistency, or superstrings (GR + self-consistency) -> no Standard Model

Meaning that it's difficult to draw conclusions about the Universe's quantum-gravity epoch, which it had once passed through.