Well, I normally don't like to get into too much Big Bang bashing or anything, but I may have come up some important derivations. But first, please read through this thread and run through the calculations. I know I need to give everyone in that discussion time to look through them on that thread as well, but I'll go ahead and start this going in the meantime.
Now, the time for light to travel to us while space is expanding is t=(e(Hd/c)-1), where H is the current value for the Hubble constant and d is the current distance to a galaxy from Earth (Hr and df). For a constant rate of expansion, Tuniverse=1/H, so let's find the furthest distance a galaxy can be where a pulse of light would just now be reaching us since the time of the Big Bang. That would be for t=T, so
t=T=1/H=(e(Hd/c)-1)/H [EDIT-This should be /He. This thread is incorrect. Sorry. -grav]
That gives us d=1.03972*10^26 meters for c=3*10^8 m/sec and H=2*10^-18 sec-1. This value for d is the final distance travelled over that time, df, so let's find the original distance, do.
do=df/[e(H*df/c)] and e(H*df/c)=2, so
The question is, then, if all galaxies once existed at a singularity (or close to it), then how can galaxies at the furthest distances have started out at only half that distance, far from a singularity? Also, how can we observe galaxies much further than this if the light would have taken more time to reach us than the age of the universe? The only answer I can think of is that the expansion model is either incorrect or in need of some tremendous alterations. Is the Big Bang model blown?