Google´s page rank (PageRankTM) attributes values to pages according to the famous formula.
PR(A) = (1-d) + d (PR(T1)/C(T1) + ... + PR(Tn)/C(Tn))
PR(A) = the page rank of your site
PR(T1) = the page rank of the first page up in the iteration
PR(Tn) = The last page considered in the iteration.
C = number of outgoing links
C(T1) number of outgoing links for T1
C(Tn) number of outgoing links for Tn
d = a damping factor (the end of the iteration) - it´s a constant (o,85).
See a primer here: http://www.iprcom.com/papers/pagerank/
In other words, the importance of your site is measured by the amount of inbound links. Each link to your site is a 'vote'. And the more important the page that links you the more important Google will think you are (If the big bosses cite you then you must also be a big boss). But where´s the evidence that a good, relevant content must have inbound linking? People may find NASA´s site a rich one in content and still not link it. My site does not link NASA´s.
Now, do you think this method is capable of capturing the subjective aspect of web search? Shouldn´t there be a correction factor to account for subjectivity?
Lets say that I have low rank astronomy site with a good content about asteroids. Then it is found that an asteroid is heading to Earth. People will start frantically searching the web for info about asteroids. They will have many results, but my rich and enlightening content will not be displayed because it has a low rank.
To insert a subjective correction factor to the algorithm, Google could keep track of the most searched words at a given moment, and dinamically put more weight on pages that feature the word "asteroid", regardless of the inbound linking of that page. So, my page would have a better chance to be shown in the first results and the user would have a better experience. Wouldn´t it be reasonable?
Don´t you think that PageRank algorithm is biased in favor of the big ones?