1. ## Google page rank

Google´s page rank (PageRankTM) attributes values to pages according to the famous formula.

PR(A) = (1-d) + d (PR(T1)/C(T1) + ... + PR(Tn)/C(Tn))

Where:

PR(A) = the page rank of your site
PR(T1) = the page rank of the first page up in the iteration
PR(Tn) = The last page considered in the iteration.
C = number of outgoing links
C(T1) number of outgoing links for T1
C(Tn) number of outgoing links for Tn
d = a damping factor (the end of the iteration) - it´s a constant (o,85).

See a primer here: http://www.iprcom.com/papers/pagerank/

In other words, the importance of your site is measured by the amount of inbound links. Each link to your site is a 'vote'. And the more important the page that links you the more important Google will think you are (If the big bosses cite you then you must also be a big boss). But where´s the evidence that a good, relevant content must have inbound linking? People may find NASA´s site a rich one in content and still not link it. My site does not link NASA´s.

Now, do you think this method is capable of capturing the subjective aspect of web search? Shouldn´t there be a correction factor to account for subjectivity?

Lets say that I have low rank astronomy site with a good content about asteroids. Then it is found that an asteroid is heading to Earth. People will start frantically searching the web for info about asteroids. They will have many results, but my rich and enlightening content will not be displayed because it has a low rank.

To insert a subjective correction factor to the algorithm, Google could keep track of the most searched words at a given moment, and dinamically put more weight on pages that feature the word "asteroid", regardless of the inbound linking of that page. So, my page would have a better chance to be shown in the first results and the user would have a better experience. Wouldn´t it be reasonable?

Don´t you think that PageRank algorithm is biased in favor of the big ones?
Last edited by Argos; 2006-Mar-02 at 03:25 PM. Reason: Grammar

2. Established Member
Join Date
Jul 2004
Posts
651
Perhaps Google should rank according to the number of times that the word or phrase searched for appears in the text. Ranking would now be different for every new search. This wouldn't ensure quality, but at least one wouldn't encounter high-ranked hits where the subject of interest is only mentioned in passing, if at all.

Personally, I tend to add more and more words which I expect to find until the number of hits has decreased to a manageable level.

In my experience, though, the hundreds of thousands of hits claimed are not usually available. If some huge number of hits is being claimed, in actuallity there might be around one thousand hits with the rest "similar texts on other sites", or some such phrase. If a few hundreds of hits are claimed, there may actually be around thirty or forty ones, which can be visited one by one.

3. I definitely think Google’s system needs to be revised a bit. Up until recently, the number of links pointing to a site was a good gauge of that site’s popularity, now you have people that blog on many, many boards just to have a link to their site for the Google spiders to find. There are also a number of program hacks that automatically search for guestbooks and automatically place posts in them for the same reason. I’m constantly having to keep an eye out for these ‘spam’ post to keep my various client’s guestbooks clean.

I’m not sure if there is a hard, quick answer to it. There are search engines that base ranking content on number of key words, your asteroid for example. But there is a problem there as well. A site not related to the subject matter will often hide and load up on key words to try to pull in a certain type of audience. I remember doing a search on Webcrawler for a software program, then clicking on an innocent looking high ranked link I was taken straight to a porn site. At the bottom of their page, text of the same color as their background had been inserted with 20 or so software programs repeated over and over (I happened to notice all the blank space at the bottom and high lighting the background showed the nasty ruse to me).

4. Originally Posted by sidmel
There are search engines that base ranking content on number of key words, your asteroid for example.
Yes. In the case of Google it does not support Meta keywords, because it is easy to manipulate results via keywords. I´m referring to capture the searched terms within the body of the doc. In fact, Google already put a lot of weight in anchor texts and headings.

5. Sorry, I should have worded that different, better. By key words, I meant words important to a particular Web site that are contained in the body of the text. I.E. asteroids in an astronomy site.

I was attempting to point out that you can (and many sites do) skew the results in this manor.

6. Ok, I got it and I agree. However it should be easy to sort out text spamming.

Another problem with PageRank is that it also infoms Google&#180;s crawlers. If your site has been left out of the index because of its low rank, it&#180;s likely that it will remain out of sight, perpetuating your low rank.

7. Some of you might like to be introduced to the the paper that paved the way for Google, case you haven&#180;t.

#### Posting Permissions

• You may not post new threads
• You may not post replies
• You may not post attachments
• You may not edit your posts
•