pagerank
Latest Pagerank update to be delayed because of bigdaddy update,
Google didnt update their toolbar pagerank from october 2005, Usually its 3 months once pagerank update these days, everyone was waiting for a toolbar pagerank update,
Mattcutts has commented on his blog that bigdaddy update is the possible reason for visible toolbar pagerank update,
We need to wait to see more green for our sites π
This is what matt said about PR update,
“Ben, I donβt know and it wouldnβt matter. No idea when the next toolbar PageRank update is. Iβm guessing that the Bigdaddy changes might cause the PageRank update to come later.”
does redirects pass pagerank ?
Many have the question whether redirects pass google pagerank?
Answer:
It depends on the type of redirect handled, if its just a normal 302 redirect then it rarely passes pagerank, if the redirect is 301 then it will pass pagerank,
Passing of pagerank depends on the type of redirect used, if the redirects are hidden inside javascript its impossible for the link to pass pagerank, similarly if the links are from folders blocked by robots file then it wont pass pagerank,
carcaserdotcom seocontest test page just checking to see how this site ranks in carcaserdotcom seocontest
Just checking to see how this site ranks for carcaserdotcom seocontest organized by carcasher.com , We don’t want to exchange links with sites running seocontest, so please don’t contact us for link exchange offers for carcaserdotcom seo contest,
1000s of sites will soon jump into this carcaserdotcom contest, good luck to all,
list of TLD ( top level domain extension codes
Here is a list of TLD codes, These extensions shows which country each extension belongs, It was compiled by the ICANN
ICANN is Internet Corporation for Assigned Names and Numbers
IANA – Internet Assigned Numbers Authority
ac β Ascension Island
.ad β Andorra
.ae β United Arab Emirates
.af β Afghanistan
.ag β Antigua and Barbuda
.ai β Anguilla
.al β Albania
.am β Armenia
.an β Netherlands Antilles
.ao β Angola
.aq β Antarctica
.ar β Argentina
.as β American Samoa
.at β Austria
.au β Australia
.aw β Aruba
.az β Azerbaijan
.ax β Aland Islands
.ba β Bosnia and Herzegovina
.bb β Barbados
.bd β Bangladesh
.be β Belgium
.bf β Burkina Faso
.bg β Bulgaria
.bh β Bahrain
.bi β Burundi
.bj β Benin
.bm β Bermuda
.bn β Brunei Darussalam
.bo β Bolivia
.br β Brazil
.bs β Bahamas
.bt β Bhutan
.bv β Bouvet Island
.bw β Botswana
.by β Belarus
.bz β Belize
.ca β Canada
.cc β Cocos (Keeling) Islands
.cd β Congo, The Democratic Republic of the
.cf β Central African Republic
.cg β Congo, Republic of
.ch β Switzerland
.ci β Cote d’Ivoire
.ck β Cook Islands
.cl β Chile
.cm β Cameroon
.cn β China
.co β Colombia
.cr β Costa Rica
.cs β Serbia and Montenegro
.cu β Cuba
.cv β Cape Verde
.cx β Christmas Island
.cy β Cyprus
.cz β Czech Republic
.de β Germany
.dj β Djibouti
.dk β Denmark
.dm β Dominica
.do β Dominican Republic
.dz β Algeria
.ec β Ecuador
.ee β Estonia
.eg β Egypt
.eh β Western Sahara
.er β Eritrea
.es β Spain
.et β Ethiopia
.eu β European Union
.fi β Finland
.fj β Fiji
.fk β Falkland Islands (Malvinas)
.fm β Micronesia, Federal State of
.fo β Faroe Islands
.fr β France
.ga β Gabon
.gb β United Kingdom
.gd β Grenada
.ge β Georgia
.gf β French Guiana
.gg β Guernsey
.gh β Ghana
.gi β Gibraltar
.gl β Greenland
.gm β Gambia
.gn β Guinea
.gp β Guadeloupe
.gq β Equatorial Guinea
.gr β Greece
.gs β South Georgia and the South Sandwich Islands
.gt β Guatemala
.gu β Guam
.gw β Guinea-Bissau
.gy β Guyana
.hk β Hong Kong
.hm β Heard and McDonald Islands
.hn β Honduras
.hr β Croatia/Hrvatska
.ht β Haiti
.hu β Hungary
.id β Indonesia
.ie β Ireland
.il β Israel
.im β Isle of Man
.in β India
.io β British Indian Ocean Territory
.iq β Iraq
.ir β Iran, Islamic Republic of
.is β Iceland
.it β Italy
.je β Jersey
.jm β Jamaica
.jo β Jordan
.jp β Japan
.ke β Kenya
.kg β Kyrgyzstan
.kh β Cambodia
.ki β Kiribati
.km β Comoros
.kn β Saint Kitts and Nevis
.kp β Korea, Democratic People’s Republic
.kr β Korea, Republic of
.kw β Kuwait
.ky β Cayman Islands
.kz β Kazakhstan
.la β Lao People’s Democratic Republic
.lb β Lebanon
.lc β Saint Lucia
.li β Liechtenstein
.lk β Sri Lanka
.lr β Liberia
.ls β Lesotho
.lt β Lithuania
.lu β Luxembourg
.lv β Latvia
.ly β Libyan Arab Jamahiriya
.ma β Morocco
.mc β Monaco
.md β Moldova, Republic of
.mg β Madagascar
.mh β Marshall Islands
.mk β Macedonia, The Former Yugoslav Republic of
.ml β Mali
.mm β Myanmar
.mn β Mongolia
.mo β Macau
.mp β Northern Mariana Islands
.mq β Martinique
.mr β Mauritania
.ms β Montserrat
.mt β Malta
.mu β Mauritius
.mv β Maldives
.mw β Malawi
.mx β Mexico
.my β Malaysia
.mz β Mozambique
.na β Namibia
.nc β New Caledonia
.ne β Niger
.nf β Norfolk Island
.ng β Nigeria
.ni β Nicaragua
.nl β Netherlands
.no β Norway
.np β Nepal
.nr β Nauru
.nu β Niue
.nz β New Zealand
.om β Oman
.pa β Panama
.pe β Peru
.pf β French Polynesia
.pg β Papua New Guinea
.ph β Philippines
.pk β Pakistan
.pl β Poland
.pm β Saint Pierre and Miquelon
.pn β Pitcairn Island
.pr β Puerto Rico
.ps β Palestinian Territories
.pt β Portugal
.pw β Palau
.py β Paraguay
.qa β Qatar
.re β Reunion Island
.ro β Romania
.ru β Russian Federation
.rw β Rwanda
.sa β Saudi Arabia
.sb β Solomon Islands
.sc β Seychelles
.sd β Sudan
.se β Sweden
.sg β Singapore
.sh β Saint Helena
.si β Slovenia
.sj β Svalbard and Jan Mayen Islands
.sk β Slovak Republic
.sl β Sierra Leone
.sm β San Marino
.sn β Senegal
.so β Somalia
.sr β Suriname
.st β Sao Tome and Principe
.sv β El Salvador
.sy β Syrian Arab Republic
.sz β Swaziland
.tc β Turks and Caicos Islands
.td β Chad
.tf β French Southern Territories
.tg β Togo
.th β Thailand
.tj β Tajikistan
.tk β Tokelau
.tl β Timor-Leste
.tm β Turkmenistan
.tn β Tunisia
.to β Tonga
.tp β East Timor
.tr β Turkey
.tt β Trinidad and Tobago
.tv β Tuvalu
.tw β Taiwan
.tz β Tanzania
.ua β Ukraine
.ug β Uganda
.uk β United Kingdom
.um β United States Minor Outlying Islands
.us β United States
.uy β Uruguay
.uz β Uzbekistan
.va β Holy See (Vatican City State)
.vc β Saint Vincent and the Grenadines
.ve β Venezuela
.vg β Virgin Islands, British
.vi β Virgin Islands, U.S.
.vn β Vietnam
.vu β Vanuatu
.wf β Wallis and Futuna Islands
.ws β Western Samoa
.ye β Yemen
.yt β Mayotte
.yu β Yugoslavia
.za β South Africa
.zm β Zambia
.zw β Zimbabwe
What Causes Sandbox filter? New experiment reveals why sandbox filter exists. Is sandbox a side effect of trust rank?
What Causes Sandbox filter? New experiment reveals why sandbox filter exists. Is sandbox a side effect of trust rank?
There have been numerous discussions on the sandbox filter of google which holds sites for up to 16 months before the site starts ranking well in google results. So what causes this filter? After loosing patience in ranking client sites, we at Search Engine Genie conducted a test across about 15 sites. This test revealed interesting results; following is given a summary of the Anti – Google Sandbox filter experiment.
1. Does the sandbox filter really exist?
Based on our experiment it is understood that, sandbox filter does exist. But sandbox filter doesn’t affect all sites; sites on the other hand are artificially linked to rank, through search engine optimization.
2. What causes sandbox filter?
This is an important question asked several times in forums, message boards and blogs. No one was able to give a definite answer for it. Even we had to struggle a bit to figure out what this sandbox is all about. Finally with our experiment using different strategies on about 15 sites we were able to find the cause of the sandbox filter. Our experiments prove that about 90% of theories circulating around in forums and other articles are wrong. Sandbox filter is caused purely by links and nothing else. It is just the abnormal growth of links in google algorithm’s eyes which results in a site being placed in sandbox.
As we all know, internet is based solely on natural linking. Search engines and their ranking algorithms are the main cause for artificial linking. Now search engines like google woke up to the occasion and are combating link spam with sophisticated algorithms like the sandbox filter. Sandbox is a natural phenomenon affecting sites using any sort of SEO type methods to gain links. Next topic explains it. Sandbox filter relies on trust rank to identify quality of links not PageRank as it used before.
Trust rank is a new algorithm active with google. The name trust rank is common for Google, Yahoo! and other search engines. So we can use the word trusted links. Trusted links are links which are hand edited for approval, algorithmically given maximum credit to vote for other sites like links from reputed sites, links from authority of the industry like the .gov sites etc. Google’s new algorithm sees what type of trusted links a site has, especially if the site is new. If a site starts off with SEO type links than naturally gained trusted links, the site will be placed in sandbox for a period of up to 1 year or even more.
3. What factors / methods lead to sandbox filter?
All types of SEO type link building will lead to sandbox filter.
a. Reciprocal link building:-
Reciprocal link building is one important method which will lead to definite sandbox / aging filter. When a site starts off with reciprocal link building, definitely their site will be sandboxed. It is because first of the reciprocal link building is not the way to build trusted links. Sites which are trusted / hand edited for approval do not have to trade links with other sites, they would rather voluntarily link out to other sites. No one can force them to link out or trade with them for a link. Most of the sites involved in reciprocal link building are very weak themselves. So if a site is involved in reciprocal link building they are not going to get trusted links. So if a new site grows with untrusted links they will be placed in the aging filter. We don’t blame reciprocal link building, in fact we do it all the time for our clients, but reciprocal link building by itself, doesn’t add any value to the end users, Plus reciprocal link building is built purely to manipulate Search engine result pages. So Google is not the one to be blamed to come out with such an amazing algorithm which can fight aggressive link building so effectively. For a new site we don’t recommend reciprocal link building immediately, first build a trust for your site then do reciprocal link building, there are no issues that time. So how do you know you have built trust with google’s algorithm? It is proved by ranking. If your site is ranking well for competitive and non competitive terms definitely you can assume that you are not in sandbox any more. We are talking about a minimum of 1 month ranking, not a day or week ranking for good terms.
b. Buying links: –
This is another most important method which will definitely lead to severe sandbox filter. Our experiment proved buying links for new sites will hurt the site badly when it comes to Google. When you start a site don’t immediately go and buy lot of links. Buying links are not the best way to gain trusted links. Most of the sites were you buy links are monitored by Google actively. Even if you buy link from a great site in your field still it won’t be considered a great trusted link. Especially site wide links (links which are placed throughout a site) are very dangerous for a new site this type of links will definitely delay the ranking of a site to a great extent. Even if you happen to buy link from great sites make sure you just get link from only one page, Also make sure that link is not placed in footer, it should be placed somewhere inside the content to make it look more natural. It is not worth buying links to for new sites. Give time for the site to grow up with naturally gained backlinks.
c. Directory submission: –
Directory submission has proved worthless when it comes to avoiding sandbox, though directory submissions don’t directly cause sandbox but those links will definitely affect the reputation of a site especially when the site is new. As discussed before it is important to gain trusted links when the site is new. But when you do directory submissions as the first step to building links we recommend avoiding it because when search engines see those links, they will place your site into the aging filter. Most of the directories are newbie and start up directories. We can name just 4 or 5 directories which are trustworthy to gain. Ask yourself whether you will go to a directory to find relevant sites today? I am afraid no, Directories are a thing of the past and people use search engines to find all information. That is why search engines don’t prefer to list directories in their listings, 2 directories which are an exception are the dmoz.org and the yahoo directory.
If you get a link from dmoz.org think you have got the best trusted link on internet, but dmoz takes up to 15 months to list a site, even they hold a site to grow to certain quality level. Yahoo directory is not as powerful as dmoz most possibly because it is paid and most of the paid information on internet is corrupted. Next to dmoz, yahoo directory is a safe and trusted place to get a listing. Other than that we don’t recommend any directory whether it is paid of free.
We don’t deny links from directories but it is not recommended to get link from useless / spam / unworthy directories especially if the site is new.
d. Links from comment spam / blog spam / guest book spam / message board spam:
If you are an aggressive SEO and has been using bad tactics, then you are not the people for Google, or at least not for the new Google. You are going to wait for ever to get top rankings in google with a new site if your initial links are coming from spam sources. These link tactics still work with yahoo and MSN but not with google anymore. If you launch a new site and take this path I would expect a sandbox period of about 2 years and by that time your site my be caught in some other sort of more severe penalty. Better don’t do this with google.
e. Links through article reproduction:
Links through article reproduction have always been a good way to build links, but not anymore for good sites. Google has started being very brilliant in finding duplicate copies across the web. If you rely on article reproduction as backlinks then you are dealing with links from duplicate copies of the article. When Google thinks it as unworthy to its index, take a good article snippet and search google, you will see apart from 2 or 3 main copies and all the other copies will be supplemental results. This is one way to find whether the article is being deemed as spam. If you are getting links from these duplicate copies it will no way help the trustworthiness of your site. So it is better avoid doing article reproduction but make people link to a good article of your site from other sites.
f. Links from a network of sites you own.
Always avoid this when you run a new site. Some people especially SEO companies tend to connect a new client site to an existing network of other active clients plus the sites they own or have tie-up with. Just avoid this anymore for new sites. It just doesn’t work for new sites and it only delays the sandbox process. The network you might have access to, doesn’t have the trust to vote for a new site. So better avoid linking from your own network.
g. Don’t participate in co-op link networks like digital point ad network or link vault:
Almost all the sites involved in these ad networks are not-worthy. Yahoo is very severe with these types of networks. Google also has good algorithms to identify these links. These links never work for new sites and it just delays the aging filter. Better avoid these types of link networks.
4. Is it some sort of hidden penalty?
Yes, it is a kind of penalty for a site. Penalty is a tougher word to give this filter but it is true that it is some kind of hidden penalty. Google treats sandbox filter a kind of mean penalty for a site and it just holds the site in penalty to prove its trust to the web. Google holds the site to see link growth patterns.
5. What is the inner working of the sandbox filter?
In our research we were able to find how this sandbox / aging filter works. First when google finds a site it gives an inception date (the date when google first found the site). From that time it sees the growth pattern of links. If the algorithm sees lot of not-trustworthy links coming into the site (especially if the site is new) it will place the site in hold from ranking for anywhere between 3 to 16 months. Once the site is in sandbox, Google’s algorithm sees the growth of link patterns. If it sees a normal growth of both trusted / normal links it will release the site sooner out of sandbox. But if the algorithm sees growth of un-trusted links the site will be delayed from ranking much longer.
6. Is sandbox based on whois registration date?
No, based on our experiment, sandbox filter doesn’t rely on whois for determining the age of a site.
7. How to detect if the site is in sandbox filter?
There are various ways to know if your site is in sandbox.
a. One very good method which has long been in existence is the allinanchor: keyword1 keyword2 search. If your site ranks for the allinanchor: search but not for the normal keyword search, then most probably your site is in the sandbox filter.
b. Next method is to check your ranking in other major search engines. If your site is ranking exceptionally well in Yahoo and MSN but not in Google then it could be another possible reason for the site to be in sandbox filter. But remember, Google’s algorithm is more sophisticated than yahoo or MSN and judging that your site is in sandbox only by this reason is absolutely wrong.
c. Check the quality of your backlinks and compare it with your competitors. There are numerous tools which show link comparison. Compare them and see what your competitor has, see the location where the link is placed to your competitor sites.
d. Check for both competitive and non-competitive terms. If your site doesn’t rank for both then most probably your site is sandboxed.
e. Check for duplicate content on your site. If your site has duplicate contents better fix them first before checking for sandbox problems. As I said before sandbox filter is caused purely based on links. You should go to check your link problems only if your on-page factors are fine and are of Google quality. Remember, all these information discussed are for sites which have quality content, unique business etc. These instructions are not for sites which use scrapped contents, made for AdSense sites, aggressive affiliate sites and sites that use other type of spam tactics. This is purely for sites which are having good quality domains and are still having problem ranking in search engines.
8. Does sandbox filter affect only certain areas of search?
No, Sandbox filter affects all areas. It is not keyword or topic based but more links based. Links are everywhere and the filter is applied everywhere.
9. Can a competitor sabotage a ranking using the evil sandbox filter of google?
This is a very good question. Having discussed all these information, if we don’t address this issue people will back fire us. Our research was focused on this issue too. We included some older domains in the test and we successfully found that a competitor CANNOT sabotage an existing ranking of a site through these link filter algorithms. But why can’t a competitor sabotage ranking?
Here I explain from the conclusion of our experiment:
What happens with Google is they want the site to be trusted before the site starts ranking very well. So in order to rank well a site should show the trust it has with Google’s algorithm. But this trust is a onetime process. If the site establishes its trust with google and starts ranking well then they don’t have to prove it again as long as the trusted links exist.
For example if a site is new and the site starts off with good trusted links, then the site doesn’t get sandboxed at all. It will start ranking very well, say for a month and an aggressive competitor who has been watching this plans to destroy the ranking of the good site. He spams all blogs, message boards, guest books, posts sitewide links on many sites etc. So will he succeed in sabotaging the ranking of the good site? No, since the good site has already established trust with google algorithm all these links from bad pages will only boost the ranking of the good site. No way will it harm the good site. Google algorithm knows this very well.
So why cant a competitor do this when the site is still new and not ranking?
It is because he won’t be aware that a site like that is growing strong, firstly. Unless a new site ranks well for targeted phrases the site is not a problem to any one’s eyes. Only if the site ranks will it itch the eyes of evil SEO companies and competitors. Only then they will plan to kill the rankings. But, by that time it will be too late, since the site has already established a trust with google’algorithm based on quality backlinks which it possesses already. So if a site owner thinks of sabotaging the existing ranking of a site by sending bad spam backlinks to their site, then they should remember that they are just actually helping someone rank better than ever. This is one reason why Google always preaches that no one can harm a site’s ranking other than the people responsible for the site.
So what is the proof for the above statement?
Our experiment is proof, though we are an ethical SEO company we know places where the bad backlinks are. We sent thousands of those backlinks to sites we monitor which are ranking currently well. We kept those links alive for 2 or 3 months and we were able to conclude that those bad links are doing nothing harmful to the site, In fact it was boosting those sites rankings and google’s algorithm never cared of those backlinks.
We are not discussing canonical issues, 302 hijacking, 301 problems etc here, we are just addressing whether a competitor can sabotage ranking for a site through link spam. There had been cases where 302 hijacking by external domains hurt a site. Matt cutt’s (Google’s senior Engineer) blog is a proof for this. Dark SEO team hijacked Matt Cutts’s blog through a tricky 302 redirect, but that is a different issue and this article deals with sandbox filter only.
10. Do we have the risk of loosing credit for untrusted links if it is gained for a new site before receiving trusted links?
Yes, this is a major risk factor. Sometimes it happens that most of the links are traded for a new site or are bought. A site owner would have possibly read in forums that reciprocal link building or buying links are the best ways to build links. Then he realizes that his site is not ranking. A site owner finally decides that he needs quality links he goes for branding and writes great articles, brings interesting stuff to the site which starts attracting high quality natural links. His site starts steadily improving and this time the site owner has the risk of loosing credit for the reciprocal links / bought links which was there before his site started acquiring trusted links. We have seen this happening with Google. So we recommend to site owners / SEO companies that they don’t use link building methods for new sites which brings in notβtrustworthy links. For a new site think of natural ways to build links. Imagine how great sites grew to this level. Not all sites started with 10,000 links to them. Successful sites take time to build their brand that is how search engines want new sites to be. Think of ways to bring in great links once your site is good enough and use all ethical link building strategies in books.
11. How to escape the sandbox filter?
a. Don’t go for any of the artificial link building methods for a new site. Some of the methods to be avoided initially are reciprocal link building, directory submissions ( excluding dmoz and yahoo directory ) , buying links, article reproduction, links from co-ad networks, links from network of your own sites, links from spam sites, sitewide links etc.
b. Make the links grow more naturally at a slow rate for new sites.
c. Show a steady growth of links to Google. Make them understand that your site is trustworthy.
d. Attract people to link into your site naturally and think of the great Million Dollar Homepage concept. If a college going youngster can attract 60,000 links in a few months why not you? If you can even get 10% of the quality natural links your site will do really great.
12. How to get the site out of sandbox?
This is an important question which many people ask.
First think of ways to attract natural links, don’t just run a site with commercial focus, today both users as well as search engines want information and commercial mixed. Take for example, Search Engine Genie which is a mixed site were we have tools, blog, articles and much more stuff to give out to users. But we also have commercial focus. Similarly there are great SEO sites that do the same. SEOMoz, SEOcompany.ca etc. Though these sites are new, they have come out very strong just because they are a valuable resource to search engines and users.
Submit to dmoz. First make sure your site is good enough to be accepted into dmoz by reading their guidelines and then submit your site.
Write great articles and make people to link to them. Make people aware that a good article exists on your site, participate in related forums post your article there and ask for an opinion. If people like your article you will get a ton of traffic and some people as a token of appreciation will post a link from their site to your article.
“Start a blog share to the world what you know about business, show them your expertise. Blogs attract lots of links today; you can even give your content to other blogs as feeds as well. ”
Think of a great strategy like the million dollar homepage to build backlinks.
As Matt Cutts suggests, even basic interviews will attract good backlinks.
We have listed some but you can think more, think all the best possible ways to build natural links till your site gains the trust, once it has the necessary trust you can use all the backlink strategies.
Wait for 3 months at least because if you are building natural backlinks it takes time to kick into google’s algorithm,
13. Does Search Engine Genie’s experimental finding work?
Our experiment has been tested across sites which deal with various topics. We have expanded our research to many more sites we handle, for the new sites we handle the way to escape sandbox has worked well.
You can see this only after you see it happening for your site, so, if you have a new site test it for yourself and post the results through comments in this post.
14. My site has never been in the Sandbox. Why not?
Couple of reasons for that.
1. Your site would be an older site, Sites prior MAY 2004 didn’t have this problem. In fact at that time we were able to rank a site with any sort of backlinks within a month. Now it is not the case.
2. You wouldn’t have worried about SEO type backlinks.
3. You might possibly have registered an expired domain which was ranking well and had / have trusted links before it came to you. Google’s expired domain penalty doesn’t work properly for some sites. One of the sites which belong to our client got expired before 3 months. We tried contacting our client numerous times prior domain expiration but it seemed he was very sick and was in hospital for 6 months and we could not contact him. We tried backordering but we were in queue. The domain is a popular domain and was in pagerank 7 category of dmoz. The domain entered pending deletion period and finally it expired and now the person who first preordered got the domain and the domain is theautomover.com. It is so sad that expired domain penalty don’t work because that site is ranking well now, even if the content was completely changed and now belongs to a new owner. We appeal to google to monitor these expired domains more actively.
This article addresses latest improvements in Google with relation to sandbox filter. In fact some points discussed here are from the latest Google update Jagger.
Rustybrick’s future pagerank tool an April fool Joke – Barry agrees,
Rustybrick ( Barry ) released a pagerank prediction tool last year, I had always stressed pagerank cannot be predicted without having all the great linkage data google has, But this tool has faked a lot of people, Lots of people thought this pagerank prediction tool can actually predict pagerank, I had argued in SEO-guy forums and other forums that this tool is for fun, But many thought that this tool is actually telling truth,
anyone can easily guess this tool is just guessing or predicting something which is 100% inaccurate,
Finally Barry agrees this tool was developed by him as a April fool joke,
This is what he says in his blog,
“I believe I was one of the first, if not the first, to come up with the Google PageRank Prediction Tool. I launched that tool on April 1st, 2005 – yes it was April fools day. To appease the SEM community, I added a line about the tool should be used for “entertainment purposes only.” How do I come up with the future PR? I pull some historical data from different places, I won’t say exactly what they are, and I either increase the current PageRank value of a page and or decrease it by a percentage factor.
So is it accurate? No way! It was an April fools joke. Sometimes it is right, and often it is wrong. But I still get emails, at least once per week, asking me questions about the tool or ways to help increase people’s pagerank.
There are other tools that look at your PageRank at all the Google datacenters. They are not really future pagerank tools, they check your real time pagerank at these datacenters. If a “Google Dance” or PageRank update is taking place, it will show the current pagerank at that datacenter.”
http://www.seroundtable.com/archives/003021.html
So did he finally agree he is completely wrong???, Is he telling the truth I feel No,
check this thread where barry participated
http://www.v7n.com/forums/showthread.php?t=6206 ,
Check the posting date it says 25-3-2004, But check the posting in his blog it says April 1st, 2005 which is about 1 year difference, Even if he made a mistake mentioning 2005 instead of 2004 still the thread clearly says this tool was not released on April fools day but released before that,
PLUS see the posts where Barry is defending his tool,
An extract of some of his postings,
rustybrick
03-25-2004, 04:36 PM
I made this nifty tool that checks
your future PageRank. The recipe is a secret:Check out the new Google PageRank
Prediction Tool (http://www.rustybrick.com/pagerank-prediction.php).Let me know
if there are any bugs here. Thanks.
rustybrick
03-25-2004, 04:48 PM
hmm… for most its accurate. Well, your sites will probably have crazy PR values and go through the roof. You do a damn good job of getting links.;)but the value of your forum makes sense.
Replying to john’s reply he says “hmm… for most its accurate. “
For Most its accurate HUH… and he didn’t say its april fool joke there???? why??//
Further more
“rustybrick
03-25-2004, 05:54 PM
How does it determine what sort of increase to expect? Its like telling you the recipe to Coke Cola. ;)”
Its a coke Cola secret??/
rustybrick
03-25-2004, 06:00 PM
its a long detailed formula. I have never found an example of a 0% change. Thanks for pointing that out.
Long Detailed Formula or April fool Joke??//
03-25-2004, 07:46 PM
based on the feedback, we fixed some problems.
Fixed problems with a April fool tool??/
More
rustybrick
base, ur still 0 – that’s fine. Spear, i am rounding down, like google does. So its under a 5.49. Returning whole numbers. but its a good thing, u increased a bit behind the scenes.
rustybrick
some people don’t, i know you do – but i got people cursing me out because its not giving them to the decimal point.
LOL some newbies are so dumb, cursing an april fool tool for not showing decimal point.
Read that thread for more Fun,
But in same thread experts like Bob Wakfer Writes,
“I don’t know why this thread is still alive and why anybody is wasting time on it. The tool is a joke. There is no way it is or can be anywhere close. Google couldn’t forecast your PR in a months time and there is no way this tool can. It a sham and a delusion. You are all either wasting your time, or deluding yourself, or both. There are lots of good tools out there that can help you. This is not one of them.”
Well said bob,
SEO Genie.
A good article to read on trust rank,
We just posted information on trust rank, There is an interesting article which talks about combating web spam with trust,
http://dbpubs.stanford.edu:8090/pub/showDoc.Fulltext?lang=en&doc=2004-17&format=pdf&compression=&name=2004-17.pdf
the above article provides great insight about the new trust rank,
What is trust rank,??
Wikepedia has a brief but well explained article on trust rank, here they try explaining to the users what trust rank, Trustrank is an advanced way of identifying high quality sites from a seed of small number of sites, the links going out them will play an important role in identifying quality sites,
Here is what wikipedia says about trust rank,
“TrustRank is a new technique proposed by researchers from Stanford University and Yahoo to semi-automatically separate reputable, good pages from spam.
Many Web spam pages are only created with the intention of misleading search engines. These pages, chiefly created for commercial reasons, use various techniques to achieve higher-than-deserved rankings on the search engines’ result pages. While human experts can easily identify spam, it is too expensive to manually evaluate a large number of pages. Therefore, Google first selects a small set of seed pages to be evaluated by an expert. Once the reputable seed pages are manually identified, Google uses the link structure of the web to discover other pages that are likely to be relevant and good. Google claims that they can now effectively filter out spam from a significant fraction of the web, based on a good seed set of fewer than 200 sites.”
MSN offers instructions for site owners on getting their site indexed and ranked by MSN,
MSN is the third best search engine on the web, they have a strong customer base who are regular visitors of MSN, Since MSN is the third best search engine it is important to get ranked in their search engine, For ranking sites MSN provides quality guidelines, the guidelines provide an insight what might work for MSN search engine,
Content guidelines for your website from MSN
“The best way to attract people to your site, and keep them coming back, is to design your pages with valuable content that your target audience is interested in.
In the visible page text, include words users might choose as search query terms to find the information on your site.
Limit all pages to a reasonable size. We recommend one topic per page. An HTML page with no pictures should be under 150 KB.
Make sure that each page is accessible by at least one static text link.
Keep the text that you want indexed outside of images. For example, if you want your company name or address to be indexed, make sure it is displayed on your page outside of a company logo.
Add a site map. This enables MSNBot to find all of your pages easily. Links embedded in menus, list boxes, and similar elements are not accessible to web crawlers unless they appear in your site map.
Google Toolbar Pagerank Dies – Page rank display in google toolbar dies
We speculated long time back why google’s toolbar was not updated frequently http://www.searchenginegenie.com/seo-blog/2004/09/toolbar-pagerank-is-dead-googles-page.html , Now it seems google will abandon toolbar pagerank all together very soon,
For the past 2 days google’s toolbar pagerank was greyed out, Usually grey pagerank display means the toolbar cannot fetch pagerank for that particular site from google server,
This is the case around the world, So is this the death of toolbar pagerank???, Lets wait for a few more days to know for sure,
We at Search Engine Genie dont care if google abandons toolbar pagerank, We dont measure a sites quality based on the pagerank of that site, Also we dont measure links based on pagerank,
SEO Blog Team,
Blogroll
Categories
- 2013 seo trends
- author rank
- Bing search engine
- blogger
- Fake popularity
- google ads
- Google Adsense
- google fault
- google impact
- google Investigation
- google knowledge
- Google panda
- Google penguin
- Google Plus
- Google webmaster tools
- Hummingbird algorithm
- infographics
- link building
- Mattcutts Video Transcript
- Microsoft
- MSN Live Search
- Negative SEO
- pagerank
- Paid links
- Panda and penguin timeline
- Panda Update
- Panda Update #22
- Panda Update 25
- Panda update releases 2012
- Penguin Update
- Sandbox Tool
- search engines
- SEO
- SEO cartoons comics
- seo predictions
- seo techniques
- SEO tools
- seo updates
- social bookmarking
- Social Media
- SOPA Act
- Spam
- Uncategorized
- Webmaster News
- website
- Yahoo