|
Difference between a possible link farm and a directory - search engine optimization tip march 22nd 2005 Link farms are links pages that are solely build for the purpose of building links for search engines, Usually a link farm will contain links to lots of unrelated pages from a links page,
But directories have relevant categories and they link out to sites only from relevant categories, Directories provide value to visitors as well as search engines, Link farms are solely build for search engine purposes, if search engines don’t exist link farms wont exist,
Also there are sites doing excessive cross linking even those sites are called link farms since the cross link to cheat the search engines for link popularity,
Title tags are very important part of search engine optimization, A good title tag will give good relevancy to that page, A title tag should be relevant and effective, Good title tags should contain important phrases for the page that is optimized,
So can keywords be repeated, YES keywords or keyword phrases can be repeated in title, But title is an important factor in search engine algorithm, Also the first 6 words are very important and is given the most weight so it not good to repeat the keywords or phrases in this first 6 keywords, we can repeat the words after the first 6 words but the first six words has to be utilized to the most, We can use singular/ plural version of the keyword in the first 6 to 8 words that way we target 2 version of the same targeted keyword,
Conclusion we would like to say dont waste precious title tags, Make sure title tags are optimized to the best and contains all important keywords in it, SEO Blog team,
Misspelling seo has been there for a while, Many people have used this tactic for their site to rank for main commercial phrase misspelling, usually search engine crawlers also have spell check feature integrated into their algorithm, A good site without any spelling is always good, But that is not the dead line, There are numerous number of sites ranking for misspellings as well as correct spellings, Misspelling optimization is not always wrong, For some areas it is essential,
For example for keywords like microdermabrasion or mesothelioma many people are bound to make mistakes, They cannot spell the word correctly so they sometimes misspell when they search, These people are still targeted just that they dont know the correct spelling, So it is necessary optimizers target this crowd too, That is why misspelling optimization has been used,
But abusing misspelling is not good, It is important mispelled words are used only if it is 100% essential like the above examples, Google / Yahoo / MSN all have spelling suggestion feature where they show the correct spelling when a person search for the wrong spelling, But this feature is shown just in one line and most people dont notice it, So we recommend doing misspelling optimization when the keywords are complex and difficult to spell, SEO Blog Team,
Title attributes are used with the Href tag it show a display of text or image when the mouse is dragged over the text, Syntax, it comes into the href tag like this,
title="search engine optimization seo company search engine genie" So does search engines use this tag in ranking algorithms, From Search Engine Genie point of view we would say no, Atleast the top search engines like google / yahoo / MSN dont use them, We have done extensive test with this tag and we have not seen any search engine show the text in this tag, We add some junk big phrase which is no where used and text it, Search engine dont the text is this tag when searched for it,
So we can conclude they dont use this tag for search engine algorithms, But it is important to note this tag is very good and very useful for usability, So dont hesitate to use it but make sure you use it when it is really necessary, SEO Blog Team,
All search engines have a cache limit for pages they crawler, The html of a page is considered as the size of the page the search engine crawler,
For yahoo bot ( yahoo slurp ) cache limit is 500 KiloBytes, Maximum size of a page yahoo robot crawls is 500 KB,
For Google robot ( Googlebot ) cache limit is Unknown, Before couple of months the cache limit was only 101 kb, Pages above that size was partially indexed, Anything above the 101 kb limit was ignored, Now it has changed and googlebot is known to index files of size more than 400 kb, So it is unknown exactly what the new cache limit is for googlebot,
Some website owners are worried about using robots exclusion on their sites because they think blocking a crawler from certain pages or certain sections are bad, From search engine genie point of view we recommend to use robots exclusion freely, Make sure you use proper syntax a wrong syntax might confuse the search engine crawler and might get the site dropped from the index,
Excluding robots from certain sections of the site is very good in various ways, 1. You can prevent search engine crawlers from eating up unnecessary bandwidth of your site, For some hosting bandwidth is costly and it is important bandwidth is saved by excluding crawlers from certain sections, 2. Excluding robots from comments section if you run a blog is very good, It prevents robots from seeing or giving credit to any unnecessary links, There might be some bad links left by spammers and those links can be prevented from crawling by excluding those pages from the crawlers, Linking to bad neighbourhood is not good, 3. Excluding robots also helps in preventing sensitive information of your site not getting exposed in search results, There are sites who dont prevent the crawlers properly and those pages contain sensitive data link credit card details, important database etc, So for precautionary measures it is best to exclude the robot from certain areas,
4. Excluding robots also helps in preventing duplicate contents of your site from being crawled, If you have a dynamic site and there are 2 versions of the dynamic pages one search engine friendly and other dynamic with query string it is better to block one version and block the other one that way search engines don’t have to worry about duplicate content on your site, There are still more benefits from excluding pages / sections from crawlers so do it safely with proper syntax, Use if well if you have big ecommerce dynamic site, Dynamic sites are known to render lots of pages to the crawlers by error, Hope this tip helps, SEO Blog Team,
Meta description tags play a very small role in search engine ranking these days, It doesn’t matter whether the meta description tags are duplicated across your pages, Just make sure your important keywords/ phrases are present in those meta description tags,
It is common for sites to have similar meta description tag, Especially in ecommerce sites it is common to have duplicated meta tags, There is nothing wrong in it, It doesn’t hurt to have duplicated meta description, Also you should remember it doesn’t help much, So it is better to just optimize them and leave it as it is,
SEO Blog Team,
For a more detailed list of search engine crawlers contact us and we will send it to you for free, Many of us have noticed sometimes search engines keep visiting the site regularly but dont index the site nor show it in the site: command,
There are various reasons for this to happen, 1. The domain is an expired domain, if the domain expires and not registered for a certain period of time google imposes a expired domain penalty on that domain, That domain is left to suffer for certain number of months, in that period googlebot keeps visiting the site but they don’t index it and they don’t show the site in site: command too, 2. An other reason is the domain is a new domain and if the domain is a new domain sometimes the crawler regularly visits the site and it doesn’t show up in the index for a long time, There is nothing wrong with this, probably google index is taking longer time to expand, you just have to wait till google updates its index,
3. An other possible reason is the site is banned from the search engines for any particular onpage factor, In that case search engines periodically checks to see whether the onpage spam tactics is removed and as soon as they see the spam being removed they might reinclude the site into the index, So for people who are complaining that their site was previously indexed and listed in google but suddenly it disappeared from the index and googlebot keeps visiting the site it is good to look at your onpage work and see if there is any spam tactics like hidden text, cloaking, keyword stuffing etc, 4. An other important reason could be that the site was permanently banned from the search engines, Even here search engine crawlers visit the site following existing links to the site but they don’t index the site because the site is banned, This is common with Yahoo slurp yahoo’s robot, yahoo slurp is known to visit the site and don’t index the site if the site is banned,
List of top search engine user-agents We get lot of mails from people who want to know the names of the leading user-agents, We will be pleased to give the information, Identifying the useragents is a very important criteria in search engine optimization, A regular visit by search engine robots like Googlebot, yahoo slurp etc is a good sign,
Here is the list of top search engines crawlers, Googlebot/2.*ooglebot@googlebot.com) Yahoo – Yahoo Slurp Msn – Msnbot Lycos – Lycos_Spider_(T-Rex)/3. Teoma- Mozilla/2.0 (compatible; Ask Jeeves/Teoma)
For a more detailed list of search engine crawlers contact us and we will send it to you for free,
|
|