What is duplicate content for search engines - search engine optimization tip March 8th 2005
Various search engines have various thresholds on duplicate content issues, Some search engines like yahoo, exalead are unable to detect duplicate contents across sites, they seem to detect within a site but are not able to detect across sites, Best is to make the pages atleast 5 to 7% different from other pages of the site,
Google is the best search engine on detecting dupe contents, They strip away the main template of the site and take the remaining part into their algorithm consideration, We recommend making the page atleast 8 to 15% different from other pages to avoid dupe content penalty for a particular page, Remember to give proper file names if you cant create too unique pages, File names are indexed by search engines and good 5 or 6 word file names add upto unique contents,
Overall 10% is the best bet to make pages different,
SEO BLog Team,
Google is the best search engine on detecting dupe contents, They strip away the main template of the site and take the remaining part into their algorithm consideration, We recommend making the page atleast 8 to 15% different from other pages to avoid dupe content penalty for a particular page, Remember to give proper file names if you cant create too unique pages, File names are indexed by search engines and good 5 or 6 word file names add upto unique contents,
Overall 10% is the best bet to make pages different,
SEO BLog Team,
1 Comments:
I think that anyone who is thought to be an expert in SEO/SEM should go take the current hot SEO quiz..it really is setting apart the novices from the experts, and then we could qualify the experts and weight their comments based on their knowledge.
Just my 2 cents..
Brian James
http://www.nodents.com/welcome.html
http://www.nodents.com/train.html
http://www.nodents.com/register.html
http://www.nodents.com/pdr.html
http://www.nodents.com/wheelrepair.html
Post a Comment
Links to this post:
Create a Link
<< SEO Blog Home