|
Official google blog has announced that google maps their great product is now available for safari and opera browsers, Initially when it was launched it supported only Internet explorer and firefox,
From the google blog,
The Maps team was excited to see all the ideas and feedback that hit blogs around the Web after Google Maps launched a couple weeks ago, and we've been listening: Maps now supports Safari and Opera. Keep the feedback coming...
Log file analysis is very important in sucess of website,Log file information provides a baseline of statistics that indicate use levels and support use and/or growth comparisons among parts of a site or over time. Such analysis also provides some technical information regarding server load, unusual activity, or unsuccessful requests, and can assist in marketing and site development and management activities. Server log file analysis is also an important part of search engine optimization, it also helps in determining the best keyword that converts and best keyword mostly used,
What's in a Log FileEvery communication between a client's browser and a Web server results in an entry in the server's log recording the transaction. In general, a log file entry contains: the address of the computer requesting the file the date and time of the request the URL for the file requestedthe protocol used for the request the size of the file requested the referring URLthe browser and operating system used by the requesting computer. What Can You Learn From a Log File?Data available from a log file can be compiled and combined in various ways, providing statistics or listings such as:
total files and kilobytes successfully served number of requests by type of file, such as HTML page views distinct IP addresses served and the number of requests each made number of requests by domain suffix (derived from IP addresses) number of requests by HTTP status codes (successful, failed, redirected, informational) totals and averages by specific time periods (hours, days, weeks, months, years) URLs from which user came to the site browsers and versions making the requests. number of requests made ("hits") number of requests for specific files or directories How good are log files?We should be able to differentiate between hits, page view and unique visitors, not all hits and page views are visitors, there are good free tools for seperating the data that way, some are funnel web analyser, awstats, webalizer etc, For more information on tracking and logging visit this forum thread, Tracking and Logging http://www.webmasterworld.com/forum39/
Javascript is a complex language which needs lot of browser understandable coding to make it friendly for browsers, Search engine crawlers are not too sophisticated browsers, they dont have high capabilities like Internet explorer or firefox in understanding Javascript,
So best is to reduce the use of javascript in web pages intend to be search engine friendly, Large javascript coding also covers a lot of the page and search engines find difficult to index contents buried lower than the javascript, Best option is to add the javascript used for your site in an external .js file, that way the size of the page is reduced, Search engines like google used to crawl only the first 101kb of the html of the page, so it is important that the higher level of the page is well accessible by crawlers, Also it is best to avoid Javascript dropdown menus, rollovers, popups etc, Search engines find difficult to crawl these stuff, if you have a javascript menu best is to add text only links to your inner pages, that way search engines wont struggle in finding your precious inner pages,
Googlebot google's crawler has recently been reported to crawl javascript, it is a great improvement but still it is in beginning stage, so better avoid javascript menus and avoid large coding in javascript, Use external .js files to reduce the size of the page, SEO BLOG TEAM,
When i was browsing through webmasterworld.com the ultimate forum I found a thread where the thread starter was worried that his competitor using hidden text and the site was not penalized,
Read more in this thread here "http://www.webmasterworld.com/forum30/28211.htm" Read answers of great experts like Brett Tabke, Ciml etc they have seen this all the time from newbies and they are taking all the efforts to correct newbies so called seo experts, One thing people should know that search engines are not for seos, webmasters or site owners it is solely run for end users who perform searching, Webmasters are just an other group of google users, Google needn't have to worry about how well they make webmasters happy, They just have to worry about their search users, If something is hurting their search users they will definetely take action on it, SEOs or webmasters neednt have to whine all the time that their competitor site is not banned by google or any search engine for spam( spam in seos mind ), According to experts search engine SPAM report for so called search engine optimization experts is simply a waste of time,
A great expert gave a clean expansion for useless SEOs definition of SPAM, SPAM - Sites Positioned Above Mine.Dont whine if your site dont rank, Keep working on contents and backlinks, stop worrying what others use to rank their sites, An other expert clearly said "If you think sites using hidden text or hidden links ranks high and you cant outrank them, it simply means you SUCK at search engine optimization"
Yahoo image search index has increased to over 1.5 billion images, Initially yahoo had 1 billion images, then google updated their index with about 1,187,630,000 images, Now yahoo index has increased to over 1.5 billion images,
There is a new way to detect rel=nofollow tag using the firefox browser, Detecting this tag helps in catching cheating webmasters using this tag in their directories and link exchanges, rel=nofollow tag can be highlighted in firefox by adding a simple line to the main CSS file ( usercontent.css ) which powers the appearance of the web pages in firefox browser,
we have to find the path of the usercontent.css file and add a simple line so that the rel=nofollow link appears in a different colour, For Windows XP / windows 2000 users: C:\Documents and Settings\[User Name]\Application Data\Mozilla\Firefox\Profiles\hmgpxvac.default\chrome hmgpxvac can be anything on your system, For PCs using Windows 98 / ME: C:\WINDOWS\Application Data\Mozilla\Firefox\Profiles\hmgpxvac.default.default\chrome Open up your new userContent.css file and add the following line:
a[rel~="nofollow"] { border: thin dashed firebrick! important; background-color: rgb(255, 200, 200)! important; } save the file, if above code is used firefox will automatically highlight links in red, You can highlight links in internet explorer too refer the attached for it, https://www.searchenginegenie.com/seo-blog/ie-detection-nofollow.txtSEO Blog Team,
hotbot.com a search engine used by very low number of users was powered by Inktomi search now yahoo search, Uptil recently I used to check hotbot for our clients since hotbot results used to show the dates which the yahoo slurp crawler visited our sites and our clients, Now when I performed a search suddenly results were completely different,
then the results was compared to google and it was similiar, It seems hotbot search engine has stopped using yahoo search engine results and now using google search results,
Not a big significant change but worth knowing,
Google has introduced a new feature search the movie: This new search helps users to find movie related information, For example we can find reviews, movie related etc using this search,
Movie: search also helps to find movie names you happen to forget, read what google says about movie: command, Just in time for the Oscars, we've created a new "movie:" operator that enables you to find movie-related information faster and more easily, whether you're looking for titles or actors, director or genre, famous lines or obscure plot details. Can't remember the name of that film where Tom Hanks made friends with a volleyball? Search for [movie: Tom Hanks talking to a volleyball] and Google will tell you: it was Cast Away.
GOogle is reaching great heights and their new feature will be absolutely helpful,
Automated submission are not good, Todays search engines are sophisticated crawler based search engines and they find sites based on links pointing to that site, Better and more the links better search engines will crawl the site,
Automated submission rarely hurt your site, Search engines understand that even your competitor can submit your URLs repeatedly to search engines, So automated submission rarely hurts, but we have to avoid automated submissions, Just for self satisfaction we can do a manual submission and leave it to search engines to find the site, crawl it, index it and rank,
Keep building links to your sites definetely all search engine crawlers will find your site, Avoid automated search engine submissions, SEO Blog Team,
Sub-domains or sub-folders for search engines, This has been an important questions among forums, In our point of view to build a big quality site sub folders are better, search engines usually treat sub domains as different sites, Subdomains and subfolders use completely different standards,
A subdomain acts as a different site, places where sub domains are useful, 1. Say suppose you want to start a site completely unrelated to your existing site, but you dont want to buy new hosting or use some other domain, that time subdomains serve a very good purpose, if your site is real-estate.yourbrand.com then you can start a completely different topic site saying cosmetics.yourbrand.com here real estate site will talk about real estate and cosmetics will talk about cosmetic items, 2. Also if you are a free hosting provider it is will be costly process to give away domain hosting, so lots of free hosting companies give sub domain hosting, some sites like freeservers, netfirms, tripod etc give free sub domains, since sub domains are treated completely different sites people can use it freely without any problems,. 3. If your organization is an affiliate, subsidiary, or chapter of a national or international organization, you might ask the headquarters if you can get a subdomain within the organization's domain; if you can, this would save you the InterNIC registration fee (subdomains don't cost anything to register) as well as give your organization an address that indicates its affiliation.
4. One more thing to note is that, if you use cookies in your site, they only work within one domain. For security and privacy purposes, browsers will not transmit cookies to any site in a different domain from the one that set the cookie. They can be shared across subdomains and hosts within a domain, but not from one domain to another. 5. Subdomains is appropriate, especially if you have a domain name that has a high level of brand recognition, For example sites like google, excite, yahoo etc use subdomains to show diversity among their sites, subfolders are always considered part of a site, For example here there are two subfolders, Both those subfolders talk completely about different topics, research subfolder talks about all research done by google, their research materials etc, Services subfolder talks about all services provided by google.com, those they talk about completely different areas they virtually stay on the same site, So they literally belong to the same site, this type of division helps identify a site being an authority, For better search engine ranking and becoming an authority site in search engines we recommend using subfolders than sub domains, Some major disadvantage of sub domains are subdomains are subject to a lot of spamming, Lots of spammers create 100s of subdomains for lots of keywords and just manipulate the search engine results, because of this search engines frown upon these type of sites slightly, So better go for subfolders if you want to run a quality site, SEO Blog Team,
Recently lots of big directories started to disappear from google results, Many have speculated various reasons for it, Some people blame google tightening up the dupe content penalty which makes directories disappear since they have lots of duplicate pages,
Some speculate google is removing directories manually from their results since empty directories doesnt add any value to their search users, it is true fact empty or very low listing directories doesnt provide value to visitors, There are some ultimate resource directories like dmoz.org, yahoo directory which provide great value to users to find good quality relevant sites, There are new empty directories just emerging whose sole purpose is to earn money, They simply start some junk directory take 40$+ for listing and inturn people who list their site there doesnt get anything in return than just a simple link, There are directories out there whose sole advertising tactics is around google Pagerank, They get high pagerank by buying links from industry established sites and sell those pagerank indirectly to people who list their sites, So do those directories provide value to users -- NO, Most the categories in those junk directories are empty or with only 1 or 2 listings,
So we at search engine genie would like to warn everyone who submit their site to directories seeing their pagerank, Better check the number of pages indexed in google, check whether the category you submit has other good quality sites listed, check whether your category has google pagerank, check whether the category page your site will feature is crawled by top search engines like google, yahoo, msn etc. Be warned that there are people who run directories with the sole purpose of getting rich quickly, We dont like to mention their names here since it doesnt add any value to our awareness campaign, We just like to warn people before submitting their site to Junk directories, SEO Blog Team.
Allan Hoffman reports how much he is addicted to google search engines, He describes he is a major google addict and performs 100s of searches in an active business day,
Yahoo seems to be the next priority to him, he likes yahoo's integration of email notification while doing a search, read more on what he has to say about other search engines too,
Search engines have difficulty indexing framed sites, We seo people recommend not to build framed sites, Frames can help navigation of large sites but they are not search engine friendly, You can use frames to load a third party sites as well as any page inside your site, This is one reason search engines stay away from indexing contents inside a frame,
Those contents may not necessarily be from your site it can come from any site, Say suppose your site is built in frames and you dont want to rebuilt them, best way to get some content crawled is to add the contents in a noframes tag, these tags helps the search engine to identify frames and index content inside the no frames tag,
If you want best search engine ranking avoid frames totally and design sites which render proper html to the browser, Whether it is a dynamic site or static site proper rendering of html matters, Search engine crawlers are just simple browsers like netscape or firefox, If you have a framed site we at search engine genie will help remove it and optimize it for better search engine rankings, Contact us if you want to get rid of your site from frames, SEO Blog Team,
Google Zeitgeist has listed the top searches countrywide for certain countries, They have listed top searches in india, Top 10 searches in India,
Popular QueriesJanuary 2005 1. tsunami 2. indian railways 3. sania mirza 4. trisha 5. aishwarya rai 6. anara gupta 7. ignou 8. bollywood 9. ndtv 10. australian open Search For information on tsunami has topped the table, Tsunami ofcourse caused massive disaster in south asian countries, Weird listing is sania mirza who is a rising star in tennis in India,
Many of us have this problem with offtopic inbound links, Some people think off-topic links are bad for a site, The fact that is not the case, Internet is a huge network of websites, Anysite can link to anysite that is just a simple rule of thumb,
For example if I run an ice cream site and my friend runs a real estate site is it wrong to link to my friend's site recommending his services to my visitors, Nope, Infact Seos or search engine optimizers are the ones worried about off-topic inbound links, Some misguided webmasters, site owners also worry about off topic links, From Search Engine Genie team's point of view there is nothing wrong in getting off topic links for a site, Just get them natural not artificial by link buying, Our site gets a lot of natural backlinks from various sites simply for quality contents do that for your site too, Attract people to link to your site,
Search engines today are not sophisticated enough to identify quality or related inbound links, If they ever think of doing it, they have to spend a lot of on it and also it will use up a lot of resources for them, Best rule Dont get too many sitewide offtopic links, Dont go for too many unrelated links, SEO Blog team,
|
|