"Its syntax is fairly verbose and partially redundant. This can hurt human readability and application efficiency, and yields higher storage costs. It can also make XML difficult to apply in cases where bandwidth is limited..."
Not especially encouraging for a 10 Mb subdomain site. Still, the Google Sitemaps (Beta) seems to have its heart in the right place, offering an opportunity to straighten out the mess googlebot seems to get itself into.
After giving Sitemaps a thorough trial, I've decided that the main advantage of the program is the additional statistics that it makes available. Unfortunately, these are not available on subdomains. What little advantage they might have conferred weren't worth the effort and drivespace they consumed.
It seems that googlebot is perfectly capable of traversing a site. The apparent problem stems from Google's secret ranking algorithm, which deems a lot of pages unworthy of listing, especially pages that are in deeply-nested directories. So much for the supposed advantages of using keywords as directory names! Foo.
No comments:
Post a Comment