Monday, March 27, 2006

Squidoo

As I was poking around in the realm of social bookmarks, I came upon an odd little site called Squidoo. It is basically a blog-space provider with an interesting business model.

Participants (Squidoo calls them "Lensmasters" and their modular meta-pages "lenses") post articles on any topic that interests them. The pages are monetized using Google AdSense ads, Amazon associate links, and a number of other revenue-producing programs. This money goes to Squidoo, who returns half of it to the lensmasters based on the performance of their links.

So why give half your commissions to Squidoo? After all, free blog-space isn't hard to come by. It's a fair question, and one I can't answer definitively.

Consider this before you reject the idea: You're not giving up anything but the time it takes you to turn some of your favorite bookmarks into "lenses." If your page generates some revenue fine. If not, there's no guarantee that people who bought something from Squidoo otherwise would have bought it from you. Actually that is extremely unlikely.

On the other hand, Squidoo's search-engine clout, built-in community traffic, diversity of revenue-streams, easy page creation and so on might well make it twice as easy to make a sale, which is a fair trade.

The company is a start-up, having begun recording sales for commission today. As such there are certain risks involved, but Squidoo is worth a look, if only for the education involved. Check out their home page, their FAQ, and their status blog. Then if you want to give it a try, sign up. (using this link could make you an extra $5.00)

Update: I'm getting a lot of traffic from my "Lens" My Lensmasters' Amazon "Top Ten" Lists. It's hard to tell if it's Squidoo users or Amazon associates, but so far it's just *A LOT* of "lookie Lou" traffic.

Tuesday, March 21, 2006

Social Bookmarks

Are blogs replacing regular websites? Not really. There will always be a need for well-designed web sites that have a different organization and function than the typical (or extremely atypical) blog. However, it's pretty obvious that blogging is here to stay, and webmasters will either adapt to this trend or lose market share.

This revelation came to me when I stumbled upon a site that helps bloggers add a little javascript snippet that invites visitors to Social Bookmark their pages. Social bookmarking, it seems, is the generic term for services like Del.icio.us that allow users to save their bookmarks / favorites online, then create a human-rated, searchable database from the links.

Human-rated databases aren't new, The Open Directory Project, being the most obvious example, but they have always had limitations. They attempt to impose a top-down organization on what is inherently a grassroots process. They are slow, arbitrary, laborious, and rely on volunteers who recieve little or no reward for their labors.

By contrast, social bookmarking immediately rewards users (they can access their bookmarks online) is easy to use, and generally relies upon market forces to validate links rather than arbitrary rules administered by an authoritarian editing process. I'm not one of those free-market ideologues who thinks government regulation is the root of all evil, but I happen to think the Internet is an excellent example of free-market thinking gone right.

Before the introduction of the World Wide Web, the Internet was a very limited and expensive conduit for a few ARPA/DARPA and NSF net computers to swap e-mails and files. By opening the Internet to commercial content, the network infrastructure was monetized, and its growth has been vigorous ever since.

Ironically, the ODP and most other human-mediated directories prohibit "commercial content" such as personal web-sites that have a few affiliate links to defray their direct expenses. They are glad to list university, government, and corporate sites that are far from "non-commercial" in the sense that they promote, advertise, dispense, or otherwise facilitate activities that are extremely lucrative, provided they lack a direct link between the website and the flow of cash from their users.

Given that the pool of potential editors for these directories consists primarily of small-time webmasters who want a little recognition for their sites and are immediately rejected, it's small wonder that these projects are as pathetically limited as they are. Does anybody actually use them for web-searches?

My impression is that people put up commercial-content-free sites, get them listed, and then monetize them secure in the knowledge that no one will ever know if they are listed by ODP.

Monday, March 13, 2006

Podcasts

Google popped up with a link to some high-bandwidth Mars photos today, ensuring a sluggish day for internet users everywhere. Of course, this is a mere drop in the bandwidth bucket compared to NASA's self-aggrandizing NASAcast. Those arrogant [expletives]!

For a seemingly positive viewpoint on this latest fiscal atrocity see: Mars Orbiter Makes Port. This article is about podcasting.

Basically, podcasts are audio/video feeds using essentially the same technology as regular feeds, but with huge files. Thus illiterates can hog bandwidth, too. I don't have time for this now, but a cursory examination reveals a whole community of DJ wannabes streaming crap nobody listens to out to gee-whiz technolgy buffs for them to delete from their iPods. Just for good measure, we learned that MickeySoft has their own non-standard

On a related note RSS Specifications has some interesting thoughts on the RSS/atom dichotomy.

If you really must delve into this evil technology, Podcating News seems like a good place to start. Juice seems to be the most popular podcast client, although more clients are available. Of course you can check the article that "launched" this tirade -- NASA Podcast Help

Pod casting isn't going away just because I'm convinced it's a stupid waste of bandwidth. It will probably thrive. Maybe there is a way to use it for good instead of evil. I need some coffee.

Saturday, March 11, 2006

RSS & atom

As I mentioned in a previous post, spiders (e.g. googlebot) crawl blogs differently than "regular" webpages. I didn't go into much detail, because I didn't actually know much about the process. Later I began fumbling around with Google Sitemaps which resulted in temporarily being blacklisted as a splog. (Incidentally, Blogspot corrected the problem promptly, much to my surprise.)

After using some online sitemap generators, which weren't entirely satisfactory, I stumbled onto Johannes Mueller's excellent GSiteCrawler. Whenever that program uploads a new sitemap, it pings Google Sitemaps, alerting them that new content is available. Aha! This two-way communication is the essence of web feed technology.

Suppose you have your own spider, dutifully traversing the web day in and day out. Obviously, you could cut down on the bandwidth it would require if it didn't have to crawl every page, but just the pages which had changed. Googlebot is perfectly capable of traversing a site, but employs its own algorithm to decide whether or not to re-index a particular page, thus driving webmasters insane.

Sitemaps don't really change this, but properly used, they give googlebot a "heads up" as to what contnet is new. New content is preferred over old content by some secret amount, so frequently updating sitemaps is the key to their successful use. If you don't have enough room for sitemaps, similar benefits can be had simply by including (properly maintained) "date" metatags for each page.

RSS and atom (remember them?) are two competing file specifications for this type of two-way communication, commonly referred to as web syndication. Presently the "standards" are under nearly constant revision, similar to the "browser wars" of a few years ago that made it practically impossible to use javascript. This site's web-feed is: http://wholeed.blogspot.com/atom.xml