SmoothSpan Blog

For Executives, Entrepreneurs, and other Digerati who need to know about SaaS and Web 2.0.

Archive for April 11th, 2008

Who Hides Data From Search Engines And Why?

Posted by Bob Warfield on April 11, 2008

Fred Wilson just did an interesting post about delicious and its traffic after being acquired by Yahoo.  On the face of it, looking at Compete or other stats, it appeared delicious had gone into decline after being acquired.  Reality is a bit different as Joshu wrote back to Fred:

We continue to grow normally.

Unique users is not a good measure of our growth, though.

Much of our traffic is through the firefox and other browser extensions, which is not measured by these systems.

Additionally, we cut off search indexing several months ago, which also hurts the UU numbers.

It is fascinating to consider just how much of the web is not measurable even today due to such things as running traffic through browser extensions.  What I really found interesting was the last line, though.

Why would delicious cut of search indexing?

As one commenter on the thread pointed out:

I would think Joshua would be delighted if the “funny-video” tag was the first search result at Google for the search team “funny video”… it would mean both greater distribution and influence.

Evidently not.  There is no good cost reason to turn away web crawlers.  For a property like delicious, crawling have to represent a tiny fraction of their traffic.  It seems to me the reasons would have to be strategic.

For the conspiracy theorists out there, consider this.  Perhaps Yahoo is doing this for a lot of their valuable properties and only letting Yahoo search engines index the data.  This means a Yahoo search can find delicious posts but Google can’t.

It seemed a great theory, but alas I could not verify it at all.  I tried half a dozen of the most popular posts on delicious and could not find them listed on searches for the title on either Google or Yahoo search.

So I’m stumped.  It would be fascinating to see a list of sites that exclude search engines sorted by popularity.  Even more fascinating would be understanding why they exclude the searchers.

Update:  It seems that Yahoo does do something special with delicious, but maybe it isn’t all in production yet based on my limited testing.  See TechCrunch for more.

Don’t you think this is a tacky way to compete?  By limiting availability of search results?  Search result integrity is essential, and here we have companies outright trying to sabotage the search results of their competition.  This will get worse if MSFT has anything to say: they don’t believe in a fair fight!

Posted in saas | 4 Comments »

Testing the General Theory of Relativity and Locating the Chasm

Posted by Bob Warfield on April 11, 2008

The physics world loves to test its theories to the furthest edges of the envelope.  One reads about elaborate experiments to push things out one more decimal place.

I’m no Einstein, but I did just see an opportunity to add insight to my theory that the Chasm Has Moved.  Briefly, that theory says that the old Early Adopter Market has been made much bigger by the Internet, but that a lot of the behaviour we see on our beloved ‘Net is just that: Early Adopter Behaviour.  The Chasm still exists, but it has moved further to the right.

Here is one opportunity to test, albeit one we won’t be able to apply until we know who our next President is.  Matt Pace of Compete writes of Obama v. Clinton that despite what the pundits say, the race isn’t even close, and TechCrunch picks up on that refrain.  By every measure the web offers, or at least by the measures of Face Time Compete can generate, it appears that Obama will win.

What does that have to do with the Chasm? 

Compete is measuring what it calls “Face Time” to make the prediction.  Face Time is the amount of time spent with each candidate across several leading social networks and media sites (Facebook, MySpace, Flickr, MeetUp, YouTube).   That measurement is going to be focused exactly on my broader Left given what web properties are being looked at.  If we had a broader measure, perhaps all searching on the net, it would be different.

Assuming no candidate does anything to catastrophically impact the results, and assuming McCain isn’t elected, there are a two outcomes:

–  Obama wins.  This would indicate that the web is a more accurate indicator than traditional measures.  I believe that means the Chasm moves right, and the early adopters are now a more powerful force than they once were.  These are the people Compete measures using the latest greatest web offerings.

–  Clinton wins.  This is the big upset.  The web says Obama, but the voters say Clinton.  That would say to me the web is primarily measuring to the left of the Chasm but the world is still run from the right.  The Chasm is right where it ever was and there aren’t enough to the left of it to make the difference.

As in so many of these experiments, much is subject to interpretation, but it will be interesting to watch!

 

Posted in saas | 3 Comments »

 
%d bloggers like this: