SmoothSpan Blog

For Executives, Entrepreneurs, and other Digerati who need to know about SaaS and Web 2.0.

Archive for September 11th, 2007

Morgan Stanley has 70 to 80 Web 2.0 Projects Underway. How About You?

Posted by Bob Warfield on September 11, 2007

Don Farber recently had a great post talking about the 70 to 80 Web 2.0 projects Morgan Stanley has underway.  Accoriding to ComputerWorld, Pfizer is also in the Web 2.0 hunt.  Lots of interesting tidbits to be gained from these two articles:

–  50% of Morgan’s 55,000 employees are under 35, so the vast majority will have grown up online with social networking.  No doubt customers will follow the same pattern.

–  Web 2.0 fluency will start to affect their recruiting patterns.  See my post on this for a related take.  Just as employers will expect fluency, so too will new customers expect the companies they do business with to support their Web 2.0 lifestyles.

–  The Morgan External Web 2.0 is creating online communities and Wikis for its customers.  The External Web will be linked with Enterprise-wide CRM to facilitate a 360 degree view and ensure clients can provide feedback.  The Internal Web 2.0 includes social networking, online communication, expertise location, participatory culture, Q&A services, personalized learning, recruiting and alumni relations.

–  There are some interesting Business Trust Fabric issues at Morgan:  everything must be archived.

–  Culturally, it sounds like it has been very hard to overcome to inertia of the Old Ways, as well as to show ROI against the substantial costs of a big new initiative like this.  Web 2.0 benefits are likely to be soft benefits for a long time.

If I were sitting in a big organization like Morgan, I could definitely see the benefits to customer satisfaction, employee satisfaction, and generally improving my overall business by improving collaboration with Web 2.0.  At the same time, it would be daunting to consider just how many bases have to be touched in a large organization like this to see all the benefits.  There are internal versus external projects to tackle for a number of constituencies.  There would be a desire to create a diversity of offerings that cover the range of Web 2.0 Personality styles.  Just rolling out a blog or some such is not enough.  There would be a desire to try a lot of small experiments to address the “Tragic Knowability” problem before investing too much in a centralized solution.  70 or 80 projects might start to sound like not nearly enough.

Meanwhile, the tools available don’t make it any easier.  The ability to control the Trust Fabric around business lines (for example, to handle the archiving requirement) is embryonic at best for Web 2.0 tools.  Most of these were built for the Social Web.  Across 70 or 80 projects, there has to be a lot of reinventing the wheel as well.

It will be interesting to watch the Business Web 2.0 unfold.  Right now it’s caught between the Old School Enterprise Technologies such as Content Management and Portals and the Social Web 2.0.  Eventually, a happy middle ground will emerge.  This is a Blue Ocean opportunity for some group of companies to pursue.  It’s also an opportunity for Asymmetric Marketing in the sense that the Business Web 2.0 companies can draft behind the market momentum created in the Social Web.

Posted in business, Marketing, user interface, Web 2.0 | Leave a Comment »

What’s the Killer App for Multicore?

Posted by Bob Warfield on September 11, 2007

I saw this question posed on MulticoreEra, so I thought I’d take a crack at it.

Let me start out with a contrarian view of the whole Multicore Thing.  Many have called it the Multicore Crisis, and I am definitely part of the “Crisis” crowd because I think software is not keeping up with hardware.  It’s too hard to write software that takes advantage of a lot of cores.  So here is my contrarian proposition: if it’s too hard, maybe the first killer app won’t be parallel at all.  Maybe it takes a lot of serial killer apps and just a few parallel killers to get us started.

Your desktop suddenly sprouts 4 cores.  Windows uses 1, your current app uses another, maybe something in the background uses a third, but what do you do with the 4th core?  And next year, what to do with cores 5, 6, 7, and 8?  I suggest we look for the Killer App to be something that runs in the background all the time, has an insatiable appetite for cpu cycles, and is completely indispensible: once we get it, we can’t live without it.  That app may or may not be parallel itself.  If its not, we’ll just run a bunch of differnet ones on the different cores to take advantage.

One last disclaimer:  lots of Multicore Killer App possibilities for the server end, this post is focused on the desktop.

Without further ado, here is my list of possibilities:

Be Speculative

What if your computer was always one step ahead?  Speculative execution has been a mainstay of cpu design for a long time, but this would be speculative execution of software on your PC.  What the system needs is some idea of what you might want to do next.  This could be as simple as knowing which files you’ve accessed recently.  Based on that, the system would know which apps you’re most likely to run.  Imagine if on startup, each core took one of the most popular apps you like to run and started to bring it up behind the scenes.  If you actually ask for the app, it pops up immediately because it was already there. 

The same trick can be employed with data.  Let’s get the data I’m likely to want into RAM and off the disk (or off the Net).  It will be much faster to access it there if it’s called for.

Programmers Need Compiler Farms/Testers Need Test Farms

This is one of those insatiable desire problems if ever there was one.  Make the compile of a single file parallel or running a single test script in parallel might be hard.  However, most programs have many files and many test scripts.  With a little creative scheduling, we can do a lot of that work in parallel even though most of the software uses only one core at a time.

Indexing

If you’re like me, anything that makes it easier to find what you’re looking for on your computer would be indispensible.  But since we’re talking about exotic multicore stuff, how about getting a little more exotic and creative?  I’m thinking of a background task that trolls through your digital photos and uses face recognition to provide an index of who is in the picture.  How cool would that be?  You’d start this program out with a nucleus of starter definitions.  Perhaps your family members and closest friends would be identified manually in a few pictures.  The program would then go off looking at pictures and looking to do a couple of things:

1)  Identify the face regions.  It creates a list of cropped rectangles for each picture, one rectangle per face.

2)  Identify faces.  It matches the regions to known faces.  There’d be a relevancy ranking:  72% chance this one Aunt Jane and only 17% chance its really Uncle Joe.

3)  It asks the user to confirm any faces where the relevancy is too low.  In so doing, it would learn how to identify a particular face better.  Likewise, you’d be able to easily correct its mistakes and it would learn there too.

4)  It builds a list of unidentified faces and ranks them by frequency.  Periodically, it would ask you to identify those that are the most common unknown persons.

Now, any time you want to look at your photos, you’d get captions to go with the photos telling who the people are.  The technology to do this exists today, and it seems to me would make for a killer app.

Tend the Garden

While we’re indexing, there’s a lot of other tasks one could undertake tending the garden of local data.  Clearly I can be on the lookout for malicious offenders: viruses, spyware, and all that sort of thing.  I can rearrange the data on the disk to defrag it and optimize it for better performance.  I can back it up over the web to keep it safe.  I can prune by looking for candidates for deletion.  I can encrypt to keep prying eyes from accessing the data too easily.

Hang Out a Shingle

Even if you don’t have software to take advantage of extra cores, maybe someone else does.  You can hang out a shingle (digitally, of course) whenever you have idle cores for rent.  Anyone whom you allow to share your cores gets to use some of them.  Perhaps it works along the lines of a Hadoop.  Heck, you could envision a search firm letting it’s users do both the web crawling and the search indexing during the normal course of their web browsing.  What a good opportunity to periodically ask them a question to help make results better.

Endless Possibilities, OS Help Wanted

There are endless possibilities here.  Right away it seems that we’ll want some sort of OS help to enable proper sharing of system resources among all these different apps that have suddenly sprouted.  The thing I’d worry most about is disk access.  When my antivirus program fires up it ruins performance of most other apps by hogging the disk.  The OS knows which app has focus and needs to be smart about parceling out that scarce disk resource.  Greater sophistication will be needed there to fully maximize the Multicore Experience.

Posted in grid, multicore, ria, user interface | Leave a Comment »