Tuesday, November 28, 2006

How IBM leverages Open Source

Dana Blankenhorn recently blogged about IBMs reaction to Sun’s Java plan and their approach to open source ecosystem. I think Dana summarizes it very well. IBM believes Open Source is a great technology floor on which others and even IBM builds. But as Dana points out, it is naive to treat IBM a Solutions and Services company with rest of the software industry players who are primarily software vendors like Oracle, SAP, RedHat and Microsoft.

IBM is a very interesting player in the Open Source ecosystem and in my opinion- The Best. They understand how it works and also how to leverage it to their business goals. To their customers they are the trusted business partner and certainly portray themselves as open and flexible. They are very smart about where to contribute to get influence in open source and what/how to consume that meets their business objective. And the wonderful thing is that they had been able to pull this off by not ruffling many feathers in the community.

In the changed software landscape of open source the core competency is not “ S/W features” but “Speed” - Speed by which a firm can leverage external innovations not by copying everything but by quickly assembling products from proprietary and open components.

In my opinion IBM believes that in the long run all software is going to be free and open and hence does not have much value in itself. But the trick is to extract as much value as possible during the journey to the end state. And to do that they leverage "Pluggable Integration Architecture" a “Lego blocks” type approach that can accommodate both proprietary and open source components. Pluggable Integration Architecture are the new influence points and hence allow “opening” their existing S/W product portfolio in increments and on their terms.

Eclipse showed how powerful pluggable architecture can be and certainly owning (in other words heavily influencing) the integration platforms that allow mixing of open and closed components is core to their strategy. In 1990 IBM tools were dismal but over time by using Eclipse as a way to build common integration framework IBM was able to transform its tools business. In the beginning Eclipse was a blob below proprietary WebSphere but over time the integration framework has been meshed into proprietary code positioning them well for future.

This totally changes the competitive landscape, now if market environment changes either from competitive pressure or by availability of better open source components; IBM now has a mechanism to respond fast. IBM can very easily slot in components from open source (like Apache httpd) and also commoditize components when it sees competitive threats (like modeling tools). By getting industry to adopt open integration framework they have a ready channel to slot in proprietary pieces on top of open pieces and IBM is in position to extract value on the road towards "Total Commoditization".

Now after having standardized the integration framework for tooling and IDE, IBM is now trying to do the same for runtimes. Geronimo is a great effort in that direction. It will be interesting to see how that plays out. Already there are signs that it doing very well (Report: IBM Open Source-Based Application Server Growing Nearly Three Times Faster Than JBoss).

And regarding Java, I believe very soon IBM will get over the gloom and then it will embrace it to make it yet another Lego block in the puzzle.

Tuesday, November 14, 2006

Discovering TimeBridge

I am sure you have had the Aha! Moments in your life and I had one of those last week when I saw TimeBridge at the Web 2.0 Summit. Scheduling with external customers had been a big pain for long time that both Stephen O'Grady and I have blogged before. This was an opportunity waiting to be snapped and I think TimeBridge is getting there.

TimeBridge has built a Personal Scheduling Manager that works across companies, time zones and calendaring systems. It is a “full service” service, in the sense that it provides help throughout the life cycle of setting up the meeting, lunch or other activity, including collaboration, distributing meeting materials and handling changes. It works for 1:1s as well as larger group meetings.

I played with their beta, it currently only works with Outlook but integrates very well with Outlook client. I could set meeting and anybody (even non-outlook) external users were able to use it through the website. What was missing was the ability to schedule meeting from the web applications itself. I am sure it is in the works.

Tuesday, November 07, 2006

Announcing SuiteTwo.com - An enterprise 2.0 solution powered by Intel

Today Intel is announcing the launch of SuiteTwo.com – a collection of web2.0 stack which is jointly developed by leading web2.0 players that include – MOVABLETYPE, SimpleFeed, Socialtext, Newsgator Spikesource.. We are announcing this at the currently ongoing Tim O’Reilly web2.0 conference at San Francisco, CA. SuiteTwo is a rich set of interconnected services that combine to improve productivity and enable high-engagement marketing. SuiteTwo includes the most trusted platforms for blogs, wikis, RSS feed reading, and RSS feed management, all under a single management interface.

This is a great first step for Intel in solving the ever increasing desire of enterprises to quickly deploy web2.0 technologies quickly inside the firewall. SuiteTwo. This had been a great effort at cross industry collaboration to bring solution to a sector quickly. Stay tuned we are already busy with version 2.0…

Sunday, November 05, 2006

PCs and web2.0 : Part 2 PCs the perfect Interaction Engines

Last week I talked about how the PCs existing role is being threatened by the evolving push towards services based applications. There is a widespread fear that web2.0 and SaaS will finally kill the thick Moore's law driven PC and we all will be either working of dumb terminals or the cell phone size devices that will run all our applications. Certainly skeptics and realists are arriving and last weeks blog by John Milan talked about how Google's desktop based application strategy is evolving. I believe that PC type high end compute devices will still be around but certainly with a new defined role of providing stateless and cheap raw computes.

Going forward PC will take on more the role of local storage/caching/execution device. It is already happening, PC in home is already the synching & charging station, music mixer & browsing device, Sam Ruby from IBM has a great presentation on the topic. While in enterprise PCs are quickly becoming thin state compute player with role based applications/kiosks becoming commonplace.

Thick Compute Thin State: The end user client nodes though loosing lot of application level computes to the cloud are certainly gaining lot of interaction level computes. Be it the browser based application that uses lot of rich AJAX code such as Zimbra or the role based deployments of enterprise applications. I remember somebody mentioning me that the Zimbra demos looked awful till Intel Core2Duo showed up, especially thru in the case of Apple Macs. Certainly with loosely coupled applications and mashups happening at the last mile the interaction level computes are bound to rise at the point of interactions. Added to that increasing desire of flexibility and agility prompts development at higher level abstracted languages, pushing performance as a back burner and certainly consuming MIPS very inefficiently.

The tipping of broadband adoption beyond 50%, the availability of cheap hardware and open source stacks have finally brought in the ability to break out from the limitations of client server model. Applications are not going to be written in the old ways and that means applications can finally be experienced differently. Application streaming vendors and role based deployment stacks are bridging the gap for existing client server applications in the enterprise space while in the consumer space everybody seems to be rallying behind web2.0

PC going from Multi Applications to Multi Player It was while working on the PDS project at IBM T.J Watson lab that we coined the term “Application Player” for the first time. Applications that can be experienced as a stream similar to watching a MPEG file or internet radio stream. The real implication was that now one could treat compiled applications and play it like any media using a S/W player. This meant that the composition and packaging of application is totally independent of the way applications are run. After all we always knew that runtime characteristics of an application is totally different from the design time and compose time characteristics, just didn't know how to manage it differently. By being able to separate these attributes it is now possible to pull in some of the composition and assembly aspect of application into the cloud, while also make the edge a better runtime stateless player. I think Microsoft is also thinking in the same line and the increased focus on declarative languages like XUL, XAML, FLEX makes it easier to reach there.

This is the biggest opportunity for PC. PCs can now become the perfect form of interaction engine. Finally the glue that stuck OS and applications to PCs are loosening and PCs could be redefined into a platform for interaction. Things like device driver models that have become a nightmare inside the Operating system could be pulled back to the hardware and used to putting up a softer face. It is time for us to look beyond the keyboard and mouse interfaces and provide an interaction based programming interface and tools that can be applied to things such as multi-touch, voice, conversational systems, 360° camera. PC's or PC type devices will start becoming the enabler of the local infoclound. And with the onset of virtualization technology in chips and open source VMMs now we have the basic building blocks to build this.

I am excited by the opportunity and believe this is just the beginning. The whole service orientation of applications and ability to experience the Application Anywhere Anytime is finally going to bring the information to fingertips and is heralding in new era for computer science.

PC is marching towards becoming the perfect “Interaction Engine” and who knows how many unintended uses will emerge. Here is one that is using exisiting PC hardware to predict Tsunami's using vibrations on your hard disk