Saturday, 6 August 2011

The thin(ish) client revolution

David Berlind: “[P]icture a world where, instead of carrying a notebook computer with you everywhere you go, and instead of having power-drinking desktops in every corner of your house, all you have is a USB key that you take from one dirt cheap thin client to another. On that key is not just all of your personal data (that is stored in the cloud but replicated to your USB key for offline usage), but perhaps a small Web server and some applications, both of which are thin-client friendly.”
I’m in complete agreement that the thin client/web application computing model is the wave of the future, and that the days of the fat client as the primary mechanism for monetizing software are numbered. However, I don’t buy into the notion that software-as-a-service displaces the traditional fat client entirely, for one very simple reason: It’s going to be a while yet before we have truly ubiquitous network connectivity.
Sure, perhaps if you live in the San Francisco Bay area or in a similarly technology centric place, it’s easy to buy into the notion that it won’t be long before we’ll all be connected all the time. However, we can’t lose sight of the fact that a vast majority of the population doesn’t live in a world where technology billboards dominate the morning commute. Heck, my cell phone doesn’t even work when I visit my grandfather in rural Indiana—the notion of an omnipresent wifi fabric built from community hotspots is a long way off there, to put it mildly.
And even when we do get to the day when we have ubiquitous connectivity, we’ll still want caching for optimization and replication to eliminate the single point of failure problem, not to mention the ability to retain control over our own digital lives. So, I view the thin vs. fat client juxtaposition as being not so much about user experience—AJAX has shown us that remarkably good user interfaces can be built on the web. It’ll be more about caching and data replication, two very important considerations in any distributed system. I think most people are missing that. David, as usual, appears to get it.
My prediction: Applications will increasingly move to the web, so you’ll be able to get to your data anytime, anywhere, and from any connected device. No surprise there. But the majority of the time, when you’re using “your” client device (whether it’s a PC in your home or office or a handheld device or something that lives on a USB key and securely attaches to the network via some device that’s outside of your administrative control), you’ll still primarily interact with the web through a fat client of some sort.
To be sure, the fat clients of the future will look a lot different from the fat clients of today. Hint: “Fat client” doesn’t have to mean “self-administered”. Perhaps the fat clients of the future are delivered on demand or via independently developed components that integrate into some larger framework. Perhaps it’s as simple as running a web server locally. But the fat client in whatever form will still be the primary interface the vast majority of the time. That’s why I don’t buy for a minute that Web 2.0 is yet another death knell for Microsoft. They’ve obviously got a great fat client story, and in case you haven’t noticed, they’ve just woken up to what’s going on around them.
By the way, this is the area where I think Linux on the desktop could have the biggest impact—it won’t be about who has the better desktop or who can replicate the popular fat client applications of today in open source. It’ll be about who can mold Linux into the fat client framework of the future, seamlessly integrated with applications and services delivered via the web but open. If I were Microsoft’s competitors in this space, namely Google and Yahoo, I’d be spending a lot of time thinking about this. In today’s fat client world, Microsoft is between them and their customers more often than not, and if the past is any indication, they’ll be able to use this position to tremendous advantage.