Wednesday, October 31, 2012

How application virtualization was reborn

Server virtualization has become a familiar fixture of the IT landscape and an important foundation for cloud computing.

But virtualization is also relevant to client devices, such as PCs. To a greater degree than on servers, client virtualization takes many forms, reflecting forms of abstraction and management that take place in many different places. Client virtualization includes well-established ways of separating the interaction with an application from the application itself, the leveraging of server virtualization to deliver complete desktops over the network (Virtual Desktop Infrastructure--VDI), and the use of hypervisors on the clients themselves. In short, client virtualization covers a lot of ground, but it’s all about delivering applications to users and managing those applications on client devices.

2012-ipadmini-home-hero

It’s essentially a tool to deal with installing, updating, and securing software on distributed “stateful” clients—which is to say, devices that store a unique pattern of bits locally. If a stateless device like a terminal breaks, you can just unplug it and swap in a new one. Not so with a PC. At a minimum, you need to restore the local pattern of bits from a backup.

However, client virtualization (in any of its forms) has never truly gone mainstream, whether it was because it often cost more than advertised or just didn’t work all that well. It’s mostly played in relative niches where some particular benefit—such as centralized security—is an overriding concern. These can be important markets. We see increased interest in VDI at government agencies, for instance. But we're not talking about the typical corporate desktop or consumer. 

Furthermore, today, we access more and more applications through browsers rather than applications installed on PCs. This effectively makes PCs more like stateless thin clients. And, therefore, it makes client virtualization something of a solution for yesterday’s problems rather than today’s.

Except for one thing.

Client virtualization, in its application virtualization guise, has in fact become prevalent. Just go to an Android or iOS app store.

Application virtualization has been around for a long time. Arguably, its roots go back to WinFrame, a multi-user version of Microsoft Windows NT that Citrix introduced in 1995. It was, in large part, a response to the rise of the PC, which replaced “dumb terminals” acting as displays and keyboards for applications running in a data center with more intelligent and independent devices. Historically, application virtualization (before it was called that) focused on what can be thought of as presentation-layer virtualization—separating the display of an application from where it ran. It was mostly used to provide standardized and centralized access to corporate applications.

As laptops became more common, application virtualization changed as well. It became a way to stream applications down to the client and enable them to run even when the client was no longer connected to the network. Application virtualization thus became something of a packaging and distribution technology. One such company working on this evolution of application virtualization was Softricity, subsequently purchased by Microsoft in 2006.

I was reminded of Softricity earlier this year when I spoke with David Greschler, one of its co-founders, at a cloud computing event. He’d moved on from Microsoft to PaperShare but we got to talking about how the market for application virtualization, as initially conceived, had (mostly not) developed. And that’s when he observed the functional relationship between an app store and application virtualization. And how application virtualization had, in a sense, gone mainstream as part of mobile device ecosystems.

If you think about it, the app store model is not the necessary and inevitable way to deliver applications to smartphones, tablets, and other client devices.

In fact, it runs rather counter to the prevailing pattern on PCs—regardless of operating system—towards installing fewer unique applications and running more Web applications through the browser. Google even debuted Crome OS, designed to work exclusively with Web applications, to great fanfare. As connecting to networks in more places with better performance improves and as standards, such as HTML5, evolve to better handle unconnected situations, it’s a reasonable expectation that this trend will continue.

But the reality of Chrome OS has been that, after early-on geek excitement, it’s so far pretty much hit the ground with a resounding thud. At least as of 2012, it’s one thing to say that we install fewer apps on our PCs. It’s another thing to use a PC that can’t install any apps. Full stop.

What’s more, it’s worth thinking about why we might prefer to run applications through a browser rather than natively.
It’s not so much that it lets developers write one application and run it on pretty much anything that comes with a browser. As users, we don’t care about making life easier for developers except insofar as it means we have more applications to use and play with. And, especially given that client devices have coalesced around a modest number of ecosystems, developers have mostly accepted that they just have to deal with that (relatively limited) diversity.

Nor is it really that we’d like to be able to use smaller, lighter, and thinner clients. Oh, we do want those things—at least up to a point. But they’re usually not the limiting factor in being able to run applications locally and natively. We don't want to make clients too limited anyway; computer cycles and storage tend to be cheaper on the client than on the server.

No, the main thing that we have against native applications on a client is their “care and feeding.” The need to install updates from all sorts of different sources and dealing with the problems if upgrades don’t go as planned. The observation that a PC’s software sometimes needs to be refreshed from the ground-up to deal with accumulating “bit rot” as added applications and services slow things down over time.

And that’s where centralized stores for packaged applications come in. Such stores don’t eliminate software bugs, of course. Nor do they eliminate applications that get broken through a new upgrade—one need only peruse the reviews in the Apple App Store to find numerous examples. However, relative to PCs, keeping smartphones and tablets up-to-date and backed up is a much easier, more intuitive, and less error-prone process.

Of course, for a vendor like Apple that wants to control the end-to-end user experience, an app store has the additional advantage of maintaining full control of the customer relationship. But the dichotomy between an open Web and a centralized app store isn’t just an Apple story. App stores have widely become the default model for delivering software to new types of client devices and certainly the primary path for selling that software.

The Web apps versus native apps (and, by implication, app stores) debate will be an ongoing one. And it doesn’t lend itself to answers that are simple either in terms of technology or in terms of device and developer ecosystems.
Witness the September 2012 dustup over comments made by Facebook CEO Mark Zuckerberg that appeared to diss his company’s HTML5 Web app, calling it "one of the biggest mistakes if not the biggest strategic mistake that we made."

However, as CNET’s Stephen Shankland wrote at the time: “Those are powerfully damning words, and many developers will likely take them to heart given Facebook's cred in the programming world. But there are subtleties here -- not an easy thing for those who see the world in black and white to grasp, to be sure, but real nonetheless. Zuckerberg himself offered a huge pro-HTML5 caveat in the middle of his statement.”

It’s often observed that new concepts in technology are rarely truly new. Instead, they’re updates or reimaginings of past ideas both successful and not. This observation can certainly be overstated, but there's a lot of truth to it. And here we see it again--with application virtualization and the app store.

2 comments:

Unknown said...

HTML5 development is increasing area these days because of multiple and local programs that can be designed using it. Designers can come up with individual programs that provides useful functions .

Gordon Haff said...

HTML5 is certainly a welcome development in Web standards. And, generally, I think we as users are better off in the long term to the degree we can access services using open standards rather than locked-down platforms. That said, the reality today is that native apps are the dominant mobile model today.