Don’t fall into the pit of thinking there’s no pendulum, or that the pendulum can be nailed to one side.
Is the rediscovery of the desktop just the latest swing of some tech-trend pendulum, or is there something more going on here?
This year, some of the big boys gave every impression of having suddenly and simultaneously remembered that there is such a thing as a desktop. Google got Geared up, Adobe announced AIR, and Microsoft saw the light with Silverlight, all of which are tools to help web developers integrate operations on the Web and the desktop just a little better. That oft-repeated mantra that the web browser is the new operating system? In 2007, not so much.
Of course it’s a pendulum. More specifically, it’s the industry constantly rebalancing the mix of several key technology factors, notably:
- computation capacity available on the edge (from motes and phones through to laptops and desktops) and in the center (from small and large servers through to local and global datacenters)
- communication bandwidth, latency, cost, availability, and reliability
This balancing actually isn’t news; we’ve been doing it since the dawn of computing. Conceptually, it’s not much different from how the designers of your PC balanced the kind and speed of memory to match the speed of the processor and the bus and the hard drive etc. to create a balanced system. We do and redo this exercise all the time. Here are just a few of the pendulum swings we’ve seen historically:
|Era/Epoch||The Center||The Edge|
|Devonian||Terminals and time-sharing|
|Triassic||Microcomputers, personal computers|
|Jurassic||File and print servers|
|Cretaceous||Client/Server, server tier||Client/Server, middle tier|
How many pendulum swings can you count on just that list? In my own career, I’ve missed only the Precambrian and Cambrian (I’m a child of terminals and micros, and never had to carry stacks of punched cards uphill both ways in snow up to my waist). Many of you have experienced most of these swings.
It’s also not news that neither the center nor the edge is going to go away. We’re in an expanding computing universe: The question is not whether one will replace the other, but what balance they will be in at a given point. This will continue to be true for the foreseeable future no matter how often people on either end of the pendulum swing try to nail the pendulum where they want it for their own business reasons. (Take it from someone who lived through trying to market early peer-to-peer database and application models in the midst of Larry Ellison’s screaming-loud "network computer" hype, and had to deal with VC after VC who believed desktops and notebooks were going to evaporate. Sigh.)
What is news, of course, is how those factors are changing and therefore how their balance is changing. Craig Mundie has spoken about this pendulum in several talks this year, including last week’s Financial Analysts Meeting (transcript and WMP webcast link; slides, including the one reproduced at right).
Quoting from one of those talks:
One of the things that I also find fascinating at this point in time is how people, how easily we forget about the cyclical nature of the evolution of the computing paradigm.
And from another:
Right now, as the Internet has evolved, broadband has become more highly penetrated, and to some extent the computers seems to be not fully utilized, we’re in the middle of one of these natural pendulum like swings between centralized computing and computing at the edge. It started with the mainframe, and then we added terminals, and then we moved to departmental, and then we moved to personal; it just kind of moves back and forth. And there are a lot of people today who say, oh, you know, I think that in the future we’ll just have dumb presentation devices again, and we’ll do all the computing in the cloud.
But … I contend that since the cloud is made ultimately from the same microprocessors, as the utilization becomes higher, it becomes impractical for a whole variety of costs and latency reasons to think you can just push everything up the wire into some centralized computing utility.
And so, in fact, I think for the first time in a long time we’re going to see the pendulum come into a fairly balanced position where we, in fact, do have incredible power plants of the Internet in these huge datacenters that provide these integrating services across the network, but at the same time we’re going to see increasingly powerful local personal computing facilities in everything from embedded devices, cell phones, and on up the computing spectrum.
A nicely balanced view. The center (mainframes, datacenters) isn’t going away anytime soon. But neither is the edge (PDAs, laptops). It would obviously be foolish to imagine either away, at least yet, because they each have different capability, availability, performance, and reliability characteristics, so there’s plenty of reason to choose each one for a different part of an application or system.
Don’t fall into the pit of assuming the pendulum will get nailed to one side. That’s pretty unlikely. Bet on new technologies constantly being developed to bring the center and the edge into new balance by filling the holes where each is deficient and as the center and edge grow at different rates. Yesterday’s disconnected computers just couldn’t do everything you can on an Internet — so as internetworks became mainstream something like HTML and AJAX had to come to let us exploit them. Early and current web apps just can’t do everything you can on a rich client — hence first AJAX, then Gears, AIR, and Silverlight, with more still to come tomorrow and next year and next decade.
Fasten your seat belts.