Monday, December 13, 2010

Technology Predictions 2010

Every December the cast of Point 'n' Click reunites at the Central New York PC Users Group meeting. The audio of this year's is available here if you've got some time to kill and you absolutely cannot think of something, anything, better to do. As part of our off-the-cuff presentation, we usually make some predictions about the future of technology. Most of the time they're spot-on. Granted, they're also fairly obvious to anyone paying attention to the tech world. Still, here's what I predicted we'll see in the relatively-near future in terms of technology:

1. The iPad was last December's big thing (it was released in Q1 2010, so it was very, very close at this time last year). This year, I see the rise of the knock-off. And by knock-off, I really mean a lot of products that are just as good as, if not better than, the original iPad. The iPad was revolutionary in no small part because Apple managed to take mostly-existing technologies and market them in a way that made them appealing to people. That was really new, even if much of the underlying technology was not and even if the device was gimped in a few critical ways.

Let me digress here for a minute or two. I've enjoyed watching portable computing technology advance over the last ten years. I remember getting my Toshiba 7200 laptop back in 1999 or 2000 and it was awesome. It was a pre-tablet tablet, by which I mean that the "guts" of the laptop could be detached from a special lightweight chassis that held some cooling fans and the auxiliary drives (optical and floppy). If you wanted, you could walk off with just the keyboard/mouse, monitor, and a couple of ports. It was highly portable - slim and sleek. What it lacked was connectivity while detached.

A few years later, I got my hands on a Toshiba Tablet PC. This was one of the true swivel-form tablets - a full-featured, very lightweight laptop with a swiveling screen that could lay down on top of the keyboard to let you operate the device with a stylus. It had full Wi-Fi connectivity, and once I plugged in a Verizon cellular aircard it became a true go-anywhere device. I loved that tablet and I took it everywhere. As a corporate executive, it absolutely improved my productivity, allowing me to work during the "wasted time" I'd spent waiting for meetings to start (our Purchasing Manager made a point of being late to certain key meetings, for instance) or when cooling my heals in an airport.

Still, the tablet form-factor, even at its height of technical capability, never caught on with the masses, even within the business world (where I'm living proof that it was a truly useful device), much less for consumers. They were still burdened with all-purpose functionality in the form of generic operating systems (mostly Windows) and all of the usage issues, patches, crashes and other challenges that go along with an O/S that's built to do it all in every type of computer. The iPad, though, isn't a computer - it's a device, and there's a subtle difference.

Technically, at its heart, and iPad is, of course, a computer. It has a microprocessor, RAM, some sort of non-volatile storage, and a mainboard that connects everything together. But so do lots of things that we don't think of as computers, from MP3 players to TiVos. No, an iPad is a device, because it's purpose-built, locked-down, and very limited in what its user can make it do. It may not feel like it to an iPad owner, because there are so many things it CAN do, but, for example, let's see you make a cell-phone call on your iPad. See? It's a device, because the manufacturer artificially limits what you can do with it in ways that true computers - PCs, Macs, and comparable devices - don't.

And to get back on the original topic, that's where I see the changes in 2011 and beyond. The iPad is selling like crazy, but it's expensive and it's deliberately crippled. Apple wants to control what goes onto the device for various reasons. They want to maintain a "wholesome" environment, free of things like pornography (that's right, a pornography-free Internet device. Seems like a contradiction to me, too). They want to maintain security, so everything that goes onto the device must go through Apple's agents first. And they want you to please also buy an iPhone, so they limit the iPad's ability to serve as an iPhone replacement. Lots of other manufacturers, however, have no such compunctions. In 2011, expect to see lots of other companies making iPad-like devices that let you install whatever you want, hook up whatever devices you want, print where you want, and make phone calls. Which brings me to trend #2:

2. Convergence has been a trend for some time - I expect to see it continue and accelerate. The Smartphone is a prime example of convergence. If you've got a full-featured phone like an Android or an iPhone, you also have a camera, an MP3 player, an email device, a gaming device, and something to watch movies on. Oh yeah, an it can make phone calls and send text messages, I guess.

We've seen our technology getting smaller, lighter and more powerful for decades. It's almost surprising that it's taken us this long to get where we are. Honestly, I believe that ancillary technologies have been the holdup. It's not the processor that's been lacking, it's the miniaturization of batteries, transmitter/receivers, and usable touchscreens that held things up. They're all where they need to be, now, so hold on for a crazy ride. See, we've got a few more tricks still to come - like organic LED screens, or OLEDs. They're paper-thin and paper-flexible, meaning they can do things like scroll up inside the device when not in use. Add one of those, and it's an iPAD that collapses down to a smartphone when that's all you need (or when you want to wear it on your belt). I don't know quite when the forearm-bracer computer will catch on, but I'm feeling like wearable computing is finally on the cusp. Probably not in 2011, but by 2015? I think there's a good chance. The key is going to be that all of these devices will converge, and people will no longer talk about their phones or their GPS devices or their MP3 players. You'll have one multi-function device that will do it all. That's where we're headed.

3. Cloud computing is the final piece of this puzzle. Let's hop back into that time-machine I like to call "Mike's memory" for a few minutes. I remember back in the early 90s when Larry Ellison predicted the death of the PC and the inevitable rise of "Network Computing." Now, Larry had a couple of ulterior motives for this prediction. One was that his company, Oracle, made some of the key technologies to enable Network Computing, and being a pioneer there would have been extremely lucrative. Also, Larry absolutely hates Microsoft with a passion to this day, and as they were the leaders in computing technology then (and now), he would have liked nothing better than to see their ouster.

Sorry, Larry, but you were about 15 years off in your prediction. The Network Computing concept was a lot like the mainframe computing concept, really. You built a really powerful data-processing system and then hooked a bunch of dumb terminals for it that people used to hook into that system. In the 70s and 80s, that system was the Mainframe computer. In the 90s, Ellison thought it was going to be a backend network of servers and non-mainframe Unix boxes. I remember when I worked at the Turning Stone, the crazy, evil network manager bought into this whole concept with his heart and black soul. He wanted to replace all the desktop computers with these low-end "thin client" devices. Our CIO insisted that as the advocate for the end-users, I, as desktop manager, test out one of these thin clients. It was utter crap - borderline unusable. The network simply wasn't ready. The back-end devices couldn't compute fast enough and the cabling couldn't deliver the results fast enough to eliminate a delayed-response known as lag. We're used to typing something in or clicking the mouse and getting immediate gratification. Network Computing couldn't provide that.

At least, it couldn't back then. Today's another matter. Processors have doubled in capacity about six times in the last eight to ten years, and wireless communications operate at speeds that rival what we used to expect from premise wiring. What used to be called Network Computing is now known as Cloud Computing, and it's finally ready (or nearly ready) for prime-time.

Netbooks, Smartphones and a wide array of apps for tablet devices like the iPad all rely on having a more-or-less persistent connection to a back-end network (usually the Internet itself). They pull down all sorts of data, in real time, just when you need it. The data might be a movie you're watching through Netflix or it might be your daily newspaper. Either way, it's coming from the Cloud. Likewise, your data and even much of your processing may not be stored or occurring right there on the device in your hand. Instead, it's handled behind the scenes in a datacenter somewhere and just sent down as you need it. This eliminates the need for high-end processors and big hard drives on your devices. And the more ubiquitous Cloud Computing becomes, the more those hand-held devices can accomplish with less of their own power. Size, shape and weight matter a lot less when you don't need to cram hefty processing power, memory and data storage (and the cooling that goes along with them) into a tiny form-factor.

The Cloud works for non-handheld devices as well. I've been saying for a while now that the days of the DVD library are numbered. I'm buying fewer and fewer DVDs - and I've avoided the Blue-Ray specification entirely - because I believe they're going to be obsolete pretty soon. The "Cloud" can also deliver movies right to my TV (or handheld, really) on-demand from services like Netflix and even Amazon. I'll pay a monthly subscription and have the benefit of watching thousands upon thousands of movies whenever I want to as often as I want to.

And that's the future as I see it. I suppose we'll have to check back in a year or two and see how close I am.

No comments:

Post a Comment