Last week, Nvidia launched the first graphics processing unit (GPU) designed for the cloud, dubbed Kepler. Supporting vendors include a who’s who of server providers, such as HP, Dell, Cisco, and IBM — all of which will have products on the market shortly.
The whole concept behind these servers is to serve up a desktop experience from the cloud. This means delivering games, applications, utilities, and media to any device that will run the client: iPads, iPods, Android tablets, smartphones, and even cars and smart TVs. As this technology comes to market, it will increasingly not matter what you are using — you’ll be able to get your stuff on it as long as it is connected with decent bandwidth.
Let’s talk about some of the results.
Gaming from anything
On stage at its GPU Technology Conference in San Jose, Nvidia had one person on an iPad gaming head to head with another on a new LG TV using a service called Gaikai. The demo game was Hawken, a mech-oriented title that isn’t even in market yet. These two were gaming on hardware that couldn’t hope to run top-line graphics intensive game locally. Yet both where pounding away at each other, and the amazing thing was, the guy on the tablet was winning, showcasing that screen size didn’t matter as much as gaming skill.
This is often the problem with games: If it comes out on one platform and you or your friends don’t have that platform, not only can’t you play the game, the developer gets a fraction of the available revenue. But if games were delivered like streamed movies, then they could go everywhere. You could play from your connected AV system in your car, your iPad, or your TV in the home.
This is truly cloud computing, though Nvidia calls it GeForce Grid.
Windows on an iPad
I was out to breakfast the other day, and I have a nasty habit of listening in on the conversation at neighboring tables if it has to do with tech. The guy talking had been a recent convert from Windows to the Mac, and was talking about switching back because the Mac sucks. (His words not mine, no desire to peg the hate-mail meter this week.) He was complaining because he was going to have to dump his near-new MacBook Pro for an Ultrabook, and he was going to lose on that investment.
Well, what if you could run Windows on a Mac, or an iPad, or anything that would host a tiny client? If you like Apple hardware but hate the Apple platform, you can still run Windows. If you want to run Windows on your big smartphone or tablet in an emergency, you can do that, too.
Citrix demonstrated new hardware that could scale to support 100 desktops off one tower that looked smaller than my (admittedly rather large) PC.
This is the freedom to run what you want wherever you want. To not be tied to Apple or anyone else. To have software delivered like it was electricity. Someone else worries about malware, and backups, and making sure a catastrophic event doesn’t destroy your digital life along with your real one.
One of the most fascinating demonstrations had to do with modeling galaxy-class events. No I’m not referring to something out of Star Trek (the Enterprise was a Galaxy Class Starship). What Nividia showed was the progress from its existing Fermi platform, which can model the birth of the universe, to the Kepler platform, which can model what’s going to happen in a few short years when the Andromeda Galaxy runs into our own. Granted, a few short years in galaxy-class events is 3.5 billion years, so no need to jump under a table (not this would do you any good, mind you). As you can imagine, the scale is massive, and the capability is a magnitude (10 times) greater than what it was with the older hardware.
We often get excited about 20 percent performance leaps, so 10 times the performance is amazing. If this level of advancement keeps up, heck, we’ll be obsolete in a few years.
You may think I’m joking on this last one, but one of the other Nvidia presenting at the show was Universal Robotics. This is the company bringing to market thinking robots that can respond to sensor-based events. In short, they can see and change their actions based on what they see. I’m hoping the eventual result is more like Robbie the Robot than Terminator, but I have my doubts. In any case, at the Nvidia conference, we once again saw major progress with regard to what you can do in the cloud, and even what machines will be able to do in the near-term future. Granted, they may be the only thing that is left of us in 3.5 billion years to say “oh crap” when the galaxies do collide.
And on that festive note, I’ll leave you to ponder our near, and far, future.
Guest contributor Rob Enderle is the founder and principal analyst for the Enderle Group, and one of the most frequently quoted tech pundits in the world. Opinion pieces denote the opinions of the author, and do not necessarily represent the views of Digital Trends.
The views expressed here are solely those of the author and do not reflect the beliefs of Digital Trends.