Skip to main content

Opinion: How Intel is building a virtual Steve Jobs, better security, and more

Paul Otellini & Andy Rubin
Image used with permission by copyright holder

Intel’s annual Developer’s Forum (IDF) showcases the Intel that is focused solidly on the future. At this year’s show, and at a unique “Speed Geeking” event hosted by Brian “Crystal Balls” David Johnson and Genevieve “Kick Ass” Bell, we got a view of this future. One interesting part of the effort, which I learned more about in an earlier session, is an effort to create an artificial Steve Jobs.

Let’s cover some of the stuff that will be coming over the next few years, and how Intel may instrument its own Steve Jobs.

Security

This topic became much more interesting to me while I was at IDF, because my Twitter account got hacked. Fortunately, it was by someone for whom English isn’t a primary language, and he or she got caught fast. (Twitter itself responded faster than I was able to, fortunately, and reset my password.)

The technology that was showcased worked two ways. First, when you logged into an application it would require a second action on a cell phone. If you didn’t have the ID, password, and the user’s cell phone, you couldn’t access the account (which would likely block 99.9 percent of the successful password based penetrations by itself). The second used a randomized keypad to input a PIN code. This keypad would be sent directly to the PC hardware, making any buffer-based attack obsolete, because it bypasses the buffer.

Of course this got me thinking about whether you’d really need passwords anymore. The combination of a pin code and the cell phone approach would seem secure enough, and given passwords aren’t secure, why use them at all?

Applied Virtual Reality

Intel also showcased some really cool stuff with regard to blended video. Right now, if you shop for clothing or eyewear on the Web, you have to largely imagine what it would look like on you. Or, if you go into a store, you actually have to try things on to see how they will look. Given how many cameras have found themselves into dressing rooms, and how many men (like me) don’t like trying things on in the first place, this is problematic.

But Intel showcased technology that would allow a camera image of you, streamed real time, to be blended with the item to provide a realistic, mirror-like view of what the result would be if you were wearing this item. While this was shown as something you might use exclusively, think of how Web buying services could use this, coupled with a history of what you like, to present you only with things you’d find attractive, or to provide advice to a spouse or parent on things you would actually like.

This could end the whole dance you do when you get that ugly sweater from a loved one and have to pretend you might actually wear it while wondering if you can exchange it, or whether Good Will might laugh at you when you try to give it to them.

Building a virtual Steve Jobs

While Steve Jobs was at Apple, he performed the role of a “Super User.” This is a user who is used to model the ideal customer, and who has a great deal of say about the final product. Steve was unique in this role because he was also the CEO, and could fund marketing programs that would assure his success. However, he really was vastly different than most of us who aren’t vegan, minimalist, micro managers (some the words often used to describe him). Still, he seemed pretty damn close to ideal in this role.

intel-prototype chip
Image used with permission by copyright holder

Intel is well along the way to instrumenting what Steve Jobs did: It is creating a data model of what the ideal user is, and putting in place metrics to measure product performance against this model. Granted, it is more focused on the performance of every aspect of the device. It has not drifted to actual design yet, and there is no hard connection to marketing funding for the result, but this would be the first big step to replicating what Steve Jobs did for Apple. It would set design parameters like screen size, weight, and the complexity of the user interface that would result in the most attractive product.

If Intel is successful, any company could have the power of Steve Jobs, at least with respect to product creation, and that would be very cool.

Wrapping up

We tend to think of Intel as a processor house, but Intel Labs is one of a handful of organizations trying to actually define the future. They have other efforts defining smart TVs, photography (for instance allowing each viewer to see a unique image tailored for them), and how people interact with technology. They are using techniques like Science Fiction Prototyping which are just amazing, and one hell of a lot of fun. In short, at Intel, they are building some amazing ideas into our future reality.

Guest contributor Rob Enderle is the founder and principal analyst for the Enderle Group, and one of the most frequently quoted tech pundits in the world. Opinion pieces denote the opinions of the author, and do not necessarily represent the views of Digital Trends.

Rob Enderle
Former Digital Trends Contributor
Rob is President and Principal Analyst of the Enderle Group, a forward-looking emerging technology advisory firm. Before…
We just got our first hint of the RTX 6090, but it’s not what you think
A hand grabbing MSI's RTX 4090 Suprim X.

As we're all counting down the days to a possible announcement of Nvidia's RTX 50-series, GPU brands are already looking ahead to what comes next. A new trademark filing with the Eurasian Economic Commission (EEC) reveals just how far ahead some manufacturers are thinking, because it mentions not just the Nvidia RTX 5090, but also an RTX 5090 Ti; there's even an RTX 6090 Ti. Still, it'll be a long while before we can count the RTX 60-series among the best graphics cards, so what is this all about?

The trademark registration filing, first spotted by harukaze5719 on X (formerly Twitter) and shared by VideoCardz, comes from a company called Sinotex International Industrial Ltd. This company is responsible for the GPU brand Ninja, which doesn't have much of a market presence in the U.S.

Read more
How the Blue Screen of Death became your PC’s grim reaper
The Blue Screen of Death seen on a laptop.

There's nothing more startling than your PC suddenly locking up and crashing to a Blue Screen of Death. Otherwise known as a Blue Screen, BSOD, or within the walls of Microsoft, a bug check screen, the Blue Screen of Death is as iconic as it is infamous. Blue Screen of Death is not a proper noun, but I'm going to treat it like one. It's what you were met with during crashes on Intel's 14th-gen CPUs, and it littered airport terminals during the recent CrowdStrike outage.

Everyone knows that a Blue Screen is bad news -- tack on "of Death" to that, and the point is only clearer. It's a sign that something catastrophic has happened, so much so that the operating system can't recover, and it needs to reboot your PC in order to save it. The Blue Screen of Death we know today, fit with its frowning emoticon, is a relatively new development in the history of Windows.

Read more
The performance downgrade made to the M4 Pro that no one is talking about
Someone using a MacBook Pro M4.

I've spent this whole week testing the new M4 chip, specifically the M4 Pro in both the Mac mini and 16-inch MacBook Pro. They are fantastic, impressive chips, but in my testing, I noticed something pretty surprising about the way they run that I haven't seen others talk much about. I'm talking about the pretty significant change Apple made in this generation to power modes.

First off, Apple has extended the different power modes to the "Pro" level chips for the first time, having kept it as an exclusive for Max in the past. The three power modes, found in System Settings, are the following: Low Power, Automatic, and High Power. The interesting thing, however, is that in my testing, the Low Power drops performance far more this time around.

Read more