Google’s been accused of becoming too big of a machine, taking over the Internet and wiping out privacy with it. But now it’s trying to make itself a little more human by funding a project to give computers some human emotion.
A research team at Tel Aviv University is using Google money to create an algorithm that will teach computers how to experience “virtual regret.” According to the university, this type of program could give a computer heightened awareness and help it “more accurately predict the future.”
“If the servers and routing systems of the Internet could see and evaluate all the relevant variables in advance, they could more efficiently prioritize server resource requests, load documents and route visitors to an Internet site, for instance,” Professor Yishay Mansour of the university and leader of the project claims.
Let’s get one thing straight: We’re sure you already deduced this, but no, computers can’t feel yet. Technically, what the team wants to accomplish is helping the machines to more precisely determine “a desired outcome and the actual outcome.”
The good news is that your computing needs could potentially be more easily and accurately met. Mansour says that systems will be more able to adapt to user needs and interaction, that this algorithm could “study and ‘learn’” from human use. Google specifically believes that AdWords and AdSense could benefit from this type of technology: The systems could learn when they were working ineffectively, determine what they were doing wrong, and remedy it.
The bad news? Anytime a computer begins to work on a human level is, frankly, a little terrifying. Despite any unease about a computer someday dealing with us on a human level, it’s definitely intriguing.
- Deep learning vs. machine learning: what's the difference between the two?
- A.I. perfectly predicted last year’s Super Bowl score. What happens to betting?
- Thanks to A.I., there is finally a way to spot ‘deepfake’ face swaps online
- A.I. is ready to advise us on how to best protect Earth from deadly asteroids
- Can an algorithm be racist? Spotting systemic oppression in the age of Google