Skip to main content

Photoshop AI thinks ‘happiness’ is a smile with rotten teeth

You can’t swing a dead cat these days without running into AI. And nowhere is that more true than in photography. I’ve certainly had fun with it on more than my share of photos. But the more I attempt to be a “serious” photographer, the less I want to rely on artificial intelligence to do my job for me.

That’s not to say it doesn’t have its place. Because it does. And at the end of the day, using AI filters isn’t really any different than hitting “auto” in Photoshop or Lightroom and using those results. And AI certainly has its place in the world of art. (Though I’d probably put that place somewhere way in the back, behind the humans who make it all possible in the first place.)

Related Videos

A recent encounter with the “Neural filters” in Adobe Photoshop has me rethinking things a little, though.

Phil Nickinson, as edited by Adobe Photoshop's Neural Filter.
Phil Nickinson, as seen with a decent edit in Lightroom and Photoshop on the left, and with Photoshop’s “Neural Filter” happiness slider turned up on the right. Phil Nickinson/Digital Trends

Let’s set the stage: In addition to all the other stuff I shoot on a regular basis — portraits, sports, work stuff, events — I also like to put myself in front of the camera and lights once a week or so. It’s good practice for shooting and editing, if nothing else, and it’s good to have some decent pictures laying around just in case I get hit by a bus.

And I was pretty happy with my latest attempt. I’m no trained model, that’s for sure. And the women in my house are far easier on the eye than my aging face. But, still. A decent shot.

“But you look mad,” my wife said to me after I sent her the picture. And she’s right. I can be a serious guy, but not that serious. And that’s when I thought about Photoshop, which has the aforementioned “Neural filters.” That’s a great name, but it really just means it’s going to take what you’re working on, shoot it to its servers, and go to work. All you have to do is more some sliders.

There are all kinds of high-level options here, from skin smoothing and colorization for black-and-white photos to artificial landscapes and makeup. “Smart Portrait” is what I was after here, specifically the “Happiness” setting therein.

There’s no real skill required here. Grab the slider and pick a number. I went all the way up to 33, just because. The result was … not good.

It’s super easy to make geographical jokes here. I live 20 minutes away from one Southern state that is the butt of plenty of them, and not much farther from another. (And I wasn’t the only one to quickly think of that.) Never mind that low-hanging fruit, though. I’m just trying to figure out in what world Adobe’s AI engine was thinking here.

Sure, my lips parted ever so slightly into something you could reasonably call a smile. The mustache moved along with it. But so did my teeth. A lot. This really should go without saying, but they do not look like that. My chompers do not look as if they’ve been on the losing side of the war against drugs for just a little too long. They really do not appear to have only seen the inside of a dentist’s office by accident because it was next door to a pawn shop. And let’s not even start on the pirate eye.

It’s easy to poke fun at the responses large language models give when you feed it nonsense in the first place. But in this case, Adobe’s was fed a 45-year-old who’s in as good a shape as he’s been since high school.

I have no idea what I did to make it so angry. And to be fair, it’s always learning, and Adobe has a button that asks if you’re happy with the edits the Neural Filter applied.

Suffice it to say, I was not.

Editors' Recommendations

AI may beat humans at everything in 45 years, experts predict
oxford yale ai survey robot1

Predicting the future of AI isn’t easy. Every decade since artificial intelligence was first formed as its own discipline in 1956, there’s been a prediction that artificial general intelligence (AGI) is just a few years away -- and so far we can safely say that most of them have been shy of the mark.

A new survey, conducted by the University of Oxford and Yale University, draws on the expertise of 352 leading AI researchers. It suggests that there’s a 50-percent chance that machines will be bettering us at every task by the year 2062. However, plenty more milestones will be hit before then. These include machines that are better than us at translating foreign languages by 2024, better at writing high school essays than us by 2026, better at driving trucks by 2027, better at working retail jobs by 2031, capable of penning a best-selling book by 2049, and better at carrying out surgery by 2053. Asian respondents predicted these events will happen much sooner than did North American researchers.

Read more
AI learns how to tackle new situations by studying how humans play games
facebook convolutional translation ai

If artificial intelligence is going to excel at driving cars or performing other complex tasks that we humans take for granted, then it needs to learn how to respond to unknown circumstances. That is the task of machine learning, which needs real-world examples to study.

So far, however, most data used to train machine-learning systems comes from virtual environments. A group of researchers, including a Microsoft Research scientist from the U.K., have set out to change that by using game replay data that can show an AI how humans tackle complex problems.

Read more
Scientists need you to play classic Atari games, teach their AI new tricks
Atari cartridges sit in a pile.

Learning valuable skills by playing video games sounds suspiciously like the kind of feeble excuse we used as teenagers to explain why we were playing GoldenEye 007 instead of doing our homework. But in the case of a new AI project carried out by computer scientists at RWTH Aachen University in Germany and Microsoft Research, it turns out to be absolutely true.

“What we’ve developed is a way to collect data of humans playing five Atari games, a large dataset of humans playing them, and the insight that — with current algorithms — less data of better players seems to be more useful for learning than more data of worse players,” Lucas Beyer, a researcher on the project, told Digital Trends. “This might sound obvious, but really it’s not: The common theme being ‘the more data the better.’”

Read more