Skip to main content

Technology makes our lives easier, but is it at the cost of our humanity?

Adam Hester/Getty Images

Technology has completely taken over our lives and, for the most part, we’ve let it.

It’s hard to argue that the world today is worse off than it was. For years I lived an ocean away from my family and many of my friends, yet they rarely felt out of reach. Asking my dad for cooking advice from five-thousand miles away was even easier than asking my neighbor to borrow salt. Around the world more pressing problems, like totalitarian regimes, have been challenged and sometimes toppled by protestors who organized revolutions over social media. And I know at least two people who’ve said they “can’t live without Alexa.”

How many phone numbers do you know? What would happen if all suddenly GPS went offline? Losing your smartphone is now akin to losing a part of your brain.

But just as technology makes things easier it has the potential to handicap our connection with the world around us. How many telephone numbers do you know? What would happen if suddenly all GPS went offline? How many people would struggle to find their way home from a cafe just a few blocks away? Losing your smartphone is now akin to losing a part of your brain.

In a new book called Re-Engineering HumanityEvan Selinger, professor of philosophy at the Rochester Institute of Technology, and Brett Frischmann, professor of law at Villanova University, argue that technology is causing humans to behave like mere machines. By taking over what were once fundamental functions, they say algorithms, robots, and consumer devices have begun to be dissociate us from our own humanity. The text isn’t a luddite-like rejection of technological progress. Rather, it’s a careful consideration and caution of the way we let tech into our lives.

We spoke to Selinger about the book, his views on our problematic relationship with technology, and how he suggests we fix it. The solution, he said, won’t take just individual digital detoxes, but a complete societal shift. The interview has been edited for clarity.

Evan Selinger
Evan Selinger Image used with permission by copyright holder

Digital Trends: The book revolves around the concept of humanity’s ”techno-social dilemma.” Can you explain what that is?


Evan Selinger: Sure, not long ago, a lot of tech coverage was very enthusiastic about the latest product reviews. There was a kind of “gee whizz” feeling about it. But…suddenly things have gotten really dark. Zuckerberg appears before congress to talk about data privacy problems and political propaganda, and this is on the back of things like the backlash against companies making addictive smartphones. There’s this turning point that seems to be happening. There’s suddenly this wide spread reflection on the dark side of technology.

A lot has been made about how little the politicians who were talking to Zuckerberg knew about how tech works. I totally understand why people are responding this way. They’re concerned about how we could even have decent regulation if regulators don’t even understand what’s going on. But the problems that cause things like humanity’s techno-social dilemma are so much more complicated than making politicians more tech-literate and social media-savvy.

“The problems that cause things like humanity’s techno-social dilemma are so much more complicated than making politicians more tech-literate and social media-savvy.”

In the book, Brett Frischmann and I had to do something like an interdisciplinary full-court press. We had to put together philosophy and law, economics, sociology, history, computer science, and cognitive science into hundreds of pages just to get a sense of what is really going on. What are the real deep problems?

One of the framings we came up with is “humanity’s techno-social dilemma,” which we think gets at the underlying stuff as a way to connect all the dots and begin to look at what technology is doing to us.

The fact is, there are tech companies with their own ambitions…but people have their own agendas too. We have this love-hate relationship with technology now where we’re clamoring for the latest iPhone and update, but then all of a sudden wonder where all our privacy went. We end up getting surprised because things ramp up to the extent that, once a certain amount of buy-in happens, we move to the next level and suddenly everyone is involved.

Digital Age Etiquette: Evan Selinger at TEDxFlourCity

It sounds like you’re referring to the concept of “creep,” or that by gradually broadening the scope of technology, something radical can suddenly feel normal. You worry about this in the book. Can you give a real world example of creep?

I have an example from just the other day. I live in New York and got a mail to renew my state driver’s license. The paperwork recommended I get real ID rather than just a driver’s license, because it said you’d need that real ID to travel in a few years. I told my father-in-law about this and he said maybe we should just start putting microchips in citizens. Then we wouldn’t have to worry about the next level of IDs. Traveling would be seamless.

That is the logic of techno-social engineering creep right there! Not too long ago people would have thought the idea of a chip implant is dystopian. Now we’re so used to being surveyed with devices like our phones that it’s become a new normal.

Not too long ago people would have thought the idea of a chip implant is dystopian. Now we’re so used to being surveyed with devices like our phones that it’s become a new normal.

Techno-social engineering creep refers to how, through practices and getting accustomed to things, our expectations and sense of comfort with things shift. Sometimes our preferences even shift and get engineered.

You pose the question early on of whether techno-social engineering is turning people into simple machines. How do you see that happening?

Technology affects our humanity because it impacts our senses and our thoughts. It impacts our decisions, including our judgement, attention, and desires. It impacts our ability to be citizens, what were informed about and how we stay informed. It impacts our relationships, and advance in A.I. will even substitute our engagements with people. It even impacts our fundamental understanding of what it means to be human, who we are and what we should strive to become.

Buy it now at: Amazon Brett Frischmann

Our point is that our very humanity is being reshaped and reconfigured by technology. As the desire to have everything be “smart” increases, one of our concerns is that…these environments will end up monitoring what we do and end up slicing and dicing us in all kinds of powerful ways. We wonder if this super smart world will result in us going with some kind of pre-programmed flow and whether that flow is optimized so that to understand what it means to be human we will feel pressured to see ourselves as optimizable technology.

Many people are focused on the rise of A.I., with the concern that our robotic overlords will enslave us once ‘the singularity’ occurs. Our concern is that we are going to be programmed to want be placed in environments that are so diminishing of our agency…that we outsource our emotions and capacities for connection. How much could we give up, dumb ourselves down, to fit in to these smart environments?

You state that one of the attractions of smart environments is that they offer “cheap bliss.” Do I sense a double meaning there?

I’m curious what you think the double meaning is?

The idea that bliss is made cheap, as in easy to attain, but also cheap, as in not very rewarding.

I think you’re putting your finger on it.

“One of the trends
that’s occurring across consumer tech … is the idea of creating an ever more frictionless world, where effort is seen as a bug, not a feature.”

When we talk about cheap bliss, we want to figure out what world we’re building and what values are being prioritized by the very design of that world. And we want to find out what human beings are being nudged to value. One of the trends that’s occurring across consumer tech and overlapping with governmental projects like smart cities, is the idea of creating an ever more frictionless world, where effort is seen as a bug, not a feature. The idea is that humans are inefficient but technology can be very efficient.

When you design technology to disburden us of efforts, your changing the moral calculus in a way that will work very well for people who value a kind of basic hedonism, who think that the highest value in life is pleasure and the more pleasure we can have the better. This is what that world seems to be optimized for.

In the book we’re trying to offer these alternative values for human flourishing.

You also seem to take a stab at how technology enables us to outsource responsibilities, and take issue with parental outsourcing in particular. You refer to it as “drone parenting.”

Just to be clear, we are absolutely not doing any finger pointing. If I were, I’d be indicting myself.

Getty Images

It’s very hard being a parent right now. We can have all the insight into tech addiction and too much screen time, and yet there’s nonetheless the reality that my middle school daughter’s friends are all on their phones, on Snapchat and Instagram, reporting on social events. There’s a whole lot of social pressure. I’m super sympathetic to the numerous complexities and tradeoffs involved with being a parent.

But tech offers the possibility to take over more and more parental functions. All these technologies make it easier to be a parent. Think about parents at restaurants, where the easiest way to keep their kids from being disruptive is to give them a tablet.

We talk about the quantified self and the quantified baby devices, which can help monitor your children. Those things can be appealing. New parents want to make sure they’re not making any mistakes. They want to make sure the baby is breathing, for example, or if the baby wakes up that they’re attentive to that. But the more and more a baby’s vital functions are being monitored by these technologies and the easier these reports get sent to us, there is a question to be raised about the trade off.

It’s very hard being a parent right now. We can have all the insight into tech addiction and too much screen time, and yet there’s nonetheless the reality that my middle school daughter’s friends are all on their phones, on Snapchat and Instagram, reporting on social events.

A consequence of adopting these technologies is whether or not we want to develop our own sense of attunement, which requires skill, effort, and a desire to be present.

So what’s next? How do you suggest we solve the dilemma?

Simple techno-fixes are not what we’re prescribing. You know, people say turn your notifications off so you’re pinged less or start using a greyscale version of your phone because it’s less enticing than the color screen. That advice exists but these micro-solutions often are a lot less consequential than the people who are proposing them make them out to appear.

Two quick things that I’ll say:

People have pointed out that our online contract system is broken. They’re engineered in such a way that you can pack the maximum amount of boilerplate in. There’s no point in reading them. Not only can you not understand it, but you realize no one else can so you’re incentivized to put deliberation on hold and immediately click “I Agree” as fast as possible to get the service. This leaves consumers without full knowledge about what they’re doing and gives companies full power.

Brett Frischmann presentation at After the Digital Tornado (November 18, 2017)

But think about how common contracts are. They seem to be increasing because they’re so easy and we’re conditioned to not think about them at all. This is a simple machine part. We’re being optimized to consider not meetings of the minds, but just basically autopilot resignation. It’s take it or leave it. There’s no bargaining. We ask whether the practice is helping signal that deliberation doesn’t really matter when it comes to dealing with tech. Just get into a habit accepting what they provide until some sort of disaster happens and then hope that regulators or someone else takes care of it.

The other thing we want to point out is that, in being a human, it’s important to have some capacity for breathing room, to sort of step back and examine all of the social pressures and all of the social programming that’s going on. The ability to step away from being observed by others, by technology companies, that is disappearing. It’s becoming harder to find spaces to have breathing room.

We’re wondering how to find this breathing room in this world. You can’t get it simply by carving out your little niche because that will only go so far. This might mean clamoring for different regulations for when companies can reach you.

We’re seeing something like that in Europe but it certainly isn’t a popular idea here in the U.S.

Editors' Recommendations

Dyllan Furness
Dyllan Furness is a freelance writer from Florida. He covers strange science and emerging tech for Digital Trends, focusing…
A disembodied robot mouth and 14 other 2020 stories we laughed at
The Prayer

Goodbye 2020, and good riddance! But before we slam the door shut on this tumultuous year, let’s try to raise a smile or two by revisiting some of the more amusing tech stories that landed on the pages of Digital Trends over the last 12 months. Here's a recap of the weirdest, wildest, and most hilariously strange stories we've run this year. Enjoy!
A.I. fail as robot TV camera follows bald head instead of soccer ball
https://twitter.com/rogbennett/status/1321869751258329090

While artificial intelligence (A.I.) has clearly made astonishing strides in recent years, the technology is still prone to the occasional fail.

Read more
Wild new ‘brainsourcing’ technique trains A.I. directly with human brainwaves
brainsourcing university of helsinki study a mannequin is fitted with prototype o

Picture a room full of desks, numbering more than two dozen in total. At each identical desk, there is a computer with a person sitting in front of it playing a simple identification game. The game asks the user to complete an assortment of basic recognition tasks, such as choosing which photo out of a series that shows someone smiling or depicts a person with dark hair or wearing glasses. The player must make their decision before moving onto the next picture.

Only they don’t do it by clicking with their mouse or tapping a touchscreen. Instead, they select the right answer simply by thinking it.

Read more
Because 2020’s not crazy enough, a robot mouth is singing A.I. prayers in Paris
The Prayer

Diemut Strebe: The Prayer

In these troubling, confusing times, it can be tough to know who to turn to for help. One possible answer? A disembodied robot mouth chanting algorithmically generated Gregorian-style prayers in the voice of Amazon’s Kendra.

Read more