Turns out Google+ isn’t for us, it’s for researching us

 Turns out Google+ isn't for us, it’s for researching us
This week at Google I/O, the company announced a slew of new features for Google+ – 41 new features, if you’re counting. These new additions inarguably make Google+ better: It’s smarter, more interactive, better looking, faster. Not to mention the massive upgrades to Photos and the Hangouts chatting client.

On paper, Google+ is a beautiful, compelling, rich product. So why don’t I care?

When Google’s answer to Facebook first launched nearly two years ago, I was a cheerleader. Like everyone else, I was frustrated by the lack of control I had over Facebook and the early signs of “Facebook fatigue” (a phrase we’re now gratuitously throwing around) were starting to set in. Google+ had been under wraps for years; it was like this Internet-wide Easter Egg that we were so excited to finally see.

And I tried really, really hard to like it. And I still don’t.

35 notifications
Notice the 35 notifications I haven’t been bothered to check in who knows how long.

I will be the first to admit that there are a handful of features that make Google+ a more viable, attractive option than Facebook. (First let’s get this out of the way: I’m not going to compare G+ to Twitter or Instagram or Tumblr because it is an all-encompassing, multi-feature, no-niche-nonsense social network … comparable, really, only to Facebook. Yell at me all you want.) For starters, photos: At launch, G+ had a bevy of options Facebook didn’t – editing tools, a slicker layout, instant emailing capabilities, higher res uploads. While Facebook’s retort has been to up its own photo game – and buy Instagram – it’s certainly not giving us anything close to the same toolbox. The new photo features just make it an even more comprehensive platform for photo-sharing (although their effectiveness thus far is debatable. Still – it’s something Facebook hasn’t given us).

The new, interactive, algorithm-learning news stream is also interesting, and certainly seems like something I should favor in comparison to the Facebook News Feed/bane of my Internet existence. It’s multi-paned, it learns from me, and discovery is hugely aided by the subject auto-tagging Google’s enabled.

Google is doing so, so much with G+. It’s incorporated photo editing, GIFs, hashtags, natural language processing technology, news feed refinement … generally, all of the things we’re constantly complaining about with Facebook. But for some reason, we can’t be convinced to buy into this beautiful madness.

And that’s because it feels like it’s all for show. Look, when G+ first launched, there was a genuine clamoring for invites – a clamoring I was not immune to (I … uh … I begged. There I said it, are you happy?). But that moment was shortlived and followed by a breathtakingly fast drop in activity.

It all feels pushed, pulled, and prodded into Google+ instead of organically originating there.

Soon after this, Google started instituting ways to make you a G+ user whether you wanted to be one or not. If you wanted to have or sign up for a Gmail account, you became a G+ user. If you wanted a YouTube account, same deal. Same went for Drive, Calendar, Music … the list went on and on. This means that people have been unwittingly and unwillingly using Google+ and filling up its stream; if you upload a YouTube video or +1 something … or even use the client for Hangouts … it’s being fed into the social network.

Some people know they’re doing this and are meaningfully posting to G+ – but some aren’t, and it’s creating a false sense of social. When I look at my feed, which is notably devoid of my close friends and family, I don’t know what was purposefully uploaded to the site and what just happened to get pulled in thanks to Google’s G+ feeding agenda. It all feels pushed, pulled, and prodded into Google+ instead of organically originating there.

When I login to Facebook, I peruse my News Feed, I respond to posts, I “like,” I might post a link. But when I login to G+ (something I haven’t done in months) and actually look at the page, something just feels … off. My general feed is full of links to industry articles from people I don’t know; my friends tab is mainly links from 1-2 people. Nothing is personal or for that matter, social.

And that’s because Google+ isn’t for us – it’s for Google. At a fireside chat during I/O, Google+ developers address some of the lurking questions about why we should be using the service. “There happens to be a product at plus.google.com and an app,” said G+ director of engineering David Glazer (via Forbes). “But really it’s a way for Google to get to know our users. Who they have relationships with. We give them the ability to share. That layer, that spine, that backbone, is intended to help us make search, Maps, YouTube, Gmail, etc. better. That’s the real point of Google+.”

All the data Google could ever want to get out of a social network it’s getting, so there’s less of an impetus to really, truly create a user experience that rivals Facebook. Of course, more user activity and data is good, but the wide reach of Google’s services means that the layer of G+ is doing plenty when it comes to amassing user data – but it’s why we continue to feel unsatisfied whenever we login and actually look at Google+. Of course we’re also learning that the answer to Facebook isn’t a new Facebook, hence the rise of niche networks like Snapchat, Tumblr, and Instagram.

It’s all sort of disappointing, because there are a lot of great things about Google+, but perhaps our expectations were too high all along. We, the pundits, were the ones talking about a Facebook exodus and a Facebook Killer – not Google. Because Google never intended for G+ to be any such thing. We wanted a platform and what Google churned out was a sheep in wolf’s clothing – a feature in a platform’s disguise.

And while it’s a very, very beautiful one, you can’t reverse engineer social. Even if you really, really want to. 

The views expressed here are solely those of the author and do not reflect the beliefs of Digital Trends.

Home Theater

Sony’s 360 Reality Audio is the epic sound revolution you didn’t know you needed

After Sony’s utterly bizarre press conference, I almost missed what was perhaps the most impactful sonic experience at the show. Luckily, I went back to Sony’s booth on the last day of the show, only to have my mind blown.
Emerging Tech

Ant-inspired walking robot navigates without GPS by using polarized light

What do you get if you cross Boston Dynamics and Ant-Man? You get Antbot, a robot from the French National Center for Scientific Research (CNRS) which uses ant-like navigation to move around without the aid of GPS.
Smart Home

Language barrier? Psh. Here's how to make your Google Home an ace translator

You can now use interpreter mode on your Google Home devices. This means, you can use your Google Home device to translate conversations in real-time. Here's how to use interpreter mode.
Gaming

After Twitch ban for using homophobic language, musician Deadmau5 apologizes

Electronic musician Deadmau5 has been suspended from Twitch after using homophobic language during a PlayerUnknown's Battlegrounds match. The musician later said he wouldn't be returning.
Photography

Using A.I., Lightroom can now boost the resolution of RAW photos

Need to eek a bit more resolution out of a RAW file? Adobe Lightroom and Camera Raw can help with a new feature called Detail Enhance. The tool uses A.I. in the demosaicing process to enhance details and reduce artifacts.
Mobile

AT&T jumps the gun with deliberately misleading 5GE launch

As excitement about 5G networks continues to build, AT&T jumps the gun with a ridiculous and deliberate attempt to deceive the public with 5G Evolution – a speed bump that’s based on improvements to 4G tech.
Features

Netflix’s latest price increase heralds the end of streaming’s golden age

Netflix’s recent price rise is just the latest in a string of signs that streaming’s golden age is nearly over. As more services enter the fray, content will be further partitioned, signaling the end of streaming’s good old days.
Features

Netflix’s rate hike is a good thing. Wait, wait, hear us out

Upset at Netflix for raising its rates? We don't blame you. Nobody likes to pay more for anything -- even if they love that thing. But you really should be thanking the streaming entertainment giant. The hike in prices is a necessary and…
Mobile

Bezel-less phones are terrible for typing on, and it’s only going to get worse

Bezel-less smartphone screens look great, and foldable smartphones are an exciting part of the mobile future; but we don't like where the typing experience is heading because of these two trends.
Gaming

Blizzard's dismal updates to 'Diablo 3' make 'Path of Exile' the better option

'Diablo 3' season 16, the 'Season of Grandeur,' is live. It attempts to shake up the stale meta-game with a minor tweak, but it falls far short of what fans of the franchise want. Better games like 'Path of Exile' are eating Blizzard's…
Wearables

A wearable may save your life, thanks to A.I. and big data. Here’s how

Wearables are morphing from devices that send you smartphone notifications and track your fitness into gadgets that can monitor your health -- and maybe even save your life.
Gaming

'Wargroove' is a delightful tactics game that lets you recruit cute armored pups

Wargroove is a fantastical Advance Wars successor with beautiful pixelated visuals and rewarding grid-based combat. In addition to a meaty campaign, Wargroove has an intuitive map editor that lets you create robust campaigns of your own.
Smart Home

Will everything from lamps to fridges be spying on me? Yes, and I’m creeped out

With the debut of Panasonic’s HomeHawk lamp with built-in video camera, should we be concerned that everything -- from couches to dishwashers -- could soon be spying on us? Here’s why the answer to that question is yes.
Computing

Debunking Dark Mode: Here’s why it won’t improve your laptop’s battery life

Dark Mode is known to improve battery life for certain devices, like a smartphone with an OLED screen. Does that apply to laptops, as well? To find out we tested two laptops, one running Windows and one running MacOS.