Skip to main content

Facebook’s latest test feature notifies you if someone is impersonating your account

Social media sites like Facebook and Twitter are constantly testing, experimenting, and patenting new features. From the strange and mundane to the downright infuriating, the tools being tweaked don’t always seem like they’ll be of any practical use to the general public.

Facebook’s latest test, however, may actually result in a genuinely useful support function. The social network is reportedly testing a feature that will automatically alert you if it discovers another user is impersonating your account by using your name and photo, reports Mashable.

Recommended Videos

This is how it works; when Facebook detects that another profile matches yours, it will send you a notification about the suspected imposter. You’ll then be prompted to identify if the selected profile is in fact impersonating you by using your personal information, or if it belongs to another user who is not in fact an imposter.

Part automation, part human support system, the function will see a dedicated support team manually review any impersonations. According to Facebook, the feature has been under review since November and has now been rolled out to 75 percent of the world, with further expansion on the cards.

The function makes complete sense as a part of Facebook’s increased support efforts for its diverse global community. Most recently, the platform introduced a new “fake name” reporting system aimed at addressing the issues of LGBTQ users, non-western individuals, and cases of stalking and abuse. This latest test feature likely falls into the latter camp, as the social network’s real name policy requires people to use an authentic name.

Facebook is also testing two other features, including new ways of reporting non-consensual intimate images and a photo checkup function. The latter helps users through the privacy settings for their images, and is reportedly already live in India and parts of South America, Africa, and Southeast Asia.

The concept of “non-consensual intimate images,” as Facebook has so formally termed them, basically refers to nude photos that you didn’t condone being posted online. These types of images have been banned on Facebook since 2012, but the new test feature is aimed at making the reporting experience more sympathetic for victims of abuse. In accordance with the test, when a person identifies a non-consensual intimate image, they will also have the option of notifying Facebook of whether the image is of them. If that is the case, the platform will provide the victim with external links to support groups of those suffering abuse, and even possible legal options.

We have reached out to Facebook for a statement, and will update the article accordingly.

Saqib Shah
Saqib Shah is a Twitter addict and film fan with an obsessive interest in pop culture trends. In his spare time he can be…
Bluesky finally adds a feature many had been waiting for
A blue sky with clouds.

Bluesky has been making a lot of progress in recent months by simplifying the process to sign up while at the same time rolling out a steady stream of new features.

As part of those continuing efforts, the social media app has just announced that users can now send direct messages (DMs).

Read more
Reddit just achieved something for the first time in its 20-year history
The Reddit logo.

Reddit’s on a roll. The social media platform has just turned a profit for the first time in its 20-year history, and now boasts a record 97.2 million daily active users, marking a year-over-year increase of 47%. A few times during the quarter, the figure topped 100 million, which Reddit CEO and co-founder Steve Huffman said in a letter to shareholders had been a “long-standing milestone” for the site.

The company, which went public in March, announced the news in its third-quarter earnings results on Tuesday.

Read more
Worried about the TikTok ban? This is how it might look on your phone
TikTok splash screen on an Android phone.

The US Supreme Court has decided to uphold a law that would see TikTok banned in the country on January 19. Now, the platform has issued an official statement, confirming that it will indeed shut down unless it gets some emergency relief from the outgoing president.

“Unless the Biden Administration immediately provides a definitive statement to satisfy the most critical service providers assuring non-enforcement, unfortunately TikTok will be forced to go dark on January 19,” said the company soon after the court’s verdict.
So, what does going dark mean?
So, far, there is no official statement on what exactly TikTok means by “going dark.” There is a lot of speculation out there on how exactly the app or website will look once TikTok shutters in the US.

Read more