Skip to main content

Is that really you? More companies are turning to voice biometrics for security purposes

Incipio NGP Case
No amount of security is too much security when it comes to our bank accounts, and now, your own voice may serve as that added layer of protection. Technology known as voice biometrics seems to be the next big thing in keeping your accounts safe and sound, especially with the alarming rise in call-in center fraud. In this latest version of trickery, criminals take advantage of human error and human emotions when they dial into a customer service line, describe some fictional situation that garners the representative’s sympathy, and subsequently gain access to sensitive data and, of course, money. $10 billion worth last year, in fact.

“[Banks] closed and locked the door online, but they left the window open with the call centers,” Vijay Balasubramaniyan, CEO of fraud detection company Pindrop Security, told CNN. And despite advances made with physical credit cards, like the Chip and PIN system, one step forward in the security realm sometimes means two steps back, as resourceful criminal masterminds find new vulnerabilities to attack.

Related Videos

According to CNN, a sales specialist at data security company Nice Systems named Erica Thomson has described fraudulent “clients” of a bank who can call a company more than 20 times a day, each time pretending to be someone else and claiming that they’re in a foreign country with a lost credit card, or in some other compromising situation in which a quick fix is desperately needed. And sometimes, in a rush to provide good customer service, representatives fall victim to these lies.

This makes technologies like voice biometrics all the more important — as CNN explains, companies like Nice Systems “can record call-in center conversations, verify the caller, and then convert the voice into a voiceprint to serve as a comparison the next time that person (or someone claiming to be that person) calls.” So when you’re told that your call is being recorded, sometimes it’s not just for quality assurance purposes — it’s for your own security.

But beyond preventing fraud, banks are also turning to voice biometrics to allow for larger transactions to take place via mobile devices. Because most people are still uncomfortable making huge deposits anywhere other than a physical bank, corporations like Wells Fargo are looking into the creation of mobile apps that makes use of some serious James Bond-esque security measures.

Tarun Wadhwa of Forbes had the chance to test out Wells Fargo’s “experimental, biometric-based commercial banking app,” and recalled announcing, “My voice gives me access to proceed, please verify me,” whereupon the phone “scanned [his] face to see if [his] lips were moving.” Then, Wadhwa had to read a series of numbers out loud, and once the app verified a voice match, it finally unlocked itself.

Thanks to the improved technological capabilities of our smartphones, facial recognition and voice biometric methods are actually becoming more and more feasible for widespread use, and Wadhwa notes that “Visa, Mastercard, and American Express all have their own biometric initiatives underway.”

But of course, given that Siri sometimes still has trouble understanding us, don’t expect to do a whole lot of voice verification anytime soon.

Editors' Recommendations

iPhone 14 satellite connectivity: how it works, what it costs, and more
Person holding iPhone 14 searching for Emergency SOS satellite.

One of the most exciting features in Apple's iPhone 14 and iPhone 14 Pro is one that you'll hopefully never need to use: the ability to summon help in an emergency using satellites.

Dubbed Emergency SOS via satellite, this new feature allows iPhone owners to reach out to emergency services from just about anywhere on the planet — even when far away from cellular and Wi-Fi networks. While it initially launched in the U.S. and Canada, Apple has also expanded its reach to four European countries and has more international expansion plans underway for next year.

Read more
Why the lawsuit against Apple’s AirTags may be bigger than you realize
Nomad Apple TV Siri Remote Case with an AirTag installed.

Apple is facing yet another lawsuit, but this one could very well define the future of AirTags. Two women who were victims of stalking using AirTags have filed a class-action lawsuit against Apple in a district court in Northern California. The plaintiffs cite incidents where former partners hid AirTags in a car wheel and a child’s backpack to track their whereabouts and harass them. 
The ordeal is harrowing, but it’s not the first of its kind. Ever since AirTags went on sale, reports of stalking and using them to steal cars have been all over the internet. But this time, Apple has been dragged to court over a rather comprehensive set of charges.

Accusing Apple of gross negligence, the plaintiffs allege that the company rushed out a product without proper safeguards. 
The design has also been lambasted as the trackers didn’t perform as expected, even though Apple claimed the AirTags are “stalker-proof.” The lawsuit mentions that “AirTag’s design defect was a substantial factor” in causing harm. To recall, AirTags weren’t initially loud enough, and after being flagged, Apple decided to bump up their sonic warning capability. 

Read more
Your iPhone may be collecting more personal data than you realize
The power key on the side of the iPhone 14 Plus.

It's widely believed that iPhones are among the most secure smartphones you can buy — and that's largely true. But what if your iPhone was collecting more personal data about you than you were led to believe? According to security researchers Tommy Mysk and Tala Haj Bakry, that's exactly what's happening.

Late in the evening on November 20, Mysk and Bakry published a series of tweets digging into something called "Directory Servicers Identifier" — or "DSID" for short. When you set up your iPhone for the first time, Apple asks if you want to share analytics data with the company to "help Apple improve and develop its products and services." You're then given a DSID if you agree to this, and upon doing so, Apple states that "none of the collected information identifies you personally." According to Mysk and Bakry, however, that may not be entirely accurate.

Read more