Skip to main content

Phone-seeking search-and-rescue drone is like Find My iPhone with wings

For anyone who has ever misplaced their iPhone, Apple’s “Find My” app is a game-changer that borders on pure magic. Sign into the app, tap a button to sound an alarm on your MIA device, and, within seconds, it’ll emit a loud noise — even if your phone is set on silent mode — that allows you to go find the missing handset. Yeah, it’s usually stuck behind your sofa cushions or left facedown on a shelf somewhere.

You can think of SArdo, a new drone project created by researchers at Germany’s NEC Laboratories Europe GmbH, as Apple’s “Find My” app on steroids. The difference is that, while finding your iPhone is usually just a matter of convenience, the technology developed by NEC investigators could be a literal lifesaver.

Antonio Albanese

“SARDO is a single-UAV [unmanned aerial vehicle] solution designed to localize victims in disaster scenarios by leveraging only on their cellular connectivity,” Antonio Albanese, a research associate at NEC Laboratories Europe, told Digital Trends. “The intuition behind it is to adapt the classical cellular multilateration technique, which is based on simultaneous target distance estimates from several anchors, for example, base stations, to the case when only a single, moving anchor is available.”

Recommended Videos

Let’s unpack that a bit. For starters, SARDO ostensibly stands, in the awkward backronym way such projects frequently do, for “Search-And-Rescue DrOne-based solution.” While there are no shortage of projects that have investigated the use of drones for search-and-rescue missions in settings like disaster zones, what makes SARDO stand (or, at least, hover) apart is how it tracks down missing people: By using their phone signals.

SARDO to the rescue

To begin with, SARDO performs a time-of-flight measurement using information extracted from the signals of a user’s smartphone in order to estimate their distance. Machine learning tools are then applied to work out the precise location of the person, even compensating for scenarios in which the cellular signals are adversely affected by rubble. If the person being searched for is moving, another machine learning algorithm leaps into action to assess their trajectory based on current movement. After it has carried out a scan of an area, the SARDO drone system will automatically change its position to be closer to the victim to retrieve more accurate distance measurements.

Image used with permission by copyright holder

“To the best of our knowledge, this is the first single-drone search-and-rescue solution able to accurately localize missing victims only through mobile phones,” Albanese said. “There are competitor solutions, but they either rely on other sensors — [such as] IR or thermal cameras — or use ad hoc ultrawide bandwidth signals, which are … not employed by common cellular networks. SARDO makes the most of the higher and higher penetration rate of mobile phones in our society to provide a ubiquitous plug-and-play emergency localization system.”

The idea of tracking people down through their phone signals is smart, not least because it makes it possible to both look for specific people (something that other drone search-and-rescue approaches can’t easily do) and retrieve the identity of individuals when necessary. But there’s also a much smarter bit of tech at play.

Field tests in progress

The big potential problem with technology like this is that, in a natural disaster scenario, there is no guarantee that cellphones will be working. For example, when Hurricane Harvey battered the coast of Texas in 2017, it knocked out 70% of cell towers, more than 360 in total, in affected areas. Hurricane Katrina, meanwhile, knocked out around 1,000 cell towers in total in 2005.

How, then, do you ensure that a drone that’s trying to track people by their phone signals is able to do so? Simple: You make the drone itself into a flying, lightweight cellular base station.

“We [have so far] tested the prototype in several field trials,” Albanese said. “First, we validated our error model, and empirically proved the dependency of the error variance on the actual distance between the UAV and the [user equipment]. Then, we tested the localization [convolutional neural network] for different UAV altitudes and user speeds. Finally, we assessed the closed-loop SARDO performances, showing that it needs few complete revolutions to achieve low localization error for different user speeds.”

Right now, the technology can only work in outdoor environments. However, Albanese said the team hopes this will change in the future with the addition of indoor localization.

“Since we developed our prototype by means of off-the-shelf hardware, we may be offering SARDO as a software module product to be executed on available hardware solutions, or even as a complete solution including the UAV and the base station,” he noted.

He said that there has been interest from public safety departments, although no final decision about its adoption has yet been made.

A paper describing the work was recently published in the journal IEEE Transactions on Mobile Computing.

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Arc Search, one of the best iPhone apps right now, just got even better
Arc Search's Call Arc feature.

One of our favorite iPhone browser apps has just introduced an interesting new feature. Arc Search’s new "Call Arc" tool functions similarly to making a phone call on your iPhone 15 Pro or other iPhone. Instead of speaking to someone on the other end of the line, though, you ask Arc to answer your queries. The outcome is fresh and unique, and it actually works really well.

Before its latest software update, Arc Search already offered a voice search feature. The AI-powered Call Arc is different and designed for people on the go who are looking for quick answers to short questions.

Read more
This AI gadget let me speak in languages I don’t know or understand
Timekettle AI interpreter hub held in hand.

The Hitchhiker's Guide to the Galaxy declares the "Babel" fish to be "probably the oddest thing in the Universe." It's described as a "leech-like" fish that fits into your ear, feeds off of the brainwaves in the surroundings, and then defecates inside your ear to produce sounds in a language that you understand. Effectively, it is a very gross and flagrant, but extremely sophisticated device for real-time translation.

Nearly half a century after Douglas Adams wrote the mind-bending and earth-shatteringly (literally) convulsive saga, the concept of a Babel fish still feels highly spellbinding. While we are still not so close to the brainwave-to-defecations level of immediate translations, a bunch of gadgets are chasing that problem in a much less disgusting way. Google's Interpreter mode and Samsung's Galaxy AI are prime examples of translation technologies that are readily available, but a few brands want to tackle the issue separately from the smartphone. Timekettle is one of those brands, and its latest X1 Interpreter hub is a handheld device that claims to do it differently (read: better) using AI.

Read more
Here’s how Apple could change your iPhone forever
An iPhone 15 Pro Max laying on its back, showing its home screen.

Over the past few months, Apple has released a steady stream of research papers detailing its work with generative AI. So far, Apple has been tight-lipped about what exactly is cooking in its research labs, while rumors circulate that Apple is in talks with Google to license its Gemini AI for iPhones.

But there have been a couple of teasers of what we can expect. In February, an Apple research paper detailed an open-source model called MLLM-Guided Image Editing (MGIE) that is capable of media editing using natural language instructions from users. Now, another research paper on Ferret UI has sent the AI community into a frenzy.

Read more