Skip to main content

Forget keywords — this new system lets you search with rudimentary sketches

Could the future of online shopping be as simple as sketching out what you’re looking for and letting a computer figure out the rest? That’s the working theory of two researchers at Queen Mary University of London’s (QMUL) School of Electronic Engineering and Computer Science. They’ve taught a deep learning neural network — an incredibly powerful tool that mimics the way that the human brain works — to recognize hand-drawn sketches and use them to search for real-life products.

The network was “trained” to match sketches to photos based on a data set consisting of around 30,000 sketch-photo comparisons. From these it learned to interpret the way that people depict real objects in hand drawing. Most impressive of all is the fact that the sketches drawn by users don’t even have to be all that detailed — but the more detail users do add, the more accurate the search results become.

“I would hope that within six months you’ll be able to log into a shopping site and use this technology.”

“Sketch is a very flexible tool,” co-developer Yi-Zhe Song, director of QMUL’s SketchX Research Lab, told Digital Trends. “For example, you might see a pair of shoes that you like, but not approve of every detail of them — such as the buckle or the heel. In that case, you can draw the shoe you would like to see — and then let the computer search for the closest possible match.”

But as impressive a computer science project as this is, why would anyone want to spend time drawing pictures instead of typing a quick text-based search in, or snapping a picture as companies like Amazon now allow you to do? Song has an answer.

“Both of those search modalities have problems,” he says. “Text-based search means that you have to try and describe the item you are looking for. This is especially difficult when you want to describe something at length, because retrieval becomes less accurate the more text you type. Photo-based search, on the other hand, lets you take a picture of an item and then find that particular product. It’s very direct, but it is also overly constrained, allowing you to find just one specific product instead of offering other similar items you may also be interested in.”

While Song and his colleague Timothy Hospedales, director of QMUL’s Applied Machine Learning Lab, acknowledge that both of these search methods have their use-cases, and they think the time is right for them to be joined by sketch-based search.

“Today, almost all of us have smartphones, tablets and even smartwatches with touchscreens you can draw on,” Song notes. “This is something that was relatively rare just 10 years ago.”

Song and Hospedales recently showed off their new creation to an audience at the International Conference on Computer Vision and Pattern Recognition in Las Vegas, and are now talking with retailers about incorporating the product into the online marketplace. “I would hope that within six months you’ll be able to log into a shopping site and use this technology,” Hospedales told Digital Trends.

For the sake of easy shopping, we hope so too.

Editors' Recommendations