Skip to main content

Researchers can tell if you’re well dressed – with math

Fashion trends can be difficult to predict — one minute something’s hot, the next it’s not at all. However, a team of researchers from IRI in Barcelona and the University of Toronto have built a mathematical model that will help us understand how something becomes fashionable.

The algorithm was presented as part of a paper entitled “Neuroaesthetics in Fashion: Modeling the Perception of Fashionability” at a conference held in Boston last month, according to a report from Science Daily. Edgar Simo-Serra and Francesc Moreno-Noguer from IRI teamed with Sanja Fidler and Raquel Urtasun from the University of Toronto to make the project a reality.

Recommended Videos

The model was built using a dataset of fashion-themed posts found on the Internet. Images of fashionable clothing and the captions they were given were cross-referenced against the social media response to each entry. Individual garments were judged as fashionable or not fashionable based on the amount of likes they received on social media.

Basically, the method boils down to posting a picture of an outfit on Facebook and seeing whether people love it or hate it — which is, in of itself, a decent summation of how fashion works. However, the project can claim some authority on what to wear thanks to the scale of the sample being used, with data from over 144,000 posts being taken into consideration.

The intention is that the system will be able to advise users on fashion choices thanks to the data it has accumulated. What remains to be seen is whether anyone will subject themselves to having a computer give them tips on how to dress in the morning. If the idea of fashion can really be boiled down to a statistical algorithm, wouldn’t we all end up wearing the exact same outfit?

Brad Jones
Brad is an English-born writer currently splitting his time between Edinburgh and Pennsylvania. You can find him on Twitter…
Amazon’s AI shopper makes sure you don’t leave without spending
Amazon Buy for Me feature.

The future of online shopping on Amazon is going to be heavily dependent on AI. Early in 2025, the company pushed its Rufus AI agent to spill product information and help users find the right items. A few weeks later, another AI tool called Interests made its way to the shopping site. 

The new Alexa+ AI assistant is also capable of placing orders semi-autonomously, handling everything from groceries to booking appointments. Now, the company has started to test yet another AI agent that will buy products from other websites if they’re not available on Amazon — without ever leaving the app. 

Read more
Study says AI hype is hindering genuine research on artificial intelligence
Monitor showing the 2025 AAAI study on AI.

A new AAAI (Association for the Advancement of Artificial Intelligence) study with hundreds of contributing AI researchers has been published this month, and the main takeaway is this: our current approach to AI is unlikely to lead us to artificial general intelligence. 

AI has been a buzzword for a good couple of years now, but artificial intelligence as a field of research has existed for many decades. Alan Turing's famous "Computing Machinery and Intelligence" paper and the Turing test we still talk about today, for example, were published in 1950. 

Read more
Amazon’s AI agent will make it even easier for you to part with your money
Amazon Nova Act performing task in a web browser.

The next big thing in the field of artificial intelligence is Agentic AI, which is essentially an AI tool that can automate certain multi-step processes for users. For example, interacting with a web browser for tasks like booking tickets or ordering groceries. 

Amazon certainly sees a future in there. After giving a massive overhaul to Alexa and introducing a new Alexa+ assistant, the company has today announced a new AI agent called Nova Act. Amazon says Nova Act is designed to “complete tasks in a web browser.” Amazon won’t be the first to reach this milestone, as few other AI companies have already attempted this vision. 

Read more