Isaac Asimov’s First Law of Robotics states that a robot may not injure a human being or, through inaction, allow a human being to come to harm. But that does not mean a computer can’t tell us whether a person is in pain — and then neatly rank that pain level into some objective measure, like a computer science textbook written by the author of Fifty Shades of Grey.
The work in question was carried out by researchers at the Massachusetts Institute of Technology (MIT). They developed an artificial intelligence that is able to predict how much pain a person is in by looking at an image. The system, called “DeepFaceLIFT,” is a machine-learning algorithm that was trained on videos of people wincing or showing other signs of discomfort. From these videos, it was able to learn the different subtleties in a person’s facial micro-expressions that would help estimate the level of pain they are in, when taken in association with self-reported pain scores. The algorithm can be honed according to a person’s age, sex, and skin complexion, and turns out to be a lot more accurate than previous one-size-fits-all research projects.
MIT’s machine-learning project might sound sadistic but it has useful potential real-world applications. At present, the so-called “gold standard for pain measurement” is something called the visual-analog scale (VAS) pain metric. While useful, this VAS metric is entirely self-reported, which makes it both subjective and context-dependent, and its range can vary significantly between different people. An algorithm is unlikely to ever totally replace these kinds of self-reported systems for a variety of reasons (imagine telling a patient in hospital that you’re denying medication because the computer says they are not exhibiting the right pained expression!), but it could be a useful clinical tool in the quest to make pain reporting more objective. It may be especially valuable in determining whether or not a person is being honest about their pain levels — and not faking it.
There is still work to be done on the project, but the hope is to eventually develop it into a mobile app that could be accessed by physicians.
A paper describing the project was recently published in the Journal of Machine Learning Research.
- Replaced by robots: 10 jobs that could be hit hard by the A.I. revolution
- A.I. can do almost anything now, but here are 6 things machines still suck at
- Like a breathalyzer for exhaustion, new blood test can tell how tired you are
- MIT is building a new $1 billion college dedicated to all things A.I.
- Teaching machines to see illusions may help computer vision get smarter