You know that old saying about how it’s not what you say, but how you say it? Keymochi takes that idea and runs with it. Created as a research project by three Cornell Tech students, Hsiao-Ching Lin, Huai-Che Lu, and Claire Opila, Keymochi uses information about the way that we type to try and gauge our emotions.
This data includes information like typing speed, punctuation changes, smartphone motion sensor data, time of day, number of backspaces, distance between keys, and — yes — a smattering of sentiment analysis to work out how you’re feeling. All of this data is then fed into a machine-learning model to decode — with the results reportedly 82 percent accurate.
“For us, emotion-sensing is fascinating because it’s currently a missing part in today’s human-AI communications,” researcher Huai-Che Lu told Digital Trends. “Imagine a chatbot can recommend you a healing song when you are sad or suggest that you take some deep breaths when you are nervous.”
As Lu pointed out, detecting emotions based on physiological changes is not something completely new. There is an entire field dedicated to “affective computing,” with one key aspect of it — called sentic modulation — focused on studying the physiological changes that accompany changing emotions, such as facial expression, voice intonation, heart rate, and more.
One of the key differentiators with Keymochi — in addition to the fact that it focuses on the lesser-studied topic of emotion detection using mobile keyboards — is that it aims to protect user privacy by doing all the data preprocessing on mobile devices. In this instance, the server only gets pseudonymous metadata and Keymochi’s creators are therefore unable to carry out reverse-engineering to learn what users are actually typing.
Unfortunately, the Keymochi app isn’t available to the public just yet. The keyboard is unlikely to find its way into the App Store anytime soon, although Lu said that the plan is to release its different components under open-source licenses. This would allow researchers to use the work to, for example, collect data (with the knowledge of users) in future HealthKit or CareKit research projects.
As for what is next, Lu said the team would like to build on this work.
“In the next semester, we’re planning to pivot a little bit by developing a productivity app to help people be more focused in their work based on the activity patterns on laptops,” he said. “Based on different focus levels, the app would suggest tasks that best suit to your current status — like doing some coding when you are ‘in the flow,’ and writing some emails when you are drained, as well as toggling the ‘do-not-disturb’ mode automatically.”
Editors' Recommendations
- I created the perfect iPhone home screen — and you can too
- Is Snapchat free? Here’s how much you’ll need to pay for it
- Seeing more ads in your Outlook app? You’re not alone
- You can finally move your WhatsApp chats from Android to iOS
- iOS 16 lets you pair Nintendo Switch controllers to your iPhone