An awkward solution for an awkward problem
14 Jul 2013 by Evoluted New Media
MIT researchers have developed software to analyse and help improve people’s social skills, but could it merely aggravate the problem?
If, like me, the thought of having to make small talk with someone makes you want to hide in a fume cupboard, researchers at MIT may have developed a potential solution for you.
If your introductions rapidly become awkward silences or if travelling in a lift with someone brings you out in a cold sweat, you may one day want to acquaint yourself with software called MACH (short for My Automated Conversation coacH). MACH can be used to help people practice that employability buzzword – “interpersonal skills” – until they feel more comfortable with situations such as job interviews, first dates, haircuts, waiting for the department kettle to boil with a particularly insufferable colleague… etc.
“Interpersonal skills are the key to being successful at work and at home,” says M. Ehsan Hoque, a MIT Media Lab doctoral student who led the research. “How we appear and how we convey our feelings to others defines us. But there isn’t much help out there to improve on that segment of interaction.”
Hoque is researching whether it’s possible to interact with computers and robots the way we interact with humans. This understanding requires modelling of complex, multidimensional data so Hoque tackles these problems by combining techniques from machine learning, computer vision and psychology. He’s developing techniques to understand and recognise human non-verbal behaviour and inventing new applications to improve people’s quality of life. He previously developed an automated system that can better distinguish between real and fake smiles than humans.
His latest venture MACH is a system for people to practice social interactions in face-to-face scenarios. It consists of a 3D character that can “see”, “hear” and make its own “decisions” in real time. It features speech-and-face recognition tools that allow it to pick up on your behaviour during a simulated heart-to-heart. These include facial expression processing to scrutinise your head gestures, eye contact and smiling intensity; prosody analysis to detect the dynamics, intonation and pauses in your speech; and speech recognition to determine how often you use filler non-words like “umm”. When your communication with the character is over, the programme allows you to watch a video of your conversation before sending you countless graphs and stats that detail the quality of your interaction, in order to evaluate why you’re so socially inept.
The first validation of the system was in the context of job interviews with 90 MIT undergraduate students and career counsellors at the university. Students who interacted with MACH demonstrated statistically significant performance improvement compared to students in the control group who had not had a go with the software.
And the researchers say they are currently expanding the technology to open up new possibilities in remarkably common behavioural health disorders such as Asperger’s syndrome, social phobia and post-traumatic stress disorder.
If this technology really can help adults with genuine social anxiety overcome some of the condition’s symptoms, then perhaps it should be embraced. But trials will have to be conducted first with people who actually do suffer from these disorders. And I can’t help but acknowledge the irony of social awkwardness being ‘cured’ by spending more time alone, indoors, interacting via the computer in a scenario that in itself sounds pretty damn awkward.
It also makes me a bit uneasy envisioning what could happen in the future if this software becomes popular and is used by people that might not necessarily need it. Or if some people are led to believe their slight introversion is a true problem, rather than an endearing quirk. And isn’t there a risk of someone who’s practiced with MACH coming off as utterly contrived? What could be more false than making sure to consciously inject eye contact, a smile or nod at appropriate points in a conversation?
I imagine if I had a go with MACH and it told me what I was doing wrong conversationally, I’d merely become acutely paranoid of my specific social failings which would surely be horribly distracting in pressurising social situations. I think what I’d really prefer is a CGI robot that would train me to stop caring so much about what other people think. Perhaps MIT should take note?