Artificial intelligence lie detector developed by Imperial alumnus
Technology that uses artificial intelligence to detect lies via tiny changes in facial expressions has been developed by an Imperial alumnus.
Wrongly accused and imprisoned for a crime you didn’t commit. It sounds like the plot to a generic crime thriller. However, this scenario does happen from time to time in the UK. From the Birmingham Six, falsely imprisoned for sixteen years, to the more recent case of Barri White, who was wrongly jailed for the murder of his girlfriend Rachel Manning, these situations can seem to the public like a tragic miscarriage of the criminal justice system.
One can imagine a near-future scenario in which your prospective employers are wearing Google Glasses, where every micro-gesture that ‘leaks’ from your face is a response that flashes by their eyes as ‘true’ or ‘false’ in real-time.
– Dr James O'Shea
Alumnus and inventor of Silent Talker
However, what if you could stop these miscarriages of justice from happening? Imperial alumnus Dr James O’Shea, who graduated with a Bachelor of Science in Chemistry in 1976, has built a lie detector device called the ‘Silent Talker’ that he believes could help to improve criminal investigations.
While lie detector tests of any sort are not currently admissible evidence in British courts, Dr O’Shea believes Silent Talker could be an invaluable tool in helping law enforcement to focus their investigations.
Dr O’Shea says: “An original member of my team who helped to develop the Silent Talker was very close to the area where one of the attacks by Yorkshire Ripper took place. She took an interest in the case and found that the Ripper had been interviewed and passed over several times by the police. If the police had Silent Talker back then, it may have helped them to determine that they needed to spend a little more time on this guy, and investigate his background more closely.”
Artificially intelligent
The Silent Talker consists of a digital video camera that is hooked up to a computer. It runs a series of programs called artificial neural networks. These are computational models that take their design from animals’ central nervous systems, acting like an autonomous ‘brain’ for the device.
The computer programming in the artificial brain is a type of artificial intelligence called machine learning. It enables Silent Talker to learn and recognise patterns in data so that it can constantly adapt and reprogram itself during an interview. This enables Silent Talker to build up an overall profile of the subject to identify when someone is lying or telling the truth.
But how does it know when someone is lying? The inventors of the device claim it’s written all over your face. The camera records the subject in an interview and the artificial brain identifies non-verbal ‘micro-gestures’ on people’s faces. These are unconscious responses that Silent Talker picks up on to determine if the interviewee is lying.
Examples of micro-gestures include signs of stress, mental strain and what psychologists call ‘duping delight’. This refers to the unconscious flash of a smile at the pleasure and thrill of getting away with telling a lie. Dr O’Shea says these ‘tells’ are extremely fine-grained and exceedingly difficult for the interviewee to have any control over.
Coming to an interview near you
Dr O’Shea says the uses for such a device are numerous.
“One can imagine a near-future scenario in which your prospective employers are wearing Google Glasses, where every micro-gesture that ‘leaks’ from your face is a response that flashes by their eyes as ‘true’ or ‘false’ in real-time.”
While it does use the latest in computational techniques, Dr O’Shea says Silent Talker is not infallible. In tests to classify the micro-gestures as deceptive or non-deceptive, the Silent Talker has achieved an accuracy rate of 87 per cent.
However, this has not stopped prospective clients from clamouring for the device. Dr O’Shea and his colleagues have already been approached by security services about whether Silent Talker could be used to determine if people approaching a military checkpoint could be suicide bombers so that they can be eliminated before blowing up their target. The team’s answer has been a loud and emphatic ‘no’.
“In an ethical sense, such decisions should not be taken by a machine,” says Dr O’Shea.
Imperial experience
Dr O’Shea credits his time at Imperial College London for giving him the educational background he needed to succeed in academia at a difficult time economically. Originally trained as a chemist, the recession in that industry during the 1970’s helped Dr O’Shea force his hand and make a transition to computer science.
“Chemistry is an excellent all-round preparation for a scientific career and I have had two computing heads of department who started with a chemistry degree.”
Article text (excluding photos or graphics) available under an Attribution-NonCommercial-ShareAlike Creative Commons license.
Photos and graphics subject to third party copyright used with permission or © Imperial College London.