Deep learning in autism treatment blurs line between machine and human

Being someone with autism, I always found myself being drawn to the logic-oriented characters of science fiction movies, such as Star Wars’ C-3PO, Sonny from I, Robot, or Baymax from Big Hero 6. They tend to be extremely knowledgeable about certain subjects, but also struggle to understand concepts of human social behavior such as body language or sarcasm. This is a struggle for many individuals like myself.

Recent advances in artificial intelligence have been addressing such challenges, and they may lead to robots becoming more human than most science fiction predicts.

A form of artificial intelligence called deep learning is being used to read and interpret human body language and facial expressions — and this software could help individuals with autism become more sociable.

Stanford University is developing such software for Google Glass, a wearable computer developed by Google that the user wears like a pair of glasses. The software uses interactive applications that train the wearers to identify examples of emotions in his, her, or their environment. According to Annett Hahn Windgassen of the San-Francisco area, her son, Erik, in 2016, became more engaged with his peers thanks to this technology.

Read full article at Hypergrid Business.