Primates have specialised parts of their brain dedicated to remembering faces. But if a monkey-brained computer can recognise static faces, to read the dynamic movements of a person's lips requires Hal from Stanley Kubrick's 2001.
Geneva and Genova universities have the means. Computers can track facial motion, including eyes and mouth, decode it with algorithms based on physics and biology and reproduce the lot as a realistic, textured 3D model of a human head. Mix that with culturally sensitive psychological analysis of facial expressions, add in speech recognition and machine translation, and that videoconferencing session with your Chinese counterpart might get a whole lot easier.
And the latest idea, from MIT's Andrew Wilson and the Georgia Institute of Technology's Aaron Bobick, is to follow hand movements with computer manipulation of parameterised spaces. That way, it seems, computers will understand us better when we boast about the size of a fish.