The short answer is that they could understand emotions if equipped with the proper sensors and algorithms.
Physiological Components of Emotions
Emotions always have a physiological component. When we feel excited our heart rate changes, skin conductance increases, facial expressions exaggerate. Many physical and physiological signatures of emotional states have been well studied and classified. For example, the facial action coding system developed by Paul Ekman have been used to detect emotion by computer scientists since the 1970s.
Today the advances in photography and affective computing made possible the development of more accurate automated face analysis. This is how it works. 1) The machine “attends” to the facial signals through automated face detection and registration, and “receives” facial signals with high speed cameras. 2) The next step is extracting key signals. In facial expression recognition, the signal components are called Action Units (AU), or the movement changes of individual facial muscles. The machine does this through a variety of algorithms such as principal component analysis, linear discriminant analysis, and support vector machine classifiers. 3) Based on the specific combination of AU, the machine concludes an emotion experienced by the recorded facial expression.
For example, pain is characterized by brow lowering (AU4), orbital tightening (AU6 and 7), eye closure (AU43), nose wrinkling and lip raise (AU9 and 10). After going through the preceding steps and detecting the changes in these Action Units the machine will conclude that the person experiences pain.
Other Key Factors
Physiological changes such as tone of voice, body movements, heart rate and more can be measured with biometric sensors. Many devices already have these sensors and collect the data. The algorithms to analyze this data and make conclusions about emotions are already developed. It’s just a matter of time before machines will actively read our emotions and use this information in ways we hope will benefit us.