Wearable Monitors Assess Our Body Language

It’s widely known that roughly 60% of communication is done via our body language, and whilst much is understood about the way we use our bodies to communicate, a team from the University of Cambridge and Dartmouth College believe that the wearable device they’ve developed will shed even more light on the subject.

The system, known as Protractor, uses light to record how we use body language to communicate.  It does this by measuring the body angles and distances between people.

Of course, body language is something that has been widely studied in the past, but most attempts to do so have been done via video and audio recordings.  It’s something the researchers believe Protractor improves upon by not requiring invasive cameras or a high burden upon users.

“Our system is a key departure from existing approaches,” they say. “The ability to sense both body distance and relative angle with fine accuracy using only infrared light offers huge advantages and can deepen the understanding on how body language plays a role in social interactions.”

Hidden communication

The system consists of a lightweight, wearable tag that is akin to an access badge.  The device is able to monitor all non-verbal behavior with a high degree of granularity by using near-infrared light.  Infrared was chosen because normal light can’t penetrate human bodies, thus making it hard to gain accurate sensing of face-to-face interactions.  Infrared is also invisible to the human eye and so its use doesn’t distract the people being monitored.

The sensor was combined with some AI that detects when the light channel is temporarily blocked and utilizes inertia to work around the absence of data from that channel.  They also worked hard to limit power consumption and to enable the system to accurately identify participants.


“By modulating the light from each Protractor tag to encode the tag ID, each tag can then figure out which individuals are participating. To increase energy efficiency, we also adapt the frequency of emitting light signals based on the specific context,” the researchers explain.

Problem solving

The system was put through its paces on a problem solving group task that participants were engaged in.  The task, called The Marshmallow Challenge, required teams to build a structure that could support a marshmallow using only tape, string and a few bits of spaghetti.

“Beyond simply observing body language with the tags, we identified the task role each group member was performing and delineated each stage in the building process through the recorded body angle and distance measurements,” the researchers say.

The technology was able to accurately estimate both interaction distances between participants, and the body’s orientation.  It was also able to assess the individual’s task role with an accuracy of around 85%, and even the particular stage of the building process with 93% accuracy.

Suffice to say, the project is at a very early stage, but the team are confident that it can eventually not only support social research but provide real-time feedback during key interactions, such as job interviews or team activities.  It will be interesting to see whether this kind of technology starts to find its way into the workplace.