Teaching machines to predict human behaviors

huggingI’ve written a few times recently about various projects that are helping develop robots capable of learning by watching how a task is performed.  The machines typically observe something like a YouTube video in their attempt to pick up the skill, but the tasks have generally been quite manual in nature, such as the correct handling of kitchen utensils.

A recent study from researchers at MIT set out to test whether machines could also use a similar method for picking up something so intuitively human.

Learning intuition

The researchers were hoping to train robots to be able to instinctively predict how an encounter with a human might unfold.  It’s the kind of intuition that we develop subconsciously as a result of our lifetime of experience.

The team, from the Computer Science and Artificial Intelligence Laboratory (CSAIL) at MIT developed an algorithm that they believe can predict interactions better than ever before.

The algorithm was trained by watching clips of television shows such as Desperate Housewives and The Office, and the team reveal that it can now accurately predict whether two humans will engage by shaking hands, hugging, kissing or even high fiving.

“Humans automatically learn to anticipate actions through experience, which is what made us interested in trying to imbue computers with the same sort of common sense,” the researchers say. “We wanted to show that just by watching large amounts of video, computers can gain enough knowledge to consistently make predictions about their surroundings.”

It’s a fascinating study, both for its findings into a better understanding of intuition, and also for the extra evidence that machines are capable of picking up behaviors from watching videos of those behaviors in action.

Related

Facebooktwitterredditpinterestlinkedinmail

One thought on “Teaching machines to predict human behaviors

Leave a Reply

Your email address will not be published. Required fields are marked *

Captcha loading...