Creating machines capable of learning has been an ongoing challenge for the AI industry for a while now. There have been various projects that have aimed to develop technology capable of picking up skills from watching videos and so on, but a recent project from researchers at Aalto University, University of Birmingham and University of Oslo proposes a system that can learn by watching human behavior.
The system was able to model the behavior of people simply by watching them as they choose items from a menu. The breakthrough is crucial in helping machines better understand not just what humans do, but why they do it.
Various cognitive models that describe our individual capabilities, as well as our goals, potentially offer a better explanation, and therefore provide us with the predictive capability to know what humans will do in particular circumstances. Creating robust models however has been easier said than done.
“The benefit of our approach is that much smaller amount of data is needed than for ‘black box’ methods. Previous methods for performing this type of tuning have either required extensive manual labor, or a large amount of very accurate observation data, which has limited the applicability of these models until now“, the researchers say.
Understanding human behavior
The approach taken by the researchers is based on the Approximate Bayesian Computation (ABC), which is a machine learning based approach that is used to infer complex models from simple observations. It’s used in notoriously complex areas of science, such as climate change and the researchers hope its application in this way paves the way for automatic inference of complex human behaviors from simple observations, which itself is a crucial step in more effective human-machine interactions.
“We will be able to infer a model of a person that also simulates how that person learns to act in totally new circumstances,” the authors say. “We’re excited about the prospects of this work in the field of intelligent user interfaces.”
“In the future, the computer will be able to understand humans in a somewhat similar manner as humans understand each other. It can then much better predict not only the benefits of a potential change but also its individual costs to an individual, a capability that adaptive interfaces have lacked”, they continue.