Improving human-machine interactions

The interface between humans and machines is an increasingly active area for researchers.  A team of US Army scientists have provided the latest input into the field.

The team developed something known as the Privileged Sensing Framework(PSF) to try and leverage the latest advances in human sensing technologies and integrate human and autonomous agents purely on the basis of their individual characteristics.  For instance, whilst humans are great at adapting to a changing environment, machines tend to be amazing at processing huge quantities of data.

The project aimed to showcase how the framework preserves the human as the main authority, whilst also providing all the machines need to thrive.

“The research was fundamentally enabled by a critical move towards a novel control systems framework that can account for dynamic interactions among information components that impact the value of that information and yet appropriately propagates into robust overall decisions. The PSF provides an evolved approach to HAI that treats the human as a special class of sensor rather than as the ultimate and absolute command arbiter,” the researchers say.

They reveal that the PSF significantly improved both human and machine performance in a range of simulations, without either group sacrificing the things they really excel in.  It’s based on the concept of ‘privileging’ information so that specific rights are assigned to specific agents based upon their capabilities in a particular task.

“Additional studies have extended this approach into a wide range of applications that include joint human-autonomy driving, human-autonomy target detection, and command and control. Overall, these efforts provide further evidence that the incorporation of the principles of the PSF can provide improved performance of joint human-autonomy systems across a wide range of applications,” the researchers say.

The team will continue to develop the method and further test its impact on human-machine system performance, in the hope that it allows the framework to be applicable in a wide range of different tasks and scenarios.

Different approaches

They certainly aren’t the only ones working on this issue however.  For instance, a European team are working on prototypes that allow robots to anticipate human actions.

The An.Dy project aims to advance human-robot collaboration by empowering the machines to better understand what humans are about to do, and how the machine can help them.

The heart of the project is the AndySuit, which is a high-tech suit that’s studded with an array of sensors to track movement, acceleration of limbs and muscle power as the wearer performs a range of tasks, either alone or in conjunction with a humanoid robot.

The data from this is then used to train the robot so that it better understands human behavior, and hopefully predict what they are about to do next so that it can appropriately support them.

“The robot would recognise a good posture and a bad posture and would work so that it gives you an object in the right way to avoid injury,” the researchers say.

The robot is to be tested in three different scenarios in order to bring it closer to market.  The first of these will be a workspace whereby a human works alongside the robot, the second will be where a human wears an exoskeleton, such as those used when lifting heavy loads, whilst the third is where a humanoid robot offers assistance, and potentially takes turns performing tasks.

These are just a couple of examples of the growing number of projects that aim to improve the ability for man and machine to work effectively alongside each other.  You can learn more about An.Dy via the video below.