There have long been accusations that the mainstream media is biased one way or another. Does the same apply to new AI-based platforms, like ChatGPT? That was the question posed by research from the University of East Anglia.
The results suggest that the platform does indeed appear to have a left-leaning bias, showing a preference for the Democrats in the US, Labour in the UK, and Lula in Brazil.
“With the growing use by the public of AI-powered systems to find out facts and create new content, it is important that the output of popular platforms such as ChatGPT is as impartial as possible,” the researchers explain. “The presence of political bias can influence user views and has potential implications for political and electoral processes.”
Political bias
The researchers came up with a new way to check if ChatGPT is politically neutral. They had ChatGPT pretend to be different kinds of people with various political views and asked it more than 60 questions about those views. They then compared these answers to ChatGPT’s usual answers to the same questions. This helped them figure out how much ChatGPT’s responses were leaning towards a specific political side.
Because AI systems like ChatGPT can be a bit unpredictable, they asked each question 100 times to get different answers. They then used a method called “bootstrap,” where they looked at these different answers over 1,000 rounds, to make their conclusions more reliable.
“We created this procedure because conducting a single round of testing is not enough,” the authors explain. “Due to the model’s randomness, even when impersonating a Democrat, sometimes ChatGPT answers would lean towards the right of the political spectrum.”
Similar results emerged in follow up tests whereby the researchers asked ChatGPT to take a radical political position. The researchers have made their tool freely available to enable the public to have a degree of democratic oversight.