• Subscribe

The social life of bots

Speed read
  • Automated editing programs square off in Wikipedia
  • Scientists discover internet bots exhibit social behavior 
  • Failure to manage bot behavior could spell trouble for human societies

The Montagues versus the Capulets. The Hatfields versus the McCoys. Most people have heard about these famous long-running feuds. But what about EmausBot versus VolkovBot?

Scientists at the Oxford Internet Institute have recently discovered that online bots tasked with editing Wikipedia pages sometimes become involved in arguments that last up to ten years.

Milena Tsvetkova.

Bots on Wikipedia are computer scripts that automatically handle repetitive functions that improve and maintain the encyclopedia. They aren’t designed to interact with each other. But researchers found that some bots 'argued' with other bots, undoing one another’s edits as many as 185 times over a ten-year period.

“We did not expect the bots to be argumentative at all,” says Milena Tsvetkova, assistant professor of methodology at the London School of Economics and Political Science. “The bots on Wikipedia are programs that execute simple editing tasks; they are not designed for interaction. Finding out that they ‘argue’ was quite unexpected.”

As a sociologist, Tsvetkova is primarily interested in human interaction: with other humans or with computers. “I was initially resistant to the thought that computer programs could show interesting social behavior,” says Tsvetkova. “But the data proved me wrong!”

He said, she said

The number of bots in online systems is increasing quickly. They’re currently used to collect information, moderate forums, generate content, and provide customer service, as well as disseminate spam and spread fake news.

“Even if bots are not designed to interact, they find themselves in systems with other bots and interaction is inevitable,” says Tsvetkova. <strong>Binary life. </strong> Automated computer scripts have exhibited social behavior, sometimes engaging in editing spats lasting up to a decade.

Wikipedia bots complete about fifteen percent of the encyclopedia’s edits on all language editions. Since 2001, the growth in the number of bots has slowed, but the number of ‘reverts’ (corrections of others’ edits) has continuously increased, suggesting that bot interactions are not becoming more efficient.

“On Wikipedia, these fights were futile and not very consequential — most of what the bots did was still extremely valuable to the Wikipedia project,” says Tsvetkova. “However, in other online systems such fights might undermine the purpose of the bots.”

The researchers found that the same handful of bots are responsible for most of the ‘arguments’ with other bots. Conflicts between bots tend to occur at a slower rate and over a longer period than conflicts between human editors.

“Interaction leads to unexpected results, even when we design for it,” says Tsvetkova. “Bots' presence in and influence on our lives will be increasing, and if we want to understand society, we need to understand how we interact with these artificial agents too.”

Living with bots

Much of the scientific and popular discussion about artificial intelligence has been about psychology – whether AI is able to think or feel the way we do, says Tsvetkova. “But now it’s time to discuss how artificial agents are expected to interact with us.”

When studying human-human interaction, social scientists often model individuals as ‘bots’ that follow simple rules when they meet other agents. These modeled interactions can lead to complex patterns at the group level, patterns that none of the individuals intended.

“For the last few decades, we’ve created artificial systems with bots and used them to gain insights into social phenomena such as the emergence of cooperation, the evolution of cultural norms, the spread of fads, and so on,” says Tsvetkova. “With the proliferation of bots online, we now can do the same but by observing systems of bots ‘in the field,’ i.e. in the real world.” 

<strong> Bunch of bots. </strong> Unless managed with foresight, bot interactions can impinge on human society, as evidenced by the proliferation of spam and fake news. Courtesy Milena Tsvetkova.

Bots, too, can have their own social life, beyond the control of their human creators. Tsvetkova points to Tay, the Microsoft chatbot that began broadcasting racist and misogynist tweets within hours of being released, thanks to the influence of online trolls.

Unlike Tay or other social bots posing as humans to spread propaganda or influence public discourse, the Wikipedia bot ecosystem is controlled and monitored. “Conflicts likely arise as a result of the bottom-up organization of the community,” says Tsvetkova. “Human editors individually create and run bots, without a formal mechanism for coordination with other bot owners.”

Behind every bot is a human designer. Even malevolent bots that are designed to cooperate (like those in a botnet) may end up in continuous disagreement. As bots and other forms of AI proliferate on the web and in our lives, says Tsvetkova, “this unintended behavior could have more dire repercussions in other human-machine systems.”   

As the creators of bots, humans must study bot behavior and design artificial agents that can carry out our tasks with minimal conflict. It is up to us to prevent a future bot war spinning out into a dangerous feud: Hatfields versus McCoys 2.0.

Join the conversation

Do you have story ideas or something to contribute? Let us know!

Copyright © 2021 Science Node ™  |  Privacy Notice  |  Sitemap

Disclaimer: While Science Node ™ does its best to provide complete and up-to-date information, it does not warrant that the information is error-free and disclaims all liability with respect to results from the use of the information.


We encourage you to republish this article online and in print, it’s free under our creative commons attribution license, but please follow some simple guidelines:
  1. You have to credit our authors.
  2. You have to credit ScienceNode.org — where possible include our logo with a link back to the original article.
  3. You can simply run the first few lines of the article and then add: “Read the full article on ScienceNode.org” containing a link back to the original article.
  4. The easiest way to get the article on your site is to embed the code below.