Online ratings, internet reviews… these don’t always reveal the best choice, according to a new MIT study. A massive controlled experiment of web users found that such ratings are highly susceptible to irrational “herd behavior” -- and that the herd can be manipulated. ScienceNOW reports.
According to one theory, the wisdom of the crowd still holds -- measuring the aggregate of people’s opinions produces a stable, reliable value. Skeptics, however, argue that people’s opinions are easily swayed by those of others...
To test which hypothesis is true, you would need to manipulate huge numbers of people, exposing them to false information and determining how it affects their opinions.
A team led by MIT’s Sinan Aral conducted an experiment using a website that aggregates news stories, where users make comments and vote others up or down -- like Reddit though he won’t say which exactly. He wanted to know how much the crowd influences the individual, and whether it can be controlled from outside.
For 5 months, 101,281 comments randomly received an up vote, a down vote, or no vote at all (control) from the researchers.
- Comments that received fake positive votes were 32 percent more likely to receive more positive votes, compared with controls.
- By the end of the study, positively manipulated comments got an overall boost of about 25 percent.
- The ratings of comments that got a fake down vote were usually negated by an up vote by the next user to see them. Basically, negative responses got cancelled out.
Seems like people are more skeptical of negative social influence, according to Aral. “They’re more willing to go along with positive opinions...”
Companies could try to boost their products by manipulating online ratings, but if people detect that, the herd may spook and leave entirely.
The work was published in Science today.
Image: SalFalko via Flickr