ADD ANI AS A TRUSTED SOURCE
googleads
Menu
Quirky

New research shows people tend to trust computers more than humans

Washington [US], April 14 (ANI): Despite increasing concern over the intrusion of algorithms in daily life, people may be more willing to trust a computer program than their fellow humans, especially if a task becomes too challenging, according to new research from data scientists at the University of Georgia.

ANI Apr 14, 2021 19:14 IST googleads

Representative Image

Washington [US], April 14 (ANI): Despite increasing concern over the intrusion of algorithms in daily life, people may be more willing to trust a computer program than their fellow humans, especially if a task becomes too challenging, according to new research from data scientists at the University of Georgia.
The findings of the study were published in the journal 'Nature's Scientific Reports'.
From choosing the next song on your playlist to choosing the right size of pants, people are relying more on the advice of algorithms to help make everyday decisions and streamline their lives.
"Algorithms are able to do a huge number of tasks, and the number of tasks that they are able to do is expanding practically every day," said Eric Bogert, a Ph.D. student in the Terry College of Business Department of Management Information Systems.
Bogert added, "It seems like there's a bias towards leaning more heavily on algorithms as a task gets harder and that effect is stronger than the bias towards relying on advice from other people."
Bogert worked with management information systems professor Rick Watson and assistant professor Aaron Schecter on the paper, "Humans rely more on algorithms than social influence as a task becomes more difficult."
Their study, which involved 1,500 individuals evaluating photographs, is part of a larger body of work analysing how and when people work with algorithms to process information and make decisions.
For this study, the team asked volunteers to count the number of people in a photograph of a crowd and supplied suggestions that were generated by a group of other people and suggestions generated by an algorithm.
As the number of people in the photograph expanded, counting became more difficult and people were more likely to follow the suggestion generated by an algorithm rather than count themselves! or follow the "wisdom of the crowd," Schecter said.
Schecter explained that the choice of counting as the trial task was an important one because the number of people in the photo makes the task objectively harder as it increases. It also is the type of task that laypeople expect computers to be good at.
"This is a task that people perceive that a computer will be good at, even though it might be more subject to bias than counting objects," Schecter said. "One of the common problems with AI is when it is used for awarding credit or approving someone for loans. While that is a subjective decision, there are a lot of numbers in there -- like income and credit score -- so people feel like this is a good job for an algorithm. But we know that dependence leads to discriminatory practices in many cases because of social factors that aren't considered."
Facial recognition and hiring algorithms have come under scrutiny in recent years as well because their use has revealed cultural biases in the way they were built, which can cause inaccuracies when matching faces to identities or screening for qualified job candidates, Schecter said.
Those biases may not be present in a simple task like counting, but their presence in other trusted algorithms is a reason why it's important to understand how people rely on algorithms when making decisions, he added.
This study was part of Schecter's larger research program into human-machine collaboration, which is funded by a USD 300,000 grant from the U.S. Army Research Office.
"The eventual goal is to look at groups of humans and machines making decisions and find how we can get them to trust each other and how that changes their behavior," Schecter said. "Because there's very little research in that setting, we're starting with the fundamentals."
Schecter, Watson and Bogert are currently studying how people rely on algorithms when making creative judgments and moral judgments, like writing descriptive passages and setting bail of prisoners. (ANI)

Get the App

What to Read Next

Food

Study finds how diet has major impact on risk of Alzheimer's

Study finds how diet has major impact on risk of Alzheimer's

In a detailed study, researchers identify which diets are effective in lowering the risk of developing Alzheimer's disease.

Read More
Fitness

World Endodontic Day: Save your natural teeth from extractions

World Endodontic Day: Save your natural teeth from extractions

Dentists are celebrating October 16th as World Endodontic Day to spread general awareness among people about the need to preserve their natural teeth from root canal infection and extractions.

Read More
Relationships

Moral reasoning displays characteristic patterns in brain: Study

Moral reasoning displays characteristic patterns in brain: Study

Philosophers, psychologists and neuroscientists have passionately argued whether moral judgments share something distinctive that separates them from non-moral matters. Moral monists claim that morality is unified by a common characteristic and that all moral issues involve concerns about harm.

Read More
Parenting

Kindergarten misbehaviour may cost society in the long run: Study

Kindergarten misbehaviour may cost society in the long run: Study

For the first time, a new economic analysis has linked kindergarten pupils' misbehaviour to significant societal costs in terms of criminality, associated medical expenses, and lost productivity as they grow up.

Read More
Quirky

Air pollution makes it difficult for bees to find flowers: Study

Air pollution makes it difficult for bees to find flowers: Study

According to a new study, air pollution prevents bees from finding flowers because it degrades the scent.

Read More
Home About Us Our Products Advertise Contact Us Terms & Condition Privacy Policy

Copyright © aninews.in | All Rights Reserved.