Study: AI systems can judge people and create "trust"
Jerusalem (ANTARA) - A study from the Hebrew University of Jerusalem, on Tuesday (14/4), found that modern artificial intelligence (AI) systems can judge people and form a kind of “trust”, albeit in a way different from humans.
In a new study published in Proceedings of the Royal Society A, the team investigated how AI evaluates individuals in contexts such as lending, recruitment, and providing recommendations.
The researchers analysed more than 43,000 simulated decisions and compared them with responses from around 1,000 human participants.
They found that both humans and AI tend to prefer people who appear competent, honest, and well-intentioned, indicating that AI captures some basic elements of “human trust”.
However, a number of important differences were identified. Humans typically form overall impressions based on a combination of various traits, while AI systems break down judgements into separate factors, such as competence or integrity, and score them more rigidly, the researchers said.
The study also found that AI can exhibit consistent biases, sometimes stronger than human biases, based on factors such as age, gender, or religion, even when all other details are the same.
The researchers stated that these findings highlight the need for a better understanding of how AI makes decisions, given that these systems are increasingly influencing fields such as recruitment, finance, and healthcare.