A new study from the Hebrew University of Jerusalem found that people value empathy more when they believe it comes from a human, even if the actual response was generated by AI. The research reveals that human-attributed responses are perceived as more supportive, more emotionally resonant, and more caring than identical AI-generated responses.

Led by Professor Anat Perry and her PhD student Matan Rubin, in collaboration with Professor Amit Goldenberg from Harvard University and Professor Desmond C. Ong from the University of Texas, the study involved over 6,000 participants across nine experiments. The researchers tested whether people perceived empathy differently depending on whether it was labeled as coming from a human or from an AI chatbot. In all cases, the responses were crafted by large language models (LLMs).

Participants consistently rated the "human" responses as more empathic, more supportive, and more emotionally satisfying than the identical "AI" responses. The preference for human responses was especially strong for replies that emphasized emotional sharing and genuine care, rather than just cognitive understanding.

Professor Anat Perry commented, "We're entering an age where AI can produce responses that look and sound empathic. But this research shows that even if AI can simulate empathy, people still prefer to feel that another human truly understands, feels with them, and cares."

The study found that participants were willing to wait days or weeks to receive a response from a human rather than get an immediate reply from a chatbot. When participants believed an AI may have helped generate or edit a response they thought was from a human, their positive feelings diminished significantly. This suggests that perceived authenticity - believing that someone genuinely invested time and emotional effort - plays a critical role in how we experience empathy.

Professor Perry noted, "In today's world, it's becoming second nature to run our emails or messages through AI. But our findings suggest a hidden cost: the more we rely on AI, the more our words risk feeling hollow. As people begin to assume that every message is AI-generated, the perceived sincerity, and with it, the emotional connection, may begin to disappear."

While AI shows promise for use in education, healthcare, and mental health settings, the study highlights its limitations. Professor Perry explained, "AI may help scale support systems, but in moments that require deep emotional connection, people still want the human touch."

The study offers key insights into the psychology of empathy and raises timely questions about how society will integrate emotionally intelligent AI into our daily lives.

The preparation of this article relied on a news-analysis system.