Close Menu
  • Home
  • UNSUBSCRIBE
  • News
  • Lifestyle
  • Tech
  • Entertainment
  • Sports
  • Travel
Facebook X (Twitter) WhatsApp
Trending
  • George Kittle’s wife shares live reaction to Achilles injury
  • ‘The scientific cost would be severe’: A Trump Greenland takeover would put climate research at risk
  • Headlines Across OC as Angel Stadium Sale Debate Intensifies
  • Anti-Islam activists clash with pro-Muslim counter-protesters in Dearborn, Michigan
  • Best monitor deal: Get the 45-inch LG Ultragear gaming monitor for its lowest price yet
  • Slovakia U21 0 – 4 England U21
  • 13 Top Sleep Products That Transform Your Bedtime Routine for Better Rest
  • Firefighters rescue puppies from burning house
Facebook X (Twitter) WhatsApp
Baynard Media
  • Home
  • UNSUBSCRIBE
  • News
  • Lifestyle
  • Tech
  • Entertainment
  • Sports
  • Travel
Baynard Media
Home»Lifestyle»Some people love AI, others hate it. Here’s why.
Lifestyle

Some people love AI, others hate it. Here’s why.

EditorBy EditorNovember 8, 2025No Comments5 Mins Read
Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
Share
Facebook Twitter LinkedIn Pinterest Email

From ChatGPT crafting emails, to AI systems recommending TV shows and even helping diagnose disease, the presence of machine intelligence in everyday life is no longer science fiction.

And yet, for all the promises of speed, accuracy and optimisation, there’s a lingering discomfort. Some people love using AI tools. Others feel anxious, suspicious, even betrayed by them. Why?

The answer isn’t just about how AI works. It’s about how we work. We don’t understand it, so we don’t trust it. Human beings are more likely to trust systems they understand. Traditional tools feel familiar: you turn a key, and a car starts. You press a button, and a lift arrives.


You may like

But many AI systems operate as black boxes: you type something in, and a decision appears. The logic in between is hidden. Psychologically, this is unnerving. We like to see cause and effect, and we like being able to interrogate decisions. When we can’t, we feel disempowered.

This is one reason for what’s called algorithm aversion. This is a term popularised by the marketing researcher Berkeley Dietvorst and colleagues, whose research showed that people often prefer flawed human judgement over algorithmic decision making, particularly after witnessing even a single algorithmic error.

We know, rationally, that AI systems don’t have emotions or agendas. But that doesn’t stop us from projecting them on to AI systems. When ChatGPT responds “too politely”, some users find it eerie. When a recommendation engine gets a little too accurate, it feels intrusive. We begin to suspect manipulation, even though the system has no self.

This is a form of anthropomorphism – that is, attributing humanlike intentions to nonhuman systems. Professors of communication Clifford Nass and Byron Reeves, along with others have demonstrated that we respond socially to machines, even knowing they’re not human.

Get the world’s most fascinating discoveries delivered straight to your inbox.

One curious finding from behavioural science is that we are often more forgiving of human error than machine error. When a human makes a mistake, we understand it. We might even empathise. But when an algorithm makes a mistake, especially if it was pitched as objective or data-driven, we feel betrayed.

This links to research on expectation violation, when our assumptions about how something “should” behave are disrupted. It causes discomfort and loss of trust. We trust machines to be logical and impartial. So when they fail, such as misclassifying an image, delivering biased outputs or recommending something wildly inappropriate, our reaction is sharper. We expected more.

The irony? Humans make flawed decisions all the time. But at least we can ask them “why?”


You may like

Man student holding phone with ChatGPT app aside laptop.

(Image credit: By BongkarnGraphic/Shutterstock)

We hate when AI gets it wrong

For some, AI isn’t just unfamiliar, it’s existentially unsettling. Teachers, writers, lawyers and designers are suddenly confronting tools that replicate parts of their work. This isn’t just about automation, it’s about what makes our skills valuable, and what it means to be human.

This can activate a form of identity threat, a concept explored by social psychologist Claude Steele and others. It describes the fear that one’s expertise or uniqueness is being diminished. The result? Resistance, defensiveness or outright dismissal of the technology. Distrust, in this case, is not a bug – it’s a psychological defence mechanism.

Craving emotional cues

Human trust is built on more than logic. We read tone, facial expressions, hesitation and eye contact. AI has none of these. It might be fluent, even charming. But it doesn’t reassure us the way another person can.

This is similar to the discomfort of the uncanny valley, a term coined by Japanese roboticist Masahiro Mori to describe the eerie feeling when something is almost human, but not quite. It looks or sounds right, but something feels off. That emotional absence can be interpreted as coldness, or even deceit.

In a world full of deepfakes and algorithmic decisions, that missing emotional resonance becomes a problem. Not because the AI is doing anything wrong, but because we don’t know how to feel about it.

It’s important to say: not all suspicion of AI is irrational. Algorithms have been shown to reflect and reinforce bias, especially in areas like recruitment, policing and credit scoring. If you’ve been harmed or disadvantaged by data systems before, you’re not being paranoid, you’re being cautious.

This links to a broader psychological idea: learned distrust. When institutions or systems repeatedly fail certain groups, scepticism becomes not only reasonable, but protective.

Telling people to “trust the system” rarely works. Trust must be earned. That means designing AI tools that are transparent, interrogable and accountable. It means giving users agency, not just convenience. Psychologically, we trust what we understand, what we can question and what treats us with respect.

If we want AI to be accepted, it needs to feel less like a black box, and more like a conversation we’re invited to join.

This edited article is republished from The Conversation under a Creative Commons license. Read the original article.

Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
Previous ArticleWashington Post editorial slams NYC mayor-elect Mamdani’s victory speech
Next Article Gold toilet expected to fetch $10 million at auction
Editor
  • Website

Related Posts

Lifestyle

‘The scientific cost would be severe’: A Trump Greenland takeover would put climate research at risk

January 17, 2026
Lifestyle

New ‘Transformer’ humanoid robot can launch a shapeshifting drone off its back — watch it in action

November 19, 2025
Lifestyle

Medieval spear pulled from Polish lake may have belonged to prince or nobleman

November 19, 2025
Add A Comment

Comments are closed.

Categories
  • Entertainment
  • Lifestyle
  • News
  • Sports
  • Tech
  • Travel
Recent Posts
  • George Kittle’s wife shares live reaction to Achilles injury
  • ‘The scientific cost would be severe’: A Trump Greenland takeover would put climate research at risk
  • Headlines Across OC as Angel Stadium Sale Debate Intensifies
  • Anti-Islam activists clash with pro-Muslim counter-protesters in Dearborn, Michigan
  • Best monitor deal: Get the 45-inch LG Ultragear gaming monitor for its lowest price yet
calendar
February 2026
M T W T F S S
 1
2345678
9101112131415
16171819202122
232425262728  
« Jan    
Recent Posts
  • George Kittle’s wife shares live reaction to Achilles injury
  • ‘The scientific cost would be severe’: A Trump Greenland takeover would put climate research at risk
  • Headlines Across OC as Angel Stadium Sale Debate Intensifies
About

Welcome to Baynard Media, your trusted source for a diverse range of news and insights. We are committed to delivering timely, reliable, and thought-provoking content that keeps you informed
and inspired

Categories
  • Entertainment
  • Lifestyle
  • News
  • Sports
  • Tech
  • Travel
Facebook X (Twitter) Pinterest WhatsApp
  • Contact Us
  • About Us
  • Privacy Policy
  • Disclaimer
  • UNSUBSCRIBE
© 2026 copyrights reserved

Type above and press Enter to search. Press Esc to cancel.