Business Daily Media

Men's Weekly

.

Being honest about using AI at work makes people trust you less, research finds

  • Written by Oliver Schilke, Director of the Center for Trust Studies, Professor of Management and Organizations, University of Arizona

Whether you’re using AI to write cover letters, grade papers or draft ad campaigns, you might want to think twice about telling others. That simple act of disclosure can make people trust you less, our new peer-reviewed article found[1].

As researchers who[2] study trust[3], we see this as a paradox. After all, being honest and transparent usually makes people trust you more. But across 13 experiments involving more than 5,000 participants, we found a consistent pattern: Revealing that you relied on AI undermines how trustworthy you seem.

Participants in our study included students, legal analysts, hiring managers and investors, among others. Interestingly, we found that even evaluators who were tech-savvy were less trusting of people who said they used AI. While having a positive view of technology reduced the effect slightly, it didn’t erase it.

Why would being open and transparent about using AI make people trust you less? One reason is that people still expect human effort in writing, thinking and innovating. When AI steps into that role and you highlight it, your work looks less legitimate.

But there’s a caveat: If you’re using AI on the job, the cover-up may be worse than the crime. We found that quietly using AI can trigger the steepest decline in trust if others uncover it later. So being upfront may ultimately be a better policy.

Being caught using AI by a third party has consequences, as one New York attorney can attest.

Why it matters

A global survey of 13,000 people found that about half had used AI at work[4], often for tasks such as writing emails or analyzing data. People typically assume that being open about using these tools is the right choice[5].

Yet our research suggests doing so may backfire. This creates a dilemma for those who value honesty but also need to rely on trust to maintain strong relationships with clients and colleagues. In fields where credibility is essential – such as finance, health care and higher education – even a small loss of trust can damage a career or brand.

The consequences go beyond individual reputations. Trust is often called the social “glue” that holds society together[6]. It drives collaboration, boosts morale and keeps customers loyal. When that trust is shaken, entire organizations can feel the effects through lower productivity, reduced motivation and weakened team cohesion.

If disclosing AI use sparks suspicion, users face a difficult choice: embrace transparency and risk a backlash, or stay silent and risk being exposed later – an outcome our findings suggest erodes trust even more.

That’s why understanding the AI transparency dilemma is so important. Whether you’re a manager rolling out new technology or an artist deciding whether to credit AI in your portfolio, the stakes are rising.

What still isn’t known

It’s unclear whether this transparency penalty will fade over time. As AI becomes more widespread – and potentially more reliable – disclosing its use may eventually seem less suspect.

There’s also no consensus on how organizations should handle AI disclosure. One option is to make transparency completely voluntary, which leaves the decision to disclose to the individual. Another is a mandatory disclosure policy across the board. Our research suggests that the threat of being exposed by a third party can motivate compliance if the policy is stringently enforced through tools such as AI detectors.

A third approach is cultural: building a workplace where AI use is seen as normal, accepted and legitimate. We think this kind of environment could soften the trust penalty and support both transparency and credibility.

The Research Brief[7] is a short take on interesting academic work.

References

  1. ^ peer-reviewed article found (doi.org)
  2. ^ researchers who (eller.arizona.edu)
  3. ^ study trust (eller.arizona.edu)
  4. ^ about half had used AI at work (www.bcg.com)
  5. ^ the right choice (www.zetwerk.com)
  6. ^ holds society together (doi.org)
  7. ^ Research Brief (theconversation.com)

Read more https://theconversation.com/being-honest-about-using-ai-at-work-makes-people-trust-you-less-research-finds-253590

Beyond borders: Building a scalable strategy for international hiring

For many Australian businesses, growth increasingly depends on thinking beyond local borders.  As wage pressures rise, and specialised talent pool...

The Next Generation of Maritime Sustainable Solutions

As organizations globally seek innovative ways to improve sustainability and their impact on Earth, the American Waterways Operators (AWO), a lead...

Demand for Home Batteries surges as Federal Rebate Kicks In

A leading provider of energy solutions VoltX Energy has seen a 400% increase in demand for home batteries in the past three weeks as people put d...

Why Sport Remains the Safest Bet in an Uncertain World

When Rome was in crisis, its leaders did not retreat to the Senate. They went to the circus. To the chariot races. To the gladiators. Sport was no...

THE FINE LINE WITHIN HILARIOUS SIGNAGE DESIGN FAILS

It seems like design failures still occur in today’s modern branding era, despite rigorous rounds of approvals behind the scenes. One signage show...

Deputy Announces Exclusive Global Partnership with Predelo to Bring AI to Shift-Based Businesses

Deputy, the global people platform for shift-based businesses, has announced an exclusive partnership with Predelo, an AI Decision Agent-as-a-Serv...

Sell by LayBy