Why Smart People Fall for Impersonation Scams: The Psychology of Fraud

The 2025 Verizon Data Breach Investigations Report found the human element was a factor in roughly 60% of all data breaches worldwide. Modern scams are designed to exploit predictable human behaviour, using psychological principles to create situations that feel legitimate and trustworthy. And with the rise of AI, these tactics are becoming even more convincing, as traditional ‘red flags’ are increasingly disappearing. 

How Scammers Exploit Human Psychology 

Psychologist Robert Cialdini spent decades studying why people say ‘yes’ to requests and identified 6 Principles of Persuasion: authority, scarcity, reciprocity, consistency, social proof, and liking. He was writing about sales and marketing, but those same principles now sit at the heart of modern fraud.

1. Reciprocity or “I helped you - now can you help me?”

People feel a natural obligation to return a favour, gift, kindness, or service they receive. If someone gives us something first, we often feel we “owe” them something in return - even if we never asked for it.

How scammers use it: Scammers often give something small first to create a sense of debt.

Fraud examples:

  • A scammer offers “free advice” or a free consultation, then pressures the victim into paying for a fake service.
  • Romance fraudsters may send small gifts, compliments, or emotional support early on, then later ask for money.
  • Phishing emails may offer a free voucher, gift card, or reward, then ask for personal details.

2. Scarcity or “Act now before it’s too late”

People place a higher value on things that are limited or becoming unavailable. When something feels rare or time-sensitive, it becomes more desirable and harder to resist.

How scammers use it: Scammers create urgency to pressure victims into making quick decisions without thinking or checking.

Fraud examples:

  • Messages claiming “your account will be locked” unless you act immediately.
  • Fake online deals stating “only 1 item left” or “offer ends today.”
  • Investment scams promoting “limited-time opportunities” to rush decisions.

3. Authority or “You can trust me, I’m official”

People are more likely to follow the lead of someone they see as credible, knowledgeable, or in a position of authority. Titles, uniforms, and official branding increase trust.

How scammers use it: Scammers impersonate trusted organisations or figures to appear legitimate and reduce suspicion.

Fraud examples:

  • Calls, emails or messages pretending to be from your bank, the police, tax authorities or other trusted institutions. 
  • CEO and executives impersonation scams. 
  • Fake tech support scams claiming to be from Microsoft, Apple, Amazon or other big companies. 
  • Emails using official logos and language to request sensitive information.

4. Consistency or “You’ve already said yes”

People usually prefer to be consistent with what they have previously said or done. Once someone commits to something small, they are more likely to agree to something bigger later.

How fraudsters use it: Scammers start with small, harmless requests and gradually build up to more significant ones.

Fraud example:

  • A fraudster first asks you to confirm basic details like your name or email. They then request additional personal information such as your date of birth. Eventually, they ask for sensitive data, like bank details or security codes.

5. Liking or “You like me, so you trust me”

People are more likely to say ‘yes’ to individuals they like. Similarity, friendliness, compliments, and shared interests all increase trust and rapport.

How fraudsters use it: Scammers build relationships by appearing friendly, relatable, and trustworthy before making a request.

Fraud examples:

  • Romance scams where the fraudster builds an emotional connection over time.
  • Social engineering attacks where the scammer mirrors their victim’s interests or background.
  • Friendly messages that include compliments or shared experiences to build trust.

6. Social Proof or “Everyone else is doing it”

People tend to follow the actions of others, especially when they are unsure what to do. If something appears popular, it feels more legitimate.

How scammers use it: Scammers create the illusion that others are already trusting or benefiting from the offer.

Fraud examples:

  • Fake reviews or testimonials promoting a fraudulent product or service.
  • Investment scams claiming “thousands of people have already signed up.”
  • Social media scams that use fake accounts to create likes, comments, and credibility.

The Most Common Impersonation Fraud Tactics 

These principles are rarely used in isolation. They are often combined to make scams more convincing and harder to resist. Common combinations used by scammers:

  1. Authority + Scarcity

“I’m calling from your bank. There is suspicious activity on your account and you need to act now.” Authority creates trust, while urgency pressures the victim to act quickly without verifying the situation.

  1. Liking + Reciprocity

A fraudster builds rapport by being friendly, supportive, and relatable. Once trust is established, they ask for help or money, relying on the victim’s sense of obligation to respond.

  1. Consistency + Scarcity

The scam starts with small, harmless requests. Once the victim is engaged, urgency is introduced, such as a limited-time offer or sudden issue. This makes it more likely they will continue rather than stop and question what is happening.

When combined, these techniques reinforce each other. Trust reduces suspicion, urgency limits thinking time, and prior commitment makes it harder to say no. This is what makes many scams so effective.

How AI Voice Cloning and Deepfakes Have Supercharged These Tactics 

Understanding the psychology is only half the picture. The other half is what has happened to the technology that delivers it. Every one of the principles above depends on the scammer being convincing enough in the moment.  For most of history, a convincing impersonation required skill, access, and time, and even then, it left signs. A fake email read slightly wrong. A cloned voice sounded slightly off. Those signs gave people something to catch. AI is removing them completely. 

Voice cloning tools can now produce a convincing replica of someone from a short audio sample, the kind freely available in a conference recording or company video. According to Deloitte's Center for Financial Services, losses from AI-powered fraud in the US are projected to grow from $12.3 billion in 2023 to $40 billion by 2027

A 2025 European Parliament briefing reports that 70% of people are not confident they could tell a cloned voice from a real one. Detection rates for high-quality video deepfakes have been measured as low as 24.5%. That is not carelessness or lack of skills. That is a technology performing beyond what human senses can reliably catch.

The Moment of Action Is Where Fraud Lives

UK Finance data shows 66% of authorised push payment (APP) fraud cases in 2024 began online, reaching people through the platforms they use most. 

Impersonation fraud attacks your next decision, filling the gap between receiving a request and completing an action with enough context to make trust feel reasonable before any check can happen.

Even though awareness training helps, it has limits. You can learn what a red flag looks like in theory, but how do you learn to reliably detect a high-quality deepfake in real time, under pressure, from someone who appears to be a senior colleague? The problem is not what you know or your IQ level; it is the situation you are in when you have to decide.

How to Protect Yourself or Your Business from Impersonation Fraud

Impersonation fraud is a systems problem. It finds the gap between trust and verification and uses it before you can close it. The answer is not to become suspicious of everyone, but to build a simple habit: verify first, at the moments that carry real risk, before any actions are taken. 

UnDoubt was built for exactly those moments. It is an authentication solution that helps stop impersonation fraud by proving that every call, meeting, or message genuinely originated from the claimed person on an approved device, at the moment a high-risk action is requested. 

For Individuals: Download UnDoubt app and protect your money and data from impersonators.   

For Enterprises: contact us to discuss a pilot programme tailored to your organisation’s highest risk workflows. 

Found this useful? Share it with your team. The more people understand how these attacks are designed, the harder they become to run.

References

  1. Verizon Business. Data Breach Investigations Report (DBIR). Verizon Business, n.d. https://www.verizon.com/business/resources/reports/dbir/
  2. Influence at Work. 7 Principles of Persuasion. Influence at Work, n.d. https://www.influenceatwork.com/7-principles-of-persuasion/
  3. Microsoft Security Blog. The psychology of social engineering: the soft side of cybercrime. Microsoft, 2020. https://www.microsoft.com/en-us/security/blog/2020/06/30/psychology-social-engineering-soft-side-cybercrime/
  4. Deloitte Insights. Deepfake banking fraud risk on the rise. Deloitte, n.d. https://www.deloitte.com/us/en/insights/industry/financial-services/deepfake-banking-fraud-risk-on-the-rise.html
  5. European Parliamentary Research Service (EPRS). Artificial intelligence and deepfakes (EPRS briefing document). European Parliament, 2025. https://www.europarl.europa.eu/RegData/etudes/ATAG/2025/777940/EPRS_ATA%282025%29777940_EN.pdf
  6. UK Finance. Half-Year Fraud Report 2025. UK Finance, 2025. https://www.ukfinance.org.uk/system/files/2025-10/Half%20Year%20Fraud%20Report%202025_0.pdf