Skip to main content

Hello? How Scammers Use AI to Impersonate People and Steal Your Money

Angela OrebaughAngela Orebaugh is Assistant Professor in the Engineering School's Computer Science Department. Prior to joining academia, Angela worked in industry for over 20 years providing cybersecurity expertise to clients such as the National Institute of Standards and Technology (NIST), the Department of Defense (DoD), intelligence agencies, small businesses, and start-ups. Her mission is to leverage her industry expertise to add value and create exceptional learning experiences for students.

 

Introduction

Artificial Intelligence (AI) is a technology that enables computers to simulate human thinking or actions, including learning, problem-solving, and decision-making. AI provides many benefits to society, including improved medical diagnostics, autonomous vehicles, smart manufacturing, and cybersecurity. It has been implemented for years in industries such as banking, marketing, and entertainment. It is the technology that enables smart assistants like Alexa to learn our preferences and patterns or travel apps to select the most efficient route. While AI creates efficiencies, reduces human error, and enables new discoveries, it is also helping cybercriminals create more effective scams.

How Are Cybercriminals Using AI for Scams?

The two main malicious uses of AI are voice cloning and deepfakes. Voice cloning, also called audio deepfakes, artificially replicates someone’s voice to sound very authentic, including the person’s tone, intonation, and pronunciation. Deepfakes are videos or photos that artificially replicate a person, often doing or saying something. There are already many voice clones and deepfakes of CEOs, politicians, and celebrities; even Taylor Swift has been faked.

Deepfake concept of a 3D face illustration with long nose attached to a mask

Cybercriminals are using AI to voice clone people to use their voice in scams asking for personal information or money. AI techniques can analyze and learn a person’s voice from audio or video recordings found on social media or recorded by the scammer, with as little as three seconds of audio, and authentically replicate them. The scams are often vishing (voice phishing) scams, where the cybercriminal calls the target person on the phone and pretends to be a family member or friend. Cybercriminals may target parents or grandparents pretending to be a child or young adult in trouble or having an emergency like traveling and losing their wallet, being in jail, or other reasons to need money quickly. Scammers will also try to call targets and interact with them long enough to record their voices, then use their AI voice clone to access financial accounts and other services with voice recognition.

Telltale signs of a voice cloning scam include:

  • An emergency situation.
  • A sense of urgency for you to act immediately. The scammer doesn’t want to give you time to think.
  • The conversation may be short, and the caller may say they can’t talk for long.
  • A request for money in the form of wire transfer, bitcoin, gift cards, payment app, or bank account information.
  • A request for personal information.
  • A request to keep the conversation a secret.
  • An offer that sounds too good to be true.

Cybercriminals are using AI to create deepfake videos or photos of people to use in their scams. AI techniques can alter a person’s face or body to make them look like someone else. AI learns how to replace a person's likeness and appearance from photos and videos. Cybercriminals use deepfakes to spread false information, facilitate money-making scams, and blackmail.

Telltale signs of a deepfake scam include:

  • Video will sometimes appear to have jerky or unrealistic movements.
  • There may be changes in lighting and shadows.
  • There may be changes in skin tone, skin texture, or shadows around the eyes.
  • Unusual blinking or hair movement.
  • Word choices and speech patterns may be uncharacteristic of the individual.
  • The video or photo may contain content that is uncharacteristic of the individual.
  • A request for money or personal information.Origami sheep throws a shadow of a wolf

How Can You Protect Yourself and Your Family?

The following are actions you can take to avoid being scammed:

  • Let unknown calls go to voicemail.
  • Don’t believe the caller ID; it can also be faked.
  • Don’t give out personal information, such as bank information or passwords.
  • Create a secret password to use with family and friends. Asking the person on the phone the secret password will help you know if it is real or a scam. 
  • Ask questions that only the real person would know the answers to and are not public information.
  • Hang up, or use another phone, if available, to call the person directly using their real number.
  • Keep social media private and encourage others to do the same. Scammers will learn about you and who your friends and family are to impersonate. 
  • Use two-factor authentication on services and accounts.
  • Report the scam to the Federal Trade Commission (FTC) online.