AI News

FBI: AI Deepfake Extortion Scams on the Rise, How to Protect Yourself

Array
The FBI has warned that deepfake extortion scams are on the rise.

The FBI has warned that deepfake extortion scams are on the rise. In these scams, criminals use AI to create fake videos or audio recordings of the victim that make it look or sound like they are doing something illegal or embarrassing.

The criminals then threaten to release the deepfake to the public if the victim does not pay them a ransom.

The FBI has received over 7,000 reports of online extortion targeting minors in the past year. In April, there was an uptick in victims of so-called “sextortion scams” using deepfakes.

Here are some tips to protect yourself from deepfake extortion scams:

  • Be careful about what information you share online. Criminals can use any information you share online to create a deepfake of you. This includes your name, address, phone number, social media profiles, and photos.
  • Be suspicious of any emails or messages that ask for personal information. Criminals will often send emails or messages that appear to be from a legitimate source, such as a bank or credit card company. These emails or messages will often ask for personal information, such as your Social Security number or credit card number.
  • Do not click on links in emails or messages from people you do not know. Criminals can use links in emails or messages to install malware on your computer. This malware can then be used to steal your personal information or control your computer.
  • Report any suspicious activity to the FBI. If you receive an email or message that you think is a scam, or if you think you have been the victim of a deepfake extortion scam, report it to the FBI. You can do this by visiting the FBI’s website or by calling 1-800-CALL-FBI.
Tags

Roland is a Public Relations & Communications guru with an immense passion for the blockchain and crypto industry. A fusion of his expertise and passion led to the dawn of Optimisus in 2020.