Want to get my easy to follow Tech Tips in your email?
Sign up for my daily Rick’s Tech Tips Newsletter!
Back in 2023 the FBI posted a warning about scammers stealing photos from innocent victims’ social media profiles and using them to blackmail and/or otherwise extort money and other things of value from them.
Once a scammer has downloaded a victim’s photos they use artificial intelligence (AI) to create fake explicit photos and videos based upon the photos stolen from the victim’s profile.
These “deep fake” images and videos are so realistic the average person is likely to believe they are real.
The scammer will then use the fake adult photos and videos to blackmail the victim into forking over a ransom in return for them agreeing not to share the illicit content with the victim’s family members and friends.
I’ve been warning folks for years about the dangers of posting innocent photos (especially of children) on social media. As you probably know, a scammer with a decent set of Photoshop skills can take a completely innocent photo and turn it into a very realistic looking, but fake adult-style image.
Well guess what? The rise of AI has made it extremely easy for most anyone to use stolen photos to create photos and videos of an adult nature that appear to be completely authentic.
The good news is there are things you can do to protect yourself and your family from this horrible scam, beginning with quickly reporting incidents that come to your attention to the FBI.
Bonus tip: It seems that the fraudsters and scammers never pause to even sleep these days. This excellent resource (#ad) can help keep you, your family and your property safe.