On June 5 2023, the FBI issued a public service announcement warning of “Malicious Actors Manipulating Photos and Videos to Create Explicit Content and Sextortion Schemes.” It states that profile images and other photographs posted by social media users have been taken and altered by AI image generators to develop explicit deepfakes. Moreover, such images can be incredibly difficult to detect.
“The FBI continues to receive reports from victims … whose photos or videos were altered into explicit content,” the report goes on, stating that, in some cases, those responsible have also contacted those featured in the original images and made threats or demands for further explicit images or money.
The public service announcement suggests visiting the Internet Crime Complaint Center to report any such an attack and shares some ways for social media users to protect themselves. They include being vigilant with passwords and carrying out reverse image searches to determine whether content has been subjected to such treatment.
It’s a real tragedy that the new possibilities offered by AI are being utilized in this way, but it certainly isn’t news to authorities. Almost two years ago, a September 2021 PSA from the FBI addressed an Increase in Sextortion Complaints, and such cases just seem to be on the rise as perpetrators discover new tools for carrying out these kinds of crimes.
As the FBI noted in that instance, it’s crucial not to feel embarrassed to report these experiences. Support is both available and necessary.
Stay connected with us on social media platform for instant update click here to join our Twitter, & Facebook
We are now on Telegram. Click here to join our channel (@TechiUpdate) and stay updated with the latest Technology headlines.
For all the latest gaming News Click Here
For the latest news and updates, follow us on Google News.