
The Federal Bureau of Investigation has warned Americans that criminals are increasingly using artificial intelligence to create sexually explicit images to intimidate and extort money, Reuters reports.
In an alert issued this week, the FBI said it has seen a recent increase in the number of victims being solicited after criminals use fake versions of images taken from online posts or private messages.
- “The photographs are then sent directly by the criminals to the victims for blackmail and harassment.
- Once the images are distributed, victims may face serious challenges preventing further distribution or removal of the content from the Internet,” the FBI said in a statement, as reported by Reuters.
The FBI also said the images looked real, and in some cases the victims were children.
The FBI did not provide details about the program or programs used to create the sexually suggestive images, but warned that technological advances are “continually improving the quality, personalization and accessibility of artificial intelligence (AI) content creation.”
The Federal Bureau of Investigation did not respond to Reuters’ request for details on the phenomenon.
Using photos to create sexually suggestive images is an old tactic, but the release of open source AI tools has made the process easier than ever. The results are often indistinguishable from real photos, and in recent years several sites and social media channels have emerged that specialize in creating and sharing sexually suggestive images with the help of AI.
Source: Hot News

Ashley Bailey is a talented author and journalist known for her writing on trending topics. Currently working at 247 news reel, she brings readers fresh perspectives on current issues. With her well-researched and thought-provoking articles, she captures the zeitgeist and stays ahead of the latest trends. Ashley’s writing is a must-read for anyone interested in staying up-to-date with the latest developments.