Maryland Attorney General Anthony G. Brown has joined a bipartisan group of 47 state attorneys general in urging major technology companies, including search engines and payment platforms, to implement stronger measures against the proliferation of deepfake nonconsensual intimate imagery (NCII). This push comes in response to a growing concern that these advanced AI-generated images and videos are increasingly being used to exploit and harm individuals, with a staggering 98% of fake online videos reportedly falling into this category.
Article continues after these messages…
We don't lock our news behind a paywall, help us keep it that way! If you're tired of news sweetened with confirmation bias, consider becoming a monthly supporter. But if you're not, that's fine too—we're confident in our mission and will be here if you decide you're ready for the truth. Just $5/month helps fund our local reporting and more.
Become a paid supporter for reduced ad experience!
The coalition has sent separate letters to both search engines and payment platforms, outlining specific actions they believe these companies should take. For search engines, the attorneys general are calling for the implementation of safeguards similar to those already in place for other harmful content. This includes developing methods to warn users about malicious content and to redirect them away from sites that host or facilitate the creation of deepfake NCII. The attorneys general specifically pointed to searches for terms like “how to make deepfake pornography,” “undress apps,” or “nudify apps,” suggesting these should be treated with the same caution as searches for dangerous activities.
In their communication with payment platforms, the attorneys general are urging these companies to more proactively identify and deny payment processing services to entities involved with the creation or distribution of deepfake NCII. This would involve scrutinizing sellers and removing those found to be connected to such content or tools from their networks. The aim is to cut off financial support for the creation and dissemination of these harmful digital creations.
The letters highlight the significant damage that deepfake NCII can inflict, particularly on women and girls, who are disproportionately targeted. The technology is being used to embarrass, intimidate, and exploit individuals, with public figures and private citizens alike falling victim. High-profile cases, such as those involving celebrities and teenagers in various states and countries, underscore the widespread nature of this problem. While the majority of victims are women and girls, the coalition acknowledges that men and boys have also been subjected to this form of abuse.
The attorneys general are leveraging existing industry practices as a precedent for their requests. They note that search engines already restrict access to content related to self-harm or illegal activities, demonstrating their capacity to filter and manage potentially damaging online material. By drawing this parallel, they are advocating for a similar level of responsibility and action regarding the creation and distribution of deepfake NCII. The broader implications of these actions could lead to increased online safety and a reduction in the exploitation of individuals through malicious AI-generated content. Residents can be aware that these efforts aim to make the internet a safer space, particularly concerning the misuse of AI technology for personal harm.
Article by Mel Anara, based upon information from the Maryland Attorney General’s Office.
Do you believe we got something wrong? Please read our publishing standards and corrections policy.
Did you know? Supporters get a reduced ad experience!
Sponsored Articles
Get daily and breaking news for Washington County, MD area from Radio Free Hub City. Sign up with your email today!
Paid supporters have a reduced ad experience!
Discover more from Radio Free Hub City
Subscribe to get the latest posts sent to your email.












