Texas Strengthens Protections Against AI-Generated Sexual Content: What Survivors Need to Know 

From the DARCC Education Department, Maggie Bego, Director of Education

Earlier this year, The Texas Tribune shared the story of Elliston Berry, a Texas teenager who woke up to dozens of texts from friends warning her that her photos had been altered. “She woke up … to frantic texts … innocent photos … had been edited using artificial intelligence to make them appear nude” (Texas Tribune, 2025). Within hours, fake sexual images of nine girls from her school were circulating online. 

Elliston’s story is heartbreaking and deeply alarming. It shows how quickly technology can be used to cause harm and how vulnerable people, especially young people, can be to this kind of exploitation. Artificial intelligence can now create “deepfakes,” making it look like someone is in a sexual situation without consent. These manipulated images can spread fast, leaving survivors feeling violated, isolated, and unsure of where to turn. 

Strengthening the Law 

Until recently, Texas law only covered deepfake videos, not images. In 2025, lawmakers closed that gap by passing Senate Bill 441 (SB 441) and House Bill 449 (HB 449). As Reason explains, “The new legislation expands the state’s prohibition on deepfake pornography to include still images as well as videos” (Reason, 2025). Under these laws, creating or sharing deepfake sexual content without consent is now a crime. Offenders can face misdemeanor or felony charges depending on the situation, with stronger penalties when the survivor is under 18 (Texas Legislature, 2025). "Those who use technology to exploit others will now face consequences." (Texas Tribune, 2025) 

AI Deepfakes as a Form of Sexual Violence 

Deepfake sexual videos and images are not just privacy violations; they are a form of sexual violence. These acts strip individuals of autonomy over their bodies and identities, causing emotional and psychological harm similar to other forms of sexual assault (Harvard DASH, 2024). Survivors often report shame, fear, anger, and helplessness and may experience ongoing anxiety knowing their likeness could resurface (Thorn, 2024). Even without physical contact, public exposure of one’s image can produce lasting trauma, leading some survivors to withdraw from school, work, or relationships (Umbach et al., 2024; MCASA, 2024). 

Deepfake abuse is overwhelmingly pornographic and disproportionately targets women, with some content monetized or shared to humiliate and control victims (The Regulatory Review, 2024; Arxiv, 2024). Surveys show that 2.2% of people worldwide have experienced deepfake pornography victimization, highlighting its prevalence (Umbach et al., 2024). As AI technology evolves, advocates and service providers must continue learning and adapting to support survivors, influence policy, and create safer digital communities (MCASA, 2024; AUDRi, 2024). 

Working Together to Protect Survivors 

Texas’ updates align with federal efforts like the Take It Down Act, which criminalizes sharing any intimate image, real or AI-generated, without consent. Citizen.org explains that these laws “ensure victims have legal avenues to remove content and seek justice” (Citizen, 2025). Texas also passed the Responsible Artificial Intelligence Governance Act (TRAIGA) to hold AI developers accountable. Mayer Brown, an international law firm in Dallas, notes that companies must now have systems to monitor and reduce misuse of AI technology (Mayer Brown, 2025). 

These steps show progress, but they also remind us that the intersection of technology and sexual violence is not a distant issue. It is happening right now in our schools, communities, and online spaces. 

Why This Conversation Matters 

At the Dallas Area Rape Crisis Center (DARCC), we believe conversations about technology-facilitated sexual violence are essential. Awareness and education are forms of prevention. As technology evolves, it is critical that our understanding and compassion evolve too. 

Deepfakes and AI-generated exploitation are not just digital problems. They have real emotional and psychological impacts on survivors. When a person’s image or likeness is used without consent, it can lead to deep trauma, fear, and a loss of safety. Recognizing these experiences as forms of sexual violence helps ensure survivors are seen, believed, and supported. 

DARCC’s Commitment 

DARCC remains dedicated to supporting survivors of all forms of sexual violence, including those harmed through technology. 

Our services include: 

  • Crisis counseling and advocacy 

  • Legal and systems support 

  • Guidance for online harassment or deepfake exploitation 

  • Education on consent, online safety, and digital ethics 

Laws can create accountability, but lasting change happens when communities commit to respect, consent, and empathy both in person and online. 

References: 

  • Reason. (2025). Texas amends non-consensual sexual deepfake law to include images. Reason.org. 

  • Texas Tribune. (2025). Take It Down Act: Deepfakes and digital nudes in Texas schools. TexasTribune.org. 

  • Mayer Brown. (2025). Texas passes unique artificial intelligence law focused on prohibited practices. MayerBrown.com. 

  • Skadden. (2025). Texas charts new path on AI with landmark regulation. Skadden.com. 

  • Citizen. (2025). Bills strengthening protections against AI-created intimate deepfakes now law. Citizen.org. 

  • Texas Legislature. (2025). Senate Bill 441 analysis. Capitol.Texas.gov. 

  • Rousay, V. (2023). Sexual deepfakes and image-based sexual abuse: Victim-survivor experiences and embodied harms. Harvard University. 

  • Thorn. (2025). Deepfake nudes and young people. Thorn. 

  • The Regulatory Review. (2024). Protecting against sexual violence linked to deepfake technology. The Regulatory Review. 

  • Umbach, R., Henry, N., Beard, G., & Berryessa, C. (2024). Non-consensual synthetic intimate imagery: Prevalence, attitudes, and knowledge in 10 countries. arXiv. 

  • Han, C., Li, A., Kumar, D., & Durumeric, Z. (2024). Characterizing the MrDeepFakes sexual deepfake marketplace. arXiv. 

  • MCASA. (2024). Survivor safety: Deepfakes and the negative impacts of AI technology. MCASA. 

  • AUDRi. (2024). Briefing paper: Deepfake image-based sexual abuse, tech-facilitated sexual exploitation and the law. AUDRi. 

Next
Next

Trey’s Law: A Landmark Victory for Survivors of Sexual Abuse in Texas