Deepfake Porn: Texas Law & Proposed Federal Legislation
In an era where technology evolves faster than legislation, deepfake porn is a pressing issue that has garnered lawmakers’ attention. In 2023, Texas enacted a law specifically targeting deepfake pornography, and now U.S. Senator Ted Cruz is proposing federal legislation that expands protections for victims and increases penalties for perpetrators, including juveniles.
In this article, we will explain the current Texas deepfake porn law and also the proposed federal legislation – dubbed the “Take It Down Act” – by Senator Cruz (R-Texas), which stemmed from a case in North Texas.
Our firm has represented a number of people, including juveniles, who have been accused of deepfake pornography and are constantly navigating the complex legal landscape surrounding this offense.
What is Deepfake Pornography?
Deepfake pornography refers to digital media, usually videos, that are manipulated using artificial intelligence (AI) technology to create fake images or videos that appear real. This can range from simple face swapping to more sophisticated techniques where AI algorithms can generate lifelike videos of people engaging in sexual acts without their consent or knowledge.
Texas Deepfake Pornography Law
In 2023, Texas passed a law specifically addressing deepfake pornography. Texas Penal Code Section 21.165, titled “Unlawful Production or Distribution of Certain Sexually Explicit Videos,” went into effect on September 1, 2023. Under this statute, a person commits an offense if he or she:
- Without the effective consent of the person appearing to be depicted
- a person knowingly produces or distributes by electronic means
- a deepfake video that appears to depict the person with their intimate parts exposed or engaged in sexual conduct.
- The law defines a “deep fake video” as “a video, created with the intent to deceive, that appears to depict a real person performing an action that did not occur in reality.”
This offense is a Class A misdemeanor, punishable by up to a year in the county jail and a maximum fine of $4,000.
This law is part of Texas’s efforts to address the growing concern over the use of artificial intelligence to create sexually explicit content without consent. It’s important to note that this is one of two deepfake video laws in Texas as of 2024. The other one addresses the use of deepfakes in political communications, which is also a Class A misdemeanor.
Take It Down Act: Proposed Federal Legislation for Deepfake Pornography
In June 2024, Senator Ted Cruz, along with bipartisan co-sponsors, proposed federal legislation to address the issue of deepfake pornography. The Take It Down Act – an acronym for Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks – aims to expand protections for victims and increase penalties for perpetrators.
The proposed legislation stems, in part, from an incident involving a high school student from Aledo, Texas. Elliston Berry became a victim of AI-generated deepfake porn created by a classmate. The perpetrator used an app to impose nude images onto photos taken from Berry’s social media accounts.
Berry shared her story during a U.S. Senate committee hearing in Dallas, describing how she woke up on October 2, 2023, to messages informing her that explicit photos of her were circulating on social media. The case highlighted several issues that the Take It Down Act seeks to address. Here are key provisions of the proposed federal legislation.
- This proposed legislation aims to combat the growing issue of deepfake pornography and non-consensual intimate imagery (NCII) across the United States.
- Criminalization: The bill would make sharing AI-generated nonconsensual pornographic images on social media a federal crime.
- Penalties: Perpetrators could face up to two years in prison for distributing nonconsensual sexual images of an adult, and up to three years if the victim is a minor.
- Platform accountability: Social media platforms and similar websites would be required to remove offending content and its duplicates within 48 hours of being notified by the victim.
- Compliance requirements: Platforms would need to develop internal processes to deal with AI-generated nonconsensual images within a year of the bill’s passage.
- Enforcement: The Federal Trade Commission would be responsible for enforcing compliance with the 48-hour takedown deadline.
Interestingly, a lot of people accused of AI pornography are juveniles, and it’s rare to see juveniles prosecuted federally. This raises questions about how the federal law would handle cases involving minors. There may be a need for specific provisions or guidelines within the federal legislation to address juvenile offenders, focusing on rehabilitation rather than severe punitive measures. Balancing the need to deter these crimes with the understanding of juvenile behavior and development is a delicate balance that will need to be addressed.
Deepfake Porn Statistics
The phenomenon of deepfake pornography has escalated in recent years. Here’s a look at the numbers.
- Increased prevalence: The total number of deepfake videos online increased by 550% from 2019 to 2023.
- Platform presence: Seven out of the top 10 most visited pornographic websites host deepfake content.
- Gender bias: 99% of the victims targeted by deepfake pornography are women.
- Creation speed: It can now take less than 25 minutes and cost nothing to create a minute-long deepfake pornographic video using a single clear face image of the victim.
- Accessibility: Due to the widespread availability of deepfake video technology and low-cost apps, anyone with a smartphone can potentially create deepfake pornography.
These statistics highlight the pervasive nature of deepfake pornography and the ease of creation and widespread distribution. We can expect to see continued enforcement efforts by lawmakers, police, and prosecutors – many of whom will try to make an example of offenders to deter others from engaging in this activity.
Accused of Deepfake Porn? Contact Us.
If you or a loved one has been accused of creating or distributing deepfake pornography, the consequences can be severe and life-altering. At Varghese Summersett, our experienced criminal defense attorneys understand the complexities of these cases and are committed to protecting your rights and reputation. We also have Board Certified Juvenile Defense Attorney Lisa Herrick on our team for cases involving minors or young adults.
Don’t face this legal battle alone—contact us today at 817-203-2220 for a confidential consultation, and let us provide the expert defense you need. Your future is worth fighting for. We serve all Fort Worth, Dallas, Southlake and the surrounding areas.