When the Law Looks Away: Deepfake Intimate Images, Legislative Priorities, and the Limits of the Criminal System
- PATH Legal
- 4 days ago
- 3 min read
In Canada, it is not a criminal offence to create or share deepfake intimate images — even when it is non-consensual and deeply harmful. The legal system offers few, if any, meaningful remedies to survivors. This stands in stark contrast to countries like South Korea or the UK, where explicit laws prohibit the non-consensual creation and distribution of sexually explicit deepfakes.
While many describe this gap as a case of the law “failing to keep up with technology,” the reality may be more about legislative priorities than technical limitations. Canadian legislators have demonstrated that they can act swiftly when it serves dominant interests. In this context, the lack of action on deepfake intimate images may reflect longstanding patterns of whose safety and dignity are prioritized by law — and whose are not.
In early 2025, Prime Minister Mark Carney indicated that his government intends to make the creation and distribution of non-consensual sexual deepfakes a criminal offence. While this promise has yet to be implemented, it marks a shift in political attention to the issue. Whether and how that commitment translates into legislation remains to be seen.
Deepfake Intimate Images Is a Real and Present Harm
AI-generated intimate images using someone’s face or likeness without consent is a form of digital sexual violence. It violates personal autonomy and bodily integrity. It can lead to job loss, online harassment, mental health crises, and forced withdrawal from public life. It erases consent — and too often, the law looks away.
Survivors of the non-consensual production or distribution of deepfake intimate images often have no access to meaningful justice. There is no standalone criminal offence. Civil options are costly, slow, and limited in scope. Platforms are largely unregulated. And the burden of fighting for takedowns, safety, and dignity falls almost entirely on those who were harmed.
Criminalization: A Step Forward, But Not the Destination
Prime Minister Carney’s recent pledge to criminalize the production and distribution of non-consensual sexual deepfakes, while a tool for accountability, must be approached with caution. The current criminal legal system has a long history of failing survivors and disproportionately targeting marginalized communities. Without comprehensive reform, new laws risk perpetuating these injustices.
From a legal and policy standpoint, criminalization raises complex questions. While many agree that there should be clear legal consequences for the non-consensual production or distribution of deepfake intimate images, the current criminal legal system has significant limitations. It is often slow, retraumatizing, and unequally applied. Historically, criminal responses in Canada have disproportionately impacted Black, Indigenous, and racialized people, while failing to deliver meaningful justice to survivors of sexual violence.
More laws and harsher penalties may signal condemnation, but they do not necessarily lead to safer environments — online or offline. Survivors frequently report that legal processes do not result in the removal of content, the restoration of safety, or long-term accountability. These limitations have prompted growing interest in alternatives to incarceration, including regulatory approaches, civil enforcement, and survivor-centred supports.
What Real Accountability Could Look Like
A non-carceral approach does not mean doing nothing. It means doing more — and doing it better:
Criminal law reform, not expansion: Where criminal law is involved, it should center survivor dignity, not punishment for punishment’s sake. Meaningful remedies could include prohibitions, mandatory takedowns, no-contact orders, and restorative justice options — rather than jail time that does nothing to repair harm.
Civil legal remedies: Survivors should be able to obtain injunctions, damages, and content removal orders swiftly and affordably. These tools should be accessible without requiring years of litigation or personal expense.
Platform accountability: Tech companies must be held responsible for the harms enabled by their platforms. That means regulation with teeth: enforceable takedown timelines, reporting infrastructure, transparency requirements, and penalties for non-compliance.
Community-based responses: Survivors deserve trauma-informed support, not retraumatizing court processes. Invest in frontline organizations offering legal aid, counselling, digital safety services, and peer support.
Cultural transformation: Prevention begins with education. We need collective conversations about consent, privacy, and harm in the digital age — especially in schools, workplaces, and media.
Broader Legal and Social Context
At present, Canada's legal response to deepfake intimate images remains fragmented. Survivors may try to seek civil injunctions or damages, but legal representation and procedural hurdles can be major barriers. Tech platforms often lack meaningful accountability frameworks, and content removal mechanisms are inconsistent.
The use of artificial intelligence in the creation of sexualized imagery raises urgent privacy, consent, and equality issues. As these technologies continue to evolve, questions remain about how legal systems — criminal, civil, regulatory, and otherwise — will respond.
If You’ve Been Affected
If you have been impacted by the production or distribution of non-consensual deepfake intimate images or other forms of tech-based sexual violence, you are not alone. PATH provides legal support to individuals who have experienced tech-based abuse, including advice on possible civil, regulatory, and non-carceral avenues. Please feel free to reach out to us to learn more about your options.