× Home Our Services   About Us   Recent Successes Testimonials News And Videos   Contact Us 中文 فارسی
Contact Our Firm

When the Law Fails to Keep Up

Book Your Free Consultation

The criminal justice system can be daunting, but you don’t need to go through it alone. Our Criminal lawyers are here to guide you every step of the way.

Contact Our Firm

 

A 2025 Ontario ruling acquits a husband charged with distributing intimate images of his wife — and exposes alarming gaps in Canada’s criminal law on non-consensual image sharing.

Published October 2025   |   Ontario Court of Justice   |   2025 ONCJ 542

 

A husband allegedly photographs his wife without her knowledge in their bathroom, sends the image to a stranger on Snapchat, and also shares a digitally altered image placing her face on a nude body. He is charged. And then — under the strict letter of Canadian law — he is acquitted. This is the uncomfortable reality laid bare by R. v. R.K.1, decided October 10, 2025.

What Happened

The accused, identified only as R.K.1, was charged under section 162.1(1) of the Criminal Code with knowingly publishing, distributing, or transmitting an intimate image of his wife, R.K., without her consent. The case arose after R.K. and her daughter P.K. discovered that R.K.1 had opened a Snapchat account and sent photos of R.K. to an unknown man.

Two photos were at the centre of the prosecution. The first showed R.K. wearing only a bra and undergarments in the couple’s bathroom — a photo she says was taken without her knowledge. Her face had been scribbled out. The second showed what appeared to be R.K. completely nude, but she testified the image was fake: her head had been digitally placed on another person’s naked body.

Case at a Glance

Court: Ontario Court of Justice, Burlington

Citation: R. v. R.K.1, 2025 ONCJ 542

Judge: Justice Brian G. Puddington

Charge: Section 162.1(1) Criminal Code — publishing/distributing an intimate image without consent

Motion: Directed verdict (no-evidence motion) brought at close of Crown’s case

Outcome: Application granted — accused found not guilty

 

The Legal Framework

Section 162.1(1) of the Criminal Code — Canada’s “revenge porn” provision, introduced in 2015 — criminalises the knowing distribution of an “intimate image” without consent. But the offence is only as powerful as the definition of “intimate image” that underpins it.

 

162.1(2) “Intimate image” means a visual recording of a person made by any means, in which the person is nude, is exposing his or her genital organs or anal region or her breasts, or is engaged in explicit sexual activity; in respect of which, at the time of the recording, there were circumstances that gave rise to a reasonable expectation of privacy; and in respect of which the person depicted retains a reasonable expectation of privacy at the time the offence is committed.

The Crown had to prove that the images shared were “intimate images” under this definition. Justice Puddington found, with some evident reluctance, that neither photo met the statutory test.

 

The Bra Photo: Not an Intimate Image?

On the first photo — R.K. in her bra — the court found that she was neither nude nor engaged in explicit sexual activity. The remaining question was whether her breasts were “exposed.” Justice Puddington held that they were not.

 

“While taking a photo of someone, without their knowledge in their underwear and sending it to another person is disgusting and disgraceful, it does not meet the clear definition of ‘intimate image’ as outlined in section 162.1.”

— R. v. R.K.1, 2025 ONCJ 542, at para. 20

The court drew support from an Alberta decision, R. v. Winsor (2024 ABCJ 5), which held that “a photograph of someone in their underwear would not fall under the provisions of section 162.1, regardless of how much or little the underwear concealed or revealed.” Justice Puddington agreed, reading “exposed” to mean bare — not covered by any clothing, however minimal.

He also pointed to a practical concern: if partial coverage counted, where would the line fall? A low-cut shirt? A bikini top? A strict reading was necessary, the court concluded, to avoid unworkable ambiguity in a provision carrying potential criminal liability and imprisonment.

Notably, the court observed that the more fitting charge might have been voyeurism under section 162 — but R.K.1 was not charged with that offence, and the court found no basis to treat it as a lesser included offence.

The Deepfake Photo: Not Her Body, Not Her Image

The second photo presented an equally stark limitation. R.K. herself confirmed the nude image was fake — her face, placed on someone else’s body. Justice Puddington found that the current statutory definition of “intimate image” simply cannot apply to fabricated images.

The definition speaks of the person being nude, or her breasts being exposed. If the nude body is not hers, the image is not a recording of her nudity. Parliament, the court found, plainly did not intend to capture synthetic or digitally manipulated images when it enacted section 162.1.

“An image of a person’s actual exposed genitals or breasts or naked body carries with it a specific level of integrity and privacy. This is not to say that a fake image does not cause harm and embarrassment, but that harm is not captured by the current provisions.”

— Justice B.G. Puddington, at para. 26

The court noted that the proposed (but not yet enacted) Online Harms Act would have explicitly extended protection to images that “falsely present in a reasonably convincing manner a person as being nude” — including deepfakes. The very fact that Parliament saw fit to add this language in the proposed legislation, the court observed, is an acknowledgment that the current law does not cover it.

The Directed Verdict

At the close of the Crown’s case, the defence brought a motion for a directed verdict — arguing that the prosecution had failed to establish a prima facie case. The test, drawn from the Supreme Court of Canada in United States of America v. Sheppard (1977), asks whether there is any evidence upon which a reasonable jury, properly instructed, could return a guilty verdict.

Finding no evidence that the images satisfied the legal definition of “intimate image,” Justice Puddington granted the motion and acquitted R.K.1.

 

The Court’s Conclusion

Justice Puddington granted the directed verdict, finding that the Crown had failed to establish that either photo was an “intimate image” under section 162.1(2) of the Criminal Code. R.K.1 was acquitted.

The judge was explicit that his ruling was not an endorsement of the conduct: “Nothing in these reasons should be read as saying that R.K. did not experience an embarrassing and humiliating event.” He acknowledged R.K.’s visible distress in court and the possibility that the photos had entered the public domain.

He was equally direct about the limits of his role: “I must apply the law dispassionately, and not try to shoehorn images into a definition simply because I find the photographs deplorable.”

 

Why This Case Matters

This decision is a sobering illustration of the gap between what the law says and what victims of image-based abuse actually experience. The conduct alleged here — covert photography of a spouse in a state of undress, and the creation and distribution of a digitally fabricated nude image — is precisely the kind of violation that section 162.1 was designed to address when it was introduced in 2015.

And yet the law, as written, did not reach it.

The underwear exclusion is not entirely surprising in light of prior case law. But the deepfake gap is starker. In an era when artificial intelligence can generate photorealistic fabricated intimate images of real people within seconds, a law that applies only to genuine nude photographs leaves a vast category of harm entirely outside the criminal law.

The Online Harms Act, which would have explicitly captured such images, received first reading in Parliament but did not become law. Until legislation catches up — whether through the Online Harms Act or otherwise — complainants whose images have been digitally fabricated and distributed will find little recourse in section 162.1.

Justice Puddington ended with a direct signal to Parliament: the law may need to expand both to cover non-consensual underwear photography and digitally fabricated images. Until it does, cases like R. v. R.K.1 will remain a painful reminder that the criminal law is only as protective as the words Parliament chooses — and the words here were not broad enough.

In a more recent and similar case, R v MSK, 2026 NSPC 12 (CanLII), the Court also stated:

[105]   The law does not reflect the technology that exists today, and to combat advancing technologies in the commission of offences requires the dictates of Parliament. It is not within the proper jurisdiction of this Court to extend the reach of the legislation to include an image that creates a nude body entirely generated by AI using a real, identifiable face of a person as a ‘visual recording of a person’ within the definition of intimate image. It would be a patchwork assembly to take the enormity of artificial intelligence and force-feed it into incompatible legislation. The statute must be revised to respond to such concerns.

Key Takeaways

For legal practitioners: Section 162.1 requires strict proof that the image depicts the actual complainant’s nudity or exposed body parts. Images showing a person in underwear, or digitally fabricated nudes, will not satisfy the current statutory definition.

For advocates: The proposed federal Online Harms Act (Bill C-63 ) language explicitly targeted deepfakes and fabricated intimate images — its passage (or equivalent legislation) remains urgently needed, however as of today’s date, the Bill died on the Order Paper following the prorogation of Parliament on January 6, 2025.  The Government is now deciding how to move forward.

For the public: Non-consensual sharing of intimate images is a serious harm regardless of whether it crosses a criminal threshold. Civil remedies, including tort claims and privacy legislation, may offer additional avenues — but they are no substitute for a comprehensive criminal framework.

For Parliament: This decision joins a growing body of cases signalling that section 162.1, as currently drafted, leaves too many victims behind. The gap is known. The fix is overdue.

 

This blog is for informational purposes only and does not constitute legal advice.

Case citation: R. v. R.K.1, 2025 ONCJ 542  |  Ontario Court of Justice  |  Released October 10, 2025

Source: CanLII (www.canlii.org)

Leave a Reply

CONTACT INFORMATION


PHONE: (416) 364-3111
FAX: (416) 364-3271