Law Enforcement, Regulation

Investigating Nonconsensual Intimate Image Sharing

by Christa Miller, Forensic Focus

Nonconsensual intimate image sharing – also known as image-based sexual abuse, nonconsensual pornography, or its original slang, “revenge porn” – has been around since at least the 1980s, but didn’t become a widespread social problem until the internet – and mobile phones – became ubiquitous.

So called because its perpetrators often share intimate images in reaction to a breakup or a fight, “revenge porn” is actually the result of a complex mix of motives. It’s often just as devastating to its victims as physical sexual abuse, and is unevenly policed and enforced. How can investigators and forensic examiners respond?

Defining Image-Based Sexual Abuse

A report published in the United Kingdom, “Shattering Lives and Myths: A Report on Image-Based Sexual Abuse,” describes how a mix of “control, attention seeking, jealousy, obsession, misogyny and lad culture, sexual gratification, a ‘prank’, distress, humiliation, entitlement, and to build up social capital” contribute to nonconsensual intimate image sharing.

Based on findings from more than 50 interviews and part of an international study across the UK, Australia, and New Zealand, the report found that nonconsensual intimate image sharing is often closely correlated with domestic abuse and other forms of coercive control and gender-based violence.

The “Shattering Lives” researchers named and corrected seven ideas around nonconsensual intimate image sharing:

  1. It’s a form of sexual assault, not a communications offense.
  2. It’s derived from a complex mix of motivations, often including power and control, not simply a form of revenge.
  3. The repeated proliferation of images across the internet can shatter lives, and victim-survivors can’t “just move on.”
  4. Only some, not all, forms of image-based sexual abuse are currently criminalized.
  5. The police and the criminal justice system aren’t always prepared to respond to non-consensual taking or sharing of sexual images.
  6. Likewise, websites and social media are not always good at taking down the material.
  7. A preventive, educational and restorative approach to criminal justice may be more effective than a punitive response.

The Impact Of Nonconsensual Intimate Image Sharing

The impact of nonconsensual intimate image sharing isn’t unlike that of child sexual abuse material (CSAM). People whose pictures have been nonconsensually shared describe similar fears of being recognized, of their pictures showing up again and again, of being revictimized. 

It’s also not unlike the impact of sexual assault, because of the idea that adult victim-survivors played a role in their own harm – especially when they consent to taking the pictures or having them taken. This can confound a victim’s abuse claim.

In fact, the “Shattering Lives” report found that many “victim-survivors spoke of being pressured into taking and sending images, or having such images taken.” Even this coercion had multiple factors. Grooming, the threat of physical violence, or threats to end the relationship all played roles.

Unsympathetic attitudes towards their experiences can serve to retraumatize victim-survivors, part of what the report described as “social rupture” whose repercussions are felt in daily personal, professional, and digital social spaces.

For example, US Representative Katie Hill resigned her Congressional seat as a result of abuse: “In Hill’s case, it cost her a career, while her ex-husband who allegedly leaked the photos has escaped the same amount of scrutiny,” wrote Gabby Landsverk for The Insider

A professional cost isn’t the only potential consequence for a victim-survivor. “[S]ocial, cultural or ethnic identities, life experiences and life stage impacted the harms, oppressions and consequences [victim-survivors] encountered in the context of image-based sexual abuse,” the “Shattering Lives” report stated.

Legal Remedies

In the United States, no federal law covering nonconsensual intimate image sharing currently exists. (A federal bill, the ENOUGH Act proposed by US Senator Kamala Harris in 2017 failed to pass.)

Although 46 of the 50 US states, plus the District of Columbia (Washington, DC) and one territory, Guam, have laws specifically prohibiting revenge porn, the laws suffer from inconsistency and confusion. For instance:

  • Some states make nonconsensual pornography a misdemeanor; others make it a felony.
  • In some states the crime is a misdemeanor no matter how many times it’s committed; in others, a second or third offense makes it a felony.
  • Some states categorize it as an invasion of privacy; others as a public decency offense.
  • Some classify the crime differently depending on whether the victim is 18+ years of age.

Finally, some states move more slowly than others to enact and update laws. For example, in the same three-year period it took Oregon state legislators to pass and then update that state’s law, New York’s first law was debated hotly before finally being passed in February 2019. 

That’s also due, however, to the efforts of large tech companies like Google, which lobbied in 2018 to change language in New York’s proposed statute. Prior to the bill’s passage, Google’s efforts to block it were successful. 

In part, that’s because nonconsensual intimate image sharing isn’t as clearly illegal as CSAM. Because consensually posted pictures of consenting adults fall under free speech, a broadly written statute could run afoul of the First Amendment of the US Constitution’s Bill of Rights – and the “Internet’s First Amendment,” Section 230 of the 1996 Communication Decency Act (CDA).

Of course, nonconsensually shared images themselves can “affect a person’s ability to live their life unfettered by harassment,” according to Kateri Gasper, a prosecutor and member of the National White Collar Crime Center (NW3C)’s Prosecutorial & Judicial Advisory Board, “so it’s a very narrow line.”

In the UK, according to the “Shattering Lives” report, motives depend on offenses, and in turn, offenses carry different thresholds: causing distress, intent to cause distress, or knowledge that the action could cause distress. Likewise in the US, where many laws require the government to prove that a perpetrator’s intent was to cause harm.

Another similarity between the two countries’ laws: not all image-based offenses are categorized as sex offenses. In the US, they’re frequently classified as privacy or decency crimes. That’s a problem, says McGlynn, because “[C]omplainants do not have the protections afforded to other sexual offence victims, such as protections against the use of sexual history evidence [at trial].” Further, says Jonithan Funkhouser, a detective with the Lake Oswego (Oregon) Police Department, the perpetrators of nonconsensual intimate image sharing don’t have to register as sex offenders.

Reclassifying image-based crimes to bring the law more in step with other sex offenses, on the other hand, would put consent front and center and also, critically cover sharing without intent to cause harm: “for purposes of sexual gratification, financial gain, group bonding or a ‘laugh’; threats; or fake images” in the words of the “Shattering Lives” report.

The UK Law Commission is currently reviewing existing legislation, but the Guardian reported that its conclusions won’t be released until 2021. “It is deeply troubling that we will have to wait until 2021/22 for there to be reform to the laws on image-based sexual abuse,” says McGlynn. “The current laws are out of date, piecemeal and clearly failing victims. Many of the gaps in the law are straightforward to remedy, such as criminalising threats to distribute sexual images without consent and granting anonymity to complainants. That the Government has refused to act means that many more people are going to be victimised and without any justice or redress. The delays mean justice is being delayed.”

In New York, before the passage of the 2019 law, Gasper says prosecutors had to “get creative” to prosecute offenders. For example, one defendant was charged and later convicted of felony contempt of court for violating a protection order by contacting the victim through a third party – the internet service provider he used to post to the fake online profiles he’d created in his victim’s name.

Alternatively, when the criminal burden of proof can’t be met or when the law doesn’t provide coverage, civil remedies – suing perpetrators under tort claims – may be possible, though David Bateman, a partner in the law firm K&L Gates and a co-founder of the Cyber Civil Rights Legal Project, says this approach is frequently more expensive and public than victim-survivors can handle. Plus, he says, “In the end, the only relief is monetary; there’s no jail time for an offender.” Victim-survivors who pursue the civil route are generally motivated by a need to prove who did it.

Another limiting factor, according to the “Shattering Lives” report: support in the form of advocacy is lacking, so victim-survivors themselves can struggle to understand their options between criminal and civil law, and to communicate with investigators and legal teams.

Nonprofit organizations like the Cyber Civil Rights Legal Project can sometimes offer some relief by providing pro bono services. Bateman says most times when victim-survivors contact a nonprofit, they aren’t trying to unmask their abusers. Instead, they seek to get the images taken down so they can move on with their lives.

Law Enforcement And Digital Forensics Response

In Bateman’s experience with law enforcement across the country, how authorities respond to revenge-porn reports is “a broad spectrum.” “There are federal versus state jurisdictions, and every locale has its own budget, issues, laws, priorities, and ethos,” Bateman says.

That can make for an inconsistent approach across jurisdictions. Investigators rely on prosecutors to help them understand how to gather evidence, arrest, and charge for offenses, but the many gray areas of nonconsensual intimate image sharing make it more difficult to assess the facts of the case – or to take action.

In the best case, says Gasper, digital forensics examiners may be called on to find email, social media or website screen names, browser histories, and/or social media apps associated with the posted images. Finding out where and how posts were made (e.g. to Facebook via mobile device) is important, and a list of keywords – often included in search warrants – can help. Generally, sites and apps are enumerated in the warrant.

That’s if a warrant is possible to obtain. In Oregon, where the law in 2015 inadvertently created a “loophole” by restricting its language to images posted on “an internet website,” police couldn’t investigate threats to disseminate images via text, email, or printed pictures.

Nor was it a crime to simply show pictures to others. “It’s very difficult to tell victims that what happened to them isn’t illegal,” says Funkhouser, who performs digital forensic examinations for his agency. Told in one case that the images had been deleted, detectives couldn’t get a warrant to search the phone because there was no crime to have probable cause to believe had taken place. “The best we could do was to try to get the image off the phone so it wouldn’t be disseminated,” says Funkhouser, “and hope it was the only copy out there.”

Other investigative challenges:

  • Proving the accused is truly the person who posted the pictures, rather than a third party with whom the pictures were shared.
  • Victims frequently don’t realize the pictures might have been shared outside of their relationship. 
  • Cell phones make it even more difficult to trace who posted what. 
  • Difficulty persuading friends and family to save the evidence rather than deleting it in a misguided effort to protect the victim. “It’s actually easier when it’s posted on porn sites,” Gasper says, “[which] are relatively cooperative when you [serve legal process showing] the post was made without permission.”

Subpoenas can help to prove a particular IP address was attached to the account(s) used to send the material, says Gasper, adding that it’s wise to bring in a prosecutor on every nonconsensual intimate image sharing case. That way, they can help with the timely creation and service of subpoenas, cutting down on confusion and overwhelm to move a case forward.

Even so, the subpoena process is arduous and tedious, which in turn can factor in resourcing issues. “Most detectives don’t have the luxury of doing a [year-long] investigation for a misdemeanor,” Gasper explains.

In larger cities like New York, for instance, detectives in specialty units tend to focus on felony cases. That leaves the misdemeanors to precinct detectives, who themselves are likewise occupied with homicide and assault cases.

In smaller communities, misdemeanor offenses may not be assigned to a detective at all, says Bateman. “The officer who is assigned therefore has no subpoena power, and that means no forensics.” Most effective, he says, is “when digital forensics is married with subpoena power.” But even then, the “light slap on the wrist” that’s characteristic of most misdemeanor sentences can feel like small comfort to both victim-survivors and investigators.

Still, Bateman has seen progress over the past five years in that law enforcement has learned more about the nonconsensual intimate image sharing problem, statutes have been passed to support enforcement, and in some jurisdictions, it’s what he calls “a real priority,” with dedicated investigators working the cases.

The Need For Training And Education

In Gasper’s experience, the education detectives receive frequently consists of experience. Without it, she says, detectives can quickly become overwhelmed and not know where to start. Her office receives numerous calls from victims seeking help when precinct officers didn’t know how to help them.

In a small agency like Lake Oswego’s, Funkhouser says it’s easy to ensure all officers receive legal updates as often as they need. That happens via email, or, if there are many updates, via “roll call” training prior to each shift. Additional training may be available from the multidisciplinary task forces responsible for investigating domestic violence or sexual assault cases.

Indeed, McGlynn says that because image-based sexual abuse is experienced by many victim-survivors as a form of sexual assault, procedures and training relating to sexual assault could be instructive in assisting the police to better tackle nonconsensual intimate image sharing.

But this is only a start. “There is also additional training required to understand the specific modes of perpetration of these abuses and to understand that for some victims, these crimes can have devastating consequences,” she adds. “Society and the police need to better recognise the serious harms of these abuses and take action.”

The “Shattering Lives” report called for specialized, comprehensive training and guidance for police and others in the criminal justice system, together with improved resources and technical support. Comprehensive statutory guidance provides the foundation for this, but programmatically, it needs to become part of the curriculum at police academies and annual training.

Proactive Investigative Responses

With many investigators already delivering presentations to schoolchildren and parent groups on internet safety, adding a unit on consent could help both would-be victims and perpetrators to think twice before taking, sharing, or posting intimate photos.

In the “Shattering Lives” report, in fact, one victim-survivor was quoted regarding the value of education around image-based sexual abuse’s harms and effects, versus its position as a “cool thing to do” among peers.

That’s the kind of education Gasper herself delivers to local schools at all levels, even to very young children, and elsewhere to improve awareness of the issues around consent – which can be given and then revoked, or given for some acts but not for others – or the impact that actions can have.

For example, even consent can be a thorny issue. “Adults talk about wanting privacy as a defense against government intrusion,” Gasper says, “but if you post a picture of yourself, what are you saying about your own privacy? That’s what we want to get them thinking about.”

In Lake Oswego, Funkhouser says the agency’s two school resource officers talk to students regularly about similar issues. The media are also an important part of the police department’s communications strategy, covering new laws as well as awareness and prevention tactics.

Individual investigators can also proactively share their experiences with investigations and the impact to victims. Funkhouser was part of a working group that reconvened in 2018 to update the law after police encountered investigative challenges.

Now that the revision – passed in 2019 and taking effect January 1, 2020 – covers any image distribution, “including through technologies that have not yet been invented,” Funkhouser anticipates different challenges.

“Where the images are stored – the phone, the cloud – will become the issue,” he says. In particular, images distributed through a service like Snapchat, which deletes them after a brief period of time, could make cases harder to investigate. That’s because investigators need to be able to see the image for themselves. “We have to be able to show a jury so they can see that it meets the requirements in the law – that the victim is identifiable, and that the image is intimate,” Funkhouser explains.

Nonconsensual intimate image sharing can be legally complicated to navigate and time-consuming to investigate. Ultimately, however, these cases are as important to helping victims as any other. When called upon to apply your analytical skills to these cases, plan to work closely with prosecutors, advocates, and others to strategize how best to approach the evidence – and to stay on top of important legislative and procedural changes that could affect your approach.

About Scar de Courcier

Scar de Courcier is Senior Editor at Forensic Focus.

Discussion

No comments yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 1,301 other followers

%d bloggers like this: