Update 6/1/15: Today the Supreme Court reversed Elonis’ conviction in a 7-2 decision. I’ll have an analysis of the opinion in next week’s post.
If a man posts violent threats on his own Facebook wall, could he be convicted of a crime even if he didn’t mean it? That’s the question the Supreme Court took up this week in the “Facebook threats” case, Elonis v. United States. It’s a fascinating look at what happens when old legal doctrines bump up against the modern on-line world.
The Facebook Posts at Issue
The defendant Anthony Elonis was convicted for a series of posts he made on his own Facebook wall in October and November 2010, when he was 27 years old. Elonis had been having an emotionally turbulent year: in May his wife left him, taking their two children with her, and in October he lost his job.
Elonis was active on Facebook; he had hundreds of “friends” and posted about a wide variety of topics. After his wife left him, he began posting some compositions of his own. These were often in the form of rap lyrics, and were frequently crude, graphic and violent.
Along with the violent posts, Elonis frequently posted disclaimers, saying his posts were merely “fictitious lyrics,” were for “entertainment purposes only,” or that he was simply exercising his First Amendment rights. He also regularly linked to things such as the Wikipedia entry on freedom of speech and other articles about the First Amendment.
In the fall of 2010, Elonis’s Facebook posts about his wife became increasingly graphic and violent. One post read in part:
There’s one way to love ya but a thousand ways to kill ya
And I’m not gonna rest until your body is a mess,
Soaked in blood and dying from all the little cuts . . .
In November 2010, based on the threatening posts, Elonis’s wife obtained a protection from abuse (“PFA”) order against him. A few days later, Elonis posted an almost word-for-word adaptation of a comedy sketch that he and his wife had watched together, in which comedian Trevor Moore explains that it’s illegal to say you want to kill the President, but not illegal to explain that it’s illegal to say that. The post read in part:
Did you know that it’s illegal for me to say I want to kill my wife?
It’s indirect criminal contempt.
It’s one of the only sentences that I’m not allowed to say.
Now it was okay for me to say it right then because I was just telling you that it’s illegal for me to say I want to kill my wife.
I’m not actually saying it. . . .
Elonis followed up this post with a statement that “Art is about pushing limits. I’m willing to go to jail for my constitutional rights. Are you?” He also provided a hyperlink to the original sketch upon which the post was based.
In another post in November 2010, Elonis referred to the PFA order his wife had obtained:
Fold up your PFA and put it in your pocket.
Is it thick enough to stop a bullet?
. . .
I’ve got enough explosives
To take care of the state police and the sheriff’s department . . .
Elonis’s wife testified at his trial that she took the Facebook threats seriously and that they made her very afraid for herself and her children. She also testified that she had never known Elonis to listen to rap music.
On November 16, Elonis posted the following:
That’s it, I’ve had enough.
I’m checking out and making a name for myself.
Enough elementary schools in a ten mile radius
To initiate the most heinous school shooting ever imagined.
And hell hath no fury like a crazy man in a kindergarten class.
The only question is . . . which one?
Elonis testified that this post was based on a rap song by Eminem, I’m Back, in which the rapper fantasizes about participating in the Columbine school shooting.
The post about the elementary school led to a visit by the FBI, during which Elonis declined to be interviewed. After the agent left, Elonis posted another item on Facebook he titled “Little Agent Lady” in which he falsely claimed he had been wearing a bomb when the agent came to his door and fantasized about killing her:
Took all the strength I had not to turn the bitch ghost
Pull my knife, flick my wrist, and slit her throat . . .
These and other posts ultimately led to Elonis being convicted on four counts of threats, one each for threatening his wife, the police, a school, and the FBI agent. He was sentenced to 44 months in prison.
The Threats Statute
Elonis was convicted under Title 18, U.S. Code, section 875(c), which provides:
Whoever transmits in interstate or foreign commerce any communication containing any threat to kidnap any person or any threat to injure the person of another, shall be fined under this title or imprisoned not more than five years, or both.
Elonis’s defense was that he never intended to threaten anyone. He claimed his posts were therapeutic and a form of artistic expression, similar to the rap artists he professed to admire. He argued that the proper interpretation of 875(c) requires the government to prove that he subjectively intended to put the targets of his alleged threats in fear, and that to prosecute him for anything less violates the First Amendment.
The lower courts rejected his argument and upheld his convictions, applying the majority rule that the intent of the person making the threats does not matter. The government needs to prove only that the defendant made a statement under circumstances or in a context where a reasonable person would foresee that the statement would be interpreted as a serious expression of an intent to do harm – even if the defendant didn’t really mean it or know it would be interpreted that way. The rationale is that the fear and disruption caused by an apparent threat takes place regardless of the speaker’s personal intent, and Congress is free to punish those who cause that fear.
The Supreme Court Argument – Chief Justice Roberts Channels Eminem
This issue in this case is not whether threats on Facebook can ever be prosecuted. It’s settled that “true threats” fall into the narrow category of speech that is not protected by the First Amendment, along with obscenity,defamation, and “fighting words” that incite violence. The issue is what does the government have to prove about the defendant’s state of mind in order to establish that the statements at issue were indeed “true threats.” During the Supreme Court argument four different possible standards emerged (in decreasing order of the level of proof required):
1) The defendant subjectively and specifically intended that the statements would place the target of his threats in fear of being harmed. (This is the standard argued for by Elonis, at least initially.)
2) The defendant knew that a reasonable person, looking at the statements, would be placed in fear of being harmed. (The “knowledge” standard.)
3) The defendant knowingly made the statements with a reckless disregard for whether the recipient would be placed in fear of being harmed. (The “recklessness” standard.)
4) The defendant knowingly made the statements, and regardless of what the defendant personally knew or intended about their effect, a reasonable person looking at those statements would think they were a serious expression of an intent to harm another. (The standard adopted by the lower courts and most other courts, and argued for by the government.)
The Supreme Court case essentially boils down to this: which of these standards should be the law?
At the oral arguments on Monday, the Court immediately honed in on this issue. The lawyer for Elonis seemed to get in trouble early, giving varied and conflicting responses on what the correct standard should be. Throughout the litigation Elonis had consistently argued that the government should have to prove he subjectively intended to cause fear (#1 above), but during the argument his lawyer seemed to back away from that standard and suggest it would be enough if the government had to prove only that the defendant knew a reasonable person would consider the statements to be true threats (#2 above). His inconsistent responses finally led Justice Scalia to comment, “You really have me confused at this point.”
At the same time, most of the Justices did not appear to be buying the government’s argument that it doesn’t matter at all what the defendant intended or knew, only that the statements looked threatening to a reasonable person. Justice Kagan observed that the government essentially was arguing for criminal liability based on negligent speech, and “that’s not the kind of standard that we typically use in the First Amendment.”
In what may be a Supreme Court first, Chief Justice Roberts quoted rap lyrics by Eminem about drowning his ex-wife, and asked the government attorney whether that could be prosecuted. The response was no, because in the context of a musical rap performance, no reasonable person would perceive Eminem reciting those lyrics as an actual threat – it’s just a performance.
The Context is the Key
As the exchange with the Chief Justice about Eminem highlighted, when it comes to threats, context is everything. Violent rap lyrics that no one would perceive as a threat in the context of a stage performance could most definitely be a threat if those same words were whispered menacingly into the ear of another person. Similarly, if Elonis had used the same language from some of his Facebook posts in a phone call or written letter to his wife, there’s little doubt they would be considered threats.
But what exactly is the proper context when talking about Facebook? Even within Facebook itself, you can imagine a number of different scenarios. For example: Elonis posted only on his own wall; was not Facebook friends with his wife, the FBI agent, or others who were the subjects of his posts; and he did not tag them. Writing on your own wall is somewhat akin to a public event. The posts can be seen by all of your friends, commented on, forwarded, and “liked” by a potentially unlimited number people. Is this more like a rap concert performance or more like whispering in someone’s ear?
Suppose Elonis had tagged his wife, or wrote on her wall instead of his own, or sent her a private message through Facebook? Would each of these lead to a different result? And what does a “reasonable person” mean when it comes to interpreting what one sees on Facebook – is it the reasonable, Internet-savvy teenager, or the reasonable septuagenarian Supreme Court Justice?
On-line communication suffers from an inability to convey nuance, tone, inflection, facial expression, body language – all things that can be critical to determining the speaker’s true meaning in face-to-face communication. Probably everyone has had the experience of sending an e-mail or posting something that was intended to be sarcastic or funny but was perceived as serious, or vice-versa. Indeed, a whole world of emoticons has sprung up to help us try to convey emotions or attitudes along with the digital written word.
This makes it particularly important to have legal standards that ensure protected speech does not end up being prosecuted. When it comes to Elonis’s posts, as Justice Scalia repeatedly pointed out, it’s hard to argue that there’s a lot of redeeming social value there. Nevertheless, the heart of the First Amendment is the protection of even speech that many find vile or offensive, whether it be violent rap lyrics, flag burning, or Ku Klux Klan rallies. And the standard the Court adopts will, of course, affect not merely Elonis but all future speakers (and potential defendants).
The Consequences of a Higher Standard of Proof
I expect the Supreme Court will adopt some kind of a middle ground (#2 or #3 above), not requiring subjective intent as Elonis argued but also not accepting the broad rule requested by the government. Justice Breyer seemed to be suggesting during the argument that #2 was effectively already the law: the defendant has to know that he is making statements that are true threats, which means by definition he has to know that a reasonable person would be put in fear by the statements. The Court may conclude that requiring at least this kind of knowledge is a reasonable middle ground.
This doesn’t mean someone like Elonis could never be prosecuted. As with so many criminal appeals, this case is all about the jury instructions: what should the jury have been told it had to find concerning the defendant’s state of mind? If the Court rules that the Elonis jury should have been told it had to find knowledge or intent by Elonis, his own convictions will be reversed, but for future cases the jury instructions will simply be modified. Elonis himself could even be re-tried, if the government chose to do so, and I wouldn’t be surprised if he were convicted again.
Proving intent or knowledge is not some kind of insurmountable hurdle. Prosecutors do it all the time. As a prosecutor, I wouldn’t hesitate to take a case like Elonis to a jury and argue that the evidence established he knew or intended that his posts would place his wife in fear. And if a defendant tried to fabricate a “rap lyric” defense, the prosecution could present evidence to establish it was merely a ruse – again, disproving an alleged defense is nothing unusual. If the Court rules for Elonis, it’s not going to be some kind of “get out of jail free” card for future stalkers and harassers.
Imposing a higher proof requirement isn’t about condoning Elonis’s reprehensible conduct, it’s simply about strictly interpreting statutes that criminalize speech. A higher standard of proof will make threats prosecutions somewhat more difficult, but that’s not necessarily a bad thing. The government has to tread very lightly when it seeks to turn written words into a federal felony. Given our First Amendment heritage and devotion to free expression, it’s not too much to ask that the government prove some level of intent or knowledge when seeking to send someone to jail solely for what they wrote. I hope the Court agrees.
What standard should the Court adopt? Leave a comment below.
Sign up for free e-mail updates of future posts by clicking on the “Follow” button on the upper right.