

At least 91 of the prosecutions have resulted in guilty pleas or verdicts, while many other cases are ongoing.ĭiscord isn’t the only tech platform dealing with the persistent problem of online child exploitation, according to numerous reports over the last year. It is illegal to consume or create CSAM in nearly all jurisdictions across the world, and it violates Discord’s rules. NBC News identified an additional 165 cases, including four crime rings, in which adults were prosecuted for transmitting or receiving CSAM via Discord or for allegedly using the platform to extort children into sending sexually graphic images of themselves, also known as sextortion. In another, a 22-year-old man kidnapped a 12-year-old after meeting her in a video game and grooming her on Discord, according to prosecutors. In March, a teen was taken across state lines, raped and found locked in a backyard shed, according to police, after she was groomed on Discord for months. “What we see is only the tip of the iceberg,” said Stephen Sauer, the director of the tipline at the Canadian Centre for Child Protection (C3P). Those numbers only represent cases that were reported, investigated and prosecuted - which all present major hurdles for victims and their advocates. At least 15 of the prosecutions have resulted in guilty pleas or verdicts, and many of the other cases are still pending. Twenty-two of those cases occurred during or after the Covid pandemic. In a review of international, national and local criminal complaints, news articles and law enforcement communications published since Discord was founded, NBC News identified 35 cases over the past six years in which adults were prosecuted on charges of kidnapping, grooming or sexual assault that allegedly involved communications on Discord.
