❌

Normal view

There are new articles available, click to refresh the page.
Yesterday β€” 8 November 2024Main stream

Discord terrorist known as β€œRabid” gets 30 years for preying on kids

8 November 2024 at 20:32

A Michigan man who ran chat rooms and Discord servers targeting children playing online games and coercing them into self-harm, sexually explicit acts, suicide, and other violence was sentenced to 30 years in prison Thursday.

According to the US Department of Justice, Richard Densmore was a member of an online terrorist network called 764, which the FBI considers a "tier one" terrorist threat. He pled guilty to sexual exploitation of a child as "part of a broader indictment that charged him with other child exploitation offenses." In the DOJ's press release, FBI Director Christopher Wray committed to bring to justice any abusive groups known to be preying on vulnerable kids online.

β€œThis defendant orchestrated a community to target children through online gaming sites and used extortion and blackmail to force his minor victims to record themselves committing acts of self-harm and violence,” Wray said. β€œIf you prey on children online, you can’t hide behind a keyboard. The FBI will use all our resources and authorities to arrest you and hold you accountable.”

Read full article

Comments

Β© NurPhoto / Contributor | NurPhoto

Before yesterdayMain stream

Lawsuit: Chatbot that allegedly caused teen’s suicide is now more dangerous for kids

23 October 2024 at 23:52

Fourteen-year-old Sewell Setzer III loved interacting with Character.AI's hyper-realistic chatbotsβ€”with a limited version available for free or a "supercharged" version for a $9.99 monthly feeβ€”most frequently chatting with bots named after his favorite Game of Thrones characters.

Within a monthβ€”his mother, Megan Garcia, later realizedβ€”these chat sessions had turned dark, with chatbots insisting they were real humans and posing as therapists and adult lovers seeming to directly spur Sewell to develop suicidal thoughts. Within a year, Setzer "died by a self-inflicted gunshot wound to the head," a lawsuit Garcia filed Wednesday said.

As Setzer became obsessed with his chatbot fantasy life, he disconnected from reality, her complaint said. Detecting a shift in her son, Garcia repeatedly took Setzer to a therapist, who diagnosed her son with anxiety and disruptive mood disorder. But nothing helped to steer Setzer away from the dangerous chatbots. Taking away his phone only intensified his apparent addiction.

Read full article

Comments

Β© via Center for Humane Technology

❌
❌