Loomy
Thinks Microaggressions are Real

Roblox, Discord sued after 15-year-old boy was allegedly groomed online before he died by suicide
Ethan Dallas was targeted by an adult sexual predator on Roblox when he was 12, and later on Discord, according to a lawsuit. He took his own life last year.

The mother of a 15-year-old California boy who took his own life is now suing Roblox and Discord over his death, alleging her son was groomed and coerced to send explicit images on the apps.
Rebecca Dallas filed the lawsuit Friday in San Francisco County Superior Court, accusing the companies of "recklessly and deceptively operating their business in a way that led to the sexual exploitation and suicide" of Ethan Dallas.
Ethan was a "bright, imaginative boy who loved gaming, streaming and interacting with friends online," the lawsuit states. He started playing on the online gaming platform Roblox around the age of 9, with his parents' approval and with parental controls in place. When he was 12, he was targeted by "an adult sex predator" who posed as a child on Roblox and befriended Ethan, attorneys for Rebecca Dallas said in a statement.
What started out as innocent conversation "gradually escalated to sexual topics and explicit exchanges," the complaint says.
After a while, the man encouraged Ethan to turn off parental controls and move their conversations to Discord, the lawyers said.
On Discord, the man "increasingly demanded explicit photographs and videos" and threatened Ethan that he'd post or share the images. Ethan complied out of fear, the complaint says.
After Ethan's tragic death, his family learned from law enforcement that the man who groomed him had been arrested in Florida "for sexually exploiting other children through Defendants' apps," the complaint said.
Today, Roblox's default settings do not allow adults to directly message children under the age of 13, but children can still create accounts with fake birth dates, giving them full access to direct-messaging options, the complaint said.