Social apps have struggled to curb drug sales for years. They’re still not doing enough, say parent groups and the U.S. Drug Enforcement Administration.
Federal officials agree. This week, the U.S. Drug Enforcement Administration issued a warning about the increase of fake pills bought online that include fentanyl, a synthetic opioid that can be fatal in small doses. In an interview, DEA Administrator Anne Milgram specifically called out Snapchat and TikTok, two apps that are popular with teenagers and young adults, for not doing more to combat sales, and said that the agency was going to go to social media companies with specific demands.
The public health crisis is renewing calls from not just law enforcement groups but concerned parents and researchers for the social media companies to do more. They want the companies to be more transparent about what’s happening on their platforms, to share more data with each other to help catch drug dealers who jump between apps and increase parental controls parents can keep tabs on what their kids are doing online.
Marc Berkman, chief executive of the Organization for Social Media Safety, said the nonprofit ran an informal test and found they were able to connect with drug dealers on multiple social media sites in under three minutes. A March report from the Digital Citizens Alliance, a consumer watchdog group, and Coalition for a Safer Web showed how Facebook pages, Instagram accounts and YouTube videos were used to promote drugs, in some instances to thousands of followers or viewers.
Facebook spokeswoman Avra Siegel said in a statement that the company doesn’t allow people to buy or sell drugs on its sites. It plans to join the nonprofit Partnership to End Addiction on Wednesday to work on a series of public service announcements about opioid addiction, she said.
Snap spokeswoman Rachel Racusen said in an email that the company strictly prohibits drug-related activity and fights it, as well as supporting law enforcement in investigations. The company also promotes videos about the dangers of drugs to teens who use its app.
TikTok spokeswoman Hilary McQuaide said in a statement it also removes accounts that promote illegal drug sales, using both technology and human reviewers to find and evaluate the violative material. TikTok blocks searches of some drug-related terms, instead directing users to its policies. It redirected a search term Monday after a Washington Post inquiry regarding drug-related content it surfaced.
Some researchers acknowledge the difficulty of the problem. The sheer scale of information on the platforms is hard to police, and algorithms can do only so much. Drug dealers can use strange fonts or edit photos advertising drugs in ways that can trick image-recognition algorithms, creating an arms race between drug sellers and the social media companies.
“You see a lot more sophistication, a lot more aggressive moderation evasion,” said Tim Mackey, an associate professor at the University of California at San Diego who runs a research start-up that has helped companies including Snap detect illegal online activity.
The Technology 202: Researchers say it’s easy to find drugs on Facebook, Instagram and YouTube
There’s more that the companies can do though, Mackey said, including investing more in training the artificial-intelligence algorithms that search for and remove drug advertisements. The companies should also share information on accounts that try to sell drugs, he said. Social media companies have done that in the past to identify and take down terrorist accounts, but they’re often hesitant to do it in other areas because of privacy concerns.
But many of the drug deals are taking place across platforms, Berkman said. For example, people may connect with drug dealers online on one site, then message with them on a second site and purchase pills on a third.
Purchasing drugs through social media is “as easy as ordering a pizza,” Berkman said.
Some parents are demanding companies provide better ways to oversee their children‘s social media accounts.
Sam Chapman said he and his wife, television host and relationship therapist Laura Berman, call themselves “accidental activists.” Their 16-year-old son Sammy died in February after taking drugs he bought online which contained fentanyl.
Sammy met the drug dealer on Snapchat, Chapman said, where he was “presented a colorful menu designed for kids.”
“We thought sexting was all you had to worry about, but it’s much worse,” Chapman said. “It’s child sex slavery, it’s drugs, it’s bullying.”
Instagram has a drug problem. Its algorithms make it worse.
Chapman and Berman are now working with the Organization for Social Media Safety on a bill for Congress that would require social media sites to integrate with parent monitoring software. The type of software allows parents to get alerts when their kids interact with concerning or illegal content online.
Some privacy advocates say these monitoring tools can actually hurt kids, for example by outing children who are gay to their parents and potentially exposing them to homophobic abuse. A Vice News investigation this year found that some child-monitoring software doesn’t work as well as it’s advertised.
But parents say it could help save children from harm. Chapman said they have asked Snapchat to allow one type of monitoring software to work on its site, but the company told him they had concerns about user privacy.
“We’re trying to protect from criminals, and they’re talking about privacy,” he said.
Snapchat is building its own parental control tools, said the company’s Racusen.
“Our goal is to deliver solutions that work effectively and reliably without compromising the security and data privacy of our community,” she said.