Social media platforms engaged in ‘exhaustive surveillance’ and failed to protect young people, FTC says


Major social media platforms and video streaming services that collect vast amounts of user data have failed to protect young people and online privacy, the Federal Trade Commission said Thursday.

“These surveillance practices can threaten people’s privacy, threaten their freedoms, and expose them to a range of harms, from detection of theft to prosecution,” FTC Chairwoman Lina Khan said in a statement. “It’s especially troubling that several companies fail to adequately protect children and teens online.”

An agency focused on consumer protection and antitrust enforcement. 129 page report It examines how some of the world’s largest social media platforms, including Instagram, TikTok and YouTube, collect and use vast amounts of user data. The findings show growing scrutiny of online platforms by regulators and lawmakers seeking to combat the potential harms of technology as it becomes more deeply intertwined with people’s everyday lives.

Politicians and consumer advocates have long criticized how companies like Facebook collect data on users that is used to target ads to people based on their interests, location, gender and other information. There has also been concern about how teenagers deal with the potential downsides of social media, including the sale of illegal drugs and comparison to their peers.

The report comes from data the FTC ordered from the largest social media and video streaming platforms in 2020. These companies include Snap; Facebook, now Meta; Google-owned YouTube; Twitter, now X; TikTok-owner ByteDance; Conflict; Reddit; and Meta-owned WhatsApp.

The responses revealed how companies collect information about consumers about their household income, their activity elsewhere on the internet, their location and more. Technical platforms collect this information from ad-tracking technology, data brokers and from users who interact with online messages and provide companies with information about their interests. Some companies did not remove information about people who requested it, according to the report.

Although most social media platforms require teens to be at least 13 to create accounts, people can easily lie about their age, and platforms collect information from teens just as they do from adults.

In an effort to address ongoing criticism, social media companies have rolled out features that give parents more control over their children’s online experiences. This week, Meta said it would make accounts private for those under 18 by default, stop sending notifications to minors at certain times and provide more parental controls. Snap, which held its annual conference on Tuesday, said it is partnering with Common Sense Media to develop an app to help families learn more about potential online harms.

Lawmakers, including in California, are trying to address young people’s data privacy and safety concerns by passing new laws. But they also faced legal hurdles due to a section of federal law that shields online platforms from legal liability for user-generated content.

Meta and Snap declined to comment on the report. Meta is a member of the Interactive Advertising Bureau, which is located in a blog post that this group was “disappointed” by the agency’s description of the digital advertising industry as a company dedicated to mass surveillance.

Discord, which allows users to communicate via text messages, video and voice calls, has tried to differentiate itself from other social media platforms by noting that it does not encourage people to leave comments by offering a seamless channel to do so.

“The FTC report’s purpose and focus on consumers is an important step. However, the report lumps very different models into one group and paints a broad brush that can confuse consumers and misrepresent some platforms, like Discord,” said Kate Sheerin, Discord’s head of public policy for the U.S. and Canada, in an email.

The platform, which is popular with gamers, has also avoided ads but began running them this year.

Google, which owns YouTube, said in a statement that it has the “strongest privacy policies” and has implemented a number of measures to protect children, including not allowing personalized ads for users under 18.

Other companies mentioned in the report did not immediately respond to a request for comment.

The FTC noted that its findings are limited because technology and business practices can change. According to the report, the responses the FTC received from companies reflected their experiences from 2019 to 2020.

The agency included recommendations for companies and urged Congress to pass legislation that would protect users’ privacy and provide rights to consumers. According to the FTC, companies should take steps to reduce potential risks, such as collecting necessary information, being more transparent about their practices and having default protections for teens and young adults.

“As policymakers consider different approaches to protecting society, it is important to focus on the root causes of many harms, not just the symptoms,” the report says.

Leave a Comment