The slow-motion explosion of Elon Musk’s X has created a host of competitors whose decade-long privacy breaches have largely gone unscathed. When Elon Musk took over Twitter in October 2022, experts warned that proposed changes, including reduced content moderation and a subscription-based verification system, would alienate users and advertisers. Fox avoided.
A year later, these predictions have largely come true. Advertising revenue on the platform fell 55 percent after Musk’s takeover, and the number of daily active users fell from 140 million to 121 million over the same time period, according to a third-party analysis.
As users move to other online spaces, the last year could be a time for other social platforms to change the way they collect and protect user data. Jenna, the company’s policy director, said: “Unfortunately, it seems that, regardless of their interests or cultural tone since the company’s founding, abandoning a minimalist approach to gluttony for all of our data is not enough.” board of directors of Free Press, a nonprofit watchdog organization, according to media reports and the lead author of a new report examining Bluesky, Mastodon and Meta’s Threads, all of whom have sought to fill the void left by Twitter, now known as X.
Companies like Google, X, and Meta collect vast amounts of user data, partly to better understand and improve their platforms, but mostly to be able to sell targeted advertising. But collecting sensitive information about race, ethnicity, sexuality, or other identifiers can put people at risk.
For example, Meta and the US Department of Justice reached a settlement earlier this year after it was discovered that the company’s algorithm allowed advertisers to exclude certain racial groups from seeing ads for things like housing, jobs and financial services.
In 2018, the company was fined $5 billion — one of the largest in history — after a Federal Trade Commission investigation found multiple instances of the company failing to protect user data, prompted by an investigation into data provided to British consulting firm Cambridge. Analytica. (Meta has since made changes to some of these ad targeting options.)
“There is a very strong correlation between the data that is collected about us and the automated tools that platforms and other services use that often produce discriminatory results,” said Nora Benavidez, director of digital justice and civil rights at the Free Press.
“And when that happens, there’s really no choice but to sue.” Even for users who want to opt out of extensive data collection, privacy policies are complex and vague, and many users don’t have the time or legal knowledge to analyze them. At best, users can determine what data won’t be collected, Benavidez said, “but in all cases, users should review the policy to try to understand what’s actually happening with their data,” she said. . “My concern is that the practices and policies of this company are so bad and misleading that people don’t really understand what’s at stake.”
“It seems like the threads are collecting a lot more information than they actually need to run the service. And some of the information they collect is quite sensitive,” says Callie Schroeder, global privacy counsel at the Electronic Privacy Information Center, a nonprofit organization dedicated to online privacy and free speech. “I think it’s just inextricably linked to the fact that the Threads Meta already contains an absolutely obscene amount of information about people.”
Before Musk’s takeover, Twitter had its own checkered history of protecting user data. Hackers compromised the platform twice in 2009, gaining access to users’ private information and, in some cases, hijacking accounts. In 2011, the FTC issued a consent decree — the threat of a lawsuit — against Twitter for failing to protect user data following a 2009 hack.
Under the agreement, “Twitter will be prohibited for 20 years from misleading consumers about the extent to which it protects the security, privacy and confidentiality of non-public consumer information,” according to the FTC, with each violation subject to a $16,000 fine.
So far, efforts to limit the collection of user data have been piecemeal, mostly driven by state-level laws and isolated enforcement actions. The American Privacy and Data Protection Act, proposed in 2022, remains in limbo in Congress.
“Regulation continues to be extremely lagging behind. Companies will not change by themselves.”