• | 8:00 am

Big Tech transparency reports are a big mess

A new study suggests that none of the largest 13 tech companies meet key principles for transparency.

Big Tech transparency reports are a big mess
[Source photo: Getty Images]

At regular intervals, Big Tech’s biggest names release data showing how well they’re responding to the challenges of running world-dominant platforms. Transparency reports share how often companies have taken action against inappropriate content—nudity, terrorism, or hate speech—and when they’ve been asked by national authorities and regulators to step in to help law enforcement.

There’s just one problem: Transparency reports aren’t all that transparent, according to a new analysis by academics Aleksandra Urman at the University of Zurich and Mykola Makhortykh at the University of Bern.

Transparency reports are meant to act as ways to keep tech platforms honest, with companies self-reporting key metrics that show how well they’re operating. Companies release the reports on a regular basis, often every six months. Google released its first transparency report in September 2010, and other companies have since followed in the search giant’s footsteps.

Yet there are questions over quite how transparent these tools are. To monitor the transparency of tech companies’ reporting on takedowns, Urman and Makhortykh analyzed the contents of each company’s reports as compared to the Santa Clara Principles, guidelines for best practice in transparency around internet platforms’ moderation of content. The Santa Clara Principles were first developed in 2018, and have subsequently been endorsed by a dozen big tech companies—including many of those mentioned in Urman’s paper.

POOR MARKS ALL AROUND

The researchers found that not a single tech company totally follows the principles they have previously endorsed. Some platforms like Snapchat declare the total number of content they’ve taken action against and accounts suspended, in full adherence to the principles, but almost all other companies only partially follow the guidelines.

“The Santa Clara Principles explicitly state that for all the numbers that they suggest being released, there should be a breakdown by country,” says Urman. “And that’s what most of these reports are missing.”

Some companies fare better than others: Apple and Amazon, two tech titans, fail to meet many of the Santa Clara Principles as measured by Urman and Makhortykh. That’s a concern, says Urman, because of the outsize role both play in our digital lives.

“I think it’s not entirely obvious what Apple would be taking down,” says Urman. “But actually, Apple does moderate its App Store, for example. We just don’t know how.” (Apple did not respond to a request for comment.) Likewise, Amazon is believed to moderate some of the books it sells on its site, taking some it deems inappropriate off-sale. “Does Amazon do something about it? We don’t know,” says Urman. Microsoft, LinkedIn, and Twitch adhere to some of the Santa Clara Principles, but only a handful. Spokespeople for Microsoft and LinkedIn declined to comment for this story.

Urman also highlights some confusion around Google and YouTube’s transparency reporting. “Google was actually the pioneer of transparency reporting. [But] they don’t have a breakdown by all of their products,” she says. “There are some reports that are YouTube only. There are other reports that are Google—everything—without any breakdown. Like what did they take down from YouTube? What from web search? What from myriad of their other products like Google Docs or something? We also don’t know.”

A Google spokesperson said: “Google has long been a leader in transparency reporting across our products, and we regularly release information about government requests to remove content, as well as the actions we take to protect our users and platforms from content that violates our policies or local laws. We believe in being open and transparent about our work, and we’ll continue to look for ways to expand these efforts in the future.”

For Liam McLoughlin, a lecturer at the University of Liverpool with a specialism in social media and content moderation, the findings aren’t all that surprising. “Some might call me cynical on this, but when we think of transparency reports as primarily a public relations mechanism, it all makes sense,” he says. “Transparency reports themselves might be doing as intended: reassuring users. If [they weren’t], platforms might feel compelled to hand over more information.”

Alongside all those named above, Fast Company approached all 13 companies named in the academic paper. A Reddit spokesperson said: “We provide robust data about our platform, content moderation efforts, and government requests in our Transparency Reports. We have consistently expanded these reports each year, including recently moving to a biannual reporting cadence and adding breakdowns for government requests by U.S. state as well as data about user notifications for legal requests. We’ll continue to provide new types of insights in our future Transparency Reports.”

Github, Meta, TikTok, Twitter, Snapchat, Pinterest, Amazon, and Twitch did not respond to a request for comment by a set deadline.

GREAT POWER, GREAT RESPONSIBILITY

Transparency is important, says Urman, because of tech companies’ central role in our lives. It’s also vital because it provides better information by which politicians can decide how best to regulate such platforms. “It’s good to be well-informed about what is happening,” she says. “And right now, it’s impossible in many aspects, which I guess might lead to over- and under-regulation of the platforms.”

Openness in transparency reports also enables the public to feel more confident in what big tech companies do, and enables ordinary people to exert pressure should they feel tech firms are making missteps along the way. Knowing, for instance, that a tech company acceded to a request from an authoritarian regime to take down content can enable the public to ask questions of why that’s happening—and what impact it has.

The fact that companies often only release their transparency reports in English stymies that watchdog element, says Urman. Likewise, organizations that refuse to release detailed breakdowns of actions taken by country could be masking major issues outside established western markets, where most of the money—and resource—is. (In December, Meta was hit with a class action lawsuit filed in Kenya for allegedly allowing hate speech to foster through Facebook in Ethiopia. Last year, Amnesty International also found that Meta helped contribute to atrocities committed in Myanmar.)

“Sometimes it’s called ‘transparency washing,’ where they release only information that is good for their image,” says Urman. “If you don’t release information on content moderation by country, there’s no way to know if it only works in, let’s say, the U.S. but nowhere else.”

  Be in the Know. Subscribe to our Newsletters.

ABOUT THE AUTHOR

More

More Top Stories:

FROM OUR PARTNERS

Brands That Matter
Brands That Matter