The Digital Services Act
05-01-2022Roger Berkowitz
Frances Haugen, the woman who blew the whistle on Facebook, has put her influence behind the European Union’s attempt to regulate social media not by regulating content, but by ensuring transparency around systematic practices and standards. The European Digital Services Act passed this month does just that, seeking to “make social media far better without impinging on free speech.” The Act is an important model because it does not regulate content or take aim at offensive speech. Instead, requires that social media companies reveal how their algorithms privilege some material over others. This new transparency will show how it is that lies and hate proliferate. And it will empower governments, corporate boards, and other public actors to hold media companies accountable for their actions. Haugen, who will be a keynote speaker at the Hannah Arendt Center Conference “Rage and Reason: Democracy Under the Tyranny of Social Media,” writes:
How the new European law is carried out will be just as important as passing it. It is a broad and comprehensive set of rules and standards, not unlike food safety standards for cleanliness and allergen labeling. But what is also remarkable about it is that it focuses on oversight of the design and implementation of systems (like how algorithms behave) rather than determining what is good or bad speech.
The law requires that Facebook and other large social platforms be transparent about what content is being amplified and shared virally across the platform. And it must apply consumer protections to features that, among other things, spy on users, addict kids or weaken public safety. With transparency finally required, it will be easier for European regulators and civil society to verify that companies are following the rules.
These rules are like systems in the United States that compel pharmaceutical companies to keep drugs safe and to allow the Food and Drug Administration to independently verify the results. Most people aren’t aware of them, but we’re all glad they are there.
The new requirement for access to data will allow independent research into the impact of social media products on public health and welfare. For example, Facebook, Instagram and others will have to open up the black box of which pages, posts and videos get the most likes and shares — shining light on the outcomes of the algorithms.
Read the full article here.