Social media companies must be accountable to the democracies that make their businesses possible.
Major technology companies- such as Google, Facebook, Twitter, SNAP and others- define the information ecosystem in much of the world. Hardly regulated and hardly accountable, these companies are completely transforming the public sphere.
While these platforms present new opportunities to connect people around the world, they also create attack surfaces for bad actors that wish to spread misinformation, encourage terrorism, engage in online harassment, steal personal data, restrict free speech and suppress dissent.
The age of unregulated social media must end. But bad regulation could cause its own problems. Now is the time to have these discussions- before we end up with misguided rules, and before it is too late. Across the country, on university campuses, at industry conferences, and other public forums, we urgently need to frame and debate these issues.
Here are three areas to explore:
1. Greater transparency to governments and independent researchers
Right now, the technology companies operate with little scrutiny. It’s crucial that there is more transparency–both to government and to independent researchers who can help society to understand the consequences of these vast new communications platforms. This means access to data and to systems. This is complicated, but it is necessary if we are to ensure these technologies are, at the very least, a net benefit to society going forward. Conversations should center around how to create frameworks and mechanisms that provide for such scrutiny. University researchers, for instance, can serve as a powerful partner to governments and technology companies in understanding these platforms and how society uses them.
2. Accountability and transparency to citizens
Citizens need to know how these platforms operate how they shape user experiences, and what the companies are doing with user data.
Further, citizens should have the right to know when the technology companies make a mistake, or when breaches occur such as the Russian election interference campaign in 2016 or the recently announced theft of Facebook user data by Cambridge Analytica. How to regulate such disclosures and what should be required in the form of robust consumer protections is an area ripe for discussion.
3. Responsibility for addressing social costs
Facebook’s market cap, for instance, isn’t far from that of ExxonMobil. Any company that has reached such scale produces some form of pollution or other negative social costs. Often, regulators seek to make industry pay for it. While Google and Facebook have each invested in some initiatives designed to address these kinds of externalities– particularly their impact on the news media- clearly much more is needed. Governments should explore the appropriate levy and/or penalties to place on these technology companies, and what types of activities to finance to remediate negative social costs.