Immediate Bipartisan Fixes for Social Media
The Wall Street Journal’s Facebook Files, and other recent reporting such as by ProPublica and MIT Review’s Karen Hao, demonstrate the failure of platforms’ whack-a-mole strategy to address the harms they cause. Instead of narrowly focusing on individual pieces of content, it’s time for regulators and platforms to fix the platforms’ design features that encourage deception and threats to public safety. Our Digital New Deal project produced a Roadmap for Safeguarding Digital Democracy recommending reforms. Below are immediate steps – mostly not requiring legislation – to update the rules and norms in order to fix social media’s broken system.
End Differential Enforcement of Rules
Expand AllFix 1: Black Box Data Recorder
Make platform enforcement of terms of service visible and appealable. The platforms reveal what they choose in transparency reports, but there is rarely detail about which policies are enforced, on whom, and at what rates. A “black box data recorder” and access to the data is necessary to allow auditing of enforcement practices. Also, there should be an efficient and transparent appeals process. Facebook’s recently revealed Content Distribution Guidelines—which outline types of content that receive reduced distribution—is an adequate first step, but such transparency efforts should be guaranteed and standardized across the industry.
Fix 2: Third-party Audits
Enforce third-party audits of terms of service enforcement that are routine and publicly available.
Prevent Harm to Young Women and Children
Expand AllFix 3: AI and Algorithms Transparency Standard
Insist that AI and algorithms targeting children provide greater transparency standards and higher levels of explainability and justification, in keeping with FTC enforcement of the Children’s Online Privacy Protection Act.
Fix 4: Criminalize Unauthorized Intimate Images
Criminalize the unauthorized disclosure of intimate images, and users should be notified when they are viewing a deepfake video or other piece of altered media.
Fix 5: Funding for Law Enforcement Training
Allocate funding for law enforcement training to tackle online abuse and for investigating and prosecuting online sexual harassment, stalking, and threats.
Hinder Conspiracy Theories
Expand AllFix 6: Circuit Breaker
Deploy a circuit breaker:
- Implement a tool to limit the exponential amplification of content, until human reviewers can determine whether the content violates platform policies or poses a risk to public safety (e.g., trigger of # of interactions on platform/cross-platform in 12 hours).
Fix 7: Replace Manipulative "Dark Patterns"
Replace “dark patterns” that manipulate users with designs that support transparency and choice. By default, users should be able to avoid:
- data sharing, with opt-ins to grant third-parties data collection permission;
- being tested or micro-targeted for ads; and
- having a newsfeed or timeline sorted for the user without meaningful options.
Fix 8: “PBS of the Internet”
Build the “PBS of the Internet” as civic infrastructure:
- Enable action and norms to incentivize engagement with accuracy and disincentivize accuracy-disruptors.
- Support a wide range of public digital media applications and content producers—including independent journalists, local governments, nonprofits and educational institutions—to increase the volume and diversity of local civic information.
- Use open protocol standards and APIs to let consumers mix and match the content they want amplified from a wide variety of sources.
- Insist on data governance grounded in human rights and ethics.
Fix 9: Interoperability
Get “Interoperability” so users are not locked into individual gatekeepers:
- Require interoperable, standard APIs.
- Require portability of data to help users switch to new platforms and foster greater competition.
Combat Human Trafficking, Drug Cartel, and Violent Extremist Recruitment and Advertising
Expand AllFix 10: Auditing and Reporting
Catalyze a new international effort to bring platforms to the table to share information and implement best practices to prevent such content – with auditing and reporting.
Expose Troll Farms and Political Manipulation
Expand AllFix 11: Nutrition Labels
Introduce nutrition labels to provide users information on the content in their newsfeed. Fake or automated accounts should be labeled as such. News stories should contain information about who funds their outlet, where to find the outlet’s standards, and whether the article claims to be news or opinion.
Fix 12: Honest Ads Act
Pass the Honest Ads Act and mandate that online platforms implement Know-Your-Customer rules for political advertisers.
Fix 13: Transparent Archive for Political Advertisements
Ensure platforms have a robust, transparent system for archiving political advertisements that is searchable and sortable through an API.
Fix 14: Update Civil Rights Laws
Update civil rights laws – just as discrimination is prohibited in public accommodations that exist in the offline world so should they be in the digital realm – both the ADA and the Equality Act contemplate platforms as public accommodations.
Fix 15: Prohibit Online Voter Suppression
Prohibit voter suppression online as it is offline.
Fight Scams on Online Marketplaces
Expand AllFix 16: Know-Your-Customer Rules
Direct platforms to implement Know-Your-Customer rules for those who use their systems to sell goods or services – with information easily accessible to the consumer.