Jonny Shipp / Apr 2022
Image: Shutterstock
The EU’s Digital Services Act and the UK’s Online Safety Bill are significant developments in digital policy and regulation. Both take a systems approach to regulation: focusing on the design of digital operations, rather than attempting to rule on individual items of content. That is why the Internet Commission has developed a reporting process that helps policymakers understand these operations: the systems and processes that companies put in place, the challenges they face, the tools they use, and how they can do better.
DELIVERING TECH FOR GOOD
Based on two years of evidence and analysis, the Internet Commission’s Accountability Report 2.0 identifies 46 trust and safety practices in six organisations including Sony Playstation, Twitch and Tinder. The independent yet collaborative reporting process is the first of its kind. It provides companies with an independent review and “health check” of their trust and safety practices, highlighting key areas for development such as prioritising user voice, establishing coordinated oversight and innovating to balance safety and freedom online.
Confidential data are gathered from companies based on an Evaluation Framework for Digital Responsibility. This looks at how organisational cultures, systems and processes align to support corporate digital responsibility, with a particular focus on internet safety, freedom of speech and the ways in which decisions are made in relation to content, contact and conduct online.
Following interviews with business leaders and front-line workers, individual case studies are built. Companies then engage in knowledge sharing workshops to look in detail at their challenges, approaches and success stories. These insights form the basis of the public report, which seeks to inform stakeholders about the challenges faced by companies and the ways in which they are responding.
ADVANCING DIGITAL RESPONSIBILITY: KEY THEMES EXPLORED
Value the voice of users. To better protect users from harm, organisations should consult a range of stakeholders: industry specialists, policymakers, and experts from civil society. As well as engaging external stakeholders, leading organisations recognise the value of the user voice. Through regular consultations, and by building forums for feedback, organisations can draw valuable insights from the experiences and expertise of their users.
Coordinate oversight to anticipate negative impact. Organisations should take responsibility for the expected impact of their services. This starts with ensuring that the product and operations of their services are designed to anticipate potential impacts. Maintaining oversight is especially important when deploying automated tools, which can misfire, unfairly punish users and amplify biases online.
Balance safety and freedom through innovation. Innovations in technology and processes can remove trade-offs between safety and freedom. The nature of the platform will dictate what is most appropriate: age assurance monitoring is very appropriate in services for younger users, whilst moderating speech may be less appropriate in closed, one-to-one communications. Options include prompts that encourage people to rethink potentially harmful messages before sending them and encourage potential victims to report harm.
The Internet Commission helps companies to deliver tech for good. Its Accountability Report 2.0 is available at inetco.org/report.