What is transparency?
At its simplest, transparency in the context of the Christchurch Call is about being as open as possible as individual supporters and as a Call Community about what we do to meet our commitments, why we do those things, and the impacts of our actions.
Why is this important?
Transparency is mentioned in several Call commitments. For example, online service providers commit to being transparent around how they set and enforce community standards and terms of service, and how much terrorist and violent extremist content they detect and remove.
Transparency is implicit in other commitments. For example, how governments enforce laws prohibiting the production or dissemination of terrorist and violent extremist content online.
Ultimately, transparency helps build understanding and trust among stakeholders as a basis for collective action. It is necessary for designing and evaluating the effectiveness of interventions to eliminate terrorist and violent extremist content online, while protecting and respecting human rights and fundamental freedoms. Transparency contributes to accountability and continuous improvement. An example of this is the practise of holding Call Community debriefs following a crisis response.
What work is under way?
The Call Community’s works in many ways to improve transparency.
The Global Internet Forum to Counter Terrorism(external link) (GIFCT) and Tech Against Terrorism(external link) are key partners. Both play an important role in encouraging and supporting transparency by online service providers. GIFCT’s membership criteria require online service providers to have publicly available policies that explicitly prohibit terrorist and violent extremist activity, and to publish regular transparency reports. Tech Against Terrorism’s mentoring program helps online service providers that aspire to GIFCT membership to develop policies and systems that meet GIFCT criteria.
In a global benchmarking report for 2021, the OECD noted tangible progress in the number of online services issuing reports with specific information on terrorist and violent extremist content, and an improvement in the quality of the information provided.
Much of the progress on transparency has been made on a voluntary basis. Since the launch of the Call, different governments have developed legislation to tackle online harms. This often includes transparency and reporting requirements. As regulators move to implement these requirements, we expect to see further guidance and support for the tech sector and improvements in transparency.
We are also making progress on government transparency. Alongside its guidance for online service providers, Tech Against Terrorism has issued guidance for government transparency reports.
Many different groups have an interest in transparency, including online service users, community advocates, human rights experts, researchers, regulators and legislators. Part of the Call Community’s work is to make sure the transparency and reporting that companies provide is meaningful for these groups, and meets their needs for research and policy development. That involves identifying what they need to know and why, and making the relevant information and insights accessible to them.
Civil society has an important role in working with governments and online service providers to improve transparency. Call Community members are actively involved in multistakeholder work on transparency in the GIFCT Transparency Working Group, the OECD, the Action Coalition for Meaningful Transparency and the Center for International Governance Innovation’s Global Platform Governance Network.
What are the main challenges?
Delivering meaningful transparency can be challenging.
Online service providers are diverse in terms of the services they provide, the users they serve, and the risks they manage on terrorist and violent extremist content.
Different stakeholders need different kinds of information in different formats. Users of an online service need to be able to access terms of service and information on appeal and complaints procedures. Free speech advocates may be more interested in false positive and negative detection rates and the outcomes of appeals, or how many take down requests are received from governments. Researchers and auditors may seek access to datasets to enable them to understand algorithmic outcomes.
Small online service providers may lack the capacity to set community standards, moderate and – critically – track decisions and outcomes. Mandatory transparency reporting requirements therefore need to be designed carefully to avoid creating cost and capacity barriers to entry for small firms.