What are algorithms and positive interventions?
Algorithms are automated processes that make decisions about the types of content users engage with online. Algorithms have an important role in users’ online experience, including in curating and suggesting content we might be interested in, and identifying and addressing content that violates terms of service.
Online algorithmic systems can be very complex. Their outcomes depend on interactions with users, different types of content, and other algorithms. These systems can learn and adapt over time through those interactions to become more effective at their given task.
Positive interventions are measures that intervene to redirect or influence a person away from extremist or terrorist content. They can play an important role in empowering community voices and disrupting recruitment or propaganda strategies.
Why are they important?
A better understanding of how algorithmic systems play out for different types of users could help inform changes that make online platforms safer for users around the world. They could enable more effective removal of terrorist and extremist violent content online, and enable communities to positively intervene in user journeys that are driving people to that content.
In the Christchurch Call, supporters have made several commitments related to algorithms. These include reviewing the operation of algorithms to better understand intervention points and implement changes, developing effective interventions, and accelerating research into technical solutions.
What work is underway?
In 2021 the Call Community endorsed a workplan on algorithms and positive interventions. This plan focuses our efforts on improving understanding of:
- online user journeys and the role they may play in radicalisation
- how online and offline factors interact
- how content recommendation processes might be exploited and ways to mitigate this.
The workplan also recommends putting effort into working across the tech sector and government to understand data and information needs for effective research.
The Call Community also undertook to support work on next generation community-led online interventions. This includes looking at new assessment frameworks to help us understand their impacts and sharing what has been learned.
This work will need to take place across several forums including the EU Internet Forum, the Global Internet Forum to Counter Terrorism and the Global Partnership on AI.
At the 2022 Leaders’ Summit, a group of Call Supporters announced a new Initiative on Algorithmic Outcomes. In the first project under this initiative, New Zealand, the USA, Twitter, Microsoft and the charitable organisation OpenMined are developing and testing privacy technologies to help enable independent research on these important questions. See more information about the initiative.