Hate Online: Platform Compliance Challenges

Under the new EU Digital Services Regulation, platforms must allow external teams to audit their algorithms. This unprecedented task, however, raises many questions.

The Digital Services Act (DSA) regulation will introduce a specific regime for very large online platforms from August. These platforms will have to analyze the societal risks they create, such as the dissemination of false information or online hatred, and propose risk mitigation measures.

These risk analysis and mitigation measures will be subject to independent audits that will assess the compliance of the algorithms with the DSA. The audit process will therefore be key to how platforms interpret and adapt to new EU rules.

The European Commission has published a draft delegated act aimed at defining the methodology for these independent audits. But the reactions of audit firms, technology companies and civil society highlight several critical points in this still uncharted territory.

Auditors


Audit firms have consistently highlighted the lack of industry standards to verify algorithms and have asked the Commission for further guidance on the “reasonable level of assurance” to comply with the various DSA obligations.

“The scientific framework is not in place. We do not have the answers to what this law aims to achieve. How to define a systemic risk? How to measure it? Adriana Iamnitchi, professor of computational social sciences at Maastricht University, told EURACTIV.

PwC, a world-renowned consulting firm, wrote in its response that “making the auditor the judge of what constitutes compliance is likely to have mixed outcomes and potentially create disagreements among stakeholders about who has set the bar at the right level and whether different entities are being treated fairly.”

PwC’s comments overlap those of Deloitte and EY, major audit firms poised to enter this new market. Since algorithm audits are relatively new and technically complex, only a handful of companies have the necessary expertise for this task or the means to acquire the required talent.

Platforms


The risks of inconsistency and incomparability are foreseeable, given the different types of platforms that will be audited, ranging from social networks to search engines.

Tech companies say the European Commission’s delegated act is too prescriptive, doesn’t take into account the diversity of use cases, and doesn’t require auditors to be proportional in their assessments.

“ The downside of imposing a [too] prescriptive standard is that it limits auditors’ choice and incentivizes them to strictly follow the letter of the law, rather than its spirit,” the Wikimedia submission reads.

Auditors and platforms seem to agree that some flexibility should be considered for the first year, given the complexity of the DSA and the newness of this framework.

However, a more fundamental argument from digital players is that the audit industry might not have the expertise to assess the inner workings of platforms.

“This shows the paradox in which we find ourselves. A lot of opaque, data-driven companies have led to complexities beyond the reach even of the people whose job it is to study them, namely academics,” Catalina Goanta, associate professor of law and technology at Utrecht University, told EURACTIV.

Civil society


At the heart of the reactions of civil society representatives is the following question: who will control the auditors?

In a joint response, the NGOs Algorithm Watch and AI Forensic pointed out that auditing firms may have an incentive to be lenient in their assessments in order to attract and retain contracts.

This risk of “audit-washing” is aggravated in a context devoid of objective standards. In the meantime, audited companies will be free to exclude certain information from the reports by labeling it confidential, which will prevent their publication.

For members of civil society, the best way to control auditors and platforms is to allow accredited researchers access to the full version of audit reports. Under the DSA, accredited researchers are allowed to request data from the platforms, but their role in the audit mechanism is still procedurally unclear.

More generally, civil society seems skeptical about whether large audit firms are well placed to assess systemic risks, such as what social media could imply for democratic processes. Other actors are already trying to fill this void.

One of them is Algorithm Audit, an NGO whose business purpose is to ethically evaluate the criteria of algorithmic audits, while indicating the advantages and disadvantages for each specific circumstance. Its methodology is called “ algoprudence ”, a neologism made up of algorithms and case law.

“ There will be a collective learning process that will take three to five years ”, explained Jurriaan Parie, co-founder of Algorithm Audit, adding that much will depend on how the Commission and its new center on algorithmic transparency work in good intelligence with auditors to establish good industry practices.

“It’s a process. It won’t be perfect at first, but you have to start somewhere. The question is who will pay attention to it,” concluded Professor Adriana Iamnitchi.

This article is originally published on euractiv.fr

Electric Scooter XElectric Scooter XElectric Scooter XElectric Scooter X

Subscribe

Related articles

Georgia’s Pro-Russian and Controversial President Mikheil Kavelashvili

Mikheil Kavelashvili is a former professional Georgian footballer and...

Ioannis Lagos’s Pro-Russia Advocacy in the European Parliament

Ioannis Lagos is a Greek far-right politician. He performed...

AfD’s Beatrix von Storch Aligned with Putin’s Realpolitik

Beatrix von Storch is a German politician and lawyer....

Pro-Russian Călin Georgescu’s Bid for Romania’s Presidential Race

Călin Georgescu is a Romanian far-right politician. Georgescu was...
Electric Scooter X