Given the increasing focus on trust and safety and the responsibilities of actors across the Internet ecosystem, regulation has gradually shifted focus on transparency requirements. What are the processes that must be in place to deal with illegal content while protecting fundamental rights and due process? The Digital Services Act (“DSA”) is quite clear: if a company’s services are to be considered safe and trusted, transparency is non-negotiable.
If there is one place in the Internet where transparency can provide some much-needed insight regarding content moderationReviewing user-generated content to ensure that it complies with a platform’s T&C as well as with legal guidelines. See also: Content Moderator, that would be its infrastructure. The infrastructure of the Internet is a space consisting of various actors who provide everyday services that allow users to have a seamless, reliable, and secure Internet experience; however, it generally attracts little attention because it is obscure and, predominantly, technical. Actors on this level consist of conduit, caching, and hosting services, seen in companies such as Cloudflare, Amazon Web Services, and Google Play Store, to name a few. Their operations are crucial, yet they often seem distanced from the public discourse; they are often considered inaccessible and, occasionally, unaccountable to everyday users.
The question, therefore, is whether the DSA could help shed some light into the practices of these otherwise invisible actors. Does the DSA manage to create a consistent and predictable environment for infrastructure providers that could help alleviate some of the opaqueness with their content moderation practices?
Read the full version of this article on the Tech Policy Press.
By: Konstantinos Komaitis, Louis-Victor de Franssu, Agne Kaarlep