Lead Forensics

Regulating Online Matchmaking: Trends & Challenges

Online dating platforms have exploded in popularity over the past decade with their combined global user bases topping 323 million and earning the industry $5.61 billion in 2021. However, the exponential growth of internet users has led to several enduring problems with creating an accessible virtual dating space where everyone feels safe and included. With a projected 15% increase in the industry’s usership by 2027, it is becoming a critical business priority of these platforms to invest in solutions in content moderation and user security, trust and well-being.

Challenges faced by matchmaking sites

Online harassment remains a persistent problem on social media platforms, and dating sites are no exception. Women in particular face frequent instances of virtual stalking, aggression, and threats of violence, as well as unsolicited explicit images – a phenomenon particularly unique to dating apps. Around 52% of women aged 18-35 reported having been sent unsolicited explicit images from new matches, and another 20% reported having been subjected to threats of physical violence. 

Even more concerning is research published in 2019 that found that no free-to-use dating platform screens their users for prior sexual offences, allowing predators to use the platform anonymously. Due to a lack of effective moderation, people have to decide whether being subjected to harassment is a price worth paying in order to participate or remain on these platforms. 

Racial prejudice also remains an issue for many individuals online, despite the rise of more inclusive and accessible dating sites. A 2018 study done by OkCupid found that Black women and Asian men were the least likely groups to receive messages or responses, while both white men and women tend to be reluctant to date other ethnicities. This problem is exacerbated within the gay community, where dating apps have identified pervasive issues with racial discrimination. 

Another hurdle for online platforms is the question of privacy and personal data. To keep their services free, many websites and social media companies sell their users’ data to third-parties for targeted advertisements. The extent of this was not well understood until, in 2019, the Norwegian Consumer Council discovered that the many popular dating apps collect and sell information such as the user’s exact location, sexual orientation, religious and political beliefs, and even drug use, and medical conditions. This set off alarm bells for consumers and regulators alike who began investigating ways to curtail what information companies could freely transmit to outsiders.

Companies have been working on how to solve these issues internally. Tinder, for example, in 2020 rolled out new features aimed at ensuring user safety when meeting matches for the first time, including an emergency responder-activated “Panic Button”, in-app safety check-ins during a date, and real-time photo verification to prevent catfishing (impersonating someone else online). Bumble made headlines this year when it released the Private Detector, an open-source A.I. software that detects and automatically blurs explicit images sent within the app. Other apps opted to remove the ability for users to sort profiles based on race, however the efficacy of this action is still debated.

Future trends in e-dating

As consumers demand more accountability from companies to make online dating a more inclusive and secure space, national governments are taking note and passing legislation to rein in these actors.

The UK has published a draft Online Safety Bill which includes a wave of regulations for social media platforms, including making companies liable to respond to reports of abuse or harassment. The law will also make “cyberflashing” – sending unsolicited explicit images – a criminal offence. In fact, lobbies for cyberflashing laws by companies like Bumble have successfully pushed through similar bills in Texas, Virginia, and most recently California.

Similarly, in Europe, the Digital Services Act (DSA), which will be live from mid-November, aims to better protect users, establish clear frameworks of accountability for platforms, and foster competition. As long as a dating site has users in an EU Member State, they will face a bulk of the obligations the regulation mandates. See what exactly the DSA means for your business here.

Judging by the trend of recent regulations, it is certain that governments around the world will continue to focus on user-oriented regulations of online companies, so it is imperative that dating apps move quickly to keep up. Not complying with the DSA may result in fines of up to 6% of the platform’s global annual turnover, or even the termination of the platform’s services in the EU. 

Implications for your business

The EU alone represents a large portion of these platforms’ user base, meaning providers will need to ensure they make several immediate operational changes in order to meet new rules and avoid hefty penalties. 

Firstly, dating platforms will need to declare a single point of contact in the EU that can be held legally accountable for infractions of the DSA. Dating service providers will then need to ensure they have implemented a well-designed, transparent, content moderation system that provides the tools for users and the platform alike to adequately respond to law enforcement, trusted flaggers, and out-of-court dispute requests. 

Another major hurdle for companies will be a range of stipulations as to the design of the platform itself. Indeed, the new due diligence obligations for very large online platforms (VLOPs) will impact the way dating sites allow user interaction, share content, show advertisements, and more. The DSA also places a priority on protection of minors, emphasising preventative risk assessments that, in the case of dating sites, would include clearly laying out the company’s procedure to ensure age verification prevents minors from using the service. 

In short, all online platforms and service providers will be required to adopt a robust streamlined approach to content moderation and user safety that is guaranteed through continuous compliance and transparency reporting.

How can Tremau help you?

Time is short for companies to get their houses in order in the face of the recently adopted DSA. To help your platforms, Tremau offers a comprehensive single trust & safety content moderation platform that prioritises compliance as a service by integrating workflow automation amongst other AI tools. Tremau’s platform ensures that e-dating providers and other VLOPs (very large online platforms) are up to standard for the DSA requirements while also improving their key trust & safety performance metrics. This way brands can have the peace of mind of protecting their users and of being protected themselves, and also increase their handling capacity, while reducing the growing administrative and reporting burden of content moderation.

For further information on these regulations and how they can affect your business, please contact info@tremau.com.

Tremau Policy Research Team

JOIN OUR COMMUNITY

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.

Share This Post

Further articles

Trust & Safety Software
Best practices in trust & safety

Making the Right Choice: Buy or Build Your Trust & Safety Software?

In the realm of software development, the age-old question about building software in-house or buying it from vendors is very common.  It is not surprising that this question is also very common when it comes to enterprise-level Trust & Safety (T&S) tools. Luckily, there is a long history of research on this question, starting from

figting terrorist content
Global Regulations

Fighting terrorist content online: insights from the FRISCO Project

In today’s digital age, there’s been a troubling increase in how terrorists exploit the internet, which has become a major concern for both online and offline security (OECD, 2022). Facebook alone removed a record 16 million pieces of terrorist propaganda in the first quarter of 2022, along with over 13 million instances of hate speechHate

Join our community

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.