Online dating platforms have exploded in popularity over the past decade with their combined global user bases topping 323 million and earning the industry $5.61 billion in 2021. However, the exponential growth of internet users has led to several enduring problems with creating an accessible virtual dating space where everyone feels safe and included. With a projected 15% increase in the industry’s usership by 2027, it is becoming a critical business priority of these platforms to invest in solutions in content moderationReviewing user-generated content to ensure that it complies with a platform’s T&C as well as with legal guidelines. See also: Content Moderator More and user security, trust and well-being.
Challenges faced by matchmaking sites
Online harassment remains a persistent problem on social media platforms, and dating sites are no exception. Women in particular face frequent instances of virtual stalking, aggression, and threats of violence, as well as unsolicited explicit images – a phenomenon particularly unique to dating apps. Around 52% of women aged 18-35 reported having been sent unsolicited explicit images from new matches, and another 20% reported having been subjected to threats of physical violence.
Even more concerning is research published in 2019 that found that no free-to-use dating platform screens their users for prior sexual offences, allowing predators to use the platform anonymously. Due to a lack of effective moderation, people have to decide whether being subjected to harassment is a price worth paying in order to participate or remain on these platforms.
Racial prejudice also remains an issue for many individuals online, despite the rise of more inclusive and accessible dating sites. A 2018 study done by OkCupid found that Black women and Asian men were the least likely groups to receive messages or responses, while both white men and women tend to be reluctant to date other ethnicities. This problem is exacerbated within the gay community, where dating apps have identified pervasive issues with racial discrimination.
Another hurdle for online platformsAn online platform refers to a digital service that enables interactions between two or more sets of users who are distinct but interdependent and use the service to communicate via the internet. The phrase "online platform" is a broad term used to refer to various internet services such as marketplaces, search engines, social media, etc. In the DSA, online platforms... More is the question of privacy and personal dataAny identifiable data is regarded as personal data. This includes names, home addresses, email addresses, ID numbers, an IP address, etc. In the US, various laws define personal data in their own words, such as personal health information defined in HIPAA. In the EU, the GDPR defines it as any information relating to an identifiable person, such as a name,... More. To keep their services free, many websites and social media companies sell their users’ data to third-parties for targeted advertisements. The extent of this was not well understood until, in 2019, the Norwegian Consumer Council discovered that the many popular dating apps collect and sell information such as the user’s exact location, sexual orientation, religious and political beliefs, and even drug use, and medical conditions. This set off alarm bells for consumers and regulators alike who began investigating ways to curtail what information companies could freely transmit to outsiders.
Companies have been working on how to solve these issues internally. Tinder, for example, in 2020 rolled out new features aimed at ensuring user safety when meeting matches for the first time, including an emergency responder-activated “Panic Button”, in-app safety check-ins during a date, and real-time photo verification to prevent catfishing (impersonating someone else online). Bumble made headlines this year when it released the Private Detector, an open-source A.I. software that detects and automatically blurs explicit images sent within the app. Other apps opted to remove the ability for users to sort profiles based on race, however the efficacy of this action is still debated.
Future trends in e-dating
As consumers demand more accountability from companies to make online dating a more inclusive and secure space, national governments are taking note and passing legislation to rein in these actors.
The UK has published a draft Online Safety Bill which includes a wave of regulations for social media platforms, including making companies liable to respond to reports of abuse or harassment. The law will also make “cyberflashing” – sending unsolicited explicit images – a criminal offence. In fact, lobbies for cyberflashing laws by companies like Bumble have successfully pushed through similar bills in Texas, Virginia, and most recently California.
Similarly, in Europe, the Digital Services Act (DSA), which will be live from mid-November, aims to better protect users, establish clear frameworks of accountability for platforms, and foster competition. As long as a dating site has users in an EU Member State, they will face a bulk of the obligations the regulation mandates. See what exactly the DSA means for your business here.
Judging by the trend of recent regulations, it is certain that governments around the world will continue to focus on user-oriented regulations of online companies, so it is imperative that dating apps move quickly to keep up. Not complying with the DSA may result in fines of up to 6% of the platform’s global annual turnover, or even the termination of the platform’s services in the EU.
Implications for your business
The EU alone represents a large portion of these platforms’ user base, meaning providers will need to ensure they make several immediate operational changes in order to meet new rules and avoid hefty penalties.
Firstly, dating platforms will need to declare a single point of contact in the EU that can be held legally accountable for infractions of the DSA. Dating service providers will then need to ensure they have implemented a well-designed, transparent, content moderation system that provides the tools for users and the platform alike to adequately respond to law enforcement, trusted flaggersGenerally, this refers to individuals or entities that have proven expertise in flagging harmful or illegal content to online service providers. Within the meaning of the DSA, trusted flaggers are entities that have been awarded an official status by a Digital Service Coordinator. Online platforms will need to ensure notices from such organizations are treated with priority. More, and out-of-court dispute requests.
Another major hurdle for companies will be a range of stipulations as to the design of the platform itself. Indeed, the new due diligence obligations for very large online platforms (VLOPs) will impact the way dating sites allow user interaction, share content, show advertisements, and more. The DSA also places a priority on protection of minors, emphasising preventative risk assessments that, in the case of dating sites, would include clearly laying out the company’s procedure to ensure age verificationAge verification is a process used by websites or online services to confirm that a user meets a specific minimum age requirement. Age verification is typically used in jurisdictions where laws prohibit minors from accessing certain online content or services, such as gambling, pornography, or alcohol sales. More prevents minors from using the service.
In short, all online platforms and service providers will be required to adopt a robust streamlined approach to content moderation and user safety that is guaranteed through continuous compliance and transparency reporting.
How can Tremau help you?
Time is short for companies to get their houses in order in the face of the recently adopted DSA. To help your platforms, Tremau offers a comprehensive single trust & safetyThe field and practices that manage challenges related to content- and conduct-related risk, including but not limited to consideration of safety-by-design, product governance, risk assessment, detection, response, quality assurance, and transparency. See also: Safety by design More content moderation platform that prioritises compliance as a service by integrating workflow automation amongst other AI tools. Tremau’s platform ensures that e-dating providers and other VLOPs (very large online platforms)The DSA has coined a new term, very large online platforms and very large online search engines which are defined as online platforms with over 45 million monthly active users in the EU. Once an online platform passes this threshold, they are designated as a VLOP by the European Commission. Read more More are up to standard for the DSA requirements while also improving their key trust & safety performance metrics. This way brands can have the peace of mind of protecting their users and of being protected themselves, and also increase their handling capacity, while reducing the growing administrative and reporting burden of content moderation.
For further information on these regulations and how they can affect your business, please contact email@example.com.
Tremau Policy Research Team