Match, the UK’s number one destination for online dating, has released an update to Lara, its artificial intelligence (AI) dating coach, to protect its members against online harassment.
An industry-first, when launched by Match in 2015, Lara acts as a virtual coach, giving daters advice on their personal profile and suggesting potential matches. Now, she has been programmed to detect behaviour that may be a form of harassment, such as multiple messages sent to the same user without reply.
For potential victims of harassment, Lara will check with the user to see if they require support and, if necessary, explain what can be done to put an end to the situation, by reporting the profile to Match’s customer care team or blocking the profile.
For those users engaging in harassing behaviour, Lara will advise them that their approach appears inappropriate and signpost to Match’s best practice for good dating behaviour:
- Rule 1: Wait for the other person to respond, or not if they choose
- Rule 2: Replying once is fine, twice is fine, three times is too much
- Rule 3: Respect the other person: don’t use insulting or denigrating words or send unsolicited inappropriate photos
A combination of the latest artificial intelligence, voice recognition and machine learning technology, enables Lara to understand and respond to conversations through chat. She continues to improve her vocabulary, understanding and recommendations by learning from every interaction between members using Match services.
Joanna Pons, UK Marketing Director, comments: “For over two decades, Match has been committed to providing a safe environment for singles to find love. Unfortunately, cyberbullying is a prevalent issue in today’s online world. That’s why Match is stepping up its fight, through the introduction of innovative tools to detect and prevent online harassment, as well as equipping singles with the relevant support. At Match, the online security of our members is our highest priority and this will form a vital part of our approach.”
This campaign is the latest in a series of initiatives that Match has in place to protect its members. All Match Group platforms have in-app safety resources that equip members with details about the latest safety features and tools. Additionally, Match Group platforms have dedicated teams that leverage a network of industry leading technology to scan for signs of fraud and review suspicious profiles, activity, and user-generated reports. Match Group safety processes, procedures, and resources are reviewed and advised upon by the Match Group Advisory Council, a group of outside experts in online safety.
* Survey conducted with Research Now in April 2018, among a representative sample of 10,671 persons aged 18+ in the UK.
**Data based on an extrapolation from Research Now survey conducted in April 2018, among a representative sample of 10,671 persons aged 18+ in the UK, which has been combined with the total population of this age group (Source Eurostat 2018). 3 per cent of interviewees have ever been in a relationship with someone they claim having met on Match.