Tinder enhances safety tools, blocks spam accounts. Will it promote safe dating?

Match Group on Tuesday said its companies including Tinder have blocked nearly 5 million spam and bot accounts in the first quarter at sign-up or before a user sees it. (ALSO READ: Tinder releases Future of Dating Report 2023. Here’s what it found)

Tinder safety features.(Unsplash)

‘Every minute, an average of 44 spam accounts removed’

According to internet crime experts, cyber criminals use various tactics and forms of communication across online platforms, from email and text phishing scams to social media platforms and online dating services. Match Group says its companies constantly invest in advanced detection and removal tools to help maintain the integrity of their services.

“Every minute, there are an average of 44 spam accounts removed across its portfolio as an effort to help curtail suspected fraudulent accounts either blocked at sign up or before a user sees them. Additionally, nearly 5 million bots and spam accounts have been removed between January and March of this year — before the account gained access to the platform or shortly after signup, in an effort to prevent potential harm,” Match Group said in a statement.

‘Spammers have evolved their tactics’

Over the few years, Tinder says spammers have evolved their tactics to exploit common member behaviours like posting a social handle on their bio to direct traffic to another platform, where they often monetise directly or share yet another link that redirects to a third site for monetization.

Tinder announced changes to its existing Community Guidelines last month, outlining the good behaviours that help lead to the best possible experience for everyone on the app. As part of these changes, Tinder says it will remove social handles from public bios that advertise or promote their social profiles to gain followers, sell things, fundraise, or campaign.

‘We are able to remove a majority of spam before a user ever sees it’

“We are continuously enhancing our spam prevention tools to help make them more effective, while also making investments in machine learning, both of which we view as essential for Match Group to help maintain a safer service for our users around the world,“ said Jess Johnson, director, safety product, Match Group.

“By implementing a combination of technology, human moderation, and user education to encourage reporting of suspicious activity, we are able to help remove the vast majority of spam at sign up or before a user ever sees it.”

Tinder has incorporated several safety features, including Photo Verification with selfie video, Block Profile and in-app video chat, the statement added.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *