Watch the RCCS originals documentary, “Speaking Frankly: Dating Apps,” in the video player above.
Steve Dean, an online dating consultant, says the person you just matched up with on aor the site may not be an actual person. “You go on Tinder, you slip on someone you thought was cute, and they say, ‘Hey sexy, it’s great to see you. You say to yourself, ‘OK, that’s a little bold, but hey.’ Then they say, ‘Would you like to chat? Here is my phone number. You can call me here. “… Then in many cases the phone numbers they send could be a link to a scam site, they could be a link to a live cam site.
Malicious bots on social media platforms are not a new problem. According to the security company Imperva, in 2016, 28.9% of all web traffic could be attributed to “bad bots” – automated programs with capabilities ranging from spam and data scraping to cybersecurity attacks.
As dating apps become more and more popular with humans, bots are heading to these platforms as well. It is all the more insidious thatseeking to establish personal and intimate bonds.
Dean says it can make an already uncomfortable situation more stressful. “If you go into an app that you think is a dating app and you don’t see any live people or profiles, then you might be wondering, ‘Why am I here? What are you doing with my attention while I’m in your app? Are you wasting it? Are you pushing me towards advertisements that don’t matter to me? Are you pushing me towards false profiles? ‘”
Not all bots have malicious intent, and in fact, many are created by companies themselves to provide useful services. (Imperva calls them “good bots.”) Lauren Kunze, CEO of Pandorabots, a chatbot development and hosting platform, says she has seen dating app companies use her service. “So we’ve seen a number of dating app companies create bots on our platform for a variety of different use cases, including user onboarding, engaging users when there isn’t. is no potential match there. And we’re also aware of what’s going on in the industry as a whole with bots that aren’t built on our platform. ”
Malicious bots, however, are usually created by third parties; most dating apps have made a point of condemning them and actively attempting to eliminate them. Nonetheless, Dean says the bots have been deployed by dating app companies in a way that seems deceptive.
“A lot of different players create a situation where users get ripped off or lied to,” he says. “They are manipulated into buying a paid membership just to send a message to someone who was never real in the first place.”
This is what Match.com, one of the top 10 most used online dating platforms, is currently accused of. The Federal Trade Commission (FTC) has launched a alleging the company “unfairly exposed consumers to the risk of fraud and engaged in other allegedly deceptive and unfair practices.” The lawsuit claims that Match.com took advantage of fraudulent accounts to trick non-paying users into purchasing a subscription through email notifications. Match.com denies that this happened, and in a Press release said the accusations were “completely baseless” and “supported by consciously misleading figures”.
As technology becomes more sophisticated, some argue that new regulations are needed. “It’s getting harder and harder for the average consumer to identify whether something is real or not,” Kunze says. “So I think we need to see an increasing amount of regulation, especially on dating platforms, where direct messaging is the medium.”
Currently, only California has passed a law that attempts to regulate bot activity on social media. The BOT law (“Strengthening online transparency”) forces robots that pretend to be human to disclose their identity. But Kunze thinks that although this is a necessary step, it is hardly enforceable.
“It’s very early on in terms of the regulatory landscape, and what we think is a good trend because our position as a company is that bots should always disclose that they are bots, they should not pretend to be. humans, ”Kunze said. . “But there is absolutely no way to regulate this in the industry today. So while lawmakers are realizing this problem and are just starting to really scratch the surface of its seriousness and severity, it there is no way to control it right now other than by promoting best practices, ie bots should disclose that they are bots. ”