« Stuff The Internet Says On Scalability For January 29th, 2016 | Main | Design of a Modern Cache »

Tinder: How does one of the largest recommendation engines decide who you'll see next?

We've heard a lot about the Netflix recommendation algorithm for movies, how Amazon matches you with stuff, and Google's infamous PageRank for search. How about Tinder? It turns out Tinder has a surprisingly thoughtful recommendation system for matching people.

This is from an extensive profile, Mr. (Swipe) Right?, on Tinder founder Sean Rad:

The goal for users, according to Badeen, is that they forget about the person they swiped on within three seconds. But Tinder doesn’t. They study who members swipe on, who they match with. Then they look at “reactivation.” Younger users will disappear for a few weeks and then “reactivate,” or start swiping again. Older users spend more time looking at individual profiles and are more likely to disappear for a few months before reactivating. The average active user spends an hour a day on Tinder, Gould says. (Rad says he’s addicted and spends countless hours swiping.)
Neighborhood patterns tend to be unique. Even people on different blocks in a city will behave differently or be less likely to match. “People naturally sort themselves geographically,” Gould says. And if people travel, their behavior changes dramatically. “We learn all about a person,” Gould says, “and then they go to a different place and act totally differently. Gould, whose hair is a little more askew and whose clothes are a little looser than Rad’s and Badeen’s, is in charge of tweaking the algorithm. Which is also to say that matches don’t happen by chance. Tinder is arranging who you’ll see next. And with billions of matches, it has an enormous trove of data. “We’re probably one of the largest recommendation engines in the world,” Rad says. 
At first, Gould tells me, the app had a ruling class of “the matching 1 percent,” people who got tons of matches and who made everyone else look bad in comparison. Tinder decided to change the trend by showing these profiles less frequently, especially to users who weren’t in the 1 percent. Now those who get a lot of right swipes (yes) get shown to progressively fewer people, and those who get a lot of left swipes (no) get shown to progressively more people. “I call it progressive taxation — redistributing matches. They’re not truly ours to redistribute, but we try,” Gould says. “It feels right to do that.” The company calls this “smart matching”: bringing justice to the dating world by balancing the playing field and making sure that members less likely to get matches still get some. “Part of the human condition is the struggle. If you’re seeing nothing but Victoria’s Secret models, one won’t necessarily stick out,” Badeen says. “When we introduce people who aren’t suited for you, it accentuates those who are. The matches don’t happen by chance. Tinder is arranging who you’ll see next.
They also change the system for bad actors, limiting the number of swipes per day. “We used to have a bunch of guys who would swipe right on everyone and then not respond, so we added a limit to detect people who weren’t playing the game,” Gould says. “I was surprised, but people actually are smart. They play what they’re given. For the first few days, the guys kept hitting their limit. Then, after that, they began to adapt. Once they did, conversations got longer.
Gould and Badeen see these interventions as a moral obligation. “It’s scary to know how much it’ll affect people,” Badeen says. “I try to ignore some of it, or I’ll go insane. We’re getting to the point where we have a social responsibility to the world because we have this power to influence it.
Gould echoes this sentiment: “Listen, architects design buildings that set up how people are going to live. City planners set up towns and roads. As the designers of a system that helps people with dating, we have a responsibility to build those contexts — we’re responsible for a decent percent of the marriages on this planet every year. That’s an honor and a responsibility.

On HackerNews

Reader Comments (3)

How it's related to high scalability?

January 27, 2016 | Unregistered CommenterTom

While I would have of course loved technical details, I find these policy related insights also relevant. It's more thoughtful and sophisticated than I would have predicted and an interesting use of big data to drive the use of a product in a goal oriented direction.

January 27, 2016 | Registered CommenterTodd Hoff

So they have a social responsibility to limit the number of matches per day but if you pay for the services you can still do that? There seems to be something wrong wiht their moral barometer.

February 8, 2016 | Unregistered CommenterV

PostPost a New Comment

Enter your information below to add a new comment.
Author Email (optional):
Author URL (optional):
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>