scorecard
  1. Home
  2. tech
  3. news
  4. I tried a site where AI matched me with someone based on my 'hotness' rating. It's really messed up.

I tried a site where AI matched me with someone based on my 'hotness' rating. It's really messed up.

Lakshmi Varanasi   

I tried a site where AI matched me with someone based on my 'hotness' rating. It's really messed up.
  • Hot Chat 3000 is a dating website where AI determines your "hotness" using huge datasets.
  • The tongue-in-cheek website then lets you chat with other people within your "hotness" bracket.

We're living in a brave new world where even our "hotness" level can be determined by AI — and a new website wants to expose just what we're getting ourselves into.

I tried it to determine what AI supposedly thinks of me — but more on that in a minute.

The art collective MSCHF launched the website, called Hot Chat 3000. It bills itself as a "1-to-1 online chat website where who you can talk to is contingent on how attractive you are, and how attractive you are is determined by AI."

But the AI is only as good as the data that's training it — and that's where MSCHF is stepping in, essentially asking the question: Just who — or what — gets to determine who's hot?

When a user enters Hot Chat 3000, they're asked to upload a picture of themselves as electronic music hums in the background. The picture is "analyzed" and given a "hotness" rating on a scale of 1-to-10, which in turn dictates who the user can chat with. So a user who earns a score of 5.4 will be matched with another person who scores between 5.0 and 5.9.

The rating system depends on certain large language models

Hot Chat 3000 says that its rating system relies predominantly on CLIP, a large machine-learning model from OpenAI that has been trained to "choose the correct captions for a particular image among a number of incorrect text captions." The model itself was trained on a dataset of 400 million image-text pairs.

In addition to CLIP, the site's rating system utilizes the "Hot or Not" dataset from the website hotornot.com as well as the SCUT-FBP5500 dataset for facial beauty prediction. (Hot or Not was a website in the early 2000s that let people rate whether users who had submitted their pictures were "hot" or "not.")

"Both datasets are heavily biased towards certain ethnicities and not at all representative of the broader population," the site notes. To that end, Hot Chat 3000 also said it used a small album of pictures from Google that featured a broader selection of ages, races, genders to manually adjust its model.

"The outputs of LLMs are a reflection of the data they were trained on," the group says, referring to large language models. "Hot Chat 3000 very deliberately sets out to expose, visualize, exacerbate these biases," the site says.

Which is to say that no one should take their Hot Chat 3000 "hotness" score seriously. The group told Artnet that "MSCHF's approach is always to participate natively in the space we are critiquing or satirizing. A.I. will be folded into a million arbitrary applications."

Hot Chat 3000 is one of several tongue-in-cheek projects that MSCHF has produced since it launched in 2016.

The collective — which is headed by former BuzzFeed employee Gabriel Whaley — once described itself on LinkedIn as "a dairy company." Its roster of stunts over the past few years include AI-generated photos of feet, an app for making stock investments based on astrological signs, and a "Satan Shoe" made with human blood that lead to a lawsuit from Nike, Insider previously reported.

It's unclear how much revenue the MSCHF generates from its endeavors. According to PitchBook, though, the company was valued at close to $120 million at its Series B round in April 2021 in which it raised funding from venture firms including Peter Thiel's Founders Fund. Whaley didn't respond to Insider's request for comment.

The moment of truth: Testing Hot Chat 3000

I braced myself and tested Hot Chat 3000 over two different days — uploading 15 different pictures of myself to understand where I fell on the AI-determined "hotness" scale.

I got an error message for some, but for the six pictures the site could read, I was given a comically wide range of scores between 3.7 and 6.8. The same picture was also given different scores on different days.

The New York Post reported that it ran pictures through the AI bot and actress Sydney Sweeney received a "5.2," while her co-star Glen Powell scored an "8." MSCHF's Whaley himself tweeted last week that he scored a 5.5.

When I finally entered the chat I used my "hottest" pic (as one does) and was matched with a fellow 6-er.

There are no names in the world of Hot Chat 3000; instead, everyone is identified by their score. So, I became "6.8" and my chat partner was called "6."

I said "Hi." And "6" immediately disconnected the chat — thankfully.



Popular Right Now



Advertisement