A little experiment by an artificial intelligence researcher is raising questions about how TikTok’s recommendation algorithm suggests new creators to users.
Specifically, the question is whether that algorithm is sorting suggestions based on the race of the creator — something TikTok denies it’s doing intentionally. But it’s another example of the need for more scrutiny into how the app and other social media platforms promote particular creators or content.
Marc Faddoul is a researcher at the University of California Berkeley School of Information, looking at AI and disinformation. He was checking out TikTok to look for disinformation when he noticed something curious about how the app recommends new creators to follow.
In the app, when a person follows a new account, they can click an arrow that then recommends other accounts to follow. Faddoul noticed that when he did this, the recommended accounts tended to look just like whoever he’d just followed — right down to ethnicity and hair color.
“I saw this very clear pattern that was happening,” he told BuzzFeed News. “When following an account the suggestions are very similar looking.”
He made a fresh account to try it out again, and these were his results:
Faddoul cautioned that this was a casual experiment, not actual research, but he said the results are still interesting.
BuzzFeed News tried a similar experiment with a new account and got similar results.
Following hijabi creator @jiggybush caused the app to recommend other women who wear a hijab.
And following @uwayeme, a black woman, prompted recommendations for other black women.
At first, Faddoul suspected TikTok was employing AI technology that studied people’s profile photos when making recommendations. Other tech companies, like Netflix, use this to determine what thumbnails a user is most likely to click on. That’s why a person’s Netflix homepage may have different images than another person’s to promote the same show.
But, according to TikTok, there’s a simple answer to the questions raised by Faddoul.
TikTok told BuzzFeed News that it uses what’s known as collaborate filtering. Basically, the app recommends new accounts based on the who the people who follow that user are also following.
“We haven’t been able to replicate results similar to these claims,” a TikTok spokesperson told BuzzFeed News.
“Our recommendation of accounts to follow is based on user behavior: users who follow account A also follow account B, so if you follow A you are likely to also want to follow B.”
That also explains why, when following a big creator like Addison Rae, the app recommends other creators from the Hype House, rather than creators who look like Rae.
Or, in another example, following an artist returns recommendations for other artists.
It’s a classic system used by tech companies from YouTube to Netflix, but that doesn’t mean it’s not without problems, Faddoul said.
“Collaborative filtering may also reproduce whatever bias there is in people’s behavior,” he said.
“People who tend to like blonde teens tend to like a whole lot of other blonde teens. In that sense, it’s kind of expected.”
What that means is TikTok recommended BuzzFeed News’ test account women who wear hijabs because people who follow one hijabi tend to follow other hijabis.
But, Faddoul said, this can create a feedback loop where people are only ever recommended a particular type of creator, leading to a lack of diversity in their feed.
For example, he said, if the most popular creators on a platform are white, and the app keeps recommending other white creators, it makes it hard for creators of color to gain followers and popularity — even if that’s not the intention of the algorithm.
“Then it means it’s easier for a white person to get recommended than someone from an underrepresented minority,” he said. “So that’s something that can be happening, regardless of its facial feature or collaborative filtering.”
Of course, this is hardly unique to TikTok. All social media platforms that use algorithms can create bubbles where people only see content that confirms their biases. Think of, for example, how a Facebook feed may be biased toward a particular political viewpoint.
“This is not a scientific research methodology, just anecdotal evidence that kind of highlights a phenomenon that seems pretty clear and distinct and encourages further investigation,” Faddoul said.
Lauren Strapagiel is a reporter for BuzzFeed News and is based in Toronto, Canada.
Contact Lauren Strapagiel at [email protected]
Got a confidential tip? Submit it here.
Source: Read Full Article