We developed a program that identifies the videos YouTube’s recommendation algorithm most often recommends, based on a given search.


Contact: hello@algotransparency.org

Aim of the project

We aim to inform citizens on the mechanisms behind the algorithms that determine and shape our access to information. YouTube is the first platform on which we’ve conducted the experiment. We are currently developing tools for other platforms.

What are “recommended videos” on YouTube?

Recommended videos appear in the “Up next” list on the right of the screen. (“When autoplay is enabled, a suggested video will automatically play next.”)

How did you identify YouTube’s most often recommended videos?

We used a multi-step program to analyze videos recommended by its algorithm in response to searches for the names of the different candidates.

The program tabulates how many times each video is recommended, which is then used to determine which videos YouTube most often recommends for any given candidate.

Can I see the program you used to gather your data?

Yes! The code we developed is open source, and available here. Anyone can use it, varying parameters such as the search depth, the subject of the initial query, etc.

Can I easily see where YouTube takes me?

Yes! Do a search for one of the candidates, then click on the first recommended video six times in a row. Our program does this many many times and computes the average of the results.

Why are certain videos no longer accessible?

Certain videos are removed, either by YouTube, according to their Community Guidelines, or by YouTubers themselves.

Do recommended videos depend on my profile?

Part of the recommendations are personalized. We performed this research with an account that has no YouTube viewing history.

Who is behind AlgoTransparency?

AlgoTransparency is led by Guillaume Chaslot, Adrien Montcoudiol, Soline Ledésert, Nicolas Wielonsky, Frédéric Bardolle et Mathieu Grac.