Adam Mosseri, the head of Instagram, has published the first in a series of posts breaking down the technology behind the app, and how it affects you
We may spend an unhealthy chunk of our lives staring at our social media feeds, but most of us have no idea how they actually work. The algorithms that govern what we see are notoriously enigmatic, while practices such as “shadow-banning” — when a user sees their posts as normal, but the content doesn’t reach the wider public — are similarly opaque, often leading to accusations of bias and censorship.
A recent blog post from head of Instagram Adam Mosseri, however, goes some way to shedding more light on how the app works, and why certain images tend to float to the top of your feed. It also responds to users’ accusations about shadow-banning, seeking to dispel rumours that it intentionally hides certain posts.
“We recognize that we haven’t always done enough to explain why we take down content when we do, what is recommendable and what isn’t, and how Instagram works more broadly,” Mosseri says in the June 8 post. “As a result, we understand people are inevitably going to come to their own conclusions about why something happened, and that those conclusions may leave people feeling confused or victimized.”
Admittedly, the post doesn’t expand on the reasons in much detail, but it does note that Instagram fields “millions of reports a day”, meaning that mistakenly acting on even a small percentage of reports affects thousands of people. Mosseri also says that Instagram is working on tools to make the process more transparent. These include better in-app notifications, “so people know in the moment why their post was taken down”, and other ways to let users know when their posts violate guidelines.
The blog post also clears up the misconception that there’s one, all-powerful algorithm. “We use a variety of algorithms… each with its own purpose,” Mosseri explains. He also says that these algorithms were developed to “make the most of your time” and prioritise what you’re most interested in, since most Instagram users apparently look at less than half of their feed.
Going into more depth, he explains that Instagram ranks the content you see by analysing thousands of “signals”, which include information about what was posted, who posted it, and your individual preferences. The way these signals are prioritised works slightly differently for your feed and stories (places to see recent posts from people you follow), and for the “Explore” page (which aims to help you discover new things).
There are also tighter restrictions on what appears via your “Explore” page, due to the fact that the posts you see there don’t come from people you follow. “If a friend you follow shares something offensive and you see that in your Feed, that’s between you and your friend,” Mosseri says. “If you see something offensive in Explore from someone you’ve never heard of, that’s a different situation.”
It’s hard for people to trust what they don’t understand, which is why we wanted to shed more light on how Instagram works, and why you see what you see. We often get asked about “the algorithm,” so we wanted to break things down a bit more.
— Adam Mosseri 😷 (@mosseri) June 8, 2021
Mosseri promises that this explainer on Instagram’s algorithms is just the first in a series of posts to boost transparency and shed more light on how the platform’s technology works. “Our systems are always evolving, and what you see in this post may change at some point,” he adds in a series of tweets. “But we’re going to share more in real time when there are big changes, so that you can better understand what’s going on.”
Last month, Instagram also detailed changes to its algorithms after it was accused of censoring pro-Palestinian content. Prior to that, it joined Facebook in finally rolling out the option to hide the like count on posts in your feed, as well as your own posts, in an effort to “depressurise people’s experience” on the platform.