It’s refreshing to see a couple major players in the social media sphere attempt to make amends for their egregious mistakes, even if that atonement comes way, way later than it should have.
The errors in question are Snapchat’s recent and massively unpopular redesign and YouTube’s algorithm allowing highly inappropriate videos into their app for kids.
Let’s unpack Snapchat first.
As a Snapchat user myself, I can personally attest to the outrage felt by Snapchatters everywhere when the platform rolled out their redesign on February 6. Initially, Snapchat CEO Evan Spiegel defended the changes. In his words, “The tech to some degree is a solved problem, the time to…learn is a hard problem to solve. Even the complaints we’re seeing reinforce the philosophy. Even the frustrations we’re seeing really validate those changes. It’ll take time for people to adjust, but for me using it for a couple months I feel way more attached to the service.”
It would appear that Spiegel has since had a change of heart. Could that possibly have been induced by the decrease in users that Snapchat saw in March?
Spiegel’s recent statement reads, “We learned that combining watching Stories and communicating with friends into the same place made it harder to optimize for both competing behaviors. We are currently rolling out an update to address this by sorting communication by recency and moving Stories from friends to the right side of the application while maintaining the structural changes we have made around separating friends from creators and sorting friends’ Stories by relationships.”
It’s a decent first step, but Snapchat is going to have to pay close attention and address the most prevalent complaints among their users, especially with the threat of WhatsApp Status looming on the horizon.
As for YouTube, buckle up everyone, because they’re finally doing something to address Elsagate.
If you need a quick refresher, Elsagate refers to the scandal wherein large numbers of videos involving violence, drugs, sexual content, and curse words were slipping past the YouTube algorithm and into the YouTube Kids app. This disturbing content was often interwoven with the use of beloved childhood figures like Elsa (hence the name), Spider-Man, and Peppa Pig to confuse the algorithm, infuriating parents and traumatizing children.
On April 25, Google announced new parental controls for the YouTube Kids app that allows parents to filter out all channels that have only been reviewed by algorithms rather than humans. Furthermore, these new restrictions will disable the search feature, which will also restrict all recommended videos to those that have been approved by humans. These controls will be off by default, meaning they will need to be manually enabled. Later this year, these controls will be further expanded to allow parents to restrict their children to specific channels and videos.
Although Reddit users report seeing disturbing children’s content as far back as 2011, the only action parents could take on the app prior to this was to flag videos for review.
I’m not sure why the videos permitted on the YouTube Kids app haven’t all been reviewed by human eyes all along, as algorithms are fallible and dangerous to leave unchecked, but I’m glad YouTube is finally stepping up to protect these kids.
What do you think? Are you willing to forgive Snapchat and YouTube for their errors, or do you think it’s too late for them to redeem themselves?