Spotify must be more transparent about its rules of the road – TechCrunch


With the controversy surrounding Joe Rogan’s podcast, Spotify has officially joined the ranks of media platforms publicly defending their governance practices.
Rogan’s podcast is a harbinger of the company’s future — and that of social media. Platforms that didn’t think of themselves as social are now faced with managing user content and interaction. In the industry, we would say that Spotify has a “Trust & Safety” problem.
Spotify, and every other platform with user-generated content, is learning the hard way that they can’t stay out of the way and rely on users to post appropriate content that doesn’t flout company policies or social norms. Platforms are finding that they must become legitimate, active authority figures, not passive publishers. Research shows that they can start by generating trust with users and building expectations of good conduct.
Rogan is just one example. With Spotify’s acquisition of Anchor and its partnership with WordPress, which enable “access to easier creation of podcasts,” user-generated podcasts discussing politics, health and social issues are part of Spotify’s new frontier.
To this, we can add platform integration: Users can now use Spotify with other platforms, like Facebook, Twitter and Peloton. This means the Spotify user experience is shaped by content created across the internet, on platforms with distinct rules and codes of conduct. Without common industry standards, “misinformation” at, say, Twitter will not always be flagged by Spotify’s algorithms.
Welcome to the future of social media. Companies once believed they could rely on algorithms to catch inappropriate content and intervene with public relations in high-profile cases. Today, the challenges are bigger and more complicated as consumers redefine where and how one is social online.
Tech companies can adapt by working on two fronts. First, they must establish themselves as legitimate authorities in the eyes of their community. This starts by making the rules readily available, easily understandable and applicable to all users.
Think of this as the rules of driving, another large-scale system that works by ensuring people know the rules and can share a common understanding of traffic lights and rights of way. Simple reminders of the rules, like stop signs, can be highly effective. In experiments with Facebook users, reminding people about rules decreased the likelihood of ongoing bad behavior. To create safety on platforms facing thousands, if not millions, of users, a company must similarly build out clear, understandable procedures.
Try to find Spotify’s rules. We couldn’t. Imagine driving without stop signs or traffic lights. It’s hard to follow the rules if you can’t find them. Tech companies have historically been resistant to being responsible authority figures. The earliest efforts in Silicon Valley at managing user content were spam fighting teams that blocked actors who hacked their systems for fun and profit. They legitimately believed that by disclosing the rules, users would game the platform and that people would change behavior only when they are punished.

Try to find Spotify’s rules. We couldn’t. Imagine driving without stop signs or traffic lights. It’s hard to follow the rules if you can’t find them.


We call this approach “deterrence,” which works for adversarial people like spammers. It is not so effective for more complicated rule-breaking behaviors, like racist rants, misinformation and incitement of violence. Here, purveyors are not necessarily motivated by money or the love of hacking. They have a cause, and they may see themselves as rightfully expressing an opinion and building a community.
To influence the content of these users, companies need to drop reactive punishment and instead take up proactive governance — set standards, reward good behavior and, when necessary, enforce rules swiftly and with dignity to avoid the perception of being arbitrary authority figures.
The second key step is to be transparent with the community and set clear expectations for appropriate behavior. Transparency means disclosing what the company is doing, and how well it is doing, to keep things safe. The effect of reinforcing so-called “platform norms” is that users understand how their actions could impact the wider community. The Joe Rogans of the world start to appear less attractive as people look at them as threatening the safe, healthy experience of the wider community.
“We’re defining an entirely new space of tech and media,” Spotify founder and CEO Daniel Ek said in a recent employee meeting. “We’re a very different kind of company, and the rules of the road are being written as we innovate.”
That’s just not true. Sorry, Spotify, but you are not that special. There are already proven “rules of the road” for technology platforms — rules that show great promise for building trust and safety. The company just needs to accept them and follow them.
You’ll still have incidents of online “road rage” once in a while, but the public might just be more forgiving when it happens.

source

Leave a Reply

Your email address will not be published. Required fields are marked *