The City Club of Cleveland

login Create an Account

private events
go

Not a Member?

Our members are champions of free speech. Join today!

join

Account Login

login

Forgot Password? Create an Account

Forgot Password

submit Cancel

Update Password

submit

blog

Want to know what is on our minds? Find blog posts written here, by the City Club staff, members, and partners. Every week you can find a new edition of #FreeSpeech in the News — a collection of related stories, commentary, and opinions on free speech in the 21st century that’s making the news. You’ll also find takes on current events, past forums, and issues surrounding Northeast Ohio. Read on for all things City Club.

« back to blog list

Wednesday, August 29, 2018

Profit, not free speech, governs media companies’ decisions on controversy

Bliss Davis, Content and Programming Coordinator, The City Club of Cleveland

Profit, not free speech, governs media companies’ decisions on controversy

For decades, U.S. media companies have limited the content they’ve offered based on what’s good for business. The decisions by Apple, Spotify, Facebook and YouTube to remove content from commentator Alex Jones and his InfoWars platform follow this same pattern.

My research on media industries makes clear that government rules and regulations do little to limit what television shows, films, music albums, video games and social media content are available to the public. Business concerns about profitability are much stronger restrictions. Movies are given ratings based on their content not by government officials but by the Motion Picture Association of America, an industry group. Television companies, for their part, often have departments handling what are called “standards and practices” – reviewing content and suggesting or demanding changes to avoid offending audiences or advertisers.

The self-policing by movie studios and TV networks is very similar to YouTube’s and Facebook’s actions: Distributing extremely controversial content is bad for business. Offended viewers will turn away from the program and may choose to boycott the network or service – reducing the size of audiences that can be sold to advertisers. Some alarmed viewers may even urge boycotts of the advertisers whose messages air during controversial programming.

Over the decades, television networks have internalized feedback from advertisers and unintended controversies to try to steer clear of negative attention. Social media companies are just beginning to understand these forces are at work in their own industries as well.

Self-regulation to avoid government intrusion

The practices of media industries to police themselves arose over many years, as companies tried to appease public concern without triggering formal government supervision. This pleased all sides: Elected and appointed officials avoided having to do much of anything that might look like squashing free speech, companies avoided formal restrictions that might be quite severe, and concerned citizens had their objections heard and acted upon.

When concerns about the amount of sex and violence on broadcast television developed in the 1970s, the networks agreed – with strong encouragement from the federal government – to establish a “Family Hour” during the first hour of prime-time programming that was monitored by the National Association of Broadcasters. Music labels agreed to place “Parental Advisory” labels on albums with explicit lyrics. Inspired by moviemakers, video game developers adopted ratings based on evaluations by an industry group, the Entertainment Software Ratings Board.

There is, though, a key difference between those industries and the situation of YouTube and Facebook. Movie studios, record labels and TV companies are responsible for making their content as well as distributing it – and are legally liable for any problems that might arise.

Online media companies, though, typically don’t create most of what appears on their platforms, and are expressly protected from legal responsibility for the content of the messages others post. But hosting information publicly viewed as hateful can damage a business, even if it doesn’t run afoul of government rules.

Challenges of social media content regulation

Social media companies have achieved their ubiquity and high profits because they do not have to pay for creating the content that attracts attention to their services. They reap the financial rewards of a technological advantage in which billions of users can create, share and look at different messages and pieces of content every day.

They are just beginning to understand the downside to that technological advantage, which is that the public – even if not the law – considers them at least somewhat responsible for what is said on their sites. And it’s extremely difficult to sort through, classify and police all those billions of posts – much less to figure out how to automate some of those tasks.

Alex Jones, banned from many social media platforms. Michael Zimmermann, CC BY-ND

So far, social media sites have avoided limiting content except in the most extreme cases, because it is difficult to draw lines of acceptability that don’t produce more controversy themselves. Their decision likely included weighing the effects of the objections that would erupt if they did ban Jones against what might happen to their brands if they didn’t.

In the past, self-regulation often allowed media companies to evade governmental action. It is unclear whether these latest moves by social media companies are the start of lasting self-regulation or a one-off effort to quell current concern. Either way, their decisions are all about what is good for business.

Their response to outcry may be craven, but it might suggest these companies are recognizing the cultural power of their products. Ultimately, social media companies – like other media companies – are showing that they will respond to pressure from their audiences and the marketplace. In the absence of regulation, consumers will encourage companies to change policies by opting out of social media that enable cesspools of trolling and hate.

Users who want changes made should take note of how audiences have pressured other media industries to make changes in the past. Consumers who want greater privacy controls, environments free of hate speech, and different kinds of algorithms could demand them by leaving flawed services or boycotting the advertisers that support them. As demand for alternatives becomes clearer, services will change or a competitor will arise.

Amanda Lotz, Fellow, Peabody Media Center; Professor of Media Studies, University of Michigan

This article was originally published on The Conversation. Read the original article.

Please login to post a comment

Want to know who is speaking next at the City Club? Sign up here.

Slice 1 Created with Sketch.

Our New Address

1317 Euclid Avenue, Suite 100
Cleveland, Ohio 44115

The City Club of Cleveland building
x

Photo Gallery

1 of 22