Irony and Outrage
Irony and Outrage
2
Political and Technological Changes That Created Jon Stewart and Bill O’Reilly
just as the early 1960s marked an important moment for the first generation of American satire and outrage, 1996 was an important year for the second generation of these two competing genres. It was in 1996 that two shows at the center of this story launched on cable television. The first was The Daily Show, a news parody and satire program on Comedy Central that showed up in July. The second, appearing in October of that year, was a news “analysis” show featuring the conservative pundit Bill O’Reilly, which was introduced as an offering of the new 24-hour Fox News Channel.
The Daily Show, created by Lizz Winstead and Madeleine Smithberg, was initially hosted by comedian Craig Kilborn delivering mock news in a fake news studio. The show featured headlines from the day’s pop culture news and introduced fictional correspondents in pretend “field segments” interviewing strange and eccentric people. With Kilborn at the helm, The Daily Show focused more on popular culture and celebrity news than it soon would under the more politically minded Jon Stewart, who was brought on as host in 1999. As Stephen Colbert explains, “it turned from local news, summer kicker stories, and celebrity jokes [under Kilborn], to something with more of a political point of view. Jon has a political point of view. He wanted us to have a political point of view, and for the most part, I found that I had a stronger one than I had imagined.”1 Stewart (and his executive producer and head writer Ben Karlin, a former Onion editor) transformed The Daily Show into a political satire vehicle in 1999, but the news parody atmosphere, aesthetic, and format of the show were created in 1996.
Just three months later, a few channels down (or up) the dial (cable box), a former tabloid television show host got his own show at the behest of media mogul Roger Ailes. On the new Fox News Channel, Bill O’Reilly, who had hosted the celebrity entertainment news show Inside Edition for six years, launched The O’Reilly Report (later renamed The O’Reilly Factor). From its inception, this show was positioned as a hybrid news and opinion program that quickly came to define the conservative television talk genre. It also dominated cable news ratings for well over a decade, up to O’Reilly’s termination by the network following charges of sexual harassment in 2017.2
The twin births of The Daily Show and The O’Reilly Factor were no accident. Both programs are logical outgrowths of simultaneous changes in the economic and regulatory underpinnings of the media industry and the development of new cable and digital technologies. Books have been written on the structural changes in media industries in the 1980s and 1990s. Still more books have been written about the technological revolution caused by cable and the internet. I don’t need to rewrite those books here. However, to explain how and why satirists Jon Stewart and Bill Maher and outrage hosts Bill O’Reilly and Sean Hannity within a period of just a few years all began hosting quasi-political entertainment programs (or quasi-entertainment political programs) I’ve got to talk media regulation, media economics, and media technologies.
At least for a minute.
The Reagan Era’s Deregulation of Media
The story goes like this. In the 1980s the Reagan administration pushed to deregulate various industries. From the oil and gas industries to the financial sector, the notion was that reliance on the free market rather than government “interference” was the best way to grow the economy. Among the industries deregulated under Reagan was the rapidly growing media industry. In the 1980s under Reagan, the regulatory powers of the FCC over media content and media industry behaviors all but disappeared. The FCC removed requirements on the amount of informational programming that broadcasters had to supply and repealed the fairness doctrine, which had required that broadcasters give “equal time” to competing political voices. Under Reagan, the FCC also reduced limits on media ownership. These limits had previously restricted the quantity of media companies that could be owned by a single entity. For example, a single entity was permitted to own no more than seven television stations in 1981. By 1985, that number had increased to 12.3
These changes in the economic and regulatory underpinnings of media fueled a focus on profit—mainly because the potential for profit was just so huge. With few limits on the number of holdings allowed by the government, large conglomerates began to form, as media giants capitalized on economies of scale. Large corporations began to acquire smaller media enterprises at a record pace, thereby vastly increasing profits by increasing efficiency and eliminating redundant positions and departments. Putting multiple media holdings (television networks, movie studios, film distributors, cable networks, radio stations, publishing companies) all under the same roof also reduced costs as it allowed for cross-merchandising across these growing corporate empires.
Take, for example, Disney. In 2019, Disney owns ABC, the Disney Channel, ESPN, Marvel, theme parks, cruise lines, and 30 percent of Hulu (to name just a tiny few). By collapsing all these companies under the Disney umbrella, each of the smaller holdings is able to save money internally. Need marketing done? No problem. Need financing? No problem. Need production studios? Got it. Need capacity for distribution? Got it. Need to get a product in stores? Got it. Want to promote a new show on a family cruise line? Weird request, but sure.
And the real genius of the consolidation of media ownership (from a profit standpoint) is that it maximizes the owners’ ability to promote their brands across their many holdings. Industry folks enthusiastically refer to this as “synergy.” Synergy is when ABC airs the Disney parade, thereby promoting Disney theme parks to a national audience on their television network. Or when an episode of Modern Family (on ABC) features the Pritchett family enjoying themselves in Disneyland. Or when Dancing with the Stars (on ABC) has “Disney Night,” when the dancing couples perform to famous songs from Disney’s giant film archive, from Alice in Wonderland to Beauty and the Beast. Free marketing across the media empire. And yes, all of these cross-merchandising activities have actually happened.
The capacity for profit in media industries is so great that it fueled, through the 1990s and into the 2000s, the consolidation of ownership across our vast media landscape. For the better part of the twentieth century, each individual mass media industry—newspapers, film, television, magazines—was controlled by multiple medium-specific companies. Media historian Robert McChesney writes that each individual media industry was “dominated by anywhere from a few to a dozen or so firms.”4 As recently as 1983, over 50 different corporations owned most of American media. By the mid-1990s that number had dropped to 23.5 And by 2000, the dire prediction by Ben Bagdikian, author of Media Monopoly, that a “half-dozen large corporations would own all the most powerful media outlets in the United States” had come to pass.6
In 2019, over 90 percent of American media is controlled by five corporations: Comcast, Walt Disney Corporation, 21st Century Fox, AT&T–Warner Media, and National Amusements (which includes Viacom and CBS). The latest trend in consolidation of media ownership is in the direction of vertical integration, in which corporations that own the dominant mode of distribution (the internet) are also acquiring content producers (the entities that make the stuff that goes on—or through—the internet). For example, during the writing of this book, AT&T (internet service provider) acquired Time Warner (content producer), making it a direct competitor with Comcast (internet service provider), which entered the content business back in 2009 with the acquisition of NBC Universal (content producer).
In sum: media deregulation means fewer—and much larger—corporate owners of media.
Profit-Oriented “Journalism” and Erosion of Trust in News
In Rich Media, Poor Democracy, McChesney included a section pessimistically titled “Farewell to Journalism.” Here he explained how the “commercialization” fueled by the formation of media conglomerates in the 1980s and 1990s contributed to “the decline and marginalization of any public service values among the media, placing the status of notions on nonmarket public service in jeopardy across society.”7 In other words, in a corporate media world, concerns like “what we should do” or “what would be good for citizens” are trumped by considerations of profit. Hence, in the corporate media world, the practice of journalism itself becomes a bit of an afterthought. According to Bagdikian, “the immense size of the parent firms means that some of their crucial media subsidiaries, like news, have become remote within their complex tables of organization. That remoteness has contributed to the unprecedented degree to which the parent firms have pressed their news subsidiaries to cross ethical lines by selecting news that will promote the needs of the owning corporation rather than serve the traditional ethical striving of journalism.”8 In their canonical work The Elements of Journalism, Bill Kovach and Tom Rosenstiel outline normative obligations of contemporary journalism, or “what we should expect from those who provide the news.”9 Two obligations in particular are central to the problem posed by the consolidation of media ownership. Kovach and Rosenstiel contend that the owner/corporation at the head of a news organization must be committed to citizens first, and that journalists must have final say over the news. Kovach and Rosenstiel also propose that journalists’ first obligation is to the truth and their first loyalty is to citizens.
By the accounts of media and journalism historians, this normative ideal actually was the model of journalism that dominated throughout much of the twentieth century. In a rare positive description of media practice, McChesney writes: “journalism has been regarded as a public service by all of the commercial media throughout [the twentieth century]. In particular, commercial broadcasters displayed their public service through the establishment of ample news divisions. … Professional journalism was predicated on the notion that its content should not be shaped by the dictates of owners and advertisers or by the biases of the editors and reporters, but rather by core public service values.”10 To be fair, though, William Randolph Hearst and Joseph Pulitzer did make a mockery of the idealized obligations of the newspaper industry with the “yellow journalism” of the 1890s, which included sensationalized and fabricated accounts of war atrocities that contributed to the United States’ involvement in the Spanish-American War. And these practices were driven by a circulation race between these two media moguls. But in the wake of this disgraceful moment, the American Society for Newspaper Editors (ASNE) created a code of ethics for journalists (in 1923), consisting of canons that journalists should follow to protect the integrity of the practice of journalism. The spirit of this code of ethics remained at the heart of the practice of journalism at newspapers and networks for decades. The code says: “the primary function of newspapers is to communicate to the human race what its members do, feel and think. Journalism, therefore, demands of its practitioners the widest range of intelligence, or knowledge, and of experience, as well as natural and trained powers of observation and reasoning. To its opportunities as a chronicle are indissolubly linked its obligations as teacher and interpreter.”11 The code contains the following canons: responsibility to the public welfare; freedom of the press; independence from private or partisan interests; sincerity, truthfulness, and accuracy; impartiality; fair play, and decency.
Based on the high rates of trust in news in the United States through the 1960s and 1970s, it would seem that the ASNE code was successful in fostering public trust in journalism at the time. The Roper Center for Public Opinion Research and the American National Election Study data show that in the 1950s–1960s, 65–70 percent of Americans thought the news was “fair.” In 1972, Gallup reported that 68 percent of the public said they had “a great deal” or “a fair amount of trust” in news organizations to “report the news fully, accurately, and fairly.” The Watergate scandal of 1972–1974 highlighted the “watchdog” capacity of the press, with fierce investigative reporting by Bob Woodward and Carl Bernstein at the Washington Post. When Gallup asked their “trust in news” question in 1976, 72 percent of the American public reported trusting news organizations—a historical high.
Through the 1990s and into the 2000s, though, those numbers took a precipitous turn. By 2016, the portion of people who reported trusting news organizations had dropped to 32 percent. The Pew Center for the People and the Press reports that when they began asking questions about trust in media in 1985, 72 percent of Americans viewed news organizations as “highly professional” and only 34 percent said they believed that “news stories are often inaccurate.”12 By 2011, only 57 percent of Americans saw news organizations as “highly professional,” and the percentage of Americans who believed that “news stories are often inaccurate” had risen to 66 percent. Perhaps most relevant to this discussion is the percentage of Americans who reported believing that news organizations are “influenced by powerful people and organizations,” a Pew statistic that rose from 53 percent in 1985 to a shocking 80 percent in 2011.
It is not a coincidence that this plummeting faith in news and growing sense that news organizations are “influenced by powerful people” happened just as media industries were deregulated and facing intense demands for profit. Put simply: profit-oriented changes in the newsrooms of the 1980s and 1990s degraded the practice of journalism. As Kovach and Rosenstiel explain, with corporate ownership, “business practices were put into the newsroom that ran counter to journalism’s and citizens’ best interests.”13 McChesney laments that “by the 1990s, traditional professional journalism was in marked retreat from its standards of the postwar years, due to the tidal wave of commercial pressure brought on by the corporate media system.”14 Journalism professor Herbert Gans describes the problems of postmodern journalism as “[stemming] largely from the very nature of commercially supplied news in a big country.”15
A quick survey of research on the state of news over the past century includes countless historians, journalists, and media theorists all pointing to the same problem: corporate media. Corporate media’s intense focus on profit undermines the function of journalism. Gans writes: “the crucial question about chains and conglomerates as opposed to traditional news firms is profit: how much profit are the news firms expected to deliver, and what effects do the pursuit and expenditure of profit have on the journalists? The more profit the firm demands, the less money is available to be spent on journalists and news coverage, the more bureaus have to be closed and the more shortcuts taken.”16 What kind of profits are we talking about? According to former Washington Post editors Leonard Downie, Jr., and Robert Kaiser, authors of The News about the News: American Journalism in Peril,17 “media owners are accustomed to profit margins that would be impossible in most traditional industries.”18 Whereas General Motors might consider a profit margin of 5 percent of total revenue to be “a very good year,” Downie and Kaiser describe desired profit margins of 30 percent and even 50 percent at newspapers and local news stations, respectively. “Protecting such high profits,” they write, “can easily undermine the notion that journalism is a public service.”19
Part of this dysfunction is due to the complicated economic model that supports the media industry. Unlike a “normal” economic model in which people go into a store and buy a good or service and money changes hands in exchange for that good or service, the news itself is not the product that is bought and sold in exchange for money. Sure, you pay $2 for USA Today at a newsstand (which itself is a quaint notion), but the actual money that is sustaining the paper comes from advertisers. And the advertisers are paying the newspaper or television network not for journalism but for access to audiences.
Kovach and Rosenstiel write: “in short, the business relationship of journalism is different from traditional consumer marketing, and in some ways more complex. It is a triangle. The audience is not the customer buying goods and services. The advertiser is. Yet the customer/advertiser has to be subordinate in that triangle to the third figure, the citizen.” So if “journalists’ first obligation is to the truth, and journalists’ first loyalty is to the citizens,”20 then any efforts to elevate the importance of advertisers or profits over citizens is antithetical to the purpose, function, and obligation of journalism—and, unfortunately, is what corporations do best. That is not intended to be read as a glib dig at corporations. That is an honest assessment of what corporate institutions are designed to do. They are designed to make profits. Their “fiduciary responsibility” is to their shareholders, full stop. McChesney writes: “the main concern of the media giants is to make journalism directly profitable, and there are a couple of proven ways to do that. First, lay off as many reporters as possible. … Second, concentrate upon stories that are inexpensive and easy to cover, like celebrity lifestyle pieces, court cases, plane crashes, crime stories, and shootouts.”21
Is McChesney right? Is this really what journalism became in the 1990s and 2000s under pressure for corporate owners to cut costs and maximize profits? Celebrities, court cases, plane crashes, crime stories, and shootouts? According to Downie and Kaiser, former Washington Post editors who had been with the paper since the mid-1960s, McChesney isn’t too far off. In their 2002 book The News about the News: American Journalism in Peril, they describe a news industry squeezed for profits, producing at the whim of news “consultants” brought in to newsrooms to tell journalists “what audiences want.” They detail the vast cuts to investigative journalism and a reduction in foreign bureaus and correspondents across the globe—at both newspapers and television news organizations alike. They describe a shift in favor of cheaper content, including the rise of television “pundits,” people talking about news in lieu of journalists investigating and reporting news, or, as they put it, “the substitution of talk, opinion, and argument for news.”22 Downie and Kaiser see this dire situation as the result of corporate motives prevailing unchecked across the media landscape. As they put it, “much of what has happened to news has been the by-product of broader economic, technological, demographic and social changes in the country. Most newspapers, television networks, and local television and radio stations now belong to giant, publicly owned corporations far removed from the communities they serve. They face the unrelenting quarterly profit pressures from Wall Street now typical of American capitalism.”23
The first time I was introduced to these problematic aspects of news was through the work of University of Washington professor W. Lance Bennett, author of News: The Politics of Illusion.24 This book, which I read as an undergraduate at the University of New Hampshire in 1998, was first published in 1983. Its tenth edition was published in 2016. In it, Bennett explores the importance of journalism to democratic health and the latest trends in news dissemination and reception. In spite of the many revisions to Bennett’s text over time, what has remained consistent over most of its 30-year life span is its articulation of four major “information biases” present in news: personalization, dramatization, fragmentation, and the authority-disorder bias.
According to Bennett, stories about individual people and personalities predominate over stories about systems and policies in mainstream news content, a phenomenon called “personalization.” The focus is on stories—narratives constructed with a beginning, middle, and end—to satisfy audiences’ supposed need for conflict, closure, and yes, drama (“dramatization”). Bennett explains that “news dramas emphasize crisis over continuity, the present over the past or future, and the personalities at their center. News dramas downplay complex policy information, the workings of government institutions, and the bases of power.”25 These stories are presented as discrete, self-contained entities, with little, if any, exploration of how they are connected. These little nuggets of chaos are absent historical, economic, political, or cultural context, so “the impression is created of a world of chaotic events and crises that seem to appear and disappear because the news picture offers little explanation of their origins” (“fragmentation”).26 Finally, a dance in news content moves back and forth between disorder and chaos on the one hand and authority and order on the other (the “authority-disorder bias”). As outlined by Murray Edelman, news programs construct “a series of threats and reassurances” that repeatedly scare people and then tell them everything is fine.27 Bennett, heavily influenced by Edelman’s work, puts it a bit more concretely: “writing dramatic endings for fragmented stories often becomes the highest imperative in the newsroom. Sometimes authorities save the day, and order is restored to some corner of society. Sometimes authorities fight valiantly, but the forces of evil are simply overwhelming, and disorder seems to prevail.”28 The four information biases Bennett describes can be viewed as a direct result of the profit pressures already discussed. Information biases help illustrate the answer to questions like these: What does news look like when news organizations chase what they think the public “wants,” while trying to simultaneously reduce production costs? What does news look like when news organizations gut their investigative units and so increasingly rely on official sources for information? What happens when journalists are under pressure to entertain audiences, to captivate them in an effort to keep them coming back?
Personalization, dramatization, fragmentation, and authority-disorder bias offer a way to categorize the systemic trends in the selection and framing of news programs that are an outgrowth of an industry focused on ratings and cheap production routines. Bennett discusses these information biases as part of a trend toward commercialism that has contributed to “the fall of journalism.”29 “News organizations are being driven into the ground by profit pressures from big corporations that now own most of them,” he writes, pointing to the fact that this trend “has been in motion for over 20 years, with devastating effects on the reporting of so-called ‘hard news.’”30
Why Political Polarization? (And No, It’s Not All the Media’s Fault)
While the erosion of the practice of journalism was certainly to blame for some of the decline of faith in news that has occurred over the last 30 years, it’s not entirely the fault of a profit-centered media system. At least part of this decline can be attributed to the nation’s political polarization. This trend, characterized by a more consistently liberal Democratic Party platform, a more consistently conservative Republican Party platform, and an eroding ideological middle, has made news viewers—and citizens—harder to please. Political communication scholars have documented a “hostile media effect,” in which viewers perceive balanced news reporting to be hostile toward their own ideological position, an effect that is especially concerning in a sharply divided political climate.31 But political polarization is is also the result a complex series of structural, political, and technological factors.
If you live in the United States, partisan gridlock, “all-or-nothing” politics, and compromise as a “dirty word” characterize your political world. Indeed, the Democratic and Republican parties are farther apart from one another on the issues than they have been in decades.32 The average Democrat and the average Republican are both more homogenous in their own issue positions than they were in the 1990s, with fewer Democrats holding at least some conservative issue positions and fewer Republicans holding at least some liberal issue positions. According to data from the Pew Research Center, the ideological placement of the average Republican has moved to the right while that of the average Democrat has moved to the left. What this leaves is an ever-shrinking political “middle” and a reduced possibility of bipartisan compromise. And while the parties move farther apart ideologically, an increasing number of Americans are describing themselves as politically independent rather than identifying with a political party. If viewed in the context of polarization as a movement of the two major parties to the extremes, this increase in American “independents” is quite logical. The rise in independent voters should be seen at least in part as an outcome of the polarization of the Democrats and Republicans, which leaves those in the ideological middle without a political home.
The roots of America’s political polarization don’t just go back to the 1990s, however. They don’t start with the partisan cable news networks. Instead, this movement away from ideological moderation in the direction of ideological extremity and homogeneity dates back to important social and cultural shifts, as well as changes in the party nominating processes and in the media environment. In the 1960s, as civil rights took center stage in American politics, the two parties’ positions on issues related to race began to crystallize. This phenomenon, referred to by Edward Carmines and James Stimson as “issue evolution,” transformed the Democratic Party into the party of civil rights and the Republican Party into the party of states’ rights.33 Meanwhile, over the past 40 years, the parties were also in the process of distinguishing themselves on so-called social issues; most notably abortion, gay rights, and other matters relating to the separation of church and state. As described by Geoffrey Layman, Thomas Carsey, and Juliana Horowitz, “cultural polarization began in Congress, in party platforms, and among party activists, and then was translated into growing divisions between the parties’ mass coalitions.”34 In other words, party elites staked out their “policy territories,” and then strong party identifiers followed suit.
Polarization between the parties has also been exacerbated indirectly by changes in the way primary elections are conducted—and in how candidates have changed their behaviors as a result. After years of party nominees being selected by party insiders behind closed doors, progressive reformers in the 1910s and 1920s sought to reduce the power of party bosses and bring transparency to the process. By shifting the selection of party nominees to voters through primary elections, the thought was that the process would become less opaque and would reduce corruption in party politics. It was a laudable goal, and a completely reasonable set of expectations. In practice, though, the primary process, while reducing shady insider dealings, has had the unintended consequences of increasing polarization—in two ways.
First, very few citizens actually vote in primary elections. And those who do are more politically engaged and ideologically extreme that most party members. Less than 30 percent of eligible voters typically participate in primary elections. In 2016, that number was 28.5 percent.35 General election turnout is bad enough (usually between 50 and 60 percent), but less than 30 percent? People who vote in political primaries tend to be highly engaged and attentive to politics (which is great), but they also tend to be strong party identifiers who are farther to the left than the average Democrat and farther to the right than the average Republican. This pushes the pools of Republican and Democratic primary voters farther away from the middle than the rest of the American public. This in turn contributes to the election of party nominees who are farther left and right than average party members. Second, recent research also suggests that the “extremity” of the primary electorate is not necessarily fueling polarization on its own.36 Rather, in anticipation of “extreme” primary voters, candidates may change their behavior and positions accordingly. Anticipating more ideological and strident primary voters, candidates strategically adopt issue positions that are more ideologically extreme, hence exacerbating this phenomenon.
All of these historical factors contributing to America’s political polarization have been compounded by the growing influence of outside interest groups. Issue-driven interest groups and super PACs help fund candidates who best represent their positions on the issues—positions that tend to be more extreme and less moderate. The 2010 Citizens United ruling by the US Supreme Court allowed for unlimited funds from individuals, corporations, and unions to flow into campaigns and elections. According to the Center for Responsive Politics, over the last decade, aided by the Citizens United ruling, outside spending in American elections has increased from about $500 million in 2010 to $1.7 billion in 2016, with about 99 percent of those funds coming from groups that are politically liberal (44 percent) or politically conservative (55 percent). In 2016, only 1 percent of outside group spending came from groups that were bipartisan or neither left nor right.37
Of particular concern in this equation is the increasing role played by “nondisclosing groups”: 501(c) nonprofit organizations that are not required to disclose the identities of their donors. In 2004, such nondisclosing groups contributed less than $6 million to US elections. By 2016 that number had risen to $180 million. And the ideological breakdown of these nondisclosing groups is astounding. In 2016, $141.9 million came from conservative nondisclosing groups, $34.4 million from liberal ones, and a paltry $2.1 million from “other” groups. Needless to say, the money that is flooding into elections isn’t coming from moderate, bipartisan groups seeking compromise and middle-of- the-road approaches to public policy. Can you imagine someone willing to throw millions of dollars into a campaign because they are really passionate … about moderate issue positions?
It is worth noting that while ideological political polarization is clearly happening, political scientists are not in full agreement on whether or not this “party sorting” is a bad thing in itself.38 This homogeneous sorting process is actually quite rational. It demonstrates the emergence of more internally consistent and constrained belief systems, which political scientist Phil Converse lamented were largely absent in the American electorate of the 1950s.39 Converse feared that without internally consistent belief-systems, Americans’ political decision-making processes were largely random, driven by group loyalties or “issues of the moment” in ways that could be easily manipulated or exploited.
While political scientists disagree on whether party sorting itself is inherently good or bad for democracy, one thing that political scientists generally agree is bad is the affective polarization that has accompanied this party sorting.40 The term “affective polarization” (“affective” meaning emotional or feeling-based) captures Americans’ increasing hostility toward members of the opposing party. Americans rate members of the opposing party less favorably now than they did 50 years ago. They even disapprove of the mere suggestion of their child marrying someone from the opposing political party more than they ever have. And this is true of Democrats and Republicans alike. So polarization isn’t just about the parties moving apart on matters of policy. This is about Americans increasingly loathing members of the opposing political party.
Political scientist Jonathan Ladd suggests that rather than thinking of today’s political media environment as one of low trust and high polarization, one should look at the 1950s–1970s as an era of uniquely high trust and low polarization. By all measures, though, trust in journalistic institutions has gone down over the last 70 years. Political polarization has increased over that same time. And these two contemporaneous trends have contributed to some rather unhealthy aspects of contemporary American politics: partisan vitriol, legislative gridlock, a reduction in political participation, and increasing reach and influence of political disinformation. As I will show, this lack of trust in news also contributed to the emergence of alternative sources of political information in the late 1990s.
Cable and Digital Technologies Create New Programming Opportunities
Media deregulation and political polarization might not have had much of an impact on the political information landscape without concurrent changes in media technologies at the close of the twentieth century. If media deregulation and political polarization contributed to the erosion of public trust in news, it was the advent of cable and digital technologies that made alternative political information sources possible. These new technologies expanded the breadth of the information landscape. With new outlets came increased opportunities for experimental programming—where new hybrid genres (a little news, a little entertainment) could test the waters in a low-risk setting. Cable created a place for politically minded comics and entertainment-minded pundits. It made it possible to have entire networks dedicated to comedy and entire networks dedicated to “news.”
The technology of cable originated as far back as the 1940s in the United States. Community Access Television, was a way for people in rural areas, whose television signals were typically obstructed by natural terrain, to import the signals from distant network affiliates using tall community antennas placed atop mountains or hills. These giant community antennas could pick up television broadcasting signals that were too remote for ordinary home antennas. Coaxial cable lines could bring those amplified signals from the community antennas into local homes, increasing the distance that urban affiliates’ signals could reach.41 Residents of rural areas in the 1960s and 1970s paid for cable subscriptions for access to clear signals from nearby network affiliates. The result? Folks like my mom and dad, tucked in between New Hampshire mountains, previously unable to receive any broadcasting signals with their roof antenna, were finally able to receive a clear picture from WCVB out of Boston.
The possibility of “importing signals” through community antennas opened up programming possibilities. If signals could be amplified and imported from distant locations, why not just import the best quality programming out of Los Angeles and New York to everywhere else in the country? Well, because the FCC at the time said you couldn’t. Both the 1966 and 1972 FCC rules required that cable subscriptions “must carry” a market’s local network affiliates, and the 1972 rules prohibited cable companies from “‘leapfrogging’ nearby stations in favor of large-market independent stations.”42 But throughout the latter half of the 1970s, many of these regulations were loosened. Networks soon began using new cheap satellite technology to amplify their signals, which were then picked up by community antennas and sent through existing cable lines into homes. The FCC’s Open Skies policy on satellite technology made it possible for just about anyone to launch a communications satellite, “thus leaving cable networks free to use satellite as a means of nationally distributing programming.”43 In the early 1970s, a young Ted Turner owned and operated a small independent television station, WTCG out of Atlanta. In 1976, he capitalized on the cheap combination of satellite and cable to help carry his station’s signal nationwide. This new “superstation,” renamed WTBS in 1979, proved highly lucrative. In 1980, using the same basic model, Turner launched the Cable News Network (CNN), the first 24-hour cable news station.
With the passage of the 1984 Cable Act, which focused on deregulating the cable industry, cable experienced explosive growth. The number of cable programming networks increased from 28 to 70 over the 1980s. The five years from 1980 to 1985 saw the birth of Black Entertainment Television (BET), CNN, Bravo, Showtime, Music Television (MTV), the Disney Channel, Lifetime, Playboy, the Financial News Network, the Weather Channel, the Discovery Channel, the Home Shopping Network (HSN), Arts & Entertainment (A&E), and American Movie Classics (AMC).44
“Breaking Up America”
The cable landscape was vast and growing vaster every year. As it grew, the giant mass audiences that broadcast networks had been able to reach since the 1940s began spreading out and shrinking, a process that media scholars call “media fragmentation.” Instead of a handful of giant audiences consuming the same television fare, cable technology created dozens—soon hundreds—of smaller audiences consuming a whole bunch of different things. In 1951, the beloved CBS sitcom I Love Lucy dominated the ratings, “with 11 million families tuning in every week (and that was when there were only 15 million TV sets in the country).”45 Can you imagine? Seventy-three percent of Americans with televisions all watching the same show at the same time. In 2017, the most watched regularly airing program, according to Nielsen, was Sunday Night Football, with 19 million viewers;46 but with 301.7 million people living in homes with televisions,47 that means that only 6 percent of folks with televisions were actually watching that “top-rated” show—a far cry from the 73 percent watching Lucy and Ricky back in 1951.
Media scholars refer to this shift rather hyperbolically as “the death of the mass audience.”48 As viewers spread themselves across an ever-widening array of programming options, the audiences of each individual program shrink. Joseph Turow explores the consequences of this “media fragmentation” in his book Breaking Up America, positing that because the economics of television relied so heavily on advertising revenue, it was the advertising industry in this “fragmented” media world that came to dictate what programming began to look like.49 Yes, cable technologies—and digital technologies in the late 1990s and 2000s—increased the number of outlets and opportunities for new programming, but as Turow explains, the deliberate segmentation of audiences according to demographics and psychographics was driven by advertisers and media executives.
With hundreds of new media outlets, the question of how and where to advertise to a promising market became exponentially more complicated than it had been in the days of ABC, NBC, and CBS. Advertisers couldn’t count on the efficiency of a national or local ad campaign the way they could in the 1970s. Since the products bought and sold in media economics are the audiences sold to advertisers, cable technologies rattled the economic underpinnings of the entire television industry. Advertisers scurried to find their customers across this diffuse new landscape. Media executives were stymied as well. The existence of their networks was contingent on advertising revenue. How do you sell advertisers on the idea of marketing to your cable network’s really tiny audience?
According to Turow’s research, media executives sold the desirability of their smaller audiences to advertisers using two claims: “the claim of efficient separation” and “the claim of a special relationship.”50 The claim of efficient separation suggested that media outlets could promise advertisers a small, homogenous audience, without the advertisers having wasted money on anyone they didn’t want to try to reach. And the claim of a “special relationship?” This is the notion that because these new outlets were programmed with specialized “niche” content designed for a “specific kind of person,” their audiences were loyal, engaged, and eager to receive everything that came to them through that trusted outlet—including advertising.
The resulting media and advertising content effectively “signaled divisions” between Americans—based on hobbies and interests, yes, but also on race, class, lifestyle, and culture.51 These “efficient separations” of distinct subgroups with whom networks cultivated “special relationships” certainly helped the specificity and efficiency of advertising campaigns but also contributed to cultural and even political divisions. While programming executives figured out how to put sports fans in one box (ESPN) and home décor hobbyists in another (HGTV), they also figured out how to segment news-obsessed partisans into boxes, by means of ideologically driven 24-hour news networks that provide news and “analysis” all while supporting a particular worldview. Meanwhile, astute program developers at a new network called Comedy Central realized that young, politically knowledgeable, largely male viewers were up for grabs, too. For them, Comedy Central offered foul-mouthed “puppets making crank phone calls,”52 as well as cutting—largely left-leaning—political satire.
Cable television was not created with the explicit purpose of dividing audiences into socially, culturally, and politically distinct enclaves. Those outcomes were merely a by-product of the economics of the new technology. But cable’s emergence against the backdrop of low public trust in news and an increasingly polarized electorate positioned it well as the place where media producers could develop new programming genres that would satisfy their audiences’ political information needs: outrage on the right and satire on the left.