Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals...

Post on 29-Sep-2020

0 views 0 download

Transcript of Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals...

Digital Power & PoliticsMatthew GentzkowStanford University

1

Most Important News Source (2016)

2

3

How do social media affect the distribution of political news and information?

1. “Theory”

2. Experimental evidence

“Theory”

4

1

5

Echo Chambers circa 2008

6

Media

7

Face to Face Social Networks

8

Two Key Forces

1. Most people get news from big, brand-name sites2. The only people who go to extreme sites are heavy

users, and so they also see non-extreme sites as well

9

10

Should social media be any different?

Social media…

• Filters content through your social network, which we saw above is highly segregated

• Makes sources less important

• Exposes even light users to niche content

11

12

POLITICAL SCIENCE

Exposure to ideologically diversenews and opinion on FacebookEytan Bakshy,1*† Solomon Messing,1† Lada A. Adamic1,2

Exposure to news, opinion, and civic information increasingly occurs through social media.How do these online networks influence exposure to perspectives that cut across ideologicallines? Using deidentified data, we examined how 10.1 million U.S. Facebook users interact withsocially shared news.We directly measured ideological homophily in friend networks andexamined the extent to which heterogeneous friends could potentially expose individuals tocross-cutting content.We then quantified the extent to which individuals encountercomparatively more or less diverse content while interacting via Facebook’s algorithmicallyrankedNews Feed and further studied users’choices to click through to ideologically discordantcontent. Compared with algorithmic ranking, individuals’ choices played a stronger role inlimiting exposure to cross-cutting content.

Exposure to news and civic information isincreasingly mediated through online socialnetworks and personalization (1). Informa-tion abundance provides individuals withan unprecedented number of options, shift-

ing the function of curating content from news-room editorial boards to individuals, their socialnetworks, andmanual or algorithmic informationsorting (2–4). Although these technologies havethe potential to expose individuals to more di-verse viewpoints (4, 5), they also have the po-tential to limit exposure to attitude-challenginginformation (2, 3, 6), which is associatedwith theadoption of more extreme attitudes over time (7)and misperception of facts about current events(8). This changing environment has led to specu-lation around the creation of “echo chambers”(in which individuals are exposed only to infor-mation from like-minded individuals) and “filterbubbles” (in which content is selected by algo-rithms according to a viewer’s previous behav-iors), which are devoid of attitude-challengingcontent (3, 9). Empirical attempts to examinethese questions have been limited by difficul-ties in measuring news stories’ ideological lean-ings (10) and measuring exposure—relying oneither error-laden, retrospective self-reports orbehavioral data with limited generalizability—and have yielded mixed results (4, 9, 11–15).We used a large, comprehensive data set from

Facebook that allows us to (i) compare the ideo-logical diversity of the broad set of news andopinion shared on Facebook with that sharedby individuals’ friend networks, (ii) compare thiswith the subset of stories that appear in indi-viduals’ algorithmically ranked News Feeds, and(iii) observewhat information individuals chooseto consume, given exposure on News Feed. Weconstructed a deidentified data set that in-cludes 10.1 million active U.S. users who self-report their ideological affiliation and 7 million

distinct Web links (URLs) shared by U.S. usersover a 6-month period between 7 July 2014 and7 January 2015. We classified stories as either“hard” (such as national news, politics, or worldaffairs) or “soft” content (such as sports, enter-tainment, or travel) by training a support vectormachine on unigram, bigram, and trigram textfeatures (details are available in the supplemen-tary materials, section S1.4.1). Approximately13% of these URLs were classified as hard con-tent. We further limited the set of hard newsURLs to the 226,000 distinct hard-content URLsshared by at least 20 users who volunteered theirideological affiliation in their profile, so thatwe could accurately measure ideological align-ment. This data set included ~3.8 billion po-tential exposures (cases in which an individual’sfriend shared hard content, regardless of whetherit appeared in her News Feed), 903 million ex-posures (cases in which a link to the contentappears on screen in an individual’s News Feed),and 59 million clicks, among users in our study.We then obtained a measure of content align-

ment (A) for each hard story by averaging theideological affiliation of each user who sharedthe article. Alignment is not a measure of me-dia slant; rather, it captures differences in the

kind of content shared among a set of partisans,which can include topic matter, framing, andslant. These scores, averaged over websites,capture key differences in well-known ideolog-ically aligned media sources: FoxNews.com isaligned with conservatives (As = +.80), whereasthe HuffingtonPost.com is aligned with liberals(As = –0.65) (additional detail and validation areprovided in the supplementary materials, sec-tion S1.4.2). We observed substantial polariza-tion among hard content shared by users, withthe most frequently shared links clearly alignedwith largely liberal or conservative populations(Fig. 1).The flow of information on Facebook is struc-

tured by how individuals are connected in thenetwork. The interpersonal networks on Face-book are different from the segregated structureof political blogs (16); although there is clusteringaccording to political affiliation on Facebook,there are also many friendships that cut acrossideological affiliations. Among friendships withindividuals who report their ideological affilia-tion in their profile, the median proportion offriendships that liberals maintain with conserva-tives is 0.20, interquartile range (IQR) [0.09,0.36]. Similarly, themedian proportion of friend-ships that conservatives maintain with liberals is0.18, IQR [0.09, 0.30] (Fig. 2).How much cross-cutting content individuals

encounter depends on who their friends are andwhat information those friends share. If individ-uals acquired information from random others,~45% of the hard content that liberals would beexposed towould be cross-cutting, comparedwith40% for conservatives (Fig. 3B). Of course, individ-uals do not encounter information at random inoffline environments (14) nor on the Internet (9).Despite the slightly higher volume of conserv-atively aligned articles shared (Fig. 1), liberalstend to be connected to fewer friends who shareinformation from the other side, compared withtheir conservative counterparts: Of the hard newsstories shared by liberals’ friends, 24% are cross-cutting, compared with 35% for conservatives(Fig. 3B).The media that individuals consume on Face-

book depends not only on what their friendsshare but also on how the News Feed ranking

1130 5 JUNE 2015 • VOL 348 ISSUE 6239 sciencemag.org SCIENCE

1Facebook, Menlo Park, CA 94025, USA. 2School ofInformation, University of Michigan, Ann Arbor, MI, USA.*Corresponding author. E-mail: ebakshy@fb.com †Theseauthors contributed equally to this work.

Fig. 1. Distribution of ideolo-gical alignment of contentshared on Facebook mea-sured as the average affilia-tion of sharers weighted bythe total number of shares.Content was delineated asliberal, conservative, or neutralon the basis of the distributionof alignment scores (detailsare available in the supple-mentary materials).

0.00

0.01

0.02

0.03

0.04

−1 0 1

Alignment score

Pro

port

ion

of s

hare

s

Alignment classificationLiberalNeutralConservative

RESEARCH | REPORTS

on April 13, 2018

http://science.sciencemag.org/

Dow

nloaded from

13

POLITICAL SCIENCE

Exposure to ideologically diversenews and opinion on FacebookEytan Bakshy,1*† Solomon Messing,1† Lada A. Adamic1,2

Exposure to news, opinion, and civic information increasingly occurs through social media.How do these online networks influence exposure to perspectives that cut across ideologicallines? Using deidentified data, we examined how 10.1 million U.S. Facebook users interact withsocially shared news.We directly measured ideological homophily in friend networks andexamined the extent to which heterogeneous friends could potentially expose individuals tocross-cutting content.We then quantified the extent to which individuals encountercomparatively more or less diverse content while interacting via Facebook’s algorithmicallyrankedNews Feed and further studied users’choices to click through to ideologically discordantcontent. Compared with algorithmic ranking, individuals’ choices played a stronger role inlimiting exposure to cross-cutting content.

Exposure to news and civic information isincreasingly mediated through online socialnetworks and personalization (1). Informa-tion abundance provides individuals withan unprecedented number of options, shift-

ing the function of curating content from news-room editorial boards to individuals, their socialnetworks, andmanual or algorithmic informationsorting (2–4). Although these technologies havethe potential to expose individuals to more di-verse viewpoints (4, 5), they also have the po-tential to limit exposure to attitude-challenginginformation (2, 3, 6), which is associatedwith theadoption of more extreme attitudes over time (7)and misperception of facts about current events(8). This changing environment has led to specu-lation around the creation of “echo chambers”(in which individuals are exposed only to infor-mation from like-minded individuals) and “filterbubbles” (in which content is selected by algo-rithms according to a viewer’s previous behav-iors), which are devoid of attitude-challengingcontent (3, 9). Empirical attempts to examinethese questions have been limited by difficul-ties in measuring news stories’ ideological lean-ings (10) and measuring exposure—relying oneither error-laden, retrospective self-reports orbehavioral data with limited generalizability—and have yielded mixed results (4, 9, 11–15).We used a large, comprehensive data set from

Facebook that allows us to (i) compare the ideo-logical diversity of the broad set of news andopinion shared on Facebook with that sharedby individuals’ friend networks, (ii) compare thiswith the subset of stories that appear in indi-viduals’ algorithmically ranked News Feeds, and(iii) observewhat information individuals chooseto consume, given exposure on News Feed. Weconstructed a deidentified data set that in-cludes 10.1 million active U.S. users who self-report their ideological affiliation and 7 million

distinct Web links (URLs) shared by U.S. usersover a 6-month period between 7 July 2014 and7 January 2015. We classified stories as either“hard” (such as national news, politics, or worldaffairs) or “soft” content (such as sports, enter-tainment, or travel) by training a support vectormachine on unigram, bigram, and trigram textfeatures (details are available in the supplemen-tary materials, section S1.4.1). Approximately13% of these URLs were classified as hard con-tent. We further limited the set of hard newsURLs to the 226,000 distinct hard-content URLsshared by at least 20 users who volunteered theirideological affiliation in their profile, so thatwe could accurately measure ideological align-ment. This data set included ~3.8 billion po-tential exposures (cases in which an individual’sfriend shared hard content, regardless of whetherit appeared in her News Feed), 903 million ex-posures (cases in which a link to the contentappears on screen in an individual’s News Feed),and 59 million clicks, among users in our study.We then obtained a measure of content align-

ment (A) for each hard story by averaging theideological affiliation of each user who sharedthe article. Alignment is not a measure of me-dia slant; rather, it captures differences in the

kind of content shared among a set of partisans,which can include topic matter, framing, andslant. These scores, averaged over websites,capture key differences in well-known ideolog-ically aligned media sources: FoxNews.com isaligned with conservatives (As = +.80), whereasthe HuffingtonPost.com is aligned with liberals(As = –0.65) (additional detail and validation areprovided in the supplementary materials, sec-tion S1.4.2). We observed substantial polariza-tion among hard content shared by users, withthe most frequently shared links clearly alignedwith largely liberal or conservative populations(Fig. 1).The flow of information on Facebook is struc-

tured by how individuals are connected in thenetwork. The interpersonal networks on Face-book are different from the segregated structureof political blogs (16); although there is clusteringaccording to political affiliation on Facebook,there are also many friendships that cut acrossideological affiliations. Among friendships withindividuals who report their ideological affilia-tion in their profile, the median proportion offriendships that liberals maintain with conserva-tives is 0.20, interquartile range (IQR) [0.09,0.36]. Similarly, themedian proportion of friend-ships that conservatives maintain with liberals is0.18, IQR [0.09, 0.30] (Fig. 2).How much cross-cutting content individuals

encounter depends on who their friends are andwhat information those friends share. If individ-uals acquired information from random others,~45% of the hard content that liberals would beexposed towould be cross-cutting, comparedwith40% for conservatives (Fig. 3B). Of course, individ-uals do not encounter information at random inoffline environments (14) nor on the Internet (9).Despite the slightly higher volume of conserv-atively aligned articles shared (Fig. 1), liberalstend to be connected to fewer friends who shareinformation from the other side, compared withtheir conservative counterparts: Of the hard newsstories shared by liberals’ friends, 24% are cross-cutting, compared with 35% for conservatives(Fig. 3B).The media that individuals consume on Face-

book depends not only on what their friendsshare but also on how the News Feed ranking

1130 5 JUNE 2015 • VOL 348 ISSUE 6239 sciencemag.org SCIENCE

1Facebook, Menlo Park, CA 94025, USA. 2School ofInformation, University of Michigan, Ann Arbor, MI, USA.*Corresponding author. E-mail: ebakshy@fb.com †Theseauthors contributed equally to this work.

Fig. 1. Distribution of ideolo-gical alignment of contentshared on Facebook mea-sured as the average affilia-tion of sharers weighted bythe total number of shares.Content was delineated asliberal, conservative, or neutralon the basis of the distributionof alignment scores (detailsare available in the supple-mentary materials).

0.00

0.01

0.02

0.03

0.04

−1 0 1

Alignment score

Pro

port

ion

of s

hare

s

Alignment classificationLiberalNeutralConservative

RESEARCH | REPORTS

on April 13, 2018

http://science.sciencemag.org/

Dow

nloaded from

14

Content

Bottom Line

• Digital media need not exacerbate segregation and polarization

• But the structure of social media platforms make them likely to do so

15

ExperimentAllcott, Braghieri, Eichmeyer & Gentzkow 2019

16

2

17

Randomized experiment: Paid users to deactivate Facebook for 4 weeks before the US 2018 midterm election

Individual effects• Substitute time uses• Happiness• Post-experiment use & valuation

Broader social impacts• News knowledge• Voting• Political polarization

Timeline (2018)

Sept 24 – Oct 3: Recruitment, pre-screen, and baselineOct 11: MidlineNov 8: EndlineDec 3: Post-endline

18

Recruitment

19

Deactivation

20

21

22

Substitution

23

24

25

News Knowledge

26

27

Polarization

28

Bottom Line

• Facebook makes people more informed

• Facebook makes people more polarized

29