29
Digital Power & Politics Matthew Gentzkow Stanford University 1

Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals are exposed only to infor-mation from like-minded individuals) and “filter bubbles”

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals are exposed only to infor-mation from like-minded individuals) and “filter bubbles”

Digital Power & PoliticsMatthew GentzkowStanford University

1

Page 2: Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals are exposed only to infor-mation from like-minded individuals) and “filter bubbles”

Most Important News Source (2016)

2

Page 3: Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals are exposed only to infor-mation from like-minded individuals) and “filter bubbles”

3

How do social media affect the distribution of political news and information?

1. “Theory”

2. Experimental evidence

Page 4: Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals are exposed only to infor-mation from like-minded individuals) and “filter bubbles”

“Theory”

4

1

Page 5: Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals are exposed only to infor-mation from like-minded individuals) and “filter bubbles”

5

Page 6: Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals are exposed only to infor-mation from like-minded individuals) and “filter bubbles”

Echo Chambers circa 2008

6

Page 7: Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals are exposed only to infor-mation from like-minded individuals) and “filter bubbles”

Media

7

Page 8: Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals are exposed only to infor-mation from like-minded individuals) and “filter bubbles”

Face to Face Social Networks

8

Page 9: Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals are exposed only to infor-mation from like-minded individuals) and “filter bubbles”

Two Key Forces

1. Most people get news from big, brand-name sites2. The only people who go to extreme sites are heavy

users, and so they also see non-extreme sites as well

9

Page 10: Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals are exposed only to infor-mation from like-minded individuals) and “filter bubbles”

10

Should social media be any different?

Page 11: Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals are exposed only to infor-mation from like-minded individuals) and “filter bubbles”

Social media…

• Filters content through your social network, which we saw above is highly segregated

• Makes sources less important

• Exposes even light users to niche content

11

Page 12: Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals are exposed only to infor-mation from like-minded individuals) and “filter bubbles”

12

POLITICAL SCIENCE

Exposure to ideologically diversenews and opinion on FacebookEytan Bakshy,1*† Solomon Messing,1† Lada A. Adamic1,2

Exposure to news, opinion, and civic information increasingly occurs through social media.How do these online networks influence exposure to perspectives that cut across ideologicallines? Using deidentified data, we examined how 10.1 million U.S. Facebook users interact withsocially shared news.We directly measured ideological homophily in friend networks andexamined the extent to which heterogeneous friends could potentially expose individuals tocross-cutting content.We then quantified the extent to which individuals encountercomparatively more or less diverse content while interacting via Facebook’s algorithmicallyrankedNews Feed and further studied users’choices to click through to ideologically discordantcontent. Compared with algorithmic ranking, individuals’ choices played a stronger role inlimiting exposure to cross-cutting content.

Exposure to news and civic information isincreasingly mediated through online socialnetworks and personalization (1). Informa-tion abundance provides individuals withan unprecedented number of options, shift-

ing the function of curating content from news-room editorial boards to individuals, their socialnetworks, andmanual or algorithmic informationsorting (2–4). Although these technologies havethe potential to expose individuals to more di-verse viewpoints (4, 5), they also have the po-tential to limit exposure to attitude-challenginginformation (2, 3, 6), which is associatedwith theadoption of more extreme attitudes over time (7)and misperception of facts about current events(8). This changing environment has led to specu-lation around the creation of “echo chambers”(in which individuals are exposed only to infor-mation from like-minded individuals) and “filterbubbles” (in which content is selected by algo-rithms according to a viewer’s previous behav-iors), which are devoid of attitude-challengingcontent (3, 9). Empirical attempts to examinethese questions have been limited by difficul-ties in measuring news stories’ ideological lean-ings (10) and measuring exposure—relying oneither error-laden, retrospective self-reports orbehavioral data with limited generalizability—and have yielded mixed results (4, 9, 11–15).We used a large, comprehensive data set from

Facebook that allows us to (i) compare the ideo-logical diversity of the broad set of news andopinion shared on Facebook with that sharedby individuals’ friend networks, (ii) compare thiswith the subset of stories that appear in indi-viduals’ algorithmically ranked News Feeds, and(iii) observewhat information individuals chooseto consume, given exposure on News Feed. Weconstructed a deidentified data set that in-cludes 10.1 million active U.S. users who self-report their ideological affiliation and 7 million

distinct Web links (URLs) shared by U.S. usersover a 6-month period between 7 July 2014 and7 January 2015. We classified stories as either“hard” (such as national news, politics, or worldaffairs) or “soft” content (such as sports, enter-tainment, or travel) by training a support vectormachine on unigram, bigram, and trigram textfeatures (details are available in the supplemen-tary materials, section S1.4.1). Approximately13% of these URLs were classified as hard con-tent. We further limited the set of hard newsURLs to the 226,000 distinct hard-content URLsshared by at least 20 users who volunteered theirideological affiliation in their profile, so thatwe could accurately measure ideological align-ment. This data set included ~3.8 billion po-tential exposures (cases in which an individual’sfriend shared hard content, regardless of whetherit appeared in her News Feed), 903 million ex-posures (cases in which a link to the contentappears on screen in an individual’s News Feed),and 59 million clicks, among users in our study.We then obtained a measure of content align-

ment (A) for each hard story by averaging theideological affiliation of each user who sharedthe article. Alignment is not a measure of me-dia slant; rather, it captures differences in the

kind of content shared among a set of partisans,which can include topic matter, framing, andslant. These scores, averaged over websites,capture key differences in well-known ideolog-ically aligned media sources: FoxNews.com isaligned with conservatives (As = +.80), whereasthe HuffingtonPost.com is aligned with liberals(As = –0.65) (additional detail and validation areprovided in the supplementary materials, sec-tion S1.4.2). We observed substantial polariza-tion among hard content shared by users, withthe most frequently shared links clearly alignedwith largely liberal or conservative populations(Fig. 1).The flow of information on Facebook is struc-

tured by how individuals are connected in thenetwork. The interpersonal networks on Face-book are different from the segregated structureof political blogs (16); although there is clusteringaccording to political affiliation on Facebook,there are also many friendships that cut acrossideological affiliations. Among friendships withindividuals who report their ideological affilia-tion in their profile, the median proportion offriendships that liberals maintain with conserva-tives is 0.20, interquartile range (IQR) [0.09,0.36]. Similarly, themedian proportion of friend-ships that conservatives maintain with liberals is0.18, IQR [0.09, 0.30] (Fig. 2).How much cross-cutting content individuals

encounter depends on who their friends are andwhat information those friends share. If individ-uals acquired information from random others,~45% of the hard content that liberals would beexposed towould be cross-cutting, comparedwith40% for conservatives (Fig. 3B). Of course, individ-uals do not encounter information at random inoffline environments (14) nor on the Internet (9).Despite the slightly higher volume of conserv-atively aligned articles shared (Fig. 1), liberalstend to be connected to fewer friends who shareinformation from the other side, compared withtheir conservative counterparts: Of the hard newsstories shared by liberals’ friends, 24% are cross-cutting, compared with 35% for conservatives(Fig. 3B).The media that individuals consume on Face-

book depends not only on what their friendsshare but also on how the News Feed ranking

1130 5 JUNE 2015 • VOL 348 ISSUE 6239 sciencemag.org SCIENCE

1Facebook, Menlo Park, CA 94025, USA. 2School ofInformation, University of Michigan, Ann Arbor, MI, USA.*Corresponding author. E-mail: [email protected] †Theseauthors contributed equally to this work.

Fig. 1. Distribution of ideolo-gical alignment of contentshared on Facebook mea-sured as the average affilia-tion of sharers weighted bythe total number of shares.Content was delineated asliberal, conservative, or neutralon the basis of the distributionof alignment scores (detailsare available in the supple-mentary materials).

0.00

0.01

0.02

0.03

0.04

−1 0 1

Alignment score

Pro

port

ion

of s

hare

s

Alignment classificationLiberalNeutralConservative

RESEARCH | REPORTS

on April 13, 2018

http://science.sciencemag.org/

Dow

nloaded from

Page 13: Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals are exposed only to infor-mation from like-minded individuals) and “filter bubbles”

13

POLITICAL SCIENCE

Exposure to ideologically diversenews and opinion on FacebookEytan Bakshy,1*† Solomon Messing,1† Lada A. Adamic1,2

Exposure to news, opinion, and civic information increasingly occurs through social media.How do these online networks influence exposure to perspectives that cut across ideologicallines? Using deidentified data, we examined how 10.1 million U.S. Facebook users interact withsocially shared news.We directly measured ideological homophily in friend networks andexamined the extent to which heterogeneous friends could potentially expose individuals tocross-cutting content.We then quantified the extent to which individuals encountercomparatively more or less diverse content while interacting via Facebook’s algorithmicallyrankedNews Feed and further studied users’choices to click through to ideologically discordantcontent. Compared with algorithmic ranking, individuals’ choices played a stronger role inlimiting exposure to cross-cutting content.

Exposure to news and civic information isincreasingly mediated through online socialnetworks and personalization (1). Informa-tion abundance provides individuals withan unprecedented number of options, shift-

ing the function of curating content from news-room editorial boards to individuals, their socialnetworks, andmanual or algorithmic informationsorting (2–4). Although these technologies havethe potential to expose individuals to more di-verse viewpoints (4, 5), they also have the po-tential to limit exposure to attitude-challenginginformation (2, 3, 6), which is associatedwith theadoption of more extreme attitudes over time (7)and misperception of facts about current events(8). This changing environment has led to specu-lation around the creation of “echo chambers”(in which individuals are exposed only to infor-mation from like-minded individuals) and “filterbubbles” (in which content is selected by algo-rithms according to a viewer’s previous behav-iors), which are devoid of attitude-challengingcontent (3, 9). Empirical attempts to examinethese questions have been limited by difficul-ties in measuring news stories’ ideological lean-ings (10) and measuring exposure—relying oneither error-laden, retrospective self-reports orbehavioral data with limited generalizability—and have yielded mixed results (4, 9, 11–15).We used a large, comprehensive data set from

Facebook that allows us to (i) compare the ideo-logical diversity of the broad set of news andopinion shared on Facebook with that sharedby individuals’ friend networks, (ii) compare thiswith the subset of stories that appear in indi-viduals’ algorithmically ranked News Feeds, and(iii) observewhat information individuals chooseto consume, given exposure on News Feed. Weconstructed a deidentified data set that in-cludes 10.1 million active U.S. users who self-report their ideological affiliation and 7 million

distinct Web links (URLs) shared by U.S. usersover a 6-month period between 7 July 2014 and7 January 2015. We classified stories as either“hard” (such as national news, politics, or worldaffairs) or “soft” content (such as sports, enter-tainment, or travel) by training a support vectormachine on unigram, bigram, and trigram textfeatures (details are available in the supplemen-tary materials, section S1.4.1). Approximately13% of these URLs were classified as hard con-tent. We further limited the set of hard newsURLs to the 226,000 distinct hard-content URLsshared by at least 20 users who volunteered theirideological affiliation in their profile, so thatwe could accurately measure ideological align-ment. This data set included ~3.8 billion po-tential exposures (cases in which an individual’sfriend shared hard content, regardless of whetherit appeared in her News Feed), 903 million ex-posures (cases in which a link to the contentappears on screen in an individual’s News Feed),and 59 million clicks, among users in our study.We then obtained a measure of content align-

ment (A) for each hard story by averaging theideological affiliation of each user who sharedthe article. Alignment is not a measure of me-dia slant; rather, it captures differences in the

kind of content shared among a set of partisans,which can include topic matter, framing, andslant. These scores, averaged over websites,capture key differences in well-known ideolog-ically aligned media sources: FoxNews.com isaligned with conservatives (As = +.80), whereasthe HuffingtonPost.com is aligned with liberals(As = –0.65) (additional detail and validation areprovided in the supplementary materials, sec-tion S1.4.2). We observed substantial polariza-tion among hard content shared by users, withthe most frequently shared links clearly alignedwith largely liberal or conservative populations(Fig. 1).The flow of information on Facebook is struc-

tured by how individuals are connected in thenetwork. The interpersonal networks on Face-book are different from the segregated structureof political blogs (16); although there is clusteringaccording to political affiliation on Facebook,there are also many friendships that cut acrossideological affiliations. Among friendships withindividuals who report their ideological affilia-tion in their profile, the median proportion offriendships that liberals maintain with conserva-tives is 0.20, interquartile range (IQR) [0.09,0.36]. Similarly, themedian proportion of friend-ships that conservatives maintain with liberals is0.18, IQR [0.09, 0.30] (Fig. 2).How much cross-cutting content individuals

encounter depends on who their friends are andwhat information those friends share. If individ-uals acquired information from random others,~45% of the hard content that liberals would beexposed towould be cross-cutting, comparedwith40% for conservatives (Fig. 3B). Of course, individ-uals do not encounter information at random inoffline environments (14) nor on the Internet (9).Despite the slightly higher volume of conserv-atively aligned articles shared (Fig. 1), liberalstend to be connected to fewer friends who shareinformation from the other side, compared withtheir conservative counterparts: Of the hard newsstories shared by liberals’ friends, 24% are cross-cutting, compared with 35% for conservatives(Fig. 3B).The media that individuals consume on Face-

book depends not only on what their friendsshare but also on how the News Feed ranking

1130 5 JUNE 2015 • VOL 348 ISSUE 6239 sciencemag.org SCIENCE

1Facebook, Menlo Park, CA 94025, USA. 2School ofInformation, University of Michigan, Ann Arbor, MI, USA.*Corresponding author. E-mail: [email protected] †Theseauthors contributed equally to this work.

Fig. 1. Distribution of ideolo-gical alignment of contentshared on Facebook mea-sured as the average affilia-tion of sharers weighted bythe total number of shares.Content was delineated asliberal, conservative, or neutralon the basis of the distributionof alignment scores (detailsare available in the supple-mentary materials).

0.00

0.01

0.02

0.03

0.04

−1 0 1

Alignment score

Pro

port

ion

of s

hare

s

Alignment classificationLiberalNeutralConservative

RESEARCH | REPORTS

on April 13, 2018

http://science.sciencemag.org/

Dow

nloaded from

Page 14: Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals are exposed only to infor-mation from like-minded individuals) and “filter bubbles”

14

Content

Page 15: Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals are exposed only to infor-mation from like-minded individuals) and “filter bubbles”

Bottom Line

• Digital media need not exacerbate segregation and polarization

• But the structure of social media platforms make them likely to do so

15

Page 16: Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals are exposed only to infor-mation from like-minded individuals) and “filter bubbles”

ExperimentAllcott, Braghieri, Eichmeyer & Gentzkow 2019

16

2

Page 17: Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals are exposed only to infor-mation from like-minded individuals) and “filter bubbles”

17

Randomized experiment: Paid users to deactivate Facebook for 4 weeks before the US 2018 midterm election

Individual effects• Substitute time uses• Happiness• Post-experiment use & valuation

Broader social impacts• News knowledge• Voting• Political polarization

Page 18: Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals are exposed only to infor-mation from like-minded individuals) and “filter bubbles”

Timeline (2018)

Sept 24 – Oct 3: Recruitment, pre-screen, and baselineOct 11: MidlineNov 8: EndlineDec 3: Post-endline

18

Page 19: Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals are exposed only to infor-mation from like-minded individuals) and “filter bubbles”

Recruitment

19

Page 20: Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals are exposed only to infor-mation from like-minded individuals) and “filter bubbles”

Deactivation

20

Page 21: Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals are exposed only to infor-mation from like-minded individuals) and “filter bubbles”

21

Page 22: Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals are exposed only to infor-mation from like-minded individuals) and “filter bubbles”

22

Substitution

Page 23: Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals are exposed only to infor-mation from like-minded individuals) and “filter bubbles”

23

Page 24: Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals are exposed only to infor-mation from like-minded individuals) and “filter bubbles”

24

Page 25: Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals are exposed only to infor-mation from like-minded individuals) and “filter bubbles”

25

News Knowledge

Page 26: Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals are exposed only to infor-mation from like-minded individuals) and “filter bubbles”

26

Page 27: Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals are exposed only to infor-mation from like-minded individuals) and “filter bubbles”

27

Polarization

Page 28: Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals are exposed only to infor-mation from like-minded individuals) and “filter bubbles”

28

Page 29: Digital Power & Politics...lation around the creation of “echo chambers” (in which individuals are exposed only to infor-mation from like-minded individuals) and “filter bubbles”

Bottom Line

• Facebook makes people more informed

• Facebook makes people more polarized

29