arXiv:1512.09074v1 [physics.soc-ph] 30 Dec 2015 · Bias, Belief and Consensus: Collective opinion...

6
Bias, Belief and Consensus: Collective opinion formation on fluctuating networks Vudtiwat Ngampruetikorn 1 and Greg J. Stephens 1, 2 1 Biological Physics Theory Unit, Okinawa Institute of Science and Technology Graduate University, Onna, Okinawa 904 0495, Japan 2 Department of Physics & Astronomy, Vrije Universiteit Amsterdam, 1081HV Amsterdam, The Netherlands (Dated: December 31, 2015) With the advent of online networks, societies are substantially more connected with individual members able to easily modify and maintain their own social links. Here, we show that active network maintenance exposes agents to confirmation bias, the tendency to confirm one’s beliefs, and we explore how this affects collective opinion formation. We introduce a model of binary opinion dynamics on a complex network with fast, stochastic rewiring and show that confirmation bias induces a segregation of individuals with different opinions. We use the dynamics of global opinion to generally categorize opinion update rules and find that confirmation bias always stabilizes the consensus state. Finally, we show that the time to reach consensus has a non-monotonic dependence on the magnitude of the bias, suggesting a novel avenue for large-scale opinion engineering. A society is molded by the opinions of its members and with enough time pervasive opinions become the basis of customs and culture. However even the most deeply-rooted ideas can be challenged and occasionally, even rapidly, dethroned such as occurred recently with the attitudes towards gay marriage in the US [1]. On the other hand, fragmented, polarized opinions can and do persist so that whether and how a belief emerges as a social norm is a fundamental and important question [2]. While consensus is a collective state, the existence and dynamics of this state depends on how individuals re- ceive, process and respond to information, all of which are influenced by cognitive bias, i.e., systematic errors in subjective judgements [3]. Among various cognitive bi- ases, ‘confirmation bias’ – the tendency to confirm one’s beliefs [4] – is not only particularly relevant to opinion formation but also sensitive to new communication tech- nology [5–9]. In online social networks, for example, it only takes seconds to silence disagreeing views or to seek out new like-minded contacts. Indeed, the increasing ability to fashion and maintain our own social circles re- gardless of geographic proximity amplifies the effects of a general preference for agreeing information sources, a confirmation bias which is known as ‘selective exposure’ [6–8, 10]. Individuals are additionally subject to ‘biased interpretation’ [11, 12], the inclination to discount in- formation undermining one’s views. Understanding how these and other cognitive biases affect collective opinion formation is likely to be increasingly important to under- standing social discourse in modern societies. Here, we explore how confirmation bias can effect the formation, stability and dynamics of the consensus state. For discrete opinions, the quantitative analysis of consen- sus dynamics can be traced to various kinetic Ising mod- els [13–20], including the pioneering work of the voter model [21, 22] which is exactly solvable in all dimen- sions. The hallmarks of the voter model include (see, e.g., Ref. [23]): inevitability of consensus; probabilistic conservation of average opinion; and how dimensionality d qualitatively affects the time to consensus: T C N 2 for d = 1, N ln N for d = 2 and N for d> 2 with N de- noting the system size. Latter analyses have considered social-network-inspired refinements, including complex, more natural network topologies [18, 19, 24–26] (see, e.g., Ref. [20] for a review) and, more recently, the beginning of a fully dynamical approach [27–31]. In distinction to imposing and refining static network structures, we endow the links with dynamics so that collective opinions and the changing social network in which they sit both emerge and coevolve. Agents update their opinion based on their neighbors and links form and disappear with differing dynamics depending on whether they connect agreeing or disagreeing agents. We find that confirmation bias in the form of asymmetric link dynam- ics leads naturally to opinion segregation. In the limit of large bias, we show that opinion polarization is unsta- ble and consensus is stable and this result is universal within a general family of update rules. We further show that although strong bias in general increases consensus time, weaker bias could speed up consensus formation, a surprising result with implications for active network engineering. Finally we discuss the implications of our results for various social systems. We introduce a model of opinion dynamics to encom- pass both bias and opinion changes [Fig. 1]. Here, as in previous approaches [18, 19, 30, 31], agents are repre- sented as nodes and carry a binary opinion (σ = 1) or (σ = -1). The network, however, isn’t static [27–31]. Instead, links disappear randomly so that connections between agents with the same opinion (agreeing links) decay at a rate δ + while those between opposite opin- ions (disagreeing links) decay at a faster rate δ - + [Fig. 1(a)]. This difference in rates encodes the social forces that push individuals towards agreement and away from cognitive conflict [33, 34]. As disagreeing links are shorter lived, islands of agreement form spontaneously around each agent and these islands are the manifesta- tion of confirmation bias. Finally, we allow for spon- arXiv:1512.09074v1 [physics.soc-ph] 30 Dec 2015

Transcript of arXiv:1512.09074v1 [physics.soc-ph] 30 Dec 2015 · Bias, Belief and Consensus: Collective opinion...

Bias, Belief and Consensus: Collective opinion formation on fluctuating networks

Vudtiwat Ngampruetikorn1 and Greg J. Stephens1, 2

1Biological Physics Theory Unit, Okinawa Institute of Science andTechnology Graduate University, Onna, Okinawa 904 0495, Japan

2Department of Physics & Astronomy, Vrije Universiteit Amsterdam, 1081HV Amsterdam, The Netherlands(Dated: December 31, 2015)

With the advent of online networks, societies are substantially more connected with individualmembers able to easily modify and maintain their own social links. Here, we show that activenetwork maintenance exposes agents to confirmation bias, the tendency to confirm one’s beliefs, andwe explore how this affects collective opinion formation. We introduce a model of binary opiniondynamics on a complex network with fast, stochastic rewiring and show that confirmation biasinduces a segregation of individuals with different opinions. We use the dynamics of global opinionto generally categorize opinion update rules and find that confirmation bias always stabilizes theconsensus state. Finally, we show that the time to reach consensus has a non-monotonic dependenceon the magnitude of the bias, suggesting a novel avenue for large-scale opinion engineering.

A society is molded by the opinions of its membersand with enough time pervasive opinions become thebasis of customs and culture. However even the mostdeeply-rooted ideas can be challenged and occasionally,even rapidly, dethroned such as occurred recently withthe attitudes towards gay marriage in the US [1]. Onthe other hand, fragmented, polarized opinions can anddo persist so that whether and how a belief emerges as asocial norm is a fundamental and important question [2].

While consensus is a collective state, the existence anddynamics of this state depends on how individuals re-ceive, process and respond to information, all of whichare influenced by cognitive bias, i.e., systematic errors insubjective judgements [3]. Among various cognitive bi-ases, ‘confirmation bias’ – the tendency to confirm one’sbeliefs [4] – is not only particularly relevant to opinionformation but also sensitive to new communication tech-nology [5–9]. In online social networks, for example, itonly takes seconds to silence disagreeing views or to seekout new like-minded contacts. Indeed, the increasingability to fashion and maintain our own social circles re-gardless of geographic proximity amplifies the effects ofa general preference for agreeing information sources, aconfirmation bias which is known as ‘selective exposure’[6–8, 10]. Individuals are additionally subject to ‘biasedinterpretation’ [11, 12], the inclination to discount in-formation undermining one’s views. Understanding howthese and other cognitive biases affect collective opinionformation is likely to be increasingly important to under-standing social discourse in modern societies.

Here, we explore how confirmation bias can effect theformation, stability and dynamics of the consensus state.For discrete opinions, the quantitative analysis of consen-sus dynamics can be traced to various kinetic Ising mod-els [13–20], including the pioneering work of the votermodel [21, 22] which is exactly solvable in all dimen-sions. The hallmarks of the voter model include (see,e.g., Ref. [23]): inevitability of consensus; probabilisticconservation of average opinion; and how dimensionality

d qualitatively affects the time to consensus: TC ∼ N2

for d = 1, N lnN for d = 2 and N for d > 2 with N de-noting the system size. Latter analyses have consideredsocial-network-inspired refinements, including complex,more natural network topologies [18, 19, 24–26] (see, e.g.,Ref. [20] for a review) and, more recently, the beginningof a fully dynamical approach [27–31].

In distinction to imposing and refining static networkstructures, we endow the links with dynamics so thatcollective opinions and the changing social network inwhich they sit both emerge and coevolve. Agents updatetheir opinion based on their neighbors and links form anddisappear with differing dynamics depending on whetherthey connect agreeing or disagreeing agents. We find thatconfirmation bias in the form of asymmetric link dynam-ics leads naturally to opinion segregation. In the limitof large bias, we show that opinion polarization is unsta-ble and consensus is stable and this result is universalwithin a general family of update rules. We further showthat although strong bias in general increases consensustime, weaker bias could speed up consensus formation,a surprising result with implications for active networkengineering. Finally we discuss the implications of ourresults for various social systems.

We introduce a model of opinion dynamics to encom-pass both bias and opinion changes [Fig. 1]. Here, asin previous approaches [18, 19, 30, 31], agents are repre-sented as nodes and carry a binary opinion ↑ (σ = 1) or↓ (σ = −1). The network, however, isn’t static [27–31].Instead, links disappear randomly so that connectionsbetween agents with the same opinion (agreeing links)decay at a rate δ+ while those between opposite opin-ions (disagreeing links) decay at a faster rate δ− > δ+[Fig. 1(a)]. This difference in rates encodes the socialforces that push individuals towards agreement and awayfrom cognitive conflict [33, 34]. As disagreeing links areshorter lived, islands of agreement form spontaneouslyaround each agent and these islands are the manifesta-tion of confirmation bias. Finally, we allow for spon-

arX

iv:1

512.

0907

4v1

[ph

ysic

s.so

c-ph

] 3

0 D

ec 2

015

2

(a) (b) (c)µ

µ

µ

δ+

δ+

δ−

σν

?

p

FIG. 1. We model confirmation bias in collective opinionformation as heterogenous link dynamics coupled with opin-ion updates on a complex network. Top – Example OpinionNetwork: Segregation is evident in a network of political we-blogs (data from [32]), labelled according to their inclinations:liberal (empty circles) or conservative (filled circles).a Bot-tom – Model: (a) All links have a finite lifetime. Agreeinglinks (green) connect like-minded agents and decay at a rateδ+ while disagreeing links (red) decay at a rate δ− > δ+.(b) New links are formed at a rate µ. (c) ‘Spineless’ agentsdiscard their current opinions at a rate ν and immediatelyadopt ↑ opinion with a probability p, determined purely bythe opinions of their nearest neighbors. Opinion updating andrewiring are intricately coupled; the local network structure,governed by rewiring dynamics, dictates opinion updating andopinion configurations determines the decay rates in rewiringdynamics.

a To represent only the most active blogs, we remove nodes ofdegree one until the minimum degree is two.

taneous new connections between any two unconnectednodes and this occurs with rate µ [Fig. 1(b)] which wechoose proportional to 1/N to maintain a constant meandegree across different network sizes.

All agents reevaluate their current opinions at a rateν after which they adopt an ↑ opinion with probabilityp [Fig. 1(c)]. We consider only ‘spineless’ agents – thosewhose opinion is determined purely by the opinions oftheir neighbors. We will return to more complex decisionrules such as Bayesian updating (see, e.g., Refs. [2, 35]) inthe discussion. In distinction to previous work on opinionnetworks (though see [36–38] on Ising systems), we char-acterize the update probability p through a generalizedfamily of functions so that commonly analyzed updateprocesses such as ‘majority rule’ and ‘proportional rule ’appear as limiting cases. We note that the sculpting ofthe network through the dynamics of confirmation bias

−1 0 1−1

0

1

Global opinion, m

Localop

inion,xσ

η = 1 (unbiased)η = 5 (biased)

FIG. 2. Confirmation bias sculpts local social circles and in-duces segregation of opinions. As seen by a reference agentwith ↑ opinion, we show local opinion as a function of globalopinion for biased (η > 1, solid) and unbiased (η = 1, dashed)network dynamics. Snapshots depict exemplar networks andillustrate that unbiased systems (disks) are well-mixed whilebiased systems (squares) are segregated. The local opinion xσis the average opinion of neighbors [Eq. (3)] and the globalopinion m is the average opinion of all agents [Eq. (4)].

detailed above means that the composition of neighbor-hoods at the time of opinion reevaluation depends onthe previous opinion; agents deposit their memory in thenetwork structure and are effectively less spineless thanthose in a static network.

Real social networks are becoming larger and moreconnected as links are easy to create and maintain. Face-book, for example, currently consists of over a billionusers, each with an average of hundreds of friends [39].Thus we consider our model in the limits of large popula-tion and average degree 1 � k � N where a mean-fieldapproximation is accurate. Online networks also greatlyfacilitate link maintenance so that it is relatively easy toacquire new neighbors and silence existing ones. Here weassume that network rewiring is much faster than andtherefore decoupled from opinion updating: between twoopinion updates, there is always enough time for the net-work to reach ‘equilibrium wiring’.

The wiring is encoded in the adjacency matrix A sothat Aij = 1 if there exists a link between agents i andj and Aij = 0 otherwise. Given a static opinion configu-ration, the master equation for A is given by

P (Aij = 1) = −δijP (Aij = 1) + µP (Aij = 0), (1)

where δij = δ± for σiσj = ±1. This yields a solution

P (Aij = 1) ≈ µ/δij + Ce−δijt, (2)

where we have taken the limit N → ∞ and C is theconstant of integration. The transient dynamics definesthe adiabatic approximation so that opinion updating isslow if min δ± � ν. In equilibrium, the probability that

3

a link between agreeing and disagreeing agents exists isP+ = µ/δ+ and P− = µ/δ−, respectively. Note that al-though the equilibrium wiring does not contain explicittime dependence, it is determined by the opinion config-uration which does evolve (slowly) in time.

The mean numbers of agreeing and disagreeing linksaround a σ-agent (an agent with opinion σ) are given by(Nσ − 1)P+ and N−σP−, respectively. In the mean fieldapproximation, the average neighbor opinion around thisagent is given by

xσ ≈ σNσP+ −N−σP−NσP+ +N−σP−

= σ(σ +m)η − (σ −m)

(σ +m)η + (σ −m)(3)

where the global opinion m and the parameterisation ofconfirmation bias η are defined as follows

m ≡ N↑ −N↓N↑ +N↓

and η ≡ P+

P−. (4)

The system is biased, exhibiting confirmation bias, if η >1 and unbiased if η = 1. Our model also allows for anti-confirmation bias (η < 1) but no strong evidence existsfor such an anti-social behavior on large networks and weomit this case in the following analysis.

In Figure 2 we show how asymmetric link dynamics(which underlie confirmation bias in our model) affectsnetwork structure and local opinion. When η = 1, thesystem is well-mixed and the local and global opinions areequal, xσ = m. For η > 1, on the other hand, agents ofdiffering opinions segregate as each forms a more agree-able neighborhood, x↓ ≤ m ≤ x↑.

Spineless agents possess neither intrinsic beliefs normemories so that their update rules depend entirely onlocal opinions; the probability of adopting ↑ opinion isgiven by pσ = p(xσ) for an agent whose previous opinionis σ. Spinelessness also implies invariance under opinionexchange, ↑↔↓, thus p(−x) = 1−p(x). Once there is con-sensus among neighbors, spineless agents always adoptthe consensus opinion hence p(−1) = 0 and p(1) = 1. InFig. 3(a) we illustrate a variety of update rules, includingthe common examples of majority and proportional ruleand we show later that these rules can be simply classifiedbased on the overall network dynamics. In all cases, wefurther assume that agents are conforming so that theyadopt an opinion with a probability that increases withthe local fraction of that opinion, p′(x) ≥ 0.

Collective opinion states emerge as fixed points of theglobal dynamics,

m = −(1 +m)(1− p(x↑)) + (1−m)p(x↓). (5)

Here m denotes the global opinion and xσ={↑,↓} the localopinion around σ-agents. We have also set ν ≡ 1 so thatall other timescales are expressed in these units. Thefirst term arises from ↑-to-↓ conversions and the otherdescribes ↓-to-↑ conversions. We note that the dynamicsis invariant under opinion exchange, p(xσ)→ 1− p(x−σ)

0

1

12

p(x

σ)

Majority RuleProportion Rule

Type-IType-II

(a)

−1 0 1

−1

0

1

Local opinion, xσ

m

η = 1 (unbiased)(b)

FIG. 3. Opinion update rules can be generally classified basedon global opinion dynamics in unbiased systems. (a) Theprobability of adopting ↑ opinion [see Fig. 1(c)]. Given theconstraints of spinelessness (solid disks), conformity (p′(x) ≥0) and a mild assumption of a single inflection point, all up-date rules can be classified as type-I (blue) or type-II (red),based on the resulting dynamics (see panel below). Alsoshown are the well-known special cases of our classification.The majority rule (dashed) is the extreme limit of type-I ruleswhile the proportion rule (dash dotted) – also known as thevoter model – is at the interface between the two types ofrules. (b) Dynamical phase space in unbiased systems forthe same update rules as in (a). Consensus is stable for type-I rules and unstable for type-II rules. For the proportion rule,the system is completely static and fluctuations are importantin finite systems.

and m → −m, as required by spinelessness. Two fixedpoints are always present. They correspond to consensus,where m∗ = ±1 and x∗σ = ±1, and the polarized state,where m∗ = 0 and, owing to spinelessness, x∗↑ = −x∗↓.The asterisks denote local and global opinions at the cor-responding fixed points. In general, the behavior of thedynamics is determined by the nature (e.g., attractive orrepulsive) of these fixed points.

The stability of each fixed point depends on boththe opinion update rules and the severity of confirma-tion bias. To separate the role of the update rules, wefirst focus on the unbiased cases (η = 1). Here wefind, under the mild assumption of a single inflectionpoint, that all update rules can be classified as type-Ior type-II [Fig. 3(a)], depending on the resulting dynam-ics [Fig. 3(b)]. Concave when x > 0, type-I rules drivethe systems to consensus, m = ±1. Convex when x > 0,type-II rules favor the polarised state, m = 0. Neitherconcave or convex, the interface between the two classesof rules corresponds to the proportion rule, also knownas the voter model [21, 22]. This special case leads tocompletely static systems in our mean field approxima-tion. The majority rule or zero-temperature Glauber ki-netics [40], in which agents always join the majority in

4

their neighborhoods, is an extreme limit of type-I.For the biased dynamics (η > 1), we find that strong

confirmation bias stabilizes consensus and destabilizesopinion polarization and this result is true regardless ofthe update rules. To see how bias stabilizes consensus,we consider a one-agent perturbation from consensus.The minority agent is always surrounded by majorityagents and the minority-to-majority conversion rate isthus fixed, regardless of bias. As biased majority agentsdiscount the presence of the minority, the majority-to-minority conversion rate approaches zero; hence, consen-sus is stable. More precisely, near the consensus fixedpoint, we have

dm

dm

∣∣∣∣m=1

= −1 + 2p′(1)f ′(1), (6)

where xσ ≡ σf(σm) describes the relation between localand global opinions. In general, confirmation bias impliesf ′(1) ≤ 1 (see also Fig. 2) and, with strong bias, smallminority fraction will not change local opinions aroundmajority-agents, i.e., f ′(1)→ 0 as η →∞. Consequentlydm/dm|m=1 → −1 from above (since we always havep′(1) ≥ 0), i.e., consensus stability increases with bias.

To understand the dynamics near the polarized state,we recall that in general f(m) is given by Eq. (3). In thelimit η � 1, this yields

dm

dm

∣∣∣∣m=0

η�1≈ p′(1)4

η+O

(1

η2

). (7)

That is, opinion polarization is always unstable in thislimit since p′(1) ≥ 0. We see that, near both fixed points,strong bias leads to universal dynamics, i.e., the natureof the fixed points is independent of the opinion updaterules. This is particularly useful given few empirical re-sults for the precise details of the rules employed by realagents [41]. The speed of the dynamics near the polar-ized state does depend on the form of update rules and

we consider the consensus time TC =∫ 1

0(m)−1dm, the

time taken by the system to reach consensus from thepolarized state. In quenched Ising systems this quantityis known as the saturation time. To circumvent the di-vergences at the fixed points (where m = 0), we notethat the global opinion evolves in a step of 2/N in a fi-nite system and alter the integration limits accordingly,

TC ∼ limN→∞∫ 1−2b/N2a/N

dmm . Here a and b denote small

positive integers which represent the first and final fewsteps where the mean field picture breaks down and themicroscopic (stochastic) details are important. Factoringout the logarithmic divergence, we have

TClnN

∼(dm

dm

)m=0

+

(−dmdm

)m=1

. (8)

This result holds if and only if there is no fixed pointat an intermediate global opinion and both terms on the

TC/lnN

(log

scale)

1 10 100Bias, η

MajorityProportion

(schem

atic)

(a)

−3−1.2

α=

0

1.23

−1 0 10

1

12

x

(b) pα(x)

α

Bias, η1 102 5 20

−4

−2

0

2

4

TC/ lnN(log scale)

(c)

FIG. 4. The time to consensus depends strongly on bothopinion update rules and confirmation bias. (a) The con-sensus time TC for majority-rule systems (dashed) increaseswith bias η whereas for for proportion-rule systems (solid),TC clearly shows non-monotonic dependence. (b) Family ofopinion update rules pα(x) (exact form in the text) so thatrules with α > 0 type-I and those with α < 0 are type-II.The proportion and majority rules correspond to α = 0 andα = ∞, respectively. (c) For all rules, there exists a mini-mum in TC at some η > 1 marked by the white curve. Fortype-II rules (α < 0), we observe three regions (in the orderof increasing bias): where the polarised state is the only at-tractive fixed point (grey shading); where consensus is alsoattractive (red shading); and where consensus prevail as theonly attractive fixed point (colour scale). The minimum inTC exists in the final region.

r.h.s. are positive. Physically the first term measuresdepolarizing time and the second consensus approach-ing time. From Eq. (6), we see that the consensus ap-proaching time decreases and saturates as bias becomesstronger. In contrast we see from Eq. (7) that the depo-larising time goes as η for η � 1. Therefor TC ∼ η forη � 1.

In Fig. 4(a), we sketch the time to consensus in ourmodel for the two well-known update rules: the majorityrule and the proportion rule (voter model). Remarkably,for the proportion rule, TC develops a minimum at an in-termediate bias, η > 1 and this suggests that there couldbe an optimal bias for fast consensus formation. Suchnon-monotonic behavior is absent with the majority rulewhere TC simply increases with η. To systematically ex-plore a family of update rules, we adopt a functional pa-rameterization pα(x) = 1

2 + 1π tan−1(eα tan xπ

2 ) as shownin Fig. 4(b) so that pα(x) is type-I if α > 0 and type-IIif α < 0. The proportion rule corresponds to α = 0 and

5

the majority rule to α =∞.

Figure 4(c) illustrates how TC depends on both theupdate rules α and the bias η. For type-II rules (α < 0)and η = 1 (no bias), the polarized state m = 0 is the onlyattractive fixed point (grey shading). As η increases, firstthe consensus state is stabilized (red shading) and thenthe polarized state is destabilized (color scale) and thesystem flows uninterrupted from m = 0 to m = ±1 withTC given by Eq. (8). The minimum of TC shown pre-viously for the proportion rule is now seen to be quitegeneral for all rules. The initial decrease in TC resultsfrom disproportionate decreases in forward (minority-to-majority) and backward (majority-to-minority) conver-sions. Though all agents find themselves in neighbor-hoods of agreement, majority agents form more agree-ing neighborhoods since it is easier to find like-mindedagents. In consequence although the rate of both for-ward and backward conversions decrease with increasingbias, backward conversions decrease more than forwardconversions; hence, the speed up of consensus formation.The increase in TC at larger η results from the fact thatthe likelihood of a backward conversion cannot drop be-low zero but backward conversions continue to decrease;therefore, fewer net conversions and longer TC .

To further elucidate the impact of bias in our opin-ion model, we consider an extreme case where a fair cointoss decides the outcome of each opinion update, i.e.,p(x) = 1/2. In this case, the majority opinion suffers netoutward conversions since inbound conversions originatefrom a smaller population and thus are less frequent thanoutward conversions. Hence, equal mixing of opinions isstable. Social interactions, as encapsulated in opinionupdate rules, can however change this picture. The pro-portion rule, for example, suppresses the probability of amajority-to-minority conversion and enhances the reversein such a balanced way that each opinion population doesnot grow or shrink regardless of the global opinion. Themajority rule reinforces this effect even further and, as aresult, the only conversions left in the system are fromminority to majority.

When subject to bias, individuals now form a distortedworldview as they embed themselves among kindred spir-its. Agents in the majority overestimate their majorityand consequently rarely switch sides. Similarly, those inthe minority are less likely to switch but their social cir-cles are also less likely to be homogeneous, meaning con-versions to the majority, though less frequent with bias,are still present. In short, intermediate confirmation biascan result in an increase in net conversions from minor-ity to majority, hence speeding up agreement formation,but strong bias deprives the system of any conversions,resulting in longer consensus time.

A number of generalizations of our model are immedi-ately apparent. When consensus is stable, as for strongbias, a small perturbation in global opinion cannot leadto a permanent global opinion shift, a result at odds with

how new, sparely known ideas can supersede old beliefsand become common knowledge. In our model, a naturalway to capture this phenomenon is to break opinion sym-metry, making one opinion more stable than the other,as with Ising spins in an external field. Such an asymme-try may also capture other social spreading phenomenasuch as the propagation of rumors or diseases (see, e.g.,Ref. [24]).

The choice of update rules poses an important chal-lenge to opinion dynamics modeling, particularly in theabsence of strong empirical constraints. One approach isto rationalize agents so that their update dynamics are aresult of individual optimizations, such as fastest consen-sus formation or longest time with the opinion on whichthe society reaches consensus. With clear individual ob-jectives, each received signal is carefully interpreted andoptimal strategy deduced. It would be very interestingto see whether heuristic opinion update rules, such asin our work, could emerge in a system of rational indi-viduals performing sophisticated calculations, e.g., usingBayesian inference (see, Ref. [2] for a review).

Insights into social dynamics could also allow large-scale opinion engineering. Indeed, the idea that collectivebeliefs can be manipulated is not new; for example, seeRef. [42] where it was empirically shown that the arrivalof The Fox News affected the Republican vote share. Asan illustration, our results suggest that the speed of col-lective opinion formation could be controlled if one cantune individual dynamics. In online social networks suchas Facebook, algorithms exist such that interactions de-termine information flows between two users; hence, am-plifying confirmation bias. Certainly a reverse algorithmcould be engineered and this could have important im-plications for mass opinion manipulations. For example,an authoritarian regime might want to slow down collec-tive opinion formation to allow more time to steer thedynamics in the preferred direction before speeding upconsensus formation.

Social regularities, such as common culture or lan-guages, and their spontaneous emergence, show similari-ties to collective behavior in physical systems, thus rais-ing the hope that some social phenomena could be under-stood in the context of models and concepts from statis-tical physics. Just as magnetism rests on the microscopicdynamics of spin, collective social phenomena arises fromhuman behaviors. While we have focused here on con-firmation bias, increasingly quantitative approaches inbehavioral sciences are offer important new ingredientsto understanding collective social phenomena. By incor-porating such laws in simple model systems, we expectfurther advances in the physics of social networks.

This work was supported in part by funding from theOkinawa Institute of Science and Technology GraduateUniversity.

6

[1] Change is gonna come, The Economist (2015, July 4).[2] D. Acemoglu and A. Ozdaglar, Opinion dynamics and

learning in social networks, Dynamic Games and Appli-cations 1, 3 (2011).

[3] A. Tversky and D. Kahneman, Judgment under uncer-tainty: Heuristics and biases, Science 185, 1124 (1974).

[4] R. S. Nickerson, Confirmation bias: A ubiquitous phe-nomenon in many guises, Review of General Psychology2, 175 (1998).

[5] T. S. Rosenblat and M. M. Mobius, Getting closer ordrifting apart? The Quarterly Journal of Economics 119,971 (2004).

[6] N. J. Stroud, Media use and political predispositions: Re-visiting the concept of selective exposure, Political Behav-ior 30, 341 (2008).

[7] N. A. Valentino, A. J. Banks, V. L. Hutchings, and A. K.Davis, Selective exposure in the internet age: The inter-action between anxiety and information utility, PoliticalPsychology 30, 591 (2009).

[8] R. K. Garrett, Politically motivated reinforcement seek-ing: Reframing the selective exposure debate, Journal ofCommunication 59, 676 (2009).

[9] E. Bakshy, S. Messing, and L. A. Adamic, Exposure toideologically diverse news and opinion on facebook, Sci-ence 348, 1130 (2015).

[10] D. Frey, Recent research on selective exposure to infor-mation, Advances in experimental social psychology 19,41 (1986).

[11] C. G. Lord, L. Ross, and M. R. Lepper, Biased assimila-tion and attitude polarization: the effects of prior theorieson subsequently considered evidence. Journal of person-ality and social psychology 37, 2098 (1979).

[12] D. Griffin and A. Tversky, The weighing of evidence andthe determinants of confidence, Cognitive Psychology 24,411 (1992).

[13] S. Galam, Y. G. Feigenblat, and Y. Shapir, Sociophysics:A new approach of sociological collective behaviour. i.meanbehaviour description of a strike, The Journal ofMathematical Sociology 9, 1 (1982).

[14] S. Galam and S. Moscovici, Towards a theory of collectivephenomena: Consensus and attitude changes in groups,European Journal of Social Psychology 21, 49 (1991).

[15] G. Deffuant, D. Neau, F. Amblard, and G. Weisbuch,Mixing beliefs among interacting agents, Advances inComplex Systems 03, 87 (2000).

[16] P. L. Krapivsky and S. Redner, Dynamics of majorityrule in two-state interacting spin systems, Phys. Rev.Lett. 90, 238701 (2003).

[17] M. Mobilia, Does a single zealot affect an infinite groupof voters? Phys. Rev. Lett. 91, 028701 (2003).

[18] V. Sood and S. Redner, Voter model on heterogeneousgraphs, Phys. Rev. Lett. 94, 178701 (2005).

[19] K. Suchecki, V. M. Eguıluz, and M. S. Miguel, Con-servation laws for the voter model in complex networks,EPL (Europhysics Letters) 69, 228 (2005).

[20] C. Castellano, S. Fortunato, and V. Loreto, Statisti-cal physics of social dynamics, Rev. Mod. Phys. 81, 591(2009).

[21] P. Clifford and A. Sudbury, A model for spatial conflict,Biometrika 60, 581 (1973).

[22] R. A. Holley and T. M. Liggett, Ergodic theorems for

weakly interacting infinite systems and the voter model,The Annals of Probability 3, 643 (1975).

[23] P. L. Krapivsky, S. Redner, and E. Ben-Naim, A kineticview of statistical physics (Cambridge University Press,2010).

[24] C. Castellano, D. Vilone, and A. Vespignani, Incom-plete ordering of the voter model on small-world net-works, EPL (Europhysics Letters) 63, 153 (2003).

[25] D. Vilone and C. Castellano, Solution of voter model dy-namics on annealed small-world networks, Phys. Rev. E69, 016109 (2004).

[26] V. Sood, T. Antal, and S. Redner, Voter models on het-erogeneous networks, Phys. Rev. E 77, 041121 (2008).

[27] P. Holme and M. E. J. Newman, Nonequilibrium phasetransition in the coevolution of networks and opinions,Phys. Rev. E 74, 056108 (2006).

[28] I. J. Benczik, S. Z. Benczik, B. Schmittmann, andR. K. P. Zia, Opinion dynamics on an adaptive randomnetwork, Phys. Rev. E 79, 046104 (2009).

[29] G. Zschaler, G. A. Bohme, M. Seißinger, C. Huepe, andT. Gross, Early fragmentation in the adaptive voter modelon directed networks, Phys. Rev. E 85, 046107 (2012).

[30] R. Durrett, J. P. Gleeson, A. L. Lloyd, P. J. Mucha,F. Shi, D. Sivakoff, J. E. S. Socolar, and C. Vargh-ese, Graph fission in an evolving voter model, Proc. Nat.Acad. Sci. 109, 3682 (2012).

[31] T. Rogers and T. Gross, Consensus time and conformityin the adaptive voter model, Phys. Rev. E 88, 030102(2013).

[32] L. A. Adamic and N. Glance, in Proceedings of the 3rd In-ternational Workshop on Link Discovery , LinkKDD ’05(ACM, New York, NY, USA, 2005) pp. 36–43.

[33] P. F. Lazarsfeld and R. K. Merton, in Freedom and con-trol in modern society, edited by M. Berger (New York:Van Nostrand, 1954).

[34] M. McPherson, L. Smith-Lovin, and J. M. Cook, Birds ofa feather: Homophily in social networks, Annual Reviewof Sociology 27, 415 (2001).

[35] P. M. DeMarzo, D. Vayanos, and J. Zwiebel, Persuasionbias, social influence, and unidimensional opinions, TheQuarterly Journal of Economics 118, 909 (2003).

[36] M. J. de Oliveira, J. F. F. Mendes, and M. A. San-tos, Nonequilibrium spin models with ising universal be-haviour, J. Phys. A 26, 2317 (1993).

[37] J.-M. Drouffe and C. Godreche, Phase ordering and per-sistence in a class of stochastic processes interpolatingbetween the ising and voter models, J. Phys. A 32, 249(1999).

[38] M. Mobilia and S. Redner, Majority versus minority dy-namics: Phase transition in an interacting two-state spinsystem, Phys. Rev. E 68, 046106 (2003).

[39] J. Ugander, B. Karrer, L. Backstrom, and C. Mar-low, The Anatomy of the Facebook Social Graph, arXiv:1111.4503 (2011).

[40] R. J. Glauber, Timedependent statistics of the isingmodel, J. Math. Phys. 4, 294 (1963).

[41] J. Fernandez-Gracia, K. Suchecki, J. J. Ramasco,M. San Miguel, and V. M. Eguıluz, Is the voter model amodel for voters? Phys. Rev. Lett. 112, 158701 (2014).

[42] S. DellaVigna and E. Kaplan, The fox news effect: Me-dia bias and voting, The Quarterly Journal of Economics122, 1187 (2007).