|
As an example, the ongoing events in Tunisia and Egypt appear to exhibit a similar process, according to Szymanski. "In those countries, dictators who were in power for decades were suddenly overthrown in just a few weeks."
The findings were published in the July 22, 2011, early online edition of the journal Physical Review E in an article titled "Social consensus through the influence of committed minorities."
An important aspect of the finding is that the percent of committed opinion holders required to shift majority opinion does not change significantly regardless of the type of network in which the opinion holders are working. In other words, the percentage of committed opinion holders required to influence a society remains at approximately 10 percent, regardless of how or where that opinion starts and spreads in the society.
To reach their conclusion, the scientists developed computer models of various types of social networks. One of the networks had each person connect to every other person in the network. The second model included certain individuals who were connected to a large number of people, making them opinion hubs or leaders. The final model gave every person in the model roughly the same number of connections. The initial state of each of the models was a sea of traditional-view holders. Each of these individuals held a view, but were also, importantly, open minded to other views.
Once the networks were built, the scientists then "sprinkled" in some true believers throughout each of the networks. These people were completely set in their views and unflappable in modifying those beliefs. As those true believers began to converse with those who held the traditional belief system, the tides gradually and then very abruptly began to shift.
"In general, people do not like to have an unpopular opinion and are always seeking to try locally to come to consensus. We set up this dynamic in each of our models," said SCNARC Research Associate and corresponding paper author Sameet Sreenivasan. To accomplish this, each of the individuals in the models "talked" to each other about their opinion. If the listener held the same opinions as the speaker, it reinforced the listener's belief. If the opinion was different, the listener considered it and moved on to talk to another person. If that person also held this new belief, the listener then adopted that belief.
"As agents of change start to convince more and more people, the situation begins to change," Sreenivasan said. "People begin to question their own views at first and then completely adopt the new view to spread it even further. If the true believers just influenced their neighbors, that wouldn't change anything within the larger system, as we saw with percentages less than 10."
The research has broad implications for understanding how opinion spreads. "There are clearly situations in which it helps to know how to efficiently spread some opinion or how to suppress a developing opinion," said Associate Professor of Physics and co-author of the paper Gyorgy Korniss. "Some examples might be the need to quickly convince a town to move before a hurricane or spread new information on the prevention of disease in a rural village."
The researchers are now looking for partners within the social sciences and other fields to compare their computational models to historical examples. They are also looking to study how the percentage might change when input into a model where the society is polarized. Instead of simply holding one traditional view, the society would instead hold two opposing viewpoints. An example of this polarization would be Democrat versus Republican.
The research was funded by the Army Research Laboratory (ARL) through SCNARC, part of the Network Science Collaborative Technology Alliance (NS-CTA), the Army Research Office (ARO), and the Office of Naval Research (ONR).