For example, the likelihood of a meme being shared three times was approximately nine times less than that of its being shared once. Running this simulation over many time steps, Lilian Weng, now at OpenAI, and researchers at OSoMe found that as agents’ attention became increasingly limited, the propagation of memes came to reflect the power-law distribution of actual social media: the probability that a meme would be shared a given number of times was roughly an inverse power of that number. To mimic limited attention, agents are allowed to view only a certain number of items near the top of their news feeds. At each time step in the simulation, agents may either create a meme or reshare one that they see in a news feed. It represented users of social media such as Andy, called agents, as nodes in a network of online acquaintances. The OSoMe team demonstrated this result with a set of simple simulations. Simon noted, “What information consumes is rather obvious: it consumes the attention of its recipients.” One of the first consequences of the so-called attention economy is the loss of high-quality information. As Nobel Prize–winning economist and psychologist Herbert A. The glut of information has generated intense competition for people’s attention. Some of these tools are already being used by journalists, civil-society organizations and individuals to detect inauthentic actors, map the spread of false narratives and foster news literacy. We are also developing analytical and machine-learning aids to fight social media manipulation. Insights from psychological studies on the evolution of information conducted at Warwick inform the computer models developed at Indiana, and vice versa. At the University of Warwick in England and at Indiana University Bloomington’s Observatory on Social Media (OSoMe, pronounced “awesome”), our teams are using cognitive experiments, simulations, data mining and artificial intelligence to comprehend the cognitive vulnerabilities of social media users. The need to understand these cognitive vulnerabilities and how algorithms use or manipulate them has become urgent. These mental shortcuts influence which information we search for, comprehend, remember and repeat to a harmful extent. Unable to process all this material, we let our cognitive biases decide what we should pay attention to. Viewing and producing blogs, videos, tweets and other units of information called memes have become so cheap and easy that the information marketplace is inundated. Making matters worse, bots-automated social media accounts that impersonate humans-enable misguided or malevolent actors to take advantage of his vulnerabilities.Ĭompounding the problem is the proliferation of online information. Search engines direct Andy to sites that inflame his suspicions, and social media connects him with like-minded people, feeding his fears. Modern technologies are amplifying these biases in harmful ways, however. People who behaved in accordance with them-for example, by staying away from the overgrown pond bank where someone said there was a viper-were more likely to survive than those who did not. These biases are products of our evolutionary past, and for tens of thousands of years, they served us well. We search for and remember things that fit well with what we already know and understand. We pay attention to and are more likely to share information about risks-for Andy, the risk of losing his job. We prefer information from people we trust, our in-group. This example illustrates a minefield of cognitive biases. When his sister asks about the rally, Andy shares the conviction that has now become part of his identity: COVID is a hoax. Almost no one at the massive protest, including him, wears a mask. Andy joins an online group of people who have been or fear being laid off and soon finds himself asking, like many of them, “What pandemic?” When he learns that several of his new friends are planning to attend a rally demanding an end to lockdowns, he decides to join them. His Web search quickly takes him to articles claiming that COVID is no worse than the flu. A colleague posts an article about the COVID “scare” having been created by Big Pharma in collusion with corrupt politicians, which jibes with Andy’s distrust of government. But then the hotel where he works closes its doors, and with his job at risk, Andy starts wondering how serious the threat from the virus really is. When one opines on Facebook that pandemic fears are overblown, Andy dismisses the idea at first. Unable to read all the articles he sees on it, he relies on trusted friends for tips. Consider Andy, who is worried about contracting COVID in 2020.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |