Hive Mind Online: Collective Sensing in Times of Disinformation

Authors

DOI:

https://doi.org/10.33621/jdsr.v4i4.119

Keywords:

human computer interaction, computer-mediated deception, collective sensing, collective intelligence, human sensor networks, language-action cues, multilevel models, information manipulation, disinformation

Abstract

This study investigates the efficacy of collective sensing as a mechanism for unveiling disinformation in group interaction. Small group interactions were simulated to experiment on the effects of a group reaction to incentivized deceptive behavior when initiated by social influencers. We use multilevel modeling to examine the individual communication data nested within group interactions. The study advances the use of computational efficacy to support the supposition of collective sensing—by analyzing individual social actors’ communicative language and interaction within group contexts. Language-action cues as stigmergic signals were systemically extracted, compared and analyzed within groups as well as between groups. The results demonstrate that patterns of group communication become more concentrated and expressive after a social influencer becomes deceptive, even when the act of deception itself is not obvious to any individual. That is, individuals in the group characterize deceptive situations differently, but communication patterns depict the group’s ability to collectively sense deception from circulating disinformation. The study confirms our postulation of using collective sensing to detect deceptive influences in a group.

References

Allcott, H., & Gentzkow, M. (2017). Social media and fake news in the 2016 election. Journal of Economic Perspectives, 31(2), 211-236. doi:10.1257/jep.31.2.211

Benkler, Y., Faris, R., & Roberts, H. (2018). Network Propaganda: Manipulation, Disinformation, and Radicalization in American Politics: Oxford University Press.

Bennati, S. (2018). On the role of collective sensing and evolution in group formation. Swarm Intelligence, 12, 267-282. doi:10.1007/s11721-018-0156-y

Berdahl, A., Torney, C. J., Ioannou, C. C., Faria, J. J., & Couzin, I. D. (2013). Emergent sensing of complex environments by mobile animal groups. Science, 339(6119), 574-576.

Berger, J. (2016). Contagious: Why things catch on (1st ed.): Simon & Schuster.

Blaschke, T., Hay, G. J., Weng, Q., & Resch, B. (2011). Collective sensing: Integrating geospatial technologies to understand urban systems—An overview. Remote Sensing, 3(8), 1743-1776. doi:10.3390/rs3081743

Bodrunova, S. S., Blekanov, I., Smoliarova, A., & Litvinenko, A. (2019). Beyond left and right: Real-world political polarization in Twitter discussions on inter-ethnic conflicts. Media and Communication, 7(3), 119-132. doi:10.17645/mac.v7i3.1934

Bouwmeester, H. (2017). Lo and behold: Let the truth be told—Russian deception warfare in Crimea and Ukraine and the return of ‘Maskirovka’ and ‘Reflexive Control Theory’. Netherlands Annual Review of Military Studies 2017, 125-153. doi:10.1007/978-94-6265-189-0_8

Boxwell, R. (2020a, April 4, 2020). The blame game: The origins of Covid-19 and the anatomy of a fake news story. South China Morning Post Magazine. Retrieved from https://www.scmp.com/magazines/post-magazine/long-reads/article/3078417/how-chinas-fake-news-machine-rewriting-history

Boxwell, R. (2020b, April 4, 2020). How China’s fake news machine is rewriting the history of Covid-19, even as the pandemic unfolds, Opinion. Politico. Retrieved from https://www.politico.com/news/magazine/2020/04/04/china-fake-news-coronavirus-164652

Brown, C. R., Greitzer, F. L., & Watkins, A. (2013). Toward the development of a psycholinguistic-based measure of insider threat risk focusing on core word categories used in social media. In Proceedings of the 2013 Americas Conference on Information Systems, Chicago, Illinois, 1-8.

Brown, C. R., Watkins, A., & Greitzer, F. L. (2013, January 7-10). Predicting insider threat risks through linguistic analysis of electronic communication. In Proceedings of the 2013 46th Hawaii International Conference on System Sciences, Wailea, Hawaii, 1849-1858. doi:10.1109/HICSS.2013.453

Buller, D. B., & Burgoon, J. K. (1994). Deception: Strategic and nonstrategic communication. Strategic interpersonal communication, 191-223.

Buller, D. B., & Burgoon, J. K. (1996). Interpersonal deception theory. Communication Theory, 6(3), 203-242.

Burgoon, J. K., Blair, J. P., Qin, T., & Nunamaker, J. F. (2003). Detecting deception through linguistic analysis. Intelligence and Security Informatics, 2665, 91-101.

Burgoon, J. K., & Buller, D. B. (1994). Interpersonal deception: III. Effects of deceit on perceived communication and nonverbal behavior dynamics. Journal of Nonverbal Behavior, 18(2), 155-184. doi:10.1007/BF02170076

Burgoon, J. K., Buller, D. B., Dillman, L., & Walther, J. B. (1995). Interpersonal deception. IV. Effects of suspicion on perceived communication and nonverbal behavior dynamics. Human Communication Research, 22(2), 163-196.

Caddell, J. W. (2004). Deception 101--Primer on deception. (1-58487-180-6). Strategic Studies Institute: U.S. Army War College Retrieved from https://ssi.armywarcollege.edu/pubs/display.cfm?pubID=589

Chen, X., Sin, S.-C. J., Theng, Y.-L., & Lee, C. S. (2015). Why students share misinformation on social media: Motivation, gender and study-level differences. The Journal of Academic Librarianship, 41(5), 583-592. doi:10.1016/j.acalib.2015.07.003

Cooper, W. H. (1981). Ubiquitous halo. Psychological Bulletin, 90(2), 218-244. doi:10.1037/0033-2909.90.2.218

Crowston, K., O’sterlund, C. S., Howison, J., & Bolici, F. (2017). Work features to support stigmergic coordination in distributed teams. In Proceedings of the Academy of Management Annual Meeting Proceedings, Briarcliff Manor, NY 10510, 14409. doi:10.5465/AMBPP.2017.14409abstract

Dansereau, F., Graen, G., & Haga, W. J. (1975). A vertical dyad linkage approach to leadership within formal organizations: A longitudinal investigation of the role making process. Organizational Behavior and Human Performance, 13(1), 46-78. doi:10.1016/0030-5073(75)90005-7

DePaulo, B. M., Kashy, D. A., Kirkendol, S. E., Wyer, M. M., & Epstein, J. A. (1996). Lying in everyday life. Journal of Personality and Social Pyschology, 70(5), 979-995. doi:0022-3514/96

Dipple, A., Raymond, K., & Docherty, M. (2014). General theory of stigmergy: Modeling stigma semantics. Cognitive Systems Research, 31-32, 61-92. doi:10.1016/j.cogsys.2014.02.002

Dozier, K., & Bergengruen, V. (2021, January 6, 2021). Incited by the President, Pro-Trump rioters violently storm the Capitol. TIME. Retrieved from https://time.com/5926883/trump-supporters-storm-capitol/

Ekman, P., & Friesen, W. B. (1969). Nonverbal leakage and clues to deception. Psychiatry, 32, 88-106.

Ezeakunne, U., Ho, S. M., & Liu, X. (2020, October 18-21, 2020). Sentiment and retweet analysis of user response for fake news detection. In Proceedings of the Proceedings of the 2020 International Conference on Social Computing, Behavioral-Cultural Modeling & Prediction and Behavior Representation in Modeling and Simulation (SBP-BRiMS’20), Washington D.C., 1-10 (Paper No. 37). Retrieved from http://sbp-brims.org/2020/proceedings/papers/working-papers/SBP-BRiMS_2020_paper_37.pdf

Fetzer, J. H. (2004a). Disinformation: The use of false information. Minds and Machines, 14(2), 231-240. doi:10.1023/B:MIND.0000021683.28604.5b

Fetzer, J. H. (2004b). Information: Does it have to be true? Minds and Machines, 14(2), 223-229. doi:10.1023/B:MIND.0000021682.61365.56

Garrett, R. K. (2017). The "echo chamber" distraction: Disinformation campaigns are the problem, not audience fragmentation. Journal of Applied Research in Memory and Cognition, 6(4), 370-376. doi:10.1016/j.jarmac.2017.09.011

George, J. F., Giordano, G., & Tilley, P. A. (2016). Website credibility and deceiver credibility: Expanding prominence-Interpretation Theory. Computers in Human Behavior, 54, 83-93. doi:10.1016/j.chb.2015.07.065

Gordon, D. M. (2014). The ecology of collective behavior. PLOS Biology, 12(3), e1001805. doi:10.1371/journal.pbio.1001805

Gordon, D. M. (2019). The ecology of collective behavior in ants. Annual review of entomology, 64, 35-50. doi:10.1146/annurev-ento-011118-111923

Grassé, P.-P. (1959). La reconstruction du nid et les coordinations interindividuelles chezBellicositermes natalensis etCubitermes sp. la théorie de la stigmergie: Essai d'interprétation du comportement des termites constructeurs. Insectes Sociaux, 6(1), 41-80. doi:10.1007/BF02223791

Greitzer, F. L., Kangas, L. J., Noonan, C. F., Brown, C. R., & Ferryman, T. (2013). Psychosocial modeling of insider threat risk based on behavioral and word use analysis. e-Service Journal, 9(1), 106-138. doi:10.2979/eservicej.9.1.106

Greitzer, F. L., Kangas, L. J., Noonan, C. F., Dalton, A. C., & Hohimer, R. E. (2012). Identifying at-risk employees: Modeling psychosocial precursors of potential insider threats. In Proceedings of the 2012 45th Hawaii International Conference on System Sciences, Maui, Hawaii, 2392-2401. doi:10.1109/HICSS.2012.309

Griffith, J. A., Connelly, S., & Thiel, c. E. (2011). Leader deception influences on leader-member exchange and suborginate organizational commitment. Journal of Leadership & Organizational Studies, 18(4), 508-521. doi:10.1177/1548051811403765

Hackman, J. R., & Vidmar, N. (1970). Effects of size and task type on group performance and member reactions. Sociometry, 33(1), 37-54. doi:10.2307/2786271

Hancock, J., Birnholtz, J., Bazarova, N., Guillory, J., Perlin, J., & Amos, B. (2009). Butler lies: Awareness, deception and design. In Proceedings of the CHI'09, Boston, MA.

Hancock, J., Curry, L. E., Goorha, S., & Woodworth, M. (2008). On lying and being lied to: A linguistic analysis of deception in computer-medicated communication. Discourse Process, 45(1), 1-23. doi:10.1080/01638530701739181

Hernon, P. (1995). Disinformation and misinformation through the Internet: Findings of an exploratory study. Government Information Quarterly, 12(2), 133-139. doi:10.1016/0740-624X(95)90052-7

Heylighen, F. (1999). Collective intelligence and its implementation on the Web: Algorithms to develop a collective mental map. Journal of Computational & Mathematical Organization Theory, 5(3), 253-280. doi:10.1023/A:1009690407292

Ho, S. M. (2009). Behavioral anomaly detection: A socio-technical study of trustworthiness in virtual organizations. (Ph.D. Information Systems). Syracuse University, Syracuse. Retrieved from http://libezproxy.syr.edu/login?url=http://proquest.umi.com/pqdweb?did=2112815091&sid=1&Fmt=2&clientId=3739&RQT=309&VName=PQD Available from ProQuest ProQuest database. (47)

Ho, S. M. (2019, January 8, 2019). Leader member exchange: An interactive framework to uncover a deceptive insider as revealed by human sensors. In Proceedings of the Proceedings of the 2019 52nd Hawaii International Conference on System Sciences (HICSS-52), Maui, Hawaii, 3212-3221. doi:hdl.handle.net/10125/59757

Ho, S. M., Fu, H., Timmarajus, S. S., Booth, C., Baeg, J. H., & Liu, M. (2015, June 4-6). Insider threat: Language-action cues in group dynamics. In Proceedings of the Proceedings of the 2015 ACM SIGMIS Computers and People Research (SIGMIS-CPR'15), Newport Beach, CA, 101-104. doi:10.1145/2751957.2751978

Ho, S. M., & Hancock, J. T. (2018, January 3-6). Computer-mediated deception: Collective language-action cues as stigmergic signals for computational intelligence. In Proceedings of the Proceedings of the 2018 51th Hawaii International Conference on System Sciences (HICSS-51), Big Island, Hawaii, 1671-1680. doi:hdl.handle.net/10125/50098

Ho, S. M., & Hancock, J. T. (2019). Context in a bottle: Language-action cues in spontaneous computer-mediated decepion. Computers in Human Behavior, 91, 33-41. doi:10.1016/j.chb.2018.09.008

Ho, S. M., Hancock, J. T., & Booth, C. (2017). Ethical dilemma: Deception dynamics in computer-mediated group communication. Journal of the Association for Information Science and Technology, 68(12), 2729-2742. doi:10.1002/asi.23849

Ho, S. M., Hancock, J. T., Booth, C., & Liu, X. (2016). Computer-mediated deception: Strategies revealed by language-action cues in spontaneous communication. Journal of Management Information Systems, 33(2), 393-420. doi:10.1080/07421222.2016.1205924

Ho, S. M., & Warkentin, M. (2017). Leader’s dilemma game: An experimental design for cyber insider threat research. Information Systems Frontiers, 19(2), 377-396. doi:10.1007/s10796-015-9599-5

Holland, S., Mason, J., & Landay, J. (2021, January 6, 2021). Trump summoned supporters to “wild” protect, and told them to fight. They did. Reuters. Retrieved from https://www.reuters.com/article/us-usa-election-protests/trump-summoned-supporters-to-wild-protest-and-told-them-to-fight-they-did-idUSKBN29B24S

Hosmer, L. T. (1995). Trust: The connecting link between organizational theory and philosophical ethics. Academy of Management Review, 20(2), 379-403. Retrieved from http://www.jstor.org/stable/258851

Kahai, S. S., & Cooper, R. B. (2003). Exploring the core concepts of media richness theory: The impact of cue multiplicity and feedback immediacy on decision quality. Journal of Management Information Systems, 20(1), 263-299.

Kim, A., & Dennis, A. R. (2019). Says who? The effects of presentation format and source rating on fake news in social media. MIS Quarterly, 43(3), 1025-1039. doi:10.25300/MISQ/2019/15188

Kim, A., Moravec, P. L., & Dennis, A. R. (2019). Combating fake news on social media with source ratings: The effects of user and expert reputation ratings. Journal of Management Information Systems, 36(3), 931-968. doi:10.1080/07421222.2019.1628921

Kimmel, A. J. (1998). In defense of deception. American Psychologist, 53(7), 803-805. doi:10.1037/0003-066X.53.7.803

Krumpal, I. (2013). Determinants of social desirability bias in sensitive surveys: a literature review. Quality & Quantity: International Journal of Methodology, 47(4), 2025-2047. doi:10.1007/s11135-011-9640-9

Levine, T. R. (2014). Active deception detection. Policy Insights from the Behavioral and Brain Sciences, 1(1), 122-128. doi:10.1177/2372732214548863

Liden, R. C., Erdogan, B., Wayne, S. J., & Sparrowe, R. T. (2006). Leader-member exchange, differentiation, and task interdependence: Implications for individual and group performance. Journal of Organizational Behavior, 27(6), 723-746. doi:10.1002/job.409

Liden, R. C., & Graen, G. (1980). Generalizability of hte vertical dyad linkage model of leadership. Academy of Management Review, 23(3), 451-465. doi:10.2307/255511

Liden, R. C., Wayne, S. J., & Stilwell, D. (1993). A longitudinal study on the early development of leader-member exchanges. Journal of Applied Psychology, 78(4), 662-674. doi:10.1037/0021-9010.78.4.662

Liu, F., & Li, M. (2019). A game theory-based network rumor spreading model: based on game experiments. International Journal of Machine Learning and Cybernetics, 10(2019), 1449-1457. doi:10.1007/s13042-018-0826-5

Maas, C. J. M., & Hox, J. J. (2005). Sufficient sample sizes for multilevel modeling. Methodology: European Journal of Research Methods for the Behavioral and Social Sciences, 1(3), 86-92. doi:10.1027/1614-2241.1.3.86

Majchrzak, A., Rice, R. E., Malhotra, A., King, N., & Ba, S. (2000). Technology adaption: The case of a computer-supported inter-organizational virtual team. MIS Quarterly, 24(4), 569-600. doi:10.2307/3250948

Malone, T. W., Laubacher, R., & Dellarocas, C. (2010). The collective intelligence genome. MIT Sloan Management Review, 51(3), 21-31.

McCarthy, T. (2020, April 14, 2020). ‘It will disappear’: the disinformation Trump spread about the coronavirus-timeline, Assorted. The Guardian. Retrieved from https://www.theguardian.com/us-news/2020/apr/14/trump-coronavirus-alerts-disinformation-timeline

McNeish, D., & Wentzel, K. R. (2017). Accommodating small sample sizes in three-level models when the third level is incidental. Multivariate Behavioral Research, 52(2), 200-215. doi:10.1080/00273171.2016.1262236

Mehrabian, A. (1968). Methods & designs: Some referents and measures of nonverbal behavior. Behavior Research Methods & Instrumentation, 1(6), 203-207.

Milman, O. (2020, March 31, 2020). Seven of Donald Trump’s most misleading coronavirus claims, Assorted. The Guardian. Retrieved from https://www.theguardian.com/us-news/2020/mar/28/trump-coronavirus-misleading-claims

Mocanu, D., Rossi, L., Zhang, Q., Karsai, M., & Quattrociocchi, W. (2015). Collective attention in the age of (mis)information. Computers in Human Behavior, 51(Part B), 1198-1204. doi:10.1016/j.chb.2015.01.024

Morrison, E. W., & Robinson, S. L. (1997). When employees feel betrayed: A model of how psychological contract violation develops. Academy of Management Review, 22(1), 226-256.

Negoita, B., Lapointe, L., & Rivard, S. (2018). Collective information system use: A typological theory. MIS Quarterly, 42(4), 1281-1301. doi:10.25300/MISQ/2018/13219

Newman, M. L., Pennebaker, J. W., Berry, D. S., & Richards, J. M. (2003). Lying words: Predicting deception from linguistic styles. Personality and social psychology bulletin, 29(5), 665-675.

Pennebaker, J. W., Chung, C. K., Ireland, M., Gonzales, A., & Booth, R. J. (2007). The development and psychometric properties of LIWC2007. Retrieved from http://www.liwc.net/LIWC2007LanguageManual.pdf

Pennebaker, J. W., & King, L. A. (1999). Linguistic styles: Language use as an individual difference. Journal of Personality and Social Pyschology, 77(6), 1296-1312. doi:10.1037/0022-3514.77.6.1296

Pennebaker, J. W., Mehl, M. R., & Niederhoffer, K. G. (2003). Psychological aspects of natural language use: Our words, our selves. Annual Review of Psychology, 54, 547-577. doi:10.1146/annurev.psych.54.101601.145041

Raafat, R. M., Chater, N., & Frith, C. (2009). Herding in humans. Trends in Cognitive Sciences, 13(10), 420-428. doi:10.1016/j.tics.2009.08.002

Raudenbush, S. W., & Bryk, A. S. (2002). Hierarchical Linear Models: Applications and Data Analysis Methods (2nd ed.). Thousand Oaks, California: Sage Publication.

Resch, B. (2013). People as sensors and collective sensing-contextual observations complementing geo-sensor network measurements. In J. M. Krisp (Ed.), Progress in Location-Based Services (pp. 391-406): Springer-Verlag Berlin Heidelberg.

Rezgui, A., & Crowston, K. (2018). Stigmergic coordination in Wikipedia. In Proceedings of the Proceedings of the 14th International Symposium on Open Collaboration, Paris, France, 1-12. doi:10.1145/323391.3233543

Robinson, S. L. (1996). Trust and breach of the psychological contract. Administrative Science Quarterly, 41(4), 574-599. doi:10.2307/2393868

Schultz, E. E. (2002). A framework for understanding and predicting insider attacks. Computers & Security, 21(6), 526-531.

Seidel, S., Berente, N., Lindberg, A., Lyytinen, K., Martinez, B., & Nickerson, J. V. (2020). Artificial intelligence and video game creation: A framework for the new logic of autonomous design. Journal of Digital Social Research, 2(3), 126-157. doi:10.33621/jdsr.v2i3.46

Simons, T. (2002). Behavioral integrity: The perceived alignment between managers' words and deeds as a research focus. Organization Science, 13(1), 18-35. doi:10.1287/orsc.13.1.18.543

Street, C. N. H., & Masip, J. (2015). The source of the truth bias: Heuristic processing? Scandinavian Journal of Psychology, 56, 254-263. doi:10.1111/sjop.12204

Tausczik, Y. R., & Pennebaker, J. W. (2010). The psychological meaning of words: LIWC and computerized text analysis methods. Journal of Language and Social Psychology, 29(1), 24-54. doi:10.1177/0261927X09351676

Taylor, P. J., Dando, C. J., Ormerod, T. C., Ball, L. J., Jenkins, M. C., Sandham, A., & Menacere, T. (2013). Detecting insider threats through language change. Law and Human Behavior, 37(4), 267-275. doi:10.1037/lhb0000032

Tendoc Jr., E. C., Lim, Z. W., & Ling, R. (2017). Defining “fake news.” A typology of scholarly definitions. Digital Journalism, 6(2), 137-153. doi:10.1080/21670811.2017.1360143

Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), 1146-1151. doi:10.1126/science.aap9559

Wayne, S. J., Shore, L. M., & Liden, R. C. (1997). Perceived organizational support and leader-member exchange: A social exchange perspective. The Academy of Management Journal, 40(1), 82-111.

Wheelan, S. A. (2009). Group size, group development and group productivity. Small Group Research, 40(2), 247-262. doi:10.1177/1046496408328703

Woolley, A. W., Aggarwal, I., & Malone, T. W. (2015). Collective intelligence and group performance. Current Directions in Psychological Science, 24(6), 420-424. doi:10.1177/0963721415599543

Woolley, A. W., Chabris, C. F., Pentland, A., Hashmi, N., & Malone, T. W. (2010). Evidence for a collective intelligence factor in the performance of human groups. Science, 330, 686-688. doi:10.1126/science.1193147

Yang, K., Ahn, C. R., Vuran, M. C., & Kim, H. (2017). Collective sensing of workers’ gait patterns to identify fall hazards in construction. Cognitive Systems Research, 82, 166-178. doi:10.1016/j.autcon.2017.04.010

Zhou, L., Burgoon, J. K., Nunamaker Jr., J. F., & Twitchell, D. P. (2004). Automating linguistics-based cues for detecting deception in text-based asynchronous computer-mediated communication. Group Decision and Negotiation, 13(1), 81-106. doi:10.1023/B:GRUP.0000011944.62889.6f

Zhou, L., & Zhang, D. (2004, Jan. 5-8). Can online behavior unveil a deceiver? In Proceedings of the Proceedings of the 2004 Hawaii International Conference on System Sciences (HICSS-37), Hilton Waikoloa Village Big Island, Hawaii.

Downloads

Published

2023-02-01

Issue

Section

Research Articles

Similar Articles

1 2 3 4 5 6 7 > >> 

You may also start an advanced similarity search for this article.