Coping with Algorithmic Risks: How Internet Users Implement Self-Help Strategies to Reduce Risks Related to Algorithmic Selection

Authors

DOI:

https://doi.org/10.33621/jdsr.v5i1.130

Keywords:

algorithm, algorithmic risks, coping behavior, self-help, governance of algorithms, governance choice, survey

Abstract

Algorithmic selection is omnipresent in various domains of our online everyday lives: it ranks our search results, curates our social media news feeds, or recommends videos to watch and music to listen to. This widespread application of algorithmic selection on the internet can be associated with risks like feeling surveilled (S), feeling exposed to distorted information (D), or feeling like one is using the internet too excessively (O). One way in which internet users can cope with such algorithmic risks is by applying self-help strategies such as adjusting their privacy settings (Sstrat), double-checking information (Dstrat), or deliberately ignoring automated recommendations (Ostrat). This article determines the association of the theoretically derived factors risk awareness (1), personal risk affectedness (2), and algorithm skills (3) with these self-help strategies. The findings from structural equation modelling on survey data representative for the Swiss online population (N2018=1,202) show that personal affectedness by algorithmic risks, awareness of algorithmic risks and algorithm skills are associated with the use of self-help strategies. These results indicate that besides implementing statutory regulation, policy makers have the option to encourage internet users’ self-help by increasing their awareness of algorithmic risks, clarifying how such risks affect them personally, and promoting their algorithm skills.

References

Baek, Y. M., Kim, E., & Bae, Y. (2014). My privacy is okay, but theirs is endangered: Why comparative optimism matters in online privacy concerns. Computers in Human Behavior, 31, 48–56. https://doi.org/10.1016/j.chb.2013.10.010

Bakardjieva, M. (2005). Internet Society: The Internet in everyday life. Sage. http://dx.doi.org/10.4135/9781446215616

Bartsch, M., & Dienlin, T. (2016). Control your Facebook: An analysis of online privacy literacy. Computers in Human Behavior, 56, 147–154. https://doi.org/10.1016/j.chb.2015.11.022

Baruh, L., & Popescu, M. (2017). Big data analytics and the limits of privacy self-management. New Media & Society, 19(4), 579–596. https://doi.org/10.1177/1461444815614001

Baruh, L., Secinti, E., & Cemalcilar, Z. (2017). Online privacy concerns and privacy management: A meta-analytical review. Journal of Communication, 67(1), 26–53. https://doi.org/10.1111/jcom.12276

Boerman, S. C., Kruikemeier, S., & Zuiderveen Borgesius, F. J. (2018). Exploring motivations for online privacy protection behavior: Insights from panel data. Communication Research, 1–25. https://doi.org/10.1177/0093650218800915

Bozdag, E. (2013). Bias in algorithmic filtering and personalization. Ethics and Information Technology, 15(3), 209–227. https://doi.org/10.1007/s10676-013-9321-6

Bucher, T. (2017). The algorithmic imaginary: Exploring the ordinary affects of Facebook algorithms. Information, Communication & Society, 20(1), 30–44. https://doi.org/10.1080/1369118X.2016.1154086

Büchi, M., Festic, N., Just, N., & Latzer, M. (2021). Digital inequalities in online privacy protection: Effects of age, education and gender. Handbook of Digital Inequality. https://www.elgaronline.com/view/edcoll/9781788116565/9781788116565.00029.xml

Büchi, M., Festic, N., & Latzer, M. (2019). Digital overuse and subjective well-being in a digitized society. Social Media + Society. https://doi.org/10.1177/2056305119886031

Büchi, M., Festic, N., & Latzer, M. (2022). The chilling effects of digital dataveillance: A theoretical model and an empirical research agenda. Big Data & Society, 9(1), 1–14. https://doi.org/10.1177/20539517211065368

Büchi, M., Fosch-Villaronga, E., Lutz, C., Tamò-Larrieux, A., Velidi, S., & Viljoen, S. (2020). The chilling effects of algorithmic profiling: Mapping the issues. Computer Law & Security Review, 36, 1–15. https://doi.org/10.1016/j.clsr.2019.105367

Büchi, M., Just, N., & Latzer, M. (2017). Caring is not enough: The importance of Internet skills for online privacy protection. Information, Communication & Society, 20(8), 1261–1278. https://doi.org/10.1080/1369118X.2016.1229001

Chen, H., & Atkin, D. (2020). Understanding third-person perception about Internet privacy risks. New Media & Society, 1–19. https://doi.org/10.1016/j.chb.2017.01.003

Chen, H., Beaudoin, C. E., & Hong, T. (2017). Securing online privacy: An empirical test on Internet scam victimization, online privacy concerns, and privacy protection behaviors. Computers in Human Behavior, 70, 291–302. https://doi.org/10.1016/j.chb.2017.01.003

Chen, H.-T. (2018). Revisiting the privacy paradox on social media with an extended privacy calculus model: The effect of privacy concerns, privacy self-efficacy, and social capital on privacy management. American Behavioral Scientist, 62(10), 1392–1412. https://doi.org/10.1177/0002764218792691

Cho, H., Lee, J.-S., & Chung, S. (2010). Optimistic bias about online privacy risks: Testing the moderating effects of perceived controllability and prior experience. Computers in Human Behavior, 26(5), 987–995. https://doi.org/10.1016/j.chb.2010.02.012

Cotter, K. (2019). Playing the visibility game: How digital influencers and algorithms negotiate influence on Instagram. New Media & Society, 21(4), 895–913. https://doi.org/10.1177/1461444818815684

Cotter, K., & Reisdorf, B. C. (2020). Algorithmic knowledge gaps: A new horizon of (digital) inequality. International Journal of Communication, 14(0), 21.

Debatin, B., Lovejoy, J. P., Horn, A.-K., & Hughes, B. N. (2009). Facebook and online privacy: attitudes, behaviors, and unintended consequences. Journal of Computer-Mediated Communication, 15(1), 83–108. https://doi.org/10.1111/j.1083-6101.2009.01494.x

Demertzis, N., Mandenaki, K., & Tsekeris, C. (2021). Privacy attitudes and behaviors in the age of post-privacy: An empirical approach. Journal of Digital Social Research, 3(1), 119-152-119–152. https://doi.org/10.33621/jdsr.v3i1.75

Dienlin, T., & Metzger, M. J. (2016). An extended privacy calculus model for SNSs: Analyzing self-disclosure and self-withdrawal in a representative U.S. sample. Journal of Computer-Mediated Communication, 21(5), 368–383. https://doi.org/10.1111/jcc4.12163

Dogruel, L., Facciorusso, D., & Stark, B. (2020). ‘I’m still the master of the machine.’ Internet users’ awareness of algorithmic decision-making and their perception of its effect on their autonomy. Information, Communication & Society, 1–22. https://doi.org/10.1080/1369118X.2020.1863999

Dogruel, L., Masur, P., & Joeckel, S. (2021). Development and validation of an algorithm literacy scale for internet users. Communication Methods and Measures, 1–19. https://doi.org/10.1080/19312458.2021.1968361

Ebbers, F. (2020). How to protect my privacy? Classifying end-user information privacy protection behaviors. In Privacy and Identity Management. Data for Better Living: AI and Privacy (pp. 327–342). https://doi.org/10.1007/978-3-030-42504-3_21

Festic, N. (2020). Same, same, but different! Qualitative evidence on how algorithmic selection applications govern different life domains. Regulation & Governance, rego.12333. https://doi.org/10.1111/rego.12333

Flaxman, S., Goel, S., & Rao, J. M. (2016). Filter bubbles, echo chambers, and online news consumption. Public Opinion Quarterly, 80(S1), 298–320. https://doi.org/10.1093/poq/nfw006

Gan, C. (2017). Understanding WeChat users’ liking behavior: An empirical study in China. Computers in Human Behavior, 68, 30–39. https://doi.org/10.1016/j.chb.2016.11.002

Gerber, N., Gerber, P., & Volkamer, M. (2018). Explaining the privacy paradox: A systematic review of literature investigating privacy attitude and behavior. Computers & Security, 77, 226–261. https://doi.org/10.1016/j.cose.2018.04.002

Gillespie, T. (2014). The relevance of algorithms. Media Technologies. https://doi.org/10.7551/mitpress/9780262525374.003.0009

Gruber, J., Hargittai, E., Karaoglu, G., & Brombach, L. (2021). Algorithm awareness as an important internet skill: The case of voice assistants. International Journal of Communication, 15(0), 19.

Gui, M., & Büchi, M. (2019). From use to overuse: Digital inequality in the age of communication abundance. Social Science Computer Review, 089443931985116. https://doi.org/10.1177/0894439319851163

Gutierrez, A., O’Leary, S., Rana, N. P., Dwivedi, Y. K., & Calle, T. (2019). Using privacy calculus theory to explore entrepreneurial directions in mobile location-based advertising: Identifying intrusiveness as the critical risk factor. Computers in Human Behavior, 95, 295–306. https://doi.org/10.1016/j.chb.2018.09.015

Ham, C.-D. (2017). Exploring how consumers cope with online behavioral advertising. International Journal of Advertising, 36(4), 632–658. https://doi.org/10.1080/02650487.2016.1239878

Hargittai, E. (2005). Survey measures of web-oriented digital literacy: Social Science Computer Review. https://doi.org/10.1177/0894439305275911

Hargittai, E., Gruber, J., Djukaric, T., Fuchs, J., & Brombach, L. (2020). Black box measures? How to study people’s algorithm skills. Information, Communication & Society, 23(5), 764–775. https://doi.org/10.1080/1369118X.2020.1713846

Hildebrandt, M. (2008). Defining profiling: A new type of knowledge? In M. Hildebrandt & S. Gutwirth (Eds.), Profiling the European citizen: Cross-disciplinary perspectives (pp. 17–45). Springer Netherlands. https://doi.org/10.1007/978-1-4020-6914-7_2

Hu, L., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling: A Multidisciplinary Journal, 6(1), 1–55. https://doi.org/10.1080/10705519909540118

Illouz, E. (2008). Saving the modern soul: Therapy, Emotions, and the culture of self-help. University of California Press. https://www.jstor.org/stable/10.1525/j.ctt1pp4br

Ireland, L. (2020). Predicting online target hardening behaviors: An extension of routine activity theory for privacy-enhancing technologies and techniques. Deviant Behavior, 1–17. https://doi.org/10.1080/01639625.2020.1760418

Islam, A. K. M. N., Laato, S., Talukder, S., & Sutinen, E. (2020). Misinformation sharing and social media fatigue during COVID-19: An affordance and cognitive load perspective. Technological Forecasting and Social Change, 159, 120201. https://doi.org/10.1016/j.techfore.2020.120201

Just, N., & Latzer, M. (2017). Governance by algorithms: Reality construction by algorithmic selection on the Internet. Media, Culture & Society, 39(2), 238–258. https://doi.org/10.1177/0163443716643157

Kemper, J., & Kolkman, D. (2019). Transparent to whom? No algorithmic accountability without a critical audience. Information, Communication & Society, 22(14), 2081–2096. https://doi.org/10.1080/1369118X.2018.1477967

Kitchin, R. (2017). Thinking critically about and researching algorithms. Information, Communication & Society, 20(1), 14–29. https://doi.org/10.1080/1369118X.2016.1154087

Kormelink Groot, T., & Meijer Costera, I. (2014). Tailor-made news. Journalism Studies, 15(5), 632–641. https://doi.org/10.1080/1461670X.2014.894367

Larus, J., Hankin, C., Carson, S. G., Christen, M., Crafa, S., Grau, O., Kirchner, C., Knowles, B., McGettrick, A., Tamburri, D. A., & Werthner, H. (2018). When computers decide: European recommendations on machine-learned automated decision making [Technical Report]. Association for Computing Machinery.

Latzer, M., Festic, N., & Kappeler, K. (2020). Awareness of risks related to algorithmic selection in Switzerland. Report 3 from the project: The significance of algorithmic selection for everyday life: The case of Switzerland. Zurich: University of Zurich. http://mediachange.ch/research/algosig

Latzer, M., Büchi, M., & Festic, N. (2019). Internetverbreitung und digitale Bruchlinien in der Schweiz 2019. Themenbericht aus dem World Internet Project—Switzerland 2019. Universität Zürich. http://mediachange.ch/research/wip-ch-2019

Latzer, M., & Festic, N. (2019). A guideline for understanding and measuring algorithmic governance in everyday life. Internet Policy Review, 8(2). https://doi.org/10.14763/2019.2.1415

Latzer, M., Hollnbuchner, K., Just, N., & Saurwein, F. (2016). The economics of algorithmic selection on the internet. Handbook on the economics of the internet. https://www.elgaronline.com/view/edcoll/9780857939845/9780857939845.00028.xml

Latzer, M., & Just, N. (2020). Governance by and of algorithms on the internet: Impact and consequences. In M. Latzer & N. Just, Oxford research encyclopedia of communication. Oxford University Press. https://doi.org/10.1093/acrefore/9780190228613.013.904

Latzer, M., Saurwein, F., & Just, N. (2019). Assessing policy II: Governance-choice method. In H. Van den Bulck, M. Puppis, K. Donders, & L. Van Audenhove (Eds.), The Palgrave handbook of methods for media policy research (pp. 557–574). Springer International Publishing. https://doi.org/10.1007/978-3-030-16065-4_32

Leeder, C. (2019). How college students evaluate and share “fake news” stories. Library & Information Science Research, 41(3), 100967. https://doi.org/10.1016/j.lisr.2019.100967

Litt, E. (2013). Measuring users’ internet skills: A review of past assessments and a look toward the future. New Media & Society, 15(4), 612–630. https://doi.org/10.1177/1461444813475424

Longworth, J. (2018). VPN: From an obscure network to a widespread solution. Computer Fraud & Security, 2018(4), 14–15. https://doi.org/10.1016/S1361-3723(18)30034-4

Lowe-Calverley, E., & Grieve, R. (2018). Thumbs up: A thematic analysis of image-based posting and liking behaviour on social media. Telematics and Informatics, 35(7), 1900–1913. https://doi.org/10.1016/j.tele.2018.06.003

Lutz, C., & Newlands, G. (2021). Privacy and smart speakers: A multi-dimensional approach. The Information Society, 0(0), 1–16. https://doi.org/10.1080/01972243.2021.1897914

Marder, B. (2018). Trumped by context collapse: Examination of ‘Liking’ political candidates in the presence of audience diversity. Computers in Human Behavior, 79, 169–180. https://doi.org/10.1016/j.chb.2017.10.025

Micheli, M., Lutz, C., & Büchi, M. (2018). Digital footprints: An emerging dimension of digital inequality. Journal of Information, Communication and Ethics in Society, 16(3), 242–251. https://doi.org/10.1108/JICES-02-2018-0014

Montaño, D. E., & Kasprzyk, D. (2008). Theory of reasoned action, theory of planned behavior, and the integrated behavioral model. In K. Glanz, B. K. Rimer & K, Viswanath (Eds.), Health behavior and health education. Theory, research, and practice. Jossey-Bass.

Monzer, C., Moeller, J., Helberger, N., & Eskens, S. (2020). User perspectives on the news personalisation process: Agency, trust and utility as building blocks. Digital Journalism, 8(9), 1142–1162. https://doi.org/10.1080/21670811.2020.1773291

Nissenbaum, H. F. (2010). Privacy in context: Technology, policy, and the integrity of social life. Stanford Law Books.

Noble, S. U. (2018). Algorithms of oppression. De Gruyter. https://doi.org/10.18574/nyu/9781479833641.001.0001

Park, Y. J. (2011). Digital literacy and privacy behavior online. Communication Research, 4(2), 215–236. https://doi.org/10.1177/0093650211418338

Park, Y. J. (2015). Do men and women differ in privacy? Gendered privacy and (in)equality in the Internet. Computers in Human Behavior, 50, 252–258. https://doi.org/10.1016/j.chb.2015.04.011

Pengnate, S. (Fone), & Sarathy, R. (2017). An experimental investigation of the influence of website emotional design features on trust in unfamiliar online vendors. Computers in Human Behavior, 67, 49–60. https://doi.org/10.1016/j.chb.2016.10.018

Petre, C., Duffy, B. E., & Hund, E. (2019). “Gaming the System”: Platform paternalism and the politics of algorithmic visibility. Social Media + Society, 5(4), 1–12. https://doi.org/10.1177/2056305119879995

Quinn, K. (2016). Why we share: A uses and gratifications approach to privacy regulation in social media use. Journal of Broadcasting & Electronic Media, 60(1), 61–86. https://doi.org/10.1080/08838151.2015.1127245

Ramizo, G. J. (2021). Platform playbook: A typology of consumer strategies against algorithmic control in digital platforms. Information, Communication & Society, 1–16. https://doi.org/10.1080/1369118X.2021.1897151

Rogers, R. W. (1975). A protection motivation theory of fear appeals and attitude change. The Journal of Psychology, 91(1), 93–114. https://doi.org/10.1080/00223980.1975.9915803

Rosenstock, I. M. (1974). Historical origins of the health belief model. Health education monographs. https://doi.org/10.1177/109019817400200403

Ruckenstein, M., & Granroth, J. (2020). Algorithms, advertising and the intimacy of surveillance. Journal of Cultural Economy, 13(1), 12–24. https://doi.org/10.1080/17530350.2019.1574866

Sánchez, D., & Viejo, A. (2018). Privacy-preserving and advertising-friendly web surfing. Computer Communications, 130, 113–123. https://doi.org/10.1016/j.comcom.2018.09.002

Saurwein, F., Just, N., & Latzer, M. (2015). Governance of algorithms: Options and limitations. Info, 17(6), 35–49. https://doi.org/10.1108/info-05-2015-0025

Selwyn, N., & Pangrazio, L. (2018). Doing data differently? Developing personal data tactics and strategies amongst young mobile media users. Big Data & Society, 5(1), 2053951718765021. https://doi.org/10.1177/2053951718765021

Seyfert, R. (2021). Algorithms as regulatory objects. Information, Communication & Society, 1–17. https://doi.org/10.1080/1369118X.2021.1874035

Strycharz, J., Kim, E. & Segijn, C. B. (2022). Why people would (not) change their media use in response to perceived corporate surveillance. Telematics and Informatics, 71. https://doi.org/10.1016/j.tele.2022.101838

Swart, J. (2021). Experiencing algorithms: How young people understand, feel about, and engage with algorithmic news selection on social media. Social Media + Society, 7(2), 20563051211008828. https://doi.org/10.1177/20563051211008828

Syvertsen, T. (2020). Digital detox. Emerald Publishing Limited. https://books.emeraldinsight.com/page/detail/Digital-Detox/?k=9781787693425

van Dijck, J. (2009). Users like you? Theorizing agency in user-generated content. Media, Culture & Society, 31(1), 41–58. https://doi.org/10.1177/0163443708098245

van Dijk, J. (2020). The digital divide. Polity.

Véliz, C. (2020). Privacy is power. Bantom Press. /books/1120394/privacy-is-power/9780552177719

Vitak, J., & Zimmer, M. (2020). More than just privacy: Using contextual integrity to evaluate the long-term risks from COVID-19 surveillance technologies. Social Media + Society, 6(3),1–4. https://doi.org/10.1177/2056305120948250

Weinberger, M., Bouhnik, D., & Zhitomirsky-Geffet, M. (2017). Factors affecting students’ privacy paradox and privacy protection behavior. Open Information Science, 1(1). https://doi.org/10.1515/opis-2017-0002

Witte, K. (1992). Putting the fear back into fear appeals: The extended parallel process model. Communication Monographs, 59(4), 329–349. https://doi.org/10.1080/03637759209376276

Zarouali, B., Ponnet, K., Walrave, M., & Poels, K. (2017). “Do you like cookies?” Adolescents’ skeptical processing of retargeted Facebook-ads and the moderating role of privacy concern and a textual debriefing. Computers in Human Behavior, 69, 157–165. https://doi.org/10.1016/j.chb.2016.11.050

Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. PublicAffairs.

Downloads

Published

2023-03-17

Issue

Section

Research Articles

Similar Articles

<< < 1 2 3 > >> 

You may also start an advanced similarity search for this article.