When corporate dataveillance brings beneficial experiences
Service-specific qualitative evidence for YouTube
DOI:
https://doi.org/10.33621/jdsr.v7i255041Keywords:
dataveillance, profiling, digital traces, imaginaries, behavior, YouTubeAbstract
Entertainment, information seeking, socialization: internet users are constantly dataveilled when relying on various online services to meet their diverse needs. Yet research that considers online-service peculiarities in shaping personal experiences in response to corporate data collection and analysis is scarce. This study investigates young adults’ dataveillance imaginaries, sense of dataveillance, and behavioral responses on YouTube, which extensively displays personalized content based on digital traces. Our thematic analysis of semi-structured interviews with frequent users demonstrated the perceived self-evidence of dataveillance on this major platform. Users tended to accept and take advantage of, rather than resist, pervasive dataveillance practices. The results also revealed that on YouTube, dataveillance brings greater benefits because it fosters user satisfaction and confirmed that individual attitudes and behaviors related to dataveillance are highly context-dependent. Our fresh service-specific approach contributes to refining user-centered research on everyday dataveillance beyond its expected adverse consequences.
References
Acquisti, A., Brandimarte, L., & Loewenstein, G. (2015). Privacy and human behavior in the age of information. Science, 347(6221), 509–514. doi:10.1126/science.aaa1465
Alvarado, O., Heuer, H., Vanden Abeele, V., Breiter, A., & Verbert, K. (2020). Middle-aged video consumers’ beliefs about algorithmic recommendations on YouTube. Proceedings of the ACM on Human-Computer Interaction, 4(CSCW2), 1–24. doi:10.1145/3415192
Andrejevic, M. (2014). The big data divide. International Journal of Communication, 8, 1673–1689. Retrieved from https://ijoc.org/index.php/ijoc/article/view/2161
Armstrong, A., Briggs, J., Moncur, W., Carey, D. P., Nicol, E., & Schafer, B. (2023). Everyday digital traces. Big Data & Society, 10(2), 1–13. doi:10.1177/20539517231213827
Augusto, F. R., & Simões, M. J. (2017). To see and be seen, to know and be known: Perceptions and prevention strategies on Facebook surveillance. Social Science Information, 56(4), 596–618. doi:10.1177/0539018417734974
Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. doi:10.1191/1478088706qp063oa
Braun, V., & Clarke, V. (2022a). Conceptual and design thinking for thematic analysis. Qualitative Psychology, 9(1), 3–26. doi:10.1037/qup0000196
Braun, V., & Clarke, V. (2022b). Thematic analysis: A practical guide. London, UK: Sage.
Brinkmann, S., & Kvale, S. (2015). Chapter 6: Thematizing and designing an interview study. In Interviews—Learning the craft of qualitative research interviewing (3rd ed., pp. 125–147). Thousand Oaks, CA: Sage.
Büchi, M., Festic, N., & Latzer, M. (2022). The chilling effects of digital dataveillance: A theoretical model and an empirical research agenda. Big Data & Society, 9(1), 1–14. doi:10.1177/20539517211065368
Büchi, M., Fosch-Villaronga, E., Lutz, C., Tamò-Larrieux, A., & Velidi, S. (2023). Making sense of algorithmic profiling: User perceptions on Facebook. Information, Communication & Society, 26(4), 809–825. doi:10.1080/1369118X.2021.1989011
Büchi, M., Fosch-Villaronga, E., Lutz, C., Tamò-Larrieux, A., Velidi, S., & Viljoen, S. (2020). The chilling effects of algorithmic profiling: Mapping the issues. Computer Law & Security Review, 36, 1–15. doi:10.1016/j.clsr.2019.105367
Buf, D.-M., & Ștefăniță, O. (2020). Uses and gratifications of YouTube: A comparative analysis of users and content creators. Romanian Journal of Communication and Public Relations, 22(2), 75–89. doi:10.21018/rjcpr.2020.2.301
Christl, W., Kopp, K., & Riechert, P. U. (2017). Corporate surveillance in everyday life. Cracked Labs. Retrieved from https://crackedlabs.org/dl/CrackedLabs_Christl_CorporateSurveillance.pdf
Clarke, R. (1988). Information technology and dataveillance. Communications of the ACM, 31(5), 498–512. doi:10.1145/42411.42413
Covington, P., Adams, J., & Sargin, E. (2016). Deep neural networks for YouTube recommendations. In RecSys ’16: Proceedings of the 10th ACM Conference on Recommender Systems (pp. 191–198). New York, NY: Association for Computing Machinery. doi:10.1145/2959100.2959190
Degli Esposti, S. (2014). When big data meets dataveillance: The hidden side of analytics. Surveillance & Society, 12(2), 209–225. doi:10.24908/ss.v12i2.5113
Dencik, L., & Cable, J. (2017). The advent of surveillance realism: Public opinion and activist responses to the Snowden leaks. International Journal of Communication, 11, 763–781. Retrieved from https://ijoc.org/index.php/ijoc/article/view/5524
DeVito, M. A., Gergle, D., & Birnholtz, J. (2017). “Algorithms ruin everything”: #RIPTwitter, folk theories, and resistance to algorithmic change in social media. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (pp. 3163–3174). New York, NY: Association for Computing Machinery. doi:10.1145/3025453.3025659
Dinev, T., & Hart, P. (2006). An extended privacy calculus model for E-commerce transactions. Information Systems Research, 17(1), 61–80. doi:10.1287/isre.1060.0080
Duffy, B. E., & Chan, N. K. (2019). “You never really know who’s looking”: Imagined surveillance across social media platforms. New Media & Society, 21(1), 119–138. doi:10.1177/1461444818791318
Frick, N. R. J., Wilms, K. L., Brachten, F., Hetjens, T., Stieglitz, S., & Ross, B. (2021). The perceived surveillance of conversations through smart devices. Electronic Commerce Research and Applications, 47, 1–16. doi:10.1016/j.elerap.2021.101046
Fusch, P., & Ness, L. (2015). Are we there yet? Data saturation in qualitative research. Walden Faculty and Staff Publications, 20(9), 1408–1416. Retrieved from https://scholarworks.waldenu.edu/facpubs/455
Goodrow, C. (2021, September 15). On YouTube’s recommendation system. Retrieved from https://blog.youtube/inside-youtube/on-youtubes-recommendation-system/
Gottfried, J. (2024). Americans’ social media use. Pew Research Center. Retrieved from https://www.pewresearch.org/internet/2024/01/31/americans-social-media-use/
Gouvernement du Québec. (n.d.). Québec education system. Retrieved May 9, 2023, from https://www.quebec.ca/en/education/study-quebec/education-system
Gruber, J., & Hargittai, E. (2023). The importance of algorithm skills for informed Internet use. Big Data & Society, 10(1), 1–14. doi:10.1177/20539517231168100
Hargittai, E., Gruber, J., Djukaric, T., Fuchs, J., & Brombach, L. (2020). Black box measures? How to study people’s algorithm skills. Information, Communication & Society, 23(5), 764–775. doi:10.1080/1369118X.2020.1713846
Hartley, J. M., & Schwartz, S. A. (2020). Trust, disconnection, minimizing risk and apathy: A compass of coping tactics in datafied everyday lives. MedieKultur: Journal of Media and Communication Research, 36(69), 11–28. doi:10.7146/mediekultur.v36i69.121182
Hildebrandt, M. (2008). Defining profiling: A new type of knowledge? In M. Hildebrandt & S. Gutwirth (Eds.), Profiling the European citizen: Cross-disciplinary perspectives (pp. 17–45). Dordrecht: Springer Netherlands. doi:10.1007/978-1-4020-6914-7_2
Kalmus, V., Bolin, G., & Figueiras, R. (2024). Who is afraid of dataveillance? Attitudes toward online surveillance in a cross-cultural and generational perspective. New Media & Society, 26(9), 5291–5313. doi:10.1177/14614448221134493
Kappeler, K., Festic, N., & Latzer, M. (2023). Dataveillance imaginaries and their role in chilling effects online. International Journal of Human-Computer Studies, 179, 1–15. doi:10.1016/j.ijhcs.2023.103120
Karizat, N., Delmonaco, D., Eslami, M., & Andalibi, N. (2021). Algorithmic folk theories and identity: How TikTok users co-produce knowledge of identity and engage in algorithmic resistance. Proceedings of the ACM on Human-Computer Interaction, 5(CSCW2), 1–44. doi:10.1145/3476046
Kennedy, H. (2018). Living with data: Aligning data studies and data activism through a focus on everyday experiences of datafication. Krisis: Journal for Contemporary Philosophy, 2018(1), 18–30. doi: 10.21827/krisis.38.1.37184
Kennedy, H., Elgesem, D., & Miguel, C. (2017). On fairness: User perspectives on social media data mining. Convergence, 23(3), 270–288. doi:10.1177/1354856515592507
Khan, M. L. (2017). Social media engagement: What motivates user participation and consumption on YouTube? Computers in Human Behavior, 66, 236–247. doi:10.1016/j.chb.2016.09.024
Latzer, M. (2022). The digital trinity—Controllable human evolution—Implicit everyday religion. KZfSS Kölner Zeitschrift Für Soziologie Und Sozialpsychologie, 74(1), 331–354. doi:10.1007/s11577-022-00841-8
Latzer, M., Hollnbuchner, K., Just, N., & Saurwein, F. (2016). The economics of algorithmic selection on the Internet. In J. M. Bauer & M. Latzer (Eds.), Handbook on the economics of the Internet (pp. 395–425). Cheltenham, Northampton: Edward Elgar. doi:10.4337/9780857939852.00028
Lefkeli, D., Tulan, D., & Gürhan-Canli, Z. (2022). Being observed in the digital era: Conceptualization and scale development of the perception of being observed. Psychology & Marketing, 39(10), 1992–2008. doi:10.1002/mar.21713
Legard, R., Keegan, J., & Ward, K. (2003). In-depth interviews. In J. Ritchie & J. Lewis (Eds.), Qualitative Research Practice: A Guide for Social Science Students and Researchers (1st ed., pp. 138–169). London, UK: Sage.
Light, B., Burgess, J., & Duguay, S. (2018). The walkthrough method: An approach to the study of apps. New Media & Society, 20(3), 881–900. doi:10.1177/1461444816675438
Lupton, D. (2020). Thinking with care about personal data profiling: A more-than-human approach. International Journal of Communication, 14, 3165–3183. Retrieved from https://ijoc.org/index.php/ijoc/article/view/13540
Lyon, D. (2017). Surveillance culture: Engagement, exposure, and ethics in digital modernity. International Journal of Communication, 11, 824–842. Retrieved from https://ijoc.org/index.php/ijoc/article/view/5527
Malterud, K., Siersma, V. D., & Guassora, A. D. (2016). Sample size in qualitative interview studies: Guided by information power. Qualitative Health Research, 26(13), 1753–1760. doi:10.1177/1049732315617444
Marinelli, A., & Parisi, S. (2024). Apps, platforms, and everyday practices: How people perceive and care (or not) about the digital traces they leave online. American Behavioral Scientist, 68(5), 711–730. doi:10.1177/00027642221144852
Marwick, A., & Hargittai, E. (2019). Nothing to hide, nothing to lose? Incentives and disincentives to sharing information with institutions online. Information, Communication & Society, 22(12), 1697–1713. doi:10.1080/1369118X.2018.1450432
McClain, C., Faverio, M., Anderson, M., & Park, E. (2023). How Americans view data privacy. Pew Research Center. Retrieved from https://www.pewresearch.org/internet/2023/10/18/how-americans-view-data-privacy/
Møller, K., & Robards, B. (2019). Walking through, going along and scrolling back: Ephemeral mobilities in digital ethnography. Nordicom Review, 40(Special Issue 1), 95–109. doi:10.2478/nor-2019-0016
NETendances. (2023). Actualités en ligne, réseaux sociaux et balados (vol. 14, no. 3). Académie de la transformation numérique. Retrieved from https://transformation-numerique.ulaval.ca/wp-content/uploads/2023/12/netendances-2023-actualites-en-ligne-reseaux-sociaux-et-balados.pdf
Nissenbaum, H. (2019). Contextual integrity up and down the data food chain. Theoretical Inquiries in Law, 20(1), 221–256. doi:10.1515/til-2019-0008
Pangrazio, L., & Sefton-Green, J. (2020). The social utility of ‘data literacy.’ Learning, Media and Technology, 45(2), 208–220. doi:10.1080/17439884.2020.1707223
Pangrazio, L., & Selwyn, N. (2018). “It’s not like it’s life or death or whatever”: Young people’s understandings of social media data. Social Media + Society, 4(3), 1–9. doi:10.1177/2056305118787808
Patton, M. Q. (2002). Qualitative research and evaluation methods (3rd ed.). Thousand Oaks, CA: Sage.
Penney, J. W. (2016). Chilling effects: Online surveillance and Wikipedia use. Berkeley Technology Law Journal, 31(1), 117–182. doi:10.15779/Z38SS13
Plangger, K., & Montecchi, M. (2020). Thinking beyond privacy calculus: Investigating reactions to customer surveillance. Journal of Interactive Marketing, 50(1), 32–44. doi:10.1016/j.intmar.2019.10.004
Robinson, O. C. (2014). Sampling in interview-based qualitative research: A theoretical and practical guide. Qualitative Research in Psychology, 11(1), 25–41. doi:10.1080/14780887.2013.801543
Ronzhyn, A., Cardenal, A. S., & Batlle Rubio, A. (2023). Defining affordances in social media research: A literature review. New Media & Society, 25(11), 3165–3188. doi:10.1177/14614448221135187
Ruckenstein, M., & Granroth, J. (2020). Algorithms, advertising and the intimacy of surveillance. Journal of Cultural Economy, 13(1), 12–24. doi:10.1080/17530350.2019.1574866
Segijn, C. M., & van Ooijen, I. (2020). Perceptions of techniques used to personalize messages across media in real time. Cyberpsychology, Behavior, and Social Networking, 23(5), 329–337. doi:10.1089/cyber.2019.0682
Solove, D. J. (2006). A taxonomy of privacy. University of Pennsylvania Law Review, 154(3), 477–564. doi:10.2307/40041279
Sörum, N., & Fuentes, C. (2023). How sociotechnical imaginaries shape consumers’ experiences of and responses to commercial data collection practices. Consumption Markets & Culture, 26(1), 24–46. doi:10.1080/10253866.2022.2124977
Southerton, C., & Taylor, E. (2020). Habitual disclosure: Routine, affordance, and the ethics of young peoples social media data surveillance. Social Media + Society, 6(2), 1–11. doi:10.1177/2056305120915612
Strycharz, J., Kim, E., & Segijn, C. M. (2022). Why people would (not) change their media use in response to perceived corporate surveillance. Telematics and Informatics, 71, 1–15. doi:10.1016/j.tele.2022.101838
Strycharz, J., & Segijn, C. M. (2022). The future of dataveillance in advertising theory and practice. Journal of Advertising, 51(5), 574–591. doi:10.1080/00913367.2022.2109781
Sued, G. E. (2022). Training the algorithm: YouTube governance, agency, and literacy. Contratexto, (37), 159–182. doi: 10.26439/contratexto2022.n037.53315669
Taylor, C. (2002). Modern social imaginaries. Public Culture, 14(1), 91–124. Retrieved from https://muse.jhu.edu/pub/4/article/26276
Van Dijck, J. (2014). Datafication, dataism and dataveillance: Big Data between scientific paradigm and ideology. Surveillance & Society, 12(2), 197–208. doi:10.24908/ss.v12i2.4776
Velkova, J., & Kaun, A. (2021). Algorithmic resistance: Media practices and the politics of repair. Information, Communication & Society, 24(4), 523–540. doi:10.1080/1369118X.2019.1657162
Vitak, J., Liao, Y., Mols, A., Trottier, D., Zimmer, M., Kumar, P. C., & Pridmore, J. (2022). When do data collection and use become a matter of concern? A cross-cultural comparison of U.S. and Dutch privacy attitudes. International Journal of Communication, 17, 471–498. Retrieved from https://ijoc.org/index.php/ijoc/article/view/19391
Young, A. L., & Quan-Haase, A. (2013). Privacy protection strategies on Facebook: The internet privacy paradox revisited. Information, Communication & Society, 16(4), 479–500. doi:10.1080/1369118X.2013.777757
YouTube Help. (n.d.). Understanding the basics of privacy on YouTube apps. Retrieved January 21, 2023, from https://support.google.com/youtube/answer/10364219?hl=en#zippy=
Zhang, D., Boerman, S. C., Hendriks, H., Araujo, T., & Voorveld, H. (2023). A peak into individuals’ perceptions of surveillance. In A. Vignolles & M. K. J. Waiguny (Eds.), Advances in advertising research (vol. XII): Communicating, designing and consuming authenticity and narrative (pp. 163–178). Wiesbaden: Springer Fachmedien. doi:10.1007/978-3-658-40429-1_12
Zhang, D., Boerman, S. C., Hendriks, H., Goot, M. J. van der, Araujo, T., & Voorveld, H. (2024). “They know everything”: Folk theories, thoughts, and feelings about dataveillance in media technologies. International Journal of Communication, 18, 2710–2730. Retrieved from https://ijoc.org/index.php/ijoc/article/view/21495

Downloads
Published
Issue
Section
License
Copyright (c) 2025 Sarah Daoust-Braun, Noemi Festic, Michael Latzer

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.