(Un)stable diffusions
The publics, publicities, and publicizations of generative AI
DOI:
https://doi.org/10.33621/jdsr.v6i440453Keywords:
AI, generative AI, publicity, public theory, media and communicationAbstract
Generative AI is a uniquely public technology. The large language models behind ChatGPT and other tools that generate text and images is a major develop in publicity as much as technology. Without public data and public participation, these large models could not be trained. Without the attention, hype, and hope around these technologies, the big AI firms probably could not afford the computational costs to train these models. Our special issue questions how Critical AI Studies can attend to the publics, publicities, and publicizations of generative AI. We situate AI’s publicity as mode of publicity – hype, scandals, silences, and inevitability – as well as a mode of participation seen in the grown importance of technology demonstrations. Within this situation our contributions offer four different research paths: (1) situating the legacy media as an enduring process of legitimation; (2) looking at the ways that AI has a private life in public; (3) questioning the post-democratic future of public participation; and, (4) developing new prototypes of public participation through research creation.
References
Ames, M. G. (2019). The Charisma Machine: The Life, Death, and Legacy of One Laptop per Child. MIT Press.
Amoore, L., Campolo, A., Jacobsen, B., & Rella, L. (2024). A world model: On the political logics of generative AI. Political Geography, 113, 103134. https://doi.org/10.1016/j.polgeo.2024.103134
Ananny, M. (2023). Making Mistakes. Osiris. https://doi.org/10.1086/725146
Ananny, M. (2024). Making Generative Artificial Intelligence a Public Problem. Seeing Publics and Sociotechnical Problem-Making in Three Scenes of AI Failure. Javnost - The Public, 31(1), 89–105. https://doi.org/10.1080/13183222.2024.2319000
Ananny, M., & Finn, M. (2020). Anticipatory news infrastructures: Seeing journalism’s expectations of future publics in its sociotechnical systems. New Media & Society, 22(9), 1600–1618. https://doi.org/10.1177/1461444820914873
Bareis, J., & Katzenbach, C. (2021). Talking AI into Being: The Narratives and Imaginaries of National AI Strategies and Their Performative Politics. Science, Technology, & Human Values, 01622439211030007. https://doi.org/10.1177/01622439211030007
Barney, D. (2008). Politics and Emerging Media: The Revenge of Publicity. Global Media Journal - Canadian Edition, 1(1), 89–106.
Barney, D. (2014). Publics without Politics: Surplus Publicity as Depoliticization. In K. Kozolanka (Ed.), Publicity and the Canadian State: Critical Communications Approaches (pp. 72–88). University of Toronto Press.
Bender, E. M., Gebru, T., McMillan-Major, A., & Shmitchell, S. (2021). On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? . Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency, 610–623. https://doi.org/10.1145/3442188.3445922
Bennett, W. L., & Manheim, J. B. (2006). The One-Step Flow of Communication. The Annals of the American Academy of Political and Social Science, 608(1), 213–232.
Blanchett, N., McKelvey, F., & Brin, C. (2022). Algorithms, platforms, and policy: The changing face of Canadian news distribution. In J. Meese & S. Bannerman (Eds.), The Algorithmic Distribution of News. Palgrave Macmillan. http://link.springer.com/book/9783030870850
Bode, K., & Goodlad, L. M. E. (2023). Data Worlds: An Introduction. Critical AI, 1(1–2). https://doi.org/10.1215/2834703X-10734026
Bossetta, M. (2020). Scandalous Design: How Social Media Platforms’ Responses to Scandal Impacts Campaigns and Elections. Social Media + Society, 6(2), 205630512092477. https://doi.org/10.1177/2056305120924777
Bourne, C. (2024). AI hype, promotional culture, and affective capitalism. AI and Ethics, 4(3), 757–769. https://doi.org/10.1007/s43681-024-00483-w
Braun, J. A., & Eklund, J. L. (2019). Fake News, Real Money: Ad Tech Platforms, Profit-Driven Hoaxes, and the Business of Journalism. Digital Journalism, 0(0), 1–21. https://doi.org/10.1080/21670811.2018.1556314
Brown, W. (2015). Undoing the Demos: Neoliberalism’s Stealth Revolution. Zone Books.
Bunz, M., & Braghieri, M. (2022). The AI doctor will see you now: Assessing the framing of AI in news coverage. AI & Society, 37, 9–22. https://doi.org/10.1007/s00146-021-01145-9
Callison, C., & Young, M. (2020). Reckoning: Journalism’s limits and possibilities. Oxford University Press.
Callon, M. (1998). An essay on framing and overflowing: Economic externalities revisited by sociology. The Sociological Review, 46(S1), 244–269. https://doi.org/10.1111/j.1467-954X.1998.tb03477.x
Carey, J. W., & Quirk, J. J. (1970). The Mythos of the Electronic Revolution. The American Scholar, 39(2), 219–241. JSTOR.
Couldry, N. (2022). Post-Covid: What is cultural theory useful for? International Journal of Cultural Studies, 25(3–4), 253–259. https://doi.org/10.1177/13678779211055846
Curran, J. (2019). Triple crisis of journalism. Journalism, 20(1), 190–193. https://doi.org/10.1177/1464884918807034
Dandurand, G., McKelvey, F., & Roberge, J. (2023). Freezing out: Legacy media’s shaping of AI as a cold controversy. Big Data & Society, 10(2), 20539517231219242. https://doi.org/10.1177/20539517231219242
Dean, J. (2001). Publicity’s Secret. Political Theory, 29(5), 624–650.
Dean, J. (2008). Communicative Capitalism: Circulation and the Foreclosure of Politics (M. Boler, Ed.; pp. 101–122). MIT Press.
Deleuze, G. (1992). Postscript on the Societies of Control. October, 59(1), 3–7.
Ferrari, F., & McKelvey, F. (2022). Hyperproduction: A social theory of deep generative models. Distinktion: Journal of Social Theory, 0(0), 1–23. https://doi.org/10.1080/1600910X.2022.2137546
Ferrari, F., van Dijck, J., & van den Bosch, A. (2023). Foundation models and the privatization of public knowledge. Nature Machine Intelligence, 5(8), 818–820. https://doi.org/10.1038/s42256-023-00695-5
Fingas, J. (2023, January 11). OpenAI will soon test a paid version of its hit ChatGPT bot. https://finance.yahoo.com/news/openai-chatgpt-professional-paid-chatbot-143004442.html
Fourcade, M., & Gordon, J. (2020). Learning Like a State: Statecraft in the Digital Age. Journal of Law and Political Economy, 1(1). https://doi.org/10.5070/LP61150258
Frase, P. (2016). Four Futures: Life After Capitalism. Verso.
Galloway, A., & Thacker, E. (2007). The Exploit: A Theory of Networks. University of Minnesota Press.
Gebru, T., & Torres, É. P. (2024). The TESCREAL bundle: Eugenics and the promise of utopia through artificial general intelligence. First Monday. https://doi.org/10.5210/fm.v29i4.13636
Gehl, R. W., & Lawson, S. T. (2022). Social Engineering: How Crowdmasters, Phreaks, Hackers, and Trolls Created a New Form of Manipulative Communication. MIT Press.
Gourlet, P., Ricci, D., & Crépel, M. (2024). Reclaiming artificial intelligence accounts: A plea for a participatory turn in artificial intelligence inquiries. Big Data & Society, 11(2), 20539517241248093. https://doi.org/10.1177/20539517241248093
Halpern, O., Mitchell, R., & Geoghegan, B. D. (2017). The Smartness Mandate: Notes toward a Critique. Grey Room, 68, 106–129. https://doi.org/10.1162/GREY_a_00221
Hong, S. (2020). Technologies of speculation: The limits of knowledge in a data-driven society. New York University Press.
Hong, S. (2021). Technofutures in Stasis: Smart Machines, Ubiquitous Computing, and the Future That Keeps Coming Back. International Journal of Communication, 15(0), Article 0.
Jones, M., & McKelvey, F. (2024). Deconstructing public participation in the governance of facial recognition technologies in Canada. AI & SOCIETY. https://doi.org/10.1007/s00146-024-01952-w
Kahn, J. (2023, January 25). The inside story of ChatGPT: How OpenAI founder Sam Altman built the world’s hottest technology with billions from Microsoft. Fortune. https://fortune.com/longform/chatgpt-openai-sam-altman-microsoft/
Knüpfer, C., Jackson, S. J., & Kreiss, D. (2024). Political Communication Research is Unprepared for the Far Right. Political Communication, 41(6), 1009–1016. https://doi.org/10.1080/10584609.2024.2414268
Koebler, J. (2024, May 2). Facebook’s AI Spam Isn’t the ‘Dead Internet’: It’s the Zombie Internet. 404 Media. https://www.404media.co/facebooks-ai-spam-isnt-the-dead-internet-its-the-zombie-internet/
Köstler, L., & Ossewaarde, R. (2022). The making of AI society: AI futures frames in German political and media discourses. AI & SOCIETY, 37(1), 249–263. https://doi.org/10.1007/s00146-021-01161-9
Lindgren, S. (Ed.). (2023). Handbook of critical studies of artificial intelligence. Edward Elgar Publishing.
Luhtakallio, E., & Meriluoto, T. (2024). Fame democracy? Social media and visuality-based transformation of the public sphere. Distinktion: Journal of Social Theory, 25(3), 318–342. https://doi.org/10.1080/1600910X.2023.2263818
Lull, J., & Hinerman, S. (Eds.). (1998). Media Scandals. Columbia University Press.
Maiberg, E. (2023, September 29). $260 Million AI Company Releases Undeletable Chatbot That Gives Detailed Instructions on Murder, Ethnic Cleansing. 404 Media. https://www.404media.co/260-million-ai-company-releases-chatbot-that-gives-detailed-instructions-on-murder-ethnic-cleansing/
Marres, N. (2005). Issues Spark a Public into Being: A Key But Often Forgotten Point of the Lippmann-Dewey Debate (B. Latour & P. Weibel, Eds.; pp. 208–217). MIT Press.
Marres, N. (2010). Front-staging Nonhumans: Publicity as a Constraint on the Political Activity of Things. In B. Braun & S. J. Whatmore (Eds.), Political Matter: Technoscience, Democracy, and Public Life (pp. 177–210). University of Minnesota Press.
Marres, N. (2024). Do automated vehicle trials test society? Testing mobility futures in the West Midlands. Mobilities, 0(0), 1–21. https://doi.org/10.1080/17450101.2024.2356566
Marwick, A. E. (2013). Status Update: Celebrity, Publicity, and Branding in the Social Media Age. Yale University Press.
McKelvey, F. (2014). Algorithmic Media Need Democratic Methods: Why Publics Matter. Canadian Journal of Communication, 39(4), 597–613.
McKelvey, F. (2018). Internet daemons: Digital communications possessed. University of Minnesota Press.
McKelvey, F. (2022, October 27). Thoughts on Stable Diffusion and Free Culture. https://www.nettime.org/Lists-Archives/nettime-l-2210/msg00025.html
McKelvey, F. (2023, April 3). Let’s base AI debates on reality, not extreme fears about the future. The Conversation. http://theconversation.com/lets-base-ai-debates-on-reality-not-extreme-fears-about-the-future-203030
McKelvey, F., Côté, M., & Raynauld, V. (2018). Scandals and Screenshots: Social Media Elites in Canadian Politics. In A. Marland, T. Giasson, & A. Lawlor (Eds.), Political Elites in Canada: Power and Influence in Instantaneous Times (pp. 204–222). UBC Press.
McKelvey, F., & Hunt, R. (2023, May 19). ChatGPT’s Infrastructural Ambitions: AI, Commodification, and the Commons. Centre for Media, Technology and Democracy. https://www.mediatechdemocracy.com/all-work/chatgpts-infrastructural-ambitions-ai-commodification-and-the-commons
McKelvey, F., & Roberge, J. (2023). Recursive Power. In S. Lindgren (Ed.), Handbook of critical studies of artificial intelligence (pp. 21–32). Edward Elgar Publishing.
McQuillan, D. (2022). Resisting AI: An Anti-fascist Approach to Artificial Intelligence (1st ed.). Bristol University Press. https://doi.org/10.2307/j.ctv2rcnp21
Metz, C. (2023, March 31). The ChatGPT King Isn’t Worried, but He Knows You Might Be. The New York Times. https://www.nytimes.com/2023/03/31/technology/sam-altman-open-ai-chatgpt.html
Mitra, B., Cramer, H., & Gurevich, O. (2024). Sociotechnical Implications of Generative Artificial Intelligence for Information Access (No. arXiv:2405.11612; Version 1). arXiv. https://doi.org/10.48550/arXiv.2405.11612
Morgan Stanley. (2023, April 18). The $6 Trillion Opportunity in AI. Morgan Stanley. https://www.morganstanley.com/ideas/generative-ai-growth-opportunity
Mosco, V. (2004). The Digital Sublime: Myth, Power, and Cyberspace. MIT Press.
Natale, S. L. in C. and M. S. S. (2021). Deceitful Media: Artificial Intelligence and Social Life After the Turing Test. Oxford University Press.
Neff, G., & Nagy, P. (2016). Talking to Bots: Symbiotic Agency and the Case of Tay. International Journal of Communication, 10, 4915–4931.
Nguyen, D., & Hekman, E. (2022). The news framing of artificial intelligence: A critical exploration of how media discourses make sense of automation. AI & SOCIETY. https://doi.org/10.1007/s00146-022-01511-1
Palmås, K., & Surber, N. (2022). Legitimation crisis in contemporary technoscientific capitalism. Journal of Cultural Economy, 15(3), 373–379. https://doi.org/10.1080/17530350.2022.2065331
Paul, R. (2022). Can critical policy studies outsmart AI? Research agenda on artificial intelligence technologies and public policy. Critical Policy Studies, 16(4), 497–509. https://doi.org/10.1080/19460171.2022.2123018
Raley, R., & Rhee, J. (2023). Critical AI: A Field in Formation. American Literature, 95(2), 185–204. https://doi.org/10.1215/00029831-10575021
Rambukkana, N. (Ed.). (2021). Intersectional automations: Robotics, AI, algorithms, and equity. Lexington Books.
Roberge, J., & Castelle, M. (Eds.). (2021). The cultural life of machine learning: An incursion into critical AI studies. Palgrave Macmillan. https://doi.org/10.1007/978-3-030-56286-1
Roberge, J., Senneville, M., & Morin, K. (2020). How to translate artificial intelligence? Myths and justifications in public discourse. Big Data & Society, 7(1), 205395172091996. https://doi.org/10.1177/2053951720919968
Roberts, S. T., & Hogan, M. (2019). Left Behind: Futurist Fetishists, Prepping and the Abandonment of Earth. B2o: An Online Journal, 4(2). https://escholarship.org/uc/item/8sr8n99w
Rosental, C. (2021). The demonstration society (C. Porter, Trans.). The MIT Press.
Ruess, C., Hoffmann, C. P., Boulianne, S., & Heger, K. (2021). Online political participation: The evolution of a concept. Information, Communication & Society, 26(8), 1495–1512. https://doi.org/10.1080/1369118X.2021.2013919
Schudson, M. (2008). Why Democracies Need an Unlovable Press. Polity.
Shi, A. (2018, September 8). Batterygate: A Complete History of Apple Throttling iPhones. iFixit. https://www.ifixit.com/News/11208/batterygate-timeline
Suchman, L. (2023). The uncontroversial ‘thingness’ of AI. Big Data & Society, 10(2), 20539517231206794. https://doi.org/10.1177/20539517231206794
Trottier, D. (2017). Scandal mining: Political nobodies and remediated visibility. Media, Culture & Society, 0163443717734408.
Ulnicane, I., Knight, W., Leach, T., Stahl, B. C., & Wanjiku, W.-G. (2021). Framing governance for a contested emerging technology:insights from AI policy. Policy and Society, 40(2), 158–177. https://doi.org/10.1080/14494035.2020.1855800
Venturini, T., Ricci, D., Mauri, M., Kimbell, L., & Meunier, A. (2015). Designing Controversies and Their Publics. Design Issues, 31(3), 74–87. https://doi.org/10.1162/DESI_a_00340
Vinsel, L. (2021, February 1). You’re Doing It Wrong: Notes on Criticism and Technology Hype. Medium. https://sts-news.medium.com/youre-doing-it-wrong-notes-on-criticism-and-technology-hype-18b08b4307e5
Volcovici, V., Kearney, L., & Volcovici, V. (2024, November 26). Data-center reliance on fossil fuels may delay clean-energy transition. Reuters. https://www.reuters.com/technology/artificial-intelligence/how-ai-cloud-computing-may-delay-transition-clean-energy-2024-11-21/
Widder, D. G., & Hicks, M. (2024). Watching the Generative AI Hype Bubble Deflate (No. arXiv:2408.08778). arXiv. https://doi.org/10.48550/arXiv.2408.08778
Widder, D. G., Whittaker, M., & West, S. M. (2024). Why ‘open’ AI systems are actually closed, and why this matters. Nature, 635(8040), 827–833. https://doi.org/10.1038/s41586-024-08141-1

Downloads
Published
Issue
Section
License
Copyright (c) 2024 Fenwick McKelvey, Joanna Redden, Jonathan Roberge, Luke Stark

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.