Disinformation after Generative AI and Synthetic Data

Guest editors
Vassilis Galanos, Lecturer, University of Stirling
Maarten Hillebrandt, Assistant professor, Utrecht University
Call for Papers
Disinformation tops many a list of most pressing issues. In a recent survey by the World Economic Forum Global Risks Report, experts placed it at the top of a ranking of most severe challenges for the coming two years (WEF, 2025). Moreover, the “are you a robot” question has escaped the confinement of Captcha Turing tests, entering our daily human communication: is the text or the image sent to me generated by artificial intelligence (AI)? Since the post-2010 advancements in machine learning, and more since the wide dissemination of generative AI technologies at beginning of this decade, societal and political attention to the risks and threats of multiple information harms has proliferated. Widely expressed concerns coincide with rapid technological advancements such as generative AI, synthetic data, encrypted messaging, adversarial neural networks, and virtual reality technologies. They are often placed against a background of significant global political upheaval including the conflicts between Ukraine and Russia, Palestine and Israel, and the Sudanese Civil War. Additionally, worldwide, there has been an intensification in the use of digital tools in pre-election campaigns and polarising events and trends more generally. At the centre of the twin acceleration of conflict and data production and circulation lies a widely shared concern for a deteriorating public information ecosystem, and thereby, a deepening epistemic crisis (Dahlgren, 2018), a process that has previously been designated as “the disinformation society,” punning on the “information society” hype (Kennedy Jr, 2005, Marshall et al 2015) but is presently more pressing than ever.
This special issue aims to explore the implications of these developments for the fields relevant to Information Polity. We are interested in critical studies from a wide range of academic disciplines that explore social ecosystems in the nexus of data, information, AI, algorithms, and the internet and associated phenomena such as the attention economy, the decline of legacy media, and post-truth politics with implications for governments, industries, militaries, education, the civic society, and everyday life. The journal is strongly interdisciplinary, so we are welcoming contributions from fields such as anthropology, communication studies, computational social analysis, history, hype studies, information science, law, media studies, philosophy, science and technology studies, social policy, social and political sciences, or social linguistics to name a few. The issue’s audience is expected to have a strong qualitative focus, however, we are welcoming contributions deploying mixed qualitative-quantitative methods, or quantitative research with a strong theoretical background. We are also open to theoretical explorations and conceptual experiments, as long as they are firmly grounded in relevant literature and evidence.
Information harms, in this context, have been related to a notably wide range of political and societal concerns across the spectrum of misinformation, disinformation, and malinformation.
• Under the modern term ‘FIMI’ (foreign information manipulation and interference), conflicts between states and military blocs have brought new attention to the risks of (often subliminal) propaganda efforts reminiscent of Soviet-era notions of psy-ops and reflexive control (Mitrakiev & Dimitrov 2024).
• Within western and other societies, political polarisation appears to go accompanied by an uptick in conspiracy theories and micro-targeted electoral ‘baiting’.
• Current technological leaps in widely accessible generative AI tools augurs the potential for a further decentring of disinformation agents, potentially sparking an ‘arms race’ in disinformation efforts, with all due environmental implications deriving from increased energy needs.
• An ongoing dynamic of social class divide is also evident, as individuals sequester into different trust networks with their own truth-finding criteria and access to specific (digital re)sources (Arsky & Cherny 1997; Jones et al, 2025). Different social groups also have different defining and response power when encountering potential instances of disinformation (Kuo and Marwick, 2021).
• Finally, a degree of self-reflection also befits academia, as an institution in which the engagement with truth finding traditionally forms a central objective. Higher education increasingly requires academic staff to systematically protect and transfer skill sets that instil information literacies across wider ranges of data sources, processing methods, and evaluation techniques (Williamson 2025, Ravn et al 2025). In research, too, traditional methodologies increasingly risk becoming caught up in the ‘truth debates’ of late-modern highly datafied society (Lynch 2017, Martin & Newell 2024).
After one-and-a-half decade of technological hypes and regulatory chaos (Belsunces 2025), we now appear to find ourselves in ‘decades of consequences’ (borrowing a term from political and climate science), where information harms become all the more pressing. What does this decade, and the more remote future, hold in store for us? Are we overcoming the initial naïve enthusiasm and narcissism associated with ‘social media 1.0’? Will we soon reach the fourth level of Baudrillard’s simulacrum of hyperreality (Baudrillard 1983)? And, under those circumstances, what criteria can replace the quest for factuality and truth that characterises democracies? How will societies salvage that which is most important to them? And how will individuals within these societies uphold a sense of dignity, stable identity, and grip on the world surrounding them in a context of contextual and information integrities under pressure (Nissenbaum, 2011; OECD, 2024)? It is these types of questions that this special issue is concerned with.
Scope of the special issue and focus areas
We invite submissions that offer both social-scientific conceptual/theoretical insights and empirical case studies, laying bare and reflecting critically on the mechanisms underlying of disinformation and its governance in a democratic context. These contributions may cover a broad spectrum of topics including, but not limited to:
1. Historical and Contemporary Lessons:
Analysis of the use of fake news, deep fakes, fabricated live reporting, and staged political communication and propaganda in conflicts such as those between Ukraine and Russia, Palestine and Israel, or in the Sudanese Civil War.
2. Methods in Communication Distortion:
Exploration of the revival of old, or invention of new, methods for fabricating news or distorting messages, such as the application of reflexive control or Rumsfeld matrixes in geopolitical new contexts or the mobilisation of polarisation, hype, and demonisation as political communicational tools.
3. Digital Content Dynamics:
Sociotechnical examinations of the interaction between viral content and echo chamber niches, the role and motives of information originators and content intermediaries (‘foot soldiers,’ ‘trolls,’ ‘information alibis,’ ‘click farms,’ influencers, etc.) and their performative influence on public discourse.
4. Traditional Communication Models and New Technological Affordances:
Insights into the lessons learned from new technological affordances within the complex configurations of platforms, including news platforms, personalised news feeds, tiktokisation, GenAI summaries, and streaming technologies, and critical reflection on the (potentially degrading) effects of such new affordances on traditional sources of quality content (inter alia, the legacy media). This extends to issues of affect, user experience, technological embodiment, public understanding of technology and policy, and the social shaping of information infrastructures.
5. Adversarial Attacks and Algorithmic Vulnerabilities:
Studies on algorithms learning from other algorithms’ limitations and the potential consequences of adversarial attacks. This includes empirical and geopolitical analyses into the drivers and dynamics behind information manipulation and interference by foreign statal actors in public debates for geopolitical reasons.
6. The Role of Post-Truth and Affective Public Language and Rhetoric:
Analysis of the impact of language in public and intellectual discourse, including strategies such as hype, click-bait, and the use of rhetoric and metaphors (Wyatt 2021). Concomitantly, an exploration of the changing role of factual information and framing in political communication and debates and how these feed into online and offline communication and debates.
7. Legislative, Policy, and Digital Governance Considerations:
Examination of legislative and policy issues surrounding the protection of the information commons and relevant ecosystems, with an emphasis on respecting/balancing freedom of speech, and attention to the wider governance landscape of digital platforms and services. As part of this examination, we particularly encourage evaluative or other studies into the state of the art about the functioning and effectiveness of behavioural policies designed to counter the negative effects of disinformation (Van der Linden et al., 2021).
8. Trust and Harm Assessment:
Development of criteria for assessing harms and trust within the informational ecosystem that transcend traditional classifications of information/misinformation/disinformation/malinformation.
9. Alternative Pathways for Trustworthy Information:
Investigation into alternative methods for acquiring trustworthy information, including user-driven innovation, community-based networks of trust, and bottom-up approaches, and the methods and experiences of different social groups with regard to such approaches (members of the general public, government organisations, political parties, etc.).
Do not hesitate to contact the special issue editors (see below), if you wonder whether your contribution might fit in.
Timeline
Important dates for the publication of this special issue are as follows:
• 14 November 2025 - Deadline for abstract submission
• 15 December 2025 - Notification for invitation to submit full manuscript
• 1 March 2026 - Deadline for submission of full manuscript
• March 2025 - October 2026 - Review process
• 1 November 2026 - Final decision on manuscripts
• January 2027 - Anticipated publication
Abstracts should initially be sent to m.z.hillebrandt@uu.nl by 14 November 2025. Abstracts should be up to 700 words and include the names of all envisaged authors and their institutional affiliations. Abstracts will be reviewed by the Guest Editors of the Special Issue. This review will focus on the fit with the special issue theme, feasibility and potential contribution to knowledge. The authors of accepted abstracts will be invited to submit full manuscripts. Full manuscripts will be double-blind peer reviewed. Please note that initial acceptance of an abstract does not guarantee acceptance and publication of the final manuscript.
Final manuscripts must be submitted directly through IP’s submission system and needs to adhere to the journals submission guidelines: https://informationpolity.com/guidelines
About Information Polity
Information Polity is a tangible expression of the increasing awareness that Information and Communication technologies (ICTs) have become of deep significance for all polities as new technology-enabled forms of government, governing and democratic practice are sought or experienced throughout the world. This journal positions itself in these contexts, seeking to be at the forefront of thought leadership and debate about emerging issues, impact, and implications of government and democracy in the information age. Information Polity is an multidisciplinary journal that invites contributions from a range of academic disciples including, but not limited to, media and culture studies, public administration, law, and sociology.
More information: http://informationpolity.com
Author Instructions
Instructions for authors for manuscript format and citation requirements can be found at: https://informationpolity.com/guidelines
Special Issue Guest Editors
Vassilis Galanos, PhD, is Lecturer at the Stirling Business School, University of Stirling, United Kingdom (vassilis.galanos@stir.ac.uk)
Maarten Hillebrandt, PhD, is Assistant Professor at Utrecht University School of Governance, The Netherlands (m.z.hillebrandt@uu.nl)
Information Polity Editors-in-Chief
Professor Albert Meijer, Utrecht University
Professor William Webster, University of Stirling
References
Arsky, J. M., & Cherny, A. I. (1997). The ethno-cultural, linguistic and ethical problems of the “infosphere”. The International Information & Library Review, 29(2), 251-260.
Baudrillard, J. (1983). Simulacra and Simulation. Trans. Sheila Faria Glaser. Ann Arbor: University of Michigan Press.
Belsunces, A. (2025) “Sociotechnical Fictions: The Performative Agencies of Fiction in Technological Development”, Science & Technology Studies. doi: 10.23987/sts.131333.
Dahlgren, P. (2018). Media, knowledge and trust: The deepening epistemic crisis of democracy. Javnost-The Public, 25(1-2), 20-27.
Jones, B., Jones, R., Luger, E., & the International Association of Public Media Researchers (IAPMR). (2025). Power Asymmetries in Public Service Journalism: Artificial Intelligence and the Intelligibility–Agency Problem. In A. D’Arma, M. Michalis, G. F. Lowe, & M.-B.
Zita (Eds.), Challenges and Developments in Public Service Journalism (pp. 148–172). University of Westminster Press.
Kennedy Jr, R.F. (2005). The Disinformation Society. Vanity Fair. May 2005. Accessed 08-09-2025 from: https://archive.vanityfair.com/article/share/8a156e73-93d7-43fa-923c-925445200f60
Kuo, R., & Marwick, A. (2021). Critical disinformation studies: History, power, and politics. Harvard Kennedy School (HKS) Misinformation Review, 2(4).
Linden, S. van der, Roozenbeek, J., Maertens, R., Basol, M., Kácha, O., Rathje, S., & Traberg, C. S. (2021). How can psychological science help counter the spread of fake news? The Spanish Journal of Psychology, 24, e23. https://doi.org/10.1017/sjp.2021.23
Lynch, M. (2017). STS, symmetry and post-truth. Social studies of science, 47(4), 593-599. Marshall, J. P., Goodman, J., Zowghi, D., & Da Rimini, F. (2015). Disorder and the disinformation society: The social dynamics of information, networks and software. Routledge.
Martin, A., & Newell, B. (2024). Synthetic data, synthetic media, and surveillance. Surveillance & Society, 22(4), 448-452.
Mitrakiev, B., & Dimitrov, N. (2024). Russian Reflexive Control Campaigns Targeting Political
Realignment of Ukraine’s Democratic Allies: Critical Review and Conceptualization. Information & Security: An International Journal, 55, 299-330. Nissenbaum, H. (2011). Privacy in context: Technology, policy, and the integrity of social life. Journal of Information Policy, 1, 149-151.
OECD (2024). Mis- and Disinformation. Organisation for Economic Co-operation and Development. Topic. https://www.oecd.org/en/topics/disinformation-and-misinformation.html
Ravn, L., Galanos, V., Archer, M., & Shanley, D. (2025). Unraveling the Regimes of Synthetic Data Metrics: Expectations, Ethics, and Politics. Digital Society, 4(2), 44.
WEF (2025). These are the biggest risks we face now and in the next 10 years.https://www.weforum.org/stories/2025/01/global-risks-report-2025-bleak- predictions/. 15 January 2025.
Williamson, B. (2025). Re-infrastructuring higher education. Dialogues on Digital Society, 1(1), 41-46.
Wyatt, S. (2021). Metaphors in critical Internet and digital media studies. New media & society, 23(2), 406-416.