Abstract
The manipulation of information and the dissemination of “fake news” are practices that trace back to the early records of human history. Significant changes in the technological environment enabling ubiquity, immediacy and considerable anonymity, have facilitated the spreading of misinformation in unforeseen ways, raising concerns around people’s (mis)perception of social issues worldwide. As a wicked problem, limiting the harm caused by misinformation goes beyond technical solutions, requiring also regulatory and behavioural changes. This workshop proposes to unpack the challenge at hand by bringing together diverse perspectives to the problem. Based on participatory design principles, it will challenge participants to critically reflect the limits of existing socio-technical approaches and co-create scenarios in which digital platforms support misinformation resilience.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Context
The acknowledged influence of social media on the results of the UK’s Brexit referendum and Donald Trump’s election in the US, for example, are examples of the magnitude of the power granted to the online world to transform reality [3]. In such context, misleading informationFootnote 1, be it deliberately false or not, is continuously harming individuals and societies by threatening democratic political processes and distorting values that shape public opinion in a variety of sectors, such as health and science (i.e. anti-vaccines movement [9]), foreign policy (i.e. Iraq war [10]), etc., and now in global scale [4].
Information disorder [2, 8] has long been examined from multiple perspectives, including social science, journalism, psychology, and computer science [6]. As a wicked problem, there is not a single and comprehensive solution capable to stop misinformation. In Fig. 1, we graphically summarise some key aspects related to the spread of misinformation from a social (people’s values, beliefs, motivations), regulatory and technical (social media, detection tools) perspectives, as well as some factors crossing boundaries, such as information literacy, with regulatory and social components, and social media regulations and fact-checking that concerns both regulatory and technical aspects.
In such a scenario, social media players, technology designers, policymakers, journalists, educators and citizens are all stakeholders with some responsibility in understanding the problem on its complexity and come up with pieces of solutions that will limit the spread and impact of misinformation worldwide.
Examining the limits of human cognition for dealing with and spreading misinformation [12, 13], exploring approaches to nudge [11], ‘vaccinating’ social media users [1], fact-checking more effectively [5], automating detection and correction [7] are some of the approaches that have been currently explored in the literature. However, as pointed out in [6], existing approaches are all limited. With few exceptions, they tend to consider technology users as passive consumers rather than active co-creators, learners, and detectors of misinformation. We argue that more comprehensive solutions can only emerge when there is an articulation of diverse ideas and approaches, requiring the participation of different stakeholders, and including end users, social scientists, computer scientists, educators, and others, in the co-creation of their features, user interfaces, and delivery methods.
2 Objectives
The goal of this workshop is to propose an agenda for interdisciplinary research that critically analyses and aggregates socio-technical solutions that establish fundamental limits to misinformation. To this end, the workshop will engage the participants in:
-
Discussing challenges and obstacles related to misinformation;
-
Challenging existing approaches to tackle misinformation and identifying their limitations;
-
Mapping stakeholders, and questioning the relationships between them;
-
Co-creating future scenarios where digital platforms support misinformation resilience;
-
Identifying criteria for assessing the potential of different solutions to make impact.
The workshop will be of interest to researchers and practitioners that hope to impact society through the design and development of socio-technical systems in the social media context, and it’s current struggles between what is considered fact and fiction. As a longer-term goal, the workshop aims at building a multidisciplinary research community focusing on the design of misinformation resilient societies.
3 Workshop Rationale
The workshop is grounded on the principles of co-creation [14], focusing on where and what value is created with the digital solution [15]. The workshop agenda will engage participants in activities that challenge the status-quo and promote creative-thinking towards creating innovative solutions. Participants will be encouraged to ask questions, be critical, active, and bold in the idea-generation process.
3.1 Participation
Not only interaction designers researchers and practitioners will be invited to participate but also some journalists, educators, policymakers or other related stakeholders. The call for papers will be distributed via the network of a European project on misinformation, authors of related papers found at digital libraries, HCI-related mailing lists, and social media in general.
Participants will be encouraged to submit a 2 to 4 pages position paper describing their approach towards fighting misinformation, acknowledged limits of the approach, and how they envision a future in which the societies are more resilient to information disorder.
The number of participants should be between 12 to 20 in order to keep group activities feasible and interesting.
Topics of interest include, but are not limited to: socio-technical empirical studies, motivational and behavioral studies, human values, persuasive technology, games, gamification, information and media literacy, fact-checking, social media policies and regulation, automated tools for misinformation detection and notifications, legal and ethical aspects.
3.2 Overall Structure
The activities of this one-day workshop will be split into 2 main parts: during the morning, participants will debate and critically analyse existing approaches and solutions to tackle misinformation, while the afternoon activities will target future scenarios where digital innovations will support misinformation resilience. The core activities of the workshop have already been applied in different contexts and research scenarios. They are:
-
Ice-breaker (30”): Everyone gets a cup of drink on which there is a label with a provocative ‘fact’ designed to spark a debate. Participants talk and validate/check these facts together, some reporting back to the entire group. After that, a quick round of introductions will happen.
-
Setting the stage (15”): A short opening talk given by an inspirational speaker on a topic that undercuts the discussions of the day, either fact checking or social media and misinformation.
-
Mapping the Terrain (40”): In groups, participants storyboard the present and map perceptions of stakeholders involved in the decision-making process of spreading or stopping misinformation, discussing their role and relations. The activity will involve some props, like the wooden dolls in Fig. 2, representing the stakeholders, their values and connections.
-
Role Play (60”): Group exercise to change perspectives, lenses and orientations. Validating the stakeholders map built previously, participants will swap perspectives by wearing different hats (fact-checker could take on the role of policymaker, for example) and identify positive and negative points in the relationship and communication between stakeholders. Each group present their sketches/maps to other groups.
-
Lightning Talks (60”): The attendees will present a 3 min lightening talks on their area of research, providing inspirational content for the second part of the day, focusing on future research.
-
Future Making (120”): The organisers first introduce the future-thinking part of the workshop by demonstrating a vision of future research communities, as a “food for thought”. This can be done also by presenting iconic images and artifacts that reflect and provoke future imaginations. Then, the Horns of the Dilemma co-creation exercise will engage participants with identifying criteria for assessing an innovation’s potential for impact, such as tools to resolve the fact or diction’ dichotomy, for example. The participants pinpoint the most promising points of intervention (‘leverage points’), at each scale in the system, considering also particular concerns of other stakeholders along the journey. This process reveals the bottlenecks - the limits - and leverage points when describing futures.
-
Wrap up and Next Steps (45”): The organisers summarise the discussions and insights and weave a red thread around the narrative. They will open up the discussion on next steps and future research, thereby paving the way for efforts that will take place collectively to publish and further research in this area.
More details on the workshop program and call for papers are available at the workshop website: http://events.kmi.open.ac.uk/misinformation/ .
4 Organisers
The four organisers share the common challenge of co-designing interactive technology to foster critical thinking and digital literacy for a better-informed and resilient society.
Lara Piccolo investigates interaction design with a socio-technical and inclusive perspective, considering how technology can trigger a positive impact on people’s lives. Community engagement, motivations and values are important drivers of her research. Her current research looks at voice-based systems to raise awareness of misinformation. Lara is also an Associated Lecturer on Interaction Design and User Experience.
Somya Joshi is an expert in the field of Sustainable Human-Computer Interaction (SHCI). Her specialisation falls within the applied context of technological innovation, particularly in how it translates into transparency in governance, environmental conservation and citizen engagement. She has experience working with a range of partners from academia, industry, NGOs, as well as international development organizations towards the common goal of facilitating inclusive development. Currently, Somya is Head of Research at eGovernance-Lab .
Evangelos Karapanos directs the Persuasive Technologies Lab . Evangelos’ expertise is in experience-centered design of interaction with technology. His ongoing work explores technology-mediated nudging interventions for misinformation-resilient societies.
Tracie Farrell is a non-formal education specialist. Her research interests focus on technologies for awareness and reflection. In particular, she examines how technology can trigger metacognitive activity.
5 Expected Outcome
The accepted position papers will be published in the official adjunct conference proceedings. Furthermore, the main workshop results will be further disseminated to a wider audience via a poster presentation and a video reaching out the overall Interact community. The website will also be updated with the accepted papers and a summary of the workshop outcomes.
The possibility of a special journal issue will be discussed with the participants as a way to strengthen the community.
Notes
- 1.
Misinformation refers to misleading information created without the intention to harm, while disinformation refers to deliberate fabricated information with the intention to impact social groups or societies. As a simplification, we refer to misinformation to represent the complexity of this information disorder.
References
Van der Linden, S., et al.: Inoculating the public against misinformation about climate change. Glob. Challenges 1(2) (2017). https://doi.org/10.1002/gch2.201600008
Claire, W., Hossein, D.: Information disorder: Toward an interdisciplinary framework for research and policymaking. Technical report, Council of Europe (2017). http://rm.coe.int/information-disorder-report-version-august-2018/16808c9c77
DiFranzo, D., Gloria-Garcia, K.: Filter bubbles and fake news. XRDS 23(3), 32–35 (2017). https://doi.org/10.1145/3055153
European Commission - Directorate-General for Communications Networks, Content and Technology: A multi-dimensional approach to disinformation. Technical report, European Commission (2018). https://publications.europa.eu/s/iOLW
Facebook: Hard questions: How is facebook’s fact-checking program working? June 2018. https://newsroom.fb.com/news/2018/06/hard-questions-fact-checking/. Accessed 12 Oct 2018
Fernandez, M., Alani, H.: Online misinformation: challenges and future directions. In: Companion Proceedings of the Web Conference 2018 (WWW 2018), pp. 595–602 (2018). https://doi.org/10.1145/3184558.3188730
Garrett, R.K., Weeks, B.E.: The promise and peril of real-time corrections to political misperceptions. In: Proceedings of the 2013 Conference on Computer supported cooperative work, pp. 1047–1058. ACM (2013)
Ireton, E.C., Posetti, J.: Journalism, ‘Fake News’ and Disinformation Handbook for Journalism Education and Training. United Nations Educational, Scientific and Cultural Organization - UNESCO Publishing, Paris (2018)
Kata, A.: A postmodern Pandora’s box: anti-vaccination misinformation on the internet. Vaccine 28(7), 1709–1716 (2010)
Kull, S., Ramsay, C., Lewis, E.: Misperceptions, the media, and the Iraq war. Polit. Sci. Q. 118(4), 569–598 (2003)
Levy, N.: Nudges in a post-truth world. J. Med. Ethics 43(8), 495–500 (2017)
Metaxas, P.: Technology, propaganda, and the limits of human intellect. arXiv preprint arXiv:1806.09541 (2018)
Pennycook, G., Rand, D.G.: Lazy, not biased: susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. Cognition 188, 39–50 (2018)
Prahalad, C.K., Ramaswamy, V.: The co-creation connection. Strategy and Business, pp. 50–61 (2002)
Voorberg, W.H., Bekkers, V.J., Tummers, L.G.: A systematic review of co-creation and co-production: embarking on the social innovation journey. Public Manag. Rev. 17(9), 1333–1357 (2015)
Acknowledgment
This workshop proposal has been supported by the EC within the Horizon 2020 programme under grant agreement 770302 - Co-Inform .
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 IFIP International Federation for Information Processing
About this paper
Cite this paper
Piccolo, L.S.G., Joshi, S., Karapanos, E., Farrell, T. (2019). Challenging Misinformation: Exploring Limits and Approaches. In: Lamas, D., Loizides, F., Nacke, L., Petrie, H., Winckler, M., Zaphiris, P. (eds) Human-Computer Interaction – INTERACT 2019. INTERACT 2019. Lecture Notes in Computer Science(), vol 11749. Springer, Cham. https://doi.org/10.1007/978-3-030-29390-1_68
Download citation
DOI: https://doi.org/10.1007/978-3-030-29390-1_68
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-29389-5
Online ISBN: 978-3-030-29390-1
eBook Packages: Computer ScienceComputer Science (R0)