What are Cyber Social Threats

The role of online platforms as a prime, daily communication tool is coincident with a sharp rise in its misuse, threatening our society in large. These platforms have been implicated for promoting hate speech, radicalization, harassment, cyberbullying, fake news, human trafficking, drug dealing, gender-based stereotyping, and violence among other ills, with a significant impact on individual and community well-being. Especially problematic in recent years, and of particular interest for CySoc 2022, has been the proliferation of vaccine misinformation. Such content and behaviors are inherently multi-faceted, making the recognition of their narratives challenging for researchers as well as social media companies. The implications to individuals and communities require reliable models and algorithms for detecting, understanding, and countering the malevolent behavior in such communications. These challenges have led to a rising prominence of analysis of online communications in academia, politics, homeland security, and industry using computational techniques from natural language processing, statistics, network science, data mining, machine learning, computational linguistics, human-computer interaction, and cognitive science. To meet these challenges, this workshop aims to stimulate research on social, cultural, emotional, communicative, and linguistic aspects of harmful conversations on online platforms and developing novel approaches to analyze, interpret, and understand them.

The workshop welcomes papers that employ quantitative and/or qualitative, analytical, theoretical approaches examining a diverse range of issues related to online harmful communications. Papers on resources/data and tools will also be welcome either for demos or for short/regular talks.

Why attend the CySoc Workshop?

This workshop will bring together researchers and practitioners in computer and social sciences from both academia and industry to exchange ideas on understanding the multi-faceted aspects of harmful content while leading the discussion on building novel computational methods to reliably detect, derive meaning of, interpret, understand and counter them. The participants will find opportunities to present and hear about other fundamental research and emerging applications, to exchange ideas and experiences, and to identify new opportunities for collaborations across disciplines. The researchers and practitioners from various disciplines are strongly encouraged to attend, including (but not limited to) behavioral science, computer and information sciences, psychology, sociology, political science, cognitive science, cultural study, information systems, terrorism and counter-terrorism, operations research, communication, medicine, and public health.


Themes & Topics

We are interested in both computing and social science approaches that study the above research directions, based on quantitative, qualitative and mixed research methods. We expect to receive submissions and lead discussions on the topics of novel analytic methods, tools, and datasets.

Spotlight topic

Parallel to the central themes and broader list of topics found below, this year we will have a spotlight topic - vaccine misinformation - due to its relevance and urgency.

Four-in-ten Americans considered social media as an important way of receiving COVID-19 vaccine news1 and a recent study suggests that exposure to online misinformation is associated with increased health risks and vaccine hesitancy2, calling for more research on this topic. Despite the ongoing efforts in research and development to investigate COVID-19 vaccine-related online communications3,4, the relationship between online misinformation and vaccine adoption5 as well as other public health outcomes are not well-understood. This workshop aims to provide opportunities to gather expertise from related areas and advance our understanding of this issue.

Themes

The CySoc workshop has three main themes:

  • Detection and prediction of content, users, and communities
  • Countering harmful narratives
  • Ethical considerations and handling bias

Topics

Topics for research and discussions on challenges in dealing with the online harmful content include (but not limited to):

  • Spotlight topic: Vaccine misinformation
  • Online extremism
  • Harassment and cyberbullying
  • Hate speech
  • Gender-based violence
  • Human trafficking
  • Illicit drug trafficking
  • Mental health implications of social media
  • Ethical considerations on privacy-preserving social media analytics
  • Emotional and psychological support
  • Trust relationship and community dynamics
  • Relationship of the social web and mainstream news media
  • Cultural implications of social web usage
  • Influencer identification and community detection for movements
  • Misinformation and disinformation (e.g., epidemics of fake news, images and videos, during a disaster, health issues and elections)

Important Dates


Paper submissions due: March 27, 2022 April 3, 2022 (Anywhere on Earth)
Final decision notification: April 10, 2022 April 27, 2022
Camera-ready submissions due: April 17, 2022 May 4, 2022

Submission Instructions


We invite research papers (8 pages), position and short papers (4 pages), and demo papers (2 pages). Submissions must be original and should not have been published previously or be under consideration for publication while being evaluated for this workshop. Submissions will be evaluated by the program committee based on the quality of the work and its fit to the workshop themes. All submissions should be double-blind and a high-resolution PDF of the paper should be uploaded to the EasyChair submission site before the paper submission deadline. The accepted papers will be presented at the CySoc workshop integrated with the conference, and they will be published as Proceedings of the ICWSM Workshops. All must be submitted, and formatted in AAAI two-column, camera-ready style.

Workshop Program

All times below are in Eastern Time

8.30 – 8.45 AM – Welcome the CySoc 2022 workshop attendees.
8.45 – 9.45 AM – Keynote I: Neil Johnson, George Washington University.

Science of online ‘anti-X’ and a scalable solution: From vaccine misinformation to extremism
Calls are escalating for social media platforms to do more against extreme online communities whose views can incite real-world harms, e.g. scientific dis/misinformation about vaccines that increased Covid-19 fatalities and now extends to non-scientific cancer treatments, distrust in climate science, and even untested baby formula alternatives; white replacement that inspired the 2022 Buffalo shooter and will likely inspire others; political distrust and anger that threatens elections (e.g. 2021 U.S. Capitol attack); notions of male supremacy that encourage abuse of women; anti-Semitic, anti-LGBQT and QAnon conspiracy theories. But should 'doing more' mean doing more of the same, or something different? And if so, what? In this talk, I start by showing why platforms doing more of the same will not solve the problem. As a solution, I then introduce and demonstrate empirically a new scalable scheme that helps soften online extremes organically. Guided by maps of the multi-community audience, it can be operationalized immediately across social media platforms ­with minimal cost. This research is done by our team of Elvira Restrepo, Martin Moreno, Lucia Illari, Minzhang Zheng, Pedro Manrique, Sara El Oud, Nicholas Gabriel, Frank Huo, Chenkai Xia, Yonatan Lupu, Richard Sear.

9.45 – 10.30 AM – Paper Session I: Three papers will be presented with 15 minutes allocated for each, including Q/A.
10.30 – 10.50 AM – Coffee Break.

10.50 – 12.00 PM – Panel Discussion: Ullrich Ecker (The University of Western Australia), Fabiana Zollo (Ca' Foscari University of Venice), Ed Pertwee (London School of Hygiene & Tropical Medicine).

The Impact of Online (Mis)information on Vaccination Programs.
Moderator: Jeremy Blackburn (Department of Computer Science at Binghamton University).

12.00 – 1.00 PM – Paper Session II: Four papers will be presented with 15 minutes allocated for each, including Q/A.
1.00 – 2.00 PM – Break.

2.00 – 3.00 PM – Keynote II: Alice Marwick, The University of North Carolina at Chapel Hill.

Based and Redpilled: The Sociotechnical Effects of Disinformation
This talk examines how disinformative narratives are created and taken up by two different audiences: adherents to the far-right conspiracy theory known as QAnon, and white supremacists on Discord, Gab, and Reddit. By examining what each group takes as evidentiary of their counterfactual beliefs, I show how disinformative narratives are reinforced by epistemology. In QAnon, participants reject institutional knowledge produced by journalists, academics, or policymakers in favor of populist expertise, a body of “home-grown” information generated by those who feel disenfranchised from mainstream political participation. Turning to white supremacists, I examine how scientific discourse, visuals, and expertise is marshalled to provide convincing evidence for biological determinism. Framing white supremacy as a “red pill moment” reinforces the supposed significance and incontrovertibility of such evidence. Although QAnon rejects scientific expertise while white supremacists embrace it, they both rely on the creation and promotion of disinformation to justify their beliefs and potentially violent actions. This calls into question fact-checking and media literacy as effective solutions to the spread of disinformation.

3.00 PM – 4.00 PM – Demo presentations: Two demos will be presented with 30 minutes allocated for each, including Q/A.
4.00 – 4.45 PM – Paper Session III: Three papers will be presented with 15 minutes allocated for each, including Q/A. 4.45 – 5.15 PM – Synthesis/Brainstorming exercise: The first 15 minutes will be utilized for brainstorming about paper concepts for impactful future research in break-out Zoom rooms. The second 15 minutes will be for discussion with the larger group in the main conference room.

Registration

Online registration information can be found on the ICWSM website here: https://www.icwsm.org/2022/index.html/#registration.

Organizers

Ugur Kursuncu

Georgia State University GA, USA
Contact Email

Kaicheng Yang

Indiana University, IN, USA
Contact Email

Francesco Pierri

Politecnico di Milano, Milano, Italy
Contact Email

Matthew DeVerna

Indiana University, Bloomington, IN, USA
Contact Email

Megan Squire

Elon University, NC, USA
Contact Email

Yelena Mejova

ISI Foundation, Turin, Italy
Contact Email

Jeremy Blackburn

State University of New York at Binghamton, NY, USA
Contact Email




Steering Committee

Program Committee

References:

  1. Pew Research Center (2021)
  2. Loomba et al. (2021)
  3. Pierri et al. (2021)
  4. DeVerna et al. (2021)
  5. Burki (2019)