Establishing collaboration with LOCARD project

ROXANNE is in touch with LOCARD H2020 EU project, a lot of similar requirements and challenges are targeted by both the consortia. Specifically, ROXANNE aims to learn more about recent work published by the LOCARD project on “Large-scale analysis of grooming in modern social networks”.

Child sexual abuse and grooming are a real threat for children and young people of all ages and backgrounds. It is a difficult task for any single agency, authority, ministry, or NGO, or company to tackle this problem and require strong cooperation to fight against the same.

ROXANNE is in touch with LOCARD H2020 EU project, a lot of similar requirements and challenges are targeted by both the consortia. Specifically, ROXANNE aims to learn more about recent work published by the LOCARD project on “Large-scale analysis of grooming in modern social networks”. The paper is available on Arxiv, and will be soon published on  https://www.journals.elsevier.com/expert-systems-with-applications.

One of ROXANNE’s use case is related to child sexual abuse (CSA), and detecting and identifying such predatory behaviours from social media streaming using conventional machine learning technologies is a challenge. More specifically, some of the NLP (Natural Language Processing) based solutions considered in ROXANNE in this direction are:

  • Building a network to identify key characters, and classify conspicuous users based on text, topology, and time.
  • Applying Authorship Attribution (AA) to identify the persons involved in the chat conversation based on their style of writing.
  • Using Topic Modelling to analyse the chat messages of sexual predators to infer the topics, which helps to study child grooming.
  • Classifying predator vis victim using grooming conversations.
  • Analyse the patterns followed by the predators. It could be based on psychoacoustics, or clustering on chat embeddings followed by temporal analysis to find the capture topics.

The findings of such research surely help LEAs and other researchers working in the field of child sexual abuse and grooming to explore the usage of technologies and to build solutions to prevent them.