Temporal Graph Benchmark

image-left

The Temporal Graph Benchmark (TGB) is a comprehensive collection of datasets designed for the realistic, reproducible, and robust evaluation of machine learning models on temporal graphs. These datasets are large-scale, span multiple years, and cover various domains such as social networks, trade networks, transaction networks, and transportation networks. They include both node and edge-level prediction tasks, providing a diverse set of challenges for researchers.

The follow-up work TGB 2.0 is a new benchmarking framework designed to evaluate methods for predicting future links on Temporal Knowledge Graphs (TKGs) and Temporal Heterogeneous Graphs (THGs). This framework extends the original Temporal Graph Benchmark by focusing on large-scale datasets, which are significantly larger than existing datasets in terms of nodes, edges, or timestamps.

TGB offers an automated machine learning pipeline that facilitates reproducible and accessible research. This pipeline includes tools for data loading, experiment setup, and performance evaluation. The benchmark is regularly maintained and updated, and it welcomes feedback from the community to ensure its continued relevance and improvement.

Veracity AI

image-left

This project combines large language models (LLMs) with web search APIs to create an AI-powered fact-checking tool. The goal is to develop a publicly accessible app that empowers individuals to verify the accuracy of information they encounter online. This app goes beyond fact-checking and aims to enhance the understanding of its users by offering credible sources.

Designed to be user-friendly and transparent, the app ensures that users can see the reasoning and evidence behind its conclusions. This combination of AI-powered analysis and credible source linking aims to build trust in the tool while addressing the growing challenges of misinformation and disinformation

Infrared

image-left

The Infrared project is looking to turn 6+ years of rigorous research into real-world impact. Our algorithms have been combined to create a searchable, AI-powered database that provides human trafficking survivors with access to their digital footprint and that of their trafficker. This digital footprint sheds light on where the victim has been, over what time period, and to which other victims they might be connected. The footprint is checked for common indicators of human trafficking, which is designed to corroborate and empower victims’ stories.

Using the Infrared tool, human trafficking survivors will be armed with new types of high quality evidence, reducing the burden and trauma of re-sharing their stories. Furthermore, the product is designed to enhance the survivor’s chances of success in accessing restorative justice (by way of criminal courts, civil courts, family courts, refugee tribunals and other venues where evidence is needed).

We are looking to collaborate with non-profit organizations working with human trafficking survivors to enhance the quality of care and the opportunities they have to rebuild their lives.

Misinfo Datasets

image-left

Misinformation is a challenging societal issue, and mitigating solutions are difficult to create due to data deficiencies. To address this problem, we have curated a growing collection of (mis)information datasets in the literature. From these, we evaluated the quality of all of the 36 datasets that consist of statements or claims.

Social Simulation

image-left Multi-agent simulator based on Deepmind’s Concordia, a software library for LLM-based multi-agent simulations of real world human experience.