Concordia

Teacher-Student Neurosymbolic Learning

Teacher-Student neurosymbolic framework, where instead of being a complex deep model, the teacher is a probabilistic logical theory. The framework is implemented in PyTorch. Concordia supports supervised, semi-supervised, and unsupervised training and has been applied to a variety of tasks, outperforming the relevant state-of-the-art. In particular, Concordia outperforms DLP, Bi-LSTM, and DistilBERT on entity linking and IARG and PSL-CAD on collective activity detection when using MobileNet and Inception-v3 as backbone networks. Concordia is strictly more expressive than DLP and T-S in terms of the types of supported logical theories.

Repository

Concordia

Relevant publications

2023

  1. ICML
    Parallel Neurosymbolic Integration with Concordia
    Jonathan Feldstein, Modestas Jurcius, and Efthymia Tsamoura
    In International Conference on Machine Learning (ICML), 23-29 July 2023, Honolulu, Hawaii, USA, 2023

2024

  1. Review paper
    Mapping the Neuro-Symbolic AI Landscape by Architectures: A Handbook on Augmenting Deep Learning Through Symbolic Reasoning
    Jonathan Feldstein, Paulius Dilkas, Vaishak Belle, and Efthymia Tsamoura
    2024