James Cussens
[Research]
[PhD supervision]
[Projects]
[Software]
[Teaching]
[Professional
Activities]
[Administration]
[Personal history]
[Contact information]
[Dept home page]
[Artificial Intelligence at York]
There appears to me to be a difficulty in this conclusion: that
happenings which depend upon an infinite number of cases cannot be
determined by a finite number of experiments; indeed nature has her
own habits, born from the return of causes, but only 'in general'. And
so, who will say whether a subsequent experiment will not stray
somewhat from the rule of all the preceding experiments, because of
the very mutabilities of things? [Letter from Leibniz to Bernoulli, 3
December 1703. Quoted in: Cussens, Probability and Statistics in Antognazza (ed.) The Oxford Handbook of Leibniz, OUP, 2018.]
Research
Google scholar profile
University of York research
profile
GOBNILP software for exact Bayesian network learning
Recent papers

Kocacoban, D., Cussens, J. Fast Online Learning in the Presence of Latent Variables. Digitale Welt 4, 37–42 (2020), 2020
 Alvaro H. C. Correia, James Cussens and Cassio de Campos. On Pruning for ScoreBased
Bayesian Network Structure Learning. Proc. AISTATS 2020 (to appear) and arXiv 1905.09943, May 2019.
 D. Kocacoban and J. Cussens, "Online Causal Structure Learning in the Presence of Latent Variables," 2019 18th IEEE International Conference On Machine Learning And Applications (ICMLA), Boca Raton, FL, USA, 2019, pp. 392395, doi: 10.1109/ICMLA.2019.00073.
 Christopher M. Hatton, Lewis W. Paton, Dean McMillan, James Cussens, Simon Gilbody and Paul A. Tiffin. Predicting persistent depressive symptoms in older adults: a machine learning approach to personalised mental healthcare. Journal of Affective Disorders, 2018.
 Zhenyu A. Liao, Charupriya Sharma, James Cussens and Peter van Beek. Finding All Bayesian Network Structures within a Factor of Optimal. Proceedings of the ThirtyThird AAAI Conference on Artificial Intelligence (AAAI19), 2019.

James
Cussens. Markov Random Field MAP as Set
Partitioning. Proceedings of the Ninth
International Conference on Probabilistic Graphical Models (PGM'18), PMLR 72:8596, 2018.

James
Cussens. Finding
Minimal Cost Herbrand Models with
BranchCutandPrice. arXiv:1808.04758, August 2018
 Milan Studený and James Cussens. Towards using the
chordal graph polytope in learning decomposable
models. International Journal of Approximate Reasoning, 88:259281, September 2017.

James Cussens, Matti Järvisalo, Janne H. Korhonen and Mark Bartlett. Bayesian Network
Structure Learning with Integer Programming: Polytopes,
Facets, and Complexity. Journal of Artificial
Intelligence Research, 58:185229, 2017.
 James Cussens, David Haws and Milan
Studený. Polyhedral
aspects of score equivalence in Bayesian network structure
learning. Mathematical Programming, 164(1):285324, July 2017.
( Fulltext,
viewonly version ).
(Arxiv
version)
Recent talks

Markov Random Field MAP as Set
Partitioning, PGM'18, Prague, 13 September 2018.

Towards
the Holy Grail in Machine Learning, CP'18 (Invited talk),
Lille, 29 August 2018.
 Optimal Algorithms for Learning Bayesian Network
Structures:
Integer Linear Programming and Evaluations, UAI
15 tutorial, Amsterdam, 12 July 2015.
Presentation
(by Changhe and then me) on YouTube
 Bayesian
network model selection using integer programming, Dept
of Statistics,
University of Oxford, 4 June 2015.
 Boole's
mathematical theory of logic and probability, Boole 200,
University of Utrecht, 8 May 2015.
 Leibniz's
new kind of Logic, Free University of Amsterdam, 6 May 2015.
Current PhD students

Zongyu Yin

Andrea Bassich (jointly with Simos Gerasimou)

John Burden (jointly with Victorial Hodge)

Felix UlrichOltean (jointly with Peter Nightingale)

Yajie Gu

Lizzie Vialls

Sorush Lajevardi

Teny Handhayani

Durdane Kocacoban
Former students

Mark Balmer (MSc by Research)

Garo Panikian  Statistical
inference of dynamical systems with application to modelling
fish populations

Eman Aljohani  Informative priors for learning graphical
models

Waleed Alsanie  Learning
PRISM programs

Joanne Powell  PrediCtoR: Predicting the Recovery of
Ancient DNA and Ancient Proteins (with Matthew
Collins, Archaeology)

Adel Aloraini  Extending the graphical represetation of KEGG pathways for a better understanding of prostate cancer using machine learning

Barnaby Fisher 
Inductive Logic Programming and Mercury (MSc by Research)

Heather
Maclaren

Inductive Logic Programming for Software Agents:
Algorithms and Implementations
Software
 GOBNILP software for exact Bayesian network learning
 gPy is a collection of
Python modules for manipulating discrete hierarchical models
(including Bayesian nets). It was used to support the
teaching of Algorithms for Graphical
Models.
 The
MCMCMS (Markov chain Monte Carlo over Model Structures)
system
uses Stochastic Logic Programs (SLPs) to define priors for
Bayesian inference. The code was written by
Nicos Angelopoulos.

Pepl
is an implementation of the FailureAdjusted Maximisation
(FAM) algorithm.This is an instance of the EM algorithm which
produces maximum likelihood estimates for the parameters of
SLPs. The code was written by Nicos Angelopoulos.

Aaron Bate, a final year student in this department, has
produced software for animating the construction of Prolog
proof trees (the software draws a graphical representation
of the proof tree). It uses Sicstus Prolog and Tcl/Tk. You can download the software as a gzipped
tar file. I have included a simple Prolog SLP
interpreter which allows you to sample from a distribution
over proof trees and where each proof tree determines an
acyclic digraph (the structural element of a Bayesian
net). See the file slp_readme in the distribution
for an explanation.
Projects
Teaching
Currently, at York:
Professional Activities
Programme chair
Editorial duties
Member
Invited speaker

Bilbao Data Science Workshop, Bilbao, November 2019

Graphical
Models: Conditional Independence and Algebraic Structures,
Munich, October 2019
 CP 2018, Lille, 2731 August, 2018
 Workshop on Learning with Structured Data and Natural
Language, Toulouse, 911 December, 2015.

Joint Workshop on Limit Theorems and Algebraic Statistics,
Prague Stochastics 2014, August 2529, 2014
 ICLP
workshop on Probabilistic logic programming, 17 July
2014

ILPMLGSRL 09

UKKDD2007

AC05

ILP04
Area chair/Senior PC
Coorganiser
PC member / Reviewer

NeurIPS 2020,
AAAI20,
AISTATS 2020,
UAI 2020,
PGM 2020,
ILP 2020

AAAI19,
AISTATS 2019,
ICML19,
ILP 2019

NIPS18,
ICML18,
AAAI18,
AISTATS 2018,
ICLR 2018,
PGM 2018,
UAI 2018,
ILP 2018

NIPS17,
AAAI17,
AISTATS 2017,
ICML 2017,
ECML/PKDD 2017,
UAI 2017,
ILP 2017

NIPS16,
AAAI16
,
KDD 2016
,
UAI 2016
,
IJCAI16
,
ECML/PKDD 2016
,
ECAI 2016
,
StarAI 2016
,
PGM 2016
,
PLP 2016

NIPS15,
ECML/PKDD 2015,UAI 2015,
AAAI15,
ILP
2015,
PLP 2015

NIPS14,
ICML
2014,
UAI 2014,
ILP
2014,
ECML/PKDD 2014,
AAAI14,
KR 2014,
ECAI'14,
BUDA 2014,

NIPS13,
ICML 2013,
UAI 2013,
ILP 2013,
ECML/PKDD 2013,
EMNLP 2013,
NAACLHLT 2013,
LML workshop at ECML/PKDD 2013

NIPS12,
ICML 2012,
UAI 2012,
ILP 2012,
ECML/PKDD 2012,
AAAI12,
KR 2012
StaRAI12,
CoCoMile 2012,
ACL 2012,
Cognitive 2012,

ICML 2011,
UAI 2011,
ILP 2011,
ECML/PKDD 2011,

ILP 2010,
AAAI10,
ECAI2010,
ECML/PKDD 2010,
SBIA 2010

NIPS09,
EACL09,
ICML 09,
ILP09,
SRL09,
Terminologie et intelligence artificielle (TIA  2009),
IJCAI09,
AISTATS
09, CoNLL 09, NAACLHLT 09,
EACL
Cognitive 2009, NAACL2009 Workshop on Unsupervised and Minimally Supervised Learning of Lexical Semantics

NIPS08,
ICML 08,
ILP 08,
ECAI 08,
SBIA 08,
CoNLL 08,
 ICML
07,
UAI07, ILP07, ACL2007 Workshop on Cognitive Aspects of Computational Language Acquisition, TIA'07

EACL06,
UAI06,
ILP06,
AAAI06,
SRL06,
CoNLL06

IJCAI05,
ICML05,
UAI05,
ILP05,
ECML/PKDD05,
LLLL,
CoNLL05,
TIA05

NIPS04,
ICML04,
UAI04,
ECML04,
CIFT04,
SRL04,
CoNLL04,
Psychocomputational models ...

ICML03,
UAI03,
ILP03,
CoNLL03,
Acquisition, apprentissage et ...,
SRL2003,
ECML03

ICML02,
UAI02,
ILP02,
CIFT02,
CoNLL02

ILP01,
ECML01
,
CoNLL01,
LLL01

ILP00,
CoNLL00,
LLL00

ILP99
,
LLL99
 ILP98
Miscellaneous
Administration
 Chair of the Board of Studies
Personal history
Contact information
Address

Dept of Computer Science, University of York, York, YO10 5GE, UK 
Direct phone

+44 (0)1904 325371 
Email

firstname.lastname AT york DOT ac DOT uk

temporary stuff