Šourek, Gustav and Aschenbrenner, Vojtěch and Železný, Filip and Schockaert, Steven and Kuželka, Ondřej

Šourek, G., Aschenbrenner, V., Železný, F., Schockaert, S., & Kuželka, O. (2018). Lifted Relational Neural Networks: Efficient Learning of Latent Relational Structures. J. Artif. Int. Res., 62(1), 69–100.

Abstract

We propose a method to combine the interpretability and expressive power of first-order logic with the effectiveness of neural network learning. In particular, we introduce a lifted framework in which first-order rules are used to describe the structure of a given problem setting. These rules are then used as a template for constructing a number of neural networks, one for each training and testing example. As the different networks corresponding to different examples share their weights, these weights can be efficiently learned using stochastic gradient descent. Our framework provides a exible way for implementing and combining a wide variety of modelling constructs. In particular, the use of first-order logic allows for a declarative specification of latent relational structures, which can then be efficiently discovered in a given data set using neural network learning. Experiments on 78 relational learning benchmarks clearly demonstrate the effectiveness of the framework.

Citation

@article{lrrns2019,
  author = {Šourek, Gustav and Aschenbrenner, Vojtěch and Železný, Filip and Schockaert, Steven and Kuželka, Ondřej},
  title = {Lifted Relational Neural Networks: Efficient Learning of Latent Relational Structures},
  year = {2018},
  issue_date = {May 2018},
  publisher = {AI Access Foundation},
  address = {El Segundo, CA, USA},
  volume = {62},
  number = {1},
  issn = {1076-9757},
  url = {https://doi.org/10.1613/jair.1.11203},
  doi = {10.1613/jair.1.11203},
  journal = {J. Artif. Int. Res.},
  month = may,
  pages = {69–100},
  numpages = {32}
}