Neural Datalog Through Time: Informed Temporal Modeling via Logical Specification

Part of Proceedings of the International Conference on Machine Learning 1 pre-proceedings (ICML 2020)

Bibtex »Metadata »Paper »Supplemental »

Bibtek download is not availble in the pre-proceeding


Authors

Hongyuan Mei, Guanghui Qin, Minjie Xu, Jason Eisner

Abstract

<p>Learning how to predict future events from patterns of past events is difficult when the set of possible event types is large. Many of the patterns detected in the data by training an everything-affects-everything model will be spurious. To exploit known structure, we propose using a deductive database to track facts over time, where each fact has a time-varying state—a vector computed by a neural net whose topology is determined by the fact’s provenance and experience. The possible events at any time correspond to structured facts, whose probabilities are modeled along with their states. In both synthetic and real-world domains, we show that neural models derived from concise Datalog programs achieve better generalization by encoding appropriate domain knowledge into the model architecture.</p>