Second International Workshop on Symbolic-Neural Learning (SNL-2018)

July 5-6, 2018
Nagoya Congress Center (Nagoya, Japan)

Learning from Noisy Transition Data

Yin Jun Phua (Tokyo Institute of Technology) and Katsumi Inoue (Tokyo Institute of Technology/National Institute of Informatics)

Abstract:

Real world data are often noisy and fuzzy. Most traditional logical machine learning methods require the data to be first discretized or pre-processed before being able to produce useful output. Such short-coming often limits their application to real world data. On the other hand, neural networks are generally known to be robust against noisy data. However, a fully trained neural network does not provide easily understandable rules that can be used to understand the underlying model. In this paper, we propose a Differentiable Learning from Interpretation Transition (∂-LFIT) algorithm, which can not only output logic programs that fully explain the state transitions, but can also learn from data containing noise and error.