Extending Path-Dependent NJ-ODEs to Noisy Observations and a Dependent Observation Framework

William Andersson · Jakob Heiss · Florian Krach · Josef Teichmann

Video

Paper PDF

Thumbnail of paper pages

Abstract

The \emph{Path-Dependent Neural Jump Ordinary Differential Equation (PD-NJ-ODE)} is a model for predicting continuous-time stochastic processes with irregular and incomplete observations. In particular, the method learns optimal forecasts given irregularly sampled time series of incomplete past observations. So far the process itself and the coordinate-wise observation times were assumed to be independent and observations were assumed to be noiseless. In this work we discuss two extensions to lift these restrictions and provide theoretical guarantees as well as empirical examples for them. In particular, we can lift the assumption of independence by extending the theory to much more realistic settings of conditional independence without any need to change the algorithm. Moreover, we introduce a new loss function, which allows us to deal with noisy observations and explain why the previously used loss function did not lead to a consistent estimator.