Speaker: Omar DeGuchy
Title: Neural Ordinary Differential Equations
Abstract: Deep learning has emerged as an effective tool in a variety of applications including computer vision, image processing and natural language processing. Typically, deep learning architectures are comprised of either fully connected layers or convolutions. In either case, this is simply some form of matrix multiplication followed by an activation function. The goal is to model highly non-linear functions by learning the appropriate entries of these matrices. The winners of the 2018 NeurIPS best paper propose using a differential equation to take the place of a these typical neural network structures. We will review and implement their findings using the Julia package DiffEqFlux.
** refreshments at 1:30pm and talk at 2:00pm