Signed Cumulative Distribution Transform for Machine Learning in 1D
Classification and estimation problems are at the core of machine learning. In this talk we will see a new mathematical signal transform that renders data easy to classify or estimate, based on a very old theory of transportation that was started by Monge. We will learn about the existing Cumulative Distribution Transform and then extend to a more general measure theoretic framework, to define the new transform (Signed Cumulative Distribution Transform). We will look at both forward (analysis) and inverse (synthesis) formulas for the transform, and describe several of its properties including translation, scaling, convexity and isometry. Finally, we will demonstrate two applications of the transform in classifying (detecting) signals under random displacements and estimation of signal parameters under such displacements.