PHYSICS GRADUATE STUDENT SYMPOSIUM<br>Information Divergence Estimation in Signal Processing and Machine Learning
- All News & Features
- All Events
-
- Archived Events
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- <b>FOUNDATION OF MODERN PHYSICS WORKSHOP</b><br>Foundations of Ordinary Quantum Mechanics Workshop
- <b>PHYSICS GRADUATE STUDENT SYMPOSIUM</b><br>Radiations, Photons and Interference
- <b>PHYSICS GRADUATE STUDENT SYMPOSIUM</b><br>Ultra-Thin, Smooth and Low Loss Al-Doped Ag Film and Its Application in Optoelectronic Devices
- <b>PHYSICS GRADUATE STUDENT SYMPOSIUM</b><br>Hyperbolic Metamaterials Based on Graphene-Dielectric Multilayers
- PHYSICS GRADUATE STUDENT SYMPOSIUM<br>Measuring the Stark Shift in the Two-Photon Cross Section of Fluorescent Proteins
- <b>PHYSICS GRADUATE STUDENT SYMPOSIUM</b><br>Parton Dynamics in High Energy Proton Proton Collisions at PHENIX<br>
- PHYSICS GRADUATE STUDENT SYMPOSIUM<br>Information Divergence Estimation in Signal Processing and Machine Learning
- <b>PHYSICS GRADUATE STUDENT SYMPOSIUM<b><br>Galaxy Evolution in X-Ray Selected Clusters and Groups in Dark Energy Survey Data
- <b>Physics Graduate Student Symposium</b><br>Engineering the Properties and Defects of InAsSb by Atomic Surface Structure
- <b>PHYSICS GRADUATE STUDENT SYMPOSIUM</b><br>Spintronic Devices and Spin Physics in Bulk Semiconductors
- MCTP Workshop
-
-
-
- Search Events
-
- Special Lectures
- K-12 Programs
- Saturday Morning Physics
- Seminars & Colloquia
Information divergence is a measure of difference between probability distributions and is important in the fields of machine learning, signal processing, statistics, and information theory. Special cases of divergence include the common measures of information known as entropy and mutual information. This talk presents multiple applications of divergence functional estimation primarily in the context of sunspot image classification, focusing on the problems of dimensionality reduction, extending existing machine learning tasks to probability distributions as features, and estimating the optimal probability of error (the Bayes error) of a classification problem. We then present a simple, computationally tractable non-parametric estimator of a wide variety of divergence functionals that achieves parametric convergence rates under certain smoothness conditions. This estimator is demonstrated by estimating bounds on the Bayes error for a classical machine learning data set.
Lunch will be served in the Don Meyer Common Room at 11:50 A.M.
Speaker: |
Kevin Moon
|
---|