Conformal prediction and testing
18th December 2020, 4:00 pm – 5:00 pm
Conformal prediction is an area of machine learning providing, in addition to its predictions, information about the accuracy and reliability of those predictions that is provably valid. The validity of conformal prediction is guaranteed under the IID assumption, standard in machine learning and much of nonparametric statistics; namely, the data are assumed to be generated independently from the same distribution. An interesting application of conformal prediction is to testing the IID assumption. My plan is to introduce conformal prediction, define some of its properties of validity, and describe some procedures for testing the IID assumption and for raising an alarm as soon as the IID assumption ceases to be satisfied. Such procedures can be used for making decisions about retraining prediction algorithms.