Steve Hanneke

Steve Hanneke



Steve-Hanneke-2017-home.JPG  Contact Information:
 Email: steve.hanneke@gmail.com
 Purdue Office: 2116K Lawson
 Address:
 Computer Science Department
 Purdue University
 West Lafayette, IN 47907 USA


I am an Assistant Professor in the Computer Science Department at Purdue University.
I work on topics in statistical learning theory.

Research Interests:
My general research interest is in systems that can improve their performance with experience, a topic known as machine learning. My focus is on the statistical analysis of machine learning. The essential questions I am interested in answering are “what can be learned from empirical observation / experimentation,” and “how much observation / experimentation is necessary and sufficient to learn it?” This overall topic intersects with several academic disciplines, including statistical learning theory, artificial intelligence, statistical inference, algorithmic and statistical information theories, probability theory, philosophy of science, and epistemology.


About me:
I am an Assistant Professor in the Computer Science Department at Purdue University. Prior to joining Purdue, I was a Research Assistant Professor at the Toyota Technological Institute at Chicago (TTIC) 2018-2021, and was an independent scientist working in Princeton 2012-2018, aside from a brief one-semester stint as a Visiting Lecturer at Princeton University in 2018. Before that, from 2009 to 2012, I was a Visiting Assistant Professor in the Department of Statistics at Carnegie Mellon University, also affiliated with the Machine Learning Department. I received my PhD in 2009 from the Machine Learning Department at Carnegie Mellon University, co-advised by Eric Xing and Larry Wasserman. My thesis work was on the theoretical foundations of active learning. From 2002 to 2005, I was an undergraduate studying Computer Science at the University of Illinois at Urbana-Champaign (UIUC), where I worked on semi-supervised learning with Prof. Dan Roth and the students in the Cognitive Computation Group. Prior to that, I studied Computer Science at Webster University in St. Louis, MO, where I played around with neural networks and classic AI a bit.
Note: A speaker bio for presentations can be found here.


Recent News and Activities:


Teaching:
Purdue University:
Fall 2022: CS 59300-MLT, Machine Learning Theory.
Spring 2022, 2023: CS 37300, Data Mining and Machine Learning.
Fall 2021: CS 59200-MLT, Machine Learning Theory.
Princeton University:
Spring 2018: ORF 525, Statistical Learning and Nonparametric Estimation.
Carnegie Mellon University:
Spring 2012: 36-752, Advanced Probability Overview.
Fall 2011: 36-755, Advanced Statistical Theory I.
Spring 2011: 36-752, Advanced Probability Overview.
Fall 2010 Mini 1: 36-781, Advanced Statistical Methods I: Active Learning
Fall 2010 Mini 2: 36-782, Advanced Statistical Methods II: Advanced Topics in Machine Learning Theory
Spring 2010: 36-754, Advanced Probability II: Stochastic Processes.
Fall 2009: 36-752, Advanced Probability Overview.


A Survey of Theoretical Active Learning:
Theory of Active Learning. [pdf][ps]

This is a survey of some of the recent advances in the theory of active learning, with particular emphasis on label complexity guarantees for disagreement-based methods.
The current version (v1.1) was updated on September 22, 2014.
A few recent significant advances in active learning not yet covered in the survey: [ZC14], [WHE-Y15], [HY15].
An abbreviated version of this survey appeared in the Foundations and Trends in Machine Learning series, Volume 7, Issues 2-3, 2014.


Selected Recent Works:


Articles in Preparation:


All Publications:
(authors are listed in alphabetical order, except sometimes when a student author is listed first).

2023

2022

2021

2020

2019

2018

2016

2015

2014

2013

2012

2011

2010

2009

2008

2007

2006


Map
Visitors since 11/14/2014.