Steve Hanneke
Contact Information:
Email: [email protected]
Purdue Office: 3121 DSAI
Address:
Computer Science Department
Purdue University
West Lafayette, IN 47907 USA
I am an Assistant Professor in the Computer Science Department at Purdue University.
I work on topics in statistical learning theory.
Research Interests:
My general research interest is in systems that can improve their performance with experience, a topic known as machine learning. My focus is on the statistical analysis of machine learning. The essential questions I am interested in answering are “what can be learned from empirical observation / experimentation,” and “how much observation / experimentation is necessary and sufficient to learn it?”
This overall topic intersects with several academic disciplines, including statistical learning theory, artificial intelligence, statistical inference, algorithmic and statistical information theories, probability theory, philosophy of science, and epistemology.
About me:
I am an Assistant Professor in the Computer Science Department at Purdue University.
Prior to joining Purdue, I was a Research Assistant Professor at the Toyota Technological Institute at Chicago (TTIC) 2018-2021, and was an independent scientist working in Princeton 2012-2018, aside from a brief one-semester stint as a Visiting Lecturer at Princeton University in 2018.
Before that, from 2009 to 2012, I was a Visiting Assistant Professor in the Department of Statistics at
Carnegie Mellon University,
also affiliated with the Machine Learning Department.
I received my PhD in 2009 from the
Machine Learning Department at
Carnegie Mellon University,
co-advised by Eric Xing
and Larry Wasserman.
My thesis work was on the theoretical foundations of active learning.
From 2002 to 2005, I was an undergraduate studying
Computer Science
at the University of Illinois at Urbana-Champaign
(UIUC),
where I worked on semi-supervised learning with Prof.
Dan Roth
and the students in the Cognitive Computation Group.
Prior to that, I studied Computer Science at Webster University
in St. Louis, MO,
where I played around with neural networks
and classic AI a bit.
Note: A speaker bio for presentations can be found here.
Recent News and Activities:
- Our paper Optimal Mistake Bounds for Transductive Online Learning was runner up for the NeurIPS 2025 Best Paper Award.
- I will serve as Program Committee Chair (with Tor Lattimore) for the 39th Annual Conference on Learning Theory (COLT 2026), held in San Diego.
- Our paper Online Learning with Simple Predictors and a Combinatorial Characterization of Minimax in 0/1 Games was runner up for the COLT 2021 Best Paper Award.
- Fall 2021 I have joined Purdue University as an Assistant Professor in Computer Science.
- Received the Best Paper Award at ALT 2021 for our paper “Stable Sample Compression Schemes: New Applications and an Optimal SVM Margin Bound”.
- New manuscript “A Theory of Universal Learning” posted to the arXiv.
- Received the Best Paper Award at COLT 2020 for our paper “Proper Learning, Helly Number, and an Optimal SVM Bound”.
- Presented (with Rob Nowak) an ICML 2019 Tutorial on Active Learning: From Theory to Practice. [slides]
- Organized the ALT 2019 workshop: When Smaller Sample Sizes Suffice for Learning.
- Our paper VC Classes are Adversarially Robustly Learnable, but Only Improperly received a Best Student Paper Award at COLT 2019.
- Fall 2018 I joined the Toyota Technological Institute at Chicago (TTIC) as a Research Assistant Professor.
- Spring 2018 I taught ORF 525 “Statistical Learning and Nonparametric Estimation” at Princeton University.
- My ICML 2007 paper “A Bound on the Label Complexity of Agnostic Active Learning” received Honorable Mention for the ICML 2017 Test of Time Award.
- Program Committee Chair (with Lev Reyzin) for the 28th International Conference on Algorithmic Learning Theory (ALT 2017), held October 15-17 in Kyoto, Japan. See our published proceedings.
- New manuscript “Learning Whenever Learning is Possible: Universal Learning under General Stochastic Processes” posted to the arXiv.
Teaching:
Purdue University:
Fall 2022, 2023, 2024, 2025: CS 59300-MLT, Machine Learning Theory.
Spring 2022, 2023, 2024, 2025: CS 37300, Data Mining and Machine Learning.
Fall 2021: CS 59200-MLT, Machine Learning Theory.
Princeton University:
Spring 2018: ORF 525, Statistical Learning and Nonparametric Estimation.
Carnegie Mellon University:
Spring 2012: 36-752, Advanced Probability Overview.
Fall 2011: 36-755, Advanced Statistical Theory I.
Spring 2011: 36-752, Advanced Probability Overview.
Fall 2010 Mini 1: 36-781, Advanced Statistical Methods I: Active Learning
Fall 2010 Mini 2: 36-782, Advanced Statistical Methods II: Advanced Topics in Machine Learning Theory
Spring 2010:
36-754, Advanced Probability II: Stochastic Processes.
Fall 2009: 36-752, Advanced Probability Overview.
A Survey of Theoretical Active Learning:
Theory of Active Learning.
[pdf][ps]
This is a survey of some of the recent advances in the theory of active learning, with particular emphasis on label complexity guarantees for disagreement-based methods.
The current version (v1.1) was updated on September 22, 2014.
A few recent significant advances in active learning not yet covered in the survey:
[ZC14], [WHE-Y15], [HY15], [H25].
An abbreviated version of this survey appeared in the Foundations and Trends in Machine Learning series,
Volume 7, Issues 2-3, 2014.
Selected Recent Works:
- Cohen, A., Erez, L., Hanneke, S., Koren, T., Mansour, Y., Moran, S., and Zhang, Q. (2025). Sample Complexity of Agnostic Multiclass Classification: Natarajan Dimension Strikes Back.
- Hanneke, S. (2025). Agnostic Active Learning Is Always Better Than Passive Learning. Advances in Neural Information Processing Systems 38 (NeurIPS).
- Chase, Z., Hanneke, S., Moran, S., and Shafer, J. (2025). Optimal Mistake Bounds for Transductive Online Learning. Advances in Neural Information Processing Systems 38 (NeurIPS).
- Hanneke, S. and Yang, L. (2023). Bandit Learnability can be Undecidable. In Proceedings of the 36th Annual Conference on Learning Theory (COLT).
- Hanneke, S. and Kpotufe, S. (2022). A No-Free-Lunch Theorem for MultiTask Learning. The Annals of Statistics. Vol. 50 (6), pp. 3119-3143.
- Alon, N., Hanneke, S., Holzman, R., and Moran, S. (2021). A Theory of PAC Learnability of Partial Concept Classes. In Proceedings of the 62nd Annual Symposium on Foundations of Computer Science (FOCS).
- Hanneke, S. (2021). Learning Whenever Learning is Possible: Universal Learning under General Stochastic Processes. Journal of Machine Learning Research, Vol. 22 (130), pp. 1-116.
- Hanneke, S., Livni, R., and Moran, S. (2021). Online Learning with Simple Predictors and a Combinatorial Characterization of Minimax in 0/1 Games. In Proceedings of the 34th Annual Conference on Learning Theory (COLT).
- Bousquet, O., Hanneke, S., Moran, S., van Handel, R., and Yehudayoff, A. (2021). A Theory of Universal Learning. In Proceedings of the 53rd Annual ACM Symposium on Theory of Computing (STOC).
- Bousquet, O., Hanneke, S., Moran, S., and Zhivotovskiy, N. (2020). Proper Learning, Helly Number, and an Optimal SVM Bound. In Proceedings of the 33rd Annual Conference on Learning Theory (COLT).
- Montasser, O., Hanneke, S., and Srebro, N. (2019). VC Classes are Adversarially Robustly Learnable, but Only Improperly. In Proceedings of the 32nd Annual Conference on Learning Theory (COLT).
- Hanneke, S. (2016). The Optimal Sample Complexity of PAC Learning. Journal of Machine Learning Research, Vol. 17 (38), pp. 1-15.
- Hanneke, S. and Yang, L. (2015). Minimax Analysis of Active Learning. Journal of Machine Learning Research, Vol. 16 (12), pp. 3487-3602.
- Hanneke, S. (2012). Activized Learning: Transforming Passive to Active with Improved Label Complexity. Journal of Machine Learning Research, Vol. 13 (5), pp. 1469-1587.
Articles in Preparation:
- Nonparametric Active Learning, Part 1: Smooth Regression Functions. [pdf][ps].
- Nonparametric Active Learning, Part 2: Smooth Decision Boundaries.
- Active Learning with Identifiable Mixture Models. Joint work with Vittorio Castelli and Liu Yang.
- Blanchard, M., Hanneke, S., and Jaillet, P. (2023). Contextual Bandits and Optimistically Universal Learning. [pdf][arXiv].
- Blanchard, M., Hanneke, S., and Jaillet, P. (2023). Non-stationary Contextual Bandits and Universal Learning. [pdf][arXiv].
- Cohen, A., Erez, L., Hanneke, S., Koren, T., Mansour, Y., Moran, S., and Zhang, Q. (2025). Sample Complexity of Agnostic Multiclass Classification: Natarajan Dimension Strikes Back. [pdf][arXiv].
All Publications:
(authors are listed in alphabetical order, except a few cases where a student author is listed first).
2025
- Hanneke, S. (2025). Agnostic Active Learning Is Always Better Than Passive Learning. Advances in Neural Information Processing Systems 38 (NeurIPS). (Oral presentation). [pdf][oral presentation][poster page].
-
Chase, Z., Hanneke, S., Moran, S., and Shafer, J. (2025).
Optimal Mistake Bounds for Transductive Online Learning.
Advances in Neural Information Processing Systems 38 (NeurIPS).
(Oral presentation).
[pdf][oral presentation][poster page].
Runner up for the Best Paper Award. - Hanneke, S., Karbasi, A., Mehrotra, A., and Velegkas, G. (2025). On Union-Closedness of Language Generation. Advances in Neural Information Processing Systems 38 (NeurIPS). [pdf][arXiv][official page].
- Attias, I., Hanneke, S., and Ramaswami, A. (2025). Tradeoffs between Mistakes and ERM Oracle Calls in Online and Transductive Online Learning. Advances in Neural Information Processing Systems 38 (NeurIPS). (Spotlight). [pdf][arXiv][official page].
- Hanneke, S., Shaeiri, A., and Wang, H. (2025). Non-Uniform Multiclass Learning with Bandit Feedback. Advances in Neural Information Processing Systems 38 (NeurIPS). [pdf][official page].
- Hanneke, S., Moran, S., and Thiessen, M. (2025). Marginal-Nonuniform PAC Learnability. Advances in Neural Information Processing Systems 38 (NeurIPS). [pdf][official page].
- Hanneke, S. and Xu, M. (2025). Universal Rates of ERM for Agnostic Learning. In Proceedings of the 38th Annual Conference on Learning Theory (COLT). [pdf][official page].
- Hanneke, S., Shaeiri, A. and Zhang, Q. (2025). Universal Rates for Multiclass Learning with Bandit Feedback. In Proceedings of the 38th Annual Conference on Learning Theory (COLT). [pdf][official page].
- Blum, A., Hanneke, S., Pabbaraju, C. and Saless, D. (2025). Proofs as Explanations: Short Certificates for Reliable Predictions. In Proceedings of the 38th Annual Conference on Learning Theory (COLT). [pdf][official page].
- Hanneke, S., Moran, S., Schefler, H. and Tsubari, I. (2025). Private List Learnability vs. Online List Learnability. In Proceedings of the 38th Annual Conference on Learning Theory (COLT). [pdf][official page].
- Hanneke, S., Moran, S., Shlimovich, A. and Yehudayoff, A. (2025). Data Selection for ERMs. In Proceedings of the 38th Annual Conference on Learning Theory (COLT). [pdf][official page].
- Hanneke, S., Moran, S., Shlimovich, A. and Yehudayoff, A. (2025). Open Problem: Data Selection for Regression Tasks. In Proceedings of the 38th Annual Conference on Learning Theory (COLT). [pdf][official page].
- Hanneke, S., Meng, Q. and Shaeiri, A. (2025). Representation Preserving Multiclass Agnostic to Realizable Reduction. In Proceedings of the 42nd International Conference on Machine Learning (ICML). [pdf][official page].
- Hanneke, S. and Shaeiri, A. (2025). A Trichotomy for List Transductive Online Learning. In Proceedings of the 42nd International Conference on Machine Learning (ICML). [pdf][official page].
- Hanneke, S. and Wang, K. (2025). A Complete Characterization of Learnability for Stochastic Noisy Bandits. In Proceedings of the 36th International Conference on Algorithmic Learning Theory (ALT). [pdf][official page].
- Hanneke, S., Shaeiri, A., and Wang, H. (2025). For Universal Multiclass Online Learning, Bandit Feedback and Full Supervision are Equivalent. In Proceedings of the 36th International Conference on Algorithmic Learning Theory (ALT). [pdf][official page].
- Hanneke, S., Yang, L., Wang, G., and Song, Y. (2025). Reliable Active Apprenticeship Learning. In Proceedings of the 36th International Conference on Algorithmic Learning Theory (ALT). [pdf][official page].
- Attias, I., Hanneke, S., and Ramaswami, A. (2025). Sample Compression Scheme Reductions. In Proceedings of the 36th International Conference on Algorithmic Learning Theory (ALT). [pdf][official page].
2024
- Filmus, Y., Hanneke, S., Mehalel, I., and Moran, S. (2024). Bandit-Feedback Online Multiclass Classification: Variants and Tradeoffs. Advances in Neural Information Processing Systems 37 (NeurIPS). [pdf][arXiv][official page].
- Hanneke, S., Karbasi, A., Moran, S., and Velegkas, G. (2024). Universal Rates for Active Learning. Advances in Neural Information Processing Systems 37 (NeurIPS). [pdf][official page].
- Hanneke, S., Raman, V., Shaeiri, A., and Subedi, U. (2024). Multiclass Transductive Online Learning. Advances in Neural Information Processing Systems 37 (NeurIPS). (Spotlight). [pdf][arXiv][official page].
- Hanneke, S. and Xu, M. (2024). Universal Rates of Empirical Risk Minimization. Advances in Neural Information Processing Systems 37 (NeurIPS). [pdf][arXiv][official page].
- Hanneke, S., Moran, S., and Zhang, Q. (2024). Improved Sample Complexity for Multiclass PAC Learning. Advances in Neural Information Processing Systems 37 (NeurIPS). [pdf][official page].
- Hanneke, S. and Wang, H. (2024). A Theory of Optimistically Universal Online Learnability for General Concept Classes. Advances in Neural Information Processing Systems 37 (NeurIPS). [pdf][arXiv][official page].
- Devulapalli, P. and Hanneke, S. (2024). Learning from Snapshots of Discrete and Continuous Data Streams. Advances in Neural Information Processing Systems 37 (NeurIPS). [pdf][arXiv][official page].
- Fioravanti, S., Hanneke, S., Moran, S., Schefler, H. and Tsubari, I. (2024). Ramsey Theorems for Trees and a General ‘Private Learning Implies Online Learning’ Theorem. In Proceedings of the 65th IEEE Symposium on Foundations of Computer Science (FOCS). [pdf][arXiv].
- Hanneke, S., Larsen, K. G. and Zhivotovskiy, N. (2024). Revisiting Agnostic PAC Learning. In Proceedings of the 65th IEEE Symposium on Foundations of Computer Science (FOCS). [pdf][arXiv].
- Attias, I., Hanneke, S., Kontorovich, A., and Sadigurschi, M. (2024). Agnostic Sample Compression Schemes for Regression. In Proceedings of the 41st International Conference on Machine Learning (ICML). (Spotlight). [pdf][arXiv][official page][proceedings][poster].
- Chase, Z., Chornomaz, B., Hanneke, S., Moran, S. and Yehudayoff, A. (2024). Dual VC Dimension Obstructs Sample Compression by Embeddings. In Proceedings of the 37th Annual Conference on Learning Theory (COLT). [pdf][official page].
- Hanneke, S. (2024). The Star Number and Eluder Dimension: Elementary Observations About the Dimensions of Disagreement. In Proceedings of the 37th Annual Conference on Learning Theory (COLT). [pdf][official page].
- Attias, I., Hanneke, S., Kalavasis, A., Karbasi, A. and Velegkas, G. (2024). Universal Rates for Regression: Separations between Cut-Off and Absolute Loss. In Proceedings of the 37th Annual Conference on Learning Theory (COLT). [pdf][official page].
- Hanneke, S., Moran, S. and Waknine, T. (2024). List Sample Compression and Uniform Convergence. In Proceedings of the 37th Annual Conference on Learning Theory (COLT). [pdf][arXiv][official page].
- Hanneke, S., Moran, S. and Waknine, T. (2024). Open problem: Direct Sums in Learning Theory. In Proceedings of the 37th Annual Conference on Learning Theory (COLT). [pdf][official page].
- Hanneke, S., Kontorovich, A., and Kornowski, G. (2024). Efficient Agnostic Learning with Average Smoothness. In Proceedings of the 35th International Conference on Algorithmic Learning Theory (ALT). [pdf][arXiv][official page].
- Devulapalli, P. and Hanneke, S. (2024). The Dimension of Self-Directed Learning. In Proceedings of the 35th International Conference on Algorithmic Learning Theory (ALT). [pdf][arXiv][official page].
2023
- Attias, I., Hanneke, S., Kalavasis, A., Karbasi, A., and Velegkas, G. (2023). Optimal Learners for Realizable Regression: PAC Learning and Online Learning. Advances in Neural Information Processing Systems 36 (NeurIPS). (Oral presentation). [pdf][arXiv][official page].
- Goel, S., Hanneke, S., Moran, S., and Shetty, A. (2023). Adversarial Resilience in Sequential Prediction via Abstention. Advances in Neural Information Processing Systems 36 (NeurIPS). [pdf][arXiv][official page].
- Hanneke, S., Kontorovich, A., and Kornowski, G. (2023). Near-optimal learning with average Hölder smoothness. Advances in Neural Information Processing Systems 36 (NeurIPS). [pdf][arXiv][official page].
- Hanneke, S., Moran, S., and Shafer, J. (2023). A Trichotomy for Transductive Online Learning. Advances in Neural Information Processing Systems 36 (NeurIPS). [pdf][arXiv][official page].
- Balcan, M.-F., Hanneke, S., Pukdee, R., and Sharma, D. (2023). Reliable Learning in Challenging Environments. Advances in Neural Information Processing Systems 36 (NeurIPS). [pdf][arXiv][official page].
- Hanneke, S. and Yang, L. (2023). Bandit Learnability can be Undecidable. In Proceedings of the 36th Annual Conference on Learning Theory (COLT). [pdf][official page].
- Hanneke, S., Moran, S., Raman, V., Subedi, U., and Tewari, A. (2023). Multiclass Online Learning and Uniform Convergence. In Proceedings of the 36th Annual Conference on Learning Theory (COLT). [pdf][official page].
- Hanneke, S., Moran, S., and Zhang, Q. (2023). Universal Rates for Multiclass Learning. In Proceedings of the 36th Annual Conference on Learning Theory (COLT). [pdf][official page].
- Hanneke, S., Kpotufe, S., and Mahdaviyeh, Y. (2023). Limits of Model Selection under Transfer Learning. In Proceedings of the 36th Annual Conference on Learning Theory (COLT). [pdf][arXiv][official page].
- Brukhim, N., Hanneke, S., and Moran, S. (2023). Improper Multiclass Boosting. In Proceedings of the 36th Annual Conference on Learning Theory (COLT). [pdf][official page]
- Bousquet, O., Hanneke, S., Moran, S., Shafer, J., and Tolstikhin, I. (2023). Fine-Grained Distribution-Dependent Learning Curves. In Proceedings of the 36th Annual Conference on Learning Theory (COLT). [pdf][arXiv][official page].
- Filmus, Y., Hanneke, S., Mehalel, I., and Moran, S. (2023). Optimal Prediction Using Expert Advice and Randomized Littlestone Dimension. In Proceedings of the 36th Annual Conference on Learning Theory (COLT). [pdf][arXiv][official page].
- Attias, I. and Hanneke, S. (2023). Adversarially Robust PAC Learnability of Real-Valued Functions. In Proceedings of the 40th International Conference on Machine Learning (ICML). [pdf][arXiv][official page, video].
2022
- Hanneke, S., Karbasi, A., Moran, S., and Velegkas, G. (2022). Universal Rates for Interactive Learning. Advances in Neural Information Processing Systems 35 (NeurIPS). (Oral presentation). [pdf][supplement][official page].
- Montasser, O., Hanneke, S., and Srebro, N. (2022). Adversarially Robust Learning: A Generic Minimax Optimal Learner and Characterization. Advances in Neural Information Processing Systems 35 (NeurIPS). (Oral presentation). [pdf][arXiv].
- Hanneke, S., Karbasi, A., Mahmoody, M., Mehalel, I., and Moran, S. (2022). On Optimal Learning Under Targeted Data Poisoning. Advances in Neural Information Processing Systems 35 (NeurIPS). (Oral presentation). [pdf][arXiv]
- Attias, I., Hanneke, S., and Mansour, Y. (2022). A Characterization of Semi-Supervised Adversarially-Robust PAC Learnability. Advances in Neural Information Processing Systems 35 (NeurIPS). [pdf][arXiv].
- Balcan, M.-F., Blum, A., Hanneke, S., and Sharma, D. (2022). Robustly-reliable Learners Under Poisoning Attacks. In Proceedings of the 35th Annual Conference on Learning Theory (COLT). [pdf][arXiv][official page].
- Hanneke, S. and Kpotufe, S. (2022). A No-Free-Lunch Theorem for MultiTask Learning. The Annals of Statistics. Vol. 50 (6), pp. 3119-3143. [pdf][arXiv][journal page]
- Hanneke, S. (2022). Universally consistent online learning with arbitrarily dependent responses. In Proceedings of the 33rd International Conference on Algorithmic Learning Theory (ALT). [pdf][arXiv].
- Blanchard, M., Cosson, R., and Hanneke, S. (2022). Universal Online Learning with Unbounded Losses: Memory Is All You Need. In Proceedings of the 33rd International Conference on Algorithmic Learning Theory (ALT). [pdf][arXiv].
- Montasser, O., Hanneke, S., and Srebro, N. (2022). Transductive Robust Learning Guarantees. In Proceedings of the 25th International Conference on Artificial Intelligence and Statistics (AISTATS). [pdf][arXiv].
2021
- Alon, N., Hanneke, S., Holzman, R., and Moran, S. (2021). A Theory of PAC Learnability of Partial Concept Classes. In Proceedings of the 62nd Annual Symposium on Foundations of Computer Science (FOCS). [pdf][arXiv][proceedings version][official page].
-
Hanneke, S., Livni, R., and Moran, S. (2021).
Online Learning with Simple Predictors and a Combinatorial Characterization of Minimax in 0/1 Games. In Proceedings of the 34th Annual Conference on Learning Theory (COLT).
[pdf][arXiv][official page][videos]
Runner up for the Best Paper Award. - Montasser, O., Hanneke, S., and Srebro, N. (2021). Adversarially Robust Learning with Unknown Perturbation Sets. In Proceedings of the 34th Annual Conference on Learning Theory (COLT). [pdf][arXiv][official page][videos]
- Blum, A., Hanneke, S., Qian, J., and Shao, H. (2021). Robust Learning under Clean-Label Attack. In Proceedings of the 34th Annual Conference on Learning Theory (COLT). [pdf][arXiv][official page][videos]
- Hanneke, S. (2021). Open Problem: Is There an Online Learning Algorithm That Learns Whenever Online Learning Is Possible?. In Proceedings of the 34th Annual Conference on Learning Theory (COLT). [pdf][arXiv][official page][videos]
- Bousquet, O., Hanneke, S., Moran, S., van Handel, R., and Yehudayoff, A. (2021). A Theory of Universal Learning. In Proceedings of the 53rd Annual ACM Symposium on Theory of Computing (STOC). [pdf][official page][arXiv]
- Hanneke, S. (2021). Learning Whenever Learning is Possible: Universal Learning under General Stochastic Processes. Journal of Machine Learning Research. Vol. 22 (130), pp. 1-116. [pdf][arXiv][journal page]
- Hanneke, S. and Yang, L. (2021). Toward a General Theory of Online Selective Sampling: Trading Off Mistakes and Queries. In Proceedings of the 24th International Conference on Artificial Intelligence and Statistics (AISTATS). [pdf][official page]
-
Hanneke, S. and Kontorovich, A. (2021).
Stable Sample Compression Schemes: New Applications and an Optimal SVM Margin Bound.
In Proceedings of the 32nd International Conference on Algorithmic Learning Theory (ALT).
[pdf][official page][arXiv]
Winner of the Best Paper Award. - Hanneke, S., Kontorovich, A., Sabato, S., and Weiss, R. (2021). Universal Bayes Consistency in Metric Spaces. The Annals of Statistics, Vol. 49 (4), pp. 2129-2150. [pdf][arXiv][official page]
2020
- Montasser, O., Hanneke, S. and Srebro, N. (2020). Reducing Adversarially Robust Learning to Non-Robust PAC Learning. Advances in Neural Information Processing Systems 33 (NeurIPS). [pdf][official page][arXiv]
-
Bousquet, O., Hanneke, S., Moran, S., and Zhivotovskiy, N. (2020).
Proper Learning, Helly Number, and an Optimal SVM Bound.
In Proceedings of the 33rd Annual Conference on Learning Theory (COLT).
[pdf][official page][arXiv]
Winner of the Best Paper Award.
2019
- Hanneke, S. and Yang, L. (2019). Surrogate Losses in Passive and Active Learning. Electronic Journal of Statistics, Vol. 13 (2), pp. 4646-4708. [pdf][ps][journal page][arXiv].
- Hanneke, S. and Kpotufe, S. (2019). On the Value of Target Data in Transfer Learning. Advances in Neural Information Processing Systems 32 (NeurIPS). [pdf][official page][arXiv].
- Hanneke, S. and Kontorovich, A. (2019). Optimality of SVM: Novel Proofs and Tighter Bounds. Theoretical Computer Science. Volume 796, Pages 99-113. [pdf][journal page]
-
Montasser, O., Hanneke, S., and Srebro, N. (2019).
VC Classes are Adversarially Robustly Learnable, but Only Improperly.
In Proceedings of the 32nd Annual Conference on Learning Theory (COLT).
[pdf][official page][arXiv]
Winner of a Best Student Paper Award. - Hanneke, S. and Kontorovich, A. (2019). A Sharp Lower Bound for Agnostic Learning with Sample Compression Schemes. In Proceedings of the 30th International Conference on Algorithmic Learning Theory (ALT). [pdf][official page][arXiv]
- Hanneke, S., Kontorovich, A., and Sadigurschi, M. (2019). Sample Compression for Real-Valued Learners. In Proceedings of the 30th International Conference on Algorithmic Learning Theory (ALT). [pdf][official page][arXiv]
- Hanneke, S. and Yang, L. (2019). Statistical Learning under Nonstationary Mixing Processes. In Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics (AISTATS). [pdf][official page][arXiv]
2018
- Hanneke, S. and Yang, L. (2018). Testing Piecewise Functions. Theoretical Computer Science, Vol. 745, pp. 23-35. [pdf][ps][journal page][arXiv]
-
Zhivotovskiy, N. and Hanneke, S. (2018). Localization of VC Classes: Beyond Local Rademacher Complexities.
Theoretical Computer Science, Vol. 742, pp. 27-49.
[pdf][journal page][arXiv]
(Special Issue for ALT 2016; Invited) - Hanneke, S., Kalai, A., Kamath, G., and Tzamos, C. (2018). Actively Avoiding Nonsense in Generative Models. In Proceedings of the 31st Annual Conference on Learning Theory (COLT). [pdf][official page][arXiv]
-
Yang, L., Hanneke, S., and Carbonell, J. (2018).
Bounds on the Minimax Rate for Estimating a Prior over a VC Class from Independent Learning Tasks.
Theoretical Computer Science, Vol. 716, pp. 124-140.
[pdf][ps][journal page][arXiv]
(Special Issue for ALT 2015; Invited)
2016
- Zhivotovskiy, N. and Hanneke, S. (2016). Localization of VC Classes: Beyond Local Rademacher Complexities. In Proceedings of the 27th International Conference on Algorithmic Learning Theory (ALT). [pdf][ps][arXiv]
- Hanneke, S. (2016). Refined Error Bounds for Several Learning Algorithms. Journal of Machine Learning Research, Vol. 17 (135), pp. 1-55. [pdf][ps][arXiv][journal page]
- Hanneke, S. (2016). The Optimal Sample Complexity of PAC Learning. Journal of Machine Learning Research, Vol. 17 (38), pp. 1-15. [pdf][ps][arXiv][journal page]
2015
- Hanneke, S. and Yang, L. (2015). Minimax Analysis of Active Learning. Journal of Machine Learning Research, Vol. 16 (12), pp. 3487-3602. [pdf][ps][arXiv][journal page]
-
Hanneke, S., Kanade, V., and Yang, L. (2015).
Learning with a Drifting Target Concept.
In Proceedings of the 26th International Conference on Algorithmic Learning Theory (ALT).
[pdf][ps][arXiv]
See also this note on a result for the sample complexity of efficient agnostic learning implicit in the above concept drift paper: [pdf] - Yang, L., Hanneke, S., and Carbonell, J. (2015). Bounds on the Minimax Rate for Estimating a Prior over a VC Class from Independent Learning Tasks. In Proceedings of the 26th International Conference on Algorithmic Learning Theory (ALT). [pdf][ps][arXiv]
- Wiener, Y., Hanneke, S., and El-Yaniv, R. (2015). A Compression Technique for Analyzing Disagreement-Based Active Learning. Journal of Machine Learning Research, Vol. 16 (4), pp. 713-745. [pdf][ps][arXiv][journal page]
2014
-
Hanneke, S. (2014).
Theory of Disagreement-Based Active Learning.
Foundations and Trends in Machine Learning, Vol. 7 (2-3), pp. 131-309.
[official][Amazon]
There is also an extended version, which I may update from time to time.
2013
- Yang, L. and Hanneke, S. (2013). Activized Learning with Uniform Classification Noise. In Proceedings of the 30th International Conference on Machine Learning (ICML). [pdf][ps][appendix pdf][appendix ps]
- Yang, L., Hanneke, S., and Carbonell, J. (2013). A Theory of Transfer Learning with Applications to Active Learning. Machine Learning, Vol. 90 (2), pp. 161-189. [pdf][ps][journal page]
2012
- Balcan, M.-F. and Hanneke, S. (2012). Robust Interactive Learning. In Proceedings of the 25th Annual Conference on Learning Theory (COLT). [pdf][ps][arXiv]
-
Hanneke, S. (2012).
Activized Learning: Transforming Passive to Active with Improved Label Complexity.
Journal of Machine Learning Research, Vol. 13 (5), pp. 1469-1587.
[pdf][ps][arXiv][journal page]
Related material: extended abstract, Chapter 4 in my thesis, and various presentations [slides].
2011
- Yang, L., Hanneke, S., and Carbonell, J. (2011). Identifiability of Priors from Bounded Sample Sizes with Applications to Transfer Learning. In Proceedings of the 24th Annual Conference on Learning Theory (COLT). [pdf][ps]
- Yang, L., Hanneke, S., and Carbonell, J. (2011). The Sample Complexity of Self-Verifying Bayesian Active Learning. In Proceedings of the 14th International Conference on Artificial Intelligence and Statistics (AISTATS). [pdf][ps]
- Hanneke, S. (2011). Rates of Convergence in Active Learning. The Annals of Statistics, Vol. 39 (1), pp. 333-361. [pdf][ps][journal page]
2010
-
Yang, L., Hanneke, S., and Carbonell, J. (2010).
Bayesian Active Learning Using Arbitrary Binary Valued Queries.
In Proceedings of the 21st International Conference on Algorithmic Learning Theory (ALT).[pdf][ps]
Also available in information theory jargon. [pdf][ps] - Hanneke, S., Fu, W., and Xing, E.P. (2010). Discrete Temporal Models of Social Networks. The Electronic Journal of Statistics, Vol. 4, pp. 585-605. [pdf][journal page]
- Hanneke, S. and Yang, L. (2010). Negative Results for Active Learning with Convex Losses. Proceedings of the 13th International Conference on Artificial Intelligence and Statistics (AISTATS). [pdf][ps]
-
Balcan, M.-F., Hanneke, S., and Wortman Vaughan, J. (2010).
The True Sample Complexity of Active Learning.
Machine Learning, Vol. 80 (2-3), pp. 111-139.
[pdf][ps][journal page]
(Special Issue for COLT 2008; Invited)
2009
- Hanneke, S. (2009). Theoretical Foundations of Active Learning. Doctoral Dissertation. Machine Learning Department. Carnegie Mellon University. [pdf][ps][defense slides]
-
Hanneke, S. (2009).
Adaptive Rates of Convergence in Active Learning.
In Proceedings of the 22nd Annual Conference on Learning Theory (COLT).[pdf][ps][slides]
Also available in expanded journal version. - Hanneke, S. and Xing, E. P. (2009). Network Completion and Survey Sampling. In Proceedings of the 12th International Conference on Artificial Intelligence and Statistics (AISTATS). [pdf][ps][slides]
2008
-
Balcan, M.-F., Hanneke, S., and Wortman, J. (2008).
The True Sample Complexity of Active Learning.
In Proceedings of the 21st Annual Conference on Learning Theory (COLT).
[pdf][ps][slides]
Winner of the Mark Fulk Best Student Paper Award.
Also available in an extended journal version.
2007
-
Balcan, M.-F., Even-Dar, E., Hanneke, S., Kearns, M., Mansour, Y., and Wortman, J. (2007).
Asymptotic Active Learning.
NIPS Workshop on Principles of Learning Problem Design.
[pdf][ps][spotlight slide]
Also available in improved conference version and expanded journal version. -
Hanneke, S. and Xing, E. P. (2007).
Network Completion and Survey Sampling.
NIPS Workshop on Statistical Network Models.
See our later conference publication. - Hanneke, S. (2007). Teaching Dimension and the Complexity of Active Learning. In proceedings of the 20th Annual Conference on Learning Theory (COLT). [pdf][ps][slides]
-
Hanneke, S. (2007).
A Bound on the Label Complexity of Agnostic Active Learning.
In proceedings of the 24th Annual International Conference on Machine Learning (ICML).
[pdf][ps][slides]
Honorable Mention for the ICML 2017 Test of Time Award. -
Guo, F., Hanneke, S., Fu, W., and Xing, E.P. (2007).
Recovering Temporally Rewiring Networks: A Model-based Approach.
In proceedings of the 24th Annual International Conference on Machine Learning (ICML).
[pdf]
Also see our related earlier work. -
Hanneke, S. (2007).
The Complexity of Interactive Machine Learning.
KDD Project Report (aka Master’s Thesis).
Machine Learning Department, Carnegie Mellon University.
[pdf][ps][slides]
Includes some interesting results from a class project on The Cost Complexity of Interactive Learning, in addition to my COLT07 and ICML07 papers.
2006
-
Hanneke, S. and Xing, E. P. (2006).
Discrete Temporal Models of Social Networks.
In Proceedings of the ICML Workshop on Statistical Network Analysis.
[pdf][ps][slides]
Also available in an extended journal version - Hanneke, S. (2006). An Analysis of Graph Cut Size for Transductive Learning. In Proceedings of the 23rd International Conference on Machine Learning (ICML). [pdf][ps][slides ppt][slides pdf]
