About Me
I'm a PhD student at the School of Electrical and Computer Engineering and the Center for Machine Learning at Georgia Tech. I am fortunate to be advised by
Mark Davenport and to be a ML@GT Fellow. I graduated with a BSE in Electrical Engineering from the
University of Michigan in May 2019.
For summer 2023, I was an AI Research Intern at Duolingo, where I worked with Will Monroe on zero/few-shot retrieval and recommendation with large language models.
From summer and fall 2022, I was an Applied Scientist Intern at Amazon, where I worked with Arjun Seshadri, Mariya Vasileva, and Achal Dave on synthetic dataset generation with GANs.
My research is focuses on studying human data elicitation from both an empirical and theoretical perspective. Specifically, I am interested in how humans should be queried to provide feedback and when we can avoid asking for human feedback.
As a result, my past work has spanned
- Learning from simple relational queries, such as paired comparisons. How much can we learn when interacting with users with very simple queries?
- Querying humans with mathematically rigorous guarantees. Can we leverage tools from the rich field of high-dimensional statistics to develop theoretical guarantees for learning from human feedback?
- Learning without human feedback. When can we avoid querying humans for additional information? What can we learn in these scenarios?
In my free time, I enjoy cooking (and eating), reading, running, and watching basketball (NBA and college).
Selected publications
- HandsOff: Labeled dataset generation with no additional human annotations
Austin Xu, Mariya I. Vasileva, Achal Dave, and Arjun Seshadri
CVPR 2023
Highlight Award (top 2.5% of submissions, 26% conference acceptance rate)
Short version in the NeurIPS 2022 SyntheticData4ML Workshop
[arxiv][website] [code]
- Large language model augmented exercise retrieval for personalized language learning
Austin Xu, Will Monroe, and Klinton Bicknell
Learning Analytics and Knowledge (LAK) 2024
Short version in the NeurIPS 2023 Generative AI for Education Workshop
- Perceptual adjustment queries and an inverted measurement paradigm for low-rank metric learning
Austin Xu, Andrew D. McRae, Jingyan Wang, Mark A. Davenport, and Ashwin Pananjady
NeurIPS 2023
Short version in the ICML 2023 Many Facets of Preference Learning Workshop
[arxiv - extended version][code]
- Simultaneous Preference and Metric Learning from Paired Comparisons
Austin Xu and Mark A. Davenport
NeurIPS 2020
Spotlight Presentation (top 4% of submissions, 20% conference acceptance rate)
[arxiv] [website] [talk]
- Active metric learning and classification using similarity queries
Namrata Nadagouda, Austin Xu, and Mark A. Davenport
UAI 2023
Short version in the NeurIPS 2022 Workshop on Human in the Loop Learning
[arxiv]
Experience
Work Experience
- AI Research Intern at Duolingo (Summer 2023)
- Applied Scientist Intern at Amazon (Summer, Fall 2022)
- R&D Summer Intern at Sandia National Laboratories (Summer 2018)
- Student Intern at General Motors (Summer 2017)
Teaching Experience
- Spring 2022: Head TA, Statistical Machine Learning (ECE 6254) [website]
- Fall 2019/Spring 2020/Summer 2020: TA, Professional and Technical Communications (ECE 3005)
- Fall 2018/Spring 2019: IA, Discrete Mathematics (University of Michigan -- EECS 203)