Yiding Jiang
yd <last name> at cmu dot edu
I am a PhD student at Machine Learning Department of Carnegie
Mellon University where I work with
Professor Zico Kolter. My research is supported by the Google
PhD Fellowship.
Previously, I was an AI Resident at Google Research.
I obtained my bachelor of science in Electrical
Engineering and Computer Science at UC Berkeley, where I worked on robotics and
generative models advised by Professor Ken Goldberg.
I have also spent time as a research intern at Meta AI
Research and Cerebras Systems.
Google Scholar
 / 
GitHub  / 
Twitter  / 
LinkedIn
|
|
Research
I am interested in understanding how high-capacity machine learning systems built on deep neural
networks learn and generalize, and using the insights to improve them further. This entails research
on a spectrum of topics including representation learning, reinforcement learning, non-convex
optimization, and generalization
-- both concrete generalization bounds, and less well-understood empirical phenomena such as
out-of-distribution and zero-shot generalization.
|
* indicates equal contribution
|
|
|
|
|
|
|
|
|
|
|
|
|
Learning Options via Compression
Yiding Jiang*,
Evan Z. Liu*,
Benjamin Eysenbach,
J. Zico Kolter,
Chelsea Finn
NeurIPS, 2022
[code]
|
|
Agreement-on-the-line: Predicting the Performance of Neural Networks under
Distribution Shift
Christina Baek,
Yiding Jiang,
Aditi Raghunathan,
J. Zico Kolter
NeurIPS, 2022 (oral)
|
|
Assessing Generalization of SGD via Disagreement
Yiding Jiang*, Vaishnavh
Nagarajan*, Christina Baek, J. Zico Kolter
ICLR, 2022 (spotlight)
[blog
post]
|
|
Methods and Analysis of The First Competition in Predicting Generalization of
Deep Learning
Yiding Jiang, Parth Natekar*, Manik
Sharma*, Sumukh K Aithal*, Dhruva Kashyap*,
Natarajan
Subramanyam*,Carlos Lassance*, Daniel M. Roy, Gintare Karolina
Dziugaite, Suriya Gunasekar,
Isabelle Guyon, Pierre Foret, Scott Yak, Hossein Mobahi, Behnam Neyshabur*, Samy Bengio
PMLR: NeurIPS 2020 Competition and Demonstration Track, 2020
[competiton page]
[Codalab]
[competition
dataset]
[competition code]
|
|
Fantastic Generalization Measures and Where to Find Them
Yiding Jiang*, Behnam Neyshabur*, Hossein Mobahi, Dilip Krishnan, Samy Bengio
ICLR, 2020
"Science meets the Engineering
of Deep Learning" workshop, NeurIPS 2019
(oral)
|
|
Observational Overfitting in Reinforcement Learning
Xingyou Song, Yiding Jiang, Stephen
Tu, Yilun Du, Behnam Neyshabur
ICLR, 2020
|
|
Language as an Abstraction for Hierarchical Deep Reinforcement Learning
Yiding Jiang, Shixiang
Gu, Kevin Murphy, Chelsea Finn
NeurIPS, 2019
[project page]
[environment]
|
|
|
|
Predicting the Generalization Gap in Deep Networks with Margin Distributions
Yiding Jiang, Dilip Krishnan,
Hossein Mobahi, Samy Bengio
ICLR, 2019
blog
post
|
- Teaching Assistant, 10-708 Probablistic Graphical Models. Carnegie Mellon University. Fall 2022.
- Teaching Assistant, 10-725 Convex Optimization. Carnegie Mellon University. Fall 2021.
- Reader, CS170 Efficient Algorithms and Intractable Problems. University of California, Berkeley.
Fall 2017.
|
CLEVR-Robot Environment
GitHub repository
The CLEVR-Robot environment is a reinforcement learning environment that aims to provide a
research platform for developing RL agents at the intersection of vision,
language, and continuous/discrete control.
|
|
Deep Model Generalization Dataset (DEMOGEN)
GitHub
repository
The DEMOGEN dataset is a the collection of 756 trained neural network models and the code
to
use them. This is the same dataset used by our work "Predicting the Generalization Gap in
Deep
Networks with Margin Distributions", and is to our knoweledge the first dataset of
models for studying generalization.
|
|
City2City
Project Page
City2City is a project that restyles Google streetview images of one city with another
city.
|
|