I am currently a Research Scientist on the Google Brain team. Previously I worked at the Information Sciences Institute with professors Kevin Knight and Daniel Marcu on topics related to neural network machine translation.


“Efficient Neural Architecture Search via Parameter Sharing”
(Hieu Pham, Melody Y. Guan, Barret Zoph, Quoc V. Le, Jeff Dean), ArXiv, 2018.
“Progressive Neural Architecture Search”
(C. Liu, B. Zoph, J. Shlens, W. Hua, Li-Jia Li, Li Fei-Fei, A. Yuille, J. Huang, K. Murphy), ArXiv, 2017.
“Searching for activation functions”
(Prajit Ramachandran, Barret Zoph, Quoc V. Le), ArXiv, 2017.
“Intriguing Properties of Adversarial Examples”
(Ekin D. Cubuk, Barret Zoph, Samuel S. Schoenholz, Quoc V. Le), ArXiv, 2017.
“Learning Transferable Architectures for Scalable Image Recognition”
(Barret Zoph , Vijay Vasudevan, Jonathan Shlens, Quoc V. Le), CVPR, 2018. (Spotlight)
“Neural Optimizer Search with Reinforcement Learning”
(Irwan Bello*, Barret Zoph*, Vijay Vasudevan, Quoc V. Le), ICML, 2017.
“Neural Architecture Search with Reinforcement Learning”
(Barret Zoph , Quoc V. Le), ICLR, 2017. (Oral)
“Transfer Learning for Low-Resource Neural Machine Translation”
(Barret Zoph , Deniz Yuret, Jonathan May, Kevin Knight), EMNLP, 2016.
“Simple, Fast Noise-Contrastive Estimation for Large RNN Vocabularies”
(Barret Zoph , Ashish Vaswani, Jonathan May, Kevin Knight), Proc. NAACL, 2016.
“Multi-Source Neural Translation”
(Barret Zoph , Kevin Knight), Proc. NAACL, 2016. (Oral)
“How Much Information Does a Human Translator Add to the Original?”
(Barret Zoph , Marjan Ghazvininejad, Kevin Knight), Proc. EMNLP, 2015. (Oral)





A GPU Recurrent Neural Network Toolkit Github

  • Supports multi-layered LSTM RNNs
  • Allows for model parallelization across multiple GPUs
  • Handles different loss functions such as MLE and NCE