Chen Liang

Chen Liang

(she/her/hers)

Georgia Institute of Technology

Machine Learning in Natural Language Processing, Parameter Efficient Learning

I am a third year student in the Machine Learning Ph.D Program at Georgia Institute of Technology. I am very fortunate to be working with Prof. Tuo Zhao in the FLASH (Foundations of LeArning Systems for alcHemy) research group. I received my M.S degree in Computational Science & Engineering from Georgia Tech, and received my B.S degree in Electrical Engineering from University of Southern California. 

I am generally interested in machine learning for natural language processing. My research mainly focuses on developing methodologies and algorithms to improve parameter efficiency and model generalization of large-scale language models. My interests also include transfer learning and representation learning (e.g., multi-domain and multi-task learning).

Improving Parameter Efficiency in Large Neural Language Models

My research goal is to tackle the existing challenges in deploying large pre-trained deep neural networks (DNNs) to downstream applications. My primary interest is to improve the computation and parameter efficiency of large pre-trained DNNs. To achieve this goal, I focus on two major directions of research: 1) developing reliable model compression approaches to accelerate computations, and 2) developing optimization and regularization algorithms to improve training sufficiency.