Skip to main content

Talk: Learning-Based Program Synthesis: Learning for Program Synthesis and Program Synthesis for Learning

Xinyun Chen: Ph.D. Candidate, UC Berkeley

Event Details

Date
Thursday, April 21, 2022
Time
4-5 p.m.
Location
Description

LIVE STREAM: https://uwmadison.zoom.us/j/95711745958?pwd=MFhXckZseWhWdlRXbWkxWXU1bWhFQT09

Abstract: With the advancement of modern technologies, programming becomes ubiquitous not only among professional software developers, but also for general computer users. However, gaining programming expertise is time-consuming and challenging. Therefore, program synthesis has many applications, where the computer automatically synthesizes programs from specifications such as natural language descriptions and input-output examples. In this talk, I will present my work on learning-based program synthesis, where I have developed deep learning techniques for various program synthesis problems. Despite the remarkable success of deep neural networks for many domains, including natural language processing and computer vision, existing deep neural networks are still insufficient for handling challenging symbolic reasoning and generalization problems.

My learning-based program synthesis research lies in two folds: (1) learning to synthesize programs from potentially ambiguous and complex specifications; and (2) neural-symbolic learning for language understanding. I will first talk about program synthesis applications, where my work demonstrates the applicability of learning-based program synthesizers for production usage. I will then present my work on neural-symbolic frameworks that integrate symbolic components into neural networks, which achieve better reasoning and generalization capabilities. In closing, I will discuss the challenges and opportunities of further improving the complexity and generalizability of learning-based program synthesis for future work.

Bio: Xinyun Chen is a Ph.D. candidate at UC Berkeley, working with Prof. Dawn Song. Her research lies at the intersection of deep learning, programming languages, and security. Her recent research focuses on learning-based program synthesis and adversarial machine learning. She received the Facebook Fellowship in 2020, and Rising Stars in Machine Learning in 2021. Her work SpreadsheetCoder for spreadsheet formula prediction was integrated into Google Sheets, and she was part of the AlphaCode team when she interned at DeepMind.

Cost
Free

Tags