Skip to main content

Title: Scaling Deep Learning Up and Down

Zhuang Liu: Research Scientist, Meta AI Research (FAIR)

Event Details

Date
Monday, April 29, 2024
Time
12-1 p.m.
Location
Description

LIVE STREAM: https://uwmadison.zoom.us/j/96331191692?pwd=UlU1UEhIdXQ0OXVDWVpzSXYvVFBrQT09

Abstract: Deep learning with neural networks has emerged as a key approach for discovering patterns and modeling relationships in complex data. AI systems powered by deep learning are used widely in applications across a broad spectrum of scales. There are strong needs for scaling deep learning both upward and downward. Scaling up highlights the pursuit of scalability - the ability to utilize increasingly abundant computing and data resources to achieve superior capabilities, overcoming diminishing returns. Scaling down represents the demand for efficiency - there is limited data for many application domains, and deployment is often in compute-limited settings.

In this talk, we present several studies in both directions. For scaling up, we first explore the design of scalable neural network architectures that are widely adopted in various fields. We then discuss an intriguing observation on modern vision datasets and its implication on scaling training data. For scaling down, we introduce simple, effective, and popularly used approaches for compressing convolutional networks and large language models, alongside interesting empirical findings. Notably, a recurring theme in this talk is the careful examination of implicit assumptions in the literature, which often leads to surprising revelations that reshape community understanding. Finally, we discuss exciting avenues for future deep learning and vision research, such as next-gen architectures and dataset modeling.

Bio: Zhuang Liu is currently a Research Scientist at Meta AI Research (FAIR) in New York City. He received his Ph.D. from UC Berkeley EECS in 2022, advised by Trevor Darrell. His research areas include deep learning and computer vision. His work focuses on scaling neural networks both up and down, to build capable models and understand their behaviors in different computational and data environments. His work is broadly applied in different areas of computing and other disciplines. He is a recipient of the CVPR 2017 Best Paper Award.

Cost
Free

Tags