Computer Science Events
This calendar presented by
Computer Science
[PAST EVENT] Colloquium: Demystifying Unsupervised Feature Learning
March 2, 2012
8am - 8:50am
Successful applications of Machine Learning typically require us to expend significant effort engineering new features and representations before applying a learning algorithm. For the most challenging applications in AI, like Computer Vision, the search for good high-level representations is vast and ongoing. Recently, many researchers have sought to create algorithms that can learn features from data automatically. In particular, a growing variety of Unsupervised Feature Learning (UFL) algorithms are now able to learn useful representations from unlabeled datasets. Yet despite steadily improving benchmark scores there remain many mysteries: with a proliferation of new models and techniques, it has been difficult to explain what exactly makes some methods perform well and others perform poorly.
In this talk, we'll investigate a variety of factors that can affect the performance of feature learning algorithms. Through a detailed study, a surprising picture emerges: we will find that many schemes succeed or fail as a result of a few key factors often unrelated to the particular learning method we use. In fact, by focusing solely on these factors and using intuitively simple learning algorithms, it is often possible not only to achieve state-of-the-art performance on common benchmarks but also to discover meaningful, high-level concepts from unlabeled data without any supervision.
----------
Adam Coates is a Ph.D. candidate in Computer Science at Stanford University where he is advised by Professor Andrew Ng. His current goal is to enable learning algorithms to acquire experience and knowledge from unlabeled data, though his interests and prior work touch topics in Reinforcement Learning, Robotics and Computer Vision. He has received Best Student Paper awards from ICML (2008) and ICDAR (2011); but his favorite endorsement is still the Instrument Rating on his pilot's license.
In this talk, we'll investigate a variety of factors that can affect the performance of feature learning algorithms. Through a detailed study, a surprising picture emerges: we will find that many schemes succeed or fail as a result of a few key factors often unrelated to the particular learning method we use. In fact, by focusing solely on these factors and using intuitively simple learning algorithms, it is often possible not only to achieve state-of-the-art performance on common benchmarks but also to discover meaningful, high-level concepts from unlabeled data without any supervision.
----------
Adam Coates is a Ph.D. candidate in Computer Science at Stanford University where he is advised by Professor Andrew Ng. His current goal is to enable learning algorithms to acquire experience and knowledge from unlabeled data, though his interests and prior work touch topics in Reinforcement Learning, Robotics and Computer Vision. He has received Best Student Paper awards from ICML (2008) and ICDAR (2011); but his favorite endorsement is still the Instrument Rating on his pilot's license.
Contact
Department of Computer Science