In handling wide range of experiences ranging from data instances, knowledge, constraints, to rewards, adversaries, and lifelong interplay in an ever-growing spectrum of tasks, contemporary ML/AI research has resulted in thousands of models, learning paradigms, optimization algorithms, not mentioning countless approximation heuristics, tuning tricks, and black-box oracles, plus combinations of all above. While pushing the field forward rapidly, these results also make a comprehensive grasp of existing ML techniques more and more difficult, and make standardized, reusable, repeatable, reliable, and explainable practice and further development of ML/AI products quite costly, if possible, at all.
This tutorial will present a systematic, unified perspective of machine learning, for both a refreshing holistic understanding of the diverse learning algorithms and a guidance of operationalizing machine learning for creating problem solutions by integrating all sources of experiences.
The tutorial consists of three parts: (1) Theory: a systematic blueprint of ML that provides a unified mathematical formulation for learning with all experiences; (2) Tooling that operationalizes the framework and enables easy composition of ML solutions; (3) Computing infrastructures for productive ML, including interoperation, automatic tuning, distributing, and scheduling.
A blueprint of ML paradigms for ALL experiences
Background
The Standard Equation
The zoo of optimization solvers
Applications of Standard Equation
Compose your ML solutions like playing Lego
ML solution design by learning with all experiences
Compositionality in ML
Texar: an open-source tool for ML Composition
Interoperation, automatic tuning, distributing, and scheduling
Interoperation: Forte
Automatic tuning: TUUN
Automatic distributing: AutoDist
Automatic scheduling: AdaptDL
CTO @ Petuum Inc.
IAS Seminar on Theoretical Machine Learning: "A Blueprint of Standardized and Composable Machine Learning"
Invited Talk @ ODSC 2019: "Compositionality in Machine Learning"
AAAI 2020 Tutorial: "Modularizing Natural Language Processing"
NeurIPS 2019 Workshop: "Learning with Rich Experience (LIRE): Integration of Learning Paradigms"
ICML 2018 Workshop: "Theoretical Foundations and Applications of Deep Generative Models"