Skip to main content

Practical Machine Learning: Foundations to Neural Networks

In this course series, you'll progress through formulating machine learning tasks from first principles—by leveraging probability theory, statistics, and Bayesian and frequentist approaches—to building linear and neural network models using maximum likelihood estimation and Bayesian methods to fit parameters for regression and classification. This series also allows you to sample content from the first course in the online Master of Engineering in Computer Engineering (MEng: CE) degree program.

Faculty

Headshot of Peter Chin

Peter Chin

Professor of Engineering

Director, Learning, Intelligence + Signal Processing (LISP) Lab

See Peter Chin's full profile.

Courses

Join us in a deep dive into the mathematical heart of machine learning. This specialization is designed to bridge the gap between foundational theory and practical mastery, guiding you as you develop the rigor needed to solve complex data challenges. Throughout this series, you will transition from exploring the nuances of probability and statistics to implementing sophisticated neural networks.

True expertise is built through discovery. Our hands-on laboratory exercises are designed to move beyond high-level software packages, inviting you to build models from first principles. By developing linear regressions, the perceptron algorithm, and various neural networks from the ground up, you will cultivate a deep, intuitive understanding of the underlying mathematics. This approach ensures that you don't just use machine learning—you truly understand how to apply it.

If you are thinking about applying for the online MEng: CE degree, the courses in this specialization draw from the first few weeks of content in the degree program's Machine Learning course, giving you a sample of the program content and style.