Skip to main content

RWTHx: Basics of Machine Learning

"Basics of Machine Learning" introduces participants to the fundamental concepts and tools of machine learning, including probability density estimation, linear regression, classification techniques, ensemble methods, and deep neural networks.

Basics of Machine Learning
8 weeks
6–8 hours per week
Instructor-paced
Instructor-led on a course schedule
This course is archived

About this course

Skip About this course

"Basics of Machine Learning" is designed to provide participants with a comprehensive understanding of the fundamental concepts and tools of machine learning. The course covers key topics such as probability density estimation, linear regression, classification techniques like linear discriminants, logistic regression, and support vector machines, as well as ensemble methods such as bagging and boosting. Additionally, the course introduces the basics of deep neural networks, laying the groundwork for more advanced learning techniques.

Throughout the course, students will gain a solid foundation in the fundamental approaches of machine learning. By working on practical exercises, participants will cement their understanding of the techniques covered and gain valuable hands-on experience.

By the end of the course, students will have the knowledge and skills required to confidently utilize machine learning tools and techniques in their own projects, providing a strong foundation for further study or professional development in this rapidly evolving field.

At a glance

  • Institution: RWTHx
  • Subject: Computer Science
  • Level: Intermediate
  • Prerequisites:
    • Basic linear algebra (vectors, matrices)
    • Basics of stochastics & statistics (mean, variance, random variables, normal distribution)
    • Basic programming skills in python
  • Language: English
  • Video Transcript: English

What you'll learn

Skip What you'll learn
  • Definition of Statistical Machine Learning
  • Probability density estimation
  • Definition and behavior of linear discriminant models
  • Linear regression
  • Logistic regression
  • Support Vector Machines
  • Ensemble Methods
  • Basics of Neural Networks

Week 1: Introduction, Definitions, and Core Principles

In the first week, we will provide an overview of the course and introduce the fundamental concepts of machine learning. Students will learn about the different types of learning, such as supervised, unsupervised, and reinforcement learning, as well as the key steps involved in developing a machine learning model, from data preprocessing to model evaluation and optimization.

Week 2: Probability Density Estimation

In week two, students will delve into probability density estimation, an essential technique for understanding the underlying structure of data. We will cover various methods, such as parametric and non-parametric approaches, and how they can be used for building machine learning models.

Week 3: Linear Discriminants

During the third week, we will focus on linear discriminants and their use in classifying data points. Students will learn about the concept of decision boundaries and how to derive them using linear discriminant functions. We will also discuss how to solve decision problems by minimizing a least-squares objective, the limitations of the resulting linear classifiers, and introduce strategies for handling non-linearly separable data.

Week 4: Linear Regression

In week four, students will be introduced to linear regression, a fundamental technique for modeling continuous data. We will cover the basics of simple and multiple linear regression, discuss the concept of least squares estimation, and explore regularization as a measure against overfitting.

Week 5: Logistic Regression

The fifth week will be dedicated to logistic regression, a powerful technique for binary classification tasks. Students will learn how to derive the logistic regression model, perform iterative optimization using first- and second-order methods, apply regularization, and explore the relations between generative and discriminative methods.

Week 6: Support Vector Machines

In week six, we will explore support vector machines (SVMs), a versatile and very robust algorithm for classification tasks. Students will learn about the key concepts behind SVMs, such as maximum margin and kernel functions, and gain hands-on experience implementing SVMs using popular machine learning libraries.

Week 7: Ensembling Methods

During the seventh week, we will delve into ensemble methods, which combine multiple models to improve the overall performance of a machine learning system. Students will explore popular techniques such as bagging and boosting, and learn how to implement the AdaBoost algorithm.

Week 8: Neural Network Basics

In the final week, students will be introduced to the foundations of deep learning and neural networks. We will cover the basics of artificial neurons, feedforward networks, and backpropagation. This week will provide the groundwork for more advanced topics in deep learning, preparing students for further study or simple practical applications in the field.

Who can take this course?

Unfortunately, learners residing in one or more of the following countries or regions will not be able to register for this course: Iran, Cuba and the Crimea region of Ukraine. While edX has sought licenses from the U.S. Office of Foreign Assets Control (OFAC) to offer our courses to learners in these countries and regions, the licenses we have received are not broad enough to allow us to offer this course in all locations. edX truly regrets that U.S. sanctions prevent us from offering all of our courses to everyone, no matter where they live.

Interested in this course for your business or team?

Train your employees in the most in-demand topics, with edX For Business.