Most popular programs
Trending now
6,400 already enrolled!
There is a growing interest in running software at the edge. This course takes a deep dive into the use cases and applications of Kubernetes at the edge using examples, labs, and a technical overview of the K3s project and the cloud native edge ecosystem.
This course is designed for those interested learning more about Kubernetes, as well as in deploying applications or embedded sensors in edge locations. While learners do not need a Kubernetes certification for this course, experience with a Linux operating system and shell scripting will be beneficial. Programming experience is also not strictly required. Learners will need to be able to run Docker on their computer.
In this course, you will learn the use cases for running compute in edge locations and about various supporting projects and foundations such as LF Edge and CNCF. The course covers how to deploy applications to the edge with open source tools such as K3s and k3sup, and how those tools can be applied to low-power hardware such as the Raspberry Pi. You will learn the challenges associated with edge compute, such as partial availability and the need for remote access. Through practical examples, students will gain experience of deploying applications to Kubernetes and get hands-on with object storage, MQTT and OpenFaaS. It also introduces the fleet management and GitOps models of deployment, and helps you understand messaging, and how to interface with sensors and real hardware.
This course will enable developers to learn about the growing impact the cloud native movement is having on modernizing edge deployments. They will also learn the challenges of deploying Kubernetes on the edge through a concrete example via the k3s project.
You should be familiar with the Linux Operating System and how to use common CLI commands to pass arguments, make use of configuration files, and to configure networking.
A basic understanding or some prior experience with deploying applications to Kubernetes would be helpful to you.
You will need to be able to run Docker on your computer.
Welcome
Ch 1. The Case for Edge Compute
Ch 2. The Edge Compute Landscape
Ch 3. Scaling Down and System-on-Chip Devices
Ch 4. What Is K3s and Why Is It needed?
Ch 5. Setting Up Your Lab Environment
Ch 6. Kubernetes API Primitives
Ch 7. Functions at the Edge
Ch 8. Command & Control and Remote Access
Ch 9. Deployment Strategies for Applications at the Edge
Ch 10. Challenges with Edge
Ch 11. Further Resources
Final Exam (verified track only)