Project 4: Microservices at Scale using Kubernetes — AWS Cloud DevOps Engineer Nanodegree Program
The cloud is perfect for operationalizing microservices at scale by using Kubernetes. In this fourth project of the AWS Cloud DevOps Engineer Nanodegree program, I will show how to operationalize microservices by deploying an elastic and fault-tolerant Machine Learning inference API using Kubernetes.
By the end of this project, you will have learned to create and deploy a Kubernetes cluster, configure Kubernetes autoscale, and load test a Kubernetes application:
- Test your project code using linting
2. Complete a Dockerfile to containerize this application
3. Deploy your containerized application using Docker and make a prediction
4. Improve the log statements in the source code for this application
5. Configure Kubernetes and create a Kubernetes cluster
6. Deploy a container using Kubernetes and make a prediction
7. Upload a complete Github repo with CircleCI to indicate that your code has been tested
You have learned to:
- Operationalize both existing and new microservices and apply containers best practices
- Deploy Machine Learning microservices that are elastic and fault tolerant
- Pick the appropriate abstraction for microservices: Serverless (AWS Lambda) or Container Orchestration (Kubernetes)
You are prepared to continue your learning journey with me in the Cloud Developer Nanodegree program of the Udacity Technology Scholarship powered by Bertelsmann.
Happy coding!
@ClaytonWert Palak Sadani @GabrielDalporto #30DaysofUdacity @Udacity #nanodegree #UdacityTechScholars @ThomasRabe #Bertelsmann #50000chances #PoweredByBertelsmann #femaleleaders #tech #fintech #womeninstem #womeninfintech #womenwhocode #womenleaders #womenleadership #womenspeakers #womeninbusiness