I just completed the Udemy course with the same title of the blog post.
The course cover three machine learning examples using different library:
- scikit-learn: Using regression model from scikit-learn trained on California housing dataset for house price prediction.
- spacy: Using a small English model from spaCy NLP framework for named entity recognition.
- keras: Using ResNet50 computer vision deep learning model from Keras framework to create for image recognition.
Let's me tell you the following value-added changes of my code from the original source code from the course:
- The examples are using Python 3.8 instead of Python 3.6 in the course and latest version of libraries except the scikit-learn example using earlier version of scikit-learn to align with m2cgen which has the possibility to convert scikit-learn model to native code and C code running in the microcontroller with emlearn.
- To deploy all examples. you only need git, no other installations are required. All examples are setup with continuous deployment pipeline using Github Actions. The following diagram shows how it works from the high-level:
- Developer update changes to repository by sending a git push command.
- The git push command will trigger the Github Actions workflow which run the serverless deploy command based on configurations in serverless.yml file.
- The CloudFormation stack consists of API Gateway, AWS Lambda, S3, ECR, etc. will be created on AWS.
After you forked my repository, you just need to add AWS_KEY and AWS_SECRET to Repository secrets of Settings/Secrets of your github repo like the following image: Then, you can deploy all examples to AWS lambda with a git push command.
- All examples are setup with "warm start" and request throttling limit to 1 to protect your account and your wallet (You don't want to have surprises in your monthly credit card bill for demo projects, right?)
- All examples have a last step "Test Lambda functions" to verify the public API is working post-deployment.
- Spacy and keras example are deploy using Docker container image to AWS Lambda. Thanks to Jan Bauer published the blog post Using container images to run TensorFlow models in AWS Lambda, timing is just right on helping me to get the keras example working.
- For keras example, image uploaded directly to Lambda endpoint and store as temp file instead of upload to S3 bucket.
- Code of each example is located in its own branch.
Please take note of the storage cost of S3 and Elastic Container Registry (ECR), make sure you clean up unnecessary files and Docker image files there.
Lastly, if you look for someone on deploying your machine learning model to AWS lambda, I think I can help here. :)
Oh yeah, here is the github repo URL https://github.com/limcheekin/serverless-ml if you missed it.
I think the scope is broad enough for now, let's go deep.
Stay tuned to the next!
No comments:
Post a Comment