Assignment 4:
Topic: - Machine Learning Model Containerized
Application Deployment
Submitted by: Mohammed Samiuddin
USN: 3KC21CS030
College: KCT Engineering College, Gulbarga
______________________________________________________________________
Assignment completion steps:
1. Build and Train the Machine Learning Model
Prepare the Directory Structure
ml-app/
├── app/
│ ├── model.py
│ ├── app.py
│ └── requirements.txt
├── Dockerfile
└── kubernetes/
├── deployment.yaml
└── service.yaml
Write the Model Code (model.py)
1. Write the following Python script to train and save the model:
import pickle
from sklearn.datasets import load_iris
from sklearn.ensemble import RandomForestClassifier
def train_model():
data = load_iris()
X, y = data.data, data.target
model = RandomForestClassifier()
model.fit(X, y)
with open("model.pkl", "wb") as file:
pickle.dump(model, file)
print("Model saved as 'model.pkl'")
if __name__ == "__main__":
train_model()
Train the Model
1. Run the script to generate model.pkl:
python app/model.py
Write the Flask API (app.py)
1. Create a Flask app to serve the model:
python
Copy code
import pickle
from flask import Flask, request, jsonify
app = Flask(__name__)
with open("model.pkl", "rb") as file:
model = pickle.load(file)
@app.route("/")
def home():
return "Welcome to ML App by samiuddin!"
@app.route("/predict", methods=["POST"])
def predict():
data = request.json
prediction = model.predict([data["features"]])
return jsonify({"prediction": prediction.tolist()})
if __name__ == "__main__":
app.run(host="0.0.0.0", port=5000)
Add Dependencies in requirements.txt
1. Add the following dependencies:
Flask==2.3.3
scikit-learn==1.3.2
…………………………………………………………………………………………………………….....
2. Containerize the Application Using Docker
Write the Dockerfile
1. Create a Dockerfile with the following content:
FROM python:3.9-slim
WORKDIR /app
COPY app/ /app/
COPY model.pkl /app/
RUN pip install -r requirements.txt
EXPOSE 5000
CMD ["python", "app.py"]
3. Build the Docker Image
1. Run the following command to build the image:
docker build . -t ml-app:0.1
4. Test the Docker Image
1. Run the Docker container:
Click on run as shown below in docker desktop from the images created.
2. click on optional settings
3. Give the host port as 5000 and click run. The status of the image is as shown
3. Now open a browser and type 127.0.0.1:5000. The containerized image using docker is
generated.
3. Deploy the Application Using Kubernetes
Start Minikube: Enable kubernetes from docker as shown
1. Initialize a local Kubernetes cluster:
minikube start
2. Ensure Minikube is running:
kubectl cluster-info
Create Kubernetes Manifests
1. Write deployment.yaml:
apiVersion: apps/v1
kind: Deployment
metadata:
name: ml-app
spec:
replicas: 1
selector:
matchLabels:
app: ml-app
template:
metadata:
labels:
app: ml-app
spec:
containers:
- name: ml-app
image: lazzyxbug/ml-app:latest
ports:
- containerPort: 5000
2. Write service.yaml:
apiVersion: v1
kind: Service
metadata:
name: ml-app-service
spec:
selector:
app: ml-app
ports:
- protocol: TCP
port: 80
targetPort: 5000
type: NodePort
Apply the Kubernetes Manifests
1. Deploy the application:
kubectl apply -f kubernetes/deployment.yaml
kubectl apply -f kubernetes/service.yaml
Verify Deployment
1. Check the status of Pods and Services:
kubectl get deployments
kubectl get pods
kubectl get svc
Access the Application
1. Use Minikube to access the service:
minikube service ml-app-service