Using AI Services in
AWS, Azure, and
GCP
Supervised by : Mr.outair
By: - Boutoutla Radwane
- EL Garte Mouhcine
Table of Contents
1. Introduction
a. 1.1. Introduction to the Topic
b. 1.2. AI in the Cloud Context
c. 1.3. Objective of the Report
2. Overview of Cloud AI Platforms
a. 2.1. Amazon SageMaker
b. 2.2. Azure Machine Learning Studio
c. 2.3. Google Vertex AI
3. Key Features
a. 3.1. AutoML Capabilities
b. 3.2. Model Deployment
c. 3.3. Notebooks and Visual Tools
d. 3.4. Service Integration
4. Real-World Use Cases
a. 4.1. Healthcare
b. 4.2. Finance
c. 4.3. Retail
d. 4.4. Transportation
5. Platform Comparison
a. 5.1. Strengths and Weaknesses
b. 5.2. Scalability and Cost Considerations
6. Choosing the Right Platform
a. 6.1. Based on Skill Level
b. 6.2. Based on Project Goals
c. 6.3. Based on Ecosystem
7. Trends and Future Outlook
a. 7.1. Generative AI
b. 7.2. End-to-End Automation
c. 7.3. Democratization of AI
d. 7.4. Responsible and Explainable AI
8. Conclusion
1. Introduction
1.1. Introduction to the Topic
Artificial Intelligence (AI) is transforming every industry, enabling organizations to
leverage massive volumes of data, improve decision-making, and automate complex
tasks. With the growing demand for AI capabilities, cloud providers have stepped in to
offer powerful platforms that simplify the development and deployment of machine
learning (ML) solutions.
AI is no longer the exclusive domain of large tech companies with specialized teams.
Thanks to cloud-based tools, organizations of all sizes and sectors can now experiment
with and implement AI-powered solutions at a fraction of traditional costs and
complexity.
1.2. AI in the Cloud Context
Cloud-based AI services provide scalability, flexibility, and integration with other cloud-
native tools, making them ideal for both startups and enterprises. Leading the market
are Amazon Web Services (AWS) with SageMaker, Microsoft Azure with Machine
Learning Studio, and Google Cloud Platform (GCP) with Vertex AI.
These platforms not only offer machine learning capabilities but also come with a suite
of tools for data preparation, model training, deployment, monitoring, and governance.
The shift to cloud has accelerated AI adoption, enabling faster time-to-market and
more resilient AI systems.
1.3. Objective of the Report
This report aims to explore and compare these three major platforms in terms of their
features, functionalities, real-world applications, and suitability for different users and
business needs. It will guide the reader in selecting the best platform according to their
technical context and objectives.
The report is structured into eight sections, each focusing on a key aspect:
introduction, presentation of the platforms, functionalities, practical use cases,
comparison, criteria-based platform selection, future trends, and conclusion.
2. Overview of Cloud AI Platforms
2.1. Amazon SageMaker
Description: SageMaker is a comprehensive machine learning service that enables
developers and data scientists to build, train, and deploy ML models at scale. It
supports the full machine learning lifecycle.
Target Audience: Experienced developers, data scientists, and enterprises with mature
DevOps practices.
Integration: Deep integration with AWS ecosystem including S3, Lambda, IAM,
CloudWatch, and more.
Key Offerings:
• SageMaker Studio: A web-based IDE for ML development.
• SageMaker Autopilot: For AutoML workflows.
• SageMaker Model Monitor: For monitoring model behavior in production.
• SageMaker Pipelines: Workflow automation for MLOps.
2.2. Azure Machine Learning Studio
Description: Azure ML Studio offers a low-code/no-code interface with drag-and-drop
components for rapid ML model development and deployment.
Target Audience: Beginners, business analysts, and developers needing quick
prototyping.
Integration: Strong integration with Microsoft services such as Azure DevOps, GitHub,
Excel, and Power BI.
Key Offerings:
• Designer: Visual pipeline editor.
• Responsible AI Toolbox: Tools for fairness, explainability, and error analysis.
• Automated ML: Automated training and hyperparameter tuning.
• Azure ML Compute: Scalable training environments.
2.3. Google Vertex AI
Description: Vertex AI unifies Google’s ML offerings under one umbrella, enabling
seamless transitions from experimentation to production.
Target Audience: Data scientists and ML engineers with a focus on scalable data
workflows.
Integration: Natively integrated with BigQuery, TensorFlow, and Google Cloud services.
Key Offerings:
• Vertex AI Workbench: Jupyter environment with collaborative features.
• Vertex Pipelines: For orchestration of ML workflows.
• Vertex AI Model Registry: Centralized model management.
• Vertex Explainable AI: Interpretation tools for black-box models.
3. Key Features
3.1. AutoML Capabilities
AutoML simplifies the model creation process, allowing users to focus on business
outcomes rather than algorithm selection or tuning.
• SageMaker Autopilot: Automatically explores models, tunes hyperparameters,
and generates notebooks.
• Azure AutoML: Easy setup, model interpretability, and integrations with visual
tools.
This diagram demonstrates how AutoML systems test multiple model
configurations (features, algorithms, and parameters) and score them based on
user-defined metrics, ultimately selecting the best-performing model.
• Vertex AutoML: Advanced support for tabular, vision, and NLP models, tightly
coupled with BigQuery.
3.2. Model Deployment
Deployment options across platforms are designed to ensure scalability and reliability.
• SageMaker: Provides scalable, managed endpoints and inference pipelines.
This diagram showcases a typical model deployment flow using Amazon
SageMaker Autopilot, Lambda functions, and evaluation steps. It emphasizes
how the pipeline handles training, evaluation, registration, and deployment of
the model.
• Azure: Offers deployment via containers, ACI, and AKS.
• Vertex AI: Enables model deployment on demand with autoscaling and
monitoring.
3.3. Notebooks and Visual Tools
These platforms support both code-centric and visual development.
• SageMaker Notebooks: Preconfigured environments with secure IAM policies.
• Azure Designer: No-code ML with step-by-step workflow design.
• Vertex AI Workbench: Shared, collaborative JupyterLab environments.
3.4. Service Integration
Each platform connects to a wider cloud ecosystem for end-to-end workflows:
• AWS: Redshift, Glue, Athena, SageMaker Clarify for bias detection.
• Azure: Synapse Analytics, Power BI, Cognitive Services.
• GCP: Looker Studio, Cloud Storage, Dataflow, AI Hub.
4. Real-World Use Cases
4.1. Healthcare
AI in healthcare is revolutionizing diagnostics, personalized treatment, and operational
efficiency.
• Vertex AI: Used for patient sentiment analysis and radiology image
classification.
• SageMaker: Powers real-time diagnostic tools integrating hospital IoT devices.
• Azure ML: Deployed for capacity forecasting and EHR data modeling.
4.2. Finance
Financial institutions use AI for fraud prevention, customer profiling, and asset
management.
• SageMaker: Trains fraud detection models using transaction data and behavior
analysis.
• Azure ML: Helps in credit risk modeling and algorithmic trading.
• Vertex AI: Implements sentiment-driven trading bots and loan approval models.
4.3. Retail
Retailers harness AI to personalize services and optimize supply chains.
• Azure ML: Customer churn and loyalty scoring models.
• SageMaker: Collaborative filtering-based recommendation engines.
• Vertex AI: Inventory prediction using sales history and external data.
4.4. Transportation
Transport industries benefit from predictive analytics and real-time optimization.
• Vertex AI: Powers fleet route optimization and accident prediction.
• Azure ML: Enables dynamic pricing models for ride-sharing services.
• SageMaker: Predictive maintenance of heavy-duty vehicle components.
5. Platform Comparison
Feature SageMaker Azure ML Studio Vertex AI
AutoML Support Yes (Autopilot) Yes Yes
No-Code Tools Limited Strong Moderate
MLOps Strong Strong Moderate
Integration
Ease of Use Moderate High Moderate
Integration AWS native Microsoft Google Cloud
Ecosystem services ecosystem stack
Notebooks Jupyter Designer & AI Workbench
Support Notebooks
Pricing Flexibility Fine-grained Predictable Consumption-
based
5.1. Strengths and Weaknesses
• SageMaker: Excellent for production but complex for beginners.
• Azure ML Studio: Easiest to start with, best documentation for non-tech users.
• Vertex AI: Most cohesive with Google tools, ideal for big data workflows.
5.2. Scalability and Cost Considerations
• AWS offers flexibility in instance types and spot pricing.
• Azure offers cost-effective plans for small to mid-sized projects.
• GCP simplifies scaling through managed services and usage-based billing.
6. Choosing the Right Platform
6.1. Based on Skill Level
• Beginners: Azure ML Studio due to drag-and-drop tools.
• Intermediate Users: Vertex AI with BigQuery data and AutoML.
• Experts: SageMaker offers full control, advanced tuning, and MLOps tools.
6.2. Based on Project Goals
• Prototyping: Azure and Vertex are ideal for experimentation.
• Production Deployment: SageMaker and Azure ensure better CI/CD integration.
• Big Data Analytics: GCP (Vertex AI + BigQuery) offers the most seamless
experience.
6.3. Based on Ecosystem
• Choose AWS if already using EC2, S3, and other AWS services.
• Choose Azure if your org is based on Office 365 or Microsoft stack.
• Choose GCP if you're deeply integrated with Google Workspace or using Looker.
7. Trends and Future Outlook
As the field of Artificial Intelligence continues to evolve rapidly, cloud-based AI
platforms are not only keeping pace—they're actively shaping the future of
intelligent systems. The following trends are at the forefront of this
transformation:
7.1. Generative AI
• Generative AI refers to models that can produce new content—text, images,
code, audio, etc.—based on patterns learned from data. These models go
beyond traditional machine learning by not only making predictions but also
generating realistic outputs.
• All major cloud providers are integrating generative AI into their platforms:
• AWS: Offers Amazon Bedrock, a fully managed service that allows users to
build generative AI applications using foundational models from top providers
(like Anthropic, AI21 Labs, Stability AI, and Meta). AWS also supports
HuggingFace models through SageMaker for users who want to fine-tune or
deploy open-source transformers.
• Microsoft Azure: Has partnered with OpenAI to provide seamless access to
models like GPT-3.5, GPT-4, and DALL·E via Azure OpenAI Service. This makes it
possible for Azure customers to embed advanced language, image, and code
generation capabilities into their apps with security and compliance built-in.
• Google Cloud Platform (GCP): Delivers cutting-edge generative AI through
Vertex AI. Users can access Google’s PaLM models (for language tasks),
Imagen (for image generation), and Gemini APIs—powerful multi-modal models
capable of understanding and generating across text, image, and audio inputs.
• These developments allow organizations to build chatbots, code assistants,
content generators, and more—without developing models from scratch.
7.2. End-to-End Automation
• End-to-end automation in ML refers to the ability to manage the entire machine
learning lifecycle—from data ingestion and preprocessing to model training,
evaluation, deployment, and monitoring—through automated pipelines.
• Each platform offers robust tools for this:
• SageMaker Pipelines (AWS): Enables the automation of ML workflows with
support for step-by-step execution, reuse of steps, versioning, and secure
sharing. Integrates with CloudWatch and EventBridge for monitoring and alerts.
• Azure ML Pipelines: Offers visual and code-based pipelines that support data
movement, model training, and deployment. Integrated with Azure DevOps and
GitHub Actions to streamline MLOps workflows.
• Vertex AI Pipelines (GCP): Based on Kubeflow Pipelines, they allow modular,
scalable automation using containers and orchestration via Cloud Build, Cloud
Composer, and other GCP-native services.
• This trend is vital for improving efficiency, consistency, and reproducibility in
ML projects.
7.3. Democratization of AI
• AI is becoming more accessible to a wider range of users—not just data
scientists and developers.
• Pre-trained models: Cloud platforms provide APIs and endpoints for vision,
NLP, and tabular models. Users can plug these into their applications with
minimal effort.
• No-code/low-code interfaces: Tools like Azure ML Designer, SageMaker
Canvas, and Vertex AI AutoML UI enable non-technical users (e.g., business
analysts, marketers) to train and deploy models through simple interfaces.
• Citizen developers: Platforms are supporting more cross-functional
collaboration, allowing people from non-technical backgrounds to build smart
applications that solve real-world business problems.
• This democratization fosters innovation and accelerates digital
transformation across industries.
7.4. Responsible and Explainable AI
• As AI adoption increases, so does the need for transparency, fairness, and
accountability. Regulatory bodies and ethical guidelines now require systems
to be auditable and explainable.
• Fairness & Bias Detection:
o SageMaker Clarify, Azure Responsible AI dashboard, and Vertex
Explainable AI offer bias detection in datasets and models.
o These tools help ensure that models do not produce unfair outcomes for
different groups.
• Explainability:
o Techniques such as SHAP, LIME, and integrated explainers provide
insights into how models make decisions, especially in high-stakes
domains (e.g., healthcare, finance).
• Auditability and Governance:
o Model versioning, metadata tracking, and lineage tools help meet
compliance requirements (e.g., GDPR, HIPAA).
o Platforms now emphasize responsible AI toolkits and governance
workflows as default components of any ML project.
• This trend ensures ethical use of AI and builds trust in intelligent systems—
crucial for societal acceptance and regulatory compliance.
8. Conclusion
Cloud-based AI platforms like AWS SageMaker, Azure ML Studio, and Google Vertex AI
are reshaping how organizations design, deploy, and maintain intelligent systems. Each
platform has its own strengths:
• SageMaker: Best for enterprise-scale deployments and MLOps.
• Azure ML Studio: Excellent for beginners and rapid development.
• Vertex AI: Ideal for data-centric environments with heavy analytics workloads.
Ultimately, the best choice depends on your team’s expertise, project scope,
integration needs, and budget. With rapid innovation in AI and cloud services, these
platforms will continue to evolve, offering more automation, better accessibility, and
tighter integration across ecosystems.
Organizations must keep up with these trends to maintain competitiveness and unlock
the full potential of AI.