top of page
Master AWS LAMBDA Functions for Data Engineers Using PYTHON

AWS LAMBDA Functions for Data Engineers Using Python

 

This course comprehensively explores AWS Lambda Functions, focusing on building an end-to-end data pipeline using Python and essential AWS Services like Boto3, S3, DynamoDB, ECR, CloudWatch, Glue Catalog, Athena, and more.

Here's a breakdown of what you'll cover:

1. **Setting Up Development Environment**: Learn to set up essential tools on Windows for developing ETL Data Pipelines using Python and AWS Services, including Ubuntu via WSL, Docker Desktop, and Visual Studio Code with Remote Development Extension Kit.

2. **Getting Started with AWS**: Create an AWS account, configure AWS CLI, and review the datasets used for the project.

3. **Developing Core Logic**: Ingest data from sources to AWS S3 using Python Boto3, incorporating Pandas for date arithmetic, and interacting with Dynamodb.

4. **AWS Lambda Functions**: Understand the basics of AWS Lambda Functions using Python 3 Runtime Environment.

5. **Deployment and Validation**: Refactor the application, build a zip file for deployment, create Lambda Functions using zip files, and troubleshoot issues using AWS CloudWatch.

6. **Custom Docker Image for Application**: Build a custom Docker image for the application, push it to AWS ECR, and create Lambda Functions using the custom Docker image.

7. **S3 Event Notifications**: Gain insights into AWS S3 Event Notifications or S3-based triggers on Lambda Functions.

8. **Data Transformation**: Develop a Python application to transform data and write it in Parquet format to S3, utilizing Pandas.

9. **Orchestrated Pipeline with AWS Step Functions**: Build an orchestrated pipeline using AWS S3 Event Notifications between Lambda Functions, and schedule the first Lambda Function using AWS EventsBridge.

10. **Data Validation with Glue Catalog and Athena**: Create an AWS Glue Catalog table on S3 location with Parquet files and validate by running SQL queries using AWS Athena.

11. **Lambda Layers**: Understand how to use Layers for Lambda Functions.

**What You'll Learn**:

- Setup tools for ETL Data Pipelines using Python and AWS Services on Windows
- Develop applications using Python and AWS Services
- Create Lambda Functions, troubleshoot issues, and monitor logs using AWS CloudWatch
- Build custom Docker images for applications and deploy them on AWS ECR
- Develop Lambda Functions using zip bundles and custom Docker images
- Use Python modules as Lambda Layers

**Requirements**:

- Computer science or IT Degree or 1-2 years of IT Experience
- Basic Linux Skills for running commands using Terminal
- Proficiency in Python programming
- Valid AWS Account to utilize AWS Services for building Data Pipelines

**Who Should Take This Course**:

- Individuals seeking hands-on experience with AWS Lambda Functions
- Aspiring Data Engineers and Data Scientists aiming to master AWS Lambda Functions for data processing
- Entry-level IT Professionals with essential programming experience eager to build complete projects using AWS and Python
- Experienced Application Developers interested in building end-to-end applications with Python and AWS Services
- Experienced Data Engineers keen on building end-to-end data pipelines using Python and AWS Services

AWS LAMBDA Functions for Data Engineers Using Python

$695.00Price
  • Any pre-loaded packaged materials or subscription-based products, including device-based training programs, and courses that include a device, may not be refunded. Digital products including DVDs may be returned for replacement if found defective

  • Free Shipping on all orders within the US.  International shipping is available.

bottom of page