How to Deploy Lambda Functions

How to Deploy Lambda Functions Amazon Web Services (AWS) Lambda is a serverless compute service that lets you run code without provisioning or managing servers. It automatically scales your applications in response to incoming requests and charges only for the compute time consumed. Deploying Lambda functions is a foundational skill for modern cloud developers, DevOps engineers, and infrastructure

Nov 6, 2025 - 10:17
Nov 6, 2025 - 10:17
 2

How to Deploy Lambda Functions

Amazon Web Services (AWS) Lambda is a serverless compute service that lets you run code without provisioning or managing servers. It automatically scales your applications in response to incoming requests and charges only for the compute time consumed. Deploying Lambda functions is a foundational skill for modern cloud developers, DevOps engineers, and infrastructure architects aiming to build scalable, cost-efficient, and highly available applications.

Deploying Lambda functions involves packaging your code, configuring execution roles, setting triggers, and publishing versionsoften through multiple environments (development, staging, production). While the AWS Management Console provides a simple interface for beginners, professional deployments require automation, version control, and infrastructure-as-code (IaC) practices to ensure reliability and repeatability.

This guide provides a comprehensive, step-by-step walkthrough of deploying AWS Lambda functions using industry-standard methods. Whether youre deploying a simple HTTP endpoint, a data processing pipeline, or an event-driven microservice, this tutorial covers everything from initial setup to advanced optimization techniques. By the end, youll understand not only how to deploy Lambda functions, but how to do so securely, efficiently, and at scale.

Step-by-Step Guide

Prerequisites

Before deploying your first Lambda function, ensure you have the following:

  • An AWS account with appropriate permissions (preferably with an IAM user configured for programmatic access)
  • AWS CLI installed and configured on your local machine
  • Node.js, Python, or another supported runtime installed (depending on your functions language)
  • A code editor (e.g., VS Code, Sublime Text)
  • Basic understanding of JSON, YAML, and command-line interfaces

Verify your AWS CLI configuration by running:

aws configure

Enter your AWS Access Key ID, Secret Access Key, default region (e.g., us-east-1), and output format (json). This ensures the CLI can interact with your AWS environment.

Step 1: Write Your Lambda Function Code

Lambda functions are written in supported languages including Node.js, Python, Java, C

, Go, and Ruby. For this guide, well use Python 3.12 as its widely adopted and easy to read.

Create a new directory for your project:

mkdir my-lambda-function

cd my-lambda-function

Create a file named lambda_function.py:

def lambda_handler(event, context):

Log the incoming event

print("Received event: " + str(event))

Return a simple response

return {

'statusCode': 200,

'headers': {

'Content-Type': 'application/json'

},

'body': {

'message': 'Hello from AWS Lambda!',

'input': event

}

}

This function accepts an event object (e.g., from API Gateway, S3, or CloudWatch) and returns a structured HTTP-like response. The lambda_handler function is the entry point AWS looks for when invoking your code.

Step 2: Package Your Function

Lambda requires your code to be packaged as a ZIP file. If your function uses external libraries (e.g., requests, boto3), you must include them in the package.

Install dependencies locally:

pip install requests -t .

This installs the requests library into the current directory. Now, zip your files:

zip function.zip lambda_function.py

If youre using additional files (e.g., configuration files, static assets), include them:

zip -r function.zip lambda_function.py requirements.txt config/

Ensure your ZIP file does not exceed 50 MB (unzipped) for direct uploads. For larger deployments, use Amazon S3 as a staging location.

Step 3: Create an IAM Execution Role

Lambda functions require an IAM role to interact with other AWS services. This role defines permissions via attached policies.

Use the AWS CLI to create a role with the minimum required permissions:

aws iam create-role --role-name lambda-execution-role --assume-role-policy-document '{

"Version": "2012-10-17",

"Statement": [

{

"Effect": "Allow",

"Principal": {

"Service": "lambda.amazonaws.com"

},

"Action": "sts:AssumeRole"

}

]

}'

Attach the AWS-managed policy for basic Lambda execution:

aws iam attach-role-policy --role-name lambda-execution-role --policy-arn arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole

For functions that need to access S3, DynamoDB, or other services, attach additional policies as needed:

aws iam attach-role-policy --role-name lambda-execution-role --policy-arn arn:aws:iam::aws:policy/AmazonS3ReadOnlyAccess

Step 4: Deploy the Function Using AWS CLI

Use the create-function command to deploy your ZIP file:

aws lambda create-function \

--function-name my-first-lambda \

--runtime python3.12 \

--role arn:aws:iam::YOUR_ACCOUNT_ID:role/lambda-execution-role \

--handler lambda_function.lambda_handler \

--zip-file fileb://function.zip \

--description "A sample Lambda function for deployment tutorial" \

--timeout 30 \

--memory-size 128

Replace YOUR_ACCOUNT_ID with your actual AWS account ID. Key parameters:

  • --function-name: Unique name for your function
  • --runtime: The execution environment (e.g., python3.12, nodejs18.x)
  • --role: ARN of the IAM role created earlier
  • --handler: Format: filename.function_name
  • --zip-file: Path to your ZIP file (use fileb:// for binary)
  • --timeout: Maximum execution time in seconds (1900)
  • --memory-size: Memory allocated (12810240 MB)

Upon success, AWS returns a JSON response containing the functions ARN, version, and configuration details.

Step 5: Test the Function

Test your deployed function using the AWS CLI:

aws lambda invoke \

--function-name my-first-lambda \

--payload '{"key": "value"}' \

response.json

View the output:

cat response.json

You should see the JSON response you defined in your code. To view logs, use CloudWatch:

aws logs tail /aws/lambda/my-first-lambda --follow

Step 6: Set Up an API Gateway Trigger (Optional)

To expose your Lambda function via HTTP, connect it to Amazon API Gateway.

Create a REST API:

aws apigateway create-rest-api --name "My Lambda API" --description "API for my Lambda function"

Save the returned id (e.g., abc123).

Get the root resource ID:

aws apigateway get-resources --rest-api-id abc123

Create a POST method on the root resource:

aws apigateway put-method \

--rest-api-id abc123 \

--resource-id YOUR_ROOT_RESOURCE_ID \

--http-method POST \

--authorization-type NONE

Integrate the method with your Lambda function:

aws apigateway put-integration \

--rest-api-id abc123 \

--resource-id YOUR_ROOT_RESOURCE_ID \

--http-method POST \

--type AWS_PROXY \

--integration-http-method POST \

--uri arn:aws:apigateway:us-east-1:lambda:path/2015-03-31/functions/arn:aws:lambda:us-east-1:YOUR_ACCOUNT_ID:function:my-first-lambda/invocations

Deploy the API:

aws apigateway create-deployment \

--rest-api-id abc123 \

--stage-name prod

Your function is now accessible via a public URL:

https://abc123.execute-api.us-east-1.amazonaws.com/prod

Test it with curl:

curl -X POST https://abc123.execute-api.us-east-1.amazonaws.com/prod -d '{"key": "value"}'

Step 7: Use AWS SAM or CDK for Advanced Deployments

For production-grade deployments, avoid manual CLI commands. Use AWS Serverless Application Model (SAM) or AWS Cloud Development Kit (CDK) to define infrastructure as code.

Install AWS SAM CLI:

pip install aws-sam-cli

Create a template.yaml file:

AWSTemplateFormatVersion: '2010-09-09'

Transform: AWS::Serverless-2016-10-31

Resources:

MyLambdaFunction:

Type: AWS::Serverless::Function

Properties:

CodeUri: src/

Handler: lambda_function.lambda_handler

Runtime: python3.12

Events:

Api:

Type: Api

Properties:

Path: /hello

Method: post

MemorySize: 128

Timeout: 30

Environment:

Variables:

ENV: production

Build and deploy:

sam build

sam deploy --guided

SAM automates packaging, IAM role creation, and API Gateway setup. It also supports local testing with sam local invoke and sam local start-api.

Best Practices

1. Use Infrastructure as Code (IaC)

Manual deployments via the AWS console are error-prone and unrepeatable. Use IaC tools like AWS SAM, CDK, or Terraform to define your Lambda functions, triggers, and permissions in version-controlled code. This ensures consistency across environments and enables CI/CD pipelines.

2. Minimize Deployment Package Size

Large ZIP files increase deployment time and cold start latency. Only include necessary dependencies. Use tools like pip install --target to install only required packages. For Python, consider using pip-tools to lock dependencies and avoid bloating your package.

For Node.js, use serverless-bundle or webpack to tree-shake unused code. For Go, compile a single binary with no external dependencies.

3. Set Appropriate Memory and Timeout Values

Lambda allocates CPU power proportionally to memory. Increasing memory from 128 MB to 512 MB can reduce execution time by up to 50%. Use AWS Lambda Power Tuning to find the optimal memory configuration for cost and performance.

Set timeouts conservativelyno more than 10% above your average execution time. Avoid timeouts exceeding 15 minutes unless absolutely necessary.

4. Implement Environment Variables for Configuration

Store sensitive or environment-specific values (e.g., API keys, database URLs) in Lambda environment variables, not in code. Use AWS Systems Manager Parameter Store or AWS Secrets Manager for sensitive data.

Example in template.yaml:

Environment:

Variables:

DATABASE_URL: !Ref DatabaseUrlParameter

API_KEY: !Ref ApiKeySecret

5. Enable Versioning and Aliases

Always publish versions after deployment. Use aliases (e.g., dev, prod) to point to specific versions. This allows safe rollbacks and blue-green deployments.

Deploy a new version:

aws lambda publish-version --function-name my-function

Update an alias:

aws lambda update-alias --function-name my-function --name prod --function-version 2

6. Monitor and Log Effectively

Enable CloudWatch Logs for every Lambda function. Use structured logging (JSON) to enable filtering and analysis:

import json

import logging

logging.basicConfig(level=logging.INFO)

logger = logging.getLogger()

def lambda_handler(event, context):

logger.info(json.dumps({

"event": event,

"function_name": context.function_name,

"request_id": context.aws_request_id

}))

return {"status": "success"}

Set up CloudWatch Alarms for errors, throttles, and duration spikes. Use AWS X-Ray for distributed tracing in complex serverless architectures.

7. Secure Your Functions

Apply the principle of least privilege to IAM roles. Avoid granting broad permissions like lambda:* or iam:*. Use custom policies that restrict access to specific resources.

Enable VPC access only when required (e.g., to reach RDS or ElastiCache). Functions inside a VPC have slower cold starts and require NAT gateways for outbound internet access.

Use AWS WAF and API Gateway authorizers (Cognito, Lambda Authorizers) to secure HTTP endpoints.

8. Handle Errors Gracefully

Never let uncaught exceptions crash your function. Wrap logic in try-catch blocks and return meaningful error responses:

def lambda_handler(event, context):

try:

Business logic here

result = process_data(event)

return {

'statusCode': 200,

'body': json.dumps(result)

}

except Exception as e:

logger.error(f"Error processing request: {str(e)}")

return {

'statusCode': 500,

'body': json.dumps({'error': 'Internal server error'})

}

Configure Dead Letter Queues (DLQs) for asynchronous invocations to capture failed events for retry or analysis.

9. Optimize for Cold Starts

Cold starts occur when Lambda initializes a new execution environment. Reduce them by:

  • Using smaller deployment packages
  • Choosing runtimes with faster startup (e.g., Python, Go over Java)
  • Enabling Provisioned Concurrency for critical functions
  • Avoiding heavy initialization in global scope (e.g., database connections)

Initialize connections outside the handler:

import boto3

Initialize once, outside handler

s3_client = boto3.client('s3')

def lambda_handler(event, context):

Reuse connection

response = s3_client.list_buckets()

return {"buckets": len(response['Buckets'])}

10. Implement CI/CD Pipelines

Integrate Lambda deployments into CI/CD pipelines using GitHub Actions, GitLab CI, or AWS CodePipeline. Automate testing, linting, packaging, and deployment on every push to main.

Example GitHub Actions workflow:

name: Deploy Lambda

on:

push:

branches: [ main ]

jobs:

deploy:

runs-on: ubuntu-latest

steps:

- uses: actions/checkout@v3

- uses: actions/setup-python@v4

with:

python-version: '3.12'

- run: pip install -r requirements.txt -t .

- run: zip -r function.zip lambda_function.py

- uses: aws-actions/amazon-s3-sync@v2

with:

aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}

aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}

aws-region: us-east-1

source-path: function.zip

destination-bucket: my-lambda-deploy-bucket

- run: aws lambda update-function-code --function-name my-function --s3-bucket my-lambda-deploy-bucket --s3-key function.zip

env:

AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}

AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}

AWS_DEFAULT_REGION: us-east-1

Tools and Resources

AWS Serverless Application Model (SAM)

AWS SAM is an open-source framework for building serverless applications. It extends AWS CloudFormation with simplified syntax for defining Lambda functions, APIs, and event sources. SAM CLI allows local testing, making it ideal for development workflows.

AWS Cloud Development Kit (CDK)

CDK lets you define infrastructure using familiar programming languages (TypeScript, Python, Java, C

). Its ideal for teams already using object-oriented languages and wanting full control over resource definitions.

Serverless Framework

A popular third-party framework that supports multiple cloud providers. It abstracts away cloud-specific details and offers plugins for deployment, monitoring, and testing.

VS Code + AWS Toolkit

The AWS Toolkit for VS Code provides a graphical interface to deploy, debug, and monitor Lambda functions directly from your editor. It integrates with SAM, CloudWatch, and S3.

Thundra, Datadog, and New Relic

Third-party observability platforms offer enhanced monitoring, tracing, and alerting for serverless applications beyond CloudWatch.

Serverless Stack (SST)

A modern framework built on CDK that simplifies development with live reloading and local testing for Lambda, API Gateway, and DynamoDB.

GitHub Actions / AWS CodePipeline

Automate your deployment pipeline. Use GitHub Actions for open-source or public repos; use CodePipeline for enterprise AWS-native workflows.

Chalice (Python-only)

A microframework by AWS for building serverless applications in Python. It auto-generates API Gateway and Lambda configurations from simple decorators.

Layer Management

Use Lambda Layers to share code and dependencies across multiple functions. Create a layer for shared utilities, logging libraries, or SDKs. This reduces duplication and simplifies updates.

Amazon S3 for Large Artifacts

For packages larger than 50 MB, upload ZIP files to S3 and reference them during deployment:

aws lambda update-function-code \

--function-name my-function \

--s3-bucket my-deployment-bucket \

--s3-key function-v2.zip

OpenTelemetry and AWS X-Ray

Use AWS X-Ray for end-to-end tracing of Lambda functions and downstream services. Integrate OpenTelemetry for cross-platform observability.

Real Examples

Example 1: Image Processing with S3 Trigger

Scenario: When a user uploads an image to an S3 bucket, resize it and store the thumbnail.

Code (resize_image.py):

import boto3

from PIL import Image

import io

s3 = boto3.client('s3')

def lambda_handler(event, context):

bucket = event['Records'][0]['s3']['bucket']['name']

key = event['Records'][0]['s3']['object']['key']

Download image

response = s3.get_object(Bucket=bucket, Key=key)

image_data = response['Body'].read()

Resize

image = Image.open(io.BytesIO(image_data))

image.thumbnail((200, 200))

Upload thumbnail

buffer = io.BytesIO()

image.save(buffer, 'JPEG')

buffer.seek(0)

thumbnail_key = 'thumbnails/' + key

s3.put_object(

Bucket=bucket,

Key=thumbnail_key,

Body=buffer,

ContentType='image/jpeg'

)

return {'status': 'Thumbnail created', 'key': thumbnail_key}

Configure S3 event trigger in AWS Console or via SAM:

Events:

S3Trigger:

Type: S3

Properties:

Bucket: !Ref ImageBucket

Events:

- s3:ObjectCreated:*

Filter:

S3Key:

Rules:

- Name: suffix

Value: .jpg

Example 2: Scheduled Data Cleanup

Scenario: Delete logs older than 30 days from DynamoDB every night.

Code (cleanup_logs.py):

import boto3

from datetime import datetime, timedelta

dynamodb = boto3.resource('dynamodb')

table = dynamodb.Table('user_logs')

def lambda_handler(event, context):

cutoff_date = (datetime.utcnow() - timedelta(days=30)).isoformat()

Scan and delete old items (use pagination for large datasets)

response = table.scan()

items_to_delete = [item for item in response['Items'] if item['timestamp']

for item in items_to_delete:

table.delete_item(Key={'id': item['id']})

return {'deleted_count': len(items_to_delete)}

Trigger via CloudWatch Events (EventBridge):

Events:

Schedule:

Type: Schedule

Properties:

Schedule: rate(24 hours)

Example 3: REST API for User Authentication

Scenario: Authenticate users via JWT tokens and return profile data.

Code (auth_handler.py):

import jwt

import boto3

import os

def lambda_handler(event, context):

token = event['headers'].get('Authorization', '').replace('Bearer ', '')

try:

payload = jwt.decode(token, os.environ['JWT_SECRET'], algorithms=['HS256'])

user_id = payload['sub']

Fetch user from DynamoDB

dynamodb = boto3.resource('dynamodb')

table = dynamodb.Table('users')

response = table.get_item(Key={'id': user_id})

if 'Item' not in response:

return {'statusCode': 404, 'body': 'User not found'}

return {

'statusCode': 200,

'body': response['Item']

}

except jwt.ExpiredSignatureError:

return {'statusCode': 401, 'body': 'Token expired'}

except jwt.InvalidTokenError:

return {'statusCode': 401, 'body': 'Invalid token'}

Deploy with API Gateway and attach a Lambda Authorizer to validate tokens before reaching the function.

FAQs

What is the maximum size for a Lambda deployment package?

The maximum unzipped size for a Lambda function is 250 MB when deployed via the console or CLI. If using S3, the ZIP file can be up to 50 MB, but the unzipped contents must still be under 250 MB. For larger dependencies, use Lambda Layers.

Can I use Docker to deploy Lambda functions?

Yes. AWS Lambda supports container images as a deployment format. You can package your function as a Docker image (up to 10 GB) and push it to Amazon ECR. This is ideal for complex applications requiring custom runtimes or large libraries.

How do I handle secrets in Lambda functions?

Never hardcode secrets. Use AWS Secrets Manager or Systems Manager Parameter Store with encryption. Reference them via environment variables. Lambda automatically decrypts secrets at runtime if the IAM role has permission.

Why is my Lambda function timing out?

Timeouts occur when your code runs longer than the configured limit. Check for infinite loops, slow external calls (e.g., unresponsive APIs), or excessive data processing. Increase the timeout setting or optimize your logic. Use CloudWatch Logs to identify bottlenecks.

How do I roll back a Lambda deployment?

Use versioning and aliases. Publish a new version after each deployment. If an issue arises, update the alias to point to a previous version. For example, change the prod alias from version 5 to version 4.

Can I run multiple functions in one deployment?

Yes. Use AWS SAM or CDK to define multiple functions in a single template. Each function is deployed independently but can share layers, environment variables, and infrastructure.

Do I need to restart Lambda after deployment?

No. AWS Lambda automatically handles updates. When you update the function code or configuration, AWS replaces the execution environment on the next invocation. Cold starts may occur, but no manual restart is required.

How does Lambda pricing work?

Lambda charges based on the number of requests and the duration of execution (rounded to the nearest millisecond). The first 1 million requests per month are free. After that, you pay per 1 million requests and per GB-second of compute time. Memory allocation affects costhigher memory = higher price.

Can I use Lambda with on-premises systems?

Lambda runs in AWS cloud. To interact with on-premises systems, use AWS Direct Connect or AWS Site-to-Site VPN to establish a secure connection. Alternatively, use AWS App Runner or EC2 as a proxy.

What happens if my Lambda function fails repeatedly?

For synchronous invocations, AWS returns an error to the caller. For asynchronous invocations, AWS retries twice. If all retries fail, the event can be sent to a Dead Letter Queue (DLQ) if configured. Use DLQs to capture and analyze failed events.

Conclusion

Deploying AWS Lambda functions is more than just uploading codeits about building resilient, scalable, and maintainable serverless applications. From writing clean, minimal code to automating deployments with CI/CD pipelines, every step in this process contributes to the reliability and performance of your system.

By following the practices outlined in this guideusing Infrastructure as Code, minimizing package sizes, securing permissions, enabling monitoring, and leveraging versioningyou position yourself to deploy Lambda functions with confidence in production environments.

Serverless architecture is not a trendits the future of cloud-native development. As AWS continues to expand Lambdas capabilitiessuch as increased memory limits, faster cold starts, and native support for more runtimesthe importance of mastering deployment techniques grows.

Start small: deploy a single function with a simple HTTP trigger. Then expand to event-driven workflows, multi-function applications, and full CI/CD pipelines. The journey from manual CLI commands to automated, production-grade deployments is one of the most valuable skills you can develop in modern cloud engineering.

Now that you understand how to deploy Lambda functions, the next step is to scale themsecurely, efficiently, and intelligently. Your serverless applications are ready to run.