Learn how to build a production-grade serverless thumbnail generator using AWS Lambda, S3, and CloudWatch. Complete setup with advanced monitoring, security, error handling, and cost optimization.
Building a production-grade serverless thumbnail generator requires more than just basic Lambda and S3 integration. This comprehensive guide walks you through creating a robust, scalable, and secure image processing pipeline with advanced monitoring, error handling, and cost optimization strategies.
Our production-grade serverless thumbnail generator follows AWS Well-Architected Framework principles:
┌─────────────────────────────────────────────────────────────────┐
│ Internet/Users │
└─────────────────────┬───────────────────────────────────────────┘
│
┌─────────────────────▼───────────────────────────────────────────┐
│ CloudFront CDN │
│ (Global Content Delivery) │
└─────────────────────┬───────────────────────────────────────────┘
│
┌─────────────────────▼───────────────────────────────────────────┐
│ S3 Source Bucket │
│ (Image Uploads) │
└─────────────────────┬───────────────────────────────────────────┘
│ (S3 Event Notification)
┌─────────────────────▼───────────────────────────────────────────┐
│ Lambda Function │
│ (Thumbnail Processing) │
└─────────────────────┬───────────────────────────────────────────┘
│
┌─────────────────────▼───────────────────────────────────────────┐
│ S3 Destination Bucket │
│ (Processed Thumbnails) │
└─────────────────────┬───────────────────────────────────────────┘
│
┌─────────────────────▼───────────────────────────────────────────┐
│ CloudWatch Logs │
│ (Monitoring & Alerting) │
└─────────────────────────────────────────────────────────────────┘
This architecture provides:
# Create source bucket with advanced security
aws s3api create-bucket \
--bucket amodhbh-image-uploads-prod \
--region ap-south-1 \
--create-bucket-configuration LocationConstraint=ap-south-1
# Enable versioning for data protection
aws s3api put-bucket-versioning \
--bucket amodhbh-image-uploads-prod \
--versioning-configuration Status=Enabled
# Enable server-side encryption
aws s3api put-bucket-encryption \
--bucket amodhbh-image-uploads-prod \
--server-side-encryption-configuration '{
"Rules": [
{
"ApplyServerSideEncryptionByDefault": {
"SSEAlgorithm": "AES256"
}
}
]
}'
# Configure lifecycle policy for cost optimization
aws s3api put-bucket-lifecycle-configuration \
--bucket amodhbh-image-uploads-prod \
--lifecycle-configuration '{
"Rules": [
{
"ID": "DeleteIncompleteMultipartUploads",
"Status": "Enabled",
"Filter": {},
"AbortIncompleteMultipartUpload": {
"DaysAfterInitiation": 7
}
},
{
"ID": "TransitionToIA",
"Status": "Enabled",
"Filter": {},
"Transitions": [
{
"Days": 30,
"StorageClass": "STANDARD_IA"
}
]
}
]
}'
# Create destination bucket
aws s3api create-bucket \
--bucket amodhbh-image-thumbnails-prod \
--region ap-south-1 \
--create-bucket-configuration LocationConstraint=ap-south-1
# Enable versioning
aws s3api put-bucket-versioning \
--bucket amodhbh-image-thumbnails-prod \
--versioning-configuration Status=Enabled
# Configure for CloudFront distribution
aws s3api put-bucket-policy \
--bucket amodhbh-image-thumbnails-prod \
--policy '{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AllowCloudFrontServicePrincipal",
"Effect": "Allow",
"Principal": {
"Service": "cloudfront.amazonaws.com"
},
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::amodhbh-image-thumbnails-prod/*",
"Condition": {
"StringEquals": {
"AWS:SourceArn": "arn:aws:cloudfront::ACCOUNT_ID:distribution/DISTRIBUTION_ID"
}
}
}
]
}'
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"logs:CreateLogGroup",
"logs:CreateLogStream",
"logs:PutLogEvents",
"logs:DescribeLogGroups",
"logs:DescribeLogStreams"
],
"Resource": "arn:aws:logs:*:*:*"
},
{
"Effect": "Allow",
"Action": ["s3:GetObject", "s3:GetObjectVersion"],
"Resource": "arn:aws:s3:::amodhbh-image-uploads-prod/*"
},
{
"Effect": "Allow",
"Action": ["s3:PutObject", "s3:PutObjectAcl"],
"Resource": "arn:aws:s3:::amodhbh-image-thumbnails-prod/*"
},
{
"Effect": "Allow",
"Action": ["s3:ListBucket"],
"Resource": [
"arn:aws:s3:::amodhbh-image-uploads-prod",
"arn:aws:s3:::amodhbh-image-thumbnails-prod"
]
},
{
"Effect": "Allow",
"Action": ["sqs:SendMessage", "sqs:GetQueueAttributes"],
"Resource": "arn:aws:sqs:ap-south-1:*:thumbnail-dlq"
},
{
"Effect": "Allow",
"Action": ["cloudwatch:PutMetricData", "cloudwatch:GetMetricStatistics"],
"Resource": "*"
},
{
"Effect": "Allow",
"Action": ["xray:PutTraceSegments", "xray:PutTelemetryRecords"],
"Resource": "*"
}
]
}
# Create SQS Dead Letter Queue
aws sqs create-queue \
--queue-name thumbnail-dlq \
--region ap-south-1 \
--attributes '{
"MessageRetentionPeriod": "1209600",
"VisibilityTimeoutSeconds": "30"
}'
# Create main SQS queue with DLQ configuration
aws sqs create-queue \
--queue-name thumbnail-processing-queue \
--region ap-south-1 \
--attributes '{
"MessageRetentionPeriod": "1209600",
"VisibilityTimeoutSeconds": "30",
"RedrivePolicy": "{\"deadLetterTargetArn\":\"arn:aws:sqs:ap-south-1:ACCOUNT_ID:thumbnail-dlq\",\"maxReceiveCount\":3}"
}'
import boto3
import os
import json
import time
from PIL import Image, ImageOps
from io import BytesIO
import logging
from typing import Dict, Any
import traceback
# Configure logging
logger = logging.getLogger()
logger.setLevel(logging.INFO)
# Initialize AWS clients
s3_client = boto3.client('s3')
sqs_client = boto3.client('sqs')
cloudwatch = boto3.client('cloudwatch')
# Environment variables
DESTINATION_BUCKET = os.environ['DESTINATION_BUCKET']
DLQ_URL = os.environ.get('DLQ_URL')
THUMBNAIL_SIZES = json.loads(os.environ.get('THUMBNAIL_SIZES', '["128x128", "256x256", "512x512"]'))
ENABLE_WEBP = os.environ.get('ENABLE_WEBP', 'true').lower() == 'true'
class ThumbnailProcessor:
def __init__(self):
self.processed_count = 0
self.error_count = 0
self.start_time = time.time()
def process_image(self, source_bucket: str, source_key: str) -> Dict[str, Any]:
"""Process image and create multiple thumbnail sizes"""
try:
# Download original image
response = s3_client.get_object(Bucket=source_bucket, Key=source_key)
image_data = response['Body'].read()
# Open image with PIL
with Image.open(BytesIO(image_data)) as image:
# Convert to RGB if necessary (for JPEG output)
if image.mode in ('RGBA', 'LA', 'P'):
# Create white background for transparency
background = Image.new('RGB', image.size, (255, 255, 255))
if image.mode == 'P':
image = image.convert('RGBA')
background.paste(image, mask=image.split()[-1] if image.mode == 'RGBA' else None)
image = background
elif image.mode != 'RGB':
image = image.convert('RGB')
results = []
# Create thumbnails for each size
for size_str in THUMBNAIL_SIZES:
width, height = map(int, size_str.split('x'))
thumbnail = self.create_thumbnail(image, width, height)
# Generate thumbnail key
base_name = os.path.splitext(os.path.basename(source_key))[0]
thumbnail_key = f"thumbnails/{size_str}/{base_name}"
# Save as JPEG
self.save_thumbnail(thumbnail, thumbnail_key, 'JPEG')
results.append(f"s3://{DESTINATION_BUCKET}/{thumbnail_key}")
# Save as WebP if enabled
if ENABLE_WEBP:
webp_key = f"thumbnails/{size_str}/webp/{base_name}.webp"
self.save_thumbnail(thumbnail, webp_key, 'WEBP')
results.append(f"s3://{DESTINATION_BUCKET}/{webp_key}")
self.processed_count += 1
return {
'status': 'success',
'thumbnails': results,
'processing_time': time.time() - self.start_time
}
except Exception as e:
self.error_count += 1
error_msg = f"Error processing {source_key}: {str(e)}"
logger.error(error_msg)
logger.error(traceback.format_exc())
# Send to DLQ if configured
if DLQ_URL:
self.send_to_dlq(source_bucket, source_key, str(e))
raise e
def create_thumbnail(self, image: Image.Image, width: int, height: int) -> Image.Image:
"""Create thumbnail with smart cropping"""
# Use thumbnail method for aspect ratio preservation
thumbnail = image.copy()
thumbnail.thumbnail((width, height), Image.Resampling.LANCZOS)
# If thumbnail is smaller than target, center it
if thumbnail.size[0] < width or thumbnail.size[1] < height:
# Create canvas with target size
canvas = Image.new('RGB', (width, height), (255, 255, 255))
# Center the thumbnail
x = (width - thumbnail.size[0]) // 2
y = (height - thumbnail.size[1]) // 2
canvas.paste(thumbnail, (x, y))
return canvas
return thumbnail
def save_thumbnail(self, thumbnail: Image.Image, key: str, format: str):
"""Save thumbnail to S3 with appropriate settings"""
buffer = BytesIO()
if format == 'JPEG':
thumbnail.save(buffer, format='JPEG', quality=85, optimize=True)
content_type = 'image/jpeg'
elif format == 'WEBP':
thumbnail.save(buffer, format='WEBP', quality=80, optimize=True)
content_type = 'image/webp'
else:
raise ValueError(f"Unsupported format: {format}")
buffer.seek(0)
s3_client.put_object(
Bucket=DESTINATION_BUCKET,
Key=key,
Body=buffer,
ContentType=content_type,
CacheControl='max-age=31536000', # 1 year cache
Metadata={
'processed-by': 'lambda-thumbnail-generator',
'processing-time': str(time.time())
}
)
def send_to_dlq(self, source_bucket: str, source_key: str, error: str):
"""Send failed processing details to Dead Letter Queue"""
try:
message = {
'source_bucket': source_bucket,
'source_key': source_key,
'error': error,
'timestamp': time.time(),
'retry_count': 0
}
sqs_client.send_message(
QueueUrl=DLQ_URL,
MessageBody=json.dumps(message),
MessageAttributes={
'ErrorType': {
'StringValue': 'ProcessingError',
'DataType': 'String'
}
}
)
logger.info(f"Sent failed processing to DLQ: {source_key}")
except Exception as e:
logger.error(f"Failed to send to DLQ: {e}")
def publish_metrics(self):
"""Publish custom metrics to CloudWatch"""
try:
cloudwatch.put_metric_data(
Namespace='ThumbnailGenerator',
MetricData=[
{
'MetricName': 'ImagesProcessed',
'Value': self.processed_count,
'Unit': 'Count'
},
{
'MetricName': 'ProcessingErrors',
'Value': self.error_count,
'Unit': 'Count'
},
{
'MetricName': 'ProcessingTime',
'Value': time.time() - self.start_time,
'Unit': 'Seconds'
}
]
)
except Exception as e:
logger.error(f"Failed to publish metrics: {e}")
def lambda_handler(event, context):
"""Main Lambda handler with comprehensive error handling"""
processor = ThumbnailProcessor()
try:
# Process S3 event
for record in event['Records']:
if record['eventSource'] == 'aws:s3':
source_bucket = record['s3']['bucket']['name']
source_key = record['s3']['object']['key']
logger.info(f"Processing: s3://{source_bucket}/{source_key}")
result = processor.process_image(source_bucket, source_key)
logger.info(f"Success: {result}")
# Publish metrics
processor.publish_metrics()
return {
'statusCode': 200,
'body': json.dumps({
'message': 'Processing completed successfully',
'processed_count': processor.processed_count,
'error_count': processor.error_count
})
}
except Exception as e:
logger.error(f"Lambda execution failed: {str(e)}")
logger.error(traceback.format_exc())
# Publish error metrics
processor.publish_metrics()
return {
'statusCode': 500,
'body': json.dumps({
'error': str(e),
'processed_count': processor.processed_count,
'error_count': processor.error_count
})
}
# Create deployment package
mkdir -p lambda-package && cd lambda-package
# Create requirements.txt
cat > requirements.txt << 'EOF'
Pillow==10.1.0
boto3==1.34.0
botocore==1.34.0
EOF
# Install dependencies
pip install -r requirements.txt -t .
# Create Lambda function code
cat > lambda_function.py << 'EOF'
# [Previous Python code here]
EOF
# Create deployment package
zip -r deployment-package.zip .
# Create Lambda function
aws lambda create-function \
--function-name thumbnail-generator-prod \
--runtime python3.11 \
--role arn:aws:iam::ACCOUNT_ID:role/LambdaThumbnailRole \
--handler lambda_function.lambda_handler \
--zip-file fileb://deployment-package.zip \
--timeout 300 \
--memory-size 1024 \
--environment Variables='{
"DESTINATION_BUCKET":"amodhbh-image-thumbnails-prod",
"DLQ_URL":"https://sqs.ap-south-1.amazonaws.com/ACCOUNT_ID/thumbnail-dlq",
"THUMBNAIL_SIZES":"[\"128x128\", \"256x256\", \"512x512\"]",
"ENABLE_WEBP":"true"
}' \
--dead-letter-config TargetArn=arn:aws:sqs:ap-south-1:ACCOUNT_ID:thumbnail-dlq \
--tracing-config Mode=Active \
--region ap-south-1
# Create S3 event notification configuration
cat > s3-notification-config.json << 'EOF'
{
"LambdaConfigurations": [
{
"Id": "thumbnail-generation-trigger",
"LambdaFunctionArn": "arn:aws:lambda:ap-south-1:ACCOUNT_ID:function:thumbnail-generator-prod",
"Events": ["s3:ObjectCreated:*"],
"Filter": {
"Key": {
"FilterRules": [
{
"Name": "suffix",
"Value": ".jpg"
}
]
}
}
}
]
}
EOF
# Apply notification configuration
aws s3api put-bucket-notification-configuration \
--bucket amodhbh-image-uploads-prod \
--notification-configuration file://s3-notification-config.json
# Create CloudWatch Dashboard
aws cloudwatch put-dashboard \
--dashboard-name ThumbnailGeneratorDashboard \
--dashboard-body '{
"widgets": [
{
"type": "metric",
"x": 0,
"y": 0,
"width": 12,
"height": 6,
"properties": {
"metrics": [
["AWS/Lambda", "Invocations", "FunctionName", "thumbnail-generator-prod"],
["AWS/Lambda", "Errors", "FunctionName", "thumbnail-generator-prod"],
["AWS/Lambda", "Duration", "FunctionName", "thumbnail-generator-prod"]
],
"period": 300,
"stat": "Sum",
"region": "ap-south-1",
"title": "Lambda Performance"
}
},
{
"type": "metric",
"x": 12,
"y": 0,
"width": 12,
"height": 6,
"properties": {
"metrics": [
["ThumbnailGenerator", "ImagesProcessed"],
["ThumbnailGenerator", "ProcessingErrors"],
["ThumbnailGenerator", "ProcessingTime"]
],
"period": 300,
"stat": "Sum",
"region": "ap-south-1",
"title": "Custom Metrics"
}
}
]
}'
# Create error rate alarm
aws cloudwatch put-metric-alarm \
--alarm-name ThumbnailGenerator-HighErrorRate \
--alarm-description "High error rate in thumbnail generation" \
--metric-name Errors \
--namespace AWS/Lambda \
--statistic Sum \
--period 300 \
--threshold 5 \
--comparison-operator GreaterThanThreshold \
--evaluation-periods 2 \
--dimensions Name=FunctionName,Value=thumbnail-generator-prod \
--alarm-actions arn:aws:sns:ap-south-1:ACCOUNT_ID:thumbnail-alerts
# Create duration alarm
aws cloudwatch put-metric-alarm \
--alarm-name ThumbnailGenerator-HighDuration \
--alarm-description "High processing duration" \
--metric-name Duration \
--namespace AWS/Lambda \
--statistic Average \
--period 300 \
--threshold 60000 \
--comparison-operator GreaterThanThreshold \
--evaluation-periods 2 \
--dimensions Name=FunctionName,Value=thumbnail-generator-prod \
--alarm-actions arn:aws:sns:ap-south-1:ACCOUNT_ID:thumbnail-alerts
# Create CloudFront distribution
aws cloudfront create-distribution \
--distribution-config '{
"CallerReference": "thumbnail-cdn-'$(date +%s)'",
"Comment": "Thumbnail CDN Distribution",
"DefaultCacheBehavior": {
"TargetOriginId": "S3-thumbnails",
"ViewerProtocolPolicy": "redirect-to-https",
"TrustedSigners": {
"Enabled": false,
"Quantity": 0
},
"ForwardedValues": {
"QueryString": false,
"Cookies": {
"Forward": "none"
}
},
"MinTTL": 0,
"DefaultTTL": 86400,
"MaxTTL": 31536000
},
"Origins": {
"Quantity": 1,
"Items": [
{
"Id": "S3-thumbnails",
"DomainName": "amodhbh-image-thumbnails-prod.s3.ap-south-1.amazonaws.com",
"S3OriginConfig": {
"OriginAccessIdentity": ""
}
}
]
},
"Enabled": true,
"PriceClass": "PriceClass_100"
}'
Create .github/workflows/deploy-thumbnail-generator.yml:
name: Deploy Thumbnail Generator
on:
push:
branches: [main]
paths: ["lambda/**"]
pull_request:
branches: [main]
paths: ["lambda/**"]
env:
AWS_REGION: ap-south-1
FUNCTION_NAME: thumbnail-generator-prod
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: "3.11"
- name: Install dependencies
run: |
cd lambda
pip install -r requirements.txt
pip install pytest pytest-cov
- name: Run tests
run: |
cd lambda
pytest tests/ --cov=. --cov-report=xml
- name: Upload coverage
uses: codecov/codecov-action@v3
with:
file: ./lambda/coverage.xml
security-scan:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Run Trivy vulnerability scanner
uses: aquasecurity/trivy-action@master
with:
scan-type: "fs"
scan-ref: "."
format: "sarif"
output: "trivy-results.sarif"
- name: Upload Trivy scan results
uses: github/codeql-action/upload-sarif@v2
with:
sarif_file: "trivy-results.sarif"
deploy:
needs: [test, security-scan]
runs-on: ubuntu-latest
if: github.ref == 'refs/heads/main'
steps:
- uses: actions/checkout@v4
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v4
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: ${{ env.AWS_REGION }}
- name: Build deployment package
run: |
cd lambda
pip install -r requirements.txt -t .
zip -r ../deployment-package.zip .
- name: Deploy Lambda function
run: |
aws lambda update-function-code \
--function-name ${{ env.FUNCTION_NAME }} \
--zip-file fileb://deployment-package.zip
- name: Update function configuration
run: |
aws lambda update-function-configuration \
--function-name ${{ env.FUNCTION_NAME }} \
--environment Variables='{
"DESTINATION_BUCKET":"amodhbh-image-thumbnails-prod",
"DLQ_URL":"https://sqs.ap-south-1.amazonaws.com/${{ secrets.AWS_ACCOUNT_ID }}/thumbnail-dlq",
"THUMBNAIL_SIZES":"[\"128x128\", \"256x256\", \"512x512\"]",
"ENABLE_WEBP":"true"
}'
- name: Run integration tests
run: |
# Upload test image
aws s3 cp test-image.jpg s3://amodhbh-image-uploads-prod/
# Wait for processing
sleep 30
# Verify thumbnails were created
aws s3 ls s3://amodhbh-image-thumbnails-prod/thumbnails/ --recursive
# Set up provisioned concurrency for consistent performance
aws lambda put-provisioned-concurrency-config \
--function-name thumbnail-generator-prod \
--provisioned-concurrency-config ProvisionedConcurrencyConfig='{
"AllocatedConcurrency": 5
}'
# Configure lifecycle policies for cost optimization
aws s3api put-bucket-lifecycle-configuration \
--bucket amodhbh-image-thumbnails-prod \
--lifecycle-configuration '{
"Rules": [
{
"ID": "TransitionToIA",
"Status": "Enabled",
"Filter": {},
"Transitions": [
{
"Days": 30,
"StorageClass": "STANDARD_IA"
},
{
"Days": 90,
"StorageClass": "GLACIER"
}
]
},
{
"ID": "DeleteOldVersions",
"Status": "Enabled",
"Filter": {},
"NoncurrentVersionTransitions": [
{
"NoncurrentDays": 30,
"StorageClass": "STANDARD_IA"
}
],
"NoncurrentVersionExpiration": {
"NoncurrentDays": 90
}
}
]
}'
# Upload test image
aws s3 cp test-image.jpg s3://amodhbh-image-uploads-prod/
# Monitor CloudWatch logs
aws logs tail /aws/lambda/thumbnail-generator-prod --follow
# Verify thumbnails were created
aws s3 ls s3://amodhbh-image-thumbnails-prod/thumbnails/ --recursive
# Test with multiple images
for i in {1..10}; do
aws s3 cp test-image-$i.jpg s3://amodhbh-image-uploads-prod/
done
# Monitor metrics
aws cloudwatch get-metric-statistics \
--namespace AWS/Lambda \
--metric-name Duration \
--dimensions Name=FunctionName,Value=thumbnail-generator-prod \
--start-time $(date -u -d '1 hour ago' +%Y-%m-%dT%H:%M:%S) \
--end-time $(date -u +%Y-%m-%dT%H:%M:%S) \
--period 300 \
--statistics Average,Maximum
# Upload unsupported file type
aws s3 cp test-file.txt s3://amodhbh-image-uploads-prod/
# Check DLQ for failed messages
aws sqs receive-message \
--queue-url https://sqs.ap-south-1.amazonaws.com/ACCOUNT_ID/thumbnail-dlq
Lambda Performance
S3 Operations
Custom Business Metrics
Cost Metrics
# Create SNS topic for alerts
aws sns create-topic --name thumbnail-alerts
# Subscribe to email notifications
aws sns subscribe \
--topic-arn arn:aws:sns:ap-south-1:ACCOUNT_ID:thumbnail-alerts \
--protocol email \
--notification-endpoint your-email@example.com
# Create VPC endpoint for S3 (if using VPC)
aws ec2 create-vpc-endpoint \
--vpc-id vpc-xxxxxxxxx \
--service-name com.amazonaws.ap-south-1.s3 \
--route-table-ids rtb-xxxxxxxxx
Symptoms: Function timing out before completing Solutions:
Symptoms: Out of memory errors Solutions:
Symptoms: AccessDenied errors when accessing S3 Solutions:
Symptoms: Unexpected AWS charges Solutions:
# Remove S3 trigger from Lambda
aws s3api put-bucket-notification-configuration \
--bucket amodhbh-image-uploads-prod \
--notification-configuration '{}'
aws lambda delete-function \
--function-name thumbnail-generator-prod \
--region ap-south-1
# Empty buckets
aws s3 rm s3://amodhbh-image-uploads-prod --recursive
aws s3 rm s3://amodhbh-image-thumbnails-prod --recursive
# Delete buckets
aws s3api delete-bucket --bucket amodhbh-image-uploads-prod
aws s3api delete-bucket --bucket amodhbh-image-thumbnails-prod
# Delete log groups
aws logs delete-log-group \
--log-group-name /aws/lambda/thumbnail-generator-prod
# Delete dashboards
aws cloudwatch delete-dashboards \
--dashboard-names ThumbnailGeneratorDashboard
# Delete alarms
aws cloudwatch delete-alarms \
--alarm-names ThumbnailGenerator-HighErrorRate ThumbnailGenerator-HighDuration
# Delete queues
aws sqs delete-queue --queue-url https://sqs.ap-south-1.amazonaws.com/ACCOUNT_ID/thumbnail-dlq
aws sqs delete-queue --queue-url https://sqs.ap-south-1.amazonaws.com/ACCOUNT_ID/thumbnail-processing-queue
# Detach policies and delete role
aws iam detach-role-policy \
--role-name LambdaThumbnailRole \
--policy-arn arn:aws:iam::ACCOUNT_ID:policy/LambdaThumbnailPolicy
aws iam delete-role --role-name LambdaThumbnailRole
aws iam delete-policy --policy-arn arn:aws:iam::ACCOUNT_ID:policy/LambdaThumbnailPolicy
Total Estimated Cost: $10-25/month
This production-grade serverless thumbnail generator provides:
The solution follows AWS Well-Architected Framework principles and is ready for production workloads. Regular monitoring, cost optimization, and security reviews ensure long-term success.
Note: Remember to replace ACCOUNT_ID with your actual AWS account ID and adjust resource names as needed. Always test in a development environment before deploying to production.