Content safety image moderation using Amazon S3 and AWS Lambda

Amazon S3(presigned URL) → AWS Lambda → Amazon Rekognition → Amazon SNS

Securely upload images to Amazon S3 and detect inappropriate content with Amazon Rekognition

Using this sample pattern, users can securely upload images to an Amazon S3 bucket by requesting a pre-signed URL through Amazon API Gateway. This URL allows secure and temporary access for uploading files directly to S3.
Once an image is uploaded, an S3 event invokes an AWS Lambda function to analyze the content using the DetectModerationLabels API. If the image is identified as inappropriate, a notification is sent via Amazon SNS, ensuring automated content moderation and alerting.

< Back to all patterns

GitHub icon Download this pattern (.zip)

GitHub icon View this pattern on GitHub


Clone repo

git clone https://github.com/aws-samples/serverless-patterns/cd serverless-patterns/apigw-lambda-rekognition

Deploy

terraform initterraform apply


Testing

See the GitHub repo for detailed testing instructions.

Cleanup

terraform destroy
terraform show

Created by:

Archana V

Archana V

Solutions Architect at AWS

Follow on LinkedIn