Distributed Systems Practice Notes

Cloud Web Apps - AWS Lambda Lab

October 14, 2018

AWS Lambda is a compute service that runs your code in response to events and automatically manages the compute resources for you, making it easy to build applications that respond quickly to new information. This lab creates a Lambda function to handle S3 image uploads by resizing them to thumbnails and storing the thumbnails in another S3 bucket.

Official Links

QwikLab: Intro to AWS Lambda

Operations

1. Create 2 Amazon S3 Buckets as Input and Output Destination

  • On the Services menu, select S3
  • Create bucket, with name images-1234, as the source bucket for original uploads
  • Create another bucket, with name images-1234-resized, as the output bucket for thumbnails
  • Upload the HappyFace.jpg to source bucket

2. Create an AWS Lambda Function

  • On the Services menu, select Lambda

  • Create function and configure

    • Name: Create-Thumbnail
    • Runtime: Python 3.6
    • Existing role: lambda-execution-role

    This role grants permission to the Lambda function to read and write images in S3

  • Finish the rest of configuration by providing the url of the zipped Python script, which handles upload event, creates thumbnail in output bucket

3. Trigger Your Function by Uploads

  • Click Test button and configure

    • Event template: Amazon S3 put
    • Event name: Upload
  • Modify the template

    • replace example-bucket with images-1234
    • replace test.key with HappyFace.jpg
  • Save and run

  • If success, the thumbnail image could be found in output bucket

4. Monitoring and Logging

  • Monitoring tab displays graphs showing:

    • Invocations: The number of times the function has been invoked.
    • Duration: How long the function took to execute (in milliseconds).
    • Errors: How many times the function failed.
    • Throttles: When too many functions are invoked simultaneously, they will be throttled. The default is 1000 concurrent executions.
    • Iterator Age: Measures the age of the last record processed from streaming triggers (Amazon Kinesis and Amazon DynamoDB Streams).
    • Dead Letter Errors: Failures when sending messages to the Dead Letter Queue.
  • Amazon CloudWatch Logs have detailed log messages in stream

Warren

Written by Warren who studies distributed systems at George Washington University. You might wanna follow him on Github