AWS SAM CLI

AWS Serverless Application Model is an open-source framework for building Serverless applications. It provides very easy syntax to define your lambda function, APIs, Databases, and event source mappings. With just a few lines of code, you can model your Serverless infrastructure in YAML format. During deployment, SAM transforms to AWS CloudFormation.
Benefits of using AWS SAM
- Single Deployment Configuration – You can easily manage all your necessary resources at one single repository.
- Easy way to use AWS CloudFormation – While deploying to AWS, it transforms to CloudFormation, so you can have benefit of out of the box.
- Local Debugging – It gives you nice way to debug your code easily.
Installing the SAM CLI
1. Setting up AWS Credentials
First you need to have AWS Account. You can create a free tire account using this link https://aws.amazon.com/console/
Once you created an account, you need to download programmatic access to your account. AWS will give you AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY which need to be configured in your computer.
Download AWS CLI from the below link and install in your machine
https://docs.aws.amazon.com/cli/latest/userguide/install-cliv2.html
If you have the already AWS CLI installed, use the aws configure
command and follow the prompts
$ aws configure
AWS Access Key ID [None]: your_access_key_id
AWS Secret Access Key [None]: your_secret_access_key
Default region name [None]:
Default output format [None]:
2. Setting up SAM CLI
SAM CLI provides you command line tools on maxOS, Linux and Windows. You can download installation files from this link
Download SAM CLI
Optional : If you need local debugging feature or image need to be deployed to AWS ECR, you have to install Docker. You can download it from https://www.docker.com/products/docker-desktop
3. Initialise new project
You can run following command which give you nice interactive workflow to generate base template in your given folder.
sam init
Use Case 1
Les’s deploy simple Hello World Lambda function to AWS. For this we are going to use Python runtime.
1. Create base template.
sam init





Once you complete the above steps, you will have the hello-world application in your folder.
Note: Default YMAL template includes API gateway event to call the lambda directly from your browser, Since this is a very basic application I have customized most of the code to make it very simple.
Now the YAML file looks like this
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: >
Hello World sample application
Globals:
Function:
Timeout: 3
Resources:
HelloWorldFunction:
Type: AWS::Serverless::Function
Properties:
CodeUri: src/
Handler: app.lambda_handler
Runtime: python3.8
In app.py file, I have done some code clean up and now it looks like this
def lambda_handler(event, context):
print("Hello World")
Now you can use SAM CLI deploy command to deploy this application into AWS.
To start the deployment,
sam deploy --guided
First, it will check .toml config file. if no configuration file is found, the following configuration details will be asked before start deployment

After generating .toml config file, SAM CLI will give you CloudFormation Change-set which tells you what kind of service going to create or modify.

Once you confirm the changeset, it will start to create services under given AWS account.

You can see same result in CloudFormation under Stack section.
At the time of writing this post, SAM CLI does not support removing the CloudFormation template by using the command. In future versions we can expect this command, So the only way to remove deployed template is, either AWS CLoudFormation stack, Delete functionality, or AWS CLI aws cloudformation delete-stack --stack-name YOUR-STACK-NAME
Use Case 2
Let’s build and deploy the following scenario. When a user uploads a file to input-files folder into S3 bucker, the lambda will be triggered by S3 event where it will read file content and send to SQS for further process.

Same as above use case, I have used AWS quick start template and added functionality on top of that.
Now the YAML file looks like this
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: >
S3 Event handler to process files
Parameters:
Environment:
Type: String
Default: dev
AllowedValues:
- dev
- prod
Globals:
Function:
Timeout: 60
Resources:
S3FileProcessingFifoQueue:
Type: AWS::SQS::Queue
Properties:
QueueName: !Sub "S3-File-Processing-Queue-${Environment}.fifo"
FifoQueue: True
ContentBasedDeduplication: true
S3FileProcessingBucket:
Type: AWS::S3::Bucket
Properties:
BucketName: !Sub "file-processing-use-case-2-${Environment}"
Tags:
- Key: name
Value: fileUpload
S3FileProcessingLambdaLayer:
Type: AWS::Serverless::LayerVersion
Properties:
ContentUri: python.zip
CompatibleRuntimes:
- python3.8
S3FileProcessingEventHandler:
Type: AWS::Serverless::Function
Properties:
FunctionName: !Sub "S3-File-Processing-Event-Handler-${Environment}"
CodeUri: src/
Handler: app.lambda_handler
Runtime: python3.8
Layers:
- !Ref S3FileProcessingLambdaLayer
Environment:
Variables:
QUEUE_URL: !Ref "S3FileProcessingFifoQueue"
Events:
S3PutObjectEvent:
Type: S3
Properties:
Bucket: !Ref S3FileProcessingBucket
Events: s3:ObjectCreated:*
Filter:
S3Key:
Rules:
- Name: prefix
Value: input-files/
- Name: suffix
Value: .txt
Policies:
- S3FullAccessPolicy:
BucketName: !Sub "file-processing-use-case-2-${Environment}"
- SQSSendMessagePolicy:
QueueName: !Sub "S3-File-Processing-Queue-${Environment}.fifo"
LambdaInvokePermission:
Type: 'AWS::Lambda::Permission'
Properties:
FunctionName: !GetAtt S3FileProcessingEventHandler.Arn
Action: 'lambda:InvokeFunction'
Principal: 's3.amazonaws.com'
SourceAccount: !Sub ${AWS::AccountId}
SourceArn: !GetAtt S3FileProcessingBucket.Arn
app.py file looks like this.
import boto3
import os
import urllib
import json
def lambda_handler(event, context):
BUCKET_NAME = urllib.parse.unquote_plus(event['Records'][0]['s3']['bucket']['name'])
FILE_NAME = urllib.parse.unquote_plus(event['Records'][0]['s3']['object']['key'])
QUEUE_URL = os.environ.get('QUEUE_URL')
client = boto3.client('s3')
result = client.get_object(Bucket=BUCKET_NAME, Key=FILE_NAME)
data = result["Body"].read()
queue = boto3.client('sqs')
queue.send_message(
QueueUrl=QUEUE_URL,
MessageBody=(json.dumps({"data": data})),
MessageGroupId='S3FileProcessginGroup',
)
If you look at YMAL file, S3 bucket, Queue and Lambda are defined in Resources section, Once you deploy this template, those defined services will be created for you.
Policies tag use to defined lambda permission to access S3 bucket and queue service.
LambdaInvokePermission tag uses to define s3 permission to trigger lambda. Once you deploy this template, in the AWS console you can see the new lambda as below.

prefix and suffix tag can be used to defined file key prefix – basically folder name or unique wording in file and suffix can be used to define file extension.
Parameters tag can be used to define different stages, for e.g – Dev, Stage, Prod, when you deploy, in the command line you can specify stage as the parameter. For this template, the Dev environment is set as default.
sam build --config-env prod
sam deploy
In this lambda, I’m using few other python libraries such as boto3, So I made those packages into a separate lambda layer and deploy with this template.

Note: Read this article about how to create custom lambda layer https://www.kamprasad.com/create-python-lambda-layer-using-docker/
You can download this code from https://github.com/kamprasad2007/s3-event-handler-use-case-2
The Lambda Environment variable can be added under S3FileProcessingEventHandler -> Properties -> Environment -> Variables.
In python, you can read environment variables by using os.environ.get
One Reply to “AWS SAM CLI”