AWS Chalice + Terraform Part 3: Testing Your App

This is the third part of the Chalice + Terraform series. You can check part 1 and 2 here:

AWS Chalice + Terraform: A serverless codebase that makes sense AWS Chalice + Terraform Part 2: Local development with LocalStack

This part is dedicated to testing our application, focusing on unit and integration testing. Let’s get started!

Writing unit tests

We will follow Chalice’s testing guide for the basics, then add additional tests on top of it.

First, let’s install pytest, our test runner of choice:

1
pip install pytest

You can add these to requirements-ci.txt for CI tooling purposes:

requirements-ci.txt
1
2
-r requirements.txt
pytest

Then, let’s create our test folders and the files we will need:

1
2
mkdir -p tests/unit/
touch tests/unit/{__init__.py,test_app.py}

As you can see, we created a unit subfolder within tests, that way we can separate the integration tests that we will be writing later on.

The Chalice test client

Chalice has a built-in test client, located at chalice.test.Client, capable of calling the lambda functions created within the project in multiple ways:

Testing REST functions

The following test checks the status code and payload of a REST enabled lambda:

app.py
1
2
3
@app.route('/')
def index():
return {'hello': 'world'}
tests/unit/test_app.py
1
2
3
4
5
6
7
8
9
from chalice.test import Client

from app import app

def test_index_endpoint():
with Client(app) as client:
response = client.http.get('/')
assert response.status_code == 200
assert response.json_body == {'hello': 'world'}

Test a bare lambda function

In case you have a simple lambda without triggers, you can test it using client.lambda_.invoke:

app.py
1
2
3
@app.lambda_function()
def bar(event, context):
return {'event': event}
tests/unit/test_app.py
1
2
3
4
def test_foo_function():
with Client(app) as client:
result = client.lambda_.invoke('bar', {'my': 'event'})
assert result.payload == {'event': {'my': 'event'}}

Note: If you run into an error like the following:

1
FAILED tests/unit/test_app.py::test_index_function - TypeError: index() takes 0 positional arguments but 2 were given

It could mean that you are trying to test a function triggered by @app.route using client.lambda_.invoke. This is reserved to functions declared using @app.lambda_function(). Use client.http instead.

Test SNS and SQS lambdas

Let’s test the SNS and SQS functions created on part 1 of the series, like so:

tests/unit/test_app.py
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
def test_sns_handler():
with Client(app) as client:
response = client.lambda_.invoke(
"handle_sns_message",
client.events.generate_sns_event(subject="test", message="hello from sns")
)
assert response.payload == {'subject':'test', 'message': 'hello from sns'}


def test_sqs_handler():
with Client(app) as client:
response = client.lambda_.invoke(
"handle_sqs_message",
client.events.generate_sqs_event(message_bodies=["hello from sqs"])
)
assert response.payload == {'message': 'hello from sqs'}

Test a function under a certain stage

To use the configuration of a specific stage during tests, use:

1
2
3
4
5
6
from chalice.test import Client

def test_foo_function():
with Client(app, stage_name='production') as client:
result = client.lambda_.invoke('foo')
assert result.payload == {'value': 'bar'}

See https://aws.github.io/chalice/topics/testing.html#environment-variables for more

See also

Check Chalice’s documentation on testing for more topics about testing, like mocking boto3 and using Pytest fixtures.

Writing integration tests with LocalStack

Now that we have unit tests in place, let’s go one step further: using LocalStack, we can trigger our functions like we would in a real environment, then check that the function actually ran. This way we can verify that our application is wired up correctly. If you need a refresher on how to set up your Chalice project to run against LocalStack, check Part 2 of the series.

Let’s start by creating a separate folder for our integration tests:

1
2
mkdir tests/integration
touch tests/integration/{__init__.py,test_app.py}

Testing strategy

AWS Lambda doesn’t provide a straight-forward way of checking that a function has run, so we need to automate the process you would normally do with the AWS Console: trigger your function, then going to CloudWatch to see if there are any execution logs. In this case, we will send a random UUID and then check that same payload was logged in our output, but feel free to adjust the assertion for your use case.

boto3 clients for LocalStack

We will be manipulating AWS resources created locally in LocalStack, so we need to tell boto3 that we are not hitting AWS servers but our own instead. We also need to mock AWS access keys:

tests/integration/test_app.py
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
import boto3

LOCALSTACK_URL = "http://localhost:4566"

# Common kwargs for boto3 client init
boto3_kwargs = {
"region_name": "us-east-1",
"aws_access_key_id": "aws_access_key_id",
"aws_secret_access_key": "aws_secret_access_key",
"endpoint_url": LOCALSTACK_URL
}

sns = boto3.client('sns', **boto3_kwargs)
sqs = boto3.client('sqs', **boto3_kwargs)
logs = boto3.client('logs', **boto3_kwargs)

We will be using the SNS and SQS clients to publish content to our topics and queues respectively, and the CloudWatch logs client to check the content of our log streams.

Triggering our Lambdas within tests

We are all set to write our integration tests:

tests/integration/test_app.py
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
import json
from time import sleep
from uuid import uuid4

import boto3

LOCALSTACK_URL = "http://localhost:4566"

# We can get these from the Terraform output
topic_arn = "arn:aws:sns:us-east-1:000000000000:chalice-tf-topic"
queue_url = "http://localhost:4566/000000000000/chalice-tf-queue"

# Common kwargs for boto3 client init
boto3_kwargs = {
"region_name": "us-east-1",
"aws_access_key_id": "aws_access_key_id",
"aws_secret_access_key": "aws_secret_access_key",
"endpoint_url": LOCALSTACK_URL
}

sns = boto3.client('sns', **boto3_kwargs)
sqs = boto3.client('sqs', **boto3_kwargs)
logs = boto3.client('logs', **boto3_kwargs)


def get_log_events_from_log_group(log_group_name):
"""This helper function returns the latest event of the specified log group"""

response = logs.describe_log_streams(
logGroupName=log_group_name,
orderBy='LastEventTime',
descending=True,
)

log_stream_name = response["logStreams"][0]["logStreamName"]

response = logs.get_log_events(
logGroupName=log_group_name,
logStreamName=log_stream_name,
)

return response


def test_sns_execution():
_id = str(uuid4()) # Generate an unique ID to assert the execution of the function
response = sns.publish(
TargetArn=topic_arn,
Message=_id,
MessageStructure='json'
)
sleep(3) # Wait for LocalStack to execute the function and log to CloudWatch

response = get_log_events_from_log_group("/aws/lambda/chalice-tf-local-handle_sns_message")
log_message = '\n'.join(event["message"] for event in response["events"]) # Join all log events into one string

assert _id in log_message # Look for our previously generated unique payload in the execution logs


def test_sqs_execution():
_id = str(uuid4()) # Generate an unique ID to assert the execution of the function
sqs.send_message(
QueueUrl=queue_url,
MessageBody=_id,
)
sleep(3) # Wait for LocalStack to execute the function and log to CloudWatch

response = get_log_events_from_log_group("/aws/lambda/chalice-tf-local-handle_sqs_message")
log_message = '\n'.join(event["message"] for event in response["events"]) # Join all log events into one string

assert _id in log_message # Look for our previously generated unique payload in the execution logs

Note: These test expect that LocalStack is up and running, and that the Terraform infrastructure has been applied.

See also

What’s next

It’s time to get ready for production release, with tips for bigger codebases, logging, tracing, secrets management and more. To be published soon