AWSDynamoDB | 9 Min Read

Master DynamoDB Integration Testing with Vitest and Docker: A Step-by-Step Guide

Learn how to master DynamoDB integration testing using Vitest and Docker with this easy, step-by-step guide. Enhance your testing skills today!

Testing is important to every application regardless of how small or large it may be. Tests allow us to ensure things are working as intended, find bugs sooner, create a higher quality product, and most importantly for this post, develop faster and more effectively.

Tests enable us to do all of these things efficiently because when we make changes we’re able to quickly run the related test suite(s) and ensure everything is working as intended. This method of developing is ideal for developing on AWS when using the CDK because it allows us to iterate quickly where otherwise you’d need to wait for CDK stack redeploys.

This is important because if you’ve ever worked on a large CDK project before, you’ll know redeploys can take up a lot of time depending on the size of your CDK stack. And, for every minute you’re waiting for a CDK stack deployment, you’re further out of your development flow and wasting time that could’ve been spent on improving your application.

So, in this tutorial, we’re going to take a look at how to implement integration testing against a DynamoDB database running in a local Docker container in a TypeScript project using Vitest as well as writing an example test for a CRUD action for a fictional books project!

Configuring DynamoDB in Docker

The first thing you’ll need for this project is Docker, so if you don’t already have Docker installed on your machine, you’ll need to install it. The easiest way to do this is to install Docker Desktop which automatically configures everything else for you.

Once you have Docker installed, we can begin working on our project. The first thing you’ll want to create in your project is a `docker-compose.yml` file in the root of the project and add the below code to it.

docker-compose.yml
1version: "3.7"
2
3services:
4 testing-db:
5 container_name: dynamodb
6 image: amazon/dynamodb-local:latest
7 ports:
8 - "8000:8000"
9 command: "-jar DynamoDBLocal.jar -sharedDb -dbPath ."
yaml

If you’ve never written a `docker-compose.yml` file before this may look a bit strange but essentially what we’ve added to this file is a Docker container definition that uses Amazon’s Local DynamoDB Docker image. We’ve also configured our container to run on port `8000` of our machine and to run the defined command when the container starts which will start the database.

After defining our Docker container, the final thing we need to do for our Docker setup is to add a command to our `package.json` that will allow us to start up our local DynamoDB container when we want to use it for our tests.

To do add this command, add the below line inside your `scripts` property in your `package.json` file.

package.json
1"scripts": {
2 "start:testing:db": "docker compose up testing-db"
3},
json

With all of this configured you can now run the command we just added to our `package.json` file by running `npm run start:testing:db` in your terminal. Then after a few seconds, you should have a local DynamoDB database running in Docker that we can test against in the coming steps.

Configuring Vitest

With Docker now configured and our testing database ready to go, let’s turn our attention to configuring Vitest in our project and getting ready to write the tests themselves. The first thing you’ll want to do is to install Vitest itself which can be done by running the command `npm i -D vitest` in your terminal.

Once Vitest is installed in your project, we’ll want to quickly update our `package.json` to add the command we’ll need to use to the tests in our project. To do this, add the command below to your `package.json` file under the one we added previously for starting the testing database.

package.json
1"scripts": {
2 "start:testing:db": "docker compose up testing-db",
3 "test": "vitest"
4},
json

Then with our `package.json` file configured, we’ll want to turn our attention to creating the configuration file for Vitest which can be done by creating a new file in the root of your project called `vitest.config.ts` and adding the code below to it.

vitest.config.ts
1import { defineConfig } from "vitest/config";
2
3export default defineConfig({
4 test: {
5 globalSetup: "./__tests__/global.setup.js",
6 },
7});
ts

In this configuration file, we mostly use the default configuration of Vitest, the only thing we add to it is a custom `globalSetup` file which we’re now going to create.

However, before we can create the `global.setup.js` file, we first need to install the AWS DynamoDB SDKs to allow us to interact with DynamoDB from our code. To install the SDKs run the command `npm i @aws-sdk/client-dynamodb @aws-sdk/lib-dynamodb` in your terminal.

Global Setup

With the SDKs for DynamoDB now installed, we can now turn our attention back to creating our `global.setup.js` file. To create this file, make a new file at `./__tests__/global.setup.js` and add the below code to it.

./__tests__/global.setup.js
1import {
2 CreateTableCommand,
3 DeleteTableCommand,
4 DescribeTableCommand,
5 DynamoDB,
6 ListTablesCommand,
7} from "@aws-sdk/client-dynamodb";
8
9// NOTE: Using the local endpoint for DynamoDB to connect to our Docker container
10const dynamodb = new DynamoDB({
11 endpoint: 'http://localhost:8000',
12})
13const TableName = "test-db";
14
15async function pollTablesList(tableName) {
16 // Fetch a list of the current tables in the database
17 const tables = await dynamodb.send(new ListTablesCommand());
18
19 // If the current tables includes our table name, keep checking
20 if (tables.TableNames.includes(tableName)) {
21 await pollTablesList(tableName);
22 }
23}
24
25async function pollTable(status) {
26 // Fetch the current table from the database
27 const table = await dynamodb.send(
28 new DescribeTableCommand({
29 TableName,
30 })
31 );
32
33 // If the current status is not the provided status ("ACTIVE") then check again
34 if (table.Table.TableStatus !== status) {
35 await pollTable(status);
36 }
37}
38
39export async function setup() {
40 console.log("Running Setup");
41 try {
42 // When setting up the tests, create a new table on our docker database
43 await dynamodb.send(
44 new CreateTableCommand({
45 TableName,
46 AttributeDefinitions: [
47 {
48 AttributeName: "pk",
49 AttributeType: "S",
50 },
51 {
52 AttributeName: "sk",
53 AttributeType: "S",
54 },
55 ],
56 KeySchema: [
57 {
58 AttributeName: "pk",
59 KeyType: "HASH",
60 },
61 {
62 AttributeName: "sk",
63 KeyType: "RANGE",
64 },
65 ],
66 BillingMode: "PAY_PER_REQUEST",
67 })
68 );
69
70 // Poll the table status to ensure it's ACTIVE before allowing tests to run
71 await pollTable("ACTIVE");
72 } catch (e) {
73 console.log(e);
74 }
75}
76
77export async function teardown() {
78 console.log("Running Teardown");
79
80 // Delete the table we created in the setup function
81 await dynamodb.send(
82 new DeleteTableCommand({
83 TableName,
84 })
85 );
86
87 // Poll the table to ensure it's been deleted before exitting
88 await pollTablesList(TableName);
89}
js

A fair amount is happening in this file so let’s take a dive into it and see what’s going on. From this setup file, we export two functions `setup` and `teardown`, these functions are in turn run by Vitest when we’re either setting up the test environment at the beginning or tearing it down at the end.

Setup

Inside the `setup` function at the start of the testing process, we create a new table in our DynamoDB database for us to test against in our tests. This is required because although our database is running on Docker, it contains no tables so we need to create one first before we can test against it.

When we create the table we still need to define the primary and sort keys for the table as you would when creating a DynamoDB table normally. For this project, we create the table with a primary key of `pk` and a sort key of `sk` as is popular in one table architectures.

Finally, in our `setup` function, because the `CreateTableCommand` will exit as soon as the command has been sent, we run a function (`pollTable`) to constantly poll our database status until it’s the status we want (`ACTIVE`). Then once the table is in the `ACTIVE` status, we exit from the function.

Teardown

With our `teardown` function we do something similar to the `setup` function but instead of setting up the database table, this time we remove it using the `DeleteTableCommand` SDK command.

Then in a similar way to the `setup` function and needing to poll for the table status, with the `teardown` function, we poll the table to see when it’s been removed from the database using the `pollTablesList` function. Then once the table has been successfully removed from the database, we exit the `teardown` function and finish the testing process.

Writing Our Tests

With all of the above setups now finished, we’re ready to turn our attention to writing the tests themselves. For this tutorial, we’re going to focus on a singular test which will be the creation of a book in our fictional project. But, if you’re interested in seeing the other CRUD action tests I’ve written for this example project, check out the entire repository on GitHub.

Finally, before we jump into our example test, I wanted to note that I’ll be using Zod in the test for parsing data that comes back from the database. So, if you would like to follow along line for line, you’ll need to install Zod using `npm i zod`.

To create our example test, add a new file at `./__tests__/books/create-book.test.ts` and add the below code to it.

./__tests__/books/create-book.test.ts
1import { DynamoDB } from "@aws-sdk/client-dynamodb";
2import { GetCommand, PutCommand } from "@aws-sdk/lib-dynamodb";
3import { expect, describe, it } from "vitest";
4import { z } from "zod";
5
6const bookSchema = z.object({
7 pk: z.string(),
8 sk: z.string(),
9 author: z.string(),
10 title: z.string(),
11});
12
13// NOTE: Using the local endpoint for DynamoDB to connect to our Docker container
14const dynamodb = new DynamoDB({
15 endpoint: 'http://localhost:8000',
16})
17const TableName = "test-db";
18
19describe("create-book", () => {
20 describe("SUCCESS", () => {
21 it("creates a book", async () => {
22 await dynamodb.send(
23 new PutCommand({
24 TableName,
25 Item: {
26 pk: "USER#3",
27 sk: "BOOK#3",
28 author: "Example Author",
29 title: "Example Title",
30 },
31 })
32 );
33
34 const { Item: book } = await dynamodb.send(
35 new GetCommand({
36 TableName,
37 Key: {
38 pk: "USER#3",
39 sk: "BOOK#3",
40 },
41 })
42 );
43
44 const parsedBook = bookSchema.parse(book);
45
46 expect(parsedBook).toMatchObject({
47 pk: "USER#3",
48 sk: "BOOK#3",
49 author: "Example Author",
50 title: "Example Title",
51 });
52 });
53 });
54});
ts

In this code, we define a couple of nested `describe` blocks from Vitest and then add an example `it` statement which contains our test for the creation of a book in our DynamoDB database.

Inside the test itself, we first create a new book in our table by using the `PutCommand` SDK command. Then to test that the data was correctly stored in the database we use the `GetCommand` SDK command to retrieve it from the database before then parsing it with Zod and asserting against it.

With our test now written, we can run the test using the `test` command we added to our `package.json` file earlier on. So, in one terminal ensure your testing database is still running and then in another run the command `npm run test`. Finally, after the test runs you should see a confirmation screen, confirming that the test passed successfully!

Closing Thoughts

And, with our test now passing, we’ve finished our tutorial! In this tutorial, we’ve written a Vitest test against a local DynamoDB database running in a Docker container that allows us to develop more efficiently and effectively without needing to wait for timely CDK deployments to check if our changes worked.

If you’re interested in seeing the full code for this tutorial, make sure to check out the GitHub repository for it here. And, if you’re interested in seeing my other AWS CDK tutorials, make sure to check out the GitHub repository for them here.

Finally, if you’re interested in learning about more things you can do with AWS and Docker, why not check out my tutorial showing how to migrate a Lambda function to an ECS/Fargate instance?

Thank you for reading.



Content

Latest Blog Posts

Below is my latest blog post and a link to all of my posts.

View All Posts

Content

Latest Video

Here is my YouTube Channel and latest video for your enjoyment.

View All Videos
AWS Bedrock Text and Image Generation Tutorial (AWS SDK)

AWS Bedrock Text and Image Generation Tutorial (AWS SDK)

Contact

Join My Newsletter

Subscribe to my weekly newsletter by filling in the form.

Get my latest content every week and 0 spam!