Unit Testing with Mocha, a local instance of dynamoDB & Promises

Pete Barber from C#, C++, Windows & other ramblings

I'm writing the backend for my current iOS App in Javascript using node.js, AWS Lambda along with DynamoDB.

My AWS Lambda code is mostly AWS Lambda agnostic except for the initial handler methods, this makes them fairly testable outside of AWS. However, they depend on the DynamoDB. It's quite easy to write Unit Tests that run against a live version of DynamoDB but I wanted to run against a local instance, ideally an in-memory instance so that it would be quick (not that running against a real instance is that slow) and so that I could have a clean dB each time.

NOTES

  • As these tests are running against a dB it might be more accurate to call them Integration Tests implemented using a Unit Testing framework but I'll refer to them as Unit Tests (UTs).
  • This is running on MacOS
  • I don't Unit Test the actual AWS Lambda function. Instead I export the underlying functions & objects that the AWS Lambda uses and Unit Test these.
  • I'm a JavaScript, Node, AWS n00b so this if you spot something wrong or that'd bad please comment.
  • I don't like the callback pyramid so I use Promises where I can. I would use async and await but the latest version of Node that AWS Lambda supports doesn't support them :-(

In order to run this locally you'll need:
My goal was to have a clean dB for each individual Unit Test. The simplest way to achieve this I thought was to create an in-memory of dynamoDB and destroy it after each Unit Test.


This uses the child_process npm package to create an instance before each test, store the handle to the process in local-ish variable and following the tasks just kill it. The important points here are that the '-inMemory' option is used meaning that when the dB instance is killed and another re-started everything is effectively wiped without having to do everything.

The problem I had with this approach is that in addition to creating the dB each time I also needed to create a table. Whilst the documentation for local dynamoDB says that one of the differences between the AWS hosted & the local versions is that CreateTable completes immediately it seems that the function does indeed complete immediately the table isn't immediately available. This meant the UT using the table often failed with:

1) awsLambdaToTest The querying an empty db for a user Returns{}:
     ResourceNotFoundException: Cannot do operations on a non-existent table

I'm going to jump ahead and show the completed Unit Test file and explain the various things I had to do in order to get it working. This shows the tests.


before()/after() - creating/destroying dynamoDB


Rather than attempting to create & destroy the dB for each Unit Test I settled with creating it once per  Unit Test file. This is handled in the begin() & after() functions. Here the local instance of dynamoDB is spawned using the child_process package and reference to the process retained. This is then used to kill it afterwards. The important point to note here is the use of the sleep package & function.

I found when I had multiple test files, each with their own begin() & after() functions that did the same as these, even though kill had purported to have killed the processed (I checked the killed flag) it seemed the process hadn't died immediately. This meant that the before() function in the next set of tests would succesfully connect to the dying instance of dynamoDB. Then later when any operation was performed it would just hang until Mocha timed-out the Unit Test/before handler. I tried various ways to detect that the process was really dead but none worked so settled for a sleep.

beforeEach()/afterEach() - creating/destroying the table

Where possible I use Promises. Mocha handles promises quite simply for both hooks (the before*/after* functions) and Unit Tests. The key is to make sure to return the final promise (or pass in the done parameter & call it - though I don't use this mechanism).

Looking at the beforeEach() function createTable() is called which returns a promise (from the AWS.Request type that aws-sdk.DynamoDB.createTable() returns. This promise is then chained too by the synchronous waitFor method. This polls the dB for state of the table. The returned promise will not complete until the table has been created and waitFor has completed.

I am not convinced that waitFor is needed. According to the AWS vs Local DynamoDB guide for local instances tables are created immediately. I added this check as occasionally I was getting resources errors like the one earlier. However, I think the cause for that was because I forgot the return statement before the call to createTable() meaning no Promise was returned to Mocha so it thought the beforeEach() function had completed. I have removed this since in my real Unit Tests and they all seem to work.

Unit Tests

That's really it. The hard part wasn't writing the UTs but getting a local instance of DynamoDB running with the table that the functions to test used in the correct state. Again, due to the functions being tested usually returning promises themselves it is necessary to return Promise. The assertion(s) are made synchronously in a then continuation chained to Promise returned from the function being tested and Promise from the whole chain returned.

If an assertion returns false then even though it's within a continuation Mocha detects this and the test fails. If the function under test throws then Mocha also catches this and the test fails.

And finally

timeout

There's also the this.timeout(5000); at the top of the tests for this file. By default Mocha has 2 second timeout for all tests and hooks. By having the 1 second sleep for starting the dB it's possible for the hook to timeout then causing everything else to fail. The 5 second timeout protects against this.

localhost

When creating the local instance it uses the default port of 8000 and is accessible via the localhost hostname alias. This is used to access the dB when creating the reference to it in before(). A local aws.config is also constructed in each of the functions that access the dB in the actual code under test below.

The actual code being tested


Debugging AWS Lambda functions locally using VS Code and lambda-local

Pete Barber from C#, C++, Windows & other ramblings

I've just started using AWS Lambda with node.js. I was able to develop these locally using the lambda-local npm package, e.g. with node.js installed (via brew) and lambda-local installed (using npm) then the following "hello, world" example is run as follows:

hellolambda.js

'use strict';

console.log('Loading function');

exports.handler = (event, context, callback) => {
    console.log('Received event:', JSON.stringify(event, null, 2));
    console.log('value1 =', event.key1);
    console.log('value2 =', event.key2);
    console.log('value3 =', event.key3);
    callback(null, event.key1);  // Echo back the first key value
    //callback('Something went wrong');

};

defaultevent.js

module.exports =
{
"key1": "hello",
"key2": "lambda",
"key3": "node"

};

/usr/local/bin/lambda-local -l hellolambda.js -e default event.js

Loading function
info: Logs
info: ------
info: START RequestId: d683128b-ac14-93c3-b2c1-5541f3bb3fda
Received event: {
  "key1": "hello",
  "key2": "lambda",
  "key3": "node"
}
value1 = hello
value2 = lambda
value3 = node
info: END
info: Message
info: ------
info: hello
info: -----

info: lambda-local successfully complete.

Rather than use bash and vi (I'm running on MacOS) I wanted to use some sort of IDE. VS Code seemed ideal as it's free and it also has builtin node.js debugging. Using it for editing is very simple, just open the folder containing the source. In this case ~/tmp/hellolambda

However, switching to the debugging section and creating the default launch configuration where VS Code will launch node with the specify file as the program doesn't do much good.



This is because when running a lambda locally using local-lambda the program that node needs to run is the local lambda environment that local-lambda creates and for it to launch the lambda function.

This can be simply configured by specifying the local-lambda script as the program (it's a node script) and then passing the lambda script and the event data as arguments using the args key (which isn't included when using the VS Code option to add a configuration). The original example above can be launched using the following configuration.

In the output window at the bottom the results of executing the lambda are shown. Breakpoints can be set and hit.

It's important that each command line argument, i.e. the option and the value are specified separately. Even though '-l' and its value are a pair they are separate command line arguments (2 in total) where "-l ${workspaceRoot}/hellolambda.js" is a single argument.

NOTE: The lambdas I'm writing are also using the AWS DynamoDB. Using a local instance of DynamoDB along with installing the AWS SDK via npm I've been able to successfully invoke local lambdas that have used the local instance of the DB.