Testcontainers & LocalStack for Spring Boot functional tests

We all know about the importance of unit and functional testing when it comes to developing new applications. In particular, functional testing helps us to see the gaps in our code compared to the functionality that we believe our application should have. One major drawback of functional testing is that they are often dependent on external services, whether that be databases or APIs or AWS components. These dependencies incur additional overhead and costs that are unnecessary from a code testing point of view. As a result, I’ve been focused within my development team on making our functional tests more lightweight and flexible. Through the power of Docker containerization and an open-source library called Testcontainers, I’ve been able to make our functional tests more portable and economical without sacrificing testing quality. 

This guide aims to show a use case (with Java Spring Boot and Cucumber) that can be extended to most applications. We will mock a PostgreSQL database and AWS S3 bucket for our tests and we’ll take a look at Testcontainers and another library called LocalStack, and how Docker containerization in general can reduce the time and costs incurred when writing functional tests for Java applications.

Prerequisites

Background

Testcontainers originally started as a way to programmatically create throwaway Docker containers. Because of the flexibility of Docker images, Testcontainers modules can now spin up containers with databases, web browsers, or mock AWS endpoints. This is what makes Testcontainers powerful for functional testing — it provides a flexible way of standing up dependencies before your code starts.

Using Testcontainers’s Localstack module in particular can allow you to remove AWS dependencies from functional tests. Localstack provides a containerized way of mocking AWS components, eliminating any need to connect to your real AWS infrastructure during testing. As we’ll see in the next section, combining these two technologies allows you to write functional tests without connecting to external dependencies or changing the application code.

So why use testcontainers? Here’s a rundown of how the library has improved my testing suite:

  • Improved portability - Mocking your dependencies with testcontainers allows you to run your tests anywhere Docker is installed. It removes the need for username/password credentials and open ports to the external network, and the overall overhead associated with these resources.
  • Reduce reliance on Dev region - Because of the improved portability of our tests, tests are able to be run locally and in DevOps pipelines without using a development region. This allows for significant cost savings on AWS resources.
  • Greater knowledge of dependencies - While there is higher initial overhead in mocking your external dependencies, you're able to get a better understanding of the behaviors of your dependencies as you mock them. 
  • Known stateful setup - Utilizing testcontainers allows you to have the same consistent state before every run of your functional test. This reduces the reliance on a proper test state in your real dependencies. 

The scenario

Let’s say we have a Java Spring Boot application that works to process information from a file in S3 and places it in a PostgreSQL database. All told, we have two components that we want to mock in functional testing—the S3 bucket and the PostgreSQL database. From here, the idea is the same no matter what testing framework you’re using, but we’re going to use Cucumber in this example:

  1. Setup the containers for the test.
  2. Run the application and its associated tests.
  3. Shut down the containers.

Database container setup

First we instantiate the database container and start it:

    PostgreSQLContainer dbContainer = new PostgreSQLContainer();
dbContainer.start(); // starts Docker container
  

We then connect to the database and run a startup script, using a connection provided by Testcontainers:

    Connection postgreSQLConnection = dbContainer.createConnection("");    
ScriptRunner scriptExecutor = new ScriptRunner(postgreSQLConnection);
Resource startupScript = new ClassPathResource("startupScript.sql");
Reader scriptReader = new BufferedReader(new FileReader(startupScript.getFile()));
scriptExecutor.runScript(scriptReader); // runs startup script
  

Finally, we change our Spring properties to point our JDBC datasource to the Testcontainers database:

    System.setProperty("jdbc.datasource.url", postgreSQLConnection.getMetaData().getURL());
  

And that’s it! We’ve set up a mock database for our code to seamlessly connect to. 

S3 container setup

Setting up the S3 container is as easy as setting up the database. Through the AWS SDK that we use in our application, we can seamlessly set our code to use a mock S3 bucket. Start the LocalStack container while specifying S3 as the service that we’re using:

    LocalStackContainer s3Container = new LocalStackContainer().withServices(S3);
s3Container.start();
  

Now we change our Spring properties to use our mock S3 container. Because our application uses the Java AWS SDK, this means changing the AWS access key, secret key, region, and most importantly the endpoint (without changing the endpoint, the S3 client will still try to connect to the real AWS):

    System.setProperty("aws.accessKeyId", s3Container.getDefaultCredentialsProvider().getCredentials().getAWSAccessKeyId());
System.setProperty("aws.secretKey", s3Container.getDefaultCredentialsProvider().getCredentials().getAWSSecretKey());
System.setProperty("s3.endpoint", s3Container.getEndpointConfiguration(S3).getServiceEndpoint());
System.setProperty("s3.region", s3Container.getEndpointConfiguration(S3).getSigningRegion());
  

Finally, we can create an S3 client and a bucket with any files we want inside for our test:

    AmazonS3 s3 = AmazonS3ClientBuilder
.standard()
.withEndpointConfiguration(s3Container.getEndpointConfiguration(S3)).withCredentials(s3Container.getDefaultCredentialsProvider())
.build();
s3.createBucket("testbucket");
s3.putObject("test.txt", "testfolder/test.txt", new File("src/main/resources/test.csv"));
  

In the application code we can also set test and non-test configurations based on the Spring profile. This way, our code will grab the mock AWS endpoint only when the test profile is on:

    @Bean
@Profile("!test")
public AmazonS3 amazonS3() {
return AmazonS3ClientBuilder.standard()
.withRegion(Regions.US_EAST_1)
.withCredentials(new DefaultAWSCredentialsProviderChain())
.build();
}
@Bean
@Profile("test")
public AmazonS3 amazonTestS3() {
return AmazonS3ClientBuilder.standard()
.withEndpointConfiguration(new EndpointConfiguration(endpoint, region)) // using testcontainers endpoint
.withCredentials(new DefaultAWSCredentialsProviderChain())
.build();
}
  

With that, we’re all done with the setup before the test.

Cucumber test setup

Now we’re all ready to run our Cucumber test. I’m going to assume that you’re already familiar with Cucumber tests so we can focus on integrating them with Testcontainers. Simply define a first step to setup the containers as above and run your Spring Boot app:

    @Given("^that the application is running$")
public static void containerSetup() {
   postgreSQLContainerSetup();
   S3ContainerSetup();
   SpringApplication.run(TestApp.class, new String[0]);
}
To finish testing, wait for the application to do its thing and run your asserts thereafter:
@And("^wait for the spring boot app process to complete$")
public void wait_for_1_minute() throws Throwable {     
   TimeUnit.MINUTES.sleep(1);
}
@Then("^data is available in database$")
public void test_results_available_in_database() {
   assertDataIsPresent();
   ...
}
  

And voila! Your testing is done.

While the Testcontainers reaper container (called Ryuk) should automatically destroy all of your containers when the tests concludes, it’s best practice to shut them down in a final Cucumber step as well:

    @When("^the test is complete$")
public void containerTeardown() {
   postgreSQLContainer.stop();
   s3Container.stop();
}
  

Conclusion

Testcontainers provides a portable way of doing functional testing without connecting to external services or changing application code. For my own team, this has removed our dependency on an AWS development environment while still being able to test our application code from beginning to end. It’s ensured that we have a consistent state in our dependencies before testing even begins. It’s allowed us to gain a greater working knowledge of the dependencies through the mocks that we build. Finally, bugs have become faster to find and easier to fix within the code, as Testcontainers tests can be run from anywhere that Docker is installed.

This is just a first simple pass at working with Testcontainers and Spring Boot, as there is so much more that you can mock. Lambdas, Web browsers, Elasticsearch, Kafka, can all be created and torn down within your application testing. Hopefully this tutorial can help you get started! 


Brian Levine, Software Engineer

Software engineer at Capital One. Passionate about finding ways to make software more elegant, efficient, and secure for customers and engineers alike.

Related Content

yellow, orange, red, blue, and green shipping containers stacked in a pile towards a blue sky
Article | January 22, 2019
light blue, dark blue, and orange gear with white text and grey background
Software Engineering

No testing strategy, no DevOps

Article | February 14, 2018