Streamlining AWS deployments with Python & Ansible, part II

Unit testing your Ansible modules

As explained in Part I, the pressing question of “How do we deploy stuff in AWS?” can be answered by the combination of Python, Ansible, and AWS.

But what if we encounter a follow-up question of equal importance: “How do we test it without actually deploying stuff to AWS?” After all, nobody loves the complexity of managing cloud infrastructure, much less the cost implications.

In the case of Python, the de facto testing framework is PyTest. To install and execute all unit tests, we simply run the following commands:

    $ pipenv install pytest
$ pipenv run pytest
  

Specifically, this will recursively find and execute all files beginning with test_(e.g. tests/test_foo.py).

This leads us into our next question: what kind of unit tests might be useful when testing the Python files that comprise your Ansible playbook?

Part II — Unit testing your Ansible modules

My mother told me never to mock, but I make an exception when it comes to the following three (3) strategies for unit testing Ansible modules written in Python:

  1. Mock out the module’s behavior (using uniittest.mock)
  2. Mock out the AWS calls (using moto)
  3. Mock the entire stack (with localstack)
This image shows the relationship between moto, mock and localstack.

unittest.Mock fakes everything. Moto only fakes the AWS client. Localstack fakes nothing except AWS cloud itself.

Before we get started, do you remember the create_elb.py file from Part 1? Here it is again (with a few upgrades that I will explain below):

    #!/usr/bin/env python
from ansible.module_utils.aws.core import AnsibleAWSModule, is_boto3_error_code
from botocore.exceptions import BotoCoreError, ClientError
from boto3 import Session


class CreateELB:
   """Class used to create a shiny new ELB"""

   def __init__(self, module):
       self.module = module
       session = Session(region_name="us-east-1")
       self.elbv2 = session.client("elbv2")

   def create_elb(self, elb_name, subnet_id):

       try:
           results = self.elbv2.create_load_balancer(Name=elb_name, Subnets=[subnet_id])
           if "LoadBalancers" not in results:
               self.module.fail_json(msg="Not created!", elb_name=elb_name)
           else:
               self.module.exit_json(msg="Created!", elb_name=elb_name)

       # Don't declare failure if ELB already exists!
       except is_boto3_error_code('DuplicateLoadBalancerNameException'):
           self.module.exit_json(msg="Already existed!", elb_name=elb_name)

       # Otherwise, fail if AWS returns an error of any kind
       except (BotoCoreError, ClientError) as e:
           self.module.fail_json_aws(e, msg=f"Error while creating ELB: {e}")

       # And don't forget to also handle non-AWS errors
       except Exception as e:
           self.module.fail_json(msg=f"Unexpected non-AWS error: {e}", changed=False)


def main():
   module = AnsibleAWSModule(
       argument_spec=dict(
           region=dict(default='us-east-1', choices=['us-east-1', 'us-west-2']),
           elb_name=dict(required=True, type="str"),
           subnet_id=dict(required=False, type="str")
       )
   )
   creator = CreateELB(module)
   creator.create_elb(module.params["elb_name"], module.params['subnet_id'])


if __name__ == "__main__":
   main()
  

This will be our source file for the following examples. OK, let’s dive in! 🤿

Option #1: Mocking The Module’s Behavior

Mocking begins with a simple install (for Python 3.3 users and below):

    pip install mock
  

Next, type python3 to kick off a Python session. Then, create a “fake” Ansible module (i.e. an empty Mock object):

    >>> from unittest.mock import Mock
>>> module = Mock()
>>> module

>>> exit()
  

Easy, right? Time to up the ante and mock the actual source code above.

A good place to start is the Ansible module’s output, which will always be captured in either the exit_json attribute (on success) or fail_json (on failure).

Using unittest.Mock, we can pass our “fake” Ansible module from above into the CreateELB source code (which will obviously fail). Then, we can check that fail_json was called (verifying failure) with a special built-in function called assert_called_once:

    class TestMocks(unittest.TestCase):
   def test_mock_module(self):
       self.module = Mock()
       creator = lib.CreateELB(self.module)
       creator.create_elb('test-elb', 'subnet-12345')
       creator.module.fail_json.assert_called_once
  

This is progress! 👏 But let’s get a deeper level of validation by mocking out the calls to AWS, which allows us to verify how our code is processing various responses from the cloud. In this case, we’ll just confirm the output to exit_json:

    def test_mock_aws_call(self):
  
   # pass a fake module to our function, just like before
   self.module = Mock()
   creator = lib.CreateELB(self.module)
  
   # overwrite the AWS connection with a fake one
   creator.elbv2 = Mock()
   creator.elbv2.create_load_balancer.return_value = {
       'LoadBalancers': [
           {'LoadBalancerName': 'test-elb'}
       ]
   }
   creator.create_elb('test-elb', 'subnet-12345')

   # verify everything runs as expected!
   assert "Created!" in creator.module.exit_json.call_args[1]['msg']
   assert "test-elb" in creator.module.exit_json.call_args[1]['elb_name']
  

Another easy-to-use mocking tool is Stubber, which is designed specifically for mocking Python calls to AWS. There are an abundance of examples, but they use the same approach as outlined above using unittest.Mock.

Whether using unittest.Mock or Stubber, your test code can become cumbersome and unreadable if you’re trying to verify the success of a complex multi-part network dialogue with AWS. We need another option!

Option #2 — Mocking the Cloud Connection

Instead of using unnittest.Mock to mock specific AWS responses in your Ansible unit testing, you can simply use moto to mock AWS’s botocore client itself! This allows your test code to interact locally with the latest version of the botocore client as if it were representing actual cloud resources.

The beauty of moto is that all you need to do is import the library and annotate your tests, with the added option of “setting up” scenarios requiring existing cloud resources. A perfect example is our CreateELB code, which we could test like so:

    from create_elb import CreateELB
from moto import mock_ec2
@mock_elbv2
@mock_ec2
def test_create_load_balancer(self):
 
   # mock the existence of a vpc and subnet
   session = Session(region_name="us-east-1")
   ec2 = session.client("ec2")
   vpc = ec2.create_vpc(CidrBlock="0.0.0.0/16")
   subnet = ec2.create_subnet(CidrBlock="0.0.0.0/24", VpcId=vpc)
   # feed a fake Ansible module to our CreateELB class
   creator = CreateELB(MagicMock()
  
   # run our regular Ansible command, without any explicit mocking
   creator.create_elb('test-elb', subnet["Subnet"]["SubnetId"])

   assert 'Created!' in creator.module.exit_json.call_args[1]['msg']
   assert 'test-elb' in creator.module.exit_json.call_args[1]['elb_name']
  

By tagging our tests with @mock_elbv2, we essentially monkeypatch all boto commands with moto commands (see what they did there?), even if those commands are originating within our source code. This way, you (a) don’t need to maintain hundreds of lines of handcrafted, hardcoded return JSON, and (b) your tests are suddenly more robust because they’re now actually talking to botocore and therefore subject to the return JSON you don’t expect.

But wait…that only covers 50% of the code! 😱

Good catch! Indeed, we have not yet interacted with the main() function, which is a significant part of our file. As a refresher, it looks like this:

    def main():
   module = AnsibleAWSModule(
       argument_spec=dict(
           elb_name=dict(required=True, type="str"),
           subnet_id=dict(required=False, type="str")
       )
   )
   creator = CreateELB(module)
   creator.create_elb(module.params["elb_name"], module.params['subnet_id'])


if __name__ == "__main__":
   main()
  

As you may recall from Part 1, the purpose of the main() function is to expose logic to the shell environment where Ansible will be running your playbook. Alas, this presents two (2) distinct problems.

  1. The first problem is that we can’t pass any test configurations into the main() function, because it doesn’t accept arguments. This prevents us from mutating our module to accommodate different testing scenarios, which from a testability perspective is a non-starter.
  2. Even if we do figure out a way to customize our Ansible modules in test, the second problem is that PyTest is not designed to run files via shell command, which is what Ansible expects. Specifically, PyTest will raise a SystemExit error as Ansible attempts to write the exit_json or fail_json to STDOUT from within our Python testing session. That is, our test will mark as a failure even if the STDOUT reads { msg: "Sucess!" }.

Do not be afraid! To work around these limitations, we simply need to add special setup and error handling in our test code. 🙌

Customizing our Ansible Module

To customize our Ansible module — this was Problem #1 — we need to inject our test configurations before the PyTest testing begins. Luckily, PyTest provides a special built-in function called setUp(), which we can use to create and customize a “pre-existing” Ansible module. An easy-to-use pattern is to define a set_module_args function that temporarily overwrites the default arguments that will get picked up during module creation.

Resolving issues with STDOUT and PyTest

The second issue — that STDOUT doesn’t play nice with PyTest — is solved with another PyTest built-in called capsys, which we will define as a fixture and use to capture and stringify the SystemExit we will inevitably encounter.

Here’s a complete example (minus the imports):

    def set_module_args(args):
   args = json.dumps({"ANSIBLE_MODULE_ARGS": args})
   basic._ANSIBLE_ARGS = to_bytes(args)
class TestMotoWithModule(unittest.TestCase):

   @fixture(autouse=True)
   def capsys(self, capsys):
       self.capsys = capsys

   @mock_elbv2
   @mock_ec2
   def setUp(self):

       # create vpc and subnet & mock client
       session = Session(region_name="us-east-1")
       ec2 = session.client("ec2")
       vpc = ec2.create_vpc(CidrBlock="0.0.0.0/16")
       subnet = ec2.create_subnet(CidrBlock="0.0.0.0/24", VpcId=vpc["Vpc"]["VpcId"], )
       subnet_id = subnet["Subnet"]["SubnetId"]
       self.elbv2 = session.client("elbv2")

       set_module_args(
           {
               "elb_name": "test-elb",
               "subnet_id": f"{subnet_id}"
           }
       )

       self.module = AnsibleAWSModule(
           argument_spec=dict(
               elb_name=dict(required=True, type="str"),
               subnet_id=dict(required=True, type="str"),
           )
       )

@mock_sts
def test_main(self):
   try:
       lib.main()
   except SystemExit:
       stdout = self.capsys.readouterr()
       self.assertIn("Error while creating ELB", str(stdout))
  

Together, this will provide upwards of 90% test coverage, and perhaps a slightly better night’s sleep. ✅

Option #3 — Mocking an Actual Cloud

In the event that you have lots of cloud resources that need to be tested in unison.

For example, let’s assume you have boto3 code that sets up a CloudWatch cron that triggers a Lambda function which publishes to a SNS topic that falls into an SQS queue. 😱

Let’s also assume you have fully vetted the notion of automated integration testing in a sandboxed AWS environment, and it has been found lacking in some significant way. How, then, might you approach this with unit testing?

Even using moto — this was Option #2 — it can be tricky to set up a half dozen client connections in memory for each testing scenario. To solve for this complexity, we can leverage localstack, which creates an entire “fake” AWS cloud for you to test in.

To get started with a with our ELB example, run pip install pytest-localstack and add the following PyTest file:

    import create_elb
import pytest_localstack

localstack = pytest_localstack.patch_fixture(
   services=["ec2", "elbv2"],
   scope='module',
   autouse=True
)
# add boto3 code here to create setup resources
def test_create_elb():
   creator = create_elb.CreateELB(Mock()
   creator.create_elb('test-elb', 'subnet-12345')
  

Behind the scenes, PyTest will start to execute localstack start, which, in turn, will spin up a fake AWS cloud in a Docker container and make its APIs available on http://localhost:4566.

Thanks to the pytest_localstack.patch_fixture, all the botocore requests coming from your Ansible modules to AWS will re-route to this Docker container, which will start isolated processes for each service you are mocking (e.g. EC2 & ELBv2), allowing those services to communicate with each other HTTP (just like they would in the cloud).

This free version of this tool is still relatively young in OSS years, so I’m excited to see it evolve over time to meet more complex testing needs in AWS. For example, imagine a world where even cloud integration testing can be run locally! 🎊 If this gets you excited, check out LocalStack.

In closing…

You may still have pressing questions regarding testing your Ansible modules— for example, what about the problem of integration testing? — but hopefully these three (3) options will get you off the ground. ✈️

What’s next?

In the next post — Part 3 — we will wrap up this series with a summary of Ansible best practices, some tips for test-writing, and a few suggestions for keeping your code of the highest quality.


Ford Prior, Principal DevOps Engineer

Ford Prior is a Principal DevOps Engineer who works on delivery experience, inner-sourcing, and CICD pipelines. He’s passionate about productivity engineering, tech education, and the community of Richmond, where he lives with his partner and kids. Between Ansible deployments, he enjoys spending time trail running or in his homemade sauna.

Related Content