Supercharging Python-based Lambda function development

With its simple syntax and rich ecosystem of libraries, Python is both easy to learn and powerful. Incidentally, it’s one of the most popular languages for developing serverless Lambda functions. That said, a simple syntax doesn’t necessarily equate to effective functions. Managing the scale and complexity of your Lambda functions, not to mention testing and deployment, is a challenge. In this article, we’ll explore some tips and tools that can help you supercharge your Lambda function development. The practical advice, tips, and best practices below will boost your productivity — and the quality of your Lambda functions.

Reuse common resources during function initialization

To optimize the performance and reduce cold start times on your AWS Lambda functions, it’s smart to make use of common resources during function initialization. For example, if you have a function that interacts with DynamoDB, RDS, SQS, or some other resource, declare the connection to those resources on cold start. 

While you’re only billed for the invocation and invocation runtime, the background execution context (microVM) that runs your function will be reused until it requires scale-out, scale-in, or replacement. This significantly reduces response time per-invocation, especially at higher use levels.

Declaring the SDK resource or a client in the global scope creates the connection during a cold start. After that, the resource is available to each Lambda invocation for reuse. Because the resource is available between invocations, the SDK setup is removed from each call. This decreases Lambda runtime and response time — similar to an IoC injection, minus the overhead. 

This is easier to do than it sounds. You simply move the call to the client connection — boto3 is a good example — outside the Lambda handler and use it inside the function instead. Then, you pass the resource to any controllers used to organize code. As a bonus, the resource is easier to mock up because you’re passing it through a pytest fixture and library.

    import boto3

from controllers import queue_request

# Declared in the global scope
sqs_client = boto3.client("sqs")

def lambda_function(event, context):
    result = queue_request(sqs_client, event)
    # response logic
  

Use functools to write transparent pass-through function wrappers

Transparent pass-through function wrappers are incredibly useful for developing Lambda functions because they allow you to add functionality without modifying the original function’s code. This lets you handle logging or error handling without changing the way a function is called by other parts of your application. Moreover, wrappers keep your Lambda functions modular and reusable, which is exactly what you want from them. 

In Python, you can use the functools module to write transparent pass-through function wrappers. These wrappers can intercept function calls, modify function behavior, and return the original or modified results. 

Here’s an example:

    import functools


def pass_through_wrapper(func):
    @functools.wraps(func)
    def wrapper(*args, **kwargs):
        return func(*args, **kwargs)
    return wrapper


@pass_through_wrapper
def my_func(x):
    """My original function."""
    return x
  

In this example, the pass_through_wrapper is a decorator that takes a function as its input and returns a new function as the wrapper. The wrapper then calls the original function. Meanwhile, the functools.wraps decorator preserves the metadata of the original function. You can use the function just like any other function, and its name, attributes, and docstring are preserved. 

Working with AWS Lambda Powertools for function development

Maintained by AWS under the MIT0 open-source license, the AWS-lambda-powertools package is a framework that helps you get more done with less code while focusing on business value. This package implements best practices for Lambda function development in idiomatic Python with native AWS integration. 

Out of the box, Lambda Powertools package supports: 

  • Event source data classes. Using event source data classes to test your Lambda functions locally using the AWS SAM CLI with an AWS API Gateway, deploying them to any gateway in the cloud environment

  • Middleware. Often reusable and packable code, such as common data used from a JSON Web Token

  • Batch Processing. SQS queues or DynamoDB Streams while supporting partial successes intuitively for developers

  • Static typing. Support to increase velocity coding in your IDE

  • Data quality. Validations of incoming events and responses; supports fastjsonschema and Pydantic libraries for parsing and validation leveraging the familiar standards from the Python community

  • Feature flags. Allowing for simple evaluation of one or more features for the scope of function and enabling seamless decoupled deployments and rollbacks

Powertools for event handlers

You can also use event handlers to process events from different sources — using AppSync to process events from a serverless GraphQL request, for example. Meanwhile, you can use the API Gateway or ALB event handlers to process events from REST APIs. These event handlers come with a routing library, compression support, dynamic path expressions, and event, context, and response structures.

Lastly, Powertools supports Lambda functions developed in TypeScript, .NET, and Java.

Better logging with AWS Lambda Powertools

Python’s standard logging library acquires and releases a master lock on the thread to process the log response. This is baked in and requires manual tuning to ensure you’re outputting proper JSON syntax that can easily query using CloudWatch Insights so that you can auto-discover your fields to make them instantly accessible to search.

The AWS Lambda Powertools logger builds in the flexibility of what you would typically look to the stdlib logging library to do, but takes out all the configuration elements to output compliant records. Pass a native dictionary you’d like to log and let serialization, tracing, and stream configuration happen automatically in line with serverless best practices.

Type hinting your Lambda functions

We often create Lambda functions to integrate processing across distributed services. A simple example would be a CRUD component using an ALB, DynamoDB Global tables, and a Lambda function to develop a low-latency active serverless system. 

In this simplified example, the DynamoDB might include two Lambda functions:

  • One listener function that's processing its data change stream into an SQS queue
  • A worker function for encoding the data change events and sending them off

Each component has its role to play, and there are good reasons for decoupling and severing responsibilities. That said, remembering all the method calls and their arguments — not to mention content switching between the elements of your application — can lead to excess developer overhead. 

Fortunately, there’s a solution: the boto3-stubs project leverages mypy type hinting in standard Python, and it works automatically with VSCode, PyCharm, Emacs, Sublime Text, mypy, and pyright. By including and type hinting your boto3 clients and resources, you inherit fully typed annotations, code auto-completion, and type checking for all AWS services.

If you’re working with DynamoDB, simply install the mypy-boto3-dynamodb dependency and any contributing developers will have the same low barrier to entry and contribution for your Lambda functions. This functionality works with all Python code using the AWS Boto3 SDK, and it truly shines for Lambda function development, maintenance, and contribution.

Maximizing Lambda function’s efficiency

Developing great Lambda functions is a flexible, cost-effective way to build scalable and reliable applications. By following the tips and best practices outlined above, you’ll increase your Lambda function performance, reduce business costs, and improve the quality of your AWS serverless apps. More importantly, by supercharging your Python-based Lambda function development, you’ll streamline your development workflow.


Dan Furman, Distinguished Engineer

Dan is a solutions architect, open source enthusiast, and cloud native advocate. With 15 years of experience, he is inspired by the innovation, speed, and trends that become best practices across programming languages. Dan's on a mission to make software delivery approachable, strategic, cost effective, and timely by thoughtfully building our toolbox.

Explore #LifeAtCapitalOne

Innovate. Inspire. Feel your impact from day one.

Learn more

Related Content