Performance and Concurrency with Lambda
Pletratech is an AWS consulting partner specialized in develop applications using AWS services. Performance is an important factor for any serverless service using Lambda. In this article, we are going to look into performance and concurrency with Lambda.
When AWS runs your Lambda function, it is running it on servers somewhere in its infrastructure, despite the serverless nomenclature. And really, these servers are actually EC2 instances, meaning they’re running on a virtual machine that matches one of the EC2 instance types. If you want to run your Lambda function on beefier hardware, you have a single parameter you can use to make that change, memory. Each Lambda is configured with a certain memory allocation. If you take a look at the detail of your Lambda under the Basic settings section, there’s a Memory slider. This slider not only affects the memory available to your function, but it also affects the CPU resources available to that function. As you increase the memory, the CPU also increases proportionally.
The exact CPU being provisioned isn’t published, but just the equation of more memory equals more CPU should be easy enough to remember. So if the execution time of your Lambda is higher than you’d like, consider raising the allocated memory to get some better hardware. When AWS is running your Lambda function and gets another invocation, it runs both concurrently. Again, Amazon is handling the infrastructure and configuration for how all this works, so you don’t need to worry about how the concurrency happens. You do, however, need to be aware of how many Lambdas are running concurrently. The reason for this is that AWS has limits on how many concurrent Lambdas can run at a time in a single account. It’s a soft limit, so it can be raised, but it’s really there to help you from unintentionally being driven to the poor house from Lambda executions. Because you’re charged by the millisecond of Lambda execution time, a limit on the number of concurrent Lambda invocations means that there is a finite maximum cost that you can be billed for. If for some reason you have a rogue Lambda running like crazy, these are guardrails to help you out. You even have more control over concurrency.
In the detail for one of your Lambdas, at the bottom, there’s a Concurrency section where you can reserve concurrent executions. This is both reserving the concurrency, ensuring it’s available to this Lambda in the event that other functions are also executing concurrently, and also setting a limit to the maximum number of concurrent invocations for this Lambda. In the event that limit is reached, the Lambda will retry the invocation for a certain number of times. If the invocation retry limit is reached, the event will be sent to a dead‑letter queue. The Throttle graph in the Monitoring section of your Lambda provides all process is in the dead-letter queue. If a Lambda invocation is stopped because a concurrency limit was hit, then that means that the run was throttled. This throttled metric is available through CloudWatch, so you can set an alarm to notify you if a Lambda begins to get throttled a lot, meaning it might be time to up your concurrency limit.
The Servers Application Model
If you ever question how much AWS has invested in serverless architecture, look no further than the Serverless Application Model. Also called SAM, this open-source software provides many of the pieces you need to create a fully serverless application in AWS. Using the configuration tool and a command‑line application, you can define all the pieces of your serverless application, such as functions, event sources, and more, and then deploy them up to AWS. SAM is still heavily in development and is missing many features need to have for a smooth serverless experience. Serverless Framework, not only does deployment, but also has a plug‑in architecture with many more capabilities. The core of SAM is a configuration template, quite similar to a CloudFormation template that defines the serverless applications. The template will actually be deployed by CloudFormation, so a key property of this template is the Transform property. This tells CloudFormation that the template is written in the SAM syntax and instructs it to transform certain resources. The additional resource types available to declare in a SAM template are Serverless::Function, Serverless::API, and Serverless::SimpleTable.
Let’s take a look at each one of these.
A Serverless::Function is exactly what you would think, a Lambda function. What SAM does beyond creating a normal Lambda function is to also create an IAM execution role and create and configure any event source mappings for the function.
Serverless::API creates an API, resources, and methods for you based on a Swagger configuration. If you’ve never heard of Swagger, it’s API documentation and design framework that can be used to model an API completely. The Serverless::API resource type in a SAM template will take that design and create the real resources it defines.
Serverless::SimpleTable is a basic DynamoDB table that has no secondary indexes defined. If you just want a Dynamo table with basic primary key retrieval, you can use this to set it up quickly. If you need to configure your table beyond what’s available in SAM, you should just use the regular CloudFormation DynamoDB type. You can also define other normal CloudFormation resource types in your SAM template. When CloudFormation is doing the transform, it will pass over them without modification. Once your SAM template is written, here are the steps for deploying your SAM application to AWS.
Use the AWS CLI and run the package command. This makes your code location and SAM template, uploads the code to S3, and modifies the template to include the path to your code in S3.
Deploy that modified template to CloudFormation using the deploy command. CloudFormation takes over from there and creates the resources and deploys your Lambda code. If you locally make a change to your code and need to redeploy, you’d follow the same steps from the beginning. As you can probably tell, the serverless framework is a much smoother experience, which is why I used it for this course. Still, SAM shows promise.\, and as they update it and continue to develop the SAM CLI tool, which is currently in beta, it may become the go‑to utility for serverless applications in AWS.