Solving AWS Lambda Cold Start Problems
Back to Blog
August 12, 2025
8 min read

Solving AWS Lambda Cold Start Problems

Cold starts were killing my serverless API's performance. Here's how I got them from 3 seconds down to 400ms without spending extra money.

AWSLambdaServerlessDevOps

Solving AWS Lambda Cold Start Problems

My Lambda function was taking 3 seconds to respond on cold starts. For an API that's supposed to be fast, that's embarrassing.

Users were complaining. I was losing potential customers. And I couldn't afford provisioned concurrency (that stuff gets expensive fast).

Here's how I fixed it without breaking the bank.

What's Actually Happening

Cold starts happen when AWS needs to spin up a new container for your function. This happens when:

  • Your function hasn't run in a while (usually 15-20 minutes)
  • Traffic spikes beyond current capacity
  • AWS does infrastructure updates

For my Node.js function, cold starts were taking 2-3 seconds. Warm starts? 80ms. Huge difference.

What Worked: Reduce Package Size

This was the biggest win. My deployment package was 45MB because I was bundling everything.

I switched to esbuild and got it down to 2.8MB:

// esbuild.config.js require('esbuild').build({ entryPoints: ['src/index.js'], bundle: true, minify: true, platform: 'node', target: 'node18', outfile: 'dist/index.js', external: ['aws-sdk'] // Already in Lambda runtime })

Cold starts dropped from 3 seconds to 800ms. Just from making the package smaller.

Lazy Load Heavy Dependencies

I was loading everything at the top of my file:

// Bad: Load everything upfront const AWS = require('aws-sdk') const axios = require('axios') const sharp = require('sharp') // Image processing - 8MB! const pdf = require('pdf-lib') // PDF generation - 5MB! exports.handler = async (event) => { // Most requests don't need sharp or pdf }

Changed it to load only when needed:

// Good: Load on demand exports.handler = async (event) => { if (event.action === 'resize-image') { const sharp = require('sharp') // Use sharp } if (event.action === 'generate-pdf') { const pdf = require('pdf-lib') // Use pdf } }

Cold starts dropped to 400ms. That's acceptable.

The Free Way to Keep Functions Warm

Provisioned concurrency costs money. Like, $35/month per function minimum. I'm running 4 functions. That's $140/month just to avoid cold starts.

Instead, I use CloudWatch Events to ping my functions every 5 minutes:

// warmer.js const AWS = require('aws-sdk') const lambda = new AWS.Lambda() exports.handler = async () => { const functions = [ 'api-handler', 'image-processor', 'email-sender' ] await Promise.all( functions.map(fn => lambda.invoke({ FunctionName: fn, InvocationType: 'Event', Payload: JSON.stringify({ warmer: true }) }).promise() ) ) }

Cost: $0.20/month. Way better than $140/month.

Handle Warmer Requests

Your function needs to recognize warmer pings and exit early:

exports.handler = async (event) => { // Exit early for warmer if (event.warmer) { return { statusCode: 200, body: 'warmed' } } // Actual logic const result = await processRequest(event) return result }

Track Cold Starts

I added simple logging to see how often cold starts happen:

let isWarm = false exports.handler = async (event) => { const isColdStart = !isWarm isWarm = true const start = Date.now() // Your logic here console.log(JSON.stringify({ coldStart: isColdStart, duration: Date.now() - start })) }

Now I can see in CloudWatch Logs how my optimizations are working.

What Didn't Work

Provisioned Concurrency: Too expensive for my budget. Maybe worth it if you're making serious money, but not for side projects.

Switching to containers: Considered moving to ECS, but that's even more expensive and complex.

Using a different runtime: Tried Python thinking it'd be faster. It wasn't. Node.js is fine.

The Results

Before:

  • Cold starts: 3 seconds
  • Warm starts: 80ms
  • User complaints: weekly

After:

  • Cold starts: 400ms
  • Warm starts: 60ms
  • User complaints: none

Cost: $0.20/month for the warmer function.

Should You Bother?

If your Lambda functions are internal tools that run on a schedule, cold starts don't matter. Don't optimize.

If you're building a user-facing API, cold starts matter a lot. A 3-second delay feels broken.

The good news: you can fix most cold start issues without spending extra money. Just make your package smaller and keep functions warm with a simple ping.


Fighting Lambda cold starts? I'd love to hear what worked for you. Hit me up on LinkedIn.