Our REST API (Lambda + API Gateway) had a cold start problem. Every morning, the first request took 8-10 seconds. Users complained. We tried provisioned concurrency, but the bill was too high. Here is what actually worked.
The Problem
Cold starts happen when Lambda creates a new execution environment. This includes:
- Downloading the function code
- Starting the runtime
- Running the initialization code
- Starting external connections (DB, API clients)
Solution 1: Keep Connections Outside Handler
This was the biggest win. We moved database connections outside the handler:
// BEFORE - New connection every request
export const handler = async (event) => {
const client = new Client({ connectionString: process.env.DB_URL });
// use client
};
// AFTER - Reuse connection
const client = new Client({ connectionString: process.env.DB_URL });
export const handler = async (event) => {
// use client (already connected)
};
Solution 2: Lazy Load Heavy Dependencies
We had this at the top of our file:
import heavyLibrary from 'heavy-library'; // 2MB
import anotherLib from 'another-lib'; // 1MB
export const handler = async (event) => {
// These are loaded even if not used!
};
We changed to:
export const handler = async (event) => {
if (event.needsHeavyLib) {
const heavyLibrary = await import('heavy-library');
// use it only when needed
}
};
Solution 3: HTTP Keep-Alive
For external API calls, we enabled keep-alive:
import https from 'https';
import { Agent as HttpAgent } from 'http';
const agent = new HttpAgent({ keepAlive: true });
const client = new FetchClient({ agent });
The Results
Our P99 latency dropped from 10s to 200ms. The golden hour (9-10 AM) is now smooth. We didn’t pay for provisioned concurrency.
When to Use Provisioned
If your function takes >30 seconds to start, or you have strict latency requirements, provisioned concurrency is worth it. But try these optimizations first.