Claude Code for Serverless: AWS Lambda, Cloudflare Workers, and Vercel Functions — Claude Skills 360 Blog
Blog / Development / Claude Code for Serverless: AWS Lambda, Cloudflare Workers, and Vercel Functions
Development

Claude Code for Serverless: AWS Lambda, Cloudflare Workers, and Vercel Functions

Published: May 17, 2026
Read time: 9 min read
By: Claude Skills 360

Serverless functions have platform-specific constraints that trip up developers: Lambda execution limits, Cloudflare Workers’ V8 Isolate restrictions, cold start optimization, and the differences between node.js and edge runtimes. Claude Code generates serverless code that respects these constraints — it knows which APIs are unavailable in edge runtimes, how to structure Lambda handlers, and how to work around cold start issues.

This guide covers serverless development with Claude Code: Lambda functions, Cloudflare Workers, Vercel Edge Functions, event processing, and deployment patterns.

Setting Up Claude Code for Serverless

Platform context prevents using unavailable APIs:

# Serverless Project Context

## Platform
- AWS Lambda (Node.js 20.x runtime), deployed via SAM
- Some functions as Vercel Edge Functions (V8 isolate — no Node.js APIs)
- Database: RDS PostgreSQL via RDS Proxy (Lambda) and Supabase API (Edge)

## Lambda Constraints
- Max timeout: 15 minutes (our timeout: 30s for API, 5min for batch)
- Max memory: 10GB (we use 512MB for API, 1GB for batch)
- Cold start budget: 500ms (keep handler.ts top-level code minimal)
- VPC: yes (for RDS access) — adds ~10ms warm / ~2s cold start

## Edge Runtime Constraints  
- No Node.js built-ins (no fs, no path, no crypto from Node)
- Web Crypto API only (crypto.subtle)
- No npm packages that use Node.js internals
- Max execution: 30s

## Conventions
- Structured logging: JSON with requestId, level, function name
- Error responses: { statusCode, body: JSON.stringify({ error: ... }) }
- Secrets via environment variables (not hardcoded)

See the CLAUDE.md setup guide for complete configuration.

AWS Lambda Functions

API Gateway Handler

Write a Lambda function for a REST endpoint.
POST /orders — create an order.
Input validation. Database write. Return created order.
Handle errors with proper status codes.
import type { APIGatewayProxyHandlerV2 } from 'aws-lambda';
import { z } from 'zod';

// Connection pool — initialized once, reused across warm invocations
let db: Pool;

function getDb(): Pool {
  if (!db) {
    db = new Pool({
      connectionString: process.env.DATABASE_URL,
      max: 2, // Low max — Lambda functions share the proxy connection pool
    });
  }
  return db;
}

const CreateOrderSchema = z.object({
  items: z.array(z.object({
    productId: z.string().uuid(),
    quantity: z.number().int().min(1).max(100),
  })).min(1),
  shippingAddress: z.string().min(1),
});

export const handler: APIGatewayProxyHandlerV2 = async (event) => {
  const requestId = event.requestContext.requestId;
  const log = (level: string, msg: string, data?: object) =>
    console.log(JSON.stringify({ level, msg, requestId, ...data }));
  
  try {
    // Parse and validate body
    let body: unknown;
    try {
      body = JSON.parse(event.body ?? '{}');
    } catch {
      return { statusCode: 400, body: JSON.stringify({ error: 'Invalid JSON' }) };
    }
    
    const result = CreateOrderSchema.safeParse(body);
    if (!result.success) {
      return {
        statusCode: 422,
        body: JSON.stringify({ error: 'Validation failed', details: result.error.issues }),
      };
    }
    
    const db = getDb();
    const order = await createOrder(db, result.data);
    
    log('info', 'Order created', { orderId: order.id });
    
    return {
      statusCode: 201,
      headers: { 'Content-Type': 'application/json' },
      body: JSON.stringify(order),
    };
  } catch (error) {
    log('error', 'Unhandled error', { error: (error as Error).message });
    return {
      statusCode: 500,
      body: JSON.stringify({ error: 'Internal server error' }),
    };
  }
};

Key Lambda patterns Claude generates: module-level connection pooling (connections survive warm invocations), low pool size (Lambda + RDS Proxy recommends max 2-5 per function), structured JSON logging with requestId for CloudWatch correlation.

Event Processing (SQS)

Write a Lambda that processes SQS messages.
Each message is a job to process.
Handle partial failures — don't fail the whole batch on one error.
import type { SQSHandler, SQSBatchItemFailure } from 'aws-lambda';

export const handler: SQSHandler = async (event) => {
  const failures: SQSBatchItemFailure[] = [];
  
  // Process messages concurrently (up to 5 at a time)
  const batchSize = 5;
  for (let i = 0; i < event.Records.length; i += batchSize) {
    const batch = event.Records.slice(i, i + batchSize);
    
    await Promise.all(batch.map(async (record) => {
      try {
        const job = JSON.parse(record.body);
        await processJob(job);
      } catch (error) {
        console.error(JSON.stringify({
          level: 'error',
          msg: 'Job processing failed',
          messageId: record.messageId,
          error: (error as Error).message,
        }));
        
        // Report this message as failed — SQS will retry it
        failures.push({ itemIdentifier: record.messageId });
      }
    }));
  }
  
  // Return failed messages — SQS won't delete them, will retry
  return { batchItemFailures: failures };
};

batchItemFailures is SQS’s partial batch response — failed messages return to the queue for retry while successful ones are deleted. Without this, a single message failure would cause the entire batch to retry (including the ones that succeeded).

Cold Start Optimization

My Lambda cold starts take 3 seconds.
Here's my handler file.
What's causing it and how do I fix it?

Claude identifies cold start culprits:

  • Large imports at module level (import * as AWS from 'aws-sdk' — use selective imports)
  • Initializing database connections in the function body (move to module level)
  • Large package bundles (use esbuild bundling, mark aws-sdk as external since Lambda provides it)
  • Unnecessary dependencies in the deployment package

It also recommends: --architecture arm64 (Lambda on ARM is faster and cheaper), and Lambda SnapStart for JVM-based functions.

Cloudflare Workers

Edge API Handler

Write a Cloudflare Worker that handles API requests.
Routes: GET /api/data, POST /api/process.
No Node.js APIs available — use Web APIs only.
export interface Env {
  DATABASE_URL: string;
  API_KEY: string;
  DB: D1Database; // Cloudflare D1
}

export default {
  async fetch(request: Request, env: Env, ctx: ExecutionContext): Promise<Response> {
    const url = new URL(request.url);
    
    // CORS preflight
    if (request.method === 'OPTIONS') {
      return new Response(null, {
        headers: {
          'Access-Control-Allow-Origin': 'https://myfrontend.com',
          'Access-Control-Allow-Methods': 'GET, POST, OPTIONS',
          'Access-Control-Allow-Headers': 'Content-Type, Authorization',
        },
      });
    }
    
    try {
      if (url.pathname === '/api/data' && request.method === 'GET') {
        return handleGetData(request, env);
      }
      
      if (url.pathname === '/api/process' && request.method === 'POST') {
        return handleProcess(request, env, ctx);
      }
      
      return Response.json({ error: 'Not found' }, { status: 404 });
    } catch (error) {
      console.error('Worker error:', error);
      return Response.json({ error: 'Internal server error' }, { status: 500 });
    }
  },
};

async function handleProcess(
  request: Request, 
  env: Env, 
  ctx: ExecutionContext
): Promise<Response> {
  const body = await request.json();
  
  // Start background work without blocking the response
  ctx.waitUntil(processInBackground(body, env));
  
  return Response.json({ success: true, message: 'Processing started' });
}

ctx.waitUntil() runs background work after the response returns — the Worker stays alive to complete it. Without waitUntil, async work after return is killed when the request finishes.

Cloudflare D1 Queries

Query a D1 database from a Worker.
Get users with pagination.
async function getUsers(env: Env, page: number, pageSize: number): Promise<Response> {
  const offset = (page - 1) * pageSize;
  
  const { results } = await env.DB.prepare(
    'SELECT id, email, name, created_at FROM users ORDER BY created_at DESC LIMIT ? OFFSET ?'
  ).bind(pageSize, offset).all();
  
  const { total } = await env.DB.prepare('SELECT COUNT(*) as total FROM users').first<{ total: number }>();
  
  return Response.json({
    users: results,
    pagination: { page, pageSize, total, pages: Math.ceil(total / pageSize) },
  });
}

D1 uses prepared statements (.prepare().bind()) to prevent SQL injection. Claude always uses parameterized queries.

Web Crypto API

Sign a webhook payload in a Cloudflare Worker.
HMAC-SHA256 signature. No Node.js crypto available.
async function generateWebhookSignature(payload: string, secret: string): Promise<string> {
  const encoder = new TextEncoder();
  
  const key = await crypto.subtle.importKey(
    'raw',
    encoder.encode(secret),
    { name: 'HMAC', hash: 'SHA-256' },
    false,
    ['sign']
  );
  
  const signature = await crypto.subtle.sign(
    'HMAC',
    key,
    encoder.encode(payload)
  );
  
  return Array.from(new Uint8Array(signature))
    .map(b => b.toString(16).padStart(2, '0'))
    .join('');
}

crypto.subtle is the Web Crypto API — available in all edge runtimes. Claude uses it instead of Node.js crypto module for edge functions.

Vercel Edge Functions

Write a Vercel Edge Function middleware that checks authentication
and redirects unauthenticated requests to /login.
import { NextResponse } from 'next/server';
import type { NextRequest } from 'next/server';

export function middleware(request: NextRequest) {
  const token = request.cookies.get('access_token')?.value;
  
  const isPublicPath = ['/login', '/signup', '/api/auth'].some(
    path => request.nextUrl.pathname.startsWith(path)
  );
  
  if (!isPublicPath && !token) {
    return NextResponse.redirect(new URL('/login', request.url));
  }
  
  if (token) {
    try {
      // Edge-compatible JWT verification (using Web Crypto)
      const payload = await verifyEdgeJwt(token, process.env.JWT_SECRET!);
      
      // Pass user ID to downstream handlers via header
      const response = NextResponse.next();
      response.headers.set('x-user-id', payload.sub);
      return response;
    } catch {
      // Invalid token — clear cookie and redirect
      const response = NextResponse.redirect(new URL('/login', request.url));
      response.cookies.delete('access_token');
      return response;
    }
  }
  
  return NextResponse.next();
}

export const config = {
  matcher: ['/((?!_next/static|_next/image|favicon.ico).*)'],
};

SAM Template for Lambda Deployment

Write a SAM template for a Lambda API with:
3 functions, API Gateway, environment variables from SSM,
VPC configuration for RDS access.

Claude generates the full template.yaml with AWS::Serverless::Function resources, AWS::Serverless::Api with CORS configuration, VpcConfig referencing Parameter Store values (not hardcoded VPC IDs), and proper IAM roles with least-privilege permissions.

Serverless Patterns with Claude Code

Serverless has platform-specific nuances that are hard to remember across Lambda, Workers, and Edge Functions. The most effective approach: specify your platform in CLAUDE.md, let Claude know whether you’re targeting Node.js Lambda or edge runtime, and specify your deployment tool (SAM, Serverless Framework, Wrangler).

For the testing side of serverless, see the testing guide — Lambda functions test well as plain functions before deploying. For CI/CD that deploys serverless functions on merge, see the CI/CD guide. The Claude Skills 360 bundle includes serverless skill sets for Lambda event processing, Workers KV/D1/R2 patterns, and edge function middleware. Start with the free tier.

Put these ideas into practice

Claude Skills 360 gives you production-ready skills for everything in this article — and 2,350+ more. Start free or go all-in.

Back to Blog

Get 360 skills free