Skip to main content
Serverless computing is a cloud model where you don’t manage any servers at all. You upload your code, and the platform handles everything: provisioning machines, scaling up during traffic spikes, scaling down to zero when idle, and even patching security updates. You only pay for the actual compute time your code uses. The term “serverless” is a bit misleading—there are still servers, you just don’t see or manage them. It’s like electricity: you don’t maintain a power plant, you just use what you need and pay for consumption. This guide covers deploying Mizu applications to AWS Lambda, Google Cloud Functions, and other serverless environments.

Understanding Serverless with Go

Serverless platforms work differently from traditional servers. Understanding these characteristics helps you design better serverless applications:
AspectImpact on Mizu Apps
Cold startsWhen your function hasn’t run recently, the platform needs to start a new instance. This “cold start” adds latency (100-500ms). Warm instances respond immediately
StatelessEach request might run on a different instance, so you can’t store data in memory between requests. Use databases or caches instead
Timeout limitsFunctions have maximum execution times (15-30 minutes). Long-running tasks need different architectures
ConcurrencyEach instance typically handles one request at a time. High traffic means many parallel instances
Why Go is great for serverless: Go’s compiled binary starts in ~50ms (compared to 500ms+ for Java or 200ms+ for Node.js), making cold starts much less painful. Go also uses little memory, which reduces costs since you pay for memory usage.

AWS Lambda

AWS Lambda is the original and most popular serverless platform. You upload a function, and AWS runs it in response to events—HTTP requests, database changes, file uploads, or scheduled triggers. For web applications, you need a way to receive HTTP requests. AWS offers two options: Function URLs (simpler, direct HTTP access) or API Gateway (more features like rate limiting, API keys, and request transformation).

Lambda with Function URLs

Function URLs are the simplest way to invoke Lambda via HTTP. AWS gives you a URL like https://xyz123.lambda-url.us-east-1.on.aws/ that directly invokes your function. No API Gateway setup required. main.go:
package main

import (
    "context"
    "net/http"

    "github.com/aws/aws-lambda-go/events"
    "github.com/aws/aws-lambda-go/lambda"
    "github.com/go-mizu/mizu"
)

var app *mizu.App

func init() {
    // Initialize once, reuse across invocations
    app = mizu.New()

    app.Get("/", func(c *mizu.Ctx) error {
        return c.JSON(200, map[string]string{"message": "Hello from Lambda!"})
    })

    app.Get("/users/{id}", func(c *mizu.Ctx) error {
        id := c.Param("id")
        return c.JSON(200, map[string]string{"id": id})
    })
}

func handler(ctx context.Context, req events.LambdaFunctionURLRequest) (events.LambdaFunctionURLResponse, error) {
    // Convert Lambda request to http.Request
    httpReq, err := convertRequest(ctx, req)
    if err != nil {
        return events.LambdaFunctionURLResponse{StatusCode: 500}, err
    }

    // Create response recorder
    recorder := &responseRecorder{
        headers: make(http.Header),
        body:    &bytes.Buffer{},
    }

    // Handle request
    app.ServeHTTP(recorder, httpReq)

    // Convert to Lambda response
    return events.LambdaFunctionURLResponse{
        StatusCode:      recorder.statusCode,
        Headers:         flattenHeaders(recorder.headers),
        Body:            recorder.body.String(),
        IsBase64Encoded: false,
    }, nil
}

func main() {
    lambda.Start(handler)
}

// Helper types and functions...
type responseRecorder struct {
    statusCode int
    headers    http.Header
    body       *bytes.Buffer
}

func (r *responseRecorder) Header() http.Header         { return r.headers }
func (r *responseRecorder) Write(b []byte) (int, error) { return r.body.Write(b) }
func (r *responseRecorder) WriteHeader(code int)        { r.statusCode = code }

func convertRequest(ctx context.Context, req events.LambdaFunctionURLRequest) (*http.Request, error) {
    httpReq, err := http.NewRequestWithContext(ctx, req.RequestContext.HTTP.Method, req.RawPath, strings.NewReader(req.Body))
    if err != nil {
        return nil, err
    }
    httpReq.URL.RawQuery = req.RawQueryString
    for k, v := range req.Headers {
        httpReq.Header.Set(k, v)
    }
    return httpReq, nil
}

func flattenHeaders(h http.Header) map[string]string {
    flat := make(map[string]string)
    for k, v := range h {
        flat[k] = strings.Join(v, ",")
    }
    return flat
}
Build and deploy:
# Build for Lambda
GOOS=linux GOARCH=amd64 CGO_ENABLED=0 go build -ldflags="-s -w" -o bootstrap main.go

# Create zip
zip function.zip bootstrap

# Create function
aws lambda create-function \
    --function-name myapp \
    --runtime provided.al2023 \
    --handler bootstrap \
    --zip-file fileb://function.zip \
    --role arn:aws:iam::123456789:role/lambda-role

# Create function URL
aws lambda create-function-url-config \
    --function-name myapp \
    --auth-type NONE

Lambda with API Gateway

For more control over HTTP handling. template.yaml (AWS SAM):
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31

Globals:
  Function:
    Timeout: 30
    MemorySize: 256
    Runtime: provided.al2023
    Architectures:
      - x86_64

Resources:
  MyAppFunction:
    Type: AWS::Serverless::Function
    Properties:
      Handler: bootstrap
      CodeUri: ./
      Events:
        CatchAll:
          Type: HttpApi
          Properties:
            Path: /{proxy+}
            Method: ANY
      Environment:
        Variables:
          ENV: production

Outputs:
  ApiUrl:
    Description: API Gateway URL
    Value: !Sub "https://${ServerlessHttpApi}.execute-api.${AWS::Region}.amazonaws.com"
Deploy with SAM:
# Build
sam build

# Deploy
sam deploy --guided

Using aws-lambda-go-api-proxy

A library that simplifies Lambda + Mizu integration:
package main

import (
    "github.com/aws/aws-lambda-go/lambda"
    "github.com/awslabs/aws-lambda-go-api-proxy/httpadapter"
    "github.com/go-mizu/mizu"
)

var adapter *httpadapter.HandlerAdapter

func init() {
    app := mizu.New()

    app.Get("/", func(c *mizu.Ctx) error {
        return c.JSON(200, map[string]string{"status": "ok"})
    })

    app.Get("/users/{id}", func(c *mizu.Ctx) error {
        return c.JSON(200, map[string]string{"id": c.Param("id")})
    })

    adapter = httpadapter.New(app)
}

func main() {
    lambda.Start(adapter.ProxyWithContext)
}

Google Cloud Functions

Google Cloud Functions is Google’s serverless platform. It’s tightly integrated with other Google Cloud services and has a simpler deployment model than Lambda—you can deploy directly from source code without building a container.

HTTP Function

The simplest approach is an HTTP-triggered function. Your function receives standard http.ResponseWriter and *http.Request parameters, so Mizu works seamlessly: function.go:
package function

import (
    "net/http"
    "sync"

    "github.com/go-mizu/mizu"
)

var (
    app  *mizu.App
    once sync.Once
)

func initApp() {
    app = mizu.New()

    app.Get("/", func(c *mizu.Ctx) error {
        return c.JSON(200, map[string]string{"message": "Hello from Cloud Functions!"})
    })

    app.Get("/users/{id}", func(c *mizu.Ctx) error {
        return c.JSON(200, map[string]string{"id": c.Param("id")})
    })
}

// MyFunction is the Cloud Functions entry point
func MyFunction(w http.ResponseWriter, r *http.Request) {
    once.Do(initApp)
    app.ServeHTTP(w, r)
}
Deploy:
gcloud functions deploy myapp \
    --runtime go122 \
    --trigger-http \
    --allow-unauthenticated \
    --entry-point MyFunction \
    --memory 256MB \
    --timeout 60s \
    --region us-central1
Based on Cloud Run, better performance.
gcloud functions deploy myapp \
    --gen2 \
    --runtime go122 \
    --trigger-http \
    --allow-unauthenticated \
    --entry-point MyFunction \
    --memory 256MB \
    --timeout 60s \
    --region us-central1 \
    --min-instances 1  # Avoid cold starts

Azure Functions

main.go:
package main

import (
    "log"
    "net/http"
    "os"

    "github.com/go-mizu/mizu"
)

func main() {
    app := mizu.New()

    app.Get("/api/hello", func(c *mizu.Ctx) error {
        return c.JSON(200, map[string]string{"message": "Hello from Azure!"})
    })

    port := os.Getenv("FUNCTIONS_CUSTOMHANDLER_PORT")
    if port == "" {
        port = "8080"
    }

    log.Printf("Starting on port %s", port)
    http.ListenAndServe(":"+port, app)
}
host.json:
{
  "version": "2.0",
  "customHandler": {
    "description": {
      "defaultExecutablePath": "myapp",
      "workingDirectory": "",
      "arguments": []
    },
    "enableForwardingHttpRequest": true
  },
  "extensionBundle": {
    "id": "Microsoft.Azure.Functions.ExtensionBundle",
    "version": "[4.*, 5.0.0)"
  }
}
function.json:
{
  "bindings": [
    {
      "type": "httpTrigger",
      "direction": "in",
      "name": "req",
      "methods": ["get", "post", "put", "delete"],
      "route": "{*route}"
    },
    {
      "type": "http",
      "direction": "out",
      "name": "res"
    }
  ]
}

Vercel

Vercel supports Go with their Serverless Functions. api/index.go:
package handler

import (
    "net/http"
    "sync"

    "github.com/go-mizu/mizu"
)

var (
    app  *mizu.App
    once sync.Once
)

func init() {
    once.Do(func() {
        app = mizu.New()

        app.Get("/api", func(c *mizu.Ctx) error {
            return c.JSON(200, map[string]string{"status": "ok"})
        })

        app.Get("/api/users/{id}", func(c *mizu.Ctx) error {
            return c.JSON(200, map[string]string{"id": c.Param("id")})
        })
    })
}

func Handler(w http.ResponseWriter, r *http.Request) {
    app.ServeHTTP(w, r)
}
vercel.json:
{
  "functions": {
    "api/index.go": {
      "runtime": "[email protected]"
    }
  },
  "rewrites": [
    {
      "source": "/api/(.*)",
      "destination": "/api"
    }
  ]
}

Optimizing for Serverless

Serverless performance is all about reducing cold starts and minimizing execution time (since you pay per millisecond). These optimizations can significantly reduce costs and improve user experience.

Reduce Cold Start Time

Cold starts happen when there’s no warm instance available to handle your request. The key insight is that initialization code runs once per instance, not once per request. Put expensive setup (like database connection pools) in init() so it only runs during cold starts:
var (
    app *mizu.App
    db  *sql.DB
)

func init() {
    // Initialize during cold start, not per request
    app = mizu.New()
    setupRoutes(app)

    // Lazy database connection
    // Don't connect in init() - connect on first request
}

func setupRoutes(app *mizu.App) {
    app.Get("/", handleHome)
    app.Get("/users/{id}", handleGetUser)
}

func getDB() *sql.DB {
    if db == nil {
        var err error
        db, err = sql.Open("postgres", os.Getenv("DATABASE_URL"))
        if err != nil {
            log.Fatal(err)
        }
    }
    return db
}

Keep Connections Alive

var httpClient = &http.Client{
    Timeout: 10 * time.Second,
    Transport: &http.Transport{
        MaxIdleConns:        100,
        MaxIdleConnsPerHost: 100,
        IdleConnTimeout:     90 * time.Second,
    },
}

func callExternalAPI(ctx context.Context) error {
    req, _ := http.NewRequestWithContext(ctx, "GET", "https://api.example.com", nil)
    resp, err := httpClient.Do(req)
    // ...
}

Binary Size Optimization

# Minimal binary
CGO_ENABLED=0 go build -ldflags="-s -w" -o bootstrap

# Further compression with upx
upx --best bootstrap

Comparison

PlatformCold StartMax TimeoutPricing
AWS Lambda100-500ms15 minPer invocation
Cloud Functions100-500ms9 min (gen1), 60 min (gen2)Per invocation
Azure Functions100-500ms10 minPer invocation
Vercel100-300ms10 sec (hobby), 60 sec (pro)Per invocation
Cloud Run0ms (min instances)60 minPer request

When to Use Serverless

Good for:
  • Variable/unpredictable traffic
  • Event-driven workloads
  • Cost optimization at low scale
  • APIs with bursty traffic
Avoid for:
  • Long-running processes
  • WebSocket connections
  • High-volume, consistent traffic
  • Latency-sensitive applications

Next Steps