Go Performance Guide
Memory Management

Stack vs Heap Allocation

Understand Go's escape analysis and how to keep variables on the stack for better performance.

Stack vs Heap Allocation

One of Go's most powerful optimizations happens automatically: escape analysis. The compiler determines whether variables should live on the fast, cache-friendly stack or the slower, garbage-collected heap. Understanding this mechanism lets you write code that naturally allocates efficiently.

Why Stack vs Heap Matters

Stack allocation is nearly free: it's just incrementing a pointer. Memory is automatically reclaimed when the function returns. It's also cache-friendly—the stack is usually in CPU cache.

Heap allocation requires the garbage collector to eventually free the memory. Heap allocations add pressure to GC, potentially causing pauses. They're also more scattered in memory, causing cache misses.

Go automatically decides where variables go, but you can influence these decisions with your code patterns.

How Go's Escape Analysis Works

The compiler performs static analysis to determine if a variable:

  • Outlives its creating function
  • Escapes to unknown places (interfaces, goroutines)
  • Is too large for the stack

If a variable doesn't escape, it's allocated on the stack. If it does escape, it's allocated on the heap.

What Causes Variables to Escape

1. Returning Pointers

The classic case: returning a pointer to a local variable forces heap allocation.

package main

type Point struct {
    X, Y int
}

// ESCAPES: Returning pointer to local
func NewPointEscape() *Point {
    p := Point{X: 10, Y: 20}
    return &p // p must escape to heap
}

// STAYS ON STACK: Returning by value
func NewPointStack() Point {
    p := Point{X: 10, Y: 20}
    return p // p is copied; no heap needed
}

// ESCAPES: Pointer stored in slice (persists beyond function)
func AddToSlice() []*Point {
    var points []*Point
    p := Point{X: 10, Y: 20}
    points = append(points, &p) // p escapes
    return points
}

2. Closures and Goroutines

Variables referenced by closures escape because the closure might outlive the containing function:

// ESCAPES: Variable captured by closure
func NewCounter() func() int {
    count := 0
    return func() int {
        count++
        return count
    }
}

// ESCAPES: Variable captured by goroutine
func StartMonitoring() chan int {
    results := make(chan int)
    data := make([]int, 100)

    go func() {
        // Closure captures 'data' and 'results'
        results <- len(data)
        close(results)
    }()

    return results
}

3. Interface Conversions

Converting to interface{} causes heap allocation if the concrete type's size differs from the interface word size:

// ESCAPES: Small struct converted to interface
func ProcessInterface(i interface{}) {
    switch v := i.(type) {
    case int:
        // ...
    }
}

func main() {
    p := Point{X: 10, Y: 20}
    ProcessInterface(p) // p escapes to heap for interface boxing
}

4. Slices of Pointers

Taking the address of a variable and storing it in a slice that outlives the function:

// ESCAPES: Address taken and stored
func CollectPointers() []*int {
    x := 42
    var ptrs []*int
    ptrs = append(ptrs, &x)
    return ptrs // x must escape
}

// STAYS: Addresses not taken
func CollectValues() []int {
    x := 42
    var vals []int
    vals = append(vals, x)
    return vals // No escape
}

5. Too-Large Objects

Objects larger than the stack allocation budget escape:

// ESCAPES: Large array (typical stack limit is 256KB)
func LargeArray() *[1000000]int {
    arr := [1000000]int{}
    return &arr // Can't fit on stack
}

// STAYS: Modest size
func SmallArray() *[100]int {
    arr := [100]int{}
    return &arr // Small enough for stack
}

6. Returning Interface Results

Returning an interface causes the underlying value to escape:

// ESCAPES: io.Writer interface contains pointer
type LogBuffer struct {
    data []byte
}

func NewWriter() io.Writer {
    buf := &LogBuffer{}
    return buf // Escapes due to interface return
}

Checking Escape Analysis

Use the -m flag with go build or go test to see compiler escape decisions:

# Show escape analysis decisions
go build -gcflags="-m" ./main.go

# More verbose output
go build -gcflags="-m -m" ./main.go

# Even more detail
go build -gcflags="-m=3" ./main.go

Example Output Analysis

// main.go
package main

type Point struct {
    X, Y int
}

func NewPointEscape() *Point {
    p := Point{X: 10, Y: 20}
    return &p
}

func NewPointStack() Point {
    p := Point{X: 10, Y: 20}
    return p
}

func main() {
    p1 := NewPointEscape()
    p2 := NewPointStack()
    _, _ = p1, p2
}

Run: go build -gcflags="-m" main.go

Output:

./main.go:8:12: &p escapes to heap
./main.go:7:5: moved to heap: p
./main.go:12:5: p does not escape
./main.go:17:9: main ignores return value of NewPointEscape

This shows:

  • p in NewPointEscape → heap allocation
  • p in NewPointStack → stack allocation (doesn't escape)

Patterns to Keep Variables on Stack

1. Return Values, Not Pointers

// GOOD: Returns value, stays on stack
func ComputeResult() Point {
    return Point{X: 10, Y: 20}
}

// AVOID: Returns pointer, goes to heap
func ComputeResultPtr() *Point {
    p := Point{X: 10, Y: 20}
    return &p
}

2. Use Value Receivers for Small Types

type Point struct {
    X, Y int
}

// GOOD: Value receiver, methods don't escape the receiver
func (p Point) Distance() float64 {
    return math.Sqrt(float64(p.X*p.X + p.Y*p.Y))
}

// AVOID: Pointer receiver causes escapes
func (p *Point) Distance() float64 {
    return math.Sqrt(float64(p.X*p.X + p.Y*p.Y))
}

3. Preallocate Slices Without Escaping

// GOOD: Slice stays on stack if size is bounded
func ProcessItems(count int) {
    items := make([]int, 0, count)
    for i := 0; i < count; i++ {
        items = append(items, i)
    }
    // Use items...
}

// AVOID: Returning slice header requires heap for backing array
func ProcessItemsReturn(count int) []int {
    items := make([]int, 0, count)
    for i := 0; i < count; i++ {
        items = append(items, i)
    }
    return items // Array escapes
}

4. Minimize Interface Conversions

// GOOD: Avoid unnecessary boxing
func Process(x int) {
    result := x * 2
    fmt.Println(result)
}

// AVOID: Convert to interface
func ProcessInterface(x int) {
    var i interface{} = x
    fmt.Println(i)
}

Stack vs Heap Benchmark

package stackheap

import (
    "testing"
)

type DataPoint struct {
    X, Y, Z int64
}

func BenchmarkStackAlloc(b *testing.B) {
    b.ReportAllocs()
    b.ResetTimer()

    for i := 0; i < b.N; i++ {
        // Variables stay on stack
        p1 := DataPoint{X: 1, Y: 2, Z: 3}
        p2 := DataPoint{X: 4, Y: 5, Z: 6}
        _ = p1.X + p2.X
    }
}

func BenchmarkHeapAlloc(b *testing.B) {
    b.ReportAllocs()
    b.ResetTimer()

    for i := 0; i < b.N; i++ {
        // Variables escape to heap
        p1 := &DataPoint{X: 1, Y: 2, Z: 3}
        p2 := &DataPoint{X: 4, Y: 5, Z: 6}
        _ = p1.X + p2.X
    }
}

func BenchmarkInterfaceEscape(b *testing.B) {
    b.ReportAllocs()
    b.ResetTimer()

    for i := 0; i < b.N; i++ {
        var x interface{} = DataPoint{X: 1, Y: 2, Z: 3}
        _ = x
    }
}

Expected results:

BenchmarkStackAlloc-8      1000000000  0.5 ns/op   0 allocs/op
BenchmarkHeapAlloc-8       50000000    20 ns/op    1 allocs/op
BenchmarkInterfaceEscape-8 100000000   10 ns/op    1 allocs/op

Stack allocation is 40x faster than heap allocation!

Escape Analysis Examples in Real Code

Example 1: HTTP Request Handler

type Handler struct {
    cache map[string]interface{}
}

// GOOD: Parse result stays local
func (h *Handler) parseRequest(r *http.Request) {
    // Config struct stays on stack
    type config struct {
        timeout time.Duration
        retries int
    }
    cfg := config{
        timeout: 30 * time.Second,
        retries: 3,
    }
    // Use cfg...
}

// AVOID: Storing in interface{}
func (h *Handler) cacheRequest(r *http.Request, data interface{}) {
    h.cache[r.URL.String()] = data // Interface boxed
}

Example 2: Temporary Buffers

// GOOD: Buffer stays on stack if size is small
func ProcessBuffer() {
    buf := make([]byte, 1024) // Small, preallocated buffer
    // Use buf...
}

// If needed to return, buffer escapes
func ProcessBufferReturn() []byte {
    buf := make([]byte, 1024)
    // ... fill buffer ...
    return buf // Buffer escapes
}

Example 3: Error Handling

// GOOD: Error value doesn't escape
func Divide(a, b int) (int, error) {
    if b == 0 {
        return 0, errors.New("division by zero")
    }
    return a / b, nil
}

// AVOID: Wrapping in interface
type ErrorWrapper struct {
    err error
}

func DivideWrapped(a, b int) interface{} {
    if b == 0 {
        return ErrorWrapper{err: errors.New("division by zero")}
    }
    return a / b // Boxing to interface
}

Common Misconceptions

Myth 1: Pointers Are Always Slower

Not true. Pointer access is cached-friendly. It's the allocation that's slow. A pointer on the stack is fast; a pointer to a heap object might be slower due to cache misses.

Myth 2: Small Structs Always Stay on Stack

False. A small struct can escape if you return a pointer to it, store its address, or box it in an interface.

Myth 3: You Must Manually Manage Stack/Heap

Not in Go! The compiler handles it automatically. You just follow idiomatic patterns (value receivers, return values, etc.) and the compiler optimizes.

Profiling with Escape Analysis

Combine escape analysis with profiling:

# Build with escape analysis
go build -gcflags="-m" ./main.go 2>&1 | grep escapes

# Profile memory allocations
go test -bench=. -benchmem -cpuprofile=cpu.prof -memprofile=mem.prof
go tool pprof mem.prof
(pprof) top
(pprof) list FunctionName

Summary

Understanding stack vs heap allocation in Go:

  • Stack allocation: Nearly free, cache-friendly, automatic cleanup
  • Heap allocation: Requires GC, slower access patterns
  • Escape analysis: Compiler automatically chooses for you
  • Common escapes: Returning pointers, closures, interfaces, goroutines
  • Stack-friendly patterns: Value receivers, return values, avoid unnecessary pointers
  • Verify with: go build -gcflags="-m" ... to see compiler decisions
  • Benchmark: Use -benchmem to measure allocation impact

Write idiomatic Go code following these patterns, and the compiler will automatically optimize variable placement for you.

On this page