Gravity (Stacks)
We enter Phase IV. Programming books usually introduce data structures as "Abstract Data Types" or "Containers".
This is a lie.
A Data Structure is not a container. It is a negotiation with the machine. You promise to restrict your behavior, and in exchange, the machine grants you speed.
The Stack is the strictest negotiation of all.
The Contract: Gravity
The Stack enforces a brutal rule: You can only touch the top. You cannot move sideways. You cannot jump to the bottom. Gravity pulls everything down.
Why would anyone accept such a limited tool? Because in exchange, the machine gives you everything.
The Four Lenses
Let's audit the Stack against our hardware realities.
- Cache Locality (The Win): The "Top" of the stack is always hot. If you push X, then push Y, they sit next to each other in the same Cache Line. You almost never miss.
- TLB Pressure (The Win): A Stack grows linearly on one page. It reuses the same translation entry thousands of times. Zero page walks.
- Prefetching (The Win): Can the hardware guess what's next? Yes. It's either the next slot down or the previous slot up. Trivial.
- Coherence (The Win): Stacks are almost always thread-local. Core A's stack is invisible to Core B. No MESI traffic. No bus wars.
The Stack is the closest thing to a survivor the memory hierarchy allows. It is the only structure that the machine truly loves.
What Gravity Forbids
The Stack is fast because it forbids freedom.
- No Search: You cannot find arbitrary elements. You are blind to everything but the Top.
- No Waiting: You cannot delay consumption. Last In must be First Out.
- No Sharing: The stack resists coordination. It is private by definition.
- No Reordering: History is irreversible.
These are not design flaws. They are the price of speed.
The stack is not just for data. It is how the CPU remembers where it was. Without the stack, there is no call, no return, no recursion, no structured execution.
Why do recursive functions crash the stack?
Because the contract of "Gravity" assumes you will eventually stop falling. The Stack has finite physical space (often 1MB-8MB). Infinite recursion breaks the negotiated limit.
Why are "Stack Allocations" faster than "Heap Allocations"?
Malloc (Heap) has to search for free space (O(N) or worse) and update metadata. Alloca (Stack) just subtracts a number from a register (SP). One instruction vs thousands.
Why not put everything on the stack?
Because the stack assumes strict lifetime ordering. If data must outlive its creator, be shared, or be accessed out of order, the stack contract breaks. The machine gives speed only if you give up freedom.
What programmers usually get wrong here
They think the Stack is small because RAM is expensive. No. The Stack is small because it *must* stay hot in L1 cache to be fast. If the stack were 100GB, you would lose the locality benefit.
This works — until we need to wait. The Stack implies "Last In, First Out". But the real world often works "First In, First Out". We need a Line.