Stack time complexity

Let n be the number of elements currently stored in the stack. The classic stack ADT only touches the top, so the standard operations are constant time in normal models. What does change is space, overflow handling on a fixed buffer, and the occasional O(n) resize for a dynamic backing array—usually analyzed as O(1) amortized per push when you double capacity.

On this page
  • Core operations — push, pop, peek, isEmpty, isFull
  • By implementation — fixed array, linked list, dynamic array
  • Patterns — when a whole scan is O(n) (monotonic stack, parsing)
  • Space — what grows with n

1) Core ADT operations

These are the operations introduced in Stack operations. Each does a bounded amount of work: update one index or pointer, read one slot, or compare the size to zero.

Time complexity for standard stack operations (n = current size)
Operation Role Time
push(x) Add x at the top (if there is room or allocation succeeds). O(1) worst-case on a fixed-capacity array or singly linked list at the head.
pop() Remove and return the top element (if not empty). O(1)
peek() / top() Read the top without removing it. O(1)
isEmpty() Test whether the stack has zero elements. O(1)
isFull() Test against a fixed capacity (arrays only). O(1)

None of these require scanning all n elements—that is why a stack is not a queue and not a list for “search anywhere” use cases.

2) By implementation (C-style)

Compare implementations before you code in an interview
Same operations, three backing structures
Operation Fixed array Linked list (top at head) Dynamic array
push O(1) if not full; overflow otherwise. O(1) allocate one node + relink. O(1) amortized; O(n) worst case when resize copies all elements.
pop O(1) O(1) unlink head. O(1) (shrink strategies are optional and less common).
peek O(1) O(1) O(1)
isEmpty O(1) (top == -1) O(1) (head == NULL) O(1)
isFull O(1) Usually not defined (unbounded until memory fails). O(1) compare size to capacity before grow.

Why it stays O(1)

Push only talks to the top

b a

No loop over all stored values for a normal push / pop.

Whole-array algorithms

Different question

Scan / monotonic passes

Problems like monotonic stack run in O(n) total because each index is pushed and popped a bounded number of times—not because a single push costs n.

3) Space complexity

Extra memory besides the stored elements themselves
Structure Elements Auxiliary
Fixed array O(n) slots reserved up to capacity C (often treat as O(C)). O(1) for top (and maybe capacity).
Linked list O(n) nodes for n pushed elements. O(1) head pointer; each node stores one next pointer.
Dynamic array O(n) occupied; capacity ≥ n (often within a constant factor after resizes). O(1) size/capacity integers.

4) O(1) in code (array top)

A typical array stack updates only top and one slot—constant work.

#define CAP 64
int stack[CAP];
int top = -1;

/* Time O(1): one compare, one index write. */
int push(int v) {
    if (top + 1 >= CAP)
        return 0; /* overflow */
    stack[++top] = v;
    return 1;
}

/* Time O(1): one compare, one decrement. */
int pop(int *out) {
    if (top < 0)
        return 0;
    *out = stack[top--];
    return 1;
}

See Stack using array and Stack using dynamic array for full implementations and overflow/grow policy.

5) Interview phrasing

  • “What is the complexity of stack operations?” — Usually: push, pop, peek in O(1) time; space O(n) for n stored items.
  • “Dynamic array push?” — Say O(1) amortized if you double capacity; mention the rare O(n) resize.
  • “Monotonic stack on an array?” — Total time O(n) for the scan, not O(n) per single ADT push.

Key takeaway

All standard stack ADT operations are O(1) time when you only touch the top. Total O(n) shows up in algorithms that use a stack over data (parsing, monotonic passes)—count pushes and pops across the whole run, not one call.

See also: Stack operations · Monotonic stack · Stack tutorial