Trees7 min read

AVL Tree

A self-balancing BST that rebalances itself after every insert or delete
search: \(O(\log n)\)insert: \(O(\log n)\)rotate: \(O(1)\)space: \(O(n)\)

An AVL tree is a Binary Search Tree that enforces a strict balance rule after every modification. Named after its inventors Adelson-Velsky and Landis (1962), it was the first self-balancing BST ever invented.

The rule is simple: for every node, the heights of its left and right subtrees must differ by at most 1. This guarantee keeps the tree height at \(O(\log n)\) always — no degenerate cases possible.

The balance factor

Every node stores its balance factor:

pseudocode
1
balance_factor = height(left_subtree) - height(right_subtree)

In a valid AVL tree, every node's balance factor must be one of: -1, 0, or +1.

  • -1 — right subtree is one level taller
  • 0 — perfectly balanced at this node
  • +1 — left subtree is one level taller

After any insert or delete, if any node's balance factor becomes ±2, the tree must be repaired using rotations.

Watch AVL rotations

← → arrow keys to step · click elements to interact

AVL Tree: Insert with Rotations

class Node:
def __init__(self, val):
self.val = val
self.left = self.right = None
self.height = 1
def height(n): return n.height if n else 0
def bf(n): return height(n.left) - height(n.right) if n else 0
def update(n): n.height = 1 + max(height(n.left), height(n.right))
def rot_right(y):
x = y.left; y.left = x.right; x.right = y
update(y); update(x); return x
def rot_left(x):
y = x.right; x.right = y.left; y.left = x
update(x); update(y); return y
def insert(node, val):
if not node: return Node(val)
if val < node.val: node.left = insert(node.left, val)
elif val > node.val: node.right = insert(node.right, val)
else: return node
update(node)
b = bf(node)
if b > 1 and val < node.left.val: return rot_right(node)
if b < -1 and val > node.right.val: return rot_left(node)
if b > 1 and val > node.left.val:
node.left = rot_left(node.left); return rot_right(node)
if b < -1 and val < node.right.val:
node.right = rot_right(node.right); return rot_left(node)
return node
def inorder(n): return inorder(n.left)+[n.val]+inorder(n.right) if n else []
root = None
for v in [10, 20, 30, 40, 50, 25]:
root = insert(root, v)
print(inorder(root)) # [10, 20, 25, 30, 40, 50]
print(height(root)) # 3
Output
[10, 20, 25, 30, 40, 50]
3
Note: AVL trees guarantee height ≤ 1.44 × \(\log_2(n)\). That means even in the worst case, a perfectly valid AVL tree of 1 million nodes has at most ~29 levels — search is always fast.

Rotations — the repair tool

A rotation restructures a small part of the tree to restore balance without violating the BST property. There are four rotation types, but only two fundamental ones:

Left Rotation

Used when the right subtree is too tall (balance factor = -2) due to a right-right insertion:

pseudocode
1
2
3
4
5
A B
\ / \
B → A C
\
C

B becomes the new root, A becomes B's left child, and B's old left child (if any) becomes A's right child.

Right Rotation

The mirror case — used when the left subtree is too tall (balance factor = +2) due to a left-left insertion:

pseudocode
1
2
3
4
5
C B
/ / \
B → A C
/
A

Double rotations

For left-right and right-left cases (the imbalance is "zigzag" shaped), you need two rotations:

  • Left-Right: Left rotate the child, then right rotate the root
  • Right-Left: Right rotate the child, then left rotate the root

When to use AVL vs Red-Black

AVL trees are more strictly balanced than Red-Black trees, making lookups slightly faster (shorter maximum path). However, they perform more rotations on insert and delete, making writes slightly slower. Choose AVL when read-heavy workloads dominate. Most language standard libraries (Java TreeMap, C++ std::map) use Red-Black trees for their better write performance.

Complexity

SearchHeight guaranteed ≤ 1.44 × \(\log_2(n)\)
Fast\(O(\log n)\)
InsertWalk down + at most \(O(\log n)\) rebalances
Fast\(O(\log n)\)
DeleteWalk down + rebalance on the way up
Fast\(O(\log n)\)
RotationJust pointer reassignments — constant work
Instant\(O(1)\)
SpaceOne node per value, plus height metadata
Linear\(O(n)\)

AVL Balance Factor Check

class Node:
def __init__(self, val, h=1, left=None, right=None):
self.val=val; self.h=h; self.left=left; self.right=right
def height(n): return n.h if n else 0
def balance_factor(n): return height(n.left) - height(n.right) if n else 0
def check_avl(n):
if not n: return True
bf = balance_factor(n)
if abs(bf) > 1:
print(f"Unbalanced at {n.val}: bf={bf}")
return False
return check_avl(n.left) and check_avl(n.right)
# Manually build a balanced tree (heights stored)
# 2 (h=3)
# / \
# 1(h=2) 4(h=2)
# / / \
# 0(h=1) 3(h=1) 5(h=1)
n0=Node(0); n1=Node(1,2,n0); n3=Node(3); n5=Node(5)
n4=Node(4,2,n3,n5); root=Node(2,3,n1,n4)
print(check_avl(root)) # True
# Skewed (balance factor = 2 at root)
skew_l=Node(1); skew_m=Node(2,2,skew_l); skew_r=Node(3,3,skew_m)
print(check_avl(skew_r)) # prints Unbalanced + False
Output
True
Unbalanced at 3: bf=2
False
Challenge

Quick check

A node in an AVL tree has balance factor +2. What must happen?

Continue reading

BST Balancing
An unbalanced BST is just a slow linked list in disguise
Red-Black Tree
A self-balancing BST used inside Java's TreeMap and C++'s std::map