8 Priority Queues 8 Priority Queues

A S is a dynamic set that supports the following operations:

ñ S. build(x1, . . . , xn): Creates a data-structure that contains An addressable Priority Queue also supports: just the elements x1, . . . , xn. ñ handle S. insert(x): Adds element x to the data-structure, ñ S. insert(x): Adds element x to the data-structure. and returns a handle to the object for future reference. ñ element S. minimum(): Returns an element x S with ∈ ñ S. delete(h): Deletes element specified through handle h. minimum key-value key[x]. ñ S. decrease-key(h, k): Decreases the key of the element ñ element delete-min : Deletes the element with S. () specified by handle h to k. Assumes that the key is at least minimum key-value from and returns it. S k before the operation. ñ boolean S. is-empty(): Returns true if the data-structure is empty and false otherwise.

Sometimes we also have ñ S. merge(S ): S : S S ; S : œ. 0 = ∪ 0 0 =

8 Priority Queues © Harald Räcke 303 © Harald Räcke 304

Dijkstra’s Shortest Path Algorithm Prim’s Minimum Spanning Tree Algorithm

Algorithm 19 Prim-MST(G (V , E, d), s V) Algorithm 18 Shortest-Path(G (V , E, d), s V) = ∈ = ∈ 1: Input: weighted graph G (V , E, d); start vertex s; 1: Input: weighted graph G (V , E, d); start vertex s; = = 2: Output: pred-fields encode MST; 2: Output: key-field of every node contains distance from s; 3: S.build(); // build empty priority queue 3: S.build(); // build empty priority queue 4: for all v V s do 4: for all v V s do ∈ \{ } ∈ \{ } 5: v. key ; 5: v. key ; ← ∞ ← ∞ 6: hv S.insert(v); 6: hv S.insert(v); ← ← 7: s. key 0; S.insert(s); 7: s. key 0; S.insert(s); ← ← 8: while S.is-empty() false do 8: while S.is-empty() false do = = 9: v S.delete-min(); 9: v S.delete-min(); ← ← 10: for all x V s.t. v, x E do 10: for all x V s.t. (v, x) E do ∈ { } ∈ ∈ ∈ 11: if x. key > d(v, x) then 11: if x. key > v. key d(v, x) then + 12: S.decrease-key(hx,d(v, x)); 12: S.decrease-key(hx,v. key d(v, x)); + 13: x. key d(v, x); 13: x. key v. key d(v, x); ← ← + 14: x. pred v; ←

8 Priority Queues 8 Priority Queues © Harald Räcke 305 © Harald Räcke 306 Analysis of Dijkstra and Prim 8 Priority Queues

Both algorithms require: Binary Binomial Fibonacci Operation BST * ñ 1 build() operation Heap Heap Heap build n log log n ñ V insert() operations n n n n | | minimum 1 log n log n 1 ñ V delete-min() operations | | is-empty 1 1 1 1 ñ V is-empty() operations insert log n log n log n 1 | | delete log n** log n log n log n ñ E decrease-key() operations | | delete-min log n log n log n log n decrease-key log n log n log n 1 How good a running time can we obtain? merge n n log n log n 1

Note that most applications use build() only to create an empty * Fibonacciheap which heapsthen only give costs an time** 1The. standard version of binary heaps is not address- amortized guarantee. able. Hence, it does not support a delete.

8 Priority Queues © Harald Räcke 307

8 Priority Queues 8.1 Binary Heaps

ñ Nearly complete binary tree; only the last level is not full, and this one is filled from left to right.

ñ Heap property: A node’s key is not larger than the key of one of its children. Using Binary Heaps, Prim and Dijkstra run in time 7 (( V E ) log V ). O | | + | | | | 9 15 Using Fibonacci Heaps, Prim and Dijkstra run in time ( V log V E ). O | | | | + | | 17 11 31 19

25 43 13 12 80

8 Priority Queues 8.1 Binary Heaps © Harald Räcke 309 © Harald Räcke 310 Binary Heaps 8.1 Binary Heaps Maintain a pointer to the last element x. ñ We can compute the predecessor of x (last element when x is deleted) in time (log n). O go up until the last edge used was a right edge. go left; go right until you reach a leaf Operations: ñ minimum(): return the root-element. Time (1). if you hit the root on the way up, go to the rightmost O element ñ is-empty(): check whether root-pointer is null. Time (1). O 7

9 15

17 11 31 19

25 43 13 12 80 x

8.1 Binary Heaps 8.1 Binary Heaps © Harald Räcke 311 © Harald Räcke 312

8.1 Binary Heaps Insert Maintain a pointer to the last element x. ñ We can compute the successor of x 1. Insert element at successor of x. (last element when an element is inserted) in time (log n). 2. Exchange with parent until heap property is fulfilled. O go up until the last edge used was a left edge. 7 go right; go left until you reach a null-pointer.

if you hit the root on the way up, go to the leftmost 9 14 element; insert a new element as a left child;

7 17 11 15 19

9 15 25 43 13 12 80 311x x

17 11 31 19 Note that an exchange can either be done by moving the data or by changing pointers. The latter method leads to an addressable

25 43 13 12 80 x priority queue.

8.1 Binary Heaps 8.1 Binary Heaps © Harald Räcke 313 © Harald Räcke 314 Delete Binary Heaps

1. Exchange the element to be deleted with the element e pointed to by x.

2. Restore the heap-property for the element e. Operations:

7 ñ minimum(): return the root-element. Time (1). O ñ is-empty(): check whether root-pointer is null. Time (1). O e 13 9 ñ insert(k): insert at x and bubble up. Time (log n). O ñ delete(h): swap with x and bubble up or sift-down. Time 18 1716 12 19 (log n). O 25 43 27 201 x 13 x

At its new position e may either travel up or down in the tree (but not both directions).

8.1 Binary Heaps 8.1 Binary Heaps © Harald Räcke 315 © Harald Räcke 316

Build Heap Binary Heaps

We can build a heap in linear time:

31 Operations: ñ minimum(): Return the root-element. Time (1). 30 2 O ñ is-empty(): Check whether root-pointer is null. Time (1). O ñ insert(k): Insert at x and bubble up. Time (log n). 13 9 5 3 O ñ delete(h): Swap with x and bubble up or sift-down. Time 15 14 11 10 7 6 4 17 (log n). O ñ build(x1, . . . , xn): Insert elements arbitrarily; then do 16 24 28 23 22 12 27 21 20 26 19 8 25 18 29 35 sift-down operations starting with the lowest layer in the tree. Time (n). O

2` (h `) (2h) (n) · − = O = O levelsX `

8.1 Binary Heaps 8.1 Binary Heaps © Harald Räcke 317 © Harald Räcke 318 Binary Heaps 8.2 Binomial Heaps

The standard implementation of binary heaps is via arrays. Let Binary Binomial Fibonacci A[0, . . . , n 1] be an array Operation BST − Heap Heap Heap* i 1 ñ The parent of i-th element is at position − . b 2 c build n n log n n log n n ñ The left child of i-th element is at position 2i 1. minimum 1 log n log n 1 + is-empty 1 1 1 1 ñ The right child of i-th element is at position 2i 2. + insert log n log n log n 1 Finding the successor of x is much easier than in the description delete log n** log n log n log n on the previous slide. Simply increase or decrease x. delete-min log n log n log n log n decrease-key log n log n log n 1 The resulting binary heap is not addressable. The elements merge n n log n log n 1 don’t maintain their positions and therefore there are no stable handles.

8.1 Binary Heaps 8.2 Binomial Heaps © Harald Räcke 319 © Harald Räcke 320

Binomial Trees Binomial Trees

B0 B1 B2 B3 B4

Properties of Binomial Trees k ñ Bk has 2 nodes.

ñ Bk has height k. ñ The root of has degree . Bt Bk k ñ has k nodes on level . Bk ` `   ñ Deleting the root of Bk gives trees B0,B1,...,Bk 1. − Bt 1 −

Bt 1 −

8.2 Binomial Heaps 8.2 Binomial Heaps © Harald Räcke 321 © Harald Räcke 322 Binomial Trees Binomial Trees

B4

B3

B0 B2

B1 B1

B2 B0

B3

B4

Deleting the root of B5 leaves sub-trees B4, B3, B2, B1, and B0. Deleting the leaf furthest from the root (in B5) leaves a path that connects the roots of sub-trees B4, B3, B2, B1, and B0.

8.2 Binomial Heaps 8.2 Binomial Heaps © Harald Räcke 323 © Harald Räcke 324

Binomial Trees Binomial Trees

0000 Bk Bk 1 − 1000 0100 0010 0001 Bk 1 − k 1 `−1  −  1100 1010 1001 0110 0101 0011 k 1 −`   1110 1101 1011 0111

1111

The binomial tree Bk is a sub-graph of the hypercube Hk. The number of nodes on level ` in tree Bk is therefore The parent of a node with label bn, . . . , b1, b0 is obtained by k 1 k 1 k − − setting the least significant 1-bit to 0. ` 1! + ` ! = `! − The `-th level contains nodes that have ` 1’s in their label.

8.2 Binomial Heaps 8.2 Binomial Heaps © Harald Räcke 325 © Harald Räcke 326 8.2 Binomial Heaps 8.2 Binomial Heaps

How do we implement trees with non-constant degree?

ñ The children of a node are arranged in a circular linked list. ñ A child-pointer points to an arbitrary node within the list. ñ A parent-pointer points to the parent node. ñ Given a pointer to a node x we can splice out the sub-tree ñ Pointers x. left and x. right point to the left and right sibling rooted at x in constant time. of x (if x does not have siblings then x. left x. right x). ñ We can add a child-tree to a node in constant time if we = = T x p are given a pointer to x and a pointer to the root of T .

parent

x left right child

a b c d

8.2 Binomial Heaps 8.2 Binomial Heaps © Harald Räcke 327 © Harald Räcke 328

Binomial Heap Binomial Heap: Merge

2 12 7

13 11 8 14 47 Given the number n of keys to be stored in a binomial heap we can deduce the binomial trees that will be contained in the 37 16 35 24 29 31 collection. 39 42 20 70

92 Let Bk1 , Bk2 , Bk3 , ki < ki 1 denote the binomial trees in the + collection and recall that every tree may be contained at most In a binomial heap the keys are arranged in a collection of once. binomial trees. Then n 2ki must hold. But since the k are all distinct this = i i Every tree fulfills the heap-property means thatP the ki define the non-zero bit-positions in the binary representation of n. There is at most one tree for every dimension/order. For

example the above heap contains trees B0, B1, and B4.

8.2 Binomial Heaps 8.2 Binomial Heaps © Harald Räcke 329 © Harald Räcke 330 Binomial Heap Binomial Heap: Merge

Properties of a heap with n keys: The merge-operation is instrumental for binomial heaps. ñ Let n bdbd 1, . . . , b0 denote binary representation of n. = − ñ The heap contains tree Bi iff bi 1. = A merge is easy if we have two heaps with Note that we do not just do a ñ Hence, at most log n 1 trees. b c + different binomial trees. We can simply concatenation as we want to ñ The minimum must be contained in one of the roots. keep the trees in the list merge the tree-lists. sorted according to size. ñ The height of the largest tree is at most log n . b c Otherwise, we cannot do this because the merged heap is not ñ The trees are stored in a single-linked list; ordered by dimension/size. allowed to contain two trees of the same order.

2 12 7 Merging two trees of the same size: Add 2 13 11 8 14 47 the tree with larger root-value as a child to 5 6 7 37 16 35 24 29 31 the other tree. 18 9 15 39 42 20 70 For more trees the technique is analogous 22 92 to binary addition.

8.2 Binomial Heaps 8.2 Binomial Heaps © Harald Räcke 331 © Harald Räcke 332

14 4 19 8.2 Binomial Heaps 16 22 39

29 S1. merge(S2): 2 14 40 ñ Analogous to binary addition.

3 19 90 5 17 ñ Time is proportional to the number of trees in both heaps. 9 18 21 27 30 94 ñ Time: (log n). O 46 13 26 42 4 4 19

54 14 14 39 14 39 40

16 22 17 17

29

2 4 19 0

3 19 90 5 14 14 39 40

9 18 21 27 30 94 16 22 17

46 13 26 42 29

8.2 Binomial Heaps 54 © Harald Räcke 334 8.2 Binomial Heaps 8.2 Binomial Heaps

All other operations can be reduced to merge(). S. minimum(): ñ Find the minimum key-value among all roots. ñ Time: (log n). S. insert(x): O

ñ Create a new heap S0 that contains just the element x.

ñ Execute S. merge(S0). ñ Time: (log n). O

8.2 Binomial Heaps 8.2 Binomial Heaps © Harald Räcke 335 © Harald Räcke 336

8.2 Binomial Heaps 8.2 Binomial Heaps

S. delete-min(): S. decrease-key(handle h): ñ Find the minimum key-value among all roots. ñ Decrease the key of the element pointed to by h.

ñ Remove the corresponding tree Tmin from the heap. ñ Bubble the element up in the tree until the heap property is fulfilled. ñ Create a new heap S0 that contains the trees obtained from Tmin after deleting the root (note that these are just ñ Time: (log n) since the trees have height (log n). O O (log n) trees). O ñ Compute S. merge(S0). ñ Time: (log n). O

8.2 Binomial Heaps 8.2 Binomial Heaps © Harald Räcke 337 © Harald Räcke 338 8.2 Binomial Heaps Amortized Analysis

S. delete(handle h): ñ Execute S. decrease-key(h, ). Definition 1 −∞ ñ Execute S. delete-min(). A data structure with operations op1(), . . . , opk() has amortized running times t1, . . . , t for these operations if the following ñ Time: (log n). k O holds.

Suppose you are given a sequence of operations (starting with an empty data-structure) that operate on at most n elements,

and let ki denote the number of occurences of opi() within this sequence. Then the actual running time must be at most k t (n). i i · i P

8.2 Binomial Heaps 8.3 Fibonacci Heaps © Harald Räcke 339 © Harald Räcke 340

Potential Method Example: Stack

Introduce a potential for the data structure. Stack

ñ Φ(Di) is the potential after the i-th operation. ñ S. push() ñ Amortized cost of the i-th operation is ñ S. pop() ñ S. multipop(k): removes k items from the stack. If the cˆi ci Φ(Di) Φ(Di 1). = + − − stack currently contains less than k items it empties the stack.

ñ The user has to ensure that pop and multipop do not ñ Show that Φ(Di) Φ(D0). ≥ generate an underflow. Then k k k Actual cost: c c Φ(D ) Φ(D0) cˆ i ≤ i + k − = i i 1 i 1 i 1 ñ X= X= X= S. push(): cost 1. ñ S. pop(): cost 1. This means the amortized costs can be used to derive a bound ñ S. multipop(k): cost min size, k k. on the total cost. { } =

8.3 Fibonacci Heaps 8.3 Fibonacci Heaps © Harald Räcke 341 © Harald Räcke 342 Example: Stack Example: Binary Counter

Incrementing a binary counter: Use potential function Φ(S) number of elements on the stack. Consider a computational model where each bit-operation costs = one time-unit. Amortized cost: ñ S. push(): cost Incrementing an n-bit binary counter may require to examine n-bits, and maybe change them. Cˆpush Cpush ∆Φ 1 1 2 . = + = + ≤ Note that the analysis becomes wrong if pop() or Actual cost: ñ S. pop(): cost multipop() are called on an empty stack. ñ Changing bit from 0 to 1: cost 1.

Cˆpop Cpop ∆Φ 1 1 0 . ñ Changing bit from 1 to 0: cost 1. = + = − ≤ ñ Increment: cost is k 1, where k is the number of + ñ S. multipop(k): cost consecutive ones in the least significant bit-positions (e.g, 001101 has k 1). Cˆmp Cmp ∆Φ min size, k min size, k 0 . = = + = { } − { } ≤

8.3 Fibonacci Heaps 8.3 Fibonacci Heaps © Harald Räcke 343 © Harald Räcke 344

Example: Binary Counter 8.3 Fibonacci Heaps Choose potential function Φ(x) k, where k denotes the = number of ones in the binary representation of x. Collection of trees that fulfill the heap property.

Amortized cost: Structure is much more relaxed than binomial heaps. ñ Changing bit from 0 to 1:

Cˆ0 1 C0 1 ∆Φ 1 1 2 . min → = → + = + ≤

7 3 23 24 17 ñ Changing bit from 1 to 0: 18 41 52 26 46 30

Cˆ1 0 C1 0 ∆Φ 1 1 0 . 39 44 35 → = → + = − ≤

ñ Increment: Let k denotes the number of consecutive ones in the least significant bit-positions. An increment involves k (1 0)-operations, and one( 0 1)-operation. → →

Hence, the amortized cost is kCˆ1 0 Cˆ0 1 2. 8.3 Fibonacci Heaps → + → ≤ © Harald Räcke 346 8.3 Fibonacci Heaps 8.3 Fibonacci Heaps

The potential function: ñ t(S) denotes the number of trees in the heap. ñ m(S) denotes the number of marked nodes. Additional implementation details: ñ We use the potential function Φ(S) t(S) 2m(S). = + ñ Every node x stores its degree in a field x. degree. Note that this can be updated in constant time when adding a child to x. min

ñ Every node stores a boolean value x. marked that specifies 7 3 23 24 17 whether is marked or not. x 18 41 52 26 46 30

39 44 35

The potential is Φ(S) 5 2 3 11. = + · =

8.3 Fibonacci Heaps 8.3 Fibonacci Heaps © Harald Räcke 347 © Harald Räcke 348

8.3 Fibonacci Heaps 8.3 Fibonacci Heaps

S. minimum()

ñ Access through the min-pointer. We assume that one unit of potential can pay for a constant ñ Actual cost (1). O amount of work, where the constant is chosen “big enough” (to ñ No change in potential. take care of the constants that occur). ñ Amortized cost (1). O To make this more explicit we use c to denote the amount of work that a unit of potential can pay for.

8.3 Fibonacci Heaps 8.3 Fibonacci Heaps © Harald Räcke 349 © Harald Räcke 350 3

18 41 52

39 44 18 41 52

39 44

ñ Consolidate root-list so that no roots have the same degree. Time t (1) (see next slide). ·O

In the figure below the dashed edges are x is inserted next to the min-pointer as • 8.3 Fibonacci Heaps replaced by red edges. 8.3 Fibonacci Heaps this is our entry point into the root-list. The minimum of the left heap becomes • S. merge(S0) the new minimum of the merged heap. S. insert(x) ñ Merge the root lists. ñ Create a new tree containing x. ñ ñ Adjust the min-pointer Insert x into the root-list. ñ Update min-pointer, if necessary.

min min min 7 3 23 24 17 5

3 23 18 41 52 26 46 30 11 7 x 24 17

18 52 26 46 30 39 44 35 41

39 44 35

Running time: Running time: ñ Actual cost (1). ñ Actual cost (1). O O ñ Change in potential is 1. ñ No change in potential. + ñ Amortized cost is c (1) (1). ñ Hence, amortized cost is (1). + O = O O 8.3 Fibonacci Heaps 8.3 Fibonacci Heaps © Harald Räcke 351 © Harald Räcke 352

D(min) is the number of D(min) is the number of 8.3 Fibonacci Heaps children of the node that 8.3 Fibonacci Heaps children of the node that stores the minimum. stores the minimum.

S. delete-min(x) S. delete-min(x) ñ Delete minimum; add child-trees to heap; ñ Delete minimum; add child-trees to heap; time: D(min) (1). time: D(min) (1). ·O ·O ñ Update min-pointer; time: (t D(min)) (1). ñ Update min-pointer; time: (t D(min)) (1). + ·O + ·O

min

7 3 23 24 17 7 18 41 52 23 24 17

18 41 52 26 46 30 39 44 26 46 30 min

39 44 35 35

ñ Consolidate root-list so that no roots have the same degree. Time t (1) (see next slide). ·O

8.3 Fibonacci Heaps 8.3 Fibonacci Heaps © Harald Räcke 353 © Harald Räcke 353 7 18 18 7 18 18

17 52 41 2439 41 39 17 52 41 2439 41 39

30 2644 46 44 30 2644 46 44

35 35

7 18 41 18 7 18 41 18

17 52 39 2444 41 39 17 52 39 2444 41 39

30 26 46 44 30 26 46 44

35 35

8.3 Fibonacci Heaps 8.3 Fibonacci Heaps

Consolidate: Consolidate:

0 1 2 3 0 1 2 3

current current

min 7x 18x 41x 52x 23 24 17 min 7x 18x 41x 52x 23 24 17

39 44 26 46 30 39 44 26 46 30

35 35

During the consolidation we traverse the root list. Whenever we discover two trees that have the same degree we merge these trees. In order to efficiently check whether two trees have the same degree, we use an array that contains for every degree value d a pointer to a tree left of the current pointer whose root has degree d (if such a tree exist).

8.3 Fibonacci Heaps 8.3 Fibonacci Heaps © Harald Räcke 354 © Harald Räcke 354

8.3 Fibonacci Heaps 8.3 Fibonacci Heaps

Consolidate: Consolidate:

0 1 2 3 0 1 2 3

current current

min 7x x 18x 52x 23 24 17 min 7x x 18x 52x 23 24 17

41 39 26 46 30 41 39 26 46 30

44 35 44 35

8.3 Fibonacci Heaps 8.3 Fibonacci Heaps © Harald Räcke 354 © Harald Räcke 354 7 18 41 5218 7 18 41 5218

17 52 39 2444 41 39 17 52 39 2444 41 39

30 26 46 44 30 26 46 44

35 35

7 18 4118 52 24 7 18 4118 52 24

17 52 39 41 4439 26 46 17 52 39 41 4439 26 46

30 44 35 30 44 35

8.3 Fibonacci Heaps 8.3 Fibonacci Heaps

Consolidate: Consolidate:

0 1 2 3 0 1 2 3

current current

min 7x 18x x 23 24 17 min 7x 18x x 23x 24x 17

52 41 39 26 46 30 52 41 39 26 46 30

44 35 44 35

8.3 Fibonacci Heaps 8.3 Fibonacci Heaps © Harald Räcke 354 © Harald Räcke 354

8.3 Fibonacci Heaps 8.3 Fibonacci Heaps

Consolidate: Consolidate:

0 1 2 3 0 1 2 3

current current

min 7x x 18x 23x x 17 min 7x x 18x 23x 17x

52 24 41 39 30 52 24 41 39 30

26 46 44 26 46 44

35 35

8.3 Fibonacci Heaps 8.3 Fibonacci Heaps © Harald Räcke 354 © Harald Räcke 354 7 18 4118 52 24 17

39 41 4439 26 46 30

44 35

24 26 4 19

74 40 72

19

4

t and t0 denote the number of trees before and 8.3 Fibonacci Heaps 8.3 Fibonacci Heaps after the delete-min() operation, respectively. Dn is an upper bound on the degree (i.e., num- ber of children) of a tree node. Actual cost for delete-min() ñ At most D t elements in root-list before consolidate. Consolidate: n + ñ Actual cost for a delete-min is at most (1) (Dn t). 0 1 2 3 O · + Hence, there exists c1 s.t. actual cost is at most c1 (D t). · n + Amortized cost for delete-min()

ñ t0 Dn 1 as degrees are different after consolidating. min 7x 18x 23x x ≤ + ñ Therefore ∆Φ Dn 1 t; 17 52 24 41 39 ≤ + − ñ We can pay c (t Dn 1) from the potential decrease. 30 26 46 44 · − − ñ The amortized cost is 35 c1 (D t) c (t D 1) · n + − · − n − (c1 c)D (c1 c)t c 2c(D 1) (D ) ≤ + n + − + ≤ n + ≤ O n for c c1 . ≥

8.3 Fibonacci Heaps 8.3 Fibonacci Heaps © Harald Räcke 354 © Harald Räcke 355

8.3 Fibonacci Heaps Fibonacci Heaps: decrease-key(handle h, v)

min

7 18 38 If the input trees of the consolidation procedure are binomial 24 1712 23 21 39 41 trees (for example only singleton vertices) then the output will be a set of distinct binomial trees, and, hence, the Fibonacci 26 45 30 52

heap will be (more or less) a Binomial heap right after the 35 74 72 consolidation. 40 If we do not have delete or decrease-key operations then Case 1: decrease-key does not violate heap-property D log n. n ≤ ñ Just decrease the key-value of element referenced by h. Nothing else to do.

8.3 Fibonacci Heaps 8.3 Fibonacci Heaps © Harald Räcke 356 © Harald Räcke 357 24 26 4 19 24 26 4

74 40 72 74 40

19

4 4

24 26 4

74 40

19 19

4

Fibonacci Heaps: decrease-key(handle h, v) Fibonacci Heaps: decrease-key(handle h, v)

min min

7 18 38 19 7 18 38

24 1712 23 21 39 41 72 24 1712 23 21 39 41

26 1945 30 52 26 45 30 52

35 74 72 35 74 72

40 40 Case 2: heap-property is violated, but parent is not marked Case 2: heap-property is violated, but parent is not marked ñ Decrease key-value of element x reference by h. ñ Decrease key-value of element x reference by h. ñ If the heap-property is violated, cut the parent edge of x, ñ If the heap-property is violated, cut the parent edge of x, and make x into a root. and make x into a root. ñ Adjust min-pointers, if necessary. ñ Adjust min-pointers, if necessary. ñ Mark the (previous) parent of x (unless it’s a root). ñ Mark the (previous) parent of x (unless it’s a root).

8.3 Fibonacci Heaps 8.3 Fibonacci Heaps © Harald Räcke 357 © Harald Räcke 357

Fibonacci Heaps: decrease-key(handle h, v) Fibonacci Heaps: decrease-key(handle h, v)

min min

19 7 18 38 24 26 4 19 7 18 38

72 24 1712 23 21 39 41 74 40 72 24 1712 23 21 39 41

26 45 30 52 26 45 30 52

354 74 72 35 74 72

40 40 Case 3: heap-property is violated, and parent is marked Case 3: heap-property is violated, and parent is marked ñ Decrease key-value of element x reference by h. ñ Decrease key-value of element x reference by h. ñ Cut the parent edge of x, and make x into a root. ñ Cut the parent edge of x, and make x into a root. ñ Adjust min-pointers, if necessary. ñ Adjust min-pointers, if necessary. ñ Continue cutting the parent until you arrive at an unmarked ñ Continue cutting the parent until you arrive at an unmarked node. node.

8.3 Fibonacci Heaps 8.3 Fibonacci Heaps © Harald Räcke 357 © Harald Räcke 357 Fibonacci Heaps: decrease-key(handle h, v) Fibonacci Heaps: decrease-key(handle h, v)

Actual cost:

Case 3: heap-property is violated, and parent is marked ñ Constant cost for decreasing the value. ñ Decrease key-value of element x reference by h. ñ Constant cost for each of ` cuts. ñ Cut the parent edge of x, and make x into a root. ñ Hence, cost is at most c2 (` 1), for some constant c2. · + ñ Adjust min-pointers, if necessary. Marking a node can be viewed as a Amortized cost: first step towards becoming a ñ Execute the following: root. The first time x loses a child ñ t0 t `, as every cut creates one new root. p parent[x]; it is marked; the second time it = + ← ñ m m (` 1) 1 m ` 2, since all but the first cut while (p is marked) loses a child it is made into a root. 0 ≤ − − + = − + unmarks a node; the last cut may mark a node. pp parent[p]; ← ñ ∆Φ ` 2( ` 2) 4 ` cut of p; make it into a root; unmark it; ≤ + − + = − t and t0: number of p pp; ñ Amortized cost is at most trees before and after ← operation. if is unmarked and not a root mark it; p c2(` 1) c(4 `) (c2 c)` 4c c2 (1), m and m0: number of + + − ≤ − + + = O marked nodes before if c c2. and after operation. ≥

8.3 Fibonacci Heaps 8.3 Fibonacci Heaps © Harald Räcke 358 © Harald Räcke 359

Delete node 8.3 Fibonacci Heaps

H. delete(x): Lemma 2

ñ decrease value of x to . Let x be a node with degree k and let y1, . . . , y denote the −∞ k ñ delete-min. children of x in the order that they were linked to x. Then

0 if i 1 Amortized cost: (Dn) degree(yi) = O ≥ ( i 2 if i > 1 ñ (1) for decrease-key. − O ñ (D ) for delete-min. O n The marking process is very important for the proof of this lemma. It ensures that a node can have lost at most one child since the last time it became a non-root node. When losing a first child the node gets marked; when losing the second child it is cut from the parent and made into a root.

8.3 Fibonacci Heaps 8.3 Fibonacci Heaps © Harald Räcke 360 © Harald Räcke 361 8.3 Fibonacci Heaps 8.3 Fibonacci Heaps

ñ Let sk be the minimum possible size of a sub-tree rooted at a node of degree k that can occur in a .

Proof ñ sk monotonically increases with k ñ s0 1 and s1 2. ñ When yi was linked to x, at least y1, . . . , yi 1 were already − = = linked to x. Let x be a degree k node of size sk and let y1, . . . , yk be its ñ Hence, at this time degree(x) i 1, and therefore also children. ≥ − degree(y ) i 1 as the algorithm links nodes of equal k i ≥ − degree only. s 2 size(y ) k = + i iX2 ñ Since, then yi has lost at most one child. = k ñ Therefore, degree(yi) i 2. 2 si 2 ≥ − ≥ + − i 2 X= k 2 − 2 s = + i i 0 X=

8.3 Fibonacci Heaps 8.3 Fibonacci Heaps © Harald Räcke 362 © Harald Räcke 363

8.3 Fibonacci Heaps

Definition 3

Consider the following non-standard Fibonacci type sequence: 0 k=0: 1 F0 Φ 1 = ≥ 1 = Φ2 k=1: 2 F1 Φ 1.61 1 if k 0 = ≥ ≈ = k 1 k 2 k 2 k k-2,k-1 k: Fk Fk 1 Fk 2 Φ − Φ − Φ − (Φ 1) Φ Fk  2 if k 1 → = − + − ≥ + = + = =  = z }| {  Fk 1 Fk 2 if k 2 − + − ≥   Facts: k=2: 3 F2 2 1 2 F0 = = + = + k-1 k: 2 k 3 2 k 2 1. F φk. Fk Fk 1 Fk 2 i −0 Fi Fk 2 i −0 Fi k ≥ → = − + − = + = + − = + = k 2 P P 2. For k 2: Fk 2 i −0 Fi. ≥ = + = The above facts can be easilyP proved by induction. From this it follows that s F φk, which gives that the maximum degree k ≥ k ≥ in a Fibonacci heap is logarithmic.

8.3 Fibonacci Heaps 8.3 Fibonacci Heaps © Harald Räcke 364 © Harald Räcke 365 Priority Queues

Bibliography [CLRS90] Thomas H. Cormen, Charles E. Leiserson, Ron L. Rivest, Clifford Stein: Introduction to algorithms (3rd ed.), MIT Press and McGraw-Hill, 2009 [MS08] Kurt Mehlhorn, Peter Sanders: Algorithms and Data Structures — The Basic Toolbox, Springer, 2008

Binary heaps are covered in [CLRS90] in combination with the heapsort algorithm in Chapter 6. Fi- bonacci heaps are covered in detail in Chapter 19. Problem 19-2 in this chapter introduces Binomial heaps.

Chapter 6 in [MS08] covers Priority Queues. Chapter 6.2.2 discusses Fibonacci heaps. Binomial heaps are dealt with in Exercise 6.11.

8.3 Fibonacci Heaps © Harald Räcke 366