Priority Queue / Heap

• Stores (key,data) pairs (like dictionary) • But, different set of operations:

• Initialize‐Heap: creates new empty heap • Is‐Empty: returns true if heap is empty • Insert(key,data): inserts (key,data)‐pair, returns pointer to entry • Get‐Min: returns (key,data)‐pair with minimum key • Delete‐Min: deletes minimum (key,data)‐pair • Decrease‐Key(entry,newkey): decreases key of entry to newkey • Merge: merges two heaps into one

Algorithm Theory, WS 2012/13 Fabian Kuhn 1 Implementation of Dijkstra’s Algorithm

Store nodes in a , use , as keys: 1. Initialize , 0 and , ∞ for all 2. All nodes are unmarked

3. Get unmarked node which minimizes , :

4. mark node

5. For all , ∈ , , min , , ,

6. Until all nodes are marked

Algorithm Theory, WS 2012/13 Fabian Kuhn 2 Analysis

Number of priority queue operations for Dijkstra: • Initialize‐Heap:

• Is‐Empty: ||

• Insert: ||

• Get‐Min: ||

• Delete‐Min: ||

• Decrease‐Key: ||

• Merge:

Algorithm Theory, WS 2012/13 Fabian Kuhn 3 Priority Queue Implementation

Implementation as min‐heap:

 complete binary tree, e.g., stored in an array

• Initialize‐Heap: • Is‐Empty: • Insert: • Get‐Min: • Delete‐Min: • Decrease‐Key: • Merge (heaps of size and , ): Algorithm Theory, WS 2012/13 Fabian Kuhn 4 Better Implementation

• Can we do better?

• Cost of Dijkstra with complete binary min‐heap implementation: log

• Can be improved if we can make decrease‐key cheaper…

• Cost of merging two heaps is expensive

• We will get there in two steps:

Binomial heap 

Algorithm Theory, WS 2012/13 Fabian Kuhn 5 Definition: Binomial Tree

Binomial tree of order 0:

Algorithm Theory, WS 2012/13 Fabian Kuhn 6 Binomial Trees

Algorithm Theory, WS 2012/13 Fabian Kuhn 7 Properties

1. Tree has 2 nodes

2. Height of tree is

3. Root degree of is

4. In , there are exactly nodes at depth

Algorithm Theory, WS 2012/13 Fabian Kuhn 8 Binomial Coefficients

• Binomial coefficient: : # of element subsets of a set of size

• Property: Pascal triangle:

Algorithm Theory, WS 2012/13 Fabian Kuhn 9 Number of Nodes at Depth in

Claim: In , there are exactly nodes at depth

Algorithm Theory, WS 2012/13 Fabian Kuhn 10 Binomial Heap

• Keys are stored in nodes of binomial trees of different order

nodes: there is a binomial tree or order iff bit of base‐2 representation of is 1.

• Min‐Heap Property: Key of node keys of all nodes in sub‐tree of

Algorithm Theory, WS 2012/13 Fabian Kuhn 11 Example

• 10 keys: 2, 5, 8, 9, 12, 14, 17, 18, 20, 22, 25

• Binary representation of : 11 1011  trees , , and present

5 2 17

9 14 20 8

12 18 25

22

Algorithm Theory, WS 2012/13 Fabian Kuhn 12 Child‐Sibling Representation

Structure of a node: parent key degree child sibling

Algorithm Theory, WS 2012/13 Fabian Kuhn 13 Link Operation

• Unite two binomial trees of the same order to one tree:  12 ⨁ ⇒ 

• Time:  15 20 18  ⨁

25 40 22  

30 

Algorithm Theory, WS 2012/13 Fabian Kuhn 14 Merge Operation

Merging two binomial heaps: • For ,,…,: If there are 2 or 3 binomial trees : apply link operation to merge 2 trees into one binomial tree

Time:

Algorithm Theory, WS 2012/13 Fabian Kuhn 15 Example

9 13 5 2 17

12 18 14 20 8

22 25

Algorithm Theory, WS 2012/13 Fabian Kuhn 16 Operations

Initialize: create empty list of trees

Get minimum of queue: time 1 (if we maintain a pointer)

Decrease‐key at node : • Set key of node to new key • Swap with parent until min‐heap property is restored • Time: log

Insert key into queue : 1. Create queue of size 1 containing only 2. Merge and • Time for insert: log

Algorithm Theory, WS 2012/13 Fabian Kuhn 17 Operations

Delete‐Min Operation:

1. Find tree with minimum root

2. Remove from queue  queue ′

3. Children of form new queue ′′

4. Merge queues ′ and ′′

• Overall time:

Algorithm Theory, WS 2012/13 Fabian Kuhn 18 Delete‐Min Example

2 5 17

9 14 20 8

12 18 25

22

Algorithm Theory, WS 2012/13 Fabian Kuhn 19 Complexities Binomial Heap

• Initialize‐Heap: • Is‐Empty: • Insert: • Get‐Min: • Delete‐Min: • Decrease‐Key: • Merge (heaps of size and , ):

Algorithm Theory, WS 2012/13 Fabian Kuhn 20 Can We Do Better?

• Binomial heap: insert, delete‐min, and decrease‐key cost log

• One of the operations insert or delete‐min must cost Ωlog : – Heap‐Sort: Insert elements into heap, then take out the minimum times – (Comparison‐based) sorting costs at least Ω log .

• But maybe we can improve decrease‐key and one of the other two operations?

• Structure of binomial heap is not flexible: – Simplifies analysis, allows to get strong worst‐case bounds – But, operations almost inherently need at least logarithmic time

Algorithm Theory, WS 2012/13 Fabian Kuhn 21 Fibonacci Heaps

Lacy‐merge variant of binomial heaps: • Do not merge trees as long as possible…

Structure: A Fibonacci heap consists of a collection of trees satisfying the min‐heap property.

Variables: • . : root of the tree containing the (a) minimum key • . : circular, doubly linked, unordered list containing the roots of all trees • . : number of nodes currently in

Algorithm Theory, WS 2012/13 Fabian Kuhn 22 Trees in Fibonacci Heaps

Structure of a single node :

parent right left key degree

child mark

• . : points to circular, doubly linked and unordered list of the children of • . , . : pointers to siblings (in doubly linked list) • . : will be used later… Advantages of circular, doubly linked lists: • Deleting an element takes constant time • Concatenating two lists takes constant time Algorithm Theory, WS 2012/13 Fabian Kuhn 23 Example

Figure: Cormen et al., Introduction to Algorithms

Algorithm Theory, WS 2012/13 Fabian Kuhn 24 Simple (Lazy) Operations

Initialize‐Heap : • . ≔ . ≔

Merge heaps and ′: • concatenate root lists • update .

Insert element into : • create new one‐node tree containing  H′ • merge heaps and ′

Get minimum element of : • return .

Algorithm Theory, WS 2012/13 Fabian Kuhn 25 Operation Delete‐Min

Delete the node with minimum key from and return its element:

1. ≔.; 2. if . 0 then 3. remove . from . ; 4. add . . (list) to . 5. . ;

// Repeatedly merge nodes with equal degree in the root list // until degrees of nodes in the root list are distinct. // Determine the element with minimum key

6. return

Algorithm Theory, WS 2012/13 Fabian Kuhn 26 Rank and Maximum Degree

Ranks of nodes, trees, heap:

Node : • : degree of

Tree : • : rank (degree) of root node of

Heap : • : maximum degree of any node in

Assumption (: number of nodes in ): – for a known function

Algorithm Theory, WS 2012/13 Fabian Kuhn 27 Merging Two Trees

Given: Heap‐ordered trees , ′ with • Assume: min‐key of min‐key of ′

Operation , : • Removes tree ′ from root list ′ and adds to child list of

• ≔1 • .≔ ′

Algorithm Theory, WS 2012/13 Fabian Kuhn 28 Consolidation of Root List

Array pointing to find roots with the same rank:

012 Consolidate: Time: 1. for ≔0to do ≔null; |.|. 2. while . null do 3. ≔“delete and return first element of . ” 4. while nulldo 5. ≔ ; 6. ≔; 7. ≔, 8. ≔ 9. Create new . and . Algorithm Theory, WS 2012/13 Fabian Kuhn 29 Consolidate Example

31 5 25 1 2 15 17 3

9 14 20 13 8 19 7

12 18 22

5 25 1 2 15 17 3

9 14 20 31 13 8 19 7

12 18 22

Algorithm Theory, WS 2012/13 Fabian Kuhn 30 Consolidate Example

5 25 1 2 15 17 3

9 14 20 31 13 8 19 7

12 18 22

5 1 2 15 17 3

9 14 20 25 13 8 19 7

12 18 22 31

Algorithm Theory, WS 2012/13 Fabian Kuhn 31 Consolidate Example

5 1 2 15 17 3

9 14 20 25 13 8 19 7

12 18 22 31

5 1 2 15 17

9 14 20 25 13 3 8

12 18 22 31 19 7

Algorithm Theory, WS 2012/13 Fabian Kuhn 32 Consolidate Example

5 1 2 15 17

9 14 20 25 13 3 8

12 18 22 31 19 7

1 2 15 17

5 25 13 3 8

9 14 20 31 19 7

12 18 22

Algorithm Theory, WS 2012/13 Fabian Kuhn 33 Consolidate Example

1 2 15 17

5 25 13 3 8

9 14 20 31 19 7

12 18 22 1 2 15

5 25 13 3 8 17

9 14 20 31 19 7

12 18 22

Algorithm Theory, WS 2012/13 Fabian Kuhn 34 Consolidate Example

1 2 15

5 25 13 3 8 17

9 14 20 31 19 7

12 18 22 1 2

5 25 13 3 8 15

9 14 20 31 19 7 17

12 18 22

Algorithm Theory, WS 2012/13 Fabian Kuhn 35 Operation Decrease‐Key

Decrease‐Key, : (decrease key of node to new value )

1. if .then return; 2. . ≔ ; update . ; 3. if ∈ . ∨ . . then return 4. repeat 5. ≔ . ; 6. . ; 7. ≔; 8. until . ∨ ∈ . ; 9. if ∉ . then . ≔ ;

Algorithm Theory, WS 2012/13 Fabian Kuhn 36 Operation Cut( )

Operation . : • Cuts ’s sub‐tree from its parent and adds to rootlist

1. if ∉ . then 2. // cut the link between and its parent 3. . ≔ . 1; 4. remove from . . (list) 5. . ≔ null; 6. add to . 1 2 15 1 3 2 15 25 13 3 8 25 13 19 7 8

31 19 7 31 Algorithm Theory, WS 2012/13 Fabian Kuhn 37 Decrease‐Key Example

• Green nodes are marked 5 1 2 15 17

9 14 20 25 13 3 8

12 18 31 19 7 Decrease‐Key, 22

5 18 9 1 2 15 17

14 20 22 12 25 13 3 8

31 19 7

Algorithm Theory, WS 2012/13 Fabian Kuhn 38 Fibonacci Heap Marks

History of a node :

is being linked to a node . ≔ a child of is cut . ≔ a second child of is cut .

• Hence, the boolean value . indicates whether node has lost a child since the last time was made the child of another node.

Algorithm Theory, WS 2012/13 Fabian Kuhn 39 Cost of Delete‐Min & Decrease‐Key

Delete‐Min: 1. Delete min. root and add . to . time: 1 2. Consolidate . time: length of . • Step 2 can potentially be linear in (size of ) Decrease‐Key (at node ): 1. If new key parent key, cut sub‐tree of node time: 1 2. Cascading cuts up the tree as long as nodes are marked time: number of consecutive marked nodes • Step 2 can potentially be linear in

Exercises: Both operations can take time in the worst case!

Algorithm Theory, WS 2012/13 Fabian Kuhn 40 Cost of Delete‐Min & Decrease‐Key

• Cost of delete‐min and decrease‐key can be Θ… – Seems a large price to pay to get insert and merge in 1 time

• Maybe, the operations are efficient most of the time? – It seems to require a lot of operations to get a long rootlist and thus, an expensive consolidate operation – In each decrease‐key operation, at most one node gets marked: We need a lot of decrease‐key operations to get an expensive decrease‐key operation

• Can we show that the average cost per operation is small?

• We can  requires amortized analysis

Algorithm Theory, WS 2012/13 Fabian Kuhn 41 Amortization

• Consider sequence ,,…, of operations (typically performed on some )

• : execution time of operation • ≔ ⋯: total execution time

• The execution time of a single operation might

vary within a large range (e.g., ∈1, )

• The worst case overall execution time might still be small  average execution time per operation might be small in the worst case, even if single operations can be expensive

Algorithm Theory, WS 2012/13 Fabian Kuhn 42 Analysis of Algorithms

• Best case

• Worst case

• Average case

• Amortized worst case

What it the average cost of an operation in a worst case sequence of operations?

Algorithm Theory, WS 2012/13 Fabian Kuhn 43 Example: Binary Counter

Incrementing a binary counter: determine the bit flip cost: Operation Counter Value Cost 00000 1 00001 1 2 00010 2 3 00011 1 400100 3 5 00101 1 6 00110 2 7 00111 1 801000 4 9 01001 1 10 01010 2 11 01011 1 12 01100 3 13 01101 1

Algorithm Theory, WS 2012/13 Fabian Kuhn 44 Accounting Method

Observation: • Each increment flips exactly one 0 into a 1 001001111 ⟹ 001000000 Idea: • Have a bank account (with initial amount 0) • Paying to the bank account costs • Take “money” from account to pay for expensive operations

Applied to binary counter: • Flip from 0 to 1: pay 1 to bank account (cost: 2) • Flip from 1 to 0: take 1 from bank account (cost: 0) • Amount on bank account = number of ones  We always have enough “money” to pay!

Algorithm Theory, WS 2012/13 Fabian Kuhn 45 Accounting Method

Op. Counter Cost To Bank From Bank Net Cost Credit 0 0 0 0 0 10 0 0 0 1 1 20 0 0 1 0 2 30 0 0 1 1 1 40 0 1 0 0 3 50 0 1 0 1 1 60 0 1 1 0 2 70 0 1 1 1 1 80 1 0 0 0 4 90 1 0 0 1 1 10 0 1 0 1 0 2

Algorithm Theory, WS 2012/13 Fabian Kuhn 46 Potential Function Method

• Most generic and elegant way to do amortized analysis! – But, also more abstract than the others…

• State of data structure / system: ∈(state space)

Potential function : →

• Operation :

– : actual cost of operation – : state after execution of operation (: initial state) – ≔Φ: potential after exec. of operation – : amortized cost of operation :

Algorithm Theory, WS 2012/13 Fabian Kuhn 47 Potential Function Method

Operation :

actual cost: amortized cost: Φ Φ Overall cost:

≔ Φ Φ

Algorithm Theory, WS 2012/13 Fabian Kuhn 48 Binary Counter: Potential Method

• Potential function: :

• Clearly, Φ 0and Φ 0for all 0

• Actual cost : . 1 flip from 0 to 1

. 1flips from 1 to 0

• Potential difference: Φ Φ 1 1 2

• Amortized cost: Φ Φ 2

Algorithm Theory, WS 2012/13 Fabian Kuhn 49 Back to Fibonacci Heaps

• Worst‐case cost of a single delete‐min or decrease‐key operation is Ω

• Can we prove a small worst‐case amortized cost for delete‐min and decrease‐key operations?

Remark:

• Data structure that allows operations ,…,

• We say that operation has amortized cost if for every execution the total time is

⋅ ,

where is the number of operations of type Algorithm Theory, WS 2012/13 Fabian Kuhn 50 Amortized Cost of Fibonacci Heaps

• Initialize‐heap, is‐empty, get‐min, insert, and merge have worst‐case cost

• Delete‐min has amortized cost • Decrease‐key has amortized cost

• Starting with an empty heap, any sequence of operations with at most delete‐min operations has total cost (time)

.

• Cost for Dijkstra: || || log ||

Algorithm Theory, WS 2012/13 Fabian Kuhn 51