AVL/Splay Trees

Total Page:16

File Type:pdf, Size:1020Kb

AVL/Splay Trees AVL/Splay Trees S. Thiel Self-Organizing Lists Balanced Trees AVL Trees AVL/Splay Trees Splay Trees References S. Thiel1 1Department of Computer Science & Software Engineering Concordia University July 17, 2019 1/43 Outline AVL/Splay Trees S. Thiel Self-Organizing Lists Balanced Trees AVL Trees Self-Organizing Lists Splay Trees References Balanced Trees AVL Trees Splay Trees References 2/43 Self-Organizing Lists [1, p.307] AVL/Splay Trees S. Thiel Self-Organizing Lists Balanced Trees AVL Trees Splay Trees References I We have looked at organizing list by value I That's not the only option I If we want to find something quickly, are there faster options? 3/43 Organizing by Frequency 1 [1, p.307] AVL/Splay Trees S. Thiel Self-Organizing Lists If we just want a fast access to what we're looking for Balanced Trees I AVL Trees Splay Trees I We can organize by frequency, as this is often cheap References I This applies when we know about the frequency: I e.g.: for ki elements where each value has probability pi I When sorting linearly, assuming the above frequencies were ordered for convenience: I Total comparisons to find each value once is: Cn = 1p0 + 2p1 + ::: + npn−1 2 I In general, Cn is a function of n , and thus each find is Θ(n) in the average case 1 1being the case where the probabilities are followed 4/43 Organizing by Frequency 2 [1, p.307-308] AVL/Splay Trees S. Thiel Self-Organizing Lists Balanced Trees AVL Trees Splay Trees I What if pn are geometric series? References 1 I What if pi = 2i+1 ? I Cn = 1p0 + 2p1 + ::: + npn−1 I which is just: n−1 X i n + 1 = 2 − 2i 2n−1 i=1 I and as n approaches 1 it is 2 5/43 Access Patterns [1, p.309] AVL/Splay Trees S. Thiel Self-Organizing Lists Balanced Trees AVL Trees Splay Trees References I 80/20 rule for data access. I It turns out this is common. I When 80/20 applies, sorting by frequency generally yields good results 6/43 How Else To Self-Organize? 1/2 [1, p.310-311] AVL/Splay Trees S. Thiel Self-Organizing Lists I Store access counts, move to front of list if count Balanced Trees exceeds count of preceeding value AVL Trees Splay Trees I must store counts References I doesn't deal well with changing frequency of access I Move-to-front: If something is accessed, move it to the front I Does not work well in arrays I responds well to localized frequency adjustment for short periods of time I responds poorly when things are accessed sequentially, or repeatedly sequentially I bounded: when searching for n values, the worst case is twice as long as searching for those n values if it were ordered conveniently. (doesn't make it much worse) 7/43 How Else To Self-Organize? 2/2 [1, p.311] AVL/Splay Trees S. Thiel Self-Organizing Lists Balanced Trees AVL Trees Splay Trees I Transpose: swap records found with the record beforehand References I good with both arrays and lists I frequently used records tend towards the front. I records that stop being accessed frequently will move to the back. I pathological cases can lead to all long accesses for bad sequences without repeats, but rare. I A common variation swaps with something a few steps ahead. 8/43 Self-Organizing List Performance [1, p.312] AVL/Splay Trees S. Thiel Self-Organizing Lists Balanced Trees AVL Trees Splay Trees References I They don't beat Θ (log n) performance in general I They do well in certain cases, so should not be overlooked I It can often behave like Θ (1) because of access patterns 9/43 Self-Organizing List Example [1, p.312] AVL/Splay Trees S. Thiel Message passing example: I Self-Organizing Lists I Sender and Receiver, sending text messages Balanced Trees I For each word, AVL Trees Splay Trees I it has never been seen, put it on the front of the list References and send it I if it has been seen before, send the index, move it to the front of the list I In this way, both sender and receiver keep matching lists and can decode properly I The order things are sent in matters (out of order sends would break this) I The car on the left hit the car I left. I The car on 3 left hit 3 5 I 5. I In general, while the above seems trivial, variants on this are common file/text encoding strategies where duplication is expected. 10/43 More Reliable Performance AVL/Splay Trees S. Thiel Self-Organizing Lists Balanced Trees AVL Trees Splay Trees I Self-Organizing lists are useful in special cases, but have References problem areas I BSTs have problem areas too, namely when they become unbalanced I Can we apply self-organizing approaches? I First let's consider what the implications of balancing are. 11/43 Balanced Trees AVL/Splay Trees S. Thiel Self-Organizing Lists Balanced Trees AVL Trees Splay Trees References I BSTs can become unbalanced I makes search expensive I adjust access, improve performance I ensuring complete binary tree too expensive I relax requirement, good enough (can we cheaply mostly fix it?) 12/43 AVL Tree vs. Splay Tree AVL/Splay Trees S. Thiel Self-Organizing Lists Balanced Trees AVL Trees Splay Trees References I AVL constrains difference between depth of sub{trees I SPLAY improves balance on each access I both have advantages 13/43 AVL Tree AVL/Splay Trees S. Thiel Self-Organizing Lists Balanced Trees AVL Trees Splay Trees I named for its inventors Adelson-Velskii and Landis) References I left and right sub{trees differ in depth by at most 1 I max depth is O (logn) I search is therefore O (logn) I updates can be done in time proportional to search I must be cautious with duplicate values, as rotations may break the BST property [1, p.454 E13.8] 14/43 AVL Tree Balance AVL/Splay Trees S. Thiel Self-Organizing Lists Balanced Trees AVL Trees Splay Trees References I Each node knows its level of unbalance I −1, 0 or 1 I tree starts balanced according to this I changes propagate up to root I when can we stop early? I propagation stops logically 15/43 AVL Tree Insertion [1, p.435] AVL/Splay Trees S. Thiel Self-Organizing Lists Balanced Trees AVL Trees Splay Trees References Figure: 24=-2, 7=-2: Unbalanced AVL with an insertion [1, p.435]. 16/43 AVL Tree Rotations AVL/Splay Trees S. Thiel Self-Organizing Lists Balanced Trees AVL Trees I what are these changes? Splay Trees References I we call them rotations I AVL needs a single rotation or double rotation I rotation about node with bad balance I if sign of imbalance is the same, single I if sign of imbalance differs, double I double rotation starts by making sign of imbalance the same I rotate on node with awkward balance 17/43 Single Rotation Example [1, p.436] AVL/Splay Trees S. Thiel Self-Organizing Lists Balanced Trees AVL Trees Splay Trees References Figure: X = -1, S = -2: Single Rotation Fix [1, p.436]. 18/43 Double Rotation Example [1, p.436] AVL/Splay Trees S. Thiel Self-Organizing Lists Balanced Trees AVL Trees Splay Trees References Figure: X = +2, S = -2: Double Rotation Fix [1, p.436]. 19/43 Single Rotation 1 AVL/Splay Trees S. Thiel Self-Organizing Lists Balanced Trees -1 AVL Trees 37 Splay Trees References -2 1 24 42 -2 0 0 1 7 32 40 43 -1 0 2 120 0 1 20/43 Single Rotation 2 AVL/Splay Trees S. Thiel Self-Organizing Lists Balanced Trees AVL Trees 0 Splay Trees 37 References -1 1 24 42 0 0 0 1 2 32 40 43 0 0 0 1 7 120 21/43 Double Rotation 1 AVL/Splay Trees S. Thiel Self-Organizing Lists Balanced Trees -1 AVL Trees 37 Splay Trees References -2 1 24 42 -2 0 0 1 7 32 40 43 1 0 2 120 0 5 22/43 Double Rotation 2 AVL/Splay Trees S. Thiel Self-Organizing Lists Balanced Trees AVL Trees 0 Splay Trees 37 References -1 1 24 42 0 0 0 1 5 32 40 43 0 0 0 2 7 120 23/43 But Shafer's Example has a Duplicate node? AVL/Splay Trees S. Thiel Self-Organizing Lists Balanced Trees AVL Trees Splay Trees References I Duplicate nodes are problematic I Rotation can break BST ≥ rule 24/43 Duplicate Doom 1 AVL/Splay Trees S. Thiel Self-Organizing Lists -1 Balanced Trees 37 AVL Trees Splay Trees References -2 1 24 42 2 0 0 1 5 32 40 43 1 0 5 120 0 5 Is this a BST? 25/43 Duplicate Doom 2 AVL/Splay Trees S. Thiel Self-Organizing Lists Balanced Trees AVL Trees -1 37 Splay Trees References -1 1 24 42 0 0 0 1 5 32 40 43 0 0 0 5 5 120 Is this a BST? 26/43 AVL Tree Deletion AVL/Splay Trees S. Thiel Self-Organizing Lists Balanced Trees AVL Trees Splay Trees I addition looks easy. it is! References I deletion is only slightly trickier I can we optimize? I if balance is 1 grab smallest in right subtree? I if balance is −1 grab biggest in left subtree? I at least it cannot hurt I work up from secondary deletion 27/43 AVL Deletion 1 try 1 AVL/Splay Trees S.
Recommended publications
  • Amortized Analysis Worst-Case Analysis
    Beyond Worst Case Analysis Amortized Analysis Worst-case analysis. ■ Analyze running time as function of worst input of a given size. Average case analysis. ■ Analyze average running time over some distribution of inputs. ■ Ex: quicksort. Amortized analysis. ■ Worst-case bound on sequence of operations. ■ Ex: splay trees, union-find. Competitive analysis. ■ Make quantitative statements about online algorithms. ■ Ex: paging, load balancing. Princeton University • COS 423 • Theory of Algorithms • Spring 2001 • Kevin Wayne 2 Amortized Analysis Dynamic Table Amortized analysis. Dynamic tables. ■ Worst-case bound on sequence of operations. ■ Store items in a table (e.g., for open-address hash table, heap). – no probability involved ■ Items are inserted and deleted. ■ Ex: union-find. – too many items inserted ⇒ copy all items to larger table – sequence of m union and find operations starting with n – too many items deleted ⇒ copy all items to smaller table singleton sets takes O((m+n) α(n)) time. – single union or find operation might be expensive, but only α(n) Amortized analysis. on average ■ Any sequence of n insert / delete operations take O(n) time. ■ Space used is proportional to space required. ■ Note: actual cost of a single insert / delete can be proportional to n if it triggers a table expansion or contraction. Bottleneck operation. ■ We count insertions (or re-insertions) and deletions. ■ Overhead of memory management is dominated by (or proportional to) cost of transferring items. 3 4 Dynamic Table: Insert Dynamic Table: Insert Dynamic Table Insert Accounting method. Initialize table size m = 1. ■ Charge each insert operation $3 (amortized cost). – use $1 to perform immediate insert INSERT(x) – store $2 in with new item IF (number of elements in table = m) ■ When table doubles: Generate new table of size 2m.
    [Show full text]
  • Lecture 26 Fall 2019 Instructors: B&S Administrative Details
    CSCI 136 Data Structures & Advanced Programming Lecture 26 Fall 2019 Instructors: B&S Administrative Details • Lab 9: Super Lexicon is online • Partners are permitted this week! • Please fill out the form by tonight at midnight • Lab 6 back 2 2 Today • Lab 9 • Efficient Binary search trees (Ch 14) • AVL Trees • Height is O(log n), so all operations are O(log n) • Red-Black Trees • Different height-balancing idea: height is O(log n) • All operations are O(log n) 3 2 Implementing the Lexicon as a trie There are several different data structures you could use to implement a lexicon— a sorted array, a linked list, a binary search tree, a hashtable, and many others. Each of these offers tradeoffs between the speed of word and prefix lookup, amount of memory required to store the data structure, the ease of writing and debugging the code, performance of add/remove, and so on. The implementation we will use is a special kind of tree called a trie (pronounced "try"), designed for just this purpose. A trie is a letter-tree that efficiently stores strings. A node in a trie represents a letter. A path through the trie traces out a sequence ofLab letters that9 represent : Lexicon a prefix or word in the lexicon. Instead of just two children as in a binary tree, each trie node has potentially 26 child pointers (one for each letter of the alphabet). Whereas searching a binary search tree eliminates half the words with a left or right turn, a search in a trie follows the child pointer for the next letter, which narrows the search• Goal: to just words Build starting a datawith that structure letter.
    [Show full text]
  • Search Trees
    Lecture III Page 1 “Trees are the earth’s endless effort to speak to the listening heaven.” – Rabindranath Tagore, Fireflies, 1928 Alice was walking beside the White Knight in Looking Glass Land. ”You are sad.” the Knight said in an anxious tone: ”let me sing you a song to comfort you.” ”Is it very long?” Alice asked, for she had heard a good deal of poetry that day. ”It’s long.” said the Knight, ”but it’s very, very beautiful. Everybody that hears me sing it - either it brings tears to their eyes, or else -” ”Or else what?” said Alice, for the Knight had made a sudden pause. ”Or else it doesn’t, you know. The name of the song is called ’Haddocks’ Eyes.’” ”Oh, that’s the name of the song, is it?” Alice said, trying to feel interested. ”No, you don’t understand,” the Knight said, looking a little vexed. ”That’s what the name is called. The name really is ’The Aged, Aged Man.’” ”Then I ought to have said ’That’s what the song is called’?” Alice corrected herself. ”No you oughtn’t: that’s another thing. The song is called ’Ways and Means’ but that’s only what it’s called, you know!” ”Well, what is the song then?” said Alice, who was by this time completely bewildered. ”I was coming to that,” the Knight said. ”The song really is ’A-sitting On a Gate’: and the tune’s my own invention.” So saying, he stopped his horse and let the reins fall on its neck: then slowly beating time with one hand, and with a faint smile lighting up his gentle, foolish face, he began..
    [Show full text]
  • Splay Trees Last Changed: January 28, 2017
    15-451/651: Design & Analysis of Algorithms January 26, 2017 Lecture #4: Splay Trees last changed: January 28, 2017 In today's lecture, we will discuss: • binary search trees in general • definition of splay trees • analysis of splay trees The analysis of splay trees uses the potential function approach we discussed in the previous lecture. It seems to be required. 1 Binary Search Trees These lecture notes assume that you have seen binary search trees (BSTs) before. They do not contain much expository or backtround material on the basics of BSTs. Binary search trees is a class of data structures where: 1. Each node stores a piece of data 2. Each node has two pointers to two other binary search trees 3. The overall structure of the pointers is a tree (there's a root, it's acyclic, and every node is reachable from the root.) Binary search trees are a way to store and update a set of items, where there is an ordering on the items. I know this is rather vague. But there is not a precise way to define the gamut of applications of search trees. In general, there are two classes of applications. Those where each item has a key value from a totally ordered universe, and those where the tree is used as an efficient way to represent an ordered list of items. Some applications of binary search trees: • Storing a set of names, and being able to lookup based on a prefix of the name. (Used in internet routers.) • Storing a path in a graph, and being able to reverse any subsection of the path in O(log n) time.
    [Show full text]
  • Comparison of Dictionary Data Structures
    A Comparison of Dictionary Implementations Mark P Neyer April 10, 2009 1 Introduction A common problem in computer science is the representation of a mapping between two sets. A mapping f : A ! B is a function taking as input a member a 2 A, and returning b, an element of B. A mapping is also sometimes referred to as a dictionary, because dictionaries map words to their definitions. Knuth [?] explores the map / dictionary problem in Volume 3, Chapter 6 of his book The Art of Computer Programming. He calls it the problem of 'searching,' and presents several solutions. This paper explores implementations of several different solutions to the map / dictionary problem: hash tables, Red-Black Trees, AVL Trees, and Skip Lists. This paper is inspired by the author's experience in industry, where a dictionary structure was often needed, but the natural C# hash table-implemented dictionary was taking up too much space in memory. The goal of this paper is to determine what data structure gives the best performance, in terms of both memory and processing time. AVL and Red-Black Trees were chosen because Pfaff [?] has shown that they are the ideal balanced trees to use. Pfaff did not compare hash tables, however. Also considered for this project were Splay Trees [?]. 2 Background 2.1 The Dictionary Problem A dictionary is a mapping between two sets of items, K, and V . It must support the following operations: 1. Insert an item v for a given key k. If key k already exists in the dictionary, its item is updated to be v.
    [Show full text]
  • AVL Tree, Bayer Tree, Heap Summary of the Previous Lecture
    DATA STRUCTURES AND ALGORITHMS Hierarchical data structures: AVL tree, Bayer tree, Heap Summary of the previous lecture • TREE is hierarchical (non linear) data structure • Binary trees • Definitions • Full tree, complete tree • Enumeration ( preorder, inorder, postorder ) • Binary search tree (BST) AVL tree The AVL tree (named for its inventors Adelson-Velskii and Landis published in their paper "An algorithm for the organization of information“ in 1962) should be viewed as a BST with the following additional property: - For every node, the heights of its left and right subtrees differ by at most 1. Difference of the subtrees height is named balanced factor. A node with balance factor 1, 0, or -1 is considered balanced. As long as the tree maintains this property, if the tree contains n nodes, then it has a depth of at most log2n. As a result, search for any node will cost log2n, and if the updates can be done in time proportional to the depth of the node inserted or deleted, then updates will also cost log2n, even in the worst case. AVL tree AVL tree Not AVL tree Realization of AVL tree element struct AVLnode { int data; AVLnode* left; AVLnode* right; int factor; // balance factor } Adding a new node Insert operation violates the AVL tree balance property. Prior to the insert operation, all nodes of the tree are balanced (i.e., the depths of the left and right subtrees for every node differ by at most one). After inserting the node with value 5, the nodes with values 7 and 24 are no longer balanced.
    [Show full text]
  • Leftist Heap: Is a Binary Tree with the Normal Heap Ordering Property, but the Tree Is Not Balanced. in Fact It Attempts to Be Very Unbalanced!
    Leftist heap: is a binary tree with the normal heap ordering property, but the tree is not balanced. In fact it attempts to be very unbalanced! Definition: the null path length npl(x) of node x is the length of the shortest path from x to a node without two children. The null path lengh of any node is 1 more than the minimum of the null path lengths of its children. (let npl(nil)=-1). Only the tree on the left is leftist. Null path lengths are shown in the nodes. Definition: the leftist heap property is that for every node x in the heap, the null path length of the left child is at least as large as that of the right child. This property biases the tree to get deep towards the left. It may generate very unbalanced trees, which facilitates merging! It also also means that the right path down a leftist heap is as short as any path in the heap. In fact, the right path in a leftist tree of N nodes contains at most lg(N+1) nodes. We perform all the work on this right path, which is guaranteed to be short. Merging on a leftist heap. (Notice that an insert can be considered as a merge of a one-node heap with a larger heap.) 1. (Magically and recursively) merge the heap with the larger root (6) with the right subheap (rooted at 8) of the heap with the smaller root, creating a leftist heap. Make this new heap the right child of the root (3) of h1.
    [Show full text]
  • SPLAY Trees • Splay Trees Were Invented by Daniel Sleator and Robert Tarjan
    Red-Black, Splay and Huffman Trees Kuan-Yu Chen (陳冠宇) 2018/10/22 @ TR-212, NTUST Review • AVL Trees – Self-balancing binary search tree – Balance Factor • Every node has a balance factor of –1, 0, or 1 2 Red-Black Trees. • A red-black tree is a self-balancing binary search tree that was invented in 1972 by Rudolf Bayer – A special point to note about the red-black tree is that in this tree, no data is stored in the leaf nodes • A red-black tree is a binary search tree in which every node has a color which is either red or black 1. The color of a node is either red or black 2. The color of the root node is always black 3. All leaf nodes are black 4. Every red node has both the children colored in black 5. Every simple path from a given node to any of its leaf nodes has an equal number of black nodes 3 Red-Black Trees.. 4 Red-Black Trees... • Root is red 5 Red-Black Trees…. • A leaf node is red 6 Red-Black Trees….. • Every red node does not have both the children colored in black • Every simple path from a given node to any of its leaf nodes does not have equal number of black nodes 7 Searching in a Red-Black Tree • Since red-black tree is a binary search tree, it can be searched using exactly the same algorithm as used to search an ordinary binary search tree! 8 Insertion in a Red-Black Tree • In a binary search tree, we always add the new node as a leaf, while in a red-black tree, leaf nodes contain no data – For a given data 1.
    [Show full text]
  • AVL Insertion, Deletion Other Trees and Their Representations
    AVL Insertion, Deletion Other Trees and their representations Due now: ◦ OO Queens (One submission for both partners is sufficient) Due at the beginning of Day 18: ◦ WA8 ◦ Sliding Blocks Milestone 1 Thursday: Convocation schedule ◦ Section 01: 8:50-10:15 AM ◦ Section 02: 10:20-11:00 AM, 12:35-1:15 PM ◦ Convocation speaker Risk Analysis Pioneer John D. Graham The promise and perils of science and technology regulations 11:10 a.m. to 12:15 p.m. in Hatfield Hall. ◦ Part of Thursday's class time will be time for you to work with your partner on SlidingBlocks Due at the beginning of Day 19 : WA9 Due at the beginning of Day 21 : ◦ Sliding Blocks Final submission Non-attacking Queens Solution Insertions and Deletions in AVL trees Other Search Trees BSTWithRank WA8 Tree properties Height-balanced Trees SlidingBlocks CanAttack( ) toString( ) findNext( ) public boolean canAttack(int row, int col) { int columnDifference = col - column ; return currentRow == row || // same row currentRow == row + columnDifference || // same "down" diagonal currentRow == row - columnDifference || // same "up" diagonal neighbor .canAttack(row, col); // If I can't attack it, maybe // one of my neighbors can. } @Override public String toString() { return neighbor .toString() + " " + currentRow ; } public boolean findNext() { if (currentRow == MAXROWS ) { // no place to go until neighbors move. if (! neighbor .findNext()) return false ; // Neighbor can't move, so I can't either. currentRow = 0; // about to be bumped up to 1. } currentRow ++; return testOrAdvance(); // See if this new position works. } You did not have to write this one: // If this is a legal row for me, say so. // If not, try the next row.
    [Show full text]
  • Amortized Complexity Analysis for Red-Black Trees and Splay Trees
    International Journal of Innovative Research in Computer Science & Technology (IJIRCST) ISSN: 2347-5552, Volume-6, Issue-6, November2018 DOI: 10.21276/ijircst.2018.6.6.2 Amortized Complexity Analysis for Red-Black Trees and Splay Trees Isha Ashish Bahendwar, RuchitPurshottam Bhardwaj, Prof. S.G. Mundada Abstract—The basic conception behind the given problem It stores data in more efficient and practical way than basic definition is to discuss the working, operations and complexity data structures. Advanced data structures include data analyses of some advanced data structures. The Data structures like, Red Black Trees, B trees, B+ Trees, Splay structures that we have discussed further are Red-Black trees Trees, K-d Trees, Priority Search Trees, etc. Each of them and Splay trees.Red-Black trees are self-balancing trees has its own special feature which makes it unique and better having the properties of conventional tree data structures along with an added property of color of the node which can than the others.A Red-Black tree is a binary search tree with be either red or black. This inclusion of the property of color a feature of balancing itself after any operation is performed as a single bit property ensured maintenance of balance in the on it. Its nodes have an extra feature of color. As the name tree during operations such as insertion or deletion of nodes in suggests, they can be either red or black in color. These Red-Black trees. Splay trees, on the other hand, reduce the color bits are used to count the tree’s height and confirm complexity of operations such as insertion and deletion in trees that the tree possess all the basic properties of Red-Black by splayingor making the node as the root node thereby tree structure, The Red-Black tree data structure is a binary reducing the time complexity of insertion and deletions of a search tree, which means that any node of that tree can have node.
    [Show full text]
  • Ch04 Balanced Search Trees
    Presentation for use with the textbook Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, 2015 Ch04 Balanced Search Trees 6 v 3 8 z 4 1 1 Why care about advanced implementations? Same entries, different insertion sequence: Not good! Would like to keep tree balanced. 2 1 Balanced binary tree The disadvantage of a binary search tree is that its height can be as large as N-1 This means that the time needed to perform insertion and deletion and many other operations can be O(N) in the worst case We want a tree with small height A binary tree with N node has height at least (log N) Thus, our goal is to keep the height of a binary search tree O(log N) Such trees are called balanced binary search trees. Examples are AVL tree, and red-black tree. 3 Approaches to balancing trees Don't balance May end up with some nodes very deep Strict balance The tree must always be balanced perfectly Pretty good balance Only allow a little out of balance Adjust on access Self-adjusting 4 4 2 Balancing Search Trees Many algorithms exist for keeping search trees balanced Adelson-Velskii and Landis (AVL) trees (height-balanced trees) Red-black trees (black nodes balanced trees) Splay trees and other self-adjusting trees B-trees and other multiway search trees 5 5 Perfect Balance Want a complete tree after every operation Each level of the tree is full except possibly in the bottom right This is expensive For example, insert 2 and then rebuild as a complete tree 6 5 Insert 2 & 4 9 complete tree 2 8 1 5 8 1 4 6 9 6 6 3 AVL - Good but not Perfect Balance AVL trees are height-balanced binary search trees Balance factor of a node height(left subtree) - height(right subtree) An AVL tree has balance factor calculated at every node For every node, heights of left and right subtree can differ by no more than 1 Store current heights in each node 7 7 Height of an AVL Tree N(h) = minimum number of nodes in an AVL tree of height h.
    [Show full text]
  • Balanced Binary Search Trees – AVL Trees, 2-3 Trees, B-Trees
    Balanced binary search trees – AVL trees, 2‐3 trees, B‐trees Professor Clark F. Olson (with edits by Carol Zander) AVL trees One potential problem with an ordinary binary search tree is that it can have a height that is O(n), where n is the number of items stored in the tree. This occurs when the items are inserted in (nearly) sorted order. We can fix this problem if we can enforce that the tree remains balanced while still inserting and deleting items in O(log n) time. The first (and simplest) data structure to be discovered for which this could be achieved is the AVL tree, which is names after the two Russians who discovered them, Georgy Adelson‐Velskii and Yevgeniy Landis. It takes longer (on average) to insert and delete in an AVL tree, since the tree must remain balanced, but it is faster (on average) to retrieve. An AVL tree must have the following properties: • It is a binary search tree. • For each node in the tree, the height of the left subtree and the height of the right subtree differ by at most one (the balance property). The height of each node is stored in the node to facilitate determining whether this is the case. The height of an AVL tree is logarithmic in the number of nodes. This allows insert/delete/retrieve to all be performed in O(log n) time. Here is an example of an AVL tree: 18 3 37 2 11 25 40 1 8 13 42 6 10 15 Inserting 0 or 5 or 16 or 43 would result in an unbalanced tree.
    [Show full text]