Binary Golomb Codes • Binary source with 0’s much more frequent than 1’s. • Variable-to-variable length code CSE 589 • . Applied Algorithms • Golomb code of order 4 Autumn 2001 input output 0000 0 Golomb Coding 0001 100 001 101 LZW 01 110 Sequitur 1 111

CSE 589 - Lecture 6 - Autumn 2 2001

Example Constructing a Binary Golomb Code

input output 0000 0 • Let p be the probability of 0 0001 100 • Choose the order k such that 001 101 k +1 ≤ − k < k −1 01 110 p 1 p p 1 111 0 1 0 1 0 1 000000001001000000010001 least likely most likely 001111010100100 001 01 1 0k-11

CSE 589 - Lecture 6 - Autumn 3 CSE 589 - Lecture 6 - Autumn 4 2001 2001

Example Comparison of GC with Entropy p = .89 0 1 7 p = .442 0 1 0 1 1- p6 = .503 0 1 0 1 5 01 1 p = .558 000001 00001 0001 001 GC – entropy Order = 6 entropy input output order 000000 0 000001 1000 00001 1001 0001 1010 001 1011 01 110 p 1 111

CSE 589 - Lecture 6 - Autumn 5 CSE 589 - Lecture 6 - Autumn 6 2001 2001

1 Arithmetic Coding Notes on Binary Golomb Code • works well for larger alphabets and gets to within one bit of the entropy lower • Very effective as a binary run-length code bound. Can we do better. Yes • As p goes to 1 effectiveness near entropy • Basic idea in arithmetic coding: • There are adaptive Golomb codes, so that – represent each string x of length n by an interval [l,r) the order can change according to observed in [0,1). We assume n is known. frequency of 0’s. – The width r-l of the interval [l,r) represents the probability of x occurring. – The interval [l,r) can itself be represented by any number, called a tag, within the half open interval.

– The k significant bits of the tag .t1t2t3... is the code of x. That is, . .t1t2t3...tk000... is in the interval [l,r).

CSE 589 - Lecture 6 - Autumn 7 CSE 589 - Lecture 6 - Autumn 8 2001 2001

Example of Arithmetic Coding (1) Some Tags are Better than Others

0 1. tag must be in the half open interval. 0 2. tag can be chosen to be (l+r)/2. 1/3 a 3. code is significant bits of the tag. 1/3 a

11/27 .011010000... ba bab 15/27 .100011100... 2/3 15/27 .100011100... 2/3 b bba b 19/27 .101101000... bb Using tag = (l+r)/2 tag = 13/27 = .011110110... code = 0111 1 tag = 17/27 = .101000010... 1 code = 101 Alternative tag = 14/37 = .100001001... code = 1

CSE 589 - Lecture 6 - Autumn 9 CSE 589 - Lecture 6 - Autumn 10 2001 2001

Code Generation from Tag Example of Codes • If binary tag is .t1t2t3... = (r+l)/2 in [l,r) then we want to choose k to form the code t1t2...tk. • P(a) = 1/3, P(b) = 2/3. tag = (l+r)/2 code • Short code: 0 0/27 .000000000... – choose k to be as small as possible so that aa aaa 1/27 .000010010... .000001001... 0 aaa aab 3/27 .000111000... .000100110... 0001 aab l < .t1t2...tk000... < r. a aba 5/27 .001011110... .001001100... 001 aba ab – Good if n is known. abb .010000101... 01 abb 9/27 .010101010... • Guaranteed code: baa 11/27 .011010000... .010111110... 01011 baa ba k = − log (r − l) +1 bab .011110111... 0111 bab – choose 2 15/27 .100011100... – l < .t1t2...tkb1b2b3... < r for any bits b1b2b3... b bba .101000010... 101 bba 19/27 .101101000... – for fixed length strings provides a good prefix code. bb bbb .110110100... 11 bbb – example: [.000000000..., .000010010...), tag = .000001001... Short code: 0 1 27/27 .111111111... .95 bits/symbol Guaranteed code: 000001 .92 entropy lower bound CSE 589 - Lecture 6 - Autumn 11 CSE 589 - Lecture 6 - Autumn 12 2001 2001

2 Guaranteed Code Example Arithmetic Coding Algorithm

• P(a) = 1/3, P(b) = 2/3. • P(a1), P(a2), … , P(am) short Prefix tag = (l+r)/2 code code • C(ai) = P(a1) + P(a2) + … + P(ai-1) 0 0/27 aa aaa 1/27 .000001001... 0 0000 aaa • Encode x1x2...xn aab 3/27 .000100110... 0001 0001 aab a aba 5/27 .001001100... 001 001 aba ab abb .010000101... 01 0100 abb Initialize l := 0 and r := 1; 9/27 for i = 1 to n do baa 11/27 .010111110... 01011 01011 baa ba w := r - l; bab .011110111... 0111 0111 bab l := l + w * C(x); 15/27 i r := l + w * P(x); b bba .101000010... 101 101 bba i 19/27 t := (l+r)/2; bb choose code for the tag bbb .110110100... 11 11 bbb

1 27/27

CSE 589 - Lecture 6 - Autumn 13 CSE 589 - Lecture 6 - Autumn 14 2001 2001

Arithmetic Coding Example Arithmetic Coding Exercise • P(a) = 1/4, P(b) = 1/2, P(c) = 1/4 • P(a) = 1/4, P(b) = 1/2, P(c) = 1/4 • C(a) = 0, C(b) = 1/4, C(c) = 3/4 • C(a) = 0, C(b) = 1/4, C(c) = 3/4 • abca • bbbb symbol w l r symbol w l r 0 1 0 1 a 1 0 1/4 b 1 w := r - l; w := r - l; b 1/4 1/16 3/16 b l := l + w C(x); l := l + w C(x); c 1/8 5/32 6/32 b r := l + w P(x) r := l + w P(x) a 1/32 5/32 21/128 b

tag = (5/32 + 21/128)/2 = 41/256 = .001010010... tag = l = .001010000... l = r = .001010100... r = code = 00101 code = prefix code = 00101001 prefix code =

CSE 589 - Lecture 6 - Autumn 15 CSE 589 - Lecture 6 - Autumn 16 2001 2001

Decoding (1) Decoding (2) • Assume the length is known to be 3. • Assume the length is known to be 3. • 0001 which converts to the tag .0001000... • 0001 which converts to the tag .0001000...

0 0 .0001000... output a .0001000... aa output a a a ab

b b

1 1 CSE 589 - Lecture 6 - Autumn 17 CSE 589 - Lecture 6 - Autumn 18 2001 2001

3 Decoding (3) Arithmetic Decoding Algorithm

• Assume the length is known to be 3. • P(a1), P(a2), … , P(am) • 0001 which converts to the tag .0001000... • C(ai) = P(a1) + P(a2) + … + P(ai-1)

0 • Decode b1b2...bm, number of symbols is n. .0001000... aa aab output b a Initialize l := 0 and r := 1; ab t := .b1b2...bm000... for i = 1 to n do w := r - l;

find j such that l + w * C(aj) < t < l + w * (C(aj)+P(aj)) b output aj; l := l + w * C(aj); r := l + w * P(aj);

1 CSE 589 - Lecture 6 - Autumn 19 CSE 589 - Lecture 6 - Autumn 20 2001 2001

Decoding Example Practical Arithmetic Coding

• P(a) = 1/4, P(b) = 1/2, P(c) = 1/4 • Scaling: • C(a) = 0, C(b) = 1/4, C(c) = 3/4 – By scaling we can keep l and r in a reasonable range of values so that w = r - l does not • 00101 underflow. tag = .00101000... = 5/32 w l r output – The code can be produced progressively, not at 0 1 the end. 1 0 1/4 a – Complicates decoding some. 1/4 1/16 3/16 b 1/8 5/32 6/32 c • Integer arithmetic coding avoids floating point 1/32 5/32 21/128 a altogether.

CSE 589 - Lecture 6 - Autumn 21 CSE 589 - Lecture 6 - Autumn 22 2001 2001

Notes on Arithmetic Coding Dictionary Coding • Arithmetic codes come close to the entropy lower bound. • Does not use statistical knowledge of data. • Grouping symbols is effective for arithmetic coding. • Arithmetic codes can be used effectively on small • Encoder: As the input is processed develop a symbol sets. Advantage over Huffman. dictionary and transmit the index of strings • Context can be added so that more than one found in the dictionary. probability distribution can be used. • Decoder: As the code is processed – The best coders in the world use this method. reconstruct the dictionary to invert the • There are very effective adaptive arithmetic coding process of encoding. methods. • Examples: LZW, LZ77, Unix Compress, gzip, GIF

CSE 589 - Lecture 6 - Autumn 23 CSE 589 - Lecture 6 - Autumn 24 2001 2001

4 LZW Encoding Algorithm LZW Encoding Example (1)

Dictionary a b a b a b a b a Repeat 0 a find the longest match w in the dictionary 1 b output the index of w put wa in the dictionary where a was the unmatched symbol

CSE 589 - Lecture 6 - Autumn 25 CSE 589 - Lecture 6 - Autumn 26 2001 2001

LZW Encoding Example (2) LZW Encoding Example (3)

Dictionary Dictionary a b a b a b a b a a b a b a b a b a 0 a 0 0 a 0 1 1 b 1 b 2 ab 2 ab 3 ba

CSE 589 - Lecture 6 - Autumn 27 CSE 589 - Lecture 6 - Autumn 28 2001 2001

LZW Encoding Example (4) LZW Encoding Example (5)

Dictionary Dictionary a b a b a b a b a a b a b a b a b a 0 a 0 1 2 0 a 0 1 2 4 1 b 1 b 2 ab 2 ab 3 ba 3 ba 4 aba 4 aba 5 abab

CSE 589 - Lecture 6 - Autumn 29 CSE 589 - Lecture 6 - Autumn 30 2001 2001

5 LZW Encoding Example (6) LZW Decoding Algorithm • Emulate the encoder in building the dictionary. Dictionary a b a b a b a b a Decoder is slightly behind the encoder. 0 a 0 1 2 4 3 1 b 2 ab 3 ba initialize dictionary; 4 aba decode first index to w; 5 abab put w? in dictionary; repeat decode the first symbol s of the index; complete the previous dictionary entry with s; finish decoding the remainder of the index; put w? in the dictionary where w was just decoded;

CSE 589 - Lecture 6 - Autumn 31 CSE 589 - Lecture 6 - Autumn 32 2001 2001

LZW Decoding Example (1) LZW Decoding Example (2a)

Dictionary Dictionary 0 1 2 4 3 6 0 1 2 4 3 6 0 a a 0 a a b 1 b 1 b 2 a? 2 ab

CSE 589 - Lecture 6 - Autumn 33 CSE 589 - Lecture 6 - Autumn 34 2001 2001

LZW Decoding Example (2b) LZW Decoding Example (3a)

Dictionary Dictionary 0 1 2 4 3 6 0 1 2 4 3 6 0 a a b 0 a a b a 1 b 1 b 2 ab 2 ab 3 b? 3 ba

CSE 589 - Lecture 6 - Autumn 35 CSE 589 - Lecture 6 - Autumn 36 2001 2001

6 LZW Decoding Example (3b) LZW Decoding Example (4a)

Dictionary Dictionary 0 1 2 4 3 6 0 1 2 4 3 6 0 a a b ab 0 a a b ab a 1 b 1 b 2 ab 2 ab 3 ba 3 ba 4 ab? 4 aba

CSE 589 - Lecture 6 - Autumn 37 CSE 589 - Lecture 6 - Autumn 38 2001 2001

LZW Decoding Example (4b) LZW Decoding Example (5a)

Dictionary Dictionary 0 1 2 4 3 6 0 1 2 4 3 6 0 a a b ab aba 0 a a b ab aba b 1 b 1 b 2 ab 2 ab 3 ba 3 ba 4 aba 4 aba 5 aba? 5 abab

CSE 589 - Lecture 6 - Autumn 39 CSE 589 - Lecture 6 - Autumn 40 2001 2001

LZW Decoding Example (5b) LZW Decoding Example (6a)

Dictionary Dictionary 0 1 2 4 3 6 0 1 2 4 3 6 0 a a b ab aba ba 0 a a b ab aba ba b 1 b 1 b 2 ab 2 ab 3 ba 3 ba 4 aba 4 aba 5 abab 5 abab 6 ba? 6 bab

CSE 589 - Lecture 6 - Autumn 41 CSE 589 - Lecture 6 - Autumn 42 2001 2001

7 Trie Data Structure for Encoder’s LZW Decoding Example (6b) Dictionary Dictionary 0 1 2 4 3 6 • Fredkin (1960) 0 a a b ab aba ba bab 1 b 0 a 9 ca 0 1 2 3 4 2 ab 1 b 10 ad a b c d r 3 ba 2 c 11 da 4 aba 3 d 12 abr 5 abab 4 r 13 raa b 5 c 8 d 10 r 6 a 9 a 11 a 7 6 bab 5 ab 14 abra 7 bab? 6 br r 12 a 13 7 ra 8 ac a 14

CSE 589 - Lecture 6 - Autumn 43 CSE 589 - Lecture 6 - Autumn 44 2001 2001

Encoder Uses a Trie (1) Encoder Uses a Trie (2)

0 1 2 3 4 0 1 2 3 4 a b c d r a b c d r

b 5 c 8 d 10 r 6 a 9 a 11 a 7 b 5 c 8 d 10 r 6 a 9 a 11 a 7

r 12 a 13 r 12 a 15 a 13

a 14 a 14

a b r a c a d a b r a a b r a c a d a b r a a b r a c a d a b r a a b r a c a d a b r a 0 1 4 0 2 0 3 5 7 12 0 1 4 0 2 0 3 5 7 12 8

CSE 589 - Lecture 6 - Autumn 45 CSE 589 - Lecture 6 - Autumn 46 2001 2001

Decoder’s Data Structure Notes on Dictionary Coding

• Simply an array of strings • Extremely effective when there are repeated patterns in the data that are widely spread. 0 a 9 ca Where local context is not as significant. 1 b 10 ad 2 c 11 da 0 1 4 0 2 0 3 5 7 12 8 ... – text 3 d 12 abr a b r a c a d ab ra abr – some graphics 4 r 13 raa – program sources or binaries 5 ab 14 abr? 6 br • Variants of LZW are pervasive. 7 ra 8 ac

CSE 589 - Lecture 6 - Autumn 47 CSE 589 - Lecture 6 - Autumn 48 2001 2001

8 Sequitur Context-Free Grammars • Nevill-Manning and Witten, 1996. • Uses a context-free grammar (without • Invented by Chomsky in 1959 to explain the recursion) to represent a string. grammar of natural languages. • The grammar is inferred from the string. • Also invented by Backus in 1959 to generate • If there is structure and repetition in the string and parse Fortran. then the grammar may be very small • Example: compared to the original string. – terminals: b, e • Clever encoding of the grammar yields – nonterminals: S, A impressive compression ratios. – Production Rules: S -> SA, S -> A, A -> bSe, A -> be • Compression plus structure! – S is the start symbol

CSE 589 - Lecture 6 - Autumn 49 CSE 589 - Lecture 6 - Autumn 50 2001 2001

Arithmetic Expressions Context-Free Grammar Example derivation of a * (a + a) + a parse tree hierarchical • S -> S + T • S -> SA parse tree S -> T S S S -> A derivation of bbebee T -> T*F S+T S T -> F S + T A -> bSe S T+T F -> a T*F+T T F A -> be A A F -> (S) F*F+T bSe T * F a bSAe a*F+T b S e bAAe a*(S)+F F ( S ) bbeAe a*(S+F)+T Example: b and e matched a S + T bbebee S A a*(T+F)+T as parentheses a*(F+F)+T T F a*(a+F)+T A b e a*(a+a)+T F a a*(a+a)+F a b e a*(a+a)+a

CSE 589 - Lecture 6 - Autumn 51 CSE 589 - Lecture 6 - Autumn 52 2001 2001

Sequitur Principles Sequitur Example (1) bbebeebebebbebee • Digram Uniqueness: – no pair of adjacent symbols (digram) appears S -> b more than once in the grammar. • Rule Utility: – Every production rule is used more than once. • These two principles are maintained as an invariant while inferring a grammar for the input string.

CSE 589 - Lecture 6 - Autumn 53 CSE 589 - Lecture 6 - Autumn 54 2001 2001

9 Sequitur Example (2) Sequitur Example (3) bbebeebebebbebee bbebeebebebbebee

S -> bb S -> bbe

CSE 589 - Lecture 6 - Autumn 55 CSE 589 - Lecture 6 - Autumn 56 2001 2001

Sequitur Example (4) Sequitur Example (5) bbebeebebebbebee bbebeebebebbebee

S -> bbeb S -> bbebe Enforce digram uniqueness. be occurs twice. Create new rule A ->be.

CSE 589 - Lecture 6 - Autumn 57 CSE 589 - Lecture 6 - Autumn 58 2001 2001

Sequitur Example (6) Sequitur Example (7) bbebeebebebbebee bbebeebebebbebee

S -> bAA S -> bAAe A -> be A -> be

CSE 589 - Lecture 6 - Autumn 59 CSE 589 - Lecture 6 - Autumn 60 2001 2001

10 Sequitur Example (8) Sequitur Example (9) bbebeebebebbebee bbebeebebebbebee

S -> bAAeb S -> bAAebe Enforce digram uniqueness. A -> be A -> be be occurs twice. Use existing rule A -> be.

CSE 589 - Lecture 6 - Autumn 61 CSE 589 - Lecture 6 - Autumn 62 2001 2001

Sequitur Example (10) Sequitur Example (11) bbebeebebebbebee bbebeebebebbebee

S -> bAAeA S -> bAAeAb A -> be A -> be

CSE 589 - Lecture 6 - Autumn 63 CSE 589 - Lecture 6 - Autumn 64 2001 2001

Sequitur Example (12) Sequitur Example (13) bbebeebebebbebee bbebeebebebbebee

S -> bAAeAbe Enforce digram uniqueness. S -> bAAeAA Enforce digram uniqueness A -> be be occurs twice. A -> be AA occurs twice. Use existing rule A -> be. Create new rule B -> AA.

CSE 589 - Lecture 6 - Autumn 65 CSE 589 - Lecture 6 - Autumn 66 2001 2001

11 Sequitur Example (14) Sequitur Example (15) bbebeebebebbebee bbebeebebebbebee

S -> bBeB S -> bBeBb A -> be A -> be B -> AA B -> AA

CSE 589 - Lecture 6 - Autumn 67 CSE 589 - Lecture 6 - Autumn 68 2001 2001

Sequitur Example (16) Sequitur Example (17) bbebeebebebbebee bbebeebebebbebee

S -> bBeBbb S -> bBeBbbe Enforce digram uniqueness. A -> be A -> be be occurs twice. B -> AA B -> AA Use existing rule A ->be.

CSE 589 - Lecture 6 - Autumn 69 CSE 589 - Lecture 6 - Autumn 70 2001 2001

Sequitur Example (18) Sequitur Example (19) bbebeebebebbebee bbebeebebebbebee

S -> bBeBbA S -> bBeBbAb A -> be A -> be B -> AA B -> AA

CSE 589 - Lecture 6 - Autumn 71 CSE 589 - Lecture 6 - Autumn 72 2001 2001

12 Sequitur Example (20) Sequitur Example (21) bbebeebebebbebee bbebeebebebbebee

S -> bBeBbAbe Enforce digram uniqueness. S -> bBeBbAA Enforce digram uniqueness. A -> be be occurs twice. A -> be AA occurs twice. B -> AA Use existing rule A -> be. B -> AA Use existing rule B -> AA.

CSE 589 - Lecture 6 - Autumn 73 CSE 589 - Lecture 6 - Autumn 74 2001 2001

Sequitur Example (22) Sequitur Example (23) bbebeebebebbebee bbebeebebebbebee

S -> bBeBbB Enforce digram uniqueness. S -> CeBC A -> be bB occurs twice. A -> be B -> AA Create new rule C -> bB. B -> AA C -> bB

CSE 589 - Lecture 6 - Autumn 75 CSE 589 - Lecture 6 - Autumn 76 2001 2001

Sequitur Example (24) Sequitur Example (25) bbebeebebebbebee bbebeebebebbebee

S -> CeBCe Enforce digram uniqueness. S -> DBD Enforce rule utility. A -> be Ce occurs twice. A -> be C occurs only once. B -> AA Create new rule D -> Ce. B -> AA Remove C -> bB. C -> bB C -> bB D -> Ce

CSE 589 - Lecture 6 - Autumn 77 CSE 589 - Lecture 6 - Autumn 78 2001 2001

13 Sequitur Example (26) The Hierarchy

bbebeebebebbebee bbebeebebebbebee S S -> DBD S -> DBD D B D A -> be A -> be B -> AA B -> AA b B e A A b B e D -> bBe D -> bBe A A b e b e A A

b e b e b e b e

Is there compression? In this small example, probably not.

CSE 589 - Lecture 6 - Autumn 79 CSE 589 - Lecture 6 - Autumn 80 2001 2001

Sequitur Algorithm Complexity • The number of non-input sequitur operations Input the first symbol s to create the production S -> s; applied < 2n where n is the input length. repeat match an existing rule: • Amortized Complexity Argument A -> ….XY… A -> ….B…. – Let s = the sum of the right hand sides of all the B -> XY B -> XY production rules. Let r = the number of rules. create a new rule: – We evaluate 2s - r. A -> ….XY…. A -> ….C.... B -> ....XY.... B -> ....C.... – Initially 2s - r = 1 because s = 1 and r = 1. remove a rule: C -> XY – 2s - r > 0 at all times because each rule has at A -> ….B…. least 1 symbol on the right hand side. B -> X X …X A -> …. X1X2…Xk …. 1 2 k – 2s - r increases by 2 for every input operation. input a new symbol: – 2s - r decreases by at least 1 for each non-input S -> X X …X S -> X X …X s 1 2 k 1 2 k sequitur rule applied. until no symbols left

CSE 589 - Lecture 6 - Autumn 81 CSE 589 - Lecture 6 - Autumn 82 2001 2001

Sequitur Rule Complexity Linear Time Algorithm

• Digram Uniqueness - match an existing rule. • There is a data structure to implement all the A -> ….XY… A -> ….B…. s r 2s - r sequitur operations in constant time. B -> XY B -> XY -1 0 -2 – Production rules in an array of doubly linked lists. • Digram Uniqueness - create a new rule. – Each production rule has reference count of the number of times used. A -> ….XY…. A -> ….C.... s r 2s - r – Each nonterminal points to its production rule. B -> ....XY.... B -> ....C.... 0 1 -1 C -> XY – digrams stored in a hash table for quick lookup. • Rule Utility - Remove a rule.

A -> ….B…. s r 2s - r A -> …. X1X2…Xk …. B -> X1X2…Xk -1 -1 -1

CSE 589 - Lecture 6 - Autumn 83 CSE 589 - Lecture 6 - Autumn 84 2001 2001

14 Basic Encoding a Grammar Data Structure Example b 000 current digram S -> DBD e 001 Grammar A -> be Symbol Code S 010 B -> AA S 0 C e B C e A 011 S -> CeBCe D -> bBe B 100 A -> be digram table D 101 B -> AA A 2 b e # 110 C -> bB BC eB Grammar Code B 2 A A Ce D B D # b e # A A # b B e be 101 100 101 110 000 001 110 011 011 110 000 100 001 39 bits AA C 2 b B bB |Grammar Code| = + − + + (s r 1)log 2 (r a 1) r = number of rules reference count s = sum of right hand sides a = number in original symbol alphabet

CSE 589 - Lecture 6 - Autumn 85 CSE 589 - Lecture 6 - Autumn 86 2001 2001

Better Encoding of the Grammar Compression Quality

• Nevill-Manning and Witten suggest a more • Neville-Manning and Witten 1997

efficient encoding of the grammar that size compress gzip sequitur PPMC resembles LZ77. bib 111261 3.35 2.51 2.48 2.12 – The first time a nonterminal is sent, its right hand book 768771 3.46 3.35 2.82 2.52 side is transmitted instead. geo 102400 6.08 5.34 4.74 5.01 obj2 246814 4.17 2.63 2.68 2.77 – The second time a nonterminal is sent the new pic 513216 0.97 0.82 0.90 0.98 production rule is established with a pointer to the progc 38611 3.87 2.68 2.83 2.49 previous occurrence sent along with the length of the rule. Files from the – Subsequently, the nonterminal is represented by Units in bits per character (8 bits) compress - based on LZW the index of the production rule. gzip - based on LZ77 PPMC - adaptive arithmetic coding with context

CSE 589 - Lecture 6 - Autumn 87 CSE 589 - Lecture 6 - Autumn 88 2001 2001

Notes on Sequitur

• Very new and different from the standards. • Yields compression and hierarchical structure simultaneously. • With clever encoding is competitive with the best of the standards. • Practical linear time encoding and decoding. • Alternatives – Off-line algorithms – (i) find the most frequent digram, (ii) find the longest repeated substring

CSE 589 - Lecture 6 - Autumn 89 2001

15