Lecture 8: Assignment Algorithms

Prof. Krishna R. Pattipati Dept. of Electrical and Computer Engineering University of Connecticut Contact: [email protected]; (860) 486-2890

© K. R. Pattipati, 2001-2016 Outline • Examples of assignment problems • Assignment Algorithms . and variants . Hungarian Algorithm (also called Kuhn-Munkres Algorithm) . Easy to understand, but not for practical applications . Successive shortest path algorithm (Hungarian; Jonker, Volgenant and Castanon (JVC)) . Signature … Not efficient computationally • Special cases • M-Best Assignment Algorithms . Murty (1968) . Stone & Cox (1995) . Popp, Pattipati & Bar-Shalom (1999)

VUGRAPH 2 Examples of assignment problems • Assignment problem . Also known as weighted bipartite matching problem • Bipartite graph . Has two sets of nodes 푆, 푇 ⇒ 푉 = 푆 ∪ 푇 . And a set of edges 퐸 connecting them • A matching on a bipartite graph G = (S, T, E) is a subset of edges 푋 ∈ 퐸 ∋ no two edges in 푋 are incident to the same node Matching S T 1 1 2 2 3 3 4 4 5 . Nodes 1, 2, 3, 4 of 푆 are matched or covered . Node 5 is uncovered or exposed

VUGRAPH 3 Matching problems • Two types of matching problems: m = |S|; n = |T| . Cardinality matching problem o Find a matching with maximum number of arcs

max  xij (,)i j E

s.t. xij 1 i

xij 1 j

xij 0,1

. Weighted matching problem or the assignment problem

maxwij x ij or min c ij x ij , c ij w ij (,)(,)i j E i j E

s.t.  xij  1  i  1,  , n (or =) (,)i j E

 xij  1  j  1,  , m (,)i j E (or =)

xij 0,1 o Problem can also be written as σ max훼 푖 푤푖훼푖 훼 ~ permutation of columns of 푊

(= assignment of object i to person i)

VUGRAPH 4 Examples of assignment problems . Examples o Assigning people to machines o Correlating reports from two sensors o System of distinct representatives (cardinality matching problem) o Cardinality matching ~ maximum flow (~  analogous to)

cardinality matching maximum flow 1 4 1 4 ∞ ∞ 1 1 2 5 2 5 s 1 ∞ 1 t 3 6 1 1 6 m + n nodes ∞ ∞ 3 capacity ퟐ <−> ퟒ ퟏ <−> ퟓ m + n + 2 nodes (Lecture 9)

VUGRAPH 5 Examples of assignment problems

o 푛 × 푛 assignment or bipartite matching ~ minimum cost network flow problem … (Lecture 10) bipartite matching MCNF 4 capacity, cost 4 1 4 1 ∞,4 –3 6 ∞,∞ 1,0 1,0 2 ∞,2 ∞,6 2 1 5 s 1,0 ∞,1 5 1,0 t 0 2 ∞,5 ∞,0 1,0 ∞ 5 1,0 ∞,–3 3 6 2 3 ∞,2 6 2n nodes 2n + 2 nodes

4 6 3 c  2 1 0 ij   52 ⇒ Can use RELAX to solve assignment problem (Lecture 10) ⇒ In this particular case, even 휖 – relax works as well as RELAX even on sequential computers (Lecture 10)

VUGRAPH 6 Optimality conditions • Dual of the assignment problem . Assume equality constraints & 푚 = 푛 w/o loss of generality

Primal Dual

max wx nn  ij ij  (,)i j E  min  ij p ij11 s.t.  xij  1  i  1,  , n   (,)i j E s.t.  ipw j ij , i , j E  xij 1  j  1,  , n (,)i j E

xij 0,1 ,  i , j  E

푥푖푗 > 0 ⇒ 휋푖 + 푝푗 = 푤푖푗 at optimum: 휋푖 = max 푤푖푘 − 푝푘 CS conditions imply: ቋ 푘∈out(푖) 푥푖푗 = 0 ⇒ 휋푖 + 푝푗 > 푤푖푗 = 푤푖푗 − 푝푗 . To provide physical interpretation, let 푖 = person and 푗 = object

o 푝푗 ~ price of object 푗 = amount of money that a person is willing to pay when assigned to 푗

o (푤푖푗 − 푝푗) ~ profit margin if person 푖 is assigned to object 푗 = benefit person 푖 associates with being assigned to object 푗 o So, if (푖, 푗) is an optimal pair (i.e., part of an optimal solution), then the profit ∗ margin 휋푖 is     iw ij  p j max  w ik  p k  kiout 

VUGRAPH 7 Optimality conditions . Using this fact, we can simplify the dual problem considerably nn Note that every  can minq , p min p i   ij be decreased by a ij11 constant  and every p s.t.  p  w ,  ( i , j )  E j i j ij increased by  without (or) ip j  w ij ,  j  out( i ); i  1,2,.., n affecting q.

o Suppose that somebody gave us prices of objects {푝푗} o Then, for a given set of {푝푗}, the optimal

 imax {wp ij j } jiout( ) o Dual problem is equivalent to

nn minq p min pj  max { w ij  p j }   p jiout( ) ji11

o Note: no constraints on 푝푗

VUGRAPH 8 Auction algorithm for the assignment problem

. Start with an initial set of object prices, 푝푗: 1 ≤ 푗 ≤ 푛 (e.g., 푝푗 = 0 or max푖 푤푖푗) . Initially, either no objects are assigned or else have 휖-complementary slackness satisfied

휋푖 − 휖 = max 푤푖푘 − 푝푘 − 휖 ≤ 푤푖푗 − 푝푗, 푗 = arg max 푤푖푘 − 푝푘 푘 푘 휋푖 − 휖 ≤ 푤푖푗 − 푝푗 = 휋푖, ∀ 푖, 푗 ∈ out 푖 ∋ 푗 is assigned to 푖

⇒ 휋푖 = 푤푖푗 − 푝푗 ≥ max 푤푖푘 − 푝푘 − 휖 ⇒ CS Conditions 푘 ⇒ Partial 휖-optimal assignment

• Auction algorithm consists of two phases . phase: o Each unassigned person 푖 computes the “current value” of object 푗 (i.e., potential profit if 푖 is assigned to 푗) & bids for the object of his choice . Assignment phase: o Each object (temporarily) offers itself to the highest bidder • Bidding phase . Each unassigned person 푖 computes the value of object 푗 ∈ out(푖) 푣푖푗 = 푤푖푗 − 푝푗; 푗 ∈ out(푖)

VUGRAPH 9 Auction has Two phases: Bidding & Assignment

. Let ∗ 휋푖 = max. value = max 푣푖푗 = 푣푖푗 푗∈out(푖) . Find the next best value (second best profit)

휙푖 = max 푣푖푗 푗∈out(푖) 푗≠푗∗ ∗ ∗ ∗ ∗ . Person 푖 then bids for object 푗 at a price of 푏푖푗 = 푝푗 + 휋푖 − 휙푖 + 휖=푤푖푗 −휙푖 + 휖

∗ ∗ ∗ . Note: actually, can have 푏푖푗 ∈ 푝푗 + 휖, 푝푗 + 휋푖 − 휙푖 + 휖 … but the above choice provides best performance (more later, from the dual cost structure) ∗ . Q: what if 푗 is the only object ⇒ set 휙푖 = −∞ or a number ≪ 휋푖 . This process is highly parallelizable • Assignment phase . For each object 푗, if 푃(푗) is the set of persons from whom 푗 received a bid, then 푝푗 = max 푏푖푗 = 푏푖∗푗 푖∈푃(푗) . Announce the new price to all persons ∗ . Assign person 푖 to object 푗 ⇒ 푥푖∗푗 = 1 ′ . De-assign any previous assignment 푖 to 푗 ⇒ 푥푖′푗 = 0 . If there was no bid, don’t do anything . This process is also highly 푝푎푟푎푙푙푒푙푖푧푎푏푙푒

VUGRAPH 10 Properties of Auction algorithm • Properties ′ ′ . If 푝푗 is price of object 푗 before assignment & 푝푗 after assignment, we have 푝푗 ≥ 푝푗, ∀푗 ⇒ 푝푗 ↑ ′ ′ ⇒ 푝푗∗ = 푝푗∗ + 휋푖 − 휙푖 + 휖 ⇒ 푝푗∗ ≥ 푝푗∗ . Maintains 휖-complementary slackness for assigned objects . Suppose object 푗∗ accepts bid of person 푖 ′ ′ ′ 휋 = 푤푖푗∗ − 푝 ∗ = 휙푖 − 휖 = max 푤푖푗 − 푝푗 − 휖 ≥ max 푤푖푗 − 푝 − 휖 푖 푗 푗∈out(푖) 푗∈out(푖) 푗 푗≠푗∗ 푗≠푗∗ ′ ′ 푤푖푗∗ − 푝 ∗ ≥ max 푤푖푗 − 푝푗 − 휖 푗 푗∈out 푖 ⇒ 휖-CS conditions continue to hold . Profit margin of 푖 after assignment ′ ′ ′ ′ 휋푖 = max 푤푖푗 − 푝푗 = max 푤푖푗∗ − 푝 ∗, max 푤푖푗 − 푝푗 푗∈out(푖) 푗 푗∈out(푖) 푗≠푗∗

≤ max 푤푖푗∗ − 푝′푗∗, max 푤푖푗 − 푝푗 = max 휙푖 − 휖, 휙푖 = 휙푖 푗∈out 푖 푗≠푗∗ ′ 푝푗∗ = 푝푗 + 휋푖 − 휙푖 + 휖 ⇒ price increases by least 휖 ′ ′ ′ 휋푖 = 푤푖푗∗ − 푝푗∗ = 휙푖 − 휖 ≥ 휋푖 − 휖 ⇒ profit goes down by at least 휖

VUGRAPH 11 Coordinate descent interpretation . Coordinate descent interpretation of bidding and assignment phases o Jacobi-like relaxation step for minimizing the dual function 푞(푝)

 In each bidding and assignment phase, the price 푝푗 of each object 푗 bidded upon is increased to either a value that minimizes 푞(푝) when all other prices are kept constant, or else exceeds the largest such value by no more than 휖

Dual cost slope = –3 range of along 푝푗 possible 푝푗

slope = –2 slope = 1 slope = –1

slope = 0

푤 − 휙 푤 − 휙 푖1푗 푖1 푖3푗 푖3 푤푖4푗 − 휙푖4 푝푗 푤푖2푗 − 휙푖2

푛 푛 Recall: 푞 푝 = σ푗=1 푝푗 + σ푖=1 max 푤푖푗 − 푝푗 푗∈out(푖)

⇒ max 푤푖푗 − 푝푗 = max 푤푖푗 − 푝푗 , max 푤푖푗′ − 푝푗′ 푗∈out(푖) 푗′∈out 푖 ,푗′≠푗 = max 푤푖푗 − 푝푗 , 휙푖

VUGRAPH 12 Coordinate descent interpretation

• If 푝푗 < 푤푖푗 − 휙푖, 푖 bids for object 푗 the amount 푤푖푗 − 휙푖 + 휖

• Accepted bid by 푗: max 푤푖푗 − 휙푖 + 휖 푖

• Also, note that the right directional derivative of 푞(푝) along 푝푗 = 푒푗 is + 푑푗 = 1 − (# of persons 푖 with 푗 ∈ out 푖 ∋ 푝푗 < 푤푖푗 − 휙푖) ′ • This is because we can increase 푝푗 only if 휙푖 = 푤푖푗′ − 푝푗′ ≤ 푤푖푗 − 푝푗 where 푗 is second best . This interpretation leads to two other variations of auction

• Each assigned person 푖 also bids for his own assigned object 푗, 푏푖푗 = 푤푖푗 − 휙푖 + 휖

푤푖4푗

푤푖3푗

푝푗 푤푖2푗

휙푖4

휙푖3

푤푖1푗

휙푖2

휙푖1

푤푖1푗 − 휙푖1 푤 − 휙 푤 − 휙 푖2푗 푖2 푤푖3푗 − 휙푖3 푖4푗 푖4 푝푗

VUGRAPH 13 Gauss-Seidel method • Gauss-Seidel . Only one person bids at a time . Price rise is taken into account before the next person bids . Problem: can’t parallelize, but has much faster convergence on sequential implementations . ∃ 푣푎푟푖푎푡푖표푛푠 in between Jacobi and Gauss-Seidel (e.g. bids by a subset of persons)… Research Problem . Q: Can we maintain optimality even in the presence of 휖? . A: Yes!!

• Suppose 푖 is assigned object 푗푖, ∀푖 = 1,2, … , 푛 • Each step of the algorithm maintains wp    iji i j i wp   n ijjii i  i i i • If f* is the optimal primal value (note: maximization) f w   p  n  f  n ijii  i  j i i i

1 • If 푎 are integer & 휖 < ⇒ 푛휖 < 1 ⇒ optimality 푖푗 푛 . Does the algorithm terminate? • Yes if ∃ at least one feasible solution

VUGRAPH 14 Illustration of Gauss-Seidel Auction algorithm 14  5  8  7 2  12  6  5 • Initialize prices to zero.  = 0.2 W   7  8  3  9  2  4  6  10

Iter Prices X Bidder Object Bid 1 (0,0,0,0)  1 2 2.2

2 (0,2.2,0,0) (1,2) 2 1 3.2

3 (3.2,2.2,0,0) (1,2),(2,1) 3 3 6.2

4 (3.2,2.2,6.2,0) (1,2),(2,1), 4 1 4.4 (3,3) 5 (4.4,2.2,6.2,0) (1,2),(3,3), 2 4 1.6 (4,1) 6 (4.4,2.2,6.2,1. (1,2),(3,3),    6) (4,1),(2,4)

T • Assignments: x12 = 1;x33= 1; x41= 1; x24= 1 p  4.4 2.2 6.2 1.6 * • f = w12 + w33 + w41 + w24 = -15

VUGRAPH 15 Auction with a different price initialization

• Initialize prices. 푝푗 = max(푤푖푗)  = 0.2 14  5  8  7 푖  2  12  6  5 T W  p  2  4  3  5 7  8  3  9  2  4  6  10

Iter Prices X Bidder Object Bid

1 (-2,-4,-3,-5) (2,1),(4,2),(3, 1 2 -1.8 3) 2 (-2,-1.8,-3, (2,1),(3,3),(1, 4 1 0.2 -5) 2) 3 (0.2,-1.8,-3, (3,3),(1,2),(4, 2 4 -1.8 --5) 1) 4 (0.2, -1.8, -3, (3,3),(1,2),    -1.8) (4,1),(2,4)

T • Assignments: x12 = 1;x33= 1; x41= 1; x24= 1 p 0.2  1.8  3  1.8 * • f = w12 + w33 + w41 + w24 = -15

VUGRAPH 16 Proof of convergence

• Proof relies on the following facts . If an object is assigned, it remains assigned throughout . Each time an object is bidded upon, its price increases by at least 휖 . If an object is bidded upon infinite number of times, its price → ∞ . Recall: unbounded dual → infeasible primal

• If a person bids for at most out(푖) times, 휋푖 ↓ by at least 휖 . Each time a person bids, 휋푖 ↓ or 휋푖 remains same . If exceed out(푖) times, 휋푖 ↓ at least 휖 . If a person bids infinite # of times, 휋푖 ↓ −∞ • Can show (see Bertsekas’s book) that, if the problem is feasible

0  i 2n  1 C  n  1  max p j , where C  max w ij j  i, j

If 휋푖 < this bound during auction, declare primal as infeasible. Unfortunately, it may take many iterations before the bound is crossed. Alternately, add artificial link 푖, 푗

wij < –(2n– 1)C

VUGRAPH 17 Algorithm complexity 푝푗 o Scaling problems 푐 1 푐 1 0 휖 2휖 푐 2 푐 2 0 휖 2휖 # of bidding phases = c/휖

푐 푐 3 3 0 푛퐶 o For some examples, # of bidding phases proportional to ⇒ performance is very 휖 sensitive to 퐶

o Scale all costs 푤푖푗 by (푛 + 1) & apply the algorithm with progressively lower value of 휖 until 휖 ≈ 1 or 휖 < 1  Use  kk max 1, , 0,1,2,  k  ∆= 푛퐶 & 휏 = 2 is suggested 푛퐶 휖  Alternately, 휖 = initially & reduce 휖 ← max , 1 2 6  With these values, the complexity of the algorithm is O(n|E|log(nC)) Proof of convergence for asynchronous version is similar to shortest path algorithm . Computational results o Comparable to any existing assignment algorithm… average complexity O(n1.8) o Parallelizable o Especially fast for sparse problems

VUGRAPH 18 Auction algorithm variants • Recent variants: . ⇒ objects bid for persons . Forward/reverse auction ⇒ alternate cycles of person-object biddings . Look-back auction . Extensions to inequality constraints . Asymmetric assignment problem ⇒ 푚 ≠ 푛 . Multi-assignment problem ⇒ a person may be assigned to more than one object • Reverse auction . Objects compete for persons by offering discounts nn

qw  i max( ij   i ) . Duals: ijIn( ) ii11 . Reverse auction iteration ∗ o Find the “best” person 푖 = arg max (푤푖푗 − 휋푖) 푖∈In(푗)

Let 훽푗 = 푤푖∗푗 − 휋푖∗

훿푗 = max 푤푖푗 − 휋푖 푖∈In(푗) 푖≠푖∗ ∗ If 푖 is the only person, set 훿푗 = −∞ or 훿푗 ≪ 훽푗 o Bid for person 푖∗ 푏푖∗푗 = 푤푖∗푗 − 훿푗 + 휖 o For each person 푖 receiving at least one bid, 휋푖 = max 푏푖푗 = 푏푖푗∗ 푗∈퐵(푖)

B(i) = set of objects from which 푖 received a bid. De-assign any previous assignment for 푖, set 푥푖푗∗ = 1 VUGRAPH 19 Auction algorithm variants • Combined forward and reverse auction . Maintain 휖 − 퐶푆 conditions on both 휋, 푝

⇒ 휋푖 + 푝푗 ≥ 푤푖푗 − 휖 ∀(푖, 푗)

휋푖 + 푝푗 = 푤푖푗 ∀ 푖, 푗 ∈ 푋 = Solution . Execute alternately the a few iterations and reverse auction a few iterations o Run forward auction: persons bid on objects. At the end of each iteration, set 휋푖 = 푤푖푗∗ − 푝푗∗ if 푥푖푗∗ = 1 stop if all are assigned, else go to Reverse Auction (do until number of assignments increases by at least one.) o Run reverse auction: objects bid on persons. At the end of each iteration, set 푝푗 = 푤푖∗푗 − 휋푖∗ if 푥푖∗푗 = 1 stop if all are assigned, else go to Forward Auction. (do until number of assignments increases by at least one.) . No need to do 휖 −scaling • Look-back auction . Number of objects a person bids to is typically small (3) . Keep track of biddings of each person i in a small list • Asymmetric assignment . Number of objects 푛 > number of persons, 푚 . All persons should be assigned, but allow objects to remain unassigned

VUGRAPH 20 Asymmetric Auction algorithm

Equivalent primal

Primal max  wxij ij (,)i j E max  wxij ij (,)i j E s.t.  xij 1  i  1,2,  , m  w sj  0 (,)i j E s.t.  xij  1  i  1,2,  , m (,)i j E  xij xsj 1  j  1,2,  , n (,)i j E x1  j  1,2,  , n  ij n (,)i j E xnsj m xij 0,1 ,  i , j  E j1

0  xij  i, j E

0xsj j1,2, , n

Dual mn minijp  ( n  m ) ij11

s.t. ip j  w ij  ( i , j )  E

 pj j  1,2,  , n mn

mini  max(w ij   i )  ( n  m ) min max( w ij   i )  j ijIn( ) ij11

VUGRAPH 21 Reverse Auction algorithm

. Use a modified reverse auction

. Select an object 푗 which is unassigned and satisfies 푝푗 > 휆. If no such object is found, terminate the algorithm. . Find best person 푖∗  iwarg max ij i ijIn( )  w j i j i

jmax w ij i iIn j , i i If 휆 ≥ 훽푗 − 휖, set 푝푗 = 휆 and go to next iteration. Otherwise, let min j   ,  j   j  

ppj j   max  ,  j   j        i i  i  Remove any assignment of 푖∗

Set 푥푖∗푗 = 1 • The algorithm terminates with an assignment that is within 푚휖 of being optimal (휖 ≤ 1/푚). See Bertsekas’s book. • Can use forward-reverse auction. Initialize forward auction with object prices equal to zero. • Multi-assignment ⇒ it is possible to assign more than one object to a single person (e.g., tracking clusters of objects)… See Exercise 7.10 of Bertsekas’s book. • Asymmetric Assignment where there is no need for every person, as well as for every object, to be assigned. See Exercise 7.11 of Bertsekas’s book.

VUGRAPH 22 Hungarian algorithm for the assignment problem

• Typically done as minimization. Fact: Adding a constant to a row or a column does not change solution n n n n n nn min cij x ij x ij c ij x ij  mincij x ij ; c ij w ij     i1 j  1 j  1 i  1 j  1 ij11 n n s.t. xij  1  i  1,  , n s.t. xij  1  i  1,  , n   j1 j1 n n xjij 1   1,  ,n xij 1  j  1,  , n   i1 i1

xij 0,1 , i , j xij 0,1 , i , j

• Konig-Egervary Theorem: If have n zeros in different rows and columns of matrix C, then we can construct a “perfect” matching. This is called “ideal” (“perfect”) cost matrix. • Kuhn-Munkres algorithm systematically converts any C matrix into an “ideal” cost matrix • Algorithm is called Hungarian in view of Konig-Egervary Theorem

VUGRAPH 23 Hungarian algorithm steps for minimization problem

• Step 1: For each row, subtract the minimum number in that row from all numbers in that row. • Step 2: For each column, subtract the minimum number in that column from all numbers in that column. • Step 3: Draw the minimum number of lines to cover all zeroes. If this number = n, Done — an assignment can be made. • Step 4: Determine the minimum uncovered number (call it ).

. Subtract  from uncovered numbers. J J . Add  to numbers covered by two lines. I  x . Numbers covered by one line remain the same. I x  . Then, Go to Step 3. • Note: Essentially, we came to step 4 because we only partially matched a subset of rows I and the corresponding subset of columns, J. What if we subtract  > 0 from every row that is not in I , and we add  to every column in J? Then, the only entries that get decreased are the entries that are not covered. The total decrease = (n-|I|-|J|) >0.

VUGRAPH 24 Illustration of Hungarian algorithm

14 5 8 7 9 0 3 2 9 0 3 0 2 12 6 5 Step 1 Step 2 0 10 4 1 0 10 4 3  7 8 3 9 4 5 0 6 4 5 0 4  2 4 6 10 0 2 4 8 0 2 4 6

Step 4

• Assignments 10 0 4 0 • x33= 1 Step 3 0 9 4 0 • x = 1  41 4 4 0 3 • x24= 1  0 1 4 5 • x12 = 1

• Cost = c12 + c24 + c33 + c41 = 15

• Row-column heuristic: at each step, pick min element, assign, and remove corresponding row and column.

• x21=1; x33=1; x42=1; x14=1; cost= 2+3+4+7=16

VUGRAPH 25 Graph theoretic ideas behind Hungarian algorithm

• An alternating path in a bipartite graph G= (S, T, E) with respect to a matching M is a path with its edges alternately in M and not in M. See (a). • Alternating tree Q (forest, F if the graph is disconnected) with respect to a matching M has a root node r in S and all other nodes except r are matched. See (b). • An augmenting path P is a simple alternating path joining two free vertices u ∈ S and v ∈ T. See (c). • A Matching M is perfect (maximal) if there is no augmenting path with

respect to M. see (a). {( u1 , v 3 ),( u 2 , vj ),( u i , v 1 ),( u n , v n )}

multipliers

ui v12  u  v j() or u i  v n  u n  v j • The Hungarian algorithm successively builds maximum matchings on a equality graph G’ that satisfies CS conditions, i.e., rij = ui + vj – cij=0

VUGRAPH 26 Connection to Duality

• Recall dual: nn maxq u , v  max  uij v ij11

s.t. ui v j  c ij ,  ( i , j )  E

• Hungarian algorithm is a primal-dual algorithm that starts with

uimin c ij ; i 1,2,.., n j out() i

vjmin ( c ij  u i ); j  1,2,.., n i In() j

• Maintains dual feasibility rij= cij-ui - vj  0.

• Works with equality graph, G’ for which rij = cij-ui - vj = 0 , i.e., preserves CS conditions. Solves the maximum cardinality bipartite matching problem on the graph G’ that either finds a perfect matching, or we get a vertex cover of size < n. • If (I, J) is not perfect cover, it finds the direction of dual increase

  min rij i I; j J

ii    ;iI 

pjj p ; j  J

VUGRAPH 27 Illustration of Hungarian algorithm 14 5 8 7   5   9 0 3 0  2 12 6 5   2   0 10 4 1  C   u   ; vT  0 0 0 2 ; R    ; 7 8 3 9   3   4 5 0 4        2 4 6 10   2   0 2 4 6  Dualcos t  14 G ' ({1,2,3,4},{1,2,3,4},{(1,2),(1,4),(2,1),(3,3),(4,1)}) Matching, M ' {(1,2),(2,1),(3,3)} perfect . I{1}; J  {1,3}. Node cov er  3  4 4   10 0 4 0  2   0 9 4 0  '  T    mincij  1  u  ; v  0 1 0 3 ; R  i I, j J 3   4 4 0 3      2   0 1 4 5 

• Assignments: x33= 1; x41= 1; x24= 1; x12 = 1

• Cost = c12 + c24 + c33 + c41 = 15

nn Dual cost=uvij 15 ij11

VUGRAPH 28 Class of Shortest Augmenting Path Algorithms • Fact: Assignment is a special case of transportation problem and the more general minimum cost network flow (MCNF) problem (Lecture 10) • In assignment, each node in S transmits one unit and each unit in T must receive one unit  multi-source, multi-sink problem • Indeed, an optimal solution can be found by considering one source in S at a time and finding the shortest path emanating from it to an unassigned node in T. • While Hungarian algorithm finds any feasible augmenting path, Jonker, Volgenant and Castanon (JVC) and a number of other algorithms find the shortest augmenting paths . JVC  a clever shortest augmenting path implementation of Hungarian and a number of pre-processing techniques, including column reduction, reduction transfer, reduction of unassigned rows, which is basically auction • Basic idea . Select an unassigned node in S

. Construct the residual (auxiliary, incremental) graph, G’ with costs {rij}

. Find the shortest augmenting path via Dijkstra (recall rij 0) . Augment the solution  improve the match . Update the dual variables so that CS conditions hold VUGRAPH 29 Jonker, Volgenant and Castanon (JVC) algorithm

• JVC algorithm for primal minimization (cij = –wij, Code is available on the net) Primal min xc Dual ijij ij  s.t. x  1  i ij max uvij ij xij 1  j s.t. cij u i v j 0

xij 0,  i , j ,  E

. Indices 푖 and 푗 refer to rows and columns, respectively

. 푥푖(푦푗) is the column (row) index assigned to row 푖 (column 푗), with 푥푖 = 0 (푦푗 = 0) for an unassigned row 푖 (column 푗)

. The dual variables (prices) 푢푖 and 푣푗 correspond to row 푖 and column 푗, respectively, with 푟푖푗 = 푐푖푗 − 푢푖 − 푣푗 denoting the reduced costs • Basic outline of JVC algorithm Step 1: initialization … column reduction Step 2: termination, if all rows are assigned Step 3: augmentation … construct auxiliary network and determine from unassigned row 푖 to unassigned column 푗 an alternating path of minimal total reduced cost … use to augment the solution Step 4: update dual solution … restore complementary slackness and go to step 2

VUGRAPH 30 JVC algorithm procedure – initialization Column Reduction: • Initialization 14 5 8 7 . Column reduction 2 12 6 5 for 푗 = 푛 … 1 do C   7 8 3 9 푐푗 = 푗; ℎ = 푐1푗; 푖1 = 1  for 푖 = 2 … 푛 do 2 4 6 10 0 if 푐푖푗 < ℎ then ℎ = 푐푖푗; 푖1 = 푖  0 푣푗 = ℎ T  v 2 4 3 5 ; u  ; x21  x 42  x 33  1 if 푥푖 = 0 then 푥푖 = 푗; 푦푗 = 푖1 0 1 1  else 푥푖 = −푥푖; 푦푗 = 0 0 end do Dualcos t  14 end do Row1 and column 4 unassigned .

⇒ 푣푗 = min(푐푖푗) Reduction Transfer: 푖 Row 2:  =min(8,3,0), vu 2; 0 .. each column is assigned to minimal row element 11 Row 3:  =min(5,4,4),vu  1;  4 .. some rows may not be assigned 33 Row 4:  =min(0,3,5),vu 4; 0 . Reduction transfer from unassigned to assigned rows 24 0 for each assigned row 푖 do  T 0 푗1 = 푥푖; 휇 = min{푐푖푗 − 푣푗: 푗 = 1, … , 푛; 푗 ≠ 푗1} vu 2 4  1 5 ;   푣 = 푣 − 휇 − 푢 ; 푢 = 휇 4 푗1 푗1 푖 푖  end do 0 … similar to 2nd best profit in auction Dualcos t  14

VUGRAPH 31 JVC algorithm procedure – pre-processing Augmentation Reduction:

Row 1: kk12 =min(12,1,9,2)=1; 2 vu3; 2 . Augmenting reduction of unassigned rows (…auction) 21 x12 1

퐿퐼푆푇 = all unassigned rows x42  0 ∀푖 ∈ 퐿퐼푆푇 do 2 0 repeat vuT 2 3  1 5 ;   4  푘1 = min{푐푖푗 − 푣푗: 푗 = 1, … , 푛} 0 Dualcos t  15 select 푗1 with 푐푖푗 − 푣푗 = 푘1 1 1 Row 4: kk =min(0,1,7,5)=0; 1 푘 = min{푐 − 푣 : 푗 = 1, … , 푛; 푗 ≠ 푗 } 12 2 푖푗 푗 1 vu141; 1 x 1 select 푗2 with 푤 = 푐푖푗2 − 푣푗2 = 푘2; 푗2 ≠ 푗1 41 x21  0 푢푖 = 푘2 2 if 푘1 < 푘2 then 0 vuT 1 3  1 5 ;   푣 = 푣 − (푘 − 푘 ) 4 푗1 푗1 2 1  else if 푗 is assigned, then 1 1 Row 2: k =min(1,9,7,0)=0;k  1 푗 = 푗 1 2 1 2 vu424; 1 x 1 푟 = 푦푗1; 24 if 푟 > 0 then r  0 2  푥푟 = 0; 푥푖 = 푗1; 푦푗 = 푖; 푖 = 푟 T 1 1 vu 0 3  1 4 ;   4 until 푘1 = 푘2 or 푟 = 0  end do 2 Primal and dual feasible. Done! . Complexity of first two initialization procedures is O(n2), and it can be shown that the augmenting reduction procedure has complexity O(Rn2), with R the range of cost coefficients

VUGRAPH 32 JVC algorithm procedure – augmentation + update

• Augmentation . Modified version of Dijkstra’s shortest augmenting path method ∀푖∗ unassigned do 푇푂푆퐶퐴푁 = 1 … 푛 for 푗 = 1 … 푛 do 푑푗 = ∞ end do ∗ 푖 = 푖 ; 푑푖∗ = 0; 휇 = 0 repeat ∀푗 ∈ 푂푈푇 푖 ∩ 푇푂푆퐶퐴푁 do Don’t need to execute if 휇 + 푐푖푗 − 푢푖 − 푣푗 < 푑푗 then this too many times 푑푗 = 휇 + 푐푖푗 − 푢푖 − 푣푗; pred 푗 = 푖 end do 휇 = ∞ ∀푗 ∈ 푇푂푆퐶퐴푁 do if 푑푗 < 휇 then 휇 = 푑푗; 휇푗 = 푗 end do

푖 = 푦휇푗; 푇푂푆퐶퐴푁 = 푇푂푆퐶퐴푁 − 휇푗

until 푦휇푗 = 0 end do . Complexity for augmentation phase is O(n3), so this holds for the entire JVC algorithm • Update of the dual solution . After augmentation of the partial assignment, the values of dual variables must be updated to restore complementary slackness, that is,

o 푐푖푘 − 푢푖 − 푣푘 = 0, if 푥푖 = 푘 for assigned column 푘, 푖 = 1, … , 푛 and o 푐푖푘 − 푢푖 − 푣푘 ≥ 0

VUGRAPH 33 Murty’s Clever Partitioning Idea to get m-best solutions

• Suppose you have three binary variables, x1 , x2 , x3 nd x1 x2 x3 • To get 2 best solution 1 1 1 • Fix x1= 0 1 1 0 Best solution • Now, you are searching over (0,1,1), (0,1,0),(0,0,1),(0,0,0) 1 0 1 • Fix x = 1 and x =0 1 0 0 1 2 • Now, you are searching over 0 1 1 2nd Best solution (1,0,1), (1,0,0)

0 1 0 • Fix x1= 1, x2=1 and x3=1 0 0 1 • Now, you evaluate (1,1,1) 0 0 0  You searched over all (x1 , x2 , x3), except the best (1,1,0) rd • To get the 3 best, do the partitioning nd nd • Pick the 2 best solution (cost and on 2 best solution variables) from the solutions from • Fix x1= 0 and x2=0 and search these. Keep the other two in a list. over(0,0,1),(0,0,0) • To get the 3rd best, do the nd • Fix x1= 0, x2=1 and x3=0 partitioning on 2 best solution • Pick the 3rd best from these and others in the list. VUGRAPH 45 Murty’s Partitioning for Assignment

. Suppose we express the assignment problem 푃 of size 푛 as a bipartite graph, represented as a list of triples 푦, 푧, 푙 , with an assignment 퐴푘 from the (feasible) solution space 퐴 denoted as a set of triples in which each 푦 and 푧 appear exactly once, that is, 푛! 푘 푘 퐴 = ራ 퐴 , where 퐴 = 푦푘푗, 푧푘푗, 푙푘푗 , 푗 = 1, … , 푛 푘=1 . The cost 퐶(퐴푘) of the assignment 퐴푘 is simply the sum of 푙′s in the triples, that is, 푛 푘 퐶 퐴 = ෍ 푙푘푗 푗=1 . The single best assignment 퐴∗ to 푃 can be found using any of the classical methods of solving the assignment problem (e.g., auction, JVC) . Subsequent assignments to 푃 are found by solving a succession of assignment problems, where 푃 is partitioned into a series of sub-problems, 푃1, 푃2, … , 푃푛, having solution spaces 퐴1, 퐴2, … , 퐴푛 ∋ 푛 ∗ ራ 퐴푖 = 퐴 − 퐴 푖=1

퐴푖 ∩ 퐴푗 = ∅, for 푖, 푗 = 1, … , 푛, 푖 ≠ 푗 st ∗ . Subproblem 푃1 is 푃 less the 1 triple 푦1, 푧1, 푙1 in the best assignment 퐴 of 푃 푖 ⇒ 퐴1 = 퐴 ∈ 퐴: 푦푖푗, 푧푖푗, 푙푖푗 ≠ 푦1, 푧1, 푙1 , 푗 = 1, … , 푛

. In subproblems 푃2, … , 푃푛, we want to force 푦1, 푧1, 푙1 to be in all assignments of 퐴2, … , 퐴푛

VUGRAPH 46 Murty’s M-Best assignment algorithm

. Hence, 푃2 is 푃1, plus the triple 푦1, 푧1, 푙1 , less all triples 푦1, 푧푗, 푙푗 & 푦푗, 푧1, 푙푗 , 푗 = 2, … , 푛, and nd ∗ less the 2 triple 푦2, 푧2, 푙2 in the best assignment 퐴 of 푃 A  AAyzlii : , ,  yzlj , , ,  1,  , n & yzlA , ,  2 ij i j i j 2 2 2 1 1 1 

. In the construction of subproblems 푃푘, … , 푃푛, we force 푦푙, 푧푙, 푙푙 , 푙 = 1, … , 푘 − 1 to be in all assignments of 퐴푘, … , 퐴푛 . In general, subproblem 푃푘, 1 < 푘 ≤ 푛, is 푃푘−1 plus the triple 푦푘−1, 푧푘−1, 푙푘−1 , less all triples ∗ 푦푘−1, 푧푗, 푙푗 & 푦푗, 푧푘−1, 푙푗 , 푗 ≠ 푘 − 1, and less the 푘th triple 푦푘, 푧푘, 푙푘 in the best assignment 퐴 of 푃 A  AAyzlii : , ,  yzlj , , ,  1,  , n & yzlAl , ,  ,  1,  , k  1 k ij i j i j k k k l l l 

. Note that solution spaces 퐴푖 for subproblems 푃푖, 푖 = 1, … , 푛 are disjoint and their union will be exactly the solution space to 푃 less its optimal assignment (i.e., 퐴 − 퐴∗) . Once we partition 푃 according to its optimal assignment 퐴∗, we place the resulting ∗ ∗ subproblems 푃1, … , 푃푛 together with their optimal assignments 퐴1, … , 퐴푛 on a priority queue of ∗ (problem, assignment) pairs (e.g., 푄푢푒푢푒 ← 푃푖, 퐴푖 , 푖 = 1, … , 푛) . We then find the problem 푃′ in the queue having the best assignment … the best assignment 퐴′∗ to this problem is the 2nd best assignment to 푃 . We then remove 푃′ from the queue and replace it by its partitioning (according to optimal assignment 퐴′∗) ⇒ the best assignment found in the queue will be the 3rd best assignment to 푃 . Complexity: since we perform one partitioning for each of the M-Best assignments, each partitioning (worst case) creating O(n) new problems, ⇒ O(Mn) assignment problems and queue insertions . . . each assignment takes O(n3), each insertion takes at most O(Mn), hence, Murty’s algorithm takes O(Mn4)

VUGRAPH 47 Optimization of Murty’s algorithm • (Stone & Cox) . Recall Murty’s algorithm is independent of the assignment algorithm chosen for solving the assignment problem . Suppose we use the JVC algorithm to find the optimal assignment o 2 important properties of the JVC algorithm  The optimal assignment will be found as long as the partial binary variable assignment, X, and dual variables u and v satisfy the following two criteria:

i, j xij  1, c ij  u i  v j

i, j xij 0, c ij u i v j  The slack or reduced cost (i.e., cij – ui – vj) between any z and y is useful in estimating the cost of the best assignment in which they are assigned to each other

 If the current lower bound on the cost of the partial assignment, with either zi or yj unassigned, is C, then the cost of the optimal assignment that assigns zi to yj will be at least C + cij – ui – vj  A new lower bound can be computed by finding the minimum slack of a partial assignment When used in Murty’s algorithm, Stone & Cox noted that the JVC assignment algorithm allows for a # of optimizations that dramatically improve both average-case and worst-case complexity . 3 optimizations proposed o Dual variables & partial solution inheritance during partitioning o Sorting subproblems by lower cost bounds before solving o Partitioning in an optimized order . Only inheritance reduces worst-case complexity, so we’ll discuss it

VUGRAPH 48 Dual variables • Dual variables & partial solution inheritance during partitioning . When a problem is partitioned, considerable work has been expended in finding its best assignment . . . we exploit this computation when finding best assignments in subproblems resulting from partitioning . Consider a problem 푃 being partitioned, with dual variables 푢(푃) and 푣(푃), binary variable assignment 푥(푃), and cost matrix 푐(푃)

′ (푃1) (푃) o Subproblem 푃1 can inherit part of 푃 s computation by assigning 푐 = 푐 , setting 푐(푃1) = ∞ and setting 푥(푃1) = 0 corresponding to arc 푦 , 푧 , 푙 ∈ 퐴∗, 푖1푗1 푖1푗1 푖1 푖1 and initializing 푥(푃1) = 푥(푃), 푢(푃1) = 푢(푃), and 푣(푃1) = 푣(푃)

o In general, subproblem 푃푘 can inherit part of the computation at 푃푘−1 by (푃 ) (푃 ) assigning 푐 푃푘 = 푐 푃푘−1 , setting 푐 푘 = ∞ and setting 푥 푘 = 0 corresponding 푖푘푗푘 푖푘푗푘 ∗ (푃푘) (푃푘−1) (푃푘) (푃푘−1) (푃푘) to 푦푖푘, 푧푖푘, 푙 ∈ 퐴 , and initializing 푥 = 푥 , 푢 = 푢 , and 푣 = 푣(푃푘−1) The value of this optimization is that, after the first assignment to the original assignment problem is found, no more than one augmentation needs to be performed for each new subproblem . Complexity: the augmentation step of JVC takes O(n2), so Murty’s algorithm with this optimization is O(Mn3)

VUGRAPH 49 Correlation problem

• A solvable case of assignment problem: correlation problem

 j w f y dy, if  ij   j i i

i w g y dy, if  ij   j i  j . f(y) = g(y) = 1 ⇒ wij = |αi – βj| 훼1 ≥ 훼2 … ≥ 훼푛 . Matching

훽1 ≥ 훽2 … ≥ 훽푛 . This can be interpreted as the following assignment problem o Suppose have n in S and n in T such that

 Attractiveness of 푖 ∈ 푆 = 훼푖

 Attractiveness of 푗 ∈ 푇 = 훽푗

 Define 푤푖푗 = |훼푖 − 훽푗|

min x w ijij ij sort     s.t. xj 1, 12 n i ij sort 12     n xi1,   j ij assign(ij , )

xij  0

This is also min-max optimal matching … ∃ several other solvable cases…see Lawler

VUGRAPH 50 Summary • Examples of assignment problems • Auction algorithm . Coordinate dual descent . Scaling in critical to the success of the algorithm • Hungarian method . O(n3) algorithm . Not as fast as auction or JVC in practice • Successive shortest path method . JVC algorithm . Fastest algorithm in practice to date (?) • M-Best assignment algorithms

Recent Book: R. Burkard, M. Dell’Amico and S. Martello, Assignment Problems, SIAM, 2009.

VUGRAPH 51