
1 6117CIT - Adv Topics in Computing Sci at Nathan Algorithms The intelligence behind the hardware Outline ! Approximation Algorithms • The class APX ! Some complexity classes, like PTAS and FPTAS ! Illustration of some PTAS ! Based on • P. Schuurman and G. Woeginger (2001), Approximation Schemes - A Tutorial. • M. Mastrolilli course notes 2 © V. Estivill-Castro The class APX ! (an abbreviation of "approximable") . ! The set of NP optimization problems that allow polynomial-time approximation algorithms with approximation ratio bounded by a constant (or constant-factor approximation algorithms for short). ! Problems in this class have efficient algorithms that can find an answer within some fixed percentage of the optimal answer. ! An approximation algorithm is called a !- approximation algorithm for some constant ! if it can be proven that the solution that the algorithm finds is at most ! times worse than the optimal solution. 3 © V. Estivill-Castro 1 !Review from week 2 ! The vertex cover problem and traveling salesman problem with triangle inequality each have simple 2- approximation algorithms. ! The traveling salesman problem with arbitrary edge-lengths can not be approximated with approximation ratio bounded by a constant as long as the Hamiltonian- path problem can not be solved in polynomial time. 4 © V. Estivill-Castro Alternative view on PTAS ! If there is a polynomial-time algorithm to solve a problem within every fixed percentage (one algorithm for each percentage), then the problem is said to have a polynomial-time approximation scheme (PTAS) • Unless P=NP, it can be shown that there are problems that are in APX but not in PTAS; that is, problems that can be approximated within some constant factor, but not every constant factor. 5 © V. Estivill-Castro More on APX ! A problem is said to be APX-hard if there is a PTAS reduction from every problem in APX to that problem, ! A problem is APX-complete if the problem is APX-hard and also in APX. ! As a consequence of PTAS ! APX, no APX-hard problem is in PTAS. 6 © V. Estivill-Castro 2 Focus on Optimization Problems ! Notation • We use I for an instance of an optimization problem • We use |I|=n, for the length of the input instance • We use Opt(I) for the value of the optimal solution ! We focus on minimization problems in this lecture, but all concepts are symmetric for maximization problems. 7 © V. Estivill-Castro !- Approximation Algorithm We denote A(I) = solution value. An algorithm is an “!-Approximation Algorithm” if A(I) " ! Opt(I) for all instances and the running time is polynomial in |I| ! = worst-case approximation ratio ! >= 1 Good: ! close to 1 8 © V. Estivill-Castro PTAS: Polynomial Time Approximation Scheme This is a family {A#}#>0 of (1+#)-approximation algorithms with running time polynomial in |I| •As observed in last lecture, the scheme fits the definition if the running time is exponential in 1/# : e.g. O(|I|1/#) NOTE: FPTAS: Fully PTAS running time also polynomial in 1/# : e.g. O(|I|/#3) 9 © V. Estivill-Castro 3 Strongly and Weakly NP-hard ! If a problem is NP-hard even if the input is encoded in unary, then it is called strongly NP-hard = NP-hard in the strong sense = unary NP-hard ! If a problem is polynomially solvable under a unary encoding, then it is solvable in pseudo-polynomial time. ! NP-Hard in the strong sense is contained within NP-Hard in the weak sense 10 © V. Estivill-Castro Complexity Classes Relationships NP Pseudo-Poly APX PTAS FPTAS P 11 © V. Estivill-Castro Some Known Approximation Algorithms Non-constant worst-case ratio • Graph coloring O(n1/2-e) • Total flow time O(n1/2) • Set covering O(log n) •Vertex cover O(log n) Constant worst-case ratio • TSP with triangle-inequalities 3/2 • Max Sat 1.2987 PTAS • Bin packing FPTAS • Makespan on 2 machines 12 © V. Estivill-Castro 4 The first Approximation Algorithm (Graham ‘66) !""#$%& Makespan minimization on' $'identical machines (strongly'N!-hard) )* $'identical machines '''' +'jobs with lengths',-.',/.'0.',+ Objective, smallest #$%& 1- 1/ 1$ #$%& 13 © V. Estivill-Castro Algorithm: List-Scheduling (LS) LS: schedule jobs in any given order to the first available (i.e. idle) machine List:'2-.'2/'.'24'.'23'.'26'.'25 1 - 2- 23 26 1/ 2/ 24 25 #$%& 14 © V. Estivill-Castro Analysis of LS-Algorithm ! Define the lower bound • LB=max {max pj; ! pj/m} ! Starting time of the FINAL job • sf : starting time of the job that completes last ! Observation (result by LS-Algorithm): LS 7 #$%& = sf+pf ! Let Ei be the completion (end) time of machine Mi 15 © V. Estivill-Castro 5 LS Analysis (cont) ! LS places the last job in the machine that is mostly available • sf ! Ei (for all other machines i!f) • sf = Ef - pf ! This implies (summing for each machine) that • m sf ! ["i=1 (Ei)] - pf • sf ! (1/m)(["i=1 Ei]-pf ) =(1/m)([" pj] -pf ) ! But LS • Cmax =sf+pf !(1/m) " pj + pf (1-(1/m)) ! Thus LS • Cmax ! [2- (1/m)]Opt 16 © V. Estivill-Castro LS: Analysis Theorem: LS is a (2-1/m)-approximation algorithm. The approximation ratio is tight. Example: p1 = p2 = 1 and p3 = 2 M M1 J J 1 J1 1 2 M J M2 J2 J3 2 3 C =2 Cmax=3 max 17 © V. Estivill-Castro Linear Programming based approximation algorithms ILP LP IDEA relax poly time difficult Opt OptLP round to integral values A(I) # $ Opt 18 © V. Estivill-Castro 6 Example: R2||Cmax R2||Cmax : Makespan minimization on 2 unrelated machines (weakly NP-hard) I: 2 unrelated machines n jobs Job j has length • p1j on machine M1 • p2j on machine M2 M1 M2 Cmax 19 © V. Estivill-Castro Integer Linear Program (ILP) ! We will encode by xij the fact that job j is placed in machine i ! Then the ILP looks as follows • Minimize Cmax • Subject to • x1j + x2j =1, for j=1,…n (each jobs is assigned once) n • !j=1 p1j x1j " Cmax n • !j=1 p1j x1j " Cmax • x1j , x2j ! {0, 1} (it most be on one machine or the other) j=1,…,n 20 © V. Estivill-Castro Linear Program (LP) Relaxation ! We will encode by xij the fact that job j is placed in machine i ! Then the ILP looks as follows • Minimize Cmax • Subject to • x1j + x2j =1, for j=1,…n (each jobs is assigned once) n • !j=1 p1j x1j " Cmax n • !j=1 p1j x1j " Cmax • x1j , x2j " 0 (it most be on some machine or split) j=1,…,n 21 © V. Estivill-Castro 7 Analysis of the number of fractional jobs ! Known: a basic optimal LP solution has the property that the number of variables that get positive values is at most the number of rows in the constraint matrix • Thus, there are at most n+2 variables with positive values ! Since Cmax is always positive, at most n+1 of the xij variables are positive • We reduce the value Cmax of if we make any pair of variables ofr the same job to zero ! Every job has at least one positive variable associated with it • Because x1j + x2j =1 ! CONCLUSION: At most 1 (ONE) job has been split onto two machines 22 © V. Estivill-Castro Rounding J5: fractional M 1 J1 J4 J5 M2 J2 J3 J6 J5 OptLP!!Opt M 1 J1 J4 M2 J2 J3 J6 J5 J5 !!Opt ROUNDING!!"Opt 23 © V. Estivill-Castro How to get a PTAS Output A(I): Input I Algorithm A feasible sol. for I Add structure IDEA: ! Add more structure (depending on ") as "##, additional structure # 0 ! Compare as "##$!%&'""()%*'"(!#&)* 24 © V. Estivill-Castro 8 Structuring the Input ! I I# difficult poly time Opt Opt# back in poly time A(I) " (1+!) Opt 25 © V. Estivill-Castro Example!"P2||Cmax P2||Cmax Makespan minimization on 2 identical machines (weakly NP-hard) I: 2 identical machines n jobs with lengths p1, p2, …, pn M1 M2 C Lower bound is again max LB=max {max pj; # pj/2} Thus LB " Opt " 2 LB 26 © V. Estivill-Castro How to round the input ! I I# ! # pj > ! LB pj := pj “big” $S/ (! LB)% jobs of length ! LB pj " ! LB ! where S= #small pj “small” 27 © V. Estivill-Castro 9 # Analysis of Rounded Instance I $ Recall that ! "# " 2&' •How many big jobs ? $ • a big job has "# ( # &' •This implies $)big* " 2/# •How many conglomerates jobs ? ow man1 small #o4s 5 $ •A conglomerate of small jobs has "# 6 # &' •This implies $)conglomerates* " 2/# LEMMA: The rounded instance has a constant(#) number of jobs. COROLARY: We can find its optimal solution in constant time!! PROOF: Use exhaustive search 28 © V. Estivill-Castro Back to a feasible solution 71 4i: 4i: … #&' #&' … 72 4i: 4i: #&' #&' #&' ;"t$ #&' #&' #&' #&' #&' small "#&' Sum of the small 29 © V. Estivill-Castro Back to a feasible solution (ctd) 71 4i: 4i: … #&' #&' … 72 4i: 4i: #&' #&' #&' ;"t$ 71 4i: 4i: … #&' #&' … 72 4i: 4i: #&' #&' #&' ;"t$ $ $ Cma> " ;"t ? #&' " )1?#* ;"t 30 © V. Estivill-Castro 10 How much error is introduced? # Cmax ! Opt + " LB Cmax ! Opt + 2" LB Opt# ! Opt + " LB ! (1+2") Opt# Wait till next slide 31 © V. Estivill-Castro !pt vs. !pt% (Case 1:LUCKY) M1 big big ... big # Opt ! Opt M2 big ... "LB "LB (Case 2:Optimal solution here has M1 big big ... "LB to be as good as LS) "LB M2 big big ... # LS Opt ! Cmax =sf+pf !(1/m) # pj + pf (1-(1/m)) Since m=2, and (1/m) # pj ! Opt # and pf /2 ! "LB, we have Opt ! Opt + "LB 32 © V. Estivill-Castro Structuring the execution of an algorithm IDEA: take an exact but slow algorithm A and interact with it while it is working.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages17 Page
-
File Size-