Scheduling with step-improving processing times

  • Published on

  • View

  • Download

Embed Size (px)


  • Operations Research Letters 34 (2006) 3740


    Schedulingwith step-improving processing timesT.C. Edwin Chenga,Yong Heb, Han Hoogeveenc,, Min Jib, Gerhard J. Woegingerd

    aDepartment of Logistics, The Hong Kong Polytechnic University, Hung Hom, Kowloon, Hong Kong SAR, ChinabDepartment of Mathematics, State Key Lab of CAD & CG, Zhejiang University, Hangzhou 310027, ChinacDepartment of Computer Science, Utrecht University, P.O. Box 80089, 3508 TB Utrecht, The Netherlands

    dDepartment of Mathematics and Computer Science, TU Eindhoven, P.O. Box 513, 5600 MB Eindhoven, The NetherlandsReceived 10 March 2005; accepted 18 March 2005

    Available online 31 May 2005


    We consider the scheduling problem of minimizing the makespan on a single machine with step-improving job processingtimes around a common critical date. For this problem we give an NP-hardness proof, a fast pseudo-polynomial time algorithm,an FPTAS, and an on-line algorithm with best possible competitive ratio. 2005 Elsevier B.V. All rights reserved.

    Keywords: Scheduling; Knapsack problem; Approximation scheme; Competitive analysis.

    1. Introduction

    Recent years have shown a growing interest in thearea of scheduling with time-dependent processingtimes; we refer the reader to the survey paper [1] byCheng, Ding and Lin for more information on thisarea. In this short technical note, we will concentrateon the following single machine scheduling problemwith time-dependent processing times: there are n in-dependent jobs J1, . . . , Jn with a common critical dateD. All jobs are available for processing at time 0. The

    Corresponding author.E-mail addresses: (T.C.E Cheng), (Y. He), (H. Hoogeveen), (M. Ji), Woeginger).

    0167-6377/$ - see front matter 2005 Elsevier B.V. All rights reserved.doi:10.1016/j.orl.2005.03.002

    processing time of job Jj (j = 1, . . . , n) is speciedby two integers aj and bj with 0bj aj . If job Jjis started at some time t

  • 38 T.C.E Cheng et al. / Operations Research Letters 34 (2006) 3740

    that the scheduling problem is NP-hard in the ordinarysense; see Section 3. Finally, we construct an on-linealgorithm with the best possible worst-case ratio 2 fora natural on-line version of this scheduling problem;see Section 4. Our results provide a complete pictureof this scheduling problem.

    2. The two underlying knapsack problems

    This section translates the scheduling problem intotwo corresponding knapsack problems. For every jobJj , we denote by cj = aj bj 0 the difference be-tween aj and bj . We let J = {1, 2, . . . , n} denote theindex set of all jobs. For an index set I J , we willwrite a(I ) as a short-hand notation for

    iI ai , b(I )


    iI bi , and c(I ) for

    iI ci . Furthermore, wewill assume

    Da(J ). (1)

    Otherwise, the problem instance would be trivial: Theoptimal schedule processes all jobs before the criticaldate with a makespan of a(J ). Next, let us consider anoptimal schedule for any given instance. Let X Jdenote the index set of the jobs with starting timessmaller than D in , and let Y = J X denote theindex set of the jobs with starting times greater thanor equal to D. Schedule belongs to one of the twopossible scenarios a(X)D 1 and a(X)D.

    In the rst scenario, the constraint a(X)D 1may be rewritten as a(J ) a(Y )D 1. In this sce-nario, all jobs starting before the critical date D alsocomplete before the critical date D. Without loss ofgenerality, we may assume that the jobs in X are pro-cessed during the time interval [0; a(X)], that the ma-chine then stands idle during [a(X); D], and that theremaining jobs in Y are executed contiguously fromtime D onwards. Because of (1), the correspondingmakespan equals D + c(Y ). The best schedule in therst scenario corresponds to the solution of the fol-lowing problem:

    Z1 := min c(Y )subject to a(Y )a(J ) D + 1; Y J . (2)

    Note that the optimization problem in (2) is a knap-sack problem subject to a covering constraint; see alsoSection 3 below.

    In the second scenario with a(X)D, we may as-sume that the jobs in X are processed during the timeinterval [0; a(X)], and that the remaining jobs in Yare processed during [a(X); a(X) + c(Y )]. The cor-responding makespan equals

    a(X) + c(Y ) = b(X) + c(X) + c(Y ) = c(J ) + b(X).Hence, the best schedule under the second scenariocorresponds to the optimal solution of

    Z2 := min b(X) subject to a(X)D; X J . (3)Note that (3) again is a knapsack problem subjectto a covering constraint. The optimal makespan forthe scheduling problem equals min{D + Z1, c(J ) +Z2}, that is, the better makespan found under the twoscenarios.

    3. Results on the off-line version

    This section deduces a number of results from theknapsack characterization. We rst prove the ordinaryNP-hardness of the problem. For this, we use a re-duction from PARTITION: We are given n positive in-tegers p1, . . . , pn with

    nj=1 pj = 2P , and we are

    asked whether there exists a set I {1, . . . , n} withjI pj =P . We construct the following instance of

    the scheduling problem: There are n jobs, where jobJj is specied by aj = 2pj and bj =pj , and the crit-ical date is D = 2P . It is easily veried that the an-swer to PARTITION is YES if and only if the optimalmakespan in the scheduling instance equals 3P . As aconsequence, we nd the following theorem.

    Theorem 1. Makespan minimization for jobs withstep-improving processing times and a common crit-ical date on a single machine is NP-hard in theordinary sense.

    Recall that the knapsack problem subject to a cov-ering constraint has as its input n items with weightsw1, . . . , wn and prots p1, . . . , pn and a bound P.The goal is to nd a subset of the items that has to-tal prot at least P and that has the smallest pos-sible weight. The knapsack problem can be solvedin pseudo-polynomial time by dynamic programmingwith running time O(n

    nj=1 wj) or O(n

    nj=1 pj );

    see for instance Kellerer et al. [2]. For our knapsack

  • T.C.E Cheng et al. / Operations Research Letters 34 (2006) 3740 39

    problems in (2) and (3), this translates into a timecomplexity of O(n

    nj=1 aj ).

    Theorem 2. Makespan minimization for jobs withstep-improving processing times and a common crit-ical date on a single machine can be solved inO(nn

    j=1 aj ) time.

    The knapsack problem subject to a covering con-straint also possesses a fully polynomial (FPTAS); seefor instance Kellerer et al. [2]. This means that for any> 0, there is an approximation algorithm that yieldsa feasible solution with total weight at most 1 + times the optimal weight. The running time of this ap-proximation algorithm is polynomially bounded in theinput size and in 1/. This yields an FPTAS for ourknapsack problems in (2) and (3). In (2), the FPTASgives us an approximation YA of the optimal solutionY such that c(YA)(1+ )c(Y ). Consequently, thecorresponding approximate makespan D+c(YA) is atmost 1+ times the optimal makespan D+c(Y ). Andin (3), the FPTAS gives us an approximation XA of theoptimal solution X with b(XA)(1+ )b(X). Con-sequently, the corresponding approximate makespanc(J ) + b(XA) is at most 1 + times the optimalmakespan c(J ) + b(X). We summarize these obser-vations in the following theorem.

    Theorem 3. Makespan minimization for jobs withstep-improving processing times and a commoncritical date on a single machine possesses anFPTAS.

    If we apply other fast knapsack approximation al-gorithms to problems (2) and (3), we will get corre-sponding approximation algorithms with correspond-ing approximation guarantees for our scheduling prob-lem in a straightforward way.

    4. Results on the on-line version

    In the on-line version of our scheduling problem,the jobs J1, . . . , Jn are revealed one by one. As soonas the on-line algorithm learns the values aj and bj forjob Jj , it must assign the job to an appropriate timeinterval; this decision is irrevocable and must not de-pend on later arriving jobs. We consider an extremely

    simple on-line algorithm ON that schedules the jobsin their given ordering J1, . . . , Jn and without intro-ducing unnecessary idle time: Algorithm ON sched-ules every new job Jj after all the jobs J1, . . . , Jj1,so that the completion time of Jj becomes as small aspossible.

    For analyzing algorithm ON, let k be the uniqueindex with

    k1j=1 aj

  • 40 T.C.E Cheng et al. / Operations Research Letters 34 (2006) 3740

    to an interval [x; x +D] with x


View more >