5
Computers ind. Engng Vol. 22, No. 4, pp. 495--499, 1992 0360-8352/92 $5.00+ 0.00 Printed in Great Britain. All fights reserved Copyright © 1992 Pvrgamon Press Ltd EFFICIENT IMPLEMENTATION OF JOHNSON'S RULE FOR THE n/2/F/Fm~x SCHEDULING PROBLEM T. C. E. CI-mN~ Department of Actuarial and Management Sciences,Universityof Manitoba, Winnipeg, Manitoba, Canada R3T 2N2 (Received for publication 29 January 1992) Abstract--Several efficientalgorithms, of O(n log n) computationalcomplexity,for the Johnson's rule to schedule a set of simultaneouslyavailable jobs on two machines in a flowshipto minimize the maximum job flowtimehave appeared in the literature. A modifiedversion of one of these algorithms is presented which not only simplifies the programming effort for implementation but is also able to generate all possible optimal sequencesobtainable from Johnson's rule. INTRODUCTION The formulation of the n/2/F/F,~ scheduling problem is as follows. A set N ffi {1, 2 ..... n } of simultaneously available jobs is given to be processed on a set M = {A, B} of machines. Each job has two operations, which must be processed in the following manner: (i) the first operation must be procesed on machine A and the second operation on machine B, and (ii) the second operation cannot begin until the first operation is finished. Job k requires an amount of processing time ak on machine A and processing time bk on machine B, all k ~ N. Let a: N ~ N be a permutation of the jobs and n be the set of all possible permutations of the jobs, of which there are n!. Given any a, let Fk((r) denote the flow time of job k in a. The problem is to find an optimal a* z ~t such that the maximum job flow time, Fm=(a) (i.e. the makespan), is minimized. That is Fm.x(t~*)=max {Fk(a*)} =min ~max {Fk(O )} ~ ( k~N Johnson [1] has presented an elegant (polynomial-time) algorithm for solving this problem. Johnson's algorithm is based on the following sufficient optimal condition to sequence a pair of jobs, commonly known as Johnson's rule: Johnson's rule: There exists an optimal sequence in which job i precedes job j if min(ai, bj) ~<min(aj, b,). More recently, Kusiak [2] has presented an efficient algorithm for implementing Johnson's rule. He has shown that the computational complexity of his algorithm is O(n log n). We note that Kusiak's algorithm closely resembles a number of algorithms for the n-job two-machine flowshop problem presented by various authors, including Baker [3, p. 144], Bellman et al. [4, p. 146] and French [5, p.71]. It is easy to show that the algorithms presented by these authors also have O(n log n) computational complexity. We present in this paper a modified version of an algorithm discussed in Bellman et al. [4, p. 146]. This algorithm can also be regarded as a realization of the idea of Page [6, p. 487]. Specifically, the modification takes the form of augmenting a small quantity to the process of evaluating the job priority indices. Such a modification enables generation of all possible optimal sequences obtainable from Johnson's rule, as will be shown in the sequel. In addition, this modified algorithm greatly simplifies the programming effort required to implement the algorithrn on a computer. We remark that, to the best of our knowledge, the problem of generating all optimal sequences for the n/2/F/Fmx scheduling problem is open, i.e. we do know if it is NP-hard or allows of a polynomial-time solution algorithm. 495

Efficient implementation of Johnson's rule for the n/2/F/Fmax scheduling problem

Embed Size (px)

Citation preview

Computers ind. Engng Vol. 22, No. 4, pp. 495--499, 1992 0360-8352/92 $5.00+ 0.00 Printed in Great Britain. All fights reserved Copyright © 1992 Pvrgamon Press Ltd

EFFICIENT IMPLEMENTATION OF JOHNSON'S RULE FOR THE n/2/F/Fm~x SCHEDULING PROBLEM

T. C. E. CI-mN~ Department of Actuarial and Management Sciences, University of Manitoba, Winnipeg, Manitoba,

Canada R3T 2N2

(Received for publication 29 January 1992)

Abstract--Several efficient algorithms, of O(n log n) computational complexity, for the Johnson's rule to schedule a set of simultaneously available jobs on two machines in a flowship to minimize the maximum job flowtime have appeared in the literature. A modified version of one of these algorithms is presented which not only simplifies the programming effort for implementation but is also able to generate all possible optimal sequences obtainable from Johnson's rule.

INTRODUCTION

The formulation of the n/2/F/F,~ scheduling problem is as follows. A set N ffi { 1, 2 . . . . . n } of simultaneously available jobs is given to be processed on a set M = {A, B} of machines. Each job has two operations, which must be processed in the following manner: (i) the first operation must be procesed on machine A and the second operation on machine B, and (ii) the second operation cannot begin until the first operation is finished. Job k requires an amount of processing time ak on machine A and processing time bk on machine B, all k ~ N. Let a: N ~ N be a permutation of the jobs and n be the set of all possible permutations of the jobs, of which there are n!. Given any a, let Fk((r) denote the flow time of job k in a. The problem is to find an optimal a* z ~t such that the maximum job flow time, Fm=(a) (i.e. the makespan), is minimized. That is

Fm.x(t~*)=max {Fk(a*)} =min ~max {Fk(O )} ~ ( k~N

Johnson [1] has presented an elegant (polynomial-time) algorithm for solving this problem. Johnson's algorithm is based on the following sufficient optimal condition to sequence a pair of jobs, commonly known as Johnson's rule: Johnson's rule: There exists an optimal sequence in which job i precedes job j if

min(ai, bj) ~< min(aj, b,).

More recently, Kusiak [2] has presented an efficient algorithm for implementing Johnson's rule. He has shown that the computational complexity of his algorithm is O(n log n). We note that Kusiak's algorithm closely resembles a number of algorithms for the n-job two-machine flowshop problem presented by various authors, including Baker [3, p. 144], Bellman et al. [4, p. 146] and French [5, p.71]. It is easy to show that the algorithms presented by these authors also have O(n log n) computational complexity.

We present in this paper a modified version of an algorithm discussed in Bellman et al. [4, p. 146]. This algorithm can also be regarded as a realization of the idea of Page [6, p. 487]. Specifically, the modification takes the form of augmenting a small quantity to the process of evaluating the job priority indices. Such a modification enables generation of all possible optimal sequences obtainable from Johnson's rule, as will be shown in the sequel. In addition, this modified algorithm greatly simplifies the programming effort required to implement the algorithrn on a computer. We remark that, to the best of our knowledge, the problem of generating all optimal sequences for the n/2/F/Fmx scheduling problem is open, i.e. we do know if it is NP-hard or allows of a polynomial-time solution algorithm.

495

496 T .C .E . C~N~

Algorithm_J Step 1. For each job k, calculate a priority index Pk expressed as

sgn(ak - bk -- ~ ) Pk = min(ak, bk)

where sgn(.) is the signum function; ~ is an infinitesimal quantity added to help generate all optimal sequences obtainable from Jonson's rule. Note that ~ can be any non-zero real number.

Step 2. Generate a set of optimal job sequences, each of which is formed by arranging the jobs in non-decreasing order of their priority indices. For any o, let a(k) denote the job in the k-th position of o, then an optimal sequence is n*=(o(1) , o(2) . . . . ,a(n)) with P~,) ~<P~2) ~<... <~ P,~,).

To establish the computational complexity of algorithm__ J to generate one optimal sequence, we note that Step 1, which involves elementary algebraic operations, takes O(n) time. Step 2 used to generate a sequence (a(l) ,~r(2), . . . , o(n)) is essentially a sorting operation, which runs in O (n log n) time by the most efficient sorting algorithm known to exist, see, for example, Horowitz and Sahni [7] and Wilf [8]. Hence, the computational complexity, OA, of algorithm_ J to produce one optimal sequence is as shown in Table 1.

Correctness of algorithm_J

The correctness of the algorithm_ J is based on the following proposition which asserts that Step 2 of algorithm_J to sequence the jobs in non-decreasing order of their priority indices as defined in Step 1 is indeed in accordance with Johnson's rule.

Proposition: Given a pair of jobs i and j with priority indices pt and pj, respectively, as defined in Step 1 of algorithm_ J, there exists an optimal sequence in which i precedes j if p~ ~< py.

Proof: For any given pair of jobs i and j, one of the following five cases holds. Case I: a~ ~ b~ and aj ~ bj (i.e. no e is used).

There are three sub-cases to consider. Case I(i): Both Pi < 0 and Pi < 0 By definition

Since

it follows that

Pi < 0 =~ a i < bi,

pj < O=~ a: < b:.

- 1 - 1 pt <~ Py::~ <~ =~aji <~ aj,

a~ a i

rain(a,, by) ~ min(aj, b,)

and so, according to Johnson's rule, there exists an optimal sequence in which job i precedes job j. Case I(ii): Both Pi > 0 and pj > 0 By definition

Pi > O=*,ai > bi,

pj> O=,aj> bj.

Table I

Stcp numbcr Computational complexity

i o(n) 2 O(n losn)

0,, = O(n logn)

Scheduling problem 497

Since

it follows that

1 1 pi <.. ps ==- <.. - :,. bj <.. b .

b, b s

rain(a,, by) .< min(aj, b,)

and so, according to Johnson's rule, there exists an optimal sequence in which job i precedes job j. Case I(iii): Pi < 0 and pj > 0 By definition

p~ < 0=~a i < b~,

p+ > O =~ aj > bj.

In this case, it is easy to see that, regardless of whether a~ <~ bj or ai > by,

min(a,, by) <~ min(aj , b,)

and so, according to Johnson's rule, there exists an optimal sequence in which job i prezedes j o b £ Case II: a~ = b, and ~ > 0

There are three sub-cases to consider. Case II(i): aj > bj This implies p~ < 0 and pj > 0, so this case reduces to case I(iii). Case II(ii): aj = bj This implies p~ < 0 and pj < 0, so this case reduces to case I(i). Case II(iii): a s < b s This implies p~ < 0 and pj < 0, so this case reduces to case I(i).

Case III: a~ = b~ and e < 0 There are two sub-cases to consider. Case III(i): a s > b s This implies p~ > 0 and Ps > 0, so this case reduces to case I(ii). Case III(ii): a s = b s This implies Pt > 0 and Ps > 0, so this case reduces to case I(ii).

Case I V : a s = b s a n d e > 0 There are two sub-cases to consider. Case IV(i): ai = b~ This implies Pi < 0 and Ps < 0, so this case reduces to case I(i). Case IV(ii): a~ < bi This implies p~ < 0 and Ps < 0, so this case reduces to case I(i).

C a s e V : a s = b j a n d ~ < 0 There are three sub-cases to consider. Case V(i): a~ > b~ This implies p~ > 0 and Ps > 0, so this case reduces to case I(ii). Case V(ii): a~ = bi This implies p~ > 0 and Ps > 0, so this case reduces to case I(ii). Case V(iii): at < bt This implies p~ < 0 and Ps > 0, so this case reduces to case I(iii). We have shown that, for any given pair of jobs i an d £ ifp~ ~< Ps then i prccedesj under all possible

situations. So algori thm_J is correct and the proof is complete.

Example 1 To demonstrate the working procedures of algorithm_J, we apply it to solve the 7/2/F/F=~

scheduling problem presented in Kusiak [2]. The problem data and solutions are as shown in Table 2.

Following Step 2 of algorithm_J, we obtain two optimal sequences (4, 2, 6, 7, 1, 3, 5) and (4, 2, 6, 7, 3, 1, 5), both of which yield F=~ = 36. It is interesting to note that algori thm_J is

498 T . C . E . Cr~r~G

Table 3

Job k a k bk pk(~ <, >0)

Table 2 1 I

Job k a~ b~ Pk

l 1 I 2 4 4 4 ' - 4

1 6 3

t 2 2 9 --1 3 9 2

2 I

1 4 10 5 3 4 3

I 4 1 8 - 1 5 10 5 - ~

l 5 7 1 1 6 10 6

1 6 4 5 -~ 7 4 3

1

7 7 6 g 8 9 10 9

computationally more efficient than both Johnson's and Kusiak's algorithm in terms of obtaining alternative optimal sequences. This is because the entire Johnson's or Kusiak's algorithm must be repeated to obtain alternative optimal solutions, while to do so using algorithm_J only requires repeating Step 2.

Example 2 We now solve 8/2/F/Fmax problem to demonstrate how our proposed change in the algorithm

of Bellman et al [4] (i.e. introducing e in the evaluation of job priority indices) can help to generate all possible optimal sequences obtainable from Johnson's rule. The problem data and solutions are as shown in Table 3.

The set of optimal sequences is easily determined from Step 2 of algorithm_ J as containing (2, 1, 5, 8, 6, 4, 7, 3), (2, 5, 8, 1, 6, 4, 7, 3), (1, 5, 8, 6, 4, 2, 7, 3) and (5, 8, 1, 6, 4, 2, 7, 3) and the minimum Fmax is 63. It is also easy to verify that applying Johnson's rule will generate this same set of optimal sequences.

D I S C U S S I O N

It should be noted that the use of e is only needed when a job k has equal processing times on the two machines, i.e. ak = bk. Under such a circumstance, according to Johnson's rule, job k may be sequenced first or last, depending on whether ak or bk is regarded as smallest among the jobs yet to be sequenced. The use of E here is to help prioritize the jobs which enables Step 2 of algorithm_J to generate alternative optimal sequences. For instance, while no e is needed in Example 1, jobs 1 and 2 in Example 2 require ~ because a~ = b~ = 7 and a2 = b2 = 4. It is clear that in this example the positions for jobs 1 and 2 in an optimal sequence are determined by its calculated priority indices, which in turn depend on whether e > 0 or E < 0 is used in the priority evaluation process.

Finally since a new sequence is generated by algorithm_ J each time ~ is used, the total number of optimal sequences for a given problem is 2 "('), where n (e) is the number of times ~ is used, provided that all resultant priority indices are different. In addition, if some of the priority indices are equal, the number of alternative sequences will increase factorially with the number of equal priority indices, as dictated by Step 2 of algorithm_ J.

Acknowledgement--Tiffs research was s u p p o r t e d in pa r t by the N a t u r a l Sciences and Eng inee r ing Resea rch Counc i l o f C a n a d a unde r G r a n t OPCJ~36424 .

Scheduling problem 499

REFERENCES

1. S. M. Johnson. Optimal two- and three-stage production schedule with setup times included. Naval Res. Log. Q. 1, 61--68 (1954).

2. A. Kusiak. Efficient implementation of Johnson's scheduling algorithm. IIE Trans. 18, 215-216 (1986). 3. K. R. Baker. Introduction to Sequencing and Scheduling. Wiley, New York (1974). 4. R. Bellman, A. O. Esogbue and I. Nabeshima. Mathematical Aspects of Scheduling and Applications. Pergamon Press,

Oxford (1983). 5. S. French. Sequencing and Scheduling: An Introduction to the Mathematics of the Job-shop. Ellis Horwood, Chichester

(1982). 6. E. S. Page. An approach to the scheduling of jobs on machines. J. R. Stat~. Sac. 23, 484-492 (1961). 7. E. Horowitz and S. Sahni. Fundamentals of Data Structures. Computer Science Press, Rockville (1983). 8. H. S. Will. Algorithms and Complexity. Prentice-Hall, New Jersey (1986).

CAIE 22/4---J