7
Computers ind. Engng Vol. 14, No. 4, pp. 387-393, 1988 0360-8352/88 $3.00+0.00 Printed in Great Britain. All rights reserved Copyright © 1988 Pergamon Press pie SINGLE FACILITY SCHEDULING WITH NONLINEAR PROCESSING TIMES JATINDERN. D. GUPTA 1 and SUSHIL K. GUPTA 2 1Department of Management Science, Ball State University, Muncie, IN 47303 and 2Department of Decision Sciences, Florida International University, Miami, FL 33199, U.S.A. (Received for publication 10 December 1987) Abstract--This paper considers the static single facility scheduling problem where the processing times of jobs are a monotonically increasing function of their starting (waiting) times and the objective is to minimize the total elapsed time (called the makespan) in which all jobs complete their processing. Based on the combinatorial analysis of the problem, an exact optimization algorithm is developed for the general processing time function which is then specialized for the linear case. In view of the excessive computational burden of the exact optimization algorithm for the nonlinear processing time functions, heuristic algorithms are proposed. The effectiveness of these proposed algorithms is empirically evaluated and found to indicate that these heuristic algorithms yield optimal or near optimal schedules in many cases. INTRODUCTION Consider the following static single facility problem: a set of n independent, single-operation jobs, is ready for being processed at time zero on a single machine. Neither job splitting nor machine idleness are allowed. The processing time of each job depends on its starting (or waiting) time in the sequence. It is desired to find that processing order (schedule) which minimizes the makespan, defined as the total elapsed time in which all jobs complete their processing. Such situations often occur in many chemical and metallurgical processes. For example, in steel rolling mills, ingots are heated to the required temperature before rolling. In this case, the furnace is the single facility and ingots are the independent jobs to be processed (heated). Heating time depends upon the ingot's current temperature, which depends upon the time it has been waiting. During the waiting period, the ingot cools down thus requiring more heating time in the furnace. It is desired to minimize the total time spent by all available ingots in the heating shop. For each job i, let Pi and ti be its processing and starting times respectively. Since the processing time of job i depends on its starting time, the relationship between Pi and ti can be represented as: Pi = f(ti). (1) The exact form of the function f in equation (1) depends upon the specific production process under consideration. Mathematically, the linear, the quadratic, and the general forms are expressed as follows: Pi = ai + biti (linear) (2) Pi = ai + biti + cit 2 (quadratic) (3) Pi = ai + biti + ci t2 + • • • + mit m (general) (4) where ai, bi, Ci, , mi are non-negative constants. As stated before, the objective is to find the schedule (or an order) S = ([1], [2] .... [K] ..... [n]), where [K] is the job in Kth position that will minimize makespan, T(S), computed as follows: T(S) = Z Pi,I (5) i=1 387

Single facility scheduling with nonlinear processing times

Embed Size (px)

Citation preview

Page 1: Single facility scheduling with nonlinear processing times

Computers ind. Engng Vol. 14, No. 4, pp. 387-393, 1988 0360-8352/88 $3.00+0.00 Printed in Great Britain. All rights reserved Copyright © 1988 Pergamon Press pie

SINGLE FACILITY SCHEDULING WITH NONLINEAR PROCESSING TIMES

JATINDER N. D. GUPTA 1 and SUSHIL K. GUPTA 2

1Department of Management Science, Ball State University, Muncie, IN 47303 and 2Department of Decision Sciences, Florida International University, Miami, FL 33199, U.S.A.

(Received for publication 10 December 1987)

Abstract--This paper considers the static single facility scheduling problem where the processing times of jobs are a monotonically increasing function of their starting (waiting) times and the objective is to minimize the total elapsed time (called the makespan) in which all jobs complete their processing. Based on the combinatorial analysis of the problem, an exact optimization algorithm is developed for the general processing time function which is then specialized for the linear case. In view of the excessive computational burden of the exact optimization algorithm for the nonlinear processing time functions, heuristic algorithms are proposed. The effectiveness of these proposed algorithms is empirically evaluated and found to indicate that these heuristic algorithms yield optimal or near optimal schedules in many cases.

INTRODUCTION

Consider the following static single facility problem: a set of n independent, single-operation jobs, is ready for being processed at time zero on a single machine. Neither job splitting nor machine idleness are allowed. The processing time of each job depends on its starting (or waiting) time in the sequence. It is desired to find that processing order (schedule) which minimizes the m a k e s p a n , defined as the total elapsed time in which all jobs complete their processing. Such situations often occur in many chemical and metallurgical processes. For example, in steel rolling mills, ingots are heated to the required temperature before rolling. In this case, the furnace is the single facility and ingots are the independent jobs to be processed (heated). Heating time depends upon the ingot's current temperature, which depends upon the time it has been waiting. During the waiting period, the ingot cools down thus requiring more heating time in the furnace. It is desired to minimize the total time spent by all available ingots in the heating shop.

For each job i, let Pi and ti be its processing and starting times respectively. Since the processing time of job i depends on its starting time, the relationship between Pi and ti can be represented as:

Pi = f ( t i ) . (1)

The exact form of the function f in equation (1) depends upon the specific production process under consideration. Mathematically, the linear, the quadratic, and the general forms are expressed as follows:

Pi = ai + biti (linear) (2)

Pi = ai + biti + c i t 2 (quadratic) (3)

Pi = ai + biti + ci t2 + • • • + m i t m (general) (4)

where ai, bi, Ci, • • • , m i are non-negative constants. As stated before, the objective is to find the schedule (or an order) S = ([1], [2] . . . . [K] . . . . .

[n]), where [K] is the job in Kth position that will minimize makespan, T ( S ) , computed as follows:

T ( S ) = Z Pi,I (5) i = 1

387

Page 2: Single facility scheduling with nonlinear processing times

388 JATINDER N. D. GUPTA and SUSHIL K. GUPTA

Minimizing makespan on a single machine is trivial if the processing times of jobs are independent of their starting times since all schedules give the same makespan. However, for the scenario presented above, finding a schedule that minimizes makespan is neither trivial nor easy. In fact, the problem appears to be NP-complete implying that it is very unlikely that polynomially bounded solution techniques will be found for its solution [1]. Branch and bound procedures are not useful since it is very difficult (if not impossible) to develop general expressions for the lower bounds on makespan. Therefore, the combinatorial approaches based on the sequence domi- nance conditions are used to describe an exact optimization algorithm for the general processing time function and a specialized version for the linear case. In view of the relative inefficiency of combinatorial approaches several heuristic approaches are discussed to find approximate solu- tions to the problem.

C O M B I N A T O R I A L A P P R O A C H

Consider two schedules S = PQ and S' = P'Q where the partial schedules P and P' are different permutations of the same subset of jobs. Let the completion time of partial schedule P be represented as T(P). The developments in flowshop scheduling [2,3], lead to the following result.

Theorem 1. For two partial schedules P and P' that are different permutations of the same subset of jobs, T(P) <- T(P') implies that T(PQ) <- T(P'Q).

The condition in Theorem 1 above is both necessary and sufficient for developing optimization algorithms for the general case considered here.

Combinatorial algorithm for the general case

Based on Gupta's [4] and Schild and Fredman's [5] algorithms for scheduling problems with deferral costs, the combinatorial algorithm for the general case of the present problem can be described as follows:

(1) Start the list with n partial schedules each containing only one job. Compute their completion times.

(2) For each of these partial schedules, generate new partial schedules by augmenting each of the unscheduled jobs to the end of the partial schedule. Compute the completion times of all partial schedules thus obtained.

(3) Group the partial schedules such that the partial schedules that are different permutations of the same subset of jobs belong to the same group. In each group of such partial schedules, retain the one with minimum completion time.

(4) Repeat steps 2 and 3 until a complete schedule containing n jobs is retained in step 3. This is an optimal schedule since it minimizes makespan.

As an illustration of the above proposed combinatorial algorithm, consider the four-job problem where the processing time of a job is a quadratic function of its starting time, as in equation (3) above. Table 1 below depicts the values of various parameters for each of the four jobs.

The steps of the combinatorial algorithm are best performed in a tabular fashion as shown in Table 2. To start the process, each job is considered at the first sequence position as shown in Table 2 under the column Iteration 1. Then, pairs are formed and only the partial schedules with

Table 1. A four-job example problem

Job(i) ai bi c

1 0.23 0.80 0.75 2 0.27 0.88 0.85 3 0.13 0.09 0.35 4 0.29 0.60 0.98

Page 3: Single facility scheduling with nonlinear processing times

Iter

atio

n 1

Tab

le 2

. Il

lust

rati

on o

f th

e co

mb

inat

ori

al a

lgo

rith

m

Iter

atio

n 2

It

erat

ion

3

Iter

atio

n 4

p T

(P)

1 0.

23

2 0.

27

3 0.

13

4 0.

29

e r(

P)

* t'

r(

P)

* P

r(

P)

1-2

0.74

7 *

1-2

-3

1.13

95

* 1-

2-3-

-4

3.38

56

1-2

-4

2.03

30

1-3

0.39

9 *

1-3

-2

1.15

54

1-3-

-4

1 .O

844

1-4

0.70

9 *

1-4

-2

2.03

01

* 1

-4-2

-3

3.78

52

1-4

-3

1.07

88

* 1

-4-3

-2

3.28

44

2-1

2-3

2-4

3-1

3-2

3

-4

4-1

4-2

4

-3

0.77

0 0.

4498

0.79

34

0.47

66

0.52

80

0.51

45

0.81

50

0.88

67

0.47

55

* 2

-3-1

1.

1919

2-

3--4

1.

2085

*

2-4-

-1

2.12

90

2-4

-3

1.21

45

4--3

--1

1.25

67

4--3

--2

1.35

74

2-3

-4-1

3.

4988

~

r o t~

=-

.

* T

his

show

s th

e pa

rtia

l sc

hedu

le r

etai

ned

.

Page 4: Single facility scheduling with nonlinear processing times

390 JATINDER N. D. GUPTA and SUSHIL K. GUPTA

least completion times are retained as shown in the column labelled Iteration 2. This process is continued until complete schedules are obtained as shown in Table 2.

From the calculations in Table 2 above, it follows that the optimal schedule for the example problem of Table 1 is 1 A 3--2 with a makespan of 3.2844.

Linear processing time function case

The processing time of any job i, as a linear function of its starting (or waiting) time, is given by equation (2). For each job i, define: hi = ai/bi.

Theorem 2. For the linear case, the schedule obtained by arranging jobs in ascending order of h i values is optimal.

Proof. Consider two schedules S = PijQ and S' = PjiQ. It is obvious that:

T(Pij) = T ( P ) + a i + b i T ( P ) + a / + bj{T(P) + a i + b i T ( P ) } .

Similarly:

T(Pji) = T(P) + aj + bjT(P) + ai + bi{T(P) + aj + biT(P)}.

From these two equations, it follows that T(Pij) <<, T(Pji) if bjai <<- biaj. Thus ai/bi ~ aj/b i (or hi <~ hi) implies that T(Pij) <~ T(Pji). Using theorem 1 then shows that T(S) <<, T(S'). If P is empty, then it follows that a job with least hi will be the first job in the sequence since it will dominate all other jobs. When repeated for other sequence positions, this argument shows that the schedule obtained by arranging jobs in ascending order of the ratio hi minimizes makespan.

The above theorem also shows that the linear processing time case can be solved in polyno- mially computational time since the computational effort involved is of the order of O(nlogn).

HEURISTIC ALGORITHMS

In view of the excessive computational burden of the combinatorial approach, it is desirable to seek heuristic algorithms for this problem. Two heuristic approaches are explored here, namely: static and dynamic heuristic algorithms. The main difference between the two algorithms is the manner in which the relative priority of a job is calculated. In the dynamic heuristic algorithm, job priority is calculated dynamically and hence it may change depending on the sequence position being considered whereas in the static case it does not change.

Static heuristic algorithm

Since the linear case can be optimized by arranging jobs in ascending order of their weighted processing times, it is natural to extend this approach to non-linear processing times. To do so, for each job i, define:

hi(a) = ai h i (b) = alibi

hi(c) = ai/c i

hi(m) = ai/m i

Then the static heuristic algorithm has the following form:

(1) For each job i, calculate hi(a), hi(b), hi(c) . . . . , hi(m) using the above equations. (2) Generate m schedules by arranging jobs in the ascending order of the hi(x ) values, x = a,

b, c , . . . , m. Break ties by using x = b or c for a, x = a or c for b, and x = a or b for c, etc. (3) Find the makespan of the schedules so obtained and accept the one with minimum

makespan as an approximate solution.

Page 5: Single facility scheduling with nonlinear processing times

Single facility scheduling 391

Table 3. hi(x) for the example problem

Job(i) hi(a ) hi(b ) hi(c)

1 0.23 0.29 0.31 2 0.27 0.31 0.32 3 0.13 1.45 0.37 4 0.29 0.48 0.30

Table 4. Schedules obtained by static heuristic algorithm

Processing times for jobs in positions

Rule Schedule 1 2 3 4 Makespan

hi(a) 3-1-24 0.13 0.3466 0.8825 2.9150 4.274 hi(b) 1-2-4-3 0.23 0.5174 1.2858 1.7598 3.793 h,(c) 4-1-2-3 0.29 0.5250 1.5518 2.3039 4.670

As an illustration of the static heuristic algorithm, consider the four-job problem of Table 1 above. The ratios hi(x) are given in Table 3.

The schedules obtained by arranging jobs in ascending order of these ratios in Table 3 are given in Table 4.

Schedule 1-2-4-3 is the best schedule with a makespan of 3.793.

Dynamic heuristic algorithm

Partition all jobs in two mutually exclusive subsets P and Q. The jobs in subset P have already been scheduled. Let the completion time of the last job in P be T(P). For each job i e Q fijad the ratio Hi(x), x = a, b, c , . . . , m. For the quadratic processing time function given in equation (3), Hi(x) are defined as follows:

H i ( a ) = a i / { b i + c i T ( P ) } H i ( b ) = {ai+biT(P)+ciT(p)2}/{bi+2ciT(P)}, and Hi(c) = ai+biT(P)+ciT(P) 2

Similar expressions for Hi(x) can be developed for the general non-linear processing time function given in equation (4).

The dynamic and static heuristic algorithms follow the same steps except that in the dynamic case, implementation of the step 2 is as follows:

(i) Find job j with minimum Hj(x), j ~ Q. (ii) Append job j to sequence P and find T(Pj).

(iii) Let P = Pj, Q = Q - {j}, and T(P) = T(Pj). (iv) If Q is empty, go to step (v), otherwise, find Hi(x) for each j ~ Q and go to step (i). (v) P represents the heuristic schedule and T(P) is the makespan.

To illustrate the dynamic heuristic algorithm, consider the example problem of Table 1 again. The calculations are shown in Table 5 below and results in the same solution as by the static heuristic algorithm.

Improving the solution

The above heuristic algorithms were found to be rather unpredictable in finding an optimal (or close to optimal) makespan schedules. Therefore, the following neighborhood search technique was used to improve their effectiveness:

(0) L e t i = 1, and j = 2. (1) Let the best schedule generated by the heuristic algorithm be S = ([1], [2], . . . [k] . . . . ,

[n]), where, [k] is the job in kth position. Let T(S) be the makespan for S.

Page 6: Single facility scheduling with nonlinear processing times

Tab

le 5

. S

ched

ules

obt

aine

d by

dy

nam

ic h

euri

stic

alg

orit

hm

e T(P)

Pro

cess

ing

tim

e fo

r Jo

b jo

b H

i(x)

H

2(x

) H

3(x)

H

4(x)

se

lect

ed

sele

cted

Sch

edu

le f

or

x =

a

0 0

.00

~

11.2

9 (1

.31(

1 1.

4500

(I

.480

0 1

0.23

00

1 O

. 230

0 *

(1.2

51

0.76

2(I

O. 3

513

2 (1

.517

3 1-

2 0.

7473

*

* (1

.369

8 (1

.217

6 4

1.28

58

1-2-

4 2.

0331

*

* O

. 162

2 *

3 1.

7598

1-

2-4-

3 3.

7929

*

* *

* *

*

Sch

edu

le f

or

x =

b

0 O

.O~X

k')

0.29

(1

.310

1.

450

(I.4

800

1 (I

.230

0 1

O. 2

300

* 0.

407

0.67

4 ()

. 456

6 2

0.51

73

1-2

0.74

73

* *

0.64

0 0.

6227

4

1.28

58

1-2-

4 2.

0331

*

* 1.

163

* 3

1,75

98

1-2-

4-3

3.79

29

* *

* *

*

Sch

edu

le f

or

x =

c

0 0.

f~X

I 0.

230

0.27

0 (1

.13

0.29

0 3

0.13

0 3

(I.1

30

0.34

7 0.

399

* 0.

385

1 0.

347

3-1

0.47

7 *

(I.8

83

* 0.

799

4 0.

799

3-1-

4 1.

276

* 2,

775

* *

2 2.

775

3-1

-4-2

4.

051

* *

* *

* *

z :z

~7

©

*Not

nec

essa

ry t

o fi

nd t

hese

val

ues

Tab

le 6

. C

om

par

ativ

e ev

alua

tion

of

the

pro

po

sed

heu

rist

ic a

lgor

ithm

s

Sta

tic

heur

isti

c al

gori

thm

D

yn

amic

heu

rist

ic a

lgor

ithm

n #e

OP

T

MIN

A

VR

M

AX

~

OP

T

MIN

A

VR

M

AX

4 17

0.

0 0.

026

0.15

5 14

0.

0 0.

0196

0.

153

5 9

0.0

0.11

2 1.

440

7 0.

0 0,

1534

1.

534

6 7

0,0

0.30

2 0.

951

6 0.

0 0.

3712

2.

952

7 2

0.0

1.44

4 5.

618

3 0.

0 12

.028

5 13

4.54

0

Tab

le 7

. E

xp

erim

enta

l re

sult

s of

usi

ng n

eigh

borh

ood

sear

ch t

echn

ique

Sin

gle

pass

D

oubl

e pa

ss

n #

OP

T

MIN

A

VR

M

AX

#e

OP

T

MIN

A

VR

M

AX

4 18

0.

0 0.

003

0.04

8 19

0.

0 0.

002

0.04

8 5

18

0.0

l).0

04

0.05

0 20

0.

0 0.

000

0.00

0 6

17

0.0

0.01

6 0.

230

19

0.0

0.00

0 0.

004

7 10

0.

0 0.

385

5.61

8 15

0.

0 0.

324

5.61

8

,v

Page 7: Single facility scheduling with nonlinear processing times

Single facility scheduling 393

(2) Let T(S') be the makespan of the schedule obtained by interchanging [i] and [j]. If T(S') < T(S), let S = S', i = j and j = j + 1. If j < n, return to step 1, otherwise enter step 3.

(3) Set i = i+ 1. If i < n, set j = i+ 1 and return to step 1 otherwise enter step 4. (4) Accept schedule S with makespan T(S) as an approximate solution.

The schedule S may still be improved by applying neighborhood search once more (i.e. repeating steps 1 through 3 above with seed as the schedule obtained at step 4).

COMPUTATIONAL EXPERIENCE

To test the effectiveness of proposed heuristic algorithms in finding optimal or near optimal schedules, the proposed optimization algorithm, both heuristic algorithms and the neighborhood search technique were used to solve problems varying in size from 4 to 7 jobs. A set of 20 problems was solved for each problem size. The ai, bi, and ci values were generated from a uniform distribution in the range [0,1]. For each problem the effectiveness of the kth heuristic algorithm, q~,, is defined as:

qk = (Pk - Po)IPo

where Pk and Po are the kth heuristic and optimal solutions respectively. Table 6 depicts the number of optimal solutions (~ OPT); the minimum (MIN), average (AVR) and maximum (MAX) values of q for each problem size and for each algorithm.

The difference in the effectiveness of the two heuristic algorithms is not appreciable. In fact the static heuristic algorithm seems to be more effective than the dynamic case, especially for large size problems. Moreover, static rules require less computational effort than the dynamic heuristic algorithm. Therefore, more experimentation was done with the static heuristic algorithm. Neighborhood search technique was used to improve the solution obtained by the static heuristic algorithm. Table 7 shows the results of these experiments with a single and a double pass of the neighborhood search procedure.

The results in Table 7 above show that the neighborhood search technique improves the quality of the heuristic solution considerably. However, the computational effort is much more than for the static heuristic algorithm only. The tradeoff depends on other managerial considerations and the values assigned to a reduction in makespan.

CONCLUSIONS

This paper has described exact and heuristic algorithms for finding a minimum makespan schedule for a single facility scheduling problem when the processing times of jobs are nonlinar functions of their starting times. While the problem appears to be in the NP-complete category, the proposed algorithms do provide a practical way to approach the problem. Further refine- ments and improvements may be possible to provide even more efficient and effective solution procedures. Nevertheless, the suggested approaches are useful in finding practical and workable schedules for these difficult problems. Even if the branch and bound approaches to solve these problems do become available in the future, the proposed heuristic algorithms can be used to find an initial solution to act as an upper bound to curtail the domain of search.

REFERENCES

1. M. R. Garey and D. S. Johnson. Computers and Intractability: A Guide to the Theory of NP-Completeness. Freeman, San Francisco (1979).

2. J. N. D. Gupta. An improved combinatorial algorithm for the flowshop scheduling problem. Opns Res. 20, 1753-1758 (1971).

3. J. N. D. Gupta. A review of flowshop scheduling research. Disaggregation: Problems in Manufacturing and Service Organizations (Edited by L. P. Ritzman et al.) Martinus Nijhoff, The Hague (1979).

4. J. N. D. Gupta. Optimal scheduling in a multi-stage flowshop. A H E Trans. 4, 238-243 (1972). 5. A. Schild and I. J. Fredman. Scheduling tasks with deadlines and nonlinear loss functions. Mgmt Sci. 9, 73-81 (1962).