- Source: Identical-machines scheduling
Identical-machines scheduling is an optimization problem in computer science and operations research. We are given n jobs J1, J2, ..., Jn of varying processing times, which need to be scheduled on m identical machines, such that a certain objective function is optimized, for example, the makespan is minimized.
Identical machine scheduling is a special case of uniform machine scheduling, which is itself a special case of optimal job scheduling. In the general case, the processing time of each job may be different on different machines; in the case of identical machine scheduling, the processing time of each job is the same on each machine. Therefore, identical machine scheduling is equivalent to multiway number partitioning. A special case of identical machine scheduling is single-machine scheduling.
In the standard three-field notation for optimal job scheduling problems, the identical-machines variant is denoted by P in the first field. For example, " P||
C
max
{\displaystyle C_{\max }}
" is an identical machine scheduling problem with no constraints, where the goal is to minimize the maximum completion time.
In some variants of the problem, instead of minimizing the maximum completion time, it is desired to minimize the average completion time (averaged over all n jobs); it is denoted by P||
∑
C
i
{\displaystyle \sum C_{i}}
. More generally, when some jobs are more important than others, it may be desired to minimize a weighted average of the completion time, where each job has a different weight. This is denoted by P||
∑
w
i
C
i
{\displaystyle \sum w_{i}C_{i}}
.
Algorithms
= Minimizing average and weighted-average completion time
=Minimizing the average completion time (P||
∑
C
i
{\displaystyle \sum C_{i}}
) can be done in polynomial time. The SPT algorithm (Shortest Processing Time First), sorts the jobs by their length, shortest first, and then assigns them to the processor with the earliest end time so far. It runs in time O(n log n), and minimizes the average completion time on identical machines, P||
∑
C
i
{\displaystyle \sum C_{i}}
.
There can be many SPT schedules; finding the SPT schedule with the smallest finish time (also called OMFT – optimal mean finish time) is NP-hard.
Minimizing the weighted average completion time is NP-hard even on identical machines, by reduction from the knapsack problem. It is NP-hard even if the number of machines is fixed and at least 2, by reduction from the partition problem.
Sahni presents an exponential-time algorithm and a polynomial-time approximation scheme for solving both these NP-hard problems on identical machines:
Optimal average-completion-time;
Weighted-average-completion-time.
= Minimizing the maximum completion time (makespan)
=Minimizing the maximum completion time (P||
C
max
{\displaystyle C_{\max }}
) is NP-hard even for identical machines, by reduction from the partition problem. Many exact and approximation algorithms are known.
Graham proved that:
Any list scheduling algorithm (an algorithm that processes the jobs in an arbitrary fixed order, and schedules each job to the first available machine) is a
2
−
1
/
m
{\displaystyle 2-1/m}
approximation for identical machines. The bound is tight for any m. This algorithm runs in time O(n).
The specific list-scheduling algorithm called Longest Processing Time First (LPT), which sorts the jobs by descending length, is a
4
/
3
−
1
/
3
m
{\displaystyle 4/3-1/3m}
approximation for identical machines.: sec.5 It is also called greedy number partitioning.
Coffman, Garey and Johnson presented a different algorithm called multifit algorithm, using techniques from bin packing, which has an approximation factor of 13/11≈1.182.
Huang and Lu presented a simple polynomial-time algorithm that attains an 11/9≈1.222 approximation in time O(m log m + n), through the more general problem of maximin-share allocation of chores.
Sahni presented a PTAS that attains (1+ε)OPT in time
O
(
n
⋅
(
n
2
/
ϵ
)
m
−
1
)
{\displaystyle O(n\cdot (n^{2}/\epsilon )^{m-1})}
. It is an FPTAS if m is fixed. For m=2, the run-time improves to
O
(
n
2
/
ϵ
)
{\displaystyle O(n^{2}/\epsilon )}
. The algorithm uses a technique called interval partitioning.
Hochbaum and Shmoys presented several approximation algorithms for any number of identical machines (even when the number of machines is not fixed):
For any r >0, an algorithm with approximation ratio at most (6/5+2−r ) in time
O
(
n
(
r
+
log
n
)
)
{\displaystyle O(n(r+\log {n}))}
.
For any r >0, an algorithm with approximation ratio at most (7/6+2−r ) in time
O
(
n
(
r
m
4
+
log
n
)
)
{\displaystyle O(n(rm^{4}+\log {n}))}
.
For any ε>0, an algorithm with approximation ratio at most (1+ε) in time
O
(
(
n
/
ε
)
(
1
/
ε
2
)
)
{\displaystyle O((n/\varepsilon )^{(1/\varepsilon ^{2})})}
. This is a PTAS. Note that, when the number of machines is a part of the input, the problem is strongly NP-hard, so no FPTAS is possible.
Leung improved the run-time of this algorithm to
O
(
(
n
/
ε
)
(
1
/
ε
)
log
(
1
/
ε
)
)
{\displaystyle O\left((n/\varepsilon )^{(1/\varepsilon )\log {(1/\varepsilon )}}\right)}
.
= Maximizing the minimum completion time
=Maximizing the minimum completion time (P||
C
min
{\displaystyle C_{\min }}
) is applicable when the "jobs" are actually spare parts that are required to keep the machines running, and they have different life-times. The goal is to keep machines running for as long as possible. The LPT algorithm attains at least
3
m
−
1
4
m
−
2
{\displaystyle {\frac {3m-1}{4m-2}}}
of the optimum.
Woeginger presented a PTAS that attains an approximation factor of
1
−
ε
{\displaystyle 1-{\varepsilon }}
in time
O
(
c
ε
n
log
k
)
{\displaystyle O(c_{\varepsilon }n\log {k})}
, where
c
ε
{\displaystyle c_{\varepsilon }}
a huge constant that is exponential in the required approximation factor ε. The algorithm uses Lenstra's algorithm for integer linear programming.
= General objective functions
=Alon, Azar, Woeginger and Yadid consider a more general objective function. Given a positive real function f, which depends only on the completion times Ci, they consider the objectives of minimizing
∑
i
=
1
m
f
(
C
i
)
{\displaystyle \sum _{i=1}^{m}f(C_{i})}
, minimizing
max
i
=
1
m
f
(
C
i
)
{\displaystyle \max _{i=1}^{m}f(C_{i})}
, maximizing
∑
i
=
1
m
f
(
C
i
)
{\displaystyle \sum _{i=1}^{m}f(C_{i})}
, and maximizing
min
i
=
1
m
f
(
C
i
)
{\displaystyle \min _{i=1}^{m}f(C_{i})}
. They prove that, if f is non-negative, convex, and satisfies a strong continuity assumption that they call "F*", then both minimization problems have a PTAS. Similarly, if f is non-negative, concave, and satisfies F*, then both maximization problems have a PTAS. In both cases, the run-time of the PTAS is O(n), but with constants that are exponential in 1/ε.
See also
Fernandez's method
References
External links
Summary of parallel machine problems without preemtion
Kata Kunci Pencarian:
- MBTA Commuter Rail
- Identical-machines scheduling
- Unrelated-machines scheduling
- Optimal job scheduling
- Uniform-machines scheduling
- Single-machine scheduling
- Job-shop scheduling
- Longest-processing-time-first scheduling
- List scheduling
- Multiway number partitioning
- Interval scheduling
Glass Onion: A Knives Out Mystery (2022)
The Creator (2023)
No More Posts Available.
No more pages to load.