- Source: TOPSIS
The Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) is a multi-criteria decision analysis method, which was originally developed by Ching-Lai Hwang and Yoon in 1981 with further developments by Yoon in 1987, and Hwang, Lai and Liu in 1993.
TOPSIS is based on the concept that the chosen alternative should have the shortest geometric distance from the positive ideal solution (PIS) and the longest geometric distance from the negative ideal solution (NIS). A dedicated book in the fuzzy context was published in 2021
Description
It is a method of compensatory aggregation that compares a set of alternatives, normalising scores for each criterion and calculating the geometric distance between each alternative and the ideal alternative, which is the best score in each criterion. The weights of the criteria in TOPSIS method can be calculated using Ordinal Priority Approach, Analytic hierarchy process, etc. An assumption of TOPSIS is that the criteria are monotonically increasing or decreasing.
Normalisation is usually required as the parameters or criteria are often of incongruous dimensions in multi-criteria problems. Compensatory methods such as TOPSIS allow trade-offs between criteria, where a poor result in one criterion can be negated by a good result in another criterion. This provides a more realistic form of modelling than non-compensatory methods, which include or exclude alternative solutions based on hard cut-offs. An example of application on nuclear power plants is provided in.
TOPSIS method
The TOPSIS process is carried out as follows:
Step 1
Create an evaluation matrix consisting of m alternatives and n criteria, with the intersection of each alternative and criteria given as
x
i
j
{\displaystyle x_{ij}}
, we therefore have a matrix
(
x
i
j
)
m
×
n
{\displaystyle (x_{ij})_{m\times n}}
.
Step 2
The matrix
(
x
i
j
)
m
×
n
{\displaystyle (x_{ij})_{m\times n}}
is then normalised to form the matrix
R
=
(
r
i
j
)
m
×
n
{\displaystyle R=(r_{ij})_{m\times n}}
, using the normalisation method
r
i
j
=
x
i
j
∑
k
=
1
m
x
k
j
2
,
i
=
1
,
2
,
…
,
m
,
j
=
1
,
2
,
…
,
n
{\displaystyle r_{ij}={\frac {x_{ij}}{\sqrt {\sum _{k=1}^{m}x_{kj}^{2}}}},\quad i=1,2,\ldots ,m,\quad j=1,2,\ldots ,n}
Step 3
Calculate the weighted normalised decision matrix
t
i
j
=
r
i
j
⋅
w
j
,
i
=
1
,
2
,
…
,
m
,
j
=
1
,
2
,
…
,
n
{\displaystyle t_{ij}=r_{ij}\cdot w_{j},\quad i=1,2,\ldots ,m,\quad j=1,2,\ldots ,n}
where
w
j
=
W
j
/
∑
k
=
1
n
W
k
,
j
=
1
,
2
,
…
,
n
{\displaystyle w_{j}=W_{j}{\Big /}\sum _{k=1}^{n}W_{k},j=1,2,\ldots ,n}
so that
∑
i
=
1
n
w
i
=
1
{\displaystyle \sum _{i=1}^{n}w_{i}=1}
, and
W
j
{\displaystyle W_{j}}
is the original weight given to the indicator
v
j
,
j
=
1
,
2
,
…
,
n
.
{\displaystyle v_{j},\quad j=1,2,\ldots ,n.}
Step 4
Determine the worst alternative
(
A
w
)
{\displaystyle (A_{w})}
and the best alternative
(
A
b
)
{\displaystyle (A_{b})}
:
A
w
=
{
⟨
max
(
t
i
j
∣
i
=
1
,
2
,
…
,
m
)
∣
j
∈
J
−
⟩
,
⟨
min
(
t
i
j
∣
i
=
1
,
2
,
…
,
m
)
∣
j
∈
J
+
⟩
}
≡
{
t
w
j
∣
j
=
1
,
2
,
…
,
n
}
,
{\displaystyle A_{w}=\{\langle \max(t_{ij}\mid i=1,2,\ldots ,m)\mid j\in J_{-}\rangle ,\langle \min(t_{ij}\mid i=1,2,\ldots ,m)\mid j\in J_{+}\rangle \rbrace \equiv \{t_{wj}\mid j=1,2,\ldots ,n\rbrace ,}
A
b
=
{
⟨
min
(
t
i
j
∣
i
=
1
,
2
,
…
,
m
)
∣
j
∈
J
−
⟩
,
⟨
max
(
t
i
j
∣
i
=
1
,
2
,
…
,
m
)
∣
j
∈
J
+
⟩
}
≡
{
t
b
j
∣
j
=
1
,
2
,
…
,
n
}
,
{\displaystyle A_{b}=\{\langle \min(t_{ij}\mid i=1,2,\ldots ,m)\mid j\in J_{-}\rangle ,\langle \max(t_{ij}\mid i=1,2,\ldots ,m)\mid j\in J_{+}\rangle \rbrace \equiv \{t_{bj}\mid j=1,2,\ldots ,n\rbrace ,}
where,
J
+
=
{
j
=
1
,
2
,
…
,
n
∣
j
}
{\displaystyle J_{+}=\{j=1,2,\ldots ,n\mid j\}}
associated with the criteria having a positive impact, and
J
−
=
{
j
=
1
,
2
,
…
,
n
∣
j
}
{\displaystyle J_{-}=\{j=1,2,\ldots ,n\mid j\}}
associated with the criteria having a negative impact.
Step 5
Calculate the L2-distance between the target alternative
i
{\displaystyle i}
and the worst condition
A
w
{\displaystyle A_{w}}
d
i
w
=
∑
j
=
1
n
(
t
i
j
−
t
w
j
)
2
,
i
=
1
,
2
,
…
,
m
,
{\displaystyle d_{iw}={\sqrt {\sum _{j=1}^{n}(t_{ij}-t_{wj})^{2}}},\quad i=1,2,\ldots ,m,}
and the distance between the alternative
i
{\displaystyle i}
and the best condition
A
b
{\displaystyle A_{b}}
d
i
b
=
∑
j
=
1
n
(
t
i
j
−
t
b
j
)
2
,
i
=
1
,
2
,
…
,
m
{\displaystyle d_{ib}={\sqrt {\sum _{j=1}^{n}(t_{ij}-t_{bj})^{2}}},\quad i=1,2,\ldots ,m}
where
d
i
w
{\displaystyle d_{iw}}
and
d
i
b
{\displaystyle d_{ib}}
are L2-norm distances from the target alternative
i
{\displaystyle i}
to the worst and best conditions, respectively.
Step 6
Calculate the similarity to the worst condition:
s
i
w
=
d
i
w
/
(
d
i
w
+
d
i
b
)
,
0
≤
s
i
w
≤
1
,
i
=
1
,
2
,
…
,
m
.
{\displaystyle s_{iw}=d_{iw}/(d_{iw}+d_{ib}),\quad 0\leq s_{iw}\leq 1,\quad i=1,2,\ldots ,m.}
s
i
w
=
1
{\displaystyle s_{iw}=1}
if and only if the alternative solution has the best condition; and
s
i
w
=
0
{\displaystyle s_{iw}=0}
if and only if the alternative solution has the worst condition.
Step 7
Rank the alternatives according to
s
i
w
(
i
=
1
,
2
,
…
,
m
)
.
{\displaystyle s_{iw}\,\,(i=1,2,\ldots ,m).}
Normalisation
Two methods of normalisation that have been used to deal with incongruous criteria dimensions are linear normalisation and vector normalisation.
Linear normalisation can be calculated as in Step 2 of the TOPSIS process above. Vector normalisation was incorporated with the original development of the TOPSIS method, and is calculated using the following formula:
r
i
j
=
x
i
j
∑
k
=
1
m
x
k
j
2
,
i
=
1
,
2
,
…
,
m
,
j
=
1
,
2
,
…
,
n
{\displaystyle r_{ij}={\frac {x_{ij}}{\sqrt {\sum _{k=1}^{m}x_{kj}^{2}}}},\quad i=1,2,\ldots ,m,\quad j=1,2,\ldots ,n}
In using vector normalisation, the non-linear distances between single dimension scores and ratios should produce smoother trade-offs.
Online tools
[1] : DeciGen A free MCDA Plugin for Grasshopper Grasshopper 3D.
Decisional : An online tool for real estate comparison with TOPSIS
Decision Radar : A free online TOPSIS calculator written in Python.
Yadav, Vinay; Karmakar, Subhankar; Kalbar, Pradip P.; Dikshit, A.K. (January 2019). "PyTOPS: A Python based tool for TOPSIS". SoftwareX. 9: 217–222. Bibcode:2019SoftX...9..217Y. doi:10.1016/j.softx.2019.02.004.
References
Kata Kunci Pencarian:
- Simple Additive Weighting
- Wahyuddin Albra
- TOPSIS
- Topsi
- Support and resistance
- Multiple-criteria decision analysis
- VIKOR method
- Supercell (company)
- Shiraz
- Latent human error
- The Devil's Servants
- Ismat Beg