24
Hesitant Fuzzy Entropy and Cross-Entropy and Their Use in Multiattribute Decision-Making Zeshui Xu, 1,Meimei Xia 2,1 Institute of Sciences, PLA University of Science and Technology, Nanjing 210007, PR China 2 School of Economics and Management, Tsinghua University, Beiing 100084, PR China We introduce the concepts of entropy and cross-entropy for hesitant fuzzy information, and discuss their desirable properties. Several measure formulas are further developed, and the relationships among the proposed entropy, cross-entropy, and similarity measures are analyzed, from which we can find that three measures are interchangeable under certain conditions. Then we develop two multiattribute decision-making methods in which the attribute values are given in the form of hesitant fuzzy sets reflecting humans’ hesitant thinking comprehensively. In one method, the weight vector is determined by the hesitant fuzzy entropy measure, and the optimal alternative is obtained by comparing the hesitant fuzzy cross-entropies between the alternatives and the ideal solutions; in another method, the weight vector is derived from the maximizing deviation method and the optimal alternative is obtained by using the TOPSIS method. An actual example is provided to compare our methods with the existing ones. C 2012 Wiley Periodicals, Inc. 1. INTRODUCTION Entropy, cross-entropy, and similarity measures are three important research topics in the fuzzy set theory, which have been widely used in practical applica- tions, such as pattern recognition, medical diagnosis, clustering analysis, image processing, and decision-making. Entropy is the measure of fuzziness. 1 Since its appearance, entropy has received great attentions. De Luca and Termini 2 put forward some axioms to describe the fuzziness degree of a fuzzy set, 3 and proposed several entropy formulas based on Shannon’s function. Kaufmann 4 introduced an entropy formula for the fuzzy set by the metric distance between its membership degree function and the membership function of its nearest crisp set. Another method pre- sented by Yager 5 is to view the fuzziness degree of the fuzzy set in terms of a lack of Author to whom all correspondence should be addressed: e-mail: [email protected]. e-mail: [email protected]. INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, VOL. 27, 799–822 (2012) C 2012 Wiley Periodicals, Inc. View this article online at wileyonlinelibrary.com. DOI 10.1002/int.21548

Hesitant fuzzy entropy and cross‐entropy and their use in multiattribute decision‐making

Embed Size (px)

Citation preview

Hesitant Fuzzy Entropy and Cross-Entropyand Their Use in MultiattributeDecision-MakingZeshui Xu,1,∗ Meimei Xia2,†1Institute of Sciences, PLA University of Science and Technology, Nanjing210007, PR China2School of Economics and Management, Tsinghua University, Beiing 100084,PR China

We introduce the concepts of entropy and cross-entropy for hesitant fuzzy information, and discusstheir desirable properties. Several measure formulas are further developed, and the relationshipsamong the proposed entropy, cross-entropy, and similarity measures are analyzed, from whichwe can find that three measures are interchangeable under certain conditions. Then we developtwo multiattribute decision-making methods in which the attribute values are given in the formof hesitant fuzzy sets reflecting humans’ hesitant thinking comprehensively. In one method, theweight vector is determined by the hesitant fuzzy entropy measure, and the optimal alternativeis obtained by comparing the hesitant fuzzy cross-entropies between the alternatives and theideal solutions; in another method, the weight vector is derived from the maximizing deviationmethod and the optimal alternative is obtained by using the TOPSIS method. An actual exampleis provided to compare our methods with the existing ones. C© 2012 Wiley Periodicals, Inc.

1. INTRODUCTION

Entropy, cross-entropy, and similarity measures are three important researchtopics in the fuzzy set theory, which have been widely used in practical applica-tions, such as pattern recognition, medical diagnosis, clustering analysis, imageprocessing, and decision-making. Entropy is the measure of fuzziness.1 Since itsappearance, entropy has received great attentions. De Luca and Termini2 put forwardsome axioms to describe the fuzziness degree of a fuzzy set,3 and proposed severalentropy formulas based on Shannon’s function. Kaufmann4 introduced an entropyformula for the fuzzy set by the metric distance between its membership degreefunction and the membership function of its nearest crisp set. Another method pre-sented by Yager5 is to view the fuzziness degree of the fuzzy set in terms of a lack of

∗Author to whom all correspondence should be addressed: e-mail: [email protected].†e-mail: [email protected].

INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, VOL. 27, 799–822 (2012)C© 2012 Wiley Periodicals, Inc.View this article online at wileyonlinelibrary.com. • DOI 10.1002/int.21548

800 XU AND XIA

distinction between the fuzzy set and its complement. Later on, other entropies forfuzzy sets have been given from different views.6−11 Since the concepts of interval-valued fuzzy set,12 intuitionistic fuzzy set,13 and rough set14 were introduced, thecorresponding entropy theories have been investigated over the last decades. Burilloand Bustince15 presented an entropy measure on interval-valued fuzzy sets andintuitionstic fuzzy sets. Zeng and Li16 proposed a new concept of entropy forinterval-valued fuzzy sets with a different view from Burillo and Bustince.15 Zhanget al.17 introduced an axiomatic definition of entropy for an interval-valued fuzzy setbased on distance measure. Szmidt and Kacprzyk18 proposed a nonprobabilistic en-tropy measure for intuitionstic fuzzy sets. Sen and Pal19 proposed classes of entropymeasures based on rough set theory and its certain generalizations, and performedrigorous theoretical analysis to provide some properties, which they satisfy.

Cross-entropy and similarity measures are mainly used to measure the discrim-ination information. Up to now, a lot of research has been done about this issue.20−33

Vlachos and Sergiadis34 introduced the concepts of discrimination information andcross-entropy for intuitionistic fuzzy sets, and revealed the connection between thenotions of entropies for fuzzy sets and intuitionistic fuzzy sets in terms of fuzzinessand intuitionism. Hung and Yang35 constructed J -divergence of intuitionistic fuzzysets and introduced some useful distance and similarity measures between two intu-itionistic fuzzy sets, and applied them to clustering analysis and pattern recognition.Based on which, Xia and Xu36 proposed some cross-entropy and entropy formulasfor intuitionstic fuzzy sets and applied them to group decision-making. The relation-ships among the entropy, cross-entropy and similarity measures have also attractedmany attentions.9,16,37,38 For example, Liu9 gave the axiomatic definitions of en-tropy, distance measure, and similarity measure of fuzzy sets and systematicallydiscussed their basic relations. Zeng and Li16 discussed the relationship betweenthe similarity measure and the entropy of interval-valued fuzzy sets in detail, andproved three theorems that the similarity measure and the entropy of interval-valuedfuzzy sets can be transformed by each other based on their axiomatic definitions. Forinterval-valued intuitionistic fuzzy sets,39 Zhang and Jiang38 proposed the entropyand cross-entropy concepts and discussed the connections among some importantinformation measures.

From the above analysis, we can find that all the existing entropy, cross-entropy, and similarity measures are based on fuzzy sets, interval-valued fuzzysets, intuitionistic fuzzy sets, and interval-valued intuitionstic fuzzy sets. However,when people make a decision, they are usually hesitant and irresolute for one thingor another, which makes it difficult to reach a final agreement. The difficulty ofestablishing a common membership degree is not because we have one possiblevalue (fuzzy set), or a margin of error (intuitionistic fuzzy set, interval-valued fuzzyset), but because we have a set of possible values. In most cases, to get a morereasonable decision result, a decision organization, which contains a lot of experts,is authorized to provide the preference information about a set of alternatives.Usually, the decision organization is not very sure about a value, but has hesitancybetween several possible values, when it estimates the degrees that an alternativeshould satisfy a criterion. For example, some experts in the decision organizationprovide 0.3, some provide 0.5, and the others provide 0.6, and when these three

International Journal of Intelligent Systems DOI 10.1002/int

HESITANT FUZZY ENTROPY AND CROSS-ENTROPY 801

parts cannot persuade each other, the degrees that the alternative should satisfy thecriterion can be represented by a hesitant fuzzy element (HFE) {0.3, 0.5, 0.6}. Itis noted that the HFE {0.3, 0.5, 0.6}, which is the basic unit of hesitant fuzzy set,can describe the above situation more comprehensively than the crisp number 0.3(or 0.6), or the interval-valued fuzzy number [0.3, 0.6], or the intuitionistic fuzzynumber (0.3, 0.4), because the degrees that the alternative should satisfy the criterionare not the convex of 0.3 and 0.6, or the interval between 0.3 and 0.6, but just threepossible values 0.3, 0.5 and 0.6. To deal with such cases, Torra and Narukawa40,41

introduced the concept of hesitant fuzzy set considered as a generalization of fuzzyset. Xu and Xia42,43 defined some similarity, distance, and correlation measures forhesitant fuzzy sets. Due to the fact that hesitancy is a very common problem inactual decision-making as mentioned above, it is necessary to develop some entropyand cross-entropy measures for hesitant fuzzy sets.

To do this, the remainder of the paper is constructed as follows. In Section 2, wegive some axiomatic definitions of entropy for HFEs (the basic units of the hesitantfuzzy set), and prove that the similarity measure and the entropy of the HFE can betransformed by each other based on their axiomatic definitions. Section 3 proposestwo cross-entropy formulas for HFEs, based on which two entropy formulas arealso given. In Section 4, based on the proposed entropy and cross-entropy measures,we give a multiattribute decision-making method under hesitant fuzzy environment,and, finally, we conclude the paper in Section 5.

2. ENTROPY FOR HESITANT FUZZY ELEMENTS

Since fuzzy set was introduced by Zadeh,3 several extensions have been de-veloped including interval-valued fuzzy set, intuitionistic fuzzy set, Type-2 fuzzyset,44,45 and so on. Recently, Torra,40 and Torra and Narukawa41 gave another gen-eralization of fuzzy set, hesitant fuzzy set, describing the hesitant situation whenpeople make a decision. In this section, we shall develop some entropy measuresfor hesitant fuzzy information and discuss their relationships.

DEFINITION 1. 40,41 Let X be a fixed set, the hesitant fuzzy set on X is in termsof a function α that when applied to X returns a subset of [0, 1], which can berepresented as the following mathematical symbol:

A = {< x, α(x) > |x ∈ X} (1)

where α(x) is a set of some values in [0, 1], denoting the possible membershipdegrees of the element x ∈ X to the set A. For convenience, Xia and Xu46 namedα(x) an HFE and H the set of all HFEs. Especially, if there is only one value inα(x), then the hesitant fuzzy set reduces to the fuzzy set,3 which indicates that fuzzysets are a special type of hesitant fuzzy sets, therefore, the theory for hesitant fuzzysets can also be applied to fuzzy sets.

International Journal of Intelligent Systems DOI 10.1002/int

802 XU AND XIA

It is noted that the number of values in different HFEs may be different, letlα(x) be the number of values in α(x). We arrange the elements in α(x) in increasingorder, and let α(x)σ (i)(i = 1, 2, . . . , lα(x)) be the ith smallest value in α(x).

Given an HFE α, Torra and Narukawa41 defined the complement set of α

described as αc = ∪γ∈α{1 − γ }. In this paper, to operate correctly, we assume theHFEs α and β should have the same length l when we compare them. If there isonly one value in α, we should extend it by repeating the value in it until it has thesame length with β.

From the literature review above, we can find that little research has beendone about hesitant fuzzy information. As entropy measures have wide applica-tions in pattern recognition,25 clustering analysis,47 approximate reasoning,48 imageprocessing,49 decision-making,50 it is very necessary to develop some entropy mea-sures under hesitant fuzzy environment. In the following, we first give the axiomaticdefinition of entropy for HFEs.

DEFINITION 2. An entropy on HFE α is a real-valued function E : H → [0, 1],satisfying the following axiomatic requirements:

(1) E(α) = 0, if and only if α = 0 or α = 1(2) E(α) = 1, if and only if ασ (i) + ασ (l−i+1) = 1, for i = 1, 2, . . . , lα(3) E(α) ≤ E(β), if ασ (i) ≤ βσ (i) for βσ (i) + βσ (l−i+1) ≤ 1 or ασ (i) ≥ βσ (i), for βσ (i) +

βσ (l−i+1) ≥ 1 i = 1, 2, . . . , l;(4) E(α) = E(αc).

Definition 2 is developed based on the axiomatic definition of fuzzy set. Mo-tivated by the entropy measures for fuzzy sets,7,10 we can construct some entropyformulas based on Definition 2 as follows:

E1(α) = 1

lα(√

2 − 1)

lα∑i=1

(sin

π(ασ (i) + ασ (lα−i+1))

4

+ sinπ(2 − ασ (i) − ασ (lα−i+1))

4− 1

)(2)

E2(α) = 1

lα(√

2 − 1)

lα∑i=1

(cos

π(ασ (i) + ασ (lα−i+1))

4

+ cosπ(2 − ασ (i) − ασ (lα−i+1))

4− 1

)(3)

International Journal of Intelligent Systems DOI 10.1002/int

HESITANT FUZZY ENTROPY AND CROSS-ENTROPY 803

E3(α) = − 1

lα ln 2

lα∑i=1

(ασ (i) + ασ (lα−i+1)

2ln

ασ (i) + ασ (lα−i+1)

2

+ 2 − ασ (i) + ασ (lα−i+1)

2ln

2 − ασ (i) + ασ (lα−i+1)

2

)(4)

E4(α) = 1

lα(2(1−s)t − 1

) lα∑i=1

(((ασ (i) + ασ (lα−i+1)

2

)s

+(

1 − ασ (i) + ασ (lα−i+1)

2

)s)t

− 1

)

t = 0, s = 1, s > 0 (5)

Moreover, with the change of the parameters in E4, some special cases can beobtained:

If t = 1, then

E4(α) = 1

lα(21−s − 1

) lα∑i=1

((ασ (i) + ασ (lα−i+1)

2

)s

+(

1 − ασ (i) + ασ (lα−i+1)

2

)s

− 1

)(6)

If t = 1s, then

E4(α) = s

lα(2(1−s)/s − 1)

lα∑i=1

(((ασ (i) + ασ (lα−i+1)

2

)s

+(

1 − ασ (i) + ασ (lα−i+1)

2

)s)1/s

− 1

)(7)

Xu and Xia43 gave the hesitant fuzzy similarity measure defined as

DEFINITION 3. 43 For two HFEs α and β, the similarity measure between α and β,denoted as S(α, β), should satisfy the following properties:

(1) S(α, β) = 0 if and only if α = 0, β = 1 or α = 1, β = 0;(2) S(α, β) = 1 if and only if ασ (i) = βσ (i), i = 1, 2, . . . , l;(3) S(α, γ ) ≤ S(α, β), S(α, γ ) ≤ S(β, γ ), if ασ (i) ≤ βσ (i) ≤ γσ (i) or ασ (i) ≥ βσ (i) ≥ γσ (i), i =

1, 2, . . . , l;(4) S(α, β) = S(β, α).

International Journal of Intelligent Systems DOI 10.1002/int

804 XU AND XIA

Based on Definition 3, some hesitant fuzzy similarity measures can be con-structed as43:

S1(α, β) = 1 − 1

l

l∑i=1

|ασ (i) − βσ (i)| (8)

S2(α, β) = 1 −√√√√1

l

l∑i=1

(ασ (i) − βσ (i))2 (9)

S3(α, β) = 1 − p

√√√√1

l

l∑i=1

|ασ (i) − βσ (i)|p (10)

S4(α, β) = maxi

{|ασ (i) − βσ (i)|}

(11)

S5(α, β) = maxi

{|ασ (i) − βσ (i)|2}

(12)

S6(α, β) = maxi

{|ασ (i) − βσ (i)|p}

(13)

S7(α, β) = 1 − 1

2

(1

l

l∑i=1

|ασ (i) − βσ (i)| + maxi

{|ασ (i) − βσ (i)|})

(14)

S8(α, β) = 1 − 1

2

⎛⎝

√√√√1

l

l∑i=1

|ασ (i) − βσ (i)|2 + maxi

{|ασ (i) − βσ (i)|2}⎞⎠ (15)

S9(α, β) = 1 − 1

2

⎛⎝ p

√√√√1

l

l∑i=1

|ασ (i) − βσ (i)|p + maxi

{|ασ (i) − βσ (i)|p}⎞⎠ (16)

By analyzing these similarity measures, we can find that S1 and S2 are basedon the Hamming distance and the Euclidean distance; S4 and S5 apply the Hausdorff

International Journal of Intelligent Systems DOI 10.1002/int

HESITANT FUZZY ENTROPY AND CROSS-ENTROPY 805

metric to S1 and S2; S7 combines S1 with S4; S8 combines S2 with S5; S3, S6, and S9

are further generalizations of S1 and S2, S4, and S5, S7, and S8, respectively; whenp = 1, then S3 becomes S1, S6 becomes S4, and S9 becomes S7; when p = 2, thenS3 reduces to S2, S6 reduces to S5, and S9 reduces to S8.

Many authors have investigated the relationships between similarity measuresand entropy formulas under different environments, such as interval-valued fuzzysets,6,16 interval-valued intuitionistic fuzzy sets.17,38 In what follows, we study therelationships between hesitant fuzzy similarity measures and hesitant fuzzy entropyformulas:

THEOREM 1. Let α be an HFE, then S(α, αc) is an entropy of α.

Proof.

(1) S(α, αc) = 0 ⇔ α = 0 and αc = 1 or αc = 0 and α = 1;(2) S(α, αc) = 1 ⇔ α = αc ⇔ ασ (i) + ασ (lα−i+1) = 1for i = 1, 2, . . . , l;(3) Suppose that ασ (i) ≤ βσ (i), for βσ (i) + βσ (lα−i+1) ≤ 1, i = 1, 2, . . . , l, then ασ (i) ≤ βσ (i) ≤

1 − βσ (lα−i+1) ≤ 1 − ασ (lα−i+1). Therefore, by the definition of the similarity measure ofHFE, we have S(α, αc) ≤ S(β, αc) ≤ S(β, βc). With the same reason, we can prove itwhen ασ (i) ≥ βσ (i), for βσ (i) + βσ (lα−i+1) ≥ 1, i = 1, 2, . . . , lα;

(4) S(α, αc) = S(αc, α).

Example 1. For two HFEs α and β, we can construct the following entropy formulasbased on the similarity measures S1, S2, S3, S4, S5, S6, S7, S8, and S9:

S1(α, αc) = 1 − 1

lα∑i=1

|ασ (i) + ασ (lα−i+1) − 1| (17)

S2(α, αc) = 1 −√√√√ 1

lα∑i=1

|ασ (i) + ασ (lα−i+1) − 1|2 (18)

S3(α, αc) = 1 − p

√√√√ 1

l∑i=1

|ασ (i) + ασ (lα−i+1) − 1|p (19)

S4(α, αc) = maxi

{|ασ (i) + ασ (lα−i+1) − 1|} (20)

S5(α, αc) = maxi

{|ασ (i) + ασ (lα−i+1) − 1|2} (21)

International Journal of Intelligent Systems DOI 10.1002/int

806 XU AND XIA

S6(α, αc) = maxi

{|ασ (i) + ασ (lα−i+1) − 1|p}(22)

S7(α, αc) = 1 − 1

2

(1

lα∑i=1

|ασ (i) + ασ (lα−i+1) − 1|

+ maxi

{|ασ (i) + ασ (lα−i+1) − 1|})

(23)

S8(α, αc) = 1 − 1

2

(√1

lα∑i=1

|ασ (i) + ασ (lα−i+1) − 1|2

+ maxi

{|ασ (i) + ασ (lα−i+1) − 1|2})

(24)

Sg(α, αc) = 1 − 1

2

(p

√√√√ 1

lα∑i=1

|ασ (i) + ασ (lα−i+1) − 1|p

+ maxi

{|ασ (i) + ασ (lα−i+1) − 1|p})

(25)

In this paper, we let [l/2] denote the largest integer no bigger than l/2, and−−−[l/2]

denote the smallest integer no smaller than l/2, then we get the following theorem:

THEOREM 2. For an HFE α, let ηα ={ασ (1), ασ (2), . . . , α

σ (−−−−[lα/2])

}and γα = {1 −

ασ (lα), 1 − ασ (lα−1), . . . , 1 − ασ ([lα/2]+1)}, then S(ηα, γα) is an entropy of α.

Proof.

(1) S(ηα, γα) = 0 ⇔ ηα = 0, γα = 1or ηα = 1, γα = 0 ⇔ α = 0 or α = 1;(2) S(ηα, γα) = 1 ⇔ ηα = γα ⇔,for i = 1, 2, . . . , lα;(3) Assume ασ (i) ≤ βσ (i), for βσ (i) + βσ (l−i+1) ≤ 1, i = 1, 2, . . . , l, then we have ασ (i) ≤

βσ (i) ≤ 1 − βσ (l−i+1) ≤ 1 − ασ (l−i+1). Therefore, from the definition of the similaritymeasure of HFEs, we have S(ηα, γα) ≤ S(ηβ, γα) ≤ S(ηβ, γβ ). Similarly, we can prove itis also true when ασ (i) ≥ βσ (i), for βσ (i) + βσ (l−i+1) ≥ 1, i = 1, 2, . . . , l;

(4) S(ηα, γα) = S(γα, ηα).

Example 2. For an HFE α, we can construct the following entropy formulas basedon the similarity measures S1, S2, S3, S4, S5, S6, S7, S8, and S9:

International Journal of Intelligent Systems DOI 10.1002/int

HESITANT FUZZY ENTROPY AND CROSS-ENTROPY 807

S1(ηα, γα) = 1 −−−−[

2

] −−−−[lα/2]∑i=1

|ασ (i) + ασ (lα−i+1) − 1| (26)

S2(ηα, γα) = 1 −

√√√√√ −−−[2

] −−−−[lα/2]∑i=1

|ασ (i) + ασ (lα−i+1) − 1|2 (27)

S3(ηα, γα) = 1 − p

√√√√√ −−−[2

] −−−−[lα

/2]

|ασ (i)+ασ (lα−i+1)−1|p∑i=1

(28)

S4(ηα, γα) = maxi

{|ασ (i) + ασ (lα−i+1) − 1|} (29)

S5(ηα, γα) = maxi

{|ασ (i) + ασ (lα−i+1) − 1|2} (30)

S6(ηα, γα) = maxi

{|ασ (i) + ασ (lα−i+1) − 1|p}(31)

S7(ηα, γα) = 1 −−−−[

1

] ⎛⎜⎝

−−−−[lα/2]∑i=1

|ασ (i) + ασ (lα−i+1) − 1|

+maxi

{|ασ (i) + ασ (lα−i+1) − 1|}⎞⎟⎠ (32)

S8(ηα, γα) = 1 − 1

2

⎛⎜⎜⎝

√√√√√−−−−−[2

] −−−−−−[lα/2]∑i=1

|ασ (i) + ασ (lα−i+1) − 1|2

+ maxi

{|ασ (i) + ασ (lα−i+1) − 1|2}⎞⎟⎟⎠ (33)

International Journal of Intelligent Systems DOI 10.1002/int

808 XU AND XIA

S9(ηα, γα) = 1 − 1

2

⎛⎜⎜⎝ p

√√√√√ −−−[2

] −−−−[lα/2]∑i=1

|ασ (i) + ασ (lα−i+1) − 1|p

+maxi

{|ασ (i) + ασ (lα−i+1) − 1|p})(34)

THEOREM 3 . For two HFEs α and β, suppose that |ασ (i) − βσ (i)| < |ασ (i+1) −βσ (i+1)|, i = 1, 2, . . . , l − 1, and

f (α, β) =( |ασ (1) − βσ (1)| + 1

2,|ασ (2) − βσ (2)| + 1

2, . . . ,

|ασ (l) − βσ (l)| + 1

2

)(35)

then E(f (α, β)) is the similarity measure of α and β.

Proof.

(1) E(f (α, β)) = 0 ⇔ f (α, β) = 1or f (α, β) = 0 ⇔ α = 0, β = 1 or α = 1, β = 0;(2) E(f (α, β)) = 1 ⇔ |ασ (i)−βσ (i)|+1

2 + |ασ (i+1)−βσ (i+1)|+12 = 1 ⇔ α = β;

(3) Since ασ (i) ≤ βσ (i) ≤ γσ (i), i = 1, 2, . . . , l, thus we can obtain |ασ (i)−γσ (i)|+12 ≥ |ασ (i)−βσ (i)|+1

2 ,i = 1, 2, . . . , l, implying f (α, γ ) ≥ f (α, β). From the definition of f (α, β), we knowthat f (α, β)σ (i) + f (α, β)σ (l−i+1) ≥ 1, i = 1, 2, . . . , l, thus E (f (α, γ )) ≤ E(f (α, β)).With the same reason, we can prove that it is also true for ασ (i) ≥ βσ (i) ≥ γσ (i), i =1, 2, . . . , l;

(4) E(f (α, β)) = E(f (β, α)).

Example 3. For two HFEs α and β, we have

E1(f (α, β)) = 1

l(√

2 − 1)

l∑i=1

(sin

π(2 + |ασ (i) − βσ (i)| + |ασ (l−i+1) − βσ (l−i+1)|)8

+ sinπ

(2 − |ασ (i) − βσ (i)| + |ασ (l−i+1) − βσ (l−i+1)|

)8

− 1

)(36)

E2(f (α,β)) = 1

l(√

2 − 1)

l∑i=1

(cos

π(2 + |ασ (i) − βσ (i)| + |ασ (l−i+1) − βσ (l−i+1)|

)8

+ cosπ

(2 − |ασ (i) − βσ (i)| − |ασ (l−i+1) − βσ (l−i+1)|

)8

− 1

)(37)

International Journal of Intelligent Systems DOI 10.1002/int

HESITANT FUZZY ENTROPY AND CROSS-ENTROPY 809

E3(f (α, β)) = − 1

l ln 2

l∑i=1

(2 + |ασ (i) − βσ (i)| + |ασ (l−i+1) − βσ (l−i+1)|

4

× ln2 + |ασ (i) − βσ (i)| + |ασ (l−i+1) − βσ (l−i+1)|

4

− 2 − |ασ (i) − βσ (i)| − |ασ (l−i+1) − βσ (l−i+1)|4

)

= − 1

l ln 2

l∑i=1

(2 + |ασ (i) − βσ (i)| + |ασ (l−i+1) − βσ (l−i+1)|

4

× ln2 − |ασ (i) − βσ (i)| − |ασ (l−i+1) − βσ (l−i+1)|

4

)(38)

E4(f (α, β)) = − 1

l(2(1−s)t − 1

)

×l∑

i=1

(((2 + |ασ (i) − βσ (i)| + |ασ (l−i+1) − βσ (l−i+1)|

4

)s

+(

2 − |ασ (i) − βσ (i)| − |ασ (l−i+1) − βσ (l−i+1)|4

)s)t

− 1

),

t = 0, s = 1, s > 0 (39)

COROLLARY 1. Let α and β be two HFEs, and E be the entropy of HFE, thenE ((f (α, β))c) is the similarity measure of the HFEs α and β.

COROLLARY 2. Let α and β be two HFEs, and |ασ (i) − βσ (i)| < |ασ (i+1) −βσ (i+1)|, i = 1, 2, . . . , l − 1, then we define

g(α, β) =( |ασ (1) − βσ (1)|p + 1

2,|ασ (2) − βσ (2)|p + 1

2,

. . . ,|ασ (l) − βσ (l)|p + 1

2

), p > 0 (40)

then E(g(α, β)) is the similarity measure of the HFEs α and β.

International Journal of Intelligent Systems DOI 10.1002/int

810 XU AND XIA

THEOREM 4 . For an HFE α, suppose that |ασ (i) + ασ (l−i+1) − 1| < |ασ (i+1) +ασ (lα−i) − 1|, i = 1, 2, . . . , [lα

/2], we define two HFEs m(α) and n(α) as

m(α) =(

1 + |ασ (1) + ασ (lα) − 1|2

,1 + |ασ (2) + ασ (lα−1) − 1|

2,

. . . ,

1 + |ασ (

−−−−[lα/2])

+ ασ (l−

−−−−[lα/2] +1)

− 1|2

)(41)

n(α) =(

1 − |ασ (1) + ασ (lα) − 1|2

,1 − |ασ (2) + ασ (lα−1) − 1|

2,

. . . ,

1 − |ασ (

−−−−[lα/2])

+ ασ (l−

−−−−[lα/2] +1)

− 1|2

)(42)

respecitvely, then S (m(α), n(α)) is the entropy of α.

Proof.

(1) S (m(α), n(α)) = 0 ⇔ m(α) = 1 and n(α) = 0 or m(α) = 0 and n(α) = 1 ⇔ α = 1 orα = 0;

(2) S(m(α), n(α)) = 1 ⇔ 1+|ασ (i)+ασ (lα−i+1)−1|2 = 1−|ασ (i)+ασ (lα−i+1)−1|

2 ⇔ ασ (i) + ασ (lα−i+1) = 1,

for i = 1, 2, . . . ,−−−−[lα/2] ⇔ ασ (i) + ασ (lα−i+1) = 1, for i = 1, 2, . . . , lα;

(3) Since ασ (i) ≤ βσ (i), for βσ (i) + βσ (l−i+1) ≤ 1, i = 1, 2, . . . , l, which implies ασ (i) ≤ βσ (i) ≤1 − βσ (l−i+1) ≤ 1 − ασ (l−i+1), we have |ασ (i) + ασ (l−i+1) − 1| ≥ |βσ (i) + βσ (l−i+1) − 1|,i = 1, 2, . . . , l, which means that n(α)σ (i) ⊆ n(β)σ (i) ⊆ m(β)σ (i) ⊆ m(α)σ (i), i = 1,2, . . . , l. Therefore, from the definition of the similarity measure of HFEs, we haveS(m(α), n(α)) ≤ S(m(β), n(α)) ≤ S(m(β), n(β)). With the same reason, when ασ (i) ≥βσ (i), and βσ (i) + βσ (l−i+1) ≥ 1, i = 1, 2, . . . , l, we can also prove S(m(α), n(α)) ≤S(m(β), n(β));

(4) S (m(α), n(α)) = S (m(αc), n(αc)).

COROLLARY 3. Suppose that S is the similarity measure for HFEs, then S((m(α))c,(n(α))c) is the entropy of the HFE α.

Example 4. Let α and β be two HFEs, we have

S1(m(α), n(α)) = S1(ηα, γα), S2(m(α), n(α))

= S2(ηα, γα), S3(m(α), n(α)) = S3(ηα, γα)

S4(m(α), n(α)) = S4(ηα, γα), S5(m(α), n(α))

= S5(ηα, γα), S6(m(α), n(α)) = S6(ηα, γα)

International Journal of Intelligent Systems DOI 10.1002/int

HESITANT FUZZY ENTROPY AND CROSS-ENTROPY 811

S7(m(α), n(α)) = S7(ηα, γα), S8(m(α), n(α))

= S8(ηα, γα), S9(m(α), n(α)) = S9(ηα, γα)

3. CROSS-ENTROPY MEASURES FOR HFES

In this section, we shall propose the axiomatic definition of cross-entropymeasure for HFEs motivated by Bhandai and Pal,6 Shang and Jiang,11 Vlachosand Sergiadis,34 and Hung and Yang,35 from which we can also get some entropymeasures for HFEs.

According to Shannon’s inequality,51 we first give the following definition:

DEFINITION 4. Let α and β be two HFEs, then the cross-entropy C(α, β) of α and β

should satisfy the following conditions:

(1) CA(α, β) ≥ 0;(2) CA(α, β) = 0 if and only if ασ (i) = βσ (i), i = 1, 2, . . . , l.

Based on Definition 4, we can give a cross-entropy formula of α and β definedas

CA(α, β) = 1

lT

l∑i=1

((1 + qασ (i)) ln(1 + qασ (i)) + (1 + qβσ (i)) ln(1 + qβσ (i))

2

− 2 + qασ (i) + qβσ (i)

2ln

2 + qασ (i) + qβσ (i)

2

+ (1 + q(1 − ασ (l−i+1))) ln(1 + q(1 − ασ (l−i+1))) + (1 + q(1 − βσ (l−i+1))) ln(1 + q(1 − βσ (l−i+1)))

2

− 2 + q(1 − ασ (l−i+1) + 1 − βσ (l−i+1))

2ln

2 + q(1 − ασ (l−i+1) + 1 − βσ (l−i+1))

2

), q > 0 (43)

where T = (1 + q) ln(1 + q) − (2 + q)(ln(2 + q) − ln 2), q > 0.Since T′

q = 1 + ln(1 + q) − 1 − (ln(2 + q) − ln 2) = ln 2+2q

1+q> 0, then, T is

an increasing function about q, and attains its minimization value 0 at q = 0, thusT > 0. In addition, since f (x) = (1 + qx) ln(1 + qx), 0 ≤ x ≤ 1, then f (x)′x =τ ln(1 + τx) + τ ≥ 0 and f (x)

′′x = τ 2

1+τx> 0. Thus f (x) is a concave-up func-

tion of x. Therefore, CA(α, β) ≥ 0 and CA(α, β) = 0 if and only if ασ (i) = βσ (i),i = 1, 2, . . . , l. Moreover, CA(α, β) degenerates to their fuzzy counterparts when α

and β are fuzzy sets. According to Definition 4, CA(α, β) is the cross-entropy of α

and β.

International Journal of Intelligent Systems DOI 10.1002/int

812 XU AND XIA

THEOREM 5 . Let α be an HFE, then EA(α) = 1 − CA(α, αc) is an entropy formulafor α.

Proof. EA(α) = 1 − CA(α, αc)

= 1 − 2

lαT

lα∑i=1

((1 + qασ (i)) ln(1 + qασ (i)) + (1 + q(1 − ασ (lα−i+1))) ln(1 + q(1 − ασ (lα−i+1)))

2

−2 + qασ (i) + q(1 − ασ (lα−i+1))

2ln

2 + qασ (i) + q(1 − ασ (lα−i+1))

2

), q > 0 (44)

where T = (1 + q) ln(1 + q) − (2 + q)(ln(2 + q) − ln 2), q > 0.If ασ (i) ≤ βσ (i), for βσ (i) + βσ (l−i+1) ≤ 1, i = 1, 2, . . . , l, then we have ασ (i) ≤

βσ (i) ≤ 1 − βσ (l−i+1) ≤ 1 − ασ (l−i+1), which means |ασ (i) + ασ (l−i+1) − 1| ≥|βσ (i) + βσ (l−i+1) − 1|. Let 0 ≤ x, y ≤ 1 and t = |x − y|, then

f (x, y) = (1 + qx) ln(1 + qx) + (1 + qy) ln(1 + qy)

2

− 1 + qx + 1 + qy

2ln

1 + qx + 1 + qy

2, q > 0 (45)

If x ≥ y, then x = t + y, and

f (t, y) = (1 + q(t + y)) ln(1 + qt + qy) + (1 + qy) ln(1 + qy)

2,

− 1 + q(t + y) + 1 + qy

2ln

1 + q(t + y) + 1 + qy

2, q > 0 (46)

thus

f (t, y)′t = q + q ln(1 + q(y + t)) + q + q ln(1 + qy)

2

− q

2− q

2ln

1 + q(y + t) + 1 + qy

2≥ 0, q > 0 (47)

Therefore, f (x, y) is a nondecreasing function of |x − y|, for x ≥ y. With thesame reason, we can prove that it is also true for x ≤ y. As a result, EA(α) ≤ EA(β),and EA(α) attains its maximization value 1 at α = αc and obtains the minimizationvalue 0 at α = 1 or α = 0.

International Journal of Intelligent Systems DOI 10.1002/int

HESITANT FUZZY ENTROPY AND CROSS-ENTROPY 813

Another cross-entropy formula of α and β can be defined as

CB(α, β) = 1

(1 − 21−p)l

l∑i=1

p

σ (i) + βp

σ (i)

2+ (1 − ασ (l−i+1))p + (1 − βσ (l−i+1))p

2

−(

ασ (i) + βσ (i)

2

)p

+(

1 − ασ (l−i+1) + 1 − βσ (l−i+1)

2

)p), p > 1

(48)

Since g(x) = xp, 0 ≤ x ≤ 1 and p > 1, then g(x)′x = pxp−1 and g(x)′′x =

p(p − 1)xp−2 > 0. Thus, g(x) is a concave-up function of x, and then CB(α, β) ≥ 0and CB(α, β) = 0, if and only if α = β. Moreover, CB(α, β) degenerates to theirfuzzy counterparts when α and β are fuzzy sets. According to Definition 4, CB(α, β)is a cross-entropy of α and β.

THEOREM 6 . Let α be an HFE, then EB(α) = 1 − CB(α, αc) is an entropy formulafor α.

Proof.

EB(α) = 1 − CB(α, αc) = 1 − 1

(1 − 21−p)lα

lα∑i=1

p

σ (i) + (1 − ασ (lα−i+1))p

2

+ (1 − ασ (lα−i+1))p + (ασ (i))p

2−

(ασ (i) + 1 − ασ (lα−i+1)

2

)p

+(

1 − ασ (lα−i+1) + ασ (i)

2

)p )= 1 − 2

(1 − 21−p)lα

l∑i=1

×(

αp

σ (i) + (1 − ασ (lα−i+1))p

2−

(ασ (i) + 1 − ασ (lα−i+1)

2

)p ), p > 1

(49)

If α ⊆ β, for βσ (i) + βσ (l−i+1) ≤ 1, then we have ασ (i) ≤ βσ (i) ≤ 1 − βσ (l−i+1)

≤ 1 − ασ (l−i+1), which implies |ασ (i) + ασ (l−i+1) − 1| ≥ |βσ (i) + βσ (l−i+1) − 1|. Lett = |x − y| and

g(x, y) = xp + yp

2−

(x + y

2

)p

, 0 ≤ x, y ≤ 1, p > 1 (50)

International Journal of Intelligent Systems DOI 10.1002/int

814 XU AND XIA

If x ≥ y, then x = y + t , and

g(t, y) = (y + t)p + yp

2−

(y + t

2

)p

, p > 1 (51)

g(t, y)′t = p

2

((y + t)p−1 −

(y + t

2

)p−1)

≥ 0, p > 1 (52)

thus g(x, y) does not decrease as |x − y| increases. With the same reason, wecan prove it is also true for x ≤ y. Therefore, E(α) ≤ E(β), and EA(α) attains itsmaximization value 1 at α = αc and obtains the minimization value 0 at α = 1 orα = 0.

4. A METHOD FOR MULTIATTRIBUTE DECISION-MAKING WITHHESITANT FUZZY INFORMATION AND INFORMATION

MEASURES

Up to now, mutiattribute decision-making problems have been investigatedunder different environments. For example, Fan and Liu52 proposed an approachto solving group decision-making problems with ordinal interval numbers. Xu andChen53 focused on the multiattribute group decision-making with different formatsof preference information on attributes. Xu and Yager54 developed some intuition-istic fuzzy Bonferroni means and applied them to multiattribute decision-making.

In the decision-making process, sometimes, the information about attributeweights55,56 is completely unknown because of time pressure, lack of knowledge ordata, and the expert’s limited expertise about the problem domain. Some classicalweight-determining methods have been developed over the last decades, includingthe TOPSIS method57 and the entropy method,50 but they cannot be suitable todeal with the situation that the degrees of an alternative satisfies to an attributeare presented by several possible values which can be considered as an HFE. Inthis section, we shall extend the entropy method to hesitant fuzzy environment andobtain the final optimal alternative by comparing the cross-entropy measures withthe ideal solutions.

Suppose that there are m alternatives Yi(i = 1, 2, . . . , m) and n attributesGj (j = 1, 2, . . . , n) with the attribute weight vector w = (w1, w2, . . . , wn)T suchthat wj ∈ [0, 1], j = 1, 2, . . . , n, and

∑nj=1 wj = 1. Suppose that a decision or-

ganization is authorized to provides all the possible degrees that the alternative Yi

satisfies the attribute Gj , denoted by an HFE αij .Based on the above analysis, we give the following decision-making methods:

Method IStep 1. The decision-maker provides all the possible evaluations about the

alternative Yi under the attribute Gj , denoted by the HFEs αij (i = 1, 2, . . . , m;j = 1, 2, . . . , n).

International Journal of Intelligent Systems DOI 10.1002/int

HESITANT FUZZY ENTROPY AND CROSS-ENTROPY 815

Step 2. If the information about the weight wj of the attribute Gj is unknowncompletely, then we establish an exact model of entropy weights for determiningthe attribute weights:

wj = 1 − Ej

n − ∑nj=1 Ej

, j = 1, 2, . . . , n (53)

where Ej = 1m

∑mi=1 E(αij ), j = 1, 2, . . . , n, and each E(αij ) can be calculated by

Equations 44 or 49.Step 3. Let J1 and J2 be the sets of benefit attributes and cost attributes, respec-

tively. Suppose that the hesitant fuzzy ideal solution is α+ = (α+1 , α+

2 , . . . , α+n ) and

the hesitant fuzzy negative-ideal solution is α− = (α−1 , α−

2 , . . . , α−n ), where α+

i = 1,α−

i = 0, i ∈ J1 and α+i = 0, α−

i = 1, i ∈ J2. Then we calculate the cross-entropybetween the alternative Yi and the positive-ideal solution or the negative-idealsolution:

C+(Yi) =n∑

j=1

(wjC(αij , α

+j )

), i = 1, 2, . . . , m (54)

C−(Yi) =n∑

j=1

(wjC(αij , α

−j )

), i = 1, 2, . . . , m (55)

Step 4. Calculate the closeness degree of the alternative Yi to the ideal solutionby using

C(Yi) = C+(Yi)

C+(Yi) + C−(Yi), i = 1, 2, . . . , m (56)

Step 5. Rank the alternatives Yi(i = 1, 2, . . . , m) according to the values ofC(Yi)(i = 1, 2, . . . , m) in ascending order, and the smaller the value of C(Yi), thebetter the alternative Yi .

If we use the maximizing deviation method58 to derive the weight vector ofthe attributes in Step 2, and use the TOPSIS method57 to compare the alternativesin Steps 3 and 4, then the following method can be obtained:

Method IIStep 1. See Method I.Step 2. Calculate the attribute weight wj of the attribute Gj by the maximizing

deviation method:

wj =∑m

i=1

∑mk=1 d(αij , αkj )∑n

j=1

∑mi=1

∑mk=1 d(αij , αkj )

, j = 1, 2, . . . , n (57)

International Journal of Intelligent Systems DOI 10.1002/int

816 XU AND XIA

where d(αij , αkj ) is the distance between αij and αkj defined by Xu and Xia,42,43

such that for two HFEs α and β, the distance between α and β, denoted as d(α, β),is given as

d(α, β) = 1

l

l∑i=1

∣∣ασ (i) − βσ (i)

∣∣ (58)

Step 3. Calculate the distance between the alternative Yi and the positive-idealsolution α+ = (α+

1 , α+2 , . . . , α+

n ) or the negative-ideal solution α− = (α−1 , α−

2 ,

. . . , α−n ):

d+(Yi) =n∑

j=1

(wjd(αij , α

+j )

), i = 1, 2, . . . , m (59)

d−(Yi) =n∑

j=1

(wjd(αij , α

−j )

), i = 1, 2, . . . , m (60)

Step 4. Calculate the closeness degree of the alternative Yi to the ideal solutionα+ by using

D(Yi) = d−(Yi)

d−(Yi) + d+(Yi), i = 1, 2, . . . , m (61)

Step 5. Rank the alternatives Yi(i = 1, 2, . . . , m) according to the values ofD(Yi)(i = 1, 2, . . . , m) in ascending order, and the smaller the value of D(Yi), thebetter the alternative Yi .

Example 5 (adapted from Boran et al.59). An automotive company is desired toselect the most appropriate supplier for one of the key elements in its manufacturingprocess. After pre-evaluation, four suppliers have been remained as alternatives forfurther evaluation. To evaluate alternative suppliers, four attributes are consideredas

G1 : Product quality;G2 : Relationship closeness;G3 : Delivery performance;G4 : Price,

where G1, G2, and G3 are the benefit attributes, and G4 is the cost attribute.To get the optimal alternative, the following steps are given if Method I is used:Step 1. The decision organization provides all the possible assessments of

the alternative Yi on the attribute Gj , which can be considered as an HFE αij

constructing the hesitant fuzzy decision matrix D = (αij )n×n (see Table I). Forexample, to evaluate the degrees that the alternative Y1 should satisfy the attributeG1, some experts in the decision organization provide 0.2, some provide 0.4 and theothers provide 0.7, and these three parts cannot persuade each other, therefore, the

International Journal of Intelligent Systems DOI 10.1002/int

HESITANT FUZZY ENTROPY AND CROSS-ENTROPY 817

Table I. Hesitant fuzzy decision matrix.

G1 G2 G3 G4

Y1 {0.2,0.4,0.7} {0.1,0.2,0.5,0.7} {0.2,0.3,0.5,0.7,0.8} {0.1,0.4,0.6}Y2 {0.4,0.6,0.7} {0.1,0.2,0.4,0.6} {0.3,0.4,0.6,0.8,0.9} {0.1,0.2,0.4}Y3 {0.2,0.3,0.6} {0.3,0.4,0.5,0.9} {0.2,0.4,0.6,0.7,0.8} {0.3,0.4,0.8}Y4 {0.2,0.3,0.5} {0.2,0.3,0.5,0.7} {0.4,0.6,0.7,0.8,0.9} {0.1,0.2,0.7}

Table II. Entropy matrix.

G1 G2 G3 G4

Y1 0.9800 0.9350 0.9920 0.9267Y2 0.9800 0.8750 0.9600 0.7133Y3 0.9200 0.9750 0.9880 0.9800Y4 0.8867 0.9750 0.8680 0.8533

degrees that the alternative Y1 satisfy the attribute G1 can be considered as an HFE{0.2, 0.4, 0.7}.

Step 2. Suppose that the information about the attribute weight wj of theattribute Gj is unknown completely, then we can calculate the entropy matrix byEquation 44 (q = 2) (see Table II):

By Equation 53, we can obtain the weight vector:

wa = (0.1957, 0.2013, 0.1611, 0.4419)T

Step 3. Utilize Equation 43 (let q = 2) to calculate the cross-entropy betweenthe alternative Yi and the positive-ideal solution or the negative-ideal solution:

C+(Y1) = 0.2850, C+(Y2) = 0.2040, C+(Y3) = 0.3127, C+(Y4) = 0.2648

C−(Y1) = 0.3329, C−(Y2) = 0.4275, C−(Y3) = 0.2834, C−(Y4) = 0.3747

Step 4. Calculate the closeness degree of the alternative Yi to the ideal solutionby Equation 56:

C(Y1) = 0.4613, C(Y2) = 0.3230, C(Y3) = 0.5245, C(Y4) = 0.4141

Step 5. Rank the alternatives Yi(i = 1, 2, 3, 4) according to the values ofC(Yi)(i = 1, 2, 3, 4) in ascending order:

Y2 � Y4 � Y1 � Y3

If we use Equations 48 and 49 (let p = 3) in the proposed method, then thefollowing steps are given:

Step 1. See the above.

International Journal of Intelligent Systems DOI 10.1002/int

818 XU AND XIA

Step 2. Utilize Equations 49 and 53 to calculate the attribute weight vector,then

w = (0.1801, 0.2123, 0.1615, 0.4460)T

Step 3. Utilize Equation 48 to calculate the cross-entropy between the alter-native Yi and the positive- ideal solution or the negative-ideal solution, we canobtain:

C+(Y1) = 0.2916, C+(Y2) = 0.2144, C+(Y3) = 0.3169, C+(Y4) = 0.2676

C−(Y1) = 0.3393, C−(Y2) = 0.4297, C−(Y3) = 0.2929, C−(Y4) = 0.3817

Step 4. Calculate the closeness degree of the alternative Yi to the ideal solutionby Equation 56:

C(Y1) = 0.4622, C(Y2) = 0.3329, C(Y3) = 0.5197, C(Y4) = 0.4121

Step 5. According to the values of C(Yi)(i = 1, 2, 3, 4), we can get the sameranking of Yi (i = 1, 2, 3, 4): Y2 � Y4 � Y1 � Y3.

If Method II is used, then the following steps are given:Step 1. See Method I.Step 2. Calculate the attribute weight wj of the attribute Gj by Equation 57:

wb = (0.2629, 0.2229, 0.2057, 0.3086)T

Step 3. Calculate the distance between the alternative Yi and the positive-idealsolution or the negative-ideal solution:

d−(Y1) = 0.4958, d−(Y2) = 0.5815, d−(Y3) = 0.4788, d−(Y4) = 0.5280

d+(Y1) = 0.5042, d+(Y2) = 0.4185, d+(Y3) = 0.5213, d+(Y4) = 0.4720

Step 4. Calculate the closeness degree of the alternative Yi to the ideal solutionby Equation 61:

D(Y1) = 0.4958, D(Y2) = 0.5815, D(Y3) = 0.4788, D(Y4) = 0.5280

Step 5. Rank the alternatives Yi(i = 1, 2, 3, 4) according to the values ofD(Yi)(i = 1, 2, 3, 4) in ascending order: Y2 � Y4 � Y1 � Y3, which is the sameas that in Method I.

If we use the hesitant fuzzy weighted aggregation (HFWA) operator46 to aggre-gate the hesitant fuzzy information for each alternative, then we can utilize the scorefunction46 to obtain the scores of each alternative S(Yi)(i = 1, 2, 3, 4). Suppose

International Journal of Intelligent Systems DOI 10.1002/int

HESITANT FUZZY ENTROPY AND CROSS-ENTROPY 819

that the weight vector of alternatives, wa = (0.1957, 0.2013, 0.1611, 0.4419)T, isobtained from Method I, then we have

S(Y1) = 0.4503, S(Y2) = 0.4683, S(Y3) = 0.5268, S(Y4) = 0.4793

Ranking the alternatives Yi(i = 1, 2, 3, 4) according to the values of S(Yi)(i =1, 2, 3, 4) in ascending order, we obtain the same ranking result: Y2 � Y4 � Y1 � Y3.

From the above analysis, we can find that although these three methods canget the same ranking of the alternatives, the first one focuses on the informationentropy and the cross entropy measures, the second one on the distance measures,and both of them are suitable for dealing with the situations that the weight vectorof the alternatives are unknown; the last one is only suitable for the situations thatthe weight vector of attributes are known. The first two methods are much simplerthan the last one, because the aggregation operators in the last method need a lot ofcomputation, which can be avoided in the first two methods. In addition, the weightvector obtained using the first method is based on the entropy method which focuseson the fuzziness of the provided information; while the weight vector obtained usingthe second one is based on the maximizing deviation method which focuses on thedeviations among the decision information, Therefore, we should choose a propermethod according to the practical problems.

5. CONCLUDING REMARKS

In this paper, we have developed some information measures under hesitantfuzzy environment, including the entropy, cross-entropy, and similarity measures.Some axiomatic definitions about these three information measures have been givenfor HFEs. We have proven several theorems that the entropy, cross-entropy, andsimilarity measures of HFEs can be transformed by each other based on theiraxiomatic definitions. Two hesitant fuzzy multiattribute decision-making methodshave been developed which permits the decision-maker to provide several possiblevalues for an alternative under the given attribute, which is consistent with humans’hesitant thinking. The proposed methods can avoid complex computations and aremore applicable than the existing ones.

Acknowledgment

The work was supported by the National Natural Science Foundation of China (No.71071161).

References

1. Zadeh LA. Probability measures of fuzzy events. J Math Anal Appl 1968;23:421–427.2. De Luca A, Termini S. A definition of nonprobabilistic entropy in the setting of fuzzy sets

theory. Inf Control 1972;20:301–312.

International Journal of Intelligent Systems DOI 10.1002/int

820 XU AND XIA

3. Zadeh LA. Fuzzy sets and systems. In: Proc Symp Systems Theory, Polytechnic Institute ofBrooklyn, New York; 1965. pp 29–37.

4. Kaufmann A. Introduction to the theory of fuzzy sets: fundamental theoretical elements.New York: Academic Press; 1975. Vol 1.

5. Yager RR. On the measure of fuzziness and negation. Part 1: Membership in the unit interval.Int J Gen Syst 1979;5:221–229.

6. Bhandari D, Pal NR. Some new information measures for fuzzy sets. Inf Sci 1993;67:209–228.

7. Fan JL. Some new fuzzy entropy formulas. Fuzzy Sets Syst 2002;128:277–284.8. Kosko B. Addition as fuzzy mutual entropy, Inf Sci 1993;73:273–284.9. Liu XC. Entropy, distance measure and similarity measure of fuzzy sets and their relations.

Fuzzy Sets Syst 1992;52:305–318.10. Parkash O, Sharma PK, Mahajan R. New measures of weighted fuzzy entropy and their appli-

cations for the study of maximum weighted fuzzy entropy principle. Inf Sci 2008;178:2389–2395.

11. Shang XG, Jiang WS. A note on fuzzy information measures. Pattern Recognit Lett1997;18:425–432.

12. Zadeh LA. The concept of a linguistic variable and its application to approximatereasoning—I. Inf Sci 1975;8:199–249.

13. Atanassov K. Intuitionistic fuzzy sets. Fuzzy Sets Syst 1986;20:87–96.14. Pawlak Z. Rough sets: Theoretical aspects of reasoning about data. Dordrecht, The Nether-

lands: Kluwer; 1991.15. Burillo P, Bustince H. Entropy on intuitionistic fuzzy sets and on interval-valued fuzzy sets.

Fuzzy Sets Syst 1996;78:305–316.16. Zeng WY, Li HX. Relationship between similarity measure and entropy of interval valued

fuzzy sets. Fuzzy Sets Syst 2006;157:1477–1484.17. Zhang QS, Jiang SY, Jia BG, Luo SH. Some information measures for interval-valued

intuitionistic fuzzy sets. Inf Sci 2010;180:5130–5145.18. Szmidt E, Kacprzyk J. Entropy for intuitionistic fuzzy sets. Fuzzy Sets Syst 2001;118:467–

477.19. Sen D, Pal SK. Generalized rough sets, entropy, and image ambiguity measures. IEEE Trans

Syst Man Cybern B 2009;39:117–128.20. Busoniu L, Ernst D, Schutter BD, Babuska R. Cross-entropy optimization of control

policies with adaptive basis functions. IEEE Trans Syst Man Cybern B 2011;41:196–209.

21. Grzegorzewski P. Distances between intuitionistic fuzzy sets and/or interval-valued fuzzysets based on the Hausdorff metric. Fuzzy Sets Syst 2004;148:319–328.

22. Hung WL, Yang MS. Similarity measures of intuitionistic fuzzy sets based on Hausdorffdistance. Pattern Recognit Lett 2004;25:1603–1611.

23. Hung WL, Yang MS. Similarity measures of intuitionistic fuzzy sets based on Lp metric.Int J Approx Reason 2007;46:120–136.

24. Hung WL, Yang MS. On similarity measures between intuitionistic fuzzy sets. Int J IntellSyst 2008;23:364–383.

25. Li DF, Cheng CT. New similarity measures of intuitionistic fuzzy sets and application topattern recognitions. Pattern Recognit Lett 2002;23:221–225.

26. Li DF, Wang YC, Liu S, Shan F. Fractional programming methodology formulti-attribute group decision-making using IFS. Appl Soft Comput 2009;9:219–225.

27. Li YH, Olson DL, Qin Z. Similarity measures between intuitionistic fuzzy (vague) sets: Acomparative analysis. Pattern Recognit Lett 2007;28:278–285.

28. Liang ZZ, Shi PF. Similarity measures on intuitionistic fuzzy sets. Pattern Recognit Lett2003;24:2687–2693.

29. Mitchell HB. On the Dengfeng-Chuntian similarity measure and its application to patternrecognition. Pattern Recognit Lett 2003;24:3101–3104.

International Journal of Intelligent Systems DOI 10.1002/int

HESITANT FUZZY ENTROPY AND CROSS-ENTROPY 821

30. Szmidt E, Kacprzyk J. Distances between intuitionistic fuzzy sets. Fuzzy Sets Syst2000;114:505–518.

31. Xu ZS, Chen J. An overview of distance and similarity measures of intuitionistic fuzzy sets.Int J Uncertain Fuzziness Knowl-Based Syst 2008;16:529–555.

32. Xu ZS, Yager RR. Intuitionistic and interval-valued intutionistic fuzzy preference relationsand their measures of similarity for the evaluation of agreement within a group. Fuzzy OptimDecis Mak 2009;8:123–139.

33. Xia MM, Xu ZS. Some new similarity measures for intuitionistic fuzzy values and theirapplication in group decision making. J Syst Sci Syst Eng 2010;19:430–452.

34. Vlachos IK, Sergiadis GD. Intuitionistic fuzzy information-applications to pattern recogni-tion. Pattern Recognit Lett 2007;28:197–206.

35. Hung W L, Yang M S. On the J-divergence of intuitionistic fuzzy sets with its applicationto pattern recognition. Inf Sci 2008;178:1641–1650.

36. Xia MM, Xu ZS. Entropy/cross entropy-based group decision making under intuitionisticfuzzy environment. Inf Fusion 2012;13:31–47.

37. Zeng WY, Guo P. Normalized distance, similarity measure, inclusion measure andentropy of interval-valued fuzzy sets and their relationship. Inf Sci 2008;178:1334–1342.

38. Zhang QS, Jiang SY. Relationships between entropy and similarity measure of interval-valued intuitionistic fuzzy sets. Int J Intell Syst 2010;25:1121– 1140.

39. Atanassov K, Gargov G. Interval valued intuitionistic fuzzy sets. Fuzzy Sets Syst1989;31:343–349.

40. Torra V. Hesitant fuzzy sets. Int J Intell Syst 2010;25:529–539.41. Torra V, Narukawa Y. On hesitant fuzzy sets and decision. In: The 18th IEEE Int Conf Fuzzy

Systems, Jeju Island, Korea; 2009. pp 1378–1382.42. Xu ZS, Xia MM. On distance and correlation measures of hesitant fuzzy information. Int J

Intell Syst 2011;26:410–425.43. Xu ZS, Xia MM. Distance and similarity measures for hesitant fuzzy sets. Inf Sci

2011;181:2128–2138.44. Dubois D, Prade H. Fuzzy sets and systems: Theory and applications. New York: Academic

Press; 1980.45. Miyamoto S. Remarks on basics of fuzzy sets and fuzzy multisets. Fuzzy Sets Syst

2005;156:427–431.46. Xia MM, Xu ZS. Hesitant fuzzy information aggregation in decision making. Int J Approx

Reason 2011;52:395–407.47. Yang MS, Lin DC. On similarity and inclusion measures between type-2 fuzzy sets with an

application to clustering. Comput Math Appl 2009;57:896–907.48. Wang TJ, Lu ZD, Li F. Bidirectional approximate reasoning based on weighted similarity

measures of vague sets. J Comput Eng Sci 2002;24:96–100.49. Pal SK, King RA. Image enhancement using smoothing with fuzzy sets. IEEE Trans Syst

Man Cybern 1981;11:495–501.50. Ye J. Fuzzy decision-making method based on the weighted correlation coefficient under

intuitionistic fuzzy environment. Eur J Oper Res 2010;205:202– 204.51. Lin J. Divergence measures based on Shannon entropy. IEEE Trans Inf Theory 1991;37:145–

151.52. Fan ZP, Liu Y. An approach to solve group decision making problems with ordinal interval

numbers. IEEE Trans Syst Man Cybern B 2010;40:1413–1423.53. Xu ZS, Chen J. MAGDM linear-programming models with distinct uncertain preference

structures. IEEE Trans Syst Man Cybern B 2008;38:1356–1370.54. Xu ZS, Yager RR. Intuitionistic fuzzy bonferroni means. IEEE Trans Syst Man Cybern B

2011;41:169–177.55. Chou SY, Chang YH, Shen CY. A fuzzy simple additive weighting system under group

decision-making for facility location selection with objective/subjective attributes. Euro JOper Res 2008;189:132–145.

International Journal of Intelligent Systems DOI 10.1002/int

822 XU AND XIA

56. Yeh CH, Chang YH. Modeling subjective evaluation for fuzzy group multicriteria decisionmaking. Eur J Oper Res 2009;194:464–473.

57. Wang YM. Using the method of maximizing deviations to make decision for multi-indices.J Syst Eng Electron 1998;7:24–26, 31.

58. Hwang CL, Yoon K. Multiple attribute decision making methods and applications. Berlin:Springer; 1981.

59. Boran FE, Genc S, Kurt M, Akay D. A multi-criteria intuitionistic fuzzy group decisionmaking for supplier selection with TOPSIS method. Expert Syst Appl 2009;36:11363–11368.

International Journal of Intelligent Systems DOI 10.1002/int