23
Review Interoperability evaluation models: A systematic review Reza Rezaei a,b, *, Thiam Kian Chiew a , Sai Peck Lee a , Zeinab Shams Aliee a a Department of Software Engineering, Faculty of Computer Science & Information Technology, University of Malaya, 50603 Kuala Lumpur, Malaysia b Department of Computer Engineering, Saveh Branch, Islamic Azad University, Saveh, Iran Contents 1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 2. Interoperability issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 3. Interoperability evaluation models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 3.1. Spectrum of Interoperability Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 3.2. Quantification of interoperability methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 3.3. Military communications and information systems interoperability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 3.4. Levels of information systems interoperability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 3.4.1. The LISI assessment products . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 3.4.2. Potential interoperability matrix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 3.5. Organizational interoperability maturity model for C2. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 3.6. Interoperability assessment methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 3.6.1. Interoperability components . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 3.6.2. Assessment process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 3.7. Stoplight . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 3.8. Enterprise interoperability maturity model. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 3.9. The layered interoperability score . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 3.10. Government interoperability maturity matrix. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 4. Comparative analysis of the interoperability evaluation models. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 4.1. Data interoperability. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 4.2. Process interoperability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 4.3. Rules interoperability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 4.4. Objects interoperability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 4.5. Software systems interoperability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 4.6. Cultural interoperability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 4.7. Knowledge interoperability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 4.8. Services interoperability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 4.9. Social networks interoperability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 Computers in Industry 65 (2014) 1–23 A R T I C L E I N F O Article history: Received 14 January 2013 Received in revised form 22 July 2013 Accepted 6 September 2013 Available online 3 October 2013 Keywords: Interoperability Evaluation Measurement Assessment A B S T R A C T Interoperability is defined as the ability for two (or more) systems or components to exchange information and to use the information that has been exchanged. There is increasing demand for interoperability between individual software systems. Developing an interoperability evaluation model between software and information systems is difficult, and becoming an important challenge. An interoperability evaluation model allows knowing the degree of interoperability, and lead to the improvement of interoperability. This paper describes the existing interoperability evaluation models, and performs a comparative analysis among their findings to determine the similarities and differences in their philosophy and implementation. This analysis yields a set of recommendations for any party that is open to the idea of creating or improving an interoperability evaluation model. ß 2013 Elsevier B.V. All rights reserved. * Corresponding author. Tel.: +60 176834318. E-mail addresses: [email protected] (R. Rezaei), [email protected] (T.K. Chiew), [email protected] (S.P. Lee), [email protected] (Z. Shams Aliee). Contents lists available at ScienceDirect Computers in Industry jo ur n al ho m epag e: ww w.els evier .c om /lo cat e/co mp in d 0166-3615/$ see front matter ß 2013 Elsevier B.V. All rights reserved. http://dx.doi.org/10.1016/j.compind.2013.09.001

Interoperability evaluation models: A systematic review

  • Upload
    zeinab

  • View
    275

  • Download
    44

Embed Size (px)

Citation preview

Computers in Industry 65 (2014) 1–23

Review

Interoperability evaluation models: A systematic review

Reza Rezaei a,b,*, Thiam Kian Chiew a, Sai Peck Lee a, Zeinab Shams Aliee a

a Department of Software Engineering, Faculty of Computer Science & Information Technology, University of Malaya, 50603 Kuala Lumpur, Malaysiab Department of Computer Engineering, Saveh Branch, Islamic Azad University, Saveh, Iran

Contents

1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

2. Interoperability issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

3. Interoperability evaluation models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

3.1. Spectrum of Interoperability Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

3.2. Quantification of interoperability methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

3.3. Military communications and information systems interoperability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

3.4. Levels of information systems interoperability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

3.4.1. The LISI assessment products . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

3.4.2. Potential interoperability matrix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

3.5. Organizational interoperability maturity model for C2. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

3.6. Interoperability assessment methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

3.6.1. Interoperability components . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

3.6.2. Assessment process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

3.7. Stoplight . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

3.8. Enterprise interoperability maturity model. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

3.9. The layered interoperability score . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

3.10. Government interoperability maturity matrix. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

4. Comparative analysis of the interoperability evaluation models. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

4.1. Data interoperability. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

4.2. Process interoperability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

4.3. Rules interoperability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

4.4. Objects interoperability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

4.5. Software systems interoperability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

4.6. Cultural interoperability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

4.7. Knowledge interoperability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

4.8. Services interoperability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

4.9. Social networks interoperability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

A R T I C L E I N F O

Article history:

Received 14 January 2013

Received in revised form 22 July 2013

Accepted 6 September 2013

Available online 3 October 2013

Keywords:

Interoperability

Evaluation

Measurement

Assessment

A B S T R A C T

Interoperability is defined as the ability for two (or more) systems or components to exchange

information and to use the information that has been exchanged. There is increasing demand for

interoperability between individual software systems. Developing an interoperability evaluation model

between software and information systems is difficult, and becoming an important challenge. An

interoperability evaluation model allows knowing the degree of interoperability, and lead to the

improvement of interoperability. This paper describes the existing interoperability evaluation models,

and performs a comparative analysis among their findings to determine the similarities and differences

in their philosophy and implementation. This analysis yields a set of recommendations for any party that

is open to the idea of creating or improving an interoperability evaluation model.

� 2013 Elsevier B.V. All rights reserved.

Contents lists available at ScienceDirect

Computers in Industry

jo ur n al ho m epag e: ww w.els evier . c om / lo cat e/co mp in d

* Corresponding author. Tel.: +60 176834318.

E-mail addresses: [email protected] (R. Rezaei), [email protected] (T.K. Chiew), [email protected] (S.P. Lee), [email protected]

(Z. Shams Aliee).

0166-3615/$ – see front matter � 2013 Elsevier B.V. All rights reserved.

http://dx.doi.org/10.1016/j.compind.2013.09.001

R. Rezaei et al. / Computers in Industry 65 (2014) 1–232

4.10. Electronic identity interoperability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

4.11. Cloud interoperability. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

4.12. Ecosystems interoperability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

5. Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

6. Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

Acknowledgements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

1. Introduction

As a multidimensional concept, interoperability can be viewedfrom numerous perspectives and approached from variousdirections [1–3]. Interoperability is a broad and complex subject.Numerous definitions have been given for interoperability. Forinstance, the following four definitions of interoperability have

(3) Semantic interoperability is defined as the ability to operate onthat data according to agreed-upon semantics [29]. Semanticinteroperability is normally related to the definition of content,and deals with the human rather than machine interpretationof this content. Thus, interoperability at this level denotes thata common understanding exists between people regarding thedefinition of the content (information) being exchanged

been given by IEEE [4,5]: (1) ‘‘The ability of two or more systems orelements to exchange information and to use the information thathave been exchanged’’; (2) ‘‘The capability for units of equipmentto work efficiently together to provide useful functions’’; (3) ‘‘Thecapability – promoted but not guaranteed – achieved through jointconformance with a given set of standards, that enablesheterogeneous equipments, generally built by various vendors,to work together in a network environment’’; (4) ‘‘The ability oftwo or more systems or components to exchange and use theexchanged information in a heterogeneous network’’ [6].

The US Department of Defense has also introduced multipledefinitions of interoperability; some of which incorporate the IEEEdefinitions: (1) ‘‘The ability of systems, units, or forces to provideservices to and accept services from other systems, units, or forces,and to use the services so exchanged to enable them to operateeffectively together’’ [7]; (2) ‘‘The condition achieved amongcommunications-electronics systems or items of communications-electronics systems equipment when information or services canbe exchanged directly and satisfactorily between them and/ortheir users. The degree of interoperability should be defined whenreferring to specific cases’’ [8]; (3) (a) ‘‘Ability of informationsystems to communicate with each other and exchange informa-tion. (b) Conditions, achieved in varying levels, when informationsystems and/or their components can exchange informationdirectly and satisfactorily between them. (c) The ability to operatesoftware and exchange information in a heterogeneous network(i.e., one large network comprised of several different local areanetworks). (d) Systems or programs capable of exchanginginformation and operating together effectively’’ [9].

Attaining interoperability requires resolution at several distinctlevels. According to Ref. [10–19], there are four levels ofinteroperability. The interoperability levels are technical, syntac-tic, semantic, and organizational interoperability.

(1) Technical Interoperability is achieved among communications-electronics systems or items of communications-electronicsequipment when services or information could be exchangeddirectly and satisfactorily between them and their users[20,21]. In referring to specific cases, the interoperabilitydegree must be defined [22–24]. Technical Interoperability istypically associated with hardware/software components,systems, and platforms that enable machine-to-machinecommunication. This type of interoperability often focuseson communication protocols and the infrastructure requiredfor those protocols to function [25–27].

(2) Syntactic interoperability is defined as the ability to exchangedata. Syntactic interoperability is generally associated withdata formats. The messages transferred by communicationprotocols should possess a well-defined syntax and encoding,even if only in the form of bit-tables [25,28].

[25,30–32].(4) Organizational interoperability pertains to the capability of

organizations to effectively communicate and transfer mean-ingful data (information) despite the use of a variety ofinformation systems over significantly different types ofinfrastructure, possibly across various geographic regionsand cultures [33]. Organizational interoperability relies onthe successful interoperability of the technical, syntactic, andsemantic aspects [25,34,35].

For example in Hospital Information Systems (HIS), interoper-ability is the ability of medical informatics systems to provideservices to or to access services from other medical informaticssystems and use the services to operate effectively together. Ifinteroperability is evaluated in the healthcare informatics fieldknowing the strengths and weakness it is possible to make thesystem more interoperable or to anticipate the potential ofinteroperability between two or more systems [36]. Anotherexample of interoperability evaluation would be where flightinformation is passed between the (separate) booking systems fortwo airlines. Interoperability evaluation would test whether theinformation reached the target system and still meant the samething to the target system as the sending system.

Therefore, developing an interoperability evaluation modelbetween software and information systems is difficult, andbecoming an important challenge [37–43]. For this reason,developing and implementing an interoperability evaluationmodel between software and information systems is extremelyproblematic [44–48].

In this direction, this paper presents the existing interoperabili-ty evaluation models, and provides an overview of their mainconcepts and recommendations. Additionally, this paper performsa comparative analysis of the existing interoperability evaluationmodels to determine the similarities and differences in theirphilosophy and implementation. This analysis yields a set ofrecommendations for any party that is open to the idea of creatingor improving an interoperability evaluation model.

The structure of the paper is as follows: in the second sectionthe interoperability issues are outlined. An introduction to theavailable interoperability evaluation models is presented inSection 3. Section 4 compares the interoperability evaluationmodels under study on the basis of the interoperability issuesproposed in Section 2. A discussion on the findings is conducted inSection 5 leading to conclusions in Section 6.

2. Interoperability issues

This section describes a set of interoperability issues observedin the results of the FP7 ENSEMBLE project [49]. The interoperabili-ty issues are categorized into four different granularity levels. The

R. Rezaei et al. / Computers in Industry 65 (2014) 1–23 3

interoperability issues that belong to a higher granularity level areregarded as super-sets of interoperability issues that belong in alower level [49]. According to the four granularity levels, Fig. 1illustrates an overview of the identified interoperability issues.Each interoperability issue is further detailed in the followingsections.

In accordance with the scope of this paper and in alignmentwith [49] (Fig. 1), the comparative analysis of the existinginteroperability evaluation models will be performed, andadditionally extended over:

The first granularity level of interoperability issues consists ofdata interoperability, process interoperability, rules interoperabil-ity, objects interoperability, software systems interoperability, aswell as cultural interoperability.

The second granularity level of interoperability issues focuseson

Knowledge interoperability, which consists of elements comingout of data interoperability, process interoperability, rulesinteroperability and cultural interoperability.

Services interoperability, which incorporates facts from processinteroperability, data interoperability, rules interoperability andsoftware systems interoperability.

Social networks interoperability, consisting of elements comingout of cultural interoperability and data interoperability, and

Electronic identify interoperability, which is strongly relatedwith objects interoperability, software systems interoperabilityand rules interoperability.

The third granularity level of interoperability issues includescloud interoperability, which takes elements from servicesinteroperability, knowledge interoperability and identity interop-erability and tries to infuse them with cloud characteristics.

Lastly, the fourth granularity level of interoperability issuesinvolves ecosystems interoperability, which deals with virtual anddigital enterprises and is related to cloud interoperability andsocial networks interoperability.

Fig. 1. Interopera

In this paper, the methodological approach to the analysis of theinteroperability evaluation models is thus based on the followingsteps:

1. The contents of the interoperability evaluation models areextracted and analyzed (Section 3).

2. A comparison of the interoperability evaluation models isdetailed based on the following interoperability issues: data,process, rules, objects, software systems, cultural, knowledge,services, social networks, electronic identify interoperability,cloud interoperability, and ecosystems (Section 4).

3. A comparison and contrast of the various interoperabilityevaluation models and a discussion of the lessons learnedto follow the presentation of comparison matrixes inSection 5.

3. Interoperability evaluation models

Extensive research has been conducted on interoperabilityevaluation models [50]. This section provides a review on all of theexisting evaluation models for interoperability produced since1980. The interoperability evaluation models were identifiedthrough a search of relevant articles published between 1980 and2012 available on the Web of Science database. Google Scholar wasalso adopted as a tool to complement the search. Considering theirimportance, some of the interoperability evaluation models aredescribed in more details.

3.1. Spectrum of Interoperability Model

LaVean [51] stated that in the Institute of Electrical andElectronics Engineers (IEEE) Transactions on Communications, theinteroperability among systems was weak because of a ‘‘lack of ameasure of interoperability by which to state goals for specificsystems.’’ To overcome this deficiency, he developed a spectrum of

bility issues.

R. Rezaei et al. / Computers in Industry 65 (2014) 1–234

interoperability model. He developed two critical measures ofinteroperability assigned levels, namely the technical possibilityand management/control possibility, which state that by ‘‘com-bining these two measures, it is possible to derive a spectrum ofinteroperability that permits cost-versus-benefits tradeoffs’’.LaVean’s [51] recognition of the possibility of differences amonginteroperability levels of each specific service that the two systemsprovide for each other, led him to devise a visualization method(interoperability matrix) able to list the services on the matrixrows and interoperability levels on the columns. Furthermore, forthe purpose of showing the evolution of the interoperability of thesystems across time, he introduced a current view and a futureview of the interoperability matrix [52]. The intention behinddeveloping ‘‘Spectrum of interoperability model’’ was to provide aconvenient tool for system managers to evaluate their systemscurrent status, define the interoperability goals for the future, andvisually control the present status in relation to the future [51].

3.2. Quantification of interoperability methodology

Mensh et al. [53] introduced a method called The Quantificationof Interoperability Methodology which forms the foundation forthe levels of information systems interoperability model. Mensh,et al.’s approach to interoperability measurement is uniquebecause they associated interoperability with measures ofeffectiveness. Their goal was to assess interoperability issues forthree mission areas: wide area surveillance, over-the-horizontargeting, and electronic warfare [53]. They stated that ‘‘interop-erability of systems, units, or forces can be factored into a set ofcomponents that can quantify interoperability’’ and they identifiedthe seven necessary components as languages, standards, envi-ronment, procedures, requirements, human factors, and media. Foreach component they allocated a measures of effectiveness logicfunction and used it for creating a truth table that was filled withthe simulation of discrete events.

Mensh et al. [53] stated that their ‘‘methodology for quantifyinginteroperability is being pursued,’’ however they emphasized that‘‘additional exercises will be required and are currently in theplanning stages.’’

3.3. Military communications and information systems

interoperability

Amanowicz and Gajewski [54] introduced a model formeasuring interoperability called ‘‘Military Communications andInformation Systems Interoperability’’ (MCISI) to mathematicallymodel the interoperability of the Communications and Informa-tion Systems (CIS) [54]. Given that interoperability modelingincorporates operational requirements, standards, CIS data, inter-faces, and modeling facilities, they used a colored cube forvisualizing the MCISI model. One axis of the cube represented thecommand level; the second indicated the CIS services and the thirdaxis represented the transmission medium [54]. The intersectionshad their own colors: red that represented none, yellow that meantpartial, and green that indicated full interoperability of a specificservice via a specific medium at a specified level of command.Furthermore, Amanowicz and Gajewski [54] explained that anumber of points represent a set of systems within a multi-dimensional environment, while the features of the systemsconstitute the coordinates of the points. They then defined anormalized ‘‘distance’’ in between each two points as d(A,B), andstated that in cases where d(A,B) = 0, then the systems A and Bacquired full interoperability, and if d(A,B) > 1, then the twosystems’ interoperability was reduced. Through considering adendrite (broken line connecting all points in a set) arrangementsof systems, they cover a set of systems, maintaining that the most

suitable arrangement is the one in which the dendrite has theshortest length [54].

3.4. Levels of information systems interoperability

The Levels of Information Systems Interoperability (LISI) modelwas developed in 1998 by The US Department of Defense C4ISRWorking Group [55]. The LISI is a reference model that provides astandard process for assessment of the information systems’interoperability [56]. In other words, it is a procedure for defining,measuring, assessing, and certifying the degree of interoperabilityrequired or achieved by and between organizations or systems[57]. The LISI evaluates the level of interoperability attainedbetween systems. A representation of the levels of the LISI modelhas been given in Fig. 2 [55].

The LISI model focuses on enhancing interoperability levels ofcomplexity within the systems [55,58]. The five interoperabilitylevels (0–4) are Isolated, Connected, Functional, Domain, andEnterprise, in which each interoperability level exists in a specificenvironment.

Level 0 – Isolated interoperability in a manual environment: theisolated interoperability level includes a wide range of standalone,or isolated systems. No connection is directly allowable withinthese systems, and their interface is manual. This interoperabilitylevel contains manual data integration and extraction betweenmultiple systems.

Level 1 – Connected interoperability in a peer-to-peer environment:interoperability in the connected level depends on the electronicconnection between systems with some form of simple electronicdata exchange. In this level, shared data types are homogeneous,such as simple text email, graphics, and voice, and the capacity forinformation fusing is limited for the decision makers.

Level 2 – Functional interoperability in a distributed environment: inthe functional interoperability level, systems are located on localnetwork areas that permit data transfer from one system to another.Increasingly sophisticated media exchanges are provided in thislevel, and systems share logical data models with each other. In thefunctional interoperability level, heterogeneous data, contained in asimple information format is combined together, and the fusedinformation is shared between functions and systems.

Level 3 – Domain-based interoperability in an integrated

environment: in the domain-based interoperability level, theconnection between systems is via wide area networks (WANs)that permit several users to access data. In this level, independentapplications exchange information with each other using agreedupon domain data models. Systems in the domain basedinteroperability level support group collaboration in informationcombination, and are permitted to implement business rules andprocesses to facilitate direct database-to-database interactions.

Level 4 – Enterprise-based interoperability in a universal

environment: in the enterprise-based interoperability level, sys-tems are allowed to use a distributed global information spaceacross multiple domains. In this level, complex data can beaccessed by multiple users simultaneously, and applications anddata are shared totally and can be distributed to support fusedinformation. In addition, it is possible to have advanced forms ofcollaboration in this level. Common data interpretation is appliedacross the entire enterprise regardless of the format.

The LISI Reference Model is the foundation of the LISI process.Five LISI interoperability levels are illustrated in rows, and fourcolumns, demonstrating that the attributes of the LISI ReferenceModel contain Procedures, Applications, Infrastructure, and Data(PAID). The broad classification of level/attribute intersectionsfacilitates addressing the required specific capabilities. Conse-quently, in the LISI, interoperability aspects are categorized intofour unified attributes:

Applica tion s Infra structure Data

c Da ta File Tran sfer

a One Way

0

BasicData

FormatsSimple Interactio n(Text Cha tter ,

Voi ce, Fax,Remote,

Acc ess ,Telemetry)

NATOLevel 2

NATOLevel 1

Grou pColl abo ration

(Whi te Boa rds,VTC)

Full TextCut & Paste

Removabl eMedi a

Manua lRe-en try

PrivateData

Media Exchang ePro cedu res

Man ualAcc ess

Control s

NATOLevel 3

Med iaFormats

Prog ramStanda rd

Pro cedure s,Trainin g, etc.

Stand ardsCompl aint(JTA, IEEE )

Cross -Enterpr ise

Model s

Enterpr iseMode l

Shared Da ta(Situa tion Di spla ys

Direct DBExchang es)

DomainSer vice/Agen cy

Doctrine ,Pro cedure s,Trainin g, etc.

Full Obje ctCut & Paste

DomainModel s

Mul ti-Dimen tiona lTopo logie s

WAN

DBMS

Domai nLevel

(Integra ted )

LEVEL(Environ men t)

Multi-Na tiona lEnterpr ises

Cross Govern men tEnterp rise

Interact ive(cross

application s)

EnterpriseLevel

(Uni ver sal)

b

a

3

c

b

a DoD En terp rise

Interoperab ility Attribu tesPro cedu res

4

a

NO KNOWN INTEROPERABI LIT Y

IsolatedLevel

(Manua l)

ConnectedLevel

(Pee r-to- Pee r)

Fun ction alLevel

(Distribu ted )

CommonOperatin g

Environ ment(DII- COELevel 5)

Complian ce

Basic Opera tion s(Documen ts,

Map s,Brie fing s,Pictures

Spread shee ts,Data)

Progra mModel s

&Advan ced

DataFormats

Two Way

bSecu rity Profil e

0

d

N/A

c

b

c

1

d

Web Brow ser

LANb

a Netwo rkAdv. Messag ing

(Parser s, E-Mail +)

2

c

Basic Messagin g(Plain Text, E-mailw/oa ttach men ts)

Fig. 2. The LISI model.

R. Rezaei et al. / Computers in Industry 65 (2014) 1–23 5

Procedure attributes include numerous forms of operationalcontrols and documented guidance that influence all aspects ofsystem integration, development, and operational functionality.The procedure attributes address the architecture guidance andstandards, policies and procedures, and doctrine that enableinformation exchanges between systems.

Application attributes include the system mission, which isthe fundamental purpose of system building and functional

requirements of the system. These attributes indicate applicationsthat permit processing, exchange, and manipulation.

Infrastructure attributes in which the establishment and use of aconnection among applications or systems is supported. Theseattributes include the environments enabling the interaction suchas system services, networks, hardware, et cetera.

Data attributes focus on information processes of the system,and contain both data format (syntax) and its content or meaning

Metric Types Levels Sub -levels

G = Gene ricE = ExpectedS = Specific

4 = Enterprise3 = Do mai n2 = Fun ctiona l1 = Conne cted0 = Isola ted

Varie s by le vel sDefined as a t hru z

LISI Level (Short Form) G2LISI Level (with Sub-level) G2b

Fig. 3. The interoperability metrics of LISI model.

R. Rezaei et al. / Computers in Industry 65 (2014) 1–236

(semantics). These data attributes of interoperability includeprotocols and formats enabling information and data interchanges.

The most important aspect of utilizing the LISI Model forassessment of interoperability lies in its valuable feature ofexpressing the results in the interoperability metric form. The LISImetric quantitatively represents the ‘‘interoperability degree’’obtained from the systems. The required tools that help indetermining the degree of interoperability are an InteroperabilityQuestionnaire acting as the data accumulation source and the LISICapabilities Model that operates as a template for measurements[55].

By using LISI measurements, the aim of capturing the possibilityof interactions amongst the systems is followed. In other words,such measurement clearly determines the outcome of a compari-son between the systems with regards to the capability ofinteroperability incorporated within each individual system [54].

Various styles of metric exist for the LISI model interoperabilitybased on the nature, goal, and strategy they use for performing thecomparison and displaying the results. Fig. 3 shows a typicalconfiguration of various options available for the explanation ofLISI metrics.

The LISI metric provides a shorthand definition of the particularform of interoperability as expressed in the LISI model.

As described in Fig. 3, there are three types of LISI metrics basedon three kinds of relationships being measured. The maindistinction between these three types is the comparison of asingle system against the LISI model (generic) and the two differentcases where two or more systems are compared to each other(expected and specific). The three metric types are addressedbelow:

Interoperability generic level: The interoperability generic levelis calculated for single systems and is expressed as a mathemati-cally calculated value by making a comparison between a singlesystem on the one hand and the LISI Capabilities Model on theother [59]. The overall set of abilities across PAID, is represented bya system that in practice determines the generic level. Theinteroperability generic level of any given system is determined bythe highest level in the LISI Capabilities Model where all the PAIDcapabilities are implemented (without dependency to any certainimplementation alternatives). This entails the necessity for asystem to have implementations for each individual capabilityacross the PAID attributes.

Expected level of interoperability: The expected level ofinteroperability is assessed for a pair of systems and is the levelthat is anticipated using the LISI model as a reference, but withoutperforming an implementation by implementation comparisonbetween the two systems [60]. The interoperability expected levelbetween one system and another is defined as the lowest genericlevel of both systems, that is, the level where the interoperability ofboth systems with each other is expected [61]. This expected levelis specified on the basis that any two systems must be capable ofinteroperating at a certain level in case each of them posses the setof generic capabilities necessary for achieving the informationexchange types of that level.

Specific level of interoperability: The specific level of interopera-bility has been defined as the metric value calculated among thetwo systems resulting from the comparison among the imple-mentation alternatives each of the systems has used concerningthe registered PAID abilities. The specific level is the highest levelat which two systems perform documented interoperableimplementations throughout all the PAID aspects. It may differfrom the expected level due to adding items to the LISI OptionsTables and/or different criteria considerations of technicalimplementation.

3.4.1. The LISI assessment products

In this section, the most important LISI products are addressedin detail. These are the products that are used, constructed oranalyzed directly in interoperability assessment of the systems[44].

The interoperability data collection tool: for gathering relevantinformation necessary for assessing the interoperability ofinformation systems, the LISI model makes use of an Interopera-bility Questionnaire.

Interoperability profiles: data collected by the LISI Questionnaireare mapped to the template of the LISI Capabilities Model using theInteroperability Profiles [17].

The implementation choices are therefore captured by theprofiles for each of the PAID capabilities existing in the system(s)which is/are under assessment; of course in a format that is able tofacilitate the required comparison at system-to-levels as well assystem-to-system scales [62]. System metrics are derived fromprofiles. The Interoperability Profile of a system has been depictedin Fig. 4 as a notional example. In this example, the system’sgeneric interoperability level is 2c, the highest level at which acapability is implemented for each of the PAID attributes.

3.4.2. Potential interoperability matrix

A potential interoperability matrix can be generated for a groupof systems based on the generic interoperability level of eachsystem and the specific interoperability level for each system pairwithin the group. For a group of systems, a Potential Interopera-bility Matrix is generated on the basis of the generic interoperabili-ty level of each individual system, and for the system pairs withinthe group, based on the specific interoperability level.

In this example (see Fig. 5), systems are represented as S1, S2,S3, et cetera. The first (gray) shaded row and column next to thesystem name contains the generic interoperability level for eachsystem. In Fig. 5, S1 (System 1) has a generic interoperability levelof ‘‘2’’ while S3 (System 3) has a generic interoperability level of‘‘3’’. The intersections throughout the matrix contain the specificinteroperability level between each pair of systems identified onthe two axes. For example, the specific interoperability levelbetween systems S1 and S2 is shown as ‘‘2’’ and the specificinteroperability level between systems S2 and S4 is ‘‘1’’. Color isused to highlight whether the specific interoperability level is lessthan, equal to, or greater than, the expected interoperability level.

The green shows the equivalence of the expected and specificlevels. In this case, the system pairs have consistent implementa-tion options for the group of capabilities that determine theinteroperability level they have achieved. Red denotes the lowervalue of specific level compared with the expected level, meaningthat one of the two systems is using at least one implementation ofsome important capability differing from the other, making itimpossible for the two systems to maintain interoperability at theexpected level [63]. Blue shows a higher value of specific levelrelative to the expected level, meaning that both systems possiblyuse dedicated interfaces or some other shared implementationsmaking them capable of interacting at an upper level than theexpected level [64].

System 7System 6System 5 System 4System 3System 2System 1 Generic

Level2 2 3 2 1 3 3

System 2 2 2System 3 3 2 2

System 4 2 3 1 2System 5 1 1 0 0 2System 6 3 2 2 4 2 1

System 7 3 2 2 3 2 1 3System 8 1 1 0 1 1 1 1 1

(Blue) Specific exceeds Expected(Green) Specific equals Expected(Red) Specific less than Expected

Fig. 5. Potential interoperability matrix.

Fig. 4. A sample systems’ interoperability profile.

R. Rezaei et al. / Computers in Industry 65 (2014) 1–23 7

3.5. Organizational interoperability maturity model for C2

In 1998, the organizational interoperability maturity modelwas developed [61]. This model extends the LISI model into themore abstract layers of command and control support. Describingthe ability to interoperate, five levels of organizational maturity aredefined in this model. Fig. 6 illustrates the organizationalinteroperability maturity model in details [61].

Level 0 – independent: The interaction between independentorganizations is described at this level of interoperability. This

interoperability level includes the organizations that normallycould work without any interaction except provided by personalcontact. This level also contains the organizations that do not sharecommon purposes or goals, but that might need to interoperate inscenarios with no precedent. Unanticipated and unplannedarrangements occur at this interoperability level. Although noformal frameworks are prepared in this level, communications areable through personal contact in meetings, fax, and telephone.

Level 1 – ad hoc: Only a few of the organizational frameworksare in position that could support ad hoc arrangements at this

Preparedness Understanding Command Style Ethos

Level 4

Unified

Complete -normal day-to-day

workingShared Homogeneous Uniform

Level 3

Combined

Detailed doctrineand experience in

using it

Shared comms andshared

knowledge

One chain of commandand

interaction with homeorg

Shared ethos but withinfluence from home

org

Level 2

Collaborative

General doctrinein place and some

experience

Shared comms andshared

knowledge aboutspecifictopics

Separate reportinglines of

responsibility overlaidwith

a single commandchain

Shared purpose; goals,value systemsignificantly

influenced by home org

Level 1

Ad hocGeneral

guidelinesElectronic comms and

shared information

Separate reportinglines of

responsibilityShared purpose

Level 0

IndependentNo preparedness

Communication viaphone

etcNo interaction Limited shared purpose

Fig. 6. Organizational interoperability maturity model [61].

R. Rezaei et al. / Computers in Industry 65 (2014) 1–238

interoperability level. A set of guidelines is provided to explainhow interoperability will take place, but basically specificarrangements are unplanned yet. At this organizational interoper-ability level, there are some overarching shared goals, however,individual organization aspirations take precedence, and theorganizations remain totally distinct.

Level 2 – collaborative: At this organizational level of interoper-ability, the recognized frameworks are prepared to supportinteroperability, and also shared goals are identified. In addition,roles and responsibilities are allocated as part of on-goingresponsibilities, though, the organizations are still distinct.Knowledge sharing and training occur at the collaborativeorganizational interoperability level. However, home organiza-tions’ frameworks yet contain a significant influence.

Level 3 – integrated: This interoperability level includes sharedgoals and value systems, a common understanding and apreparedness to interoperate, as a detailed doctrine that there issignificant experience in using it. The frameworks in the integratedlevel of organizational interoperability are prepared and practiced,although, there are residual attachments to a home organization sofar.

Level 4 – unified: This interoperability level shares theorganizational goals, value systems, command structure/style,and knowledge bases across the system. In the unified level oforganizational interoperability, the organization is interoperatingon continuing basis. This level is considered as ideal level sincethere is no impediment in the organizational frameworks tocomplete and full interoperation. The unified level of organiza-tional interoperability is likely to happen only in very homoge-neous organizations.

Preparedness, Understanding, Command style, and Ethos arethe four enabling attributes that have been identified for theorganizational interoperability are defined as follows [61]:

Preparedness: The preparedness of the organization to interop-erate is described by this attribute. This attribute is made up oftraining, experience, and doctrine.

Understanding: The amount of communication and sharing ofinformation and knowledge in the organization, and the wayinformation is used are measured by the understanding attribute.

Command style: The command style attribute describes themanagement and command style of the organization, on the waydecisions are made, and the way roles and responsibilities areassigned.

Ethos: The goals and aspiration of the organization, the cultureand value systems of the organization, and the level of trust in theorganizations are related to the ethos attribute.

Several researchers reviewed the organizational interopera-bility maturity model since its initial introduction so far[10,45,50,65,66].

3.6. Interoperability assessment methodology

The interoperability assessment methodology was publishedinitially in the Proceedings of the 66th Military OperationsResearch Society (MORS) Symposium, three months after thepublication of the LISI model; further revisions of the model werepublished in 1999 and 2003. It is not known that whether theauthor knew about the LISI effort, but in his paper he madereference to Mensh et al.’s the quantification of interoperability.The interoperability assessment methodology like the quantifica-tion of interoperability is based on the idea of ‘‘measurement andquantification of a set of interoperability system components’’[67]. The interoperability assessment methodology model intro-duced nine components (contrary to the quantification ofinteroperability that used seven) which were requirements, nodeconnectivity, data elements, protocols, information flow, informa-tion utilization, interpretation, latency, and standards. The ninecomponents each included either a ‘‘yes/no’’ response or amathematical equation. Leite [67] further defined ‘‘degrees ofinterconnection’’ which included the availability, connectivity,understanding, interpretation, utility, feedback, and execution. Hedescribed the interoperability assessment methodology using a

R. Rezaei et al. / Computers in Industry 65 (2014) 1–23 9

flowchart and applied the process to the Navy’s Tactical BallisticMissile Defense Program.

3.6.1. Interoperability components

The interoperability assessment methodology componentsidentified by Leite [67] are described as follows:

1. Requirements: Any system or components thereof, for which theinteroperability is considered, must have requirements in common.Without such requirements, systems developers and acquisitionmanagers have no obligation to deliver interoperable systems.

2. Standards: The interoperability norms have defined thetransmitting and receiving nodes, the message content, and themedia used for carrying the data (data link) between the nodes[68].

The systems must achieve a common implementation of thestandards to be interoperable.

3. Data elements: The standards and requirements have beenthoroughly examined. If the assessments in those areas (therequirements and standards) are positive, we can claim that theflow of information among the nodes is established via a commonformat with suitable data rates. However, the interoperability isnot yet sure. Next, what is important is assessing the data streamcontent.

4. Node connectivity: As the node connectivity is a variabledependent on time, that is both discrete and continuous timeintervals, it can be said that it is a troublesome element to measurethe interoperability amongst others. In simple terms, connectivityis the ability to send and receive data at any time. This implies thatthe transmitter and receiver are both up and that the link isavailable. For any interoperable system the operator has control ofthe medium and equipment; the environment represents thoseitems which are outside the operator’s direct control.

For a communication system, an index called the connectivityindex is calculated, showing the relationship between the numberof nodes in the system and the paths available in between them.

The following equation is used for defining the connectivityindex:

Ci ¼k

n � ðn � 1Þ (1)

where Ci: connectivity index; k: Number of connections (pathsbetween nodes); n: Number of nodes (participating units).

Measuring connectivity is directly effected by counting the totalnumber of messages initially launched by all the units participat-ing in communication and the number of messages received by thenetwork or data communication link. As long as the link is working,the exemplified connectivity represents the network connectivity.Should the network operate intermittently, the sample needs to becarefully selected and tested, so that achieving the necessaryconfidence level is assured. The following shows the generalrelationship that is used for measuring the connectivity:

C ¼ 1

nr�Pnr

y¼1 ðMrÞyPntx¼1 ðMtÞx

(2)

where C: node connectivity (during measurement period); nr:number of receiving nodes; nt: number of transmitting nodes; Mt:messages transmitted by a node; Mr: messages received by a node.

5. Protocols: protocols facilitate access to the data stream. Onthe transmitting side, the protocol initiates the polling sequences,time allocations of transmittance, and the data that can betransmitted.

The protocols on the receiving side determine the data filteredout or sent to the user. The foundation on which the protocols arebased is the data exchange requirements, data volume, and theavailable capacity of datalink/processor.

6. Information flow: Data volume is normally a function of theoperations’ tempo and the area of interest (AOI), which isdetermined by the operational commander. The operations’ tempois event-driven, but the estimations can still be performed onhistorical and exercise results.

Capacity is determined by the number of data links available.Practically multiple numbers of links or paths are available. Forcombat systems and war instruments, primary and back up pathsare required. The data flow redundancy restricts the total capacityto a value lower than the sum of each individual system.

In connection with the system performance there are severalitems to be measured or calculated including capacity, datalatency, and system overload. The following relationships are usedfor these measurements:

Capacity: A system’s capacity is the rate of data passage overtime. The maximum data rate, given the involved operatingparameters, will be calculated for a system or a group of systems.The following relationships are used for this purpose:

Qe f f ¼ ðQmax � QohÞ � ðt f � t pÞ (3)

where Qeff: effective system capacity (data rate); Qmax: maximumdata rate; Qoh: system overhead data rate; tf: time slot duration(unit transmission); tp: unit propagation time

System overload: A system overload occurs when more datamust be exchanged than the system is able to transmit. Typically,the overload is placed in a queue and is then transmitted whencapacity is available. Therefore, the measure of system overload isthe sum of the messages remaining in queues after their assignedtransmission period for all system nodes.

MOL ¼ nt �Xnr

y¼1

ðMqÞy (4)

where MOL: system message overload; nt: number of transmittingnodes; Mq: messages in queue to be transmitted by node.

Underutilization: When the data rate/message load of thesystem is lower than the full capacity while messages are in queueswaiting for transmission, we are faced with an underutilizationsituation. In other words, this occurs if the transmission allocationto a number of selected nodes is lower than what is required toempty the queue by termination of the transmission period.Moreover, other nodes cannot use their allocated time span.

Quu ¼ MOL (5)

for

MOL � ðQe f f � QÞ

and

Quu ¼ Qe f f � Q (6)

for

MOL > ðQe f f � QÞ

where Quu: system Underutilization (data rate); Q: measured/observed data rate (other terms as previously defined).

Undercapacity: When messages waiting in queues and data rateof the system are at a maximum, undercapacity occurs.

Quc ¼ ðQ þ MOLÞ � Qe f f (7)

must be > 0where Quc: system undercapacity (data rate) (otherterms as previously defined.)

7. Data latency: is the time span consumed from the event time,to the time of receiving the message by the user. That is, the tacticaldata processor. The latency is usually divided into smaller

R. Rezaei et al. / Computers in Industry 65 (2014) 1–2310

segments for analytical objectives. Some of the common timeperiods are:

� The event time to time of observation.� Time of observation to processing completion time.� Time of completion of processing to the receipt time of the

tactical data processor.

This division can be useful in situations where a remote sensorand intermediary processing for reducing the data to anexploitable form (track message) occurs before passing them tothe user. The relevant relationships are as follows:

Dt ¼ tr � te (8)

Dto ¼ to � te (9)

Dtm ¼ tm � to (10)

Dtr ¼ tr � tm (11)

And Equation 8 may be rewritten as:

Dt ¼ Dto þ Dtm þ Dtr (12)

where Dt: time latency; Dt0: latency of observation; Dtm: latencyof measurement/processing; Dtr: latency of transmission/receipt;te: time of event; to: time of observation; tm: time of completion ofprocessing; tr: time of receipt.

8. Interpretation: After the consistency of the transmitted dataset is ensured, the interpretation of the data by each individualprocessor must be investigated.

9. Information utilization: Having passed the data and correctlyinterpreted it, the next step would be to verify that the properaction is taken. Verification of the action taken involves a review ofthe logic associated with every possible option in response to amessage or operator action. These deal with questions ofinteroperability and not with the difficult, higher-level topic ofmeasuring mission effectiveness. These data would be qualitative

Fig. 7. Interoperability a

in nature, perhaps binary (i.e., successful vs. failed). Somesuggested measures in this area include:

� Percentage of initial transmission messages received correctly byshooters.� Percentage of consistency/disparity of redundant data sources.� Number of attempts needed to establish connections.� Delay in sending critical command messages and time to receive

and acknowledge messages.

3.6.2. Assessment process

Now that every part of the interoperability puzzle has beenreviewed, it is possible to develop an assessment process forobjective assessing of the system interoperability. Many steps ofthe assessment process must be replied to by a ‘‘yes/no’’ or ‘‘go/nogo’’ answer. The negative answer represents no interoperability.With regards to the elements for which making calculations isnecessary, results with lower than optimum values show thepossibility of interoperability, but with degraded quality. Degradedinteroperability can be considered as systems functioning with animperfect data set. Most of the time, it is the outcome of reducedconnectivity or sometimes, system overload.

The interoperability assessment process is shown in Fig. 7.Testing must be considered as an integral component of thedefinition of the requirements, as well as system development.Thus testing must be essentially continuous, and ‘‘stability’’ is astate that is never reached in any meaningful sense. Withoutcontinuous feedback, primary execution of the processes andsystems may result in satisfactory interoperation at first, but not ata later time.

The results of this study show that interoperability must beconsidered for systems at the design stage, especially when thesystem requirements are defined. Developers can assign anumber of components or characteristics that all togetherprovide an objective interoperability assessment. Analysis ofsuch characteristics is then transformed into a process orapplicable flow chart by the analyst for determining the systeminteroperability [67].

ssessment process.

Fig. 8. Enterprise interoperability maturity model [70].

R. Rezaei et al. / Computers in Industry 65 (2014) 1–23 11

3.7. Stoplight

Hamilton Jr et al. [69] introduced a very simple model forinteroperability measurement called the Stoplight model. Theymaintained that ‘‘interoperability is notoriously difficult tomeasure,’’ still, they introduced a model for measuring it. Thismodel is aimed at helping the decision makers to determinewhether or not the legacy system they use can meet theoperational and acquisition interoperability requirements. As amatrix, it has been designed in a way that ‘‘meets operationalrequirements (yes/no)’’ is shown in rows and ‘‘meets acquisitionrequirements (yes/no)’’ appears in the matrix columns. The matrixintersections are in red, yellow, orange, and green. Assigning thishierarchical set of colors depends on how well each specificrequirement is fulfilled. An example of a color coding that can beadapted on a timeline to represent the future improvements of theplan interoperability was introduced by Hamilton Jr et al., though itwas not institutionalized in the Department of Defense. The abovetwo methods help to develop a four-colored system of interopera-bility assessment as shown in Tables 1 and 2 [69].

3.8. Enterprise interoperability maturity model

The European Commission elaborated the enterprise interop-erability maturity model through the ATHENA IP (AdvancedTechnologies for interoperability of Heterogeneous EnterpriseNetworks and their Applications Integrated Project) [70].

In the enterprise interoperability maturity model, a set ofmaturity levels, and a set of areas of concern are defined (SeeFig. 8). Each area of concern would be defined by a set of objectivesand goals relevant to interoperability and collaboration issues.Depending on the absence, or presence of the maturity indicators,interoperability, and collaboration maturity level would be definedfor each area of concern. With the intention of achieving a certainmaturity level, each indicator is required to fulfill the thresholdstates or values that are specified for the relevant maturity level.Simultaneously, if the company aims to attain the next maturitylevel, the To-Be status requires to be considered.

The six areas of concern in the enterprise interoperabilitymaturity model are defined as follows [70]:

Table 1Stop light model.

Meets acquisition

requirements?

Yes No

Meets operational requirements? Yes Green Yellow

No Orange Red

Table 2Stop light color definition and implications.

Green The system meets its

interoperability requirement

set and has no known

interoperability problems

Fielded system without

known issues that meets

all documented requirements

Yellow The system does not meet its

interoperability requirement

set, but has no known

interoperability problems

Documented requirements

do not reflect operational

use of the system

Red The system does not meet

its interoperability requirement

set, but has no known

interoperability problems

Improvement, migration

and/or action plans needs

to be put in place

Orange The system meets its

interoperability requirement

set, but has known

interoperability problems

Revisit requirements and

deter mine if requirements

are adequate

Business Strategy and Processes: The Business Strategy andProcesses area of concern covers the alignment, improvement,execution, specification, and identification of business strategy andprocesses. To achieve interoperability at this area of concern, itpursues and requires improving the collaborative processes, forseveral units inside the organization in addition to the externalentities.

Organization and Competences: The Organization and Compe-tences area of concern covers the improvement, enactment,specification, and identification of the organizational structure,containing the skills and knowledge of the identified players. Toachieve interoperability at this area of concern, it requiresidentifying the external entities to collaborate with each other,specifying the networked organization topology, and its improve-ment and deployment.

Products and Services: The Products and Services area of concerncovers the design, specification, and identification of the organiza-tion’s products and services, its lifecycle strategy and the qualitycharacteristics. Achieving interoperability at this area of concernrequires identifying new opportunities, and specifying the sameaspects for new products and services that make use of networkedtechnologies for its delivery: e-Products and e-Services.

Systems and Technology: The Systems and Technology area ofconcern covers the improvement, maintenance, operation, acqui-sition/construction, design, specification, and identification ofenterprise systems. This contains the establishment of links andtraceability to enterprise models, at best self-controlled. Achievinginteroperability at this area of concern requires research andevolution of enterprise systems to apply innovative technologiesthat foster interoperability.

Legal Environment, Security and Trust: The Legal Environment,Security and Trust area of concern covers the identification of legal,trust and security requirements, because of collaborating withexternal entities, and the provision of solutions to manage theseaspects that are a key for interoperability.

Enterprise Modeling: All of the areas of concern that wereidentified previously are directly affected by aspects of an allembracing sixth area of concern. The Enterprise Modeling area ofconcern covers the improvement, application, construction, andspecification of the enterprise models, in which it containssupporting activities for the enterprise modeling, such as identifyingthe appropriate languages and meta-models, infrastructure, meth-odologies, organization (skills and people), and so on. Moreover, itdeals with the interoperability within enterprise models.

The five maturity levels of the enterprise interoperabilitymaturity model are identified as follows:

Performed: At the Performed maturity level, enterprise model-ing and collaboration is done, but in an ad-hoc approach.

Fig. 9. Government interoperability maturity levels [73].

R. Rezaei et al. / Computers in Industry 65 (2014) 1–2312

Collaborations are done between the organization, and externalentities (customers, administration, suppliers), although therelationships are not planned thoughtfully. At this level, usuallycollaborative processes and tasks exceed the schedule and budget.Besides, their past success could not be repeated, and the potentialof the technology is not used properly.

Modeled: At the Modeled maturity level, the enterprisemodeling and collaboration is done similarly each time, and thetechnique has been found applicable. At this level, the definedapproaches and meta-models are applied. Additionally, responsi-bilities are defined, the enterprise model is understood by people,and they are familiar with its execution. Besides, in order tocollaborate, network technologies are used.

Integrated: At the Integrated maturity level, the enterprisemodeling process has been formally documented, communicatedand is used consistently. At this level, organizations use a definedinfrastructure and methodology for the enterprise modeling, thedifferent dimensions are integrated among themselves and themodel is traceable to the enterprise systems, there is a knowledgebase used for improving the models. Besides, business collabora-tion is facilitated via interoperability technologies, using thestandards, and externalization of part of the enterprise models.

Interoperable: At the Interoperable maturity level, dynamicinteroperability, adaptation to changes, and external entitiesevolution are supported by enterprise models. At this level, thepeople’s workplace is seamlessly adapted to the enterprise model.Results (for the involved people and organizations) and the processmetrics are defined as a basis for continuous improvement.

Optimizing: At the Optimizing maturity level, the enterprisemodels permits the organization to adapt and react to changes inthe business environment in a responsive, flexible and agilemanner. At this level, enterprise systems are systematically tracedto enterprise models, and innovative technologies are continu-ously researched and applied to improve interoperability. The useof enterprise modeling could be contributed at the Optimizingmaturity level to achieve the overall goals of involved organization,people, or unit.

3.9. The layered interoperability score

As a method of interoperability measurement for all kinds ofsystems, the Layered Interoperability Score (i-Score) is used in anoperational process context [71,72]. This method uses the currentarchitecture data and can involve more than one interoperabilitytype. What distinguishes the i-Score method is the mechanism ituses for determining an empirical upper limit of interoperabilityfor those systems that support the operational process. The i-Scoremethod can accept custom layers allowing the analyst to offset thei-Score measurement for an unlimited number of performancefactors related to interoperability. The said set of factors caninclude bandwidth, mission capability rate, protocols, atmosphericeffects or probability of connection, amongst others.

The method can be used to make non-traditional interopera-bility measurements such as organizational or policy interopera-bility measurements. The i-Score method has not beeninstitutionalized within the Department of Defense.

Other possible layers are reliability, cost, schedule, andperformance that are used for measuring the impact(s) of diverseprogrammatic changes on the interoperability process. Thismethod is useful for non-traditional measuring of interoperabilitylike policy or organizational interoperability measurements.

3.10. Government interoperability maturity matrix

Fig. 9 illustrates the government interoperability maturitymatrix (GIMM) that was presented by Sarantis et al. [73] in 2008.

The model offers administrations including a simple, self-evalua-tion method which could be used in assessing the current status ofthe administrations concerning e-Government interoperability,and the required steps to be taken for improving their positioningin respect to system implementation and services provision tocitizens and businesses. This maturity model expands the threeinteroperability types considered in the European interoperabilityframework (EIF), aiming to identify several InteroperabilityAttributes that are required to taken into account with theintention of evaluating each organization positioning in e-Government interoperability.

Fig. 9 shows that the GIMM consists of a set of levels in whicheach level corresponds to a different interoperability level, for a setof interoperability attributes (IA). Five levels of maturity aredefined in the GIMM as follows [73]:

Level 1 – Independent: At the Independent level, the interactionbetween independent organizations is described.

Level 2 - Ad hoc: At the Ad hoc interoperability level, only verylimited organizational frameworks are in place, where ad hocarrangements could be supported.

Level 3 – Collaborative: At this level of interoperability,recognized frameworks are in place to support interoperability.Shared goals are identified, along with the allocated roles andresponsibilities as part of on-going responsibilities, though,organizations are still distinct.

Level 4 – Integrated: At this interoperability level, there areshared goals and shared value systems, a common understandingand a preparedness to interoperate with other organizations.

Level 5 – Unified: At this level of interoperability, value systems,organizational goals, command structure/style and knowledgebases are shared among organizations.

Three main interoperability dimensions are defined in theGIMM as below [73]:

Organizational interoperability: The organizational interopera-bility dimension involves defining business goals, modelingbusiness processes, and bringing about the administrationscollaborations that desire to perform information exchanging,and might have different internal structures and processes.Furthermore, the organizational interoperability intends to ad-dress user community requirements by making services user-oriented, accessible, easily identifiable, and available.

Semantic interoperability: The semantic interoperability dimen-sion involves ensuring that the precise meaning of informationexchanged is understandable by any other application that was notinitially developed for this purpose. This dimension facilitatessystems to combine the received information with other informa-tion resources, and processing it meaningfully. Therefore, thesemantic dimension of interoperability is a prerequisite for thefront-end multilingual delivery of services to users.

Ta

ble

3T

he

inte

rop

era

bil

ity

ev

alu

ati

on

mo

de

lsa

nd

da

tain

tero

pe

rab

ilit

ym

atr

ix.

Inte

rop

era

bil

ity

ev

alu

ati

on

Sp

ect

rum

of

inte

rop

era

bil

ity

mo

de

l

Qu

an

tifi

cati

on

of

inte

rop

era

bil

ity

me

tho

do

log

y

Mil

ita

ryco

mm

un

ica

tio

ns

an

din

form

ati

on

syst

em

s

inte

rop

era

bil

ity

Lev

els

of

info

rma

tio

nsy

ste

ms

inte

rop

era

bil

ity

Org

an

iza

tio

na

l

inte

rop

era

bil

ity

ma

turi

ty

mo

de

lfo

rC

2

Inte

rop

era

bil

ity

ass

ess

me

nt

me

tho

do

log

y

Sto

pli

gh

tE

nte

rpri

se

inte

rop

era

bil

ity

ma

turi

tym

od

el

Th

ela

ye

red

inte

rop

era

bil

ity

sco

re

Go

ve

rnm

en

t

inte

rop

era

bil

ity

ma

turi

tym

atr

ix

Mo

de

l

Inte

rop

era

bil

ity

issu

e

Da

tain

tero

pe

rab

ilit

yH

HH

HH

HH

HH

H

R. Rezaei et al. / Computers in Industry 65 (2014) 1–23 13

Technical interoperability: The technical interoperability dimen-sion includes the technical issues of linking computer systems andservices. This dimension contains the key aspects such as securityservices, accessibility, data presentation and exchange, middle-ware and data integration, interconnection services, and openinterfaces.

A remarkable number of Interoperability Attributes are definedin the GIMM that are classified in the three interoperability maindimensions in a parametric score matrix [73]. A vector containingthe values of the GIMM Interoperability Attributes defines theinteroperability state clearly. As a final point, the mathematicalformulation and patterns to use in the context of organization’sinteroperability state transition are presented by the GovernmentInteroperability Maturity Matrix [73].

4. Comparative analysis of the interoperability evaluationmodels

In this section, the indications alongside each interoperabilityissues in each table denote the corresponding coverage in eachparticular interoperability evaluation model. Examples are pre-sented below.

� H indicates that the interoperability evaluation model hasadopted an approach for this criterion, without judgingwhether this approach provides full or partial coverage forthe issue.� X refers to the lack of a tangible approach to this issue.

4.1. Data interoperability

Data interoperability describes the ability of data (includingdocuments, multimedia content and digital resources) to beuniversally accessible, reusable and comprehensible by alltransaction parties (in a human-to-machine and machine-to-machine basis), by addressing the lack of common understandingcaused by the use of different representations, different purposes,different contexts, and different syntax-dependent approaches[49,74,75]. Table 3 illustrates the mapping of data interoperabilityto the interoperability evaluation models.

4.2. Process interoperability

Process interoperability is defined as the ability to alignprocesses of different entities (enterprises), in order for them toexchange data and to conduct business in a seamless way [49].Table 4 presents the mapping of process interoperability tointeroperability evaluation models.

4.3. Rules interoperability

Rules interoperability is the ability of entities to align andmatch their business and legal rules for conducting legitimateautomated transactions that are also compatible with the internalbusiness operation rules of each other [49,76]. Table 5 shows themapping of rules interoperability to the interoperability evaluationmodels.

4.4. Objects interoperability

Objects interoperability refers to the networked interconnec-tion and cooperation of everyday objects [77]. These objects canembrace aspects besides and beyond software components,consistent with the concept of Internet of Things [78]. Objectscan be really seen as orthogonal concepts, each one having its ownspecific and distinguishing features. In this context, devices or

Table 5The interoperability evaluation models and rules interoperability matrix.

Interoperability

evaluation

Spectrum of

interoperability

model

Quantification of

interoperability

methodology

Military communications

and information

systems interoperability

Levels of

information

systems

interoperability

Organizational

interoperability

maturity model

for C2

Interoperability

assessment

methodology

Stoplight Enterprise

interoperability

maturity model

The layered

interoperability

score

Government

interoperability

maturity matrix

Model

Interoperability issue

Rules interoperability � � � H H � � H H �

Table 4The interoperability evaluation models and process interoperability matrix.

Interoperability

evaluation

Spectrum of

interoperability

model

Quantification of

interoperability

methodology

Military communications

and information

systems interoperability

Levels of

information

systems

interoperability

Organizational

interoperability

maturity

model for C2

Interoperability

assessment

methodology

Stoplight Enterprise

interoperability

maturity model

The layered

interoperability

score

Government

interoperability

maturity matrix

Model

Interoperability issue

Process interoperability � � � H H � H H H H

R.

Reza

ei et

al.

/ C

om

pu

ters in

Ind

ustry

65

(20

14

) 1

–2

31

4

Ta

ble

6T

he

inte

rop

era

bil

ity

ev

alu

ati

on

mo

de

lsa

nd

ob

ject

sin

tero

pe

rab

ilit

ym

atr

ix.

Inte

rop

era

bil

ity

ev

alu

ati

on

Sp

ect

rum

of

inte

rop

era

bil

ity

mo

de

l

Qu

an

tifi

cati

on

of

inte

rop

era

bil

ity

me

tho

do

log

y

Mil

ita

ry

com

mu

nic

ati

on

s

an

din

form

ati

on

syst

em

sin

tero

pe

rab

ilit

y

Lev

els

of

info

rma

tio

n

syst

em

s

inte

rop

era

bil

ity

Org

an

iza

tio

na

l

inte

rop

era

bil

ity

ma

turi

tym

od

el

for

C2

Inte

rop

era

bil

ity

ass

ess

me

nt

me

tho

do

log

y

Sto

pli

gh

tE

nte

rpri

se

inte

rop

era

bil

ity

ma

turi

tym

od

el

Th

ela

ye

red

inte

rop

era

bil

ity

sco

re

Go

ve

rnm

en

t

inte

rop

era

bil

ity

ma

turi

tym

atr

ix

Mo

de

l

Inte

rop

era

bil

ity

issu

e

Ob

ject

sin

tero

pe

rab

ilit

y�

�H

HH

H�

HH

R. Rezaei et al. / Computers in Industry 65 (2014) 1–23 15

hardware components interoperability can be seen as a particularcase of the object interoperability domain [49,79]. The interopera-bility evaluation models and objects interoperability matrix arepresented in Table 6.

4.5. Software systems interoperability

Software systems interoperability refers to the ability of anenterprise system or a product to work with other enterprisesystems or products without special effort from the stakeholders[80]. This can be achieved with a large number of alternative ITarchitectures stakeholders [81], and solutions stakeholders [82],including custom, in-house development of APIs, message-orient-ed middleware and message brokers, service-oriented architectureimplementations, or comprehensive stand-alone B2B softwaregateways [49]. Table 7 illustrates the mapping of software systemsinteroperability to interoperability evaluation models.

4.6. Cultural interoperability

Cultural interoperability is the degree to which knowledge andinformation is anchored to a unified model of meaning acrosscultures. Enterprise systems that take into consideration culturalinteroperability aspects can be used by transnational groups indifferent languages and cultures with the same domain of interestin a cost-effective and efficient manner [49]. Table 8 presents themapping of cultural interoperability to interoperability evaluationmodels.

4.7. Knowledge interoperability

Knowledge interoperability is the ability of two or moredifferent entities to share their intellectual assets, take immediateadvantage of the mutual knowledge and utilize it, and to furtherextend them through cooperation [49]. Table 9 shows the mappingof knowledge interoperability to interoperability evaluationmodels.

4.8. Services interoperability

Services interoperability can be defined as the ability of anenterprise to dynamically register, aggregate and consumecomposite services of an external source, such as a businesspartner or an internet-based service provider, in seamless manner[49,83,84]. The interoperability evaluation models and serviceinteroperability matrix is presented in Table 10.

4.9. Social networks interoperability

Social networks interoperability refers to the ability ofenterprises to seamlessly interconnect and utilize social networksfor collaboration purposes, by aligning their internal structure tothe fundamental aspects of the social networks [49,85]. Table 11illustrates the mapping of social networks interoperability tointeroperability evaluation models.

4.10. Electronic identity interoperability

Electronic identity interoperability refers the ability of differentelectronic identity systems within or across the boundaries of anenterprise to collaborate in order to automatically authenticateand authorize entities and to pass on security roles andpermissions to the corresponding electronic identity holders,regardless the system that they originate from [49,86]. Table 12presents the mapping of electronic identity interoperability tointeroperability evaluation models.

Table 7The interoperability evaluation models and software systems interoperability matrix.

Interoperability

evaluation

Spectrum of

interoperability

model

Quantification of

interoperability

methodology

Military communications

and information

systems interoperability

Levels of

information

systems

interoperability

Organizational

interoperability

maturity model

for C2

Interoperability

assessment

methodology

Stoplight Enterprise

interoperability

maturity model

The layered

interoperability

score

Government

interoperability

maturity matrix

Model

Interoperability issue

Software systems

interoperability

H H H H H H H H H H

Table 8The interoperability evaluation models and cultural interoperability matrix.

Interoperability

evaluation

Spectrum of

interoperability

model

Quantification of

interoperability

methodology

Military communications

and information

systems interoperability

Levels of

information

systems

interoperability

Organizational

interoperability

maturity model

for C2

Interoperability

assessment

methodology

Stoplight Enterprise

interoperability

maturity model

The layered

interoperability

score

Government

interoperability

maturity matrix

Model

Interoperability issue

Cultural interoperability � H � � H � � H � �

R.

Reza

ei et

al.

/ C

om

pu

ters in

Ind

ustry

65

(20

14

) 1

–2

31

6

Table 9The interoperability evaluation models and knowledge interoperability matrix.

Interoperability

evaluation

Spectrum of

interoperability

model

Quantification of

interoperability

methodology

Military communications

and information systems

interoperability

Levels of

information systems

interoperability

Organizational

interoperability

maturity model

for C2

Interoperability

assessment

methodology

Stoplight Enterprise

interoperability

maturity model

The layered

interoperability

score

Government

interoperability

maturity matrix

Model

Interoperability issue

Knowledge

interoperability

� � � H H � � H � H

Table 10The interoperability evaluation models and service interoperability matrix.

Interoperability

evaluation

Spectrum of

interoperability

model

Quantification of

interoperability

methodology

Military communications

and information systems

interoperability

Levels of

information

systems

interoperability

Organizational

interoperability

maturity model

for C2

Interoperability

assessment

methodology

Stoplight Enterprise

interoperability

maturity model

The layered

interoperability

score

Government

interoperability

maturity matrix

Model

Interoperability issue

Services interoperability � � H H � � H H � H

R.

Reza

ei et

al.

/ C

om

pu

ters in

Ind

ustry

65

(20

14

) 1

–2

3

17

Table 11The interoperability evaluation models and social networks interoperability matrix.

Interoperability

evaluation

Spectrum of

interoperability

model

Quantification of

interoperability

methodology

Military communications

and information

systems interoperability

Levels of

information

systems

interoperability

Organizational

interoperability

maturity model

for C2

Interoperability

assessment

methodology

Stoplight Enterprise

interoperability

maturity model

The layered

interoperability

score

Government

interoperability

maturity matrix

Model

Interoperability issue

Social networks

interoperability

� � � � � � � � � �

Table 12The interoperability evaluation models and electronic identity interoperability matrix.

Interoperability

evaluation

Spectrum of

interoperability model

Quantification of

interoperability

methodology

Military communications

and information systems

interoperability

Levels of

information

systems

interoperability

Organizational

interoperability

maturity model

for C2

Interoperability

assessment

methodology

Stoplight Enterprise

interoperability

maturity model

The layered

interoperability

score

Government

interoperability

maturity matrix

Model

Interoperability issue

Electronic identity

interoperability

� � � � � � � � � �

R.

Reza

ei et

al.

/ C

om

pu

ters in

Ind

ustry

65

(20

14

) 1

–2

31

8

Table 13The interoperability evaluation models and cloud interoperability matrix.

Interoperability

evaluation

Spectrum of

interoperability

model

Quantification of

interoperability

methodology

Military communications

and information

systems interoperability

Levels of

information

systems

interoperability

Organizational

interoperability

maturity model

for C2

Interoperability

assessment

methodology

Stoplight Enterprise

interoperability

maturity model

The layered

interoperability

score

Government

interoperability

maturity matrix

Model

Interoperability issue

Cloud interoperability � � � � � � � � � �

Table 14The interoperability evaluation models and ecosystems interoperability matrix.

Interoperability

evaluation

Spectrum of

interoperability

model

Quantification of

interoperability

methodology

Military communications

and information systems

interoperability

Levels of

information systems

interoperability

Organizational

interoperability

maturity model

for C2

Interoperability

assessment

methodology

Stoplight Enterprise

interoperability

maturity model

The layered

interoperability

score

Government

interoperability

maturity matrix

Model

Interoperability issue

Ecosystems

interoperability

� � � � � � � � � �

R.

Reza

ei et

al.

/ C

om

pu

ters in

Ind

ustry

65

(20

14

) 1

–2

3

19

R. Rezaei et al. / Computers in Industry 65 (2014) 1–2320

4.11. Cloud interoperability

Cloud interoperability defines the ability of cloud servicesto be able to work together with both different cloud servicesand providers, and other applications or platforms that are notcloud dependant [49]. The scope of interoperability refers bothto the links amongst different clouds and the connectionbetween a cloud and an organization’s local systems [87] inorder to realize the seamless fluid data across clouds andbetween cloud and local applications [49,88,89]. Table 13 showsthe mapping of cloud interoperability to interoperabilityevaluation models.

4.12. Ecosystems interoperability

Ecosystems interoperability can be defined as the ability ofinstant and seamless collaboration between different ecosystemsand independent entities, entities within the ecosystems and asthe ability of different independent entities to formulate virtualstructures for specific purposes [49]. The interoperability evalua-tion models and ecosystems interoperability matrix are presentedin Table 14.

5. Discussion

The state-of-the-art analysis on the interoperability evaluationmodels reveals that the existing interoperability evaluationmodels have carried out considerable efforts to support theinteroperability issues in the first, and the second granularitylevels, however the interoperability issues in the third granularitylevel (cloud interoperability) and the fourth granularity level(ecosystems interoperability) are not supported in the existinginteroperability evaluation models. Furthermore, the analysis ofthe interoperability evaluation models indicates that efforts tobuild an interoperability evaluation model have usually beendevoted to production of guidelines and standards addressing thefollowing four levels of interoperability: technical, syntactic,semantic, and organizational.

Although the spectrum of interoperability model was revolu-tionary and was the newest method for measuring the interoper-ability, no further mention of this model has been found since itwas first published and it is not clear whether or not the programmanagers have used it to improve the interoperability among thesystems [90]. For quantification of interoperability methodology,apart from a citation made by Leite [67] no other mention or usageof the specify can be found [90]. The military communications andinformation systems interoperability model was not immediatelyaccepted for institutionalization after publication [90]. The LISImodel is intriguing because it provides a detailed interoperabilitymodel with mapping between the model and implementationtechnologies. In addition, the LISI model intends to measureinteroperability. It is required to categorize systems and indicateif they are capable of being interoperated [13,55]. The LISI modelconsists of processes and a maturity model in order to determinethe interoperability requirements. Also, it evaluates the informa-tion systems ability to meet the requirements. The LISI modelcreates a common structure and language for interoperabilitybetween organizational information systems. This creates atransition plan and practical solutions to achieve higherinteroperability levels [55]. In the LISI model, there are twomajor concerns. Firstly, the LISI model reflects a set of standardsand interoperability expectations aligned with the US Depart-ment of Defense at the time of its creation. The LISI model containsrisks in becoming out-dated and the interoperability optionstables are required to be updated to reflect new technologyand approaches. However, the LISI model includes certain

technological assumptions that reflect a specific technology bias[13,55]. The second concern, which is more significant, is that thetwo systems will not necessarily be highly interoperable with ahigh shared interoperability profile. This occurs since thedifferences in service qualities, the intention of using systems,how data is used, or other factors, that might render two systemsnon-interoperable, or even if the systems technical underpin-nings are identical [13]. The LISI model focuses on technicalinteroperability and the interoperation’s complexity betweensystems. This model does not address the organizational andenvironmental issues that contribute to the interoperablesystems maintenance and construction [13]. At the organizationalinteroperability maturity model organizations interoperate, buthave different interoperability attributes in caparison withtechnical systems. It is unknown whether the OrganizationalInteroperability Maturity model has been institutionalized by theAustralian Department of Defense. Although interoperabilityassessment methodology is not institutionalized, but it is referredto by Kasunic and Anderson [10] who stated that the interopera-bility assessment methodology attributes could be utilized forextending the LISI model at the mission slice level. [10,90]. Thestoplight model’s purpose is to help decision makers understandwhether or not their legacy systems meet operational andacquisition interoperability requirements and is designed as atwo-dimensional matrix in which ‘‘meets operational require-ments (yes/no)’’ appears on the rows of the matrix and ‘‘meetsacquisition requirements (yes/no)‘‘appears on the columns. A setof areas of concern and a set of maturity levels are defined in theenterprise interoperability maturity model, providing the meansto determine the current ability of an enterprise to collaboratewith external entities and to specify the path to improve thisability. The enterprise interoperability maturity model offers anorganizational context for more specific and technical improve-ments. Additionally, this model takes into account the targetedorganizational units for which a maturity level needs to beassessed, or which need to be improved, in order to achieve acertain maturity level. The enterprise interoperability maturitymodel only covers the organizational interoperability. Thelayered interoperability score model has not been institutional-ized in the Department of Defense [90].

As a result, the interoperability evaluation models varysignificantly in the way they evaluate interoperability issues.Specifically, enterprise interoperability maturity model, organiza-tional interoperability maturity model for C2, and the LISI model.

With regard to the strengths of the existing interoperabilityevaluation models, all of the existing interoperability evaluationmodels cover the data interoperability and software systemsinteroperability in the first granularity level of interoperabilityissues. Most of the existing interoperability evaluation modelsinclude the process interoperability and objects interoperability inthe first granularity level of interoperability issues, and knowledgeinteroperability and services interoperability in the secondgranularity level of interoperability issues. As for the weaknessesof the existing interoperability evaluation models, none of theexisting interoperability frameworks contain the social networksinteroperability and electronic identity interoperability in thesecond granularity level of interoperability issues, cloud interop-erability in the third granularity level of interoperability issues,and ecosystems interoperability in the fourth granularity level ofinteroperability issues.

From a researcher’s perspective, generally the strengths of theexisting interoperability evaluation models could be classified asfollows:

� All of the existing interoperability evaluation models cover thetechnical interoperability level.

R. Rezaei et al. / Computers in Industry 65 (2014) 1–23 21

� Few of the existing interoperability evaluation models containthe organizational interoperability level.

Also, in general, the weaknesses of the existing interoperabilityevaluation models can be described as follows:

� From the existing interoperability evaluation models, only theenterprise interoperability maturity model and governmentinteroperability maturity matrix supports the syntactic andsemantic interoperability levels.� In each of the existing interoperability evaluation models,

different sets of interoperability attributes have been defined.There is no unique set of interoperability attributes defined in theexisting interoperability evaluation models.

In this direction, the insight gained from the study ofthe aforementioned interoperability evaluation model isreviewed in this paper for others that intend to create aninteroperability evaluation model. These realizations are sum-marized as follows:

An interoperability evaluation model cannot be realized byaddressing only the technical interoperability level. A bottom-upapproach beginning with technical interoperability is necessary toenable interoperability. The beginning point is situated on top withorganizational interoperability. In this context, syntactic andsemantic interoperability issues deserve greater priority and effortthan the technical interoperability level. From the perspective ofinteroperability issues, interoperability evaluation models cannotbe realized by addressing only data and software systemsinteroperability issues. Interoperability issues should be furthersupported by gaining a more concrete understanding of process,rules, objects, software systems, cultural, knowledge, service,social networks, electronic identity, cloud, and ecosystemsinteroperability issues.

Regarding the completeness of the frameworks examined, theenterprise interoperability maturity model appears to have a morecomplete set of interoperability issues. The enterprise interopera-bility maturity model is well received in interoperability evalua-tion and is often cited as one of the foundational documents wheninteroperability evaluation is discussed.

6. Conclusions

This paper presents an overview of the development ofinteroperability evaluation models. Several attempts have beenmade to develop interoperability evaluation models. Interopera-bility evaluation allows a company to know the strengths andweaknesses to interoperate and to prioritize actions to improve theinteroperability.

Based on the findings of this research and the directionsprovided by relevant literature, future perspectives on theinteroperability evaluation models could cover the followingtwo areas:

(1) Scientific research that focuses on all interoperability issues,such as social networks interoperability, cloud interopera-bility, and ecosystems interoperability. Thus, an interopera-bility evaluation model should address all interoperabilityissues. An interoperability evaluation model should besimple and easy to understand for the convenience of thedevelopers. The structuring interoperability evaluationmodel is necessary in ensuring consistency and avoidingredundancy.

(2) Practical research that concerns about all aspects of interop-erability. Therefore, an interoperability evaluation model

should address all aspects of interoperability, such asinteroperability levels and interoperability attributes. Theinteroperability evaluation model must consider the existinginteroperability standards. Moreover, properties and metricscould be identified to evaluate the development of interoper-ability between systems. An interoperability evaluation modelmust be defined based on the definitions and standardconcepts of interoperability and interoperability issues.Furthermore, given that today, more emphasis is placed onsemantic interoperability for creating a general understand-ing between systems, the interoperability evaluation modelmust focus on the semantic interoperability.

Acknowledgement

This research is funded by the University of Malaya ResearchGrant (UMRG) RG055-11ICT.

References

[1] C.E. Kuziemsky, J.H. Weber-Jahnke, An eBusiness-based framework for eHealthinteroperability, Journal of Emerging Technologies in Web Intelligence 1 (2)(2009) 129–136.

[2] E. Kajan, Electronic business interoperability: concepts, opportunities and chal-lenges, Business Science Reference (2011).

[3] M. Javanbakht, R. Rezaie, F. Shams, M. Seyyedi, A new method for decision makingand planning in enterprises, in: Information and Communication Technologies:From Theory to Applications, IEEE, Damascus, (2008), pp. 1–5.

[4] K. Breitfelder, D. Messina, IEEE 100 the authoritative dictionary of IEEE standardsterms, Standards Information Network IEEE Press 879 (2000).

[5] J. Radatz, A. Geraci, F. Katki, IEEE Standard Glossary of Software EngineeringTerminology,IEEE Std, (610121990), 1990, p. 121990.

[6] A. Geraci, F. Katki, L. McMonegal, B. Meyer, J. Lane, P. Wilson, J. Radatz, M. Yee, H.Porteous, F. Springsteel, IEEE Standard Computer Dictionary: Compilation of IEEEStandard Computer Glossaries, IEEE Press, Piscataway, 1991.

[7] J. Pub, Joint Publication 1-02, DoD Dictionary of Military and Associated Terms, 12,2001.

[8] D. Defense, Chairman of The Joint Chiefs of Staff Instruction, Policy (CJCSI2800.01C), 2001.

[9] U.J.F. Command, Capstone Requirements Document: Global Information Grid(GIG), JROCM, 2001, pp. 1–99.

[10] M. Kasunic, W. Anderson, Measuring Systems Interoperability: Challenges andOpportunities. Technical Note, Software Engineering Institute, Carnegie MellonUniversity, Pittsburgh, 2004.

[11] C.E. de Normalisation, E.K. fur Normung, CEN/ISO 11354-1, Part 1: Framework forEnterprise Interoperability, 2011.

[12] D. Chen, Enterprise interoperability framework, in: Proceedings of EnterpriseModelling and Ontologies for Interoperability, EMOI-Interop, 2006.

[13] E.J. Morris, L. Levine, B.C. Meyers, D. Plakosh, System of Systems Interoperability(SOSI): Final Report, Software Engineering Institute, Carnegie Mellon University,Pittsburgh, 2004.

[14] D. Carney, P. Oberndorf, Integration and interoperability models for systems ofsystems, in: Proceedings of the System and Software Technology Conference,2004, pp. 1–35.

[15] S. Munk, An analysis of basic interoperability related terms, system of interoper-ability types, Academic and Applied Research in Military Sciences 1 (2002) 117–131.

[16] S. Heiler, Semantic Interoperability, ACM Computing Surveys 27 (2) (1995) 271–273.

[17] L. Levine, B.C. Meyers, E.J. Morris, in: Proceedings of the System of SystemsInteroperability Workshop, Software Engineering Institute, Carnegie Mellon Uni-versity, (CMU/SEI-2003-TN-016), 2003, pp. 1–37.

[18] Y. Naudet, T. Latour, W. Guedria, D. Chen, Towards a systemic formalisation ofinteroperability, Computers in Industry 61 (2) (2010) 176–185.

[19] A. Zutshi, A. Grilo, R. Jardim-Goncalves, The business interoperability quotientmeasurement model, Computers in Industry 63 (5) (2012) 389–404.

[20] M. Novakouski, G.A. Lewis, Interoperability in the e-Government Context, SEI,Carnegie Mellon University, Pittsburgh, 2012, pp. 1–35.

[21] Y. Charalabidis, S. Pantelopoulos, Enhancing Application Interoperability andEnabling B2B Transactions over the Internet for Small and Medium Enterprises:The PRAXIS Project CAiSE Workshops (3), 2004, 257–260.

[22] T. Kinder, Mrs. Miller moves house: the interoperability of local public services inEurope, Journal of European Social Policy 13 (2) (2003) 141–157.

[23] K. Kosanke, ISO Standards for interoperability: a comparison, in: Proceedings ofINTEROP-ESA’05, Geneva, Swiss, February 21-25 2005, Springer-Verlag, (2006),pp. 55–64.

[24] Y. Charalabidis, D. Chen, Achieving enterprise application interoperability: designpatterns and directives, Workshop on Ontology and Enterprise Modelling: Ingre-dients for Interoperability, 2004.

R. Rezaei et al. / Computers in Industry 65 (2014) 1–2322

[25] H. Van der Veer, A. Wiles, Achieving Technical Interoperability – the ETSIApproach, ETSI White Paper 3 (2008).

[26] F. Lampathaki, S. Koussouris, C. Agostinho, R. Jardim-Goncalves, Y. Charalabidis, J.Psarras, Infusing scientific foundations into enterprise interoperability, Compu-ters in Industry (2012).

[27] Y. Charalabidis, F. Lampathaki, A. Kavalaki, D. Askounis, A review of electronicgovernment interoperability frameworks: patterns and challenges, InternationalJournal of Electronic Governance 3 (2) (2010) 189–221.

[28] Y. Charalabidis, S. Pantelopoulos, Y. Koussos, Enabling interoperability of trans-actional enterprise applications, in: Workshop on Interoperability of EnterpriseSystems, 18th European Conference on Object-Oriented Programming (ECOOP),Oslo, (2004), pp. 14–18.

[29] G.A. Lewis, L. Wrage, Model Problems in Technologies for Interoperability: WebServices (CMU/SEI-2006-TN-021), 2006, 1–37.

[30] L. Guijarro, Semantic interoperability in eGovernment initiatives, ComputerStandards & Interfaces 31 (1) (2009) 174–180.

[31] J. Hall, S. Koukoulas, Semantic Interoperability for E-Business in the ISP ServiceDomain, ICE-B, 2008, 26–29.

[32] Y. Charalabidis, R.J. Goncalves, K. Popplewell, Towards a scientific foundation forinteroperability, in: Interoperability in Digital Public Services and Administra-tion: Bridging E-Government and E-Business, 2010, 355–373.

[33] Y. Charalabidis, G. Gionis, K.M. Hermann, C. Martinez, Enterprise InteroperabilityResearch Roadmap, European Commission, 2008.

[34] G. Gionis, Y. Charalabidis, T. Janner, C. Schroth, S. Koussouris, D. Askounis,Enabling cross-organizational interoperability: a hybrid e-business architecture,in: Enterprise Interoperability II, Springer, London, 2007, pp. 525–528.

[35] Y. Charalabidis, H. Panetto, E. Loukis, K. Mertins, Interoperability approaches forenterprises and administrations worldwide, The Electronic Journal for E-Com-merce Tools and Applications (eJeta) 2 (3) (2008) 1–10.

[36] M. Vida, L. Stoicu-Tivadar, Measuring medical informatics systems interoperabil-ity using the LISI model, in: IEEE 8th International Symposium on IntelligentSystems and Informatics, IEEE, Subotica, Serbia, (2010), pp. 33–36.

[37] D. Chen, POP* Meta-Model For Enterprise Model Interoperability, InformationControl Problems in Manufacturing, V.A. Trapeznikov Institute of ControlSciences, Russia, 2009, pp. 175–180.

[38] C. Legner, K. Wende, Towards an excellence framework for business interopera-bility, in: Proceedings of 19th Bled eConference eValues, 2006, pp. 5–7.

[39] E. Commission, European interoperability framework for pan-european egovern-ment services, in: IDA Working Document, Version 2, 2004.

[40] F. Lillehagen, H. Solheim, ATHENA A1 deliverables, Deliverable DA1. 5.1: MPCESpecification, Computas, Oslo, 2004.

[41] A. Zutshi, Framework for a Business Interoperability Quotient MeasurementModel, Departamento de Engenharia Mecanica e Industrial, Faculdade de Cien-cias e Tecnologia da Universidade Nova de Lisboa, Portugal, 2010.

[42] J. Staff, Joint Publication 1-02, DOD Dictionary of Military and Associated Terms,US Govt Printing Office, Washington, DC, 2008, pp. 1–495.

[43] J. Staff, Joint Publication 1-02, Department of Defense Dictionary of Military andAssociated Terms, Washington, DC, 2004.

[44] D.D. Directive, Standardization and Interoperability of Weapons Systems andEquipment within the North Atlantic Treaty Organization, DoD, Washington,1980.

[45] E. Morris, L. Levine, C. Meyers, P. Place, D. Plakosh, System of Systems Interopera-bility (SOSI): Final Report, Citeseer (ESC-TR-2004-004), 2004.

[46] K. Stewart, Non-Technical Interoperability: The Challenge of Command Leader-ship in Multinational Operations, QinetiQ Ltd./Centre for Human Sciences, Farn-borough, United Kingdom, 2004 (ADA465749).

[47] E.P. Presson, Software Metrics and Interoperability, in: Proceedings of AIAAComputers in Aerospace IV Conference, Hartford, 1983.

[48] D. Defense, Realizing the Potential of C4I, National Academy Press, Washington,D.C., 1999.

[49] S. Koussouris, F. Lampathaki, S. Mouzakitis, Y. Charalabidis, J. Psarras, Digging intothe real-life enterprise interoperability areas definition and overview of the mainresearch areas, in: Proceedings of CENT, 2011, pp. 19–22.

[50] T. Clark, T. Moon, Interoperability for joint and coalition operations, AustralianDefence Force Journal (2001).

[51] G.E. Lavean, Interoperability in defense communications, IEEE Transactions onCommunications 28 (9) (1980) 1445–1455.

[52] M.J. DiMario, System of Systems Interoperability Types and Characteristics InJoint Command and Control, Software Engineering Institute, Carnegie MellonUniversity, 2006, pp. 1–6.

[53] D. Mensh, R. Kite, P. Darby, A methodology for quantifying interoperability, NavalEngineers Journal (1998).

[54] M. Amanowicz, P. Gajewski, Military communications and information systemsinteroperability 1 (1996) 280–283.

[55] DOD, Levels of Information Systems Interoperability (LISI), Department ofDefense, United States, 1998, pp. 1–139.

[56] D. Chen, G. Doumeingts, F. Vernadat, Architectures for enterprise integration andinteroperability: Past, present and future, Computers in Industry 59 (7) (2008)647–659.

[57] M. Kasunic, Measuring Systems Interoperability: Challenges and Opportunities,DTIC Document (074-0188), 2001, 1–39.

[58] M. Kasunic, Measuring Systems Interoperability, Software Engineering Institute,Carnegie Mellon University, Pittsburgh, 2003, pp. 1–18.

[59] K. Stewart, H. Clarke, P. Goillau, N. Verrall, M. Widdowson, Non-TechnicalInteroperability in Multinational Forces, QinetiQ Ltd./Centre for Human Sciences,Farnborough, United Kingdom, 2004.

[60] G. Kingston, An Organisational Interoperability Agility Model, Defence Scienceand Technology Organisation Canberra, Australia, 2005, pp. 1–31.

[61] T. Cl ark, R. Jones, Organisational interoperability maturity model for C2, in,Proceedings of the Conference Namej, Conference Locationj, Conference Datej,Publisherj, Year of Conferencej, pp. Pagesj.

[62] U. Schade, Towards the edge and beyond: the role of interoperability, in: 10thInternational Command and Control Research and Technology Symposium,2005.

[63] Review, Review of Operational Level Interoperability Between the MilitaryForces of Australia and the United States of America, United States andAustralia, 2004.

[64] B.C. Meyers, J.D. Smith, Programmatic Interoperability, Software EngineeringInstitute, Carnegie Mellon University, Pittsburgh, 2007.

[65] L. Brownsword, D.J. Carney, D. Fisher, G. Lewis, C. Meyers, Current Perspectives onInteroperability, Software Engineering Institute, Carnegie Mellon University,Pittsburgh, 2004.

[66] S. Fewell, T. Clark, Organisational interoperability: evaluation and further devel-opment of the OIM model, Defence Science and Technology Organisation,Edinburgh, Australia, 2003.

[67] M.J. Leite, Interoperability assessment, in: Proceedings of the 66th MilitaryOperations Research Society Symposium, Monterey, CA, 1998.

[68] B.C. Meyers, I. Monarch, L. Levine, J.I.I. Smith, Including Interoperability in theAcquisition Process, Software Engineering Institute, Carnegie Mellon University,Pittsburgh, 2005, pp. 1–74.

[69] J.A. Hamilton Jr., J.D. Rosen, P.A. Summers, An Interoperability Road Map for C4ISRLegacy Systems, DTIC Document, 2002.

[70] ATHENA, D.A1.4.1: Framework for the Establishment and Management Method-ology, Version 1.0, ATHENA IP, Interoperability Research for Networked Enter-prises Applications and Software, 2005.

[71] T. Ford, J. Colombi, S. Graham, D. Jacques, The interoperability score, in: Proceed-ings of the Fifth Annual Conference on Systems Engineering Research, 2007, pp.1–10.

[72] T. Ford, J. Colombi, S. Graham, D. Jacques, Measuring System Interoperability (Ani-Score Improvement), Air Force Institute of Technology, CSER (2008).

[73] D. Sarantis, Y. Charalabidis, J. Psarras, Towards standardising interoperabilitylevels for information systems of public administrations, The Electronic Journalfor E-commerce Tools & Applications (eJETA) Special Issue on Interoperability forEnterprises and Administrations Worldwide 2 (2008).

[74] J. Mykkanen, M. Tuomainen, An evaluation and selection framework for inter-operability standards, Information and Software Technology 50 (3) (2008) 176–197.

[75] F. Lampathaki, S. Mouzakitis, G. Gionis, Y. Charalabidis, D. Askounis, Business tobusiness interoperability: a current review of XML data integration standards,Computer Standards & Interfaces 31 (6) (2009) 1045–1055.

[76] G. Gionis, Y. Charalabidis, K. Sourouni, D. Askounis, Enabling cross-borderinteroperability: modelling legal rules for electronic transactions in the Euro-pean Union, in: Enterprise Interoperability II, Springer, London, 2007 , pp. 361–364.

[77] A. Vallecillo, J. Troya, J. Hernandez, Object interoperability, in: A. Moreira (Ed.),Object-Oriented Technology ECOOP’99 Workshop Reader, Springer, Berlin Hei-delberg, 1999, pp. 1–21.

[78] N. Gershenfeld, R. Krikorian, D. Cohen, The Internet of things, Scientific American291 (4) (2004) 76–81.

[79] E. Commission, Early Challenges regarding the ‘‘Internet of Things‘‘, Accompa-nying document to the Communication on future networks and the internet, 6,European Commission, Brussels, 2008 (Retrieved 6.10.2008).

[80] E. Lusk, N. Desai, R. Bradshaw, A. Lusk, R. Butler, An interoperability approach tosystem software, tools, and libraries for clusters, International Journal of HighPerformance Computing Applications 20 (3) (2006) 401–408.

[81] C. Britton, P. Bye, IT Architectures and Middleware: Strategies for Building Large,Integrated Systems, Pearson Education, 2004.

[82] T. Al-Naeem, F.A. Rabhi, B. Benatallah, P.K. Ray, Systematic approaches fordesigning B2B applications, International Journal of Electronic Commerce 9 (2)(2005) 41–70.

[83] M. Papazoglou, Web Services: Principles and Technology, Pearson Education,2008.

[84] Y. Charalabidis, R.J. Goncalves, K. Popplewell, Developing a science base forenterprise interoperability, in: Enterprise Interoperability IV, Springer, London,2010, pp. 245–254.

[85] F. Abel, N. Henze, D. Krause, Social semantic web at work: annotating andgrouping social media content, in: J. Cordeiro, S. Hammoudi, J. Filipe (Eds.),Web Information Systems and Technologies, Springer, Berlin Heidelberg, 2009,pp. 199–213.

[86] J. Palfrey, U. Gasser, Interoperability DI. Digital Identity InteroperabilityEinnovation, Berkman Center Research Publication, Harvard University, 2007.

[87] T. Dillon, C. Wu, E. Chang, Cloud computing: Issues and challenges, AdvancedInformation Networking and Applications (AINA), 2010, in: 24th IEEE Interna-tional Conference on, IEEE, Perth, WA, (2010), pp. 27–33.

[88] M. Janssen, Y. Charalabidis, H. Krcmar, Introduction to Cloud Infrastructures andInteroperability Minitrack, System Sciences (HICSS), 2013, in: 46th Hawaii Inter-national Conference on, IEEE, Wailea, HI, USA, (2013), p. 1641.

[89] Y. Charalabidis, M. Janssen, O. Glassey, Introduction to cloud infrastructures andinteroperability minitrack, in: Hawaii International Conference on SystemSciences, 2012, p. 2177.

[90] T.C. Ford, J.M. Colombi, S.R. Graham, D.R. Jacques, Survey on InteroperabilityMeasurement, in: DTIC Document, 2007.

R. Rezaei et al. / Computers in Industry 65 (2014) 1–23 23

Reza Rezaei PhD student in Department of SoftwareEngineering, Faculty of Computer Science and Informa-tion Technology, University of Malaya. His researchinterests are interoperability, cloud computing, andservice oriented architecture. He is a lecturer at theDepartment of Computer Engineering, Faculty ofTechnical and Engineering Sciences, Islamic AzadUniversity of Saveh Branch, Iran.

Thiam Kian Chiew obtained both his bachelor andmasters degrees in computer science from the Univer-sity of Malaya in 1998 and 2000, respectively. Hereceived his PhD degree in computing science from theUniversity of Glasgow in 2009. He is now a seniorlecturer at the Faculty of Computer Science andInformation Technology, University of Malaya,Malaysia. His research interests include web engineer-ing, software architecture, and human computerinteraction.

Sai Peck Lee is a professor at Faculty of ComputerScience and Information Technology, University ofMalaya. She obtained her Master of Computer Sciencefrom University of Malaya, her Diplome d’EtudesApprofondies (D.E.A.) in Computer Science from Uni-versite Pierre et Marie Curie (Paris VI) and her Ph.D.degree in Computer Science from Universite Pantheon-Sorbonne (Paris I). Her current research interests inSoftware Engineering include Object-Oriented Techni-ques and CASE tools, Software Reuse, RequirementsEngineering, Application and Persistence Frameworks,Information Systems and Database Engineering. Shehas published an academic book, a few book chapters as

well as more than 100 papers in various local and international conferences andjournals. She has been an active member in the reviewer committees and programcommittees of several local and international conferences. She is currently inseveral Experts Referee Panels, both locally and internationally.

Zeinab Shams Aliee Received Master degree inSoftware Engineering from Faculty of Computer Scienceand Information Technology, University of Malaya in2012. She obtained her Bachelor degree in softwareengineering from the Azad University of Tehran NorthBranch, Iran.