73
Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for Different Software Platforms in a Network Management System Muhammad Nadeem Mohammed Azharuddin School of Computing Blekinge Institute of Technology SE 371 79 Karlskrona Sweden

Performance, Maintainability and Implementation Cost for …831392/... · 2015. 6. 30. · 2 Contact Information: Authors: Muhammad Nadeem E-mail: [email protected] Mohammed Azharuddin

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

  • 1

    Master’s Thesis

    Electrical Engineering

    June 2012

    Performance, Maintainability and

    Implementation Cost for Different Software

    Platforms in a Network Management System

    Muhammad Nadeem

    Mohammed Azharuddin

    School of Computing

    Blekinge Institute of Technology

    SE – 371 79 Karlskrona

    Sweden

  • 2

    Contact Information:

    Authors:

    Muhammad Nadeem

    E-mail: [email protected]

    Mohammed Azharuddin

    E-mail: [email protected]

    External advisor:

    Anders Grahn Email: [email protected]

    Company/Organization: Smart Grid Networks AB

    Address: Rombvägen 4 SE-371 65 Lyckeby Sweden

    University advisor:

    Professor Lars Lundberg Email: [email protected]

    School of Computing

    Blekinge Institute of Technology

    School of Computing

    Blekinge Institute of Technology

    SE – 371 79 Karlskrona

    Sweden

    Internet : www.bth.se/com

    Phone : +46 455 38 50 00

    Fax : +46 455 38 50 57

    This thesis is submitted to the School of Computing at Blekinge Institute of Technology in

    partial fulfillment of the requirements for the degree of Master of Science in Electrical

    Engineering. The thesis is equivalent to 20 weeks of full time studies.

    University Examiner:

    Professor Patrik Arlos Email: [email protected]

    School of Computing

    Blekinge Institute of Technology

    mailto:[email protected]:[email protected]:[email protected]:[email protected]:[email protected]

  • 3

    ABSTRACT

    Context: Software architecture is an emerging field and progressively more popular in software

    engineering. Software architecture has become an essential part in development of software systems.

    Prototyping is possibly one of the most commonly used learning paradigms in software architecture.

    Hence, it is reasonable to accept some of the requirements that could be expressed as specific quality

    attributes for developing and comparative analysis of prototype. In this thesis we deal with software

    architecture based on different prototypes, where the different platforms have been shared canonical

    within the software architecture. It also has a good potential for performance intensification to analyze

    the prototype according to the required quality attributes.

    Objectives: In this study, we investigate the significance of quality attributes such as performance,

    maintainability and implementation cost of different software platforms. Mainly, it is focused on

    integration of prototypes in software architecture. We specifically investigate several challenges being

    faced by the organizations in the maintainability for addressing the challenges in prototype of network

    management system using software platforms.

    Methods: In this study, both theoretical and empirical research methods have been applied. In order to

    accomplish the goal of this thesis, literature review in this research has performed by studying articles

    from several sources and also performed snowball sampling method to decrease the chance of missing

    any relevant article. During literature review, we have analyzed learning structure and workflow of

    prototypes and then incorporated quality attributes by theoretical analysis. In the experiment part,

    three prototypes were built by deploying different software platforms such as PHP, JSP and Perl. Each

    of these prototypes was evaluated with respect to maintainability using twenty five surveys from

    industrial experts, implementation cost in number of hours and performance in terms of response time.

    Results: As a result of our study, we have identified different challenges in software architecture and

    practicing in software prototypes by using different software platforms. By this study we analyze the

    performance, maintainability and implementation cost for different software platforms. Survey has

    been conducted to recognize challenges and practices in maintainability of prototypes. We have

    shown the possibility to achieve better quality attributes given to a certain system.

    Conclusions: There is trade-off, the best implementation alternative depends on how important the

    different quality attributes are in a certain situation.

    Keywords: Implementation Cost, Maintainability, Network Management System, Performance, Software Architecture

    and Software Platform.

  • 4

    ACKNOWLEDGEMENT

    First and foremost, all praise and thanks be to Allah, the Lord of all the Worlds, the most

    Gracious and most Merciful, who enabled us to complete this thesis successfully.

    We are heartily thankful to our academic supervisor Dr. Lars Lundberg for giving us chance

    to work under his supervision and providing us with his esteem guidance, encouragement

    and suggestions throughout this thesis. We honor his huge knowledge and skill in many

    areas, which have made us to improve our research.

    We would also thankful for technical guidance, support, and encouragement given by

    Andres Grahn (Chief Technical Officer) and Peter Engström (Design Engineer) from Smart

    Grid Network AB Sweden. We would also like to thank all industry and academia survey

    participants, who contributed towards survey part of this research.

    Next, we express our gratitude to our beloved friends for helping us by suggesting

    improvements in the report and providing ideas for the analysis. Surely, we owe our deepest

    gratitude to our families for their prayers, moral and unconditional support throughout our

    study.

    Finally, we wish to express our sincere appreciation to all of those who pray for us and give

    us moral support and encouragement.

  • 5

    LIST OF FIGURES Figure 1 Structural Diagram of Literature Review ................................................................. 15 Figure 2 Structural Design of Experimental Process .............................................................. 17 Figure 3 Structure of Software Architecture........................................................................... 18 Figure 4 Basic Prototype Development Process ..................................................................... 19 Figure 5 Maintainability Assessment of Software Prototype ................................................. 20 Figure 6 Implementation Cost for Prototypes ........................................................................ 21 Figure 7 Measuring Response Time for Prototypes ............................................................... 22 Figure 8 Response Time of Prototype 1 ................................................................................. 27 Figure 9 Implementation Cost of Prototype 1 ........................................................................ 28 Figure 10 Maintainability of Prototype 1 ............................................................................... 29 Figure 11 Response Time of Prototype 2 ............................................................................... 29 Figure 12 Implementation Cost of Prototype 2 ...................................................................... 30 Figure 13 Maintainability of Prototype 2 ............................................................................... 31 Figure 14 Response Time of Prototype 3 ............................................................................... 32 Figure 15 Implementation Cost of Prototype 3 ...................................................................... 32 Figure 16 Maintainability of Prototype 3 ............................................................................... 33 Figure 17 Comparison of Maintainability .............................................................................. 34 Figure 18 Comparison of Implementation Cost ..................................................................... 34 Figure 19 Comparison of Response Time .............................................................................. 35 Figure 20 Prototype 1 of Software Architecture ..................................................................... 50 Figure 21 Prototype 2 of Software Architecture ..................................................................... 52 Figure 22 Prototype 3 of Software Architecture ..................................................................... 54 Figure 23 Maintainability Efforts Validation ......................................................................... 69 Figure 24 Web Server Survey by Netcraft.............................................................................. 71

  • 6

    LIST OF TABLES Table 1 Search String for Literature Review .......................................................................... 16 Table 2 Description of Each Scenario .................................................................................... 21 Table 3 Databases Description ............................................................................................... 23 Table 4 Search String Results ................................................................................................. 23 Table 5 Refinement of Article from Databases ...................................................................... 24 Table 6 Snowball Sampling Selected Articles ........................................................................ 24 Table 7 List of Selected Articles ............................................................................................ 24 Table 8 Response Time of Prototype 1 ................................................................................... 27 Table 9 Implementation Cost of Prototype 1 .......................................................................... 28 Table 10 Maintainability of Prototype 1 ................................................................................. 28 Table 11 Response Time of Prototype 2 ................................................................................ 29 Table 12 Implementation Cost of Prototype 2 ........................................................................ 30 Table 13 Maintainability of Prototype 2 ................................................................................. 30 Table 14 Response Time of Prototype 3 ................................................................................ 31 Table 15 Implementation Cost of Prototype 3 ........................................................................ 32 Table 16 Maintainability of Prototype 3 ................................................................................. 33 Table 17 Maintainability of all Three Prototypes ................................................................... 33 Table 18 Implementation Cost of all Three Prototypes .......................................................... 34 Table 19 Response Time of all Three Prototypes ................................................................... 35 Table 20 Over all Comparison of Quality Attributes ............................................................ 35 Table 21 List of Search Strings .............................................................................................. 49 Table 22 Maintainability Average for Prototype 1 ................................................................. 50 Table 23 Prototype 1 Response Time ..................................................................................... 51 Table 24 Maintainability Average for Prototype 2 ................................................................. 52 Table 25 Prototype 2 Response Time ..................................................................................... 53 Table 26 Maintainability Average for Prototype 3 ................................................................. 54 Table 27 Prototype 3 Response Time ..................................................................................... 55 Table 28 Prototype Efforts Rating .......................................................................................... 57 Table 29 Probability Ratings of Selected Scenario ................................................................ 57 Table 30 Example Table for Filling Survey Form .................................................................. 57 Table 31 Prototype and Probability Scenario Rating .............................................................. 57 Table 32 List of Industries for Survey as Target Audience .................................................... 58 Table 33 Organization Using Different Platforms .................................................................. 71 Table 34 Websites Based on Different Platforms ................................................................... 71

  • 7

    ACRONYMS

    ADD Attribute Driven Design

    ATAM Architectural Trade-off Analysis Method

    ANOVA Analysis of Variance

    AOP Aspect-Oriented Programming

    CBAM Cost Benefit Analysis Method

    CSS Cascading Style Sheets

    CGI Common Gateway Interface

    CORBA Common Object Request Broker Architecture

    CPU Central Processing Unit

    GUI Graphical User Interface

    FDM Functionality-based Design Method

    HTML Hyper Text Markup Language

    HTTP Hyper Text Transfer Protocol

    IEEE Institute of Electrical and Electronics Engineers

    JDBC Java Data Base Connectivity

    JDK Java Development Kit

    JPL Jet Propulsion Laboratory

    JSP Java Server Pages

    J2EE Java 2 Platform, Enterprise Edition

    LAMP Linux, Apache, MySQL and PHP

    LOC Line of Code

    MAAP New Millennium Autonomy Architecture Prototype

    MDA Model-Driven Architecture

    NASA National Aeronautics and Space Administration

    NMRA New Millennium Remote Agent

    ODBC Open Data Base Connectivity

    PERL Practical Extraction and Report Language

    PHP Hypertext Preprocessor

    PRO3D Programming for Future 3D Architecture with Many Cores

    RCS Remote Control System

    SA Software Architecture

    SAAM Scenario-based Architecture Analysis Method

    SPL Software Product Line

    SQL Structure Query Language

    SSPS Statistical Package for the Social Sciences

    TCP Transmission Control Protocol

    WWW World Wide Web

  • 8

    TABLE OF CONTENTS ABSTRACT .......................................................................................................................................... 3

    ACKNOWLEDGEMENT ................................................................................................................... 4

    LIST OF FIGURES .............................................................................................................................. 5

    LIST OF TABLES ................................................................................................................................ 6

    ACRONYMS ......................................................................................................................................... 7

    1 INTRODUCTION ..................................................................................................................... 10

    1.1 OVERVIEW ........................................................................................................................... 10 1.2 AIMS AND OBJECTIVES ........................................................................................................ 11 1.3 RESEARCH QUESTION .......................................................................................................... 11 1.4 THESIS OUTLINE .................................................................................................................. 11

    2 BACKGROUND ........................................................................................................................ 12

    2.1 SOFTWARE ARCHITECTURE AND QUALITY ATTRIBUTES ..................................................... 12 2.2 SOFTWARE PLATFORM AND PROTOTYPE ............................................................................. 12 2.3 NETWORK MANAGEMENT SYSTEM ...................................................................................... 13 2.4 RELATED WORK .................................................................................................................. 13

    3 RESEARCH METHODOLOGY ............................................................................................. 15

    3.1 LITERATURE REVIEW ........................................................................................................... 15 3.1.1 Search String and Databases .......................................................................................... 15 3.1.2 Article Selection Criteria ................................................................................................ 16 3.1.3 Snowball Sampling Method ............................................................................................ 16

    3.2 EXPERIMENT ........................................................................................................................ 17 3.2.1 Software Architecture Design Method for NMS ............................................................. 17 3.2.2 Development of Prototype .............................................................................................. 18 3.2.3 Maintainability ............................................................................................................... 19 3.2.4 Implementation cost ........................................................................................................ 21 3.2.5 Performance ................................................................................................................... 21

    4 RESULT & ANALYSIS ............................................................................................................ 23

    4.1 RESULT FROM LITERATURE REVIEW .................................................................................... 23 4.1.1 Search String Results from Databases ............................................................................ 23 4.1.2 Conducting the Literature Review .................................................................................. 23 4.1.3 Snowball Sampling ......................................................................................................... 24

    4.2 PROTOTYPING ...................................................................................................................... 27 4.2.1 Prototype 1...................................................................................................................... 27 4.2.2 Prototype 2...................................................................................................................... 29 4.2.3 Prototype 3...................................................................................................................... 31

    4.3 COMPARISON AND SUMMING UP .......................................................................................... 33 4.3.1 Maintainability ............................................................................................................... 33 4.3.2 Implementation Cost ....................................................................................................... 34 4.3.3 Performance ................................................................................................................... 35 4.3.4 Quality Attributes Summing Up ...................................................................................... 35

    5 DISCUSSION ............................................................................................................................. 37

    5.1 VERIFICATION, VALIDATION AND DISCUSSION .................................................................... 37 5.2 VALIDITY THREATS ............................................................................................................. 39

    5.2.1 Internal Validity Threats ................................................................................................. 39 5.2.2 External Validity Threats ................................................................................................ 39 5.2.3 Construct Validity Threats .............................................................................................. 40 5.2.4 Conclusion Validity Threats ........................................................................................... 40

    6 CONCLUSION .......................................................................................................................... 41

  • 9

    7 FUTURE WORK ....................................................................................................................... 43

    REFERENCE ..................................................................................................................................... 44

    APPENDIX ......................................................................................................................................... 49

    APPENDIX-A: SEARCH STRING.......................................................................................................... 49 APPENDIX-B: PROTOTYPES ............................................................................................................... 50 APPENDIX-C: SURVEY FORM ............................................................................................................ 56 APPENDIX-D: TARGET AUDIENCE ..................................................................................................... 58 APPENDIX-E: SURVEY DATA ............................................................................................................ 59 APPENDIX-F: SURVEY DATA ANALYSIS ........................................................................................... 68 APPENDIX-G: SERVERS & PLATFORMS ............................................................................................. 71 APPENDIX-H: SOURCE CODE ............................................................................................................. 73

  • 10

    1 INTRODUCTION

    1.1 Overview

    In many real-world situations, it is important to make accurate predictions based on the

    available information. Software architecture constrains the achievement of quality attributes

    such as performance, usability and maintainability of a system [1][2]. Generally it has

    become an accepted concept in academia as well as industry. Software architecture is a

    structured solution to facilitate all the technical and operational requirements while analyzing

    the entire quality attributes of software system [3]. It requires a series of decisions based on a

    wide range of factors and then each of these decisions have considerable impact on the

    performance, maintainability for overall success of the application [4].

    Why do we need software architecture and their design method with different software

    platforms? The inspiration behind this research is that global technologies are entirely

    depending on software. Software need to be developed with different software platforms

    (e.g. PHP, JSP and Perl), maintain and update in order to remain contemporary with change

    in the technology [5].

    Every software depend on their architectural design and software platform, so there is a need

    to conduct even more in depth experiment to investigate the design of prototypes with

    different software platforms. Industry wants to go further with other software platforms

    using the modern technology that are available today. They are in search of best solution that

    is as quick as possible to implement, easy to maintain, easy to extend and with the needed

    functionality in any NMS software. One part of the work could be to investigate the

    programming platform that are available today and find out their quality attributes. Therefore

    it is very important to build a right prototype on the basis of requirements before the

    development of software.

    J. Bosch et al. [1] described an approach to design and evaluate the software architecture

    with logical and dynamic view along with different styles and scenarios. According to P. O.

    Bengtsson et al. [6] a technique for analyzing the maintainability of software architecture

    based on specific scenario for a particular system. This technique is illustrated and evaluated

    using industrial cases also [7].

    Nowadays software architecture design is a foundation of any software system and these

    systems entirely depend upon there design method [1][5]. In both industry and academia

    several components are used in software architecture that are generally recognized and has

    lead to better control over the design [4]. A software platform development method

    that allows suitable description of requirements and characteristics of software are crucial

    [8]. So, there is a need to accomplish scientific research to identify the different software

    platforms and obtain all required quality attributes (performance, maintainability and

    implementation cost) with minimum utilization of resources.

    The software platform is a concrete model of the architecture that allows executing the

    software with detailed hardware-software interaction based on performance and

    maintainability [9]. Software platform evaluation has a great impact on the quality attributes.

    Evaluation of platform has been carried out several times until the entire requirement and

    quality attributes are gathered [10]. The evaluation process iterates the design of platform

    and fulfils all the required quality attributes [11]. Change scenario is an important part of

    practicing architecture in order to gain more information about maintainability of the

    software system [12][13][14].

  • 11

    Network management system is a technique for development and perfection. While the

    technologies of network management have been getting increase day-after-day, but it still

    fails to meet the demands for the rapid development of the network management system.

    Network management basically involves monitoring of the devices connected in a network

    by gathering and analyzing the data from the devices [15]. The purpose of any network

    management system is to monitor and control the behaviour of a network system to

    accomplish better operation of the network system [16].

    1.2 Aims and Objectives

    The aim of this research is to investigate the maintainability, performance and

    implementation cost of different software platforms for a network management system.

    Build three prototypes with different software platforms based on performance,

    maintainability and implementation cost.

    Investigate the possibility for implementation cost of network management system.

    Investigate the performance of different prototypes based on software platforms.

    Investigate the maintainability of prototypes based on different software platforms.

    Investigate the significance (trades-off and conflicts) of performance, maintainability

    and implementation cost for different software platforms.

    Validation of all three prototypes from industry (Smart Grid Network AB).

    1.3 Research Question

    RQ1. What is the maintainability for different software platforms in a network management

    system?

    RQ2. What is the implementation cost for different software platforms in a network

    management system?

    RQ3. What is the performance for different software platforms in a network management

    system?

    RQ4. What are the trades-off and conflicts in between performance, maintainability and

    Implementation cost for different software platforms in a network management system?

    1.4 Thesis Outline

    The thesis report is organized as follows:

    Chapter 1 Introduction, motivation, research question, aims and objectives

    Chapter 2 Background and related work of software architecture, software platforms,

    performance, maintainability, implementation cost and network management

    system

    Chapter 3 Research methodology for this research. It covers strategies that we used for

    the literature review and experiment

    Chapter 4 Results from literature review and experiment

    Chapter 5 Discusses results and findings

    Chapter 6 Conclusion

    Chapter 7 Future work

  • 12

    2 BACKGROUND

    2.1 Software Architecture and Quality Attributes

    The software architecture was first used in a scientific article in early 1981, Erik Sandewall

    [17] described a concept for dealing with decomposition of system into modules. At the

    beginning of 1990s software architecture was widely used in the software engineering

    community and industry. Today it has become a most acceptable concept by the new roles

    appearing in the software developing organizations [18].

    J. Bosch et al. [1] described a development procedure to design and assess the software

    architecture of haemo dialysis system with logical and dynamic view along with various

    styles and scenarios. The main focus in this architecture was maintainability of software

    architecture [19]. To build software architecture two design approaches were used i.e. top-

    down and bottom-up approach. It is better to choose top-down approach as compare to

    bottom-up during the architecture design and reengineering of architecture [10][11][4].

    Software architecture which was built from a scratch has very less chances to reuse it. For

    this purpose product line architecture approach to software development has gained a

    solution which enables the development of component in a systematic way for reuse of

    software architecture [20].

    J. Bosch et al. [10] evaluated the software quality attributes specifically for maintainability.

    The most useful method for maintainability is change scenario method as compared to other

    methods such as simulation, mathematical modeling and experience-based assessment.

    These methods were used for high performance and fault tolerance in software architecture

    [13][6]. For removing the deficiencies and improving the quality attributes of software

    architecture, there are five different techniques for transformation of architecture such as

    impose architectural style, apply architectural pattern, use design pattern, convert quality

    requirements to functionality and distribute quality requirements [22][18][23][24]. After

    every transformation of software architecture a newer version is obtained with same

    functionalities but different in their quality attributes [10][25]. Traditional object oriented

    design method was used for designing the software architecture [5].

    Software developers were facing many challenges to reach the customer needs and their

    expectations within specific time and cost [13]. Z. Li et al. [26] have discussed a method

    regarding implementation cost for software architecture systems by using divide-and-

    conquer approach. This implementation cost and effort estimation that can simplify and

    regulate the complicated development for software based application.

    2.2 Software Platform and Prototype

    Earlier software development group used the word “platform” transformed dramatically [27].

    The influence of system was too common for software developers who began to think of

    platforms as the peculiar operating systems on their products. Some of the organizations that

    make software and taking Microsoft’s operating systems as their own platform are not

    enough for a company to be efficient and effective for development of robust platform

    architectures [9][28]. Still, it is externally focused on platform definition which does not be

    enough for a software company to fully manage the architecture of its own. Management

    needs an internal comprehensive definition of its own software product platform

    [27][29][30].

  • 13

    The idea of prototyping in software development come out in 1970 [31][32] as a reaction

    against most traditional phase-oriented model (e.g. life-cycle model, waterfall model etc) of

    how to develop software system [33]. Prototype was described by J. C. Hallet et al. [34] in

    the year 1975 and described a method to overcome the well-documented problems of

    software and maintainability during development of prototype. J. S. Dong et al. [8]

    developed different software platforms by reusing software architectures, source code and

    proposed a technique to reuse software architecture and management of a platform quality

    through defect removing and continuous maintenance.

    H. Algestam et al. [35] has developed a component-based prototype and differentiated new

    architecture with existing architecture to know the performance and maintainability. From

    the experiment, there observed that maintainability cost was reduced around 20% and

    performance evaluation of prototype was lower when using one CPU in component based

    architecture. when using multiprocessor component-based prototype was very efficient to

    increase the maintainability and conserve performance in the system [36]. M. Christensen et

    al. [37] has develop the dragon project as a series of evolutionary prototype of a customer

    service. The project progressed through two main stages in which an iterative and

    incremental progress strategy was used [38].

    2.3 Network Management System

    Z. Yongjun et al. [39] has generally described the fast development of network technology

    that meets network requirement in every organization and provides reliable services of

    network devices to improve the complexity in the network. The management information

    system that uses request/response mode to ensure the reliable and stable operation of

    network management system by using web-based architecture [15]. In any system, quality

    attributes such as performance and maintainability are more versatile measures in software

    architecture [40]. For example, the quality attributes like performance and maintainability

    typically require explicit attention during the development to achieve required levels.

    Traditionally, software engineers have performed their designs based on assumed relations

    between design solutions and quality requirements [41]. Another technology as CORBA for

    intelligent system to build a network management system [42].

    2.4 Related Work

    J. Bosch [20] has represented the development of software product line to reduce the cost of

    developing specific components of software. This approach was used for games: 2d real time

    strategy & 3d first person shooter, online playing and web-based application e-commerce

    pages, corporate information page and database application using functionality-based

    architecture design method.

    M. H. Meyer et al. [27] described the software platform as a collection of subsystems and

    interface of the common structure for the development of software product. C. Bădică et al.

    [43] used the java and C/C++ platform for implementation of agent-based system. J. S. Dong

    et al. [8] described the maintainability of component in software and their side effect on

    components due to change in software components. When a particular software component

    is changed then unplanned error occurs in that component and developer has to spend lot of

    effort in term of maintainability.

    R. Kazman et al. [14] described about experimental scenarios to obtain an understanding of

    real world system in various domains. SAAM (Software Architecture Analysis Method)

    [11] was used to evaluate different scenarios for analyses of architecture and achieve

    maintainability. To evaluate software architecture, SAAM forces key stakeholders to

    influence a relative rating among the various potential of the system. For the evaluation

  • 14

    process of SAAM, each session is generated by considering customer requirements in the

    form of scenarios. This method is also used to evaluate global information system, air traffic

    control and remote control system [14].

    Software architecture evaluation has a great impact on the quality attributes. Evaluation of

    design transformation can be carried out several times until the entire requirements and

    quality attributes are gained [10]. Change scenario is one of the important part for practicing

    architecture in order to gain more information about the software system with respect to set

    of define quality attributes which are evaluated by using SAAM [11][13][14][19][40].

    Quality requirements such as performance and maintainability are generally specified clearly

    in industrial requirement specifications. In some industrial projects [44], the initial required

    specification contained statements such as “The maintainability of the system should be as

    good as possible” and “The performance should be satisfactory for an average user”. Such

    subjective statements are well intended and useful for the evaluation of software [6].

    L. Dobrica et al. [45] described the set of various software architecture evaluation methods

    and compared the performance of two or more competing software platforms [35][46][21].

    Gulimanescu et al. [47] compared the structure, hardware functionality and software of a

    network management system based on the ATMEGA256 controller. The development cost

    of software system almost utilizes about 20 % to 40 % as a whole project cost other than the

    remaining cost approximately 60 % to 80 % is consumed by maintenance [15][48]. System

    with poor maintainability is difficult to alter and update. So the maintainability cost of any

    project is key parameter for developing the robust software system [49]. J. Bosch et al. [6]

    proposed the technique for analyzing the optimal maintainability of software based on

    different scenarios.

    H. Grahn et al. [24] presented the evaluation of performance characteristics with different

    quality attributes of three architectural styles. S. Balsamo et al. [50] proposed performance

    models to characterize the quantitative behavior of software architecture for the system.

    Performance results such as response time using system workload are useful to directly

    interpret the software design level [51][52]. M. Marzolla et al. [53] mentioned the

    performance evaluation and prediction of software at the software architecture level.

    N. Huang et al. [54] described a method for estimating software development cost of

    software system and size to intend the development during implementation time of the

    software life cycle. M. E. Helander et al. [55] has described the modelling work on cost

    distribution among software with in the quantified system reliability.

    J. Matton et al. [21] described the implementation of a service data point (SDP) prototype

    with an alternative architecture to reduce the size of code for optimizing the performance and

    availability of prototype as much as possible [56]. D. Dvorak et al. [57] added a new

    dimension to the requirements for software reliability and described about case study

    between NASA and JPL (Jet Propulsion Laboratory) on project NMAAP (New Millennium

    Autonomy Architecture Prototype) for improving the software architecture of flight in a

    prototype form. Xiaosong Wang et al. [15] proposed an extensive system which covers the

    demand for network management including system architecture, prototype as well as the

    deployment method. X. Lu [58] has built the heterogeneous telecommunication network

    management system based on J2EE architecture.

  • 15

    3 RESEARCH METHODOLOGY

    Research is defined by HECFE (Higher Education Funding Council for England) as

    “original investigation undertaken in order to gain knowledge and understanding“ [59]. In

    our thesis, two methods are used i.e. literature review and experiment.

    3.1 Literature review

    According to Creswell [60] literature review is defined as “a written summary of journal

    articles, books and other papers that describes the past and current state of information,

    organizes the literature into topics and documents a need for a proposed study”.

    The first phase of research study is literature review that is based on the reviewing previous

    work to know the current knowledge and prepare a foundation for new research to carry on.

    The literature review is sorting, managing and gripping the already available research articles

    [59]. We conducted literature review on software architecture, software platform,

    performance, maintainability and implementation cost [59]. The process for literature review

    is shown in Figure 1.

    Research question

    Literature review

    search string

    Selection

    criteria(Inclusion/

    Exclusion)

    Refine papers by

    reading Title/

    Abstract

    Databases

    Final out come of

    LR

    Defining keywords

    IEE

    E

    INS

    PE

    C

    SC

    OP

    US

    update list of

    articles

    Apply snowball

    sampling

    Figure 1 Structural Diagram of Literature Review

    3.1.1 Search String and Databases The sources for literature were IEEE Xplore, Engineering Villas (Inspec), Scopus

    (Compendex) and the university library at BTH. Search strategy formulation for search

    string is done with keywords.

  • 16

    3.1.1.1 Defining Keywords

    The below given keywords are defined for searching the related articles from different

    databases for research and deeper analysis of new knowledge:

    “Implementation Cost, Maintainability, Network Management System, Performance,

    Software Architecture, Software Platform”

    The search strings were formulated by using keywords and the search string is shown below.

    Table 1 Search String for Literature Review

    ((( (($Software $Platform) WN KY) AND (1969-2012 WN YR)) AND ( (($Maintainability)

    WN KY) AND (1969-2012 WN YR)) AND ( (($Implementation $cost) WN KY) AND (1969-

    2012 WN YR)) AND ( (($Performance) WN KY) AND (1969-2012 WN YR))))

    ( (($Software $Architecture) WN KY) AND (1969-2012 WN YR)) AND ( (($Maintainability)

    WN KY) AND (1969-2012 WN YR)) AND ( (($Implementation $cost) WN KY) AND (1969-

    2012 WN YR)) AND ( (($Performance) WN KY) AND (1969-2012 WN YR))

    3.1.2 Article Selection Criteria Kitchenham [61] described selection criteria for inclusion and exclusion. The articles are

    filtered for research on the basis of inclusion and exclusion criteria techniques.

    3.1.2.1 Inclusion criteria

    Studies covering the software architecture and software platform The accumulation of various database articles based on software architecture,

    software platform, maintainability, performance, implementation cost and network

    management system

    Studies covering software prototype Studies focus on different software platform and evaluation methods Considering the articles which are in English language

    3.1.2.2 Exclusion criteria

    Articles which are not related to software architecture and software platform Remove the duplicate articles Articles which are not peer reviewed Articles which are not published

    3.1.3 Snowball Sampling Method Snowball sampling is defined by K. D. Bailey [62] as “a non-probabilistic form of sampling

    in which persons initially chosen for the sample are used as informants to locate other

    persons having necessary characteristics making them eligible for sample”. In this thesis

    snowball sampling method is used to discover reference among the articles found from

    literature review.

    The basic techniques for literature review are searching article with keywords and also

    perform snowball sampling method [63] to decrease the chance of missing any relevant

    article. Snowball sampling is a chain study of article selecting from reference of one article

    [59]. D. Waldorf et al. recently referred to snowball sampling as “chain referral sampling”

    [64]. We used snowball sampling method in our research to develop a good scope and in

    order to avoid missing any important research articles related to our thesis.

  • 17

    3.2 Experiment

    In second phase of the research study, experimental research is used to answer and support

    our research question. Dawson [59] defined empirical research as “a research based on

    observed and measured phenomena”. It reports research based on actual observations or

    experiments are usually performed in development, evaluation and problem-solving projects

    [60].

    The implementation of the three prototypes with different software platforms is a

    challenging task based on supporting literature review. Three prototypes were built by

    deploying different software platforms such as PHP [65], JSP [66] and Perl [67]. The whole

    part of experiment is conducted as step by step process as shown in Figure 2. Firstly,

    software architecture is designed to check the possibilities for building three prototypes.

    Each of these prototypes was evaluated with respect to maintainability, implementation cost,

    performance (response time) and also the trade-off & conflicts between them.

    Literature review

    Software

    architecture design

    in Section 3.2.1

    Maintainability

    evaluation in

    Section 3.2.4

    Implementation

    cost calculation in

    Section 3.2.5

    Performance

    (response time)

    measurement in

    Section 3.2.6

    Prototype

    development in

    Section 3.2.2

    Final research out

    come(Results)

    Industry

    requirement

    Change scenario

    Design & evaluation

    methods

    Figure 2 Structural Design of Experimental Process

    3.2.1 Software Architecture Design Method for NMS

    3.2.1.1 Software Architecture design

    In general, Software architecture can be defined as “a decomposition and structure of

    software components and how these system parts are interconnected to each other”

    [68][23][69].

    According to the J. Bosch [13], architecture design is the process of changing a set of

    requirement into software architecture which satisfies the requirements. The design method

    of software is the activity to identifying the sub-systems of the software system. To design

  • 18

    the software architecture, functionality-based architecture design method is used [6]. This

    design method has demonstrated as a great impact on the assessment of different quality

    attributes.

    3.2.1.2 Software Architecture Description

    The basic architecture of prototype is described in Figure 3 [13] by using functionality-based

    architecture design method.

    Figure 3 Structure of Software Architecture

    The major components in Figure 3 are database, web server, platforms, data acquisition, web

    browser and graphical user interface. We have used Apache web server, the most popular

    open source cross-platform that executes all modules present in Xampp (Apache HTTP

    Server, Tomcat, PHP, Perl and MySQL). All three prototypes are browser independent (like

    Chrome, Firefox, Safari, and Explorer) [20]. The major functionalities are considered for the

    development of prototypes. For the research work industrial partner has decided to develop

    the major functionalities that are identical in all three prototypes. The selected functionalities

    with their respective prototype are assessed by considering the quality attributes such as

    performance, maintainability and implementation cost. Different operating systems like

    windows and Linux are considered for platform independent application [27].

    3.2.2 Development of Prototype In software development, prototyping is basic model/structure of software system [70].

    Prototype is used to narrow down and refine the user’s requirement for the system [59].

    The structural design of software prototype has various processes shown in Figure 4. All

    three prototypes have same functionality but the quality attributes such as performance,

    maintainability and implementation cost is different in each prototype with different software

    platforms. The prototypes are based on web browser application which provides user to

    analyze the behavior of network and faults if any in the network.

    Graphical User Interface (GUI)

    Architecture Components

    Web Browser

    Security

    Database Management System

    Operating System

    (Windows / Linux / Mac OS X)

    Web Server

    Platform

    (PHP/Java/

    HTML/Perl)

    Database

    conector

    Main functionality of Software

    Data Acquisition

    (Probing devices / injecting data into database through script, tcp/ip, comm port)

  • 19

    Implementation

    Initially Identified the

    Requirements

    Requirement

    Prioritization &

    selection

    Design

    Integration

    Testing

    Figure 4 Basic Prototype Development Process

    Three prototypes have been developed with different software platforms. First prototype is

    developed by using PHP (Hypertext Preprocessor) [65] which is especially used for web

    development, Apache web server (Xampp) [71] which is mostly used in many organizations

    than any other web server described in survey conducted by Netcraft [72] as shown in Figure

    24 appendix-G and MySQL is the most popular open source database because of

    its performance and reliability [73]. Second prototype is developed by using JSP (Java

    Server Pages) [66] which provides fast and simplified way for creating dynamic web

    content, Tomcat [74] web server which is an open source software implementation of java

    server pages technologies and MySQL [73]. Third prototype is based on Perl [67] which is

    highly featured programming language, MySQL and Apache web server. The computer for

    all three prototypes was Intel core i3 processor 2.4 GHz and 2 GB RAM.

    3.2.3 Maintainability Maintainability of any software is defined by IEEE [75][69] as “The ease with which a

    software or component can be modified to correct faults, improve performance or other

    attributes, or adapt to a change environment”. The definition turn into three main classes of

    maintenance, that is corrective, perfective and adaptive. The prediction method concentrate

    only on perfective maintenance and adaptive maintenance that does not predict efforts

    requisite for corrective maintenance [76][75][13].

    Scenario-based analysis method (change scenario) is used to evaluate the maintainability of

    the software prototype. The evaluation part is done by using SAAM as show in Figure 5 and

    to analyze the maintainability of prototypes with different scenarios, a survey was conducted

    from twenty five industrial experts [19].

  • 20

    Software

    prototype design

    List of scenarios Evaluation using

    Change scenarios

    Assessment of

    Maintainability

    Result of

    Maintainability

    Figure 5 Maintainability Assessment of Software Prototype

    Survey based research is used to distinguish the knowledge, attributes and behaviour of large

    group of people and support effective research results. The collection of survey results are

    taken from twenty five industrial experts for the maintainability evaluation in a highly

    efficient way [77][78]. Survey is extensively used for specific problem solving and it is an

    approach for collecting information close by problem under study described in [79][80] as

    “A survey is data-gathering and analysis approach in which respondents answer the

    question that are developed in advance”.

    The maintainability survey is based on following steps [79][81]:

    Description of prototypes and scenarios Indentify the target audience both from industry and academia Designing the survey form Approval of survey from advisor Statistical analysis of results

    In every assessment, the change scenario method is applied which is shown in Figure 5. The

    software prototype is described for the assessment of each scenario to find the probability of

    each change scenario of software prototype, which is used to figure out the maintainability

    efforts using formula as given in Equation 1 [6][82].

    Equation 1 Maintainability Effort Formula

    Where

    Pn = Prototype number

    Ks = Number of scenarios

    i = Scenario number

    Probability (i) = Probability of change scenario (i) occurrence

    Effort (Pn, i) = Effort to implement change scenario (i) in prototype Pn

    3.2.3.1 Scenario Description

    Every scenario demonstrates the semantics of software quality attributes that may be

    assigned an associated weight or probability of the scenario occurrence [6][19]. During the

    literature review we have found a lot of possible change scenarios. But for better

    development of prototype few fundamental scenarios were selected with the help of internal

    and external advisors given in Table 2.

  • 21

    Table 2 Description of Each Scenario

    Scenario No. Scenario Description

    S1 Change of database (Microsoft SQL or PostgreSQL , Oracle, mongoDB or

    NoSql or any other )

    S2 Change of web server (Tomcat, Blazix, Jboss, Lamp or any other)

    S3 Enhancing GUI feature for any functionality in main menu of software.

    S4 Change of operating system (Linux, Microsoft Windows and Microsoft

    windows server edition)

    S5 Changing the technique for probing the network and storing data into

    database.

    S6 Enhancing functionalities for keeping backup of software data on cloud

    storages services or dedicated storage server (Redundancy).

    S7 Additional search capabilities ( in GUI of Networks)

    S8 Web page upgrading (for example: Multilingual, Meta-refresh , or any

    other)

    S9 Upgrade of database

    S10 Remote administration of NMS

    3.2.4 Implementation cost Implementation cost helps us to figure out how much time (in hours) is consumed during

    development of software prototype successfully with all functionalities [20]. The Figure 6

    has three main phases of prototype and it shows how the implementation cost for each

    prototype is calculated. For each phase of software prototype (Design of prototype, prototype

    development & implementation and testing of prototype), the time is recorded in hours and

    summarized [13][20].

    Testing of Prototype

    Design of prototype

    (Modelling estimated time )

    Prototype Development &

    ImplementationTotal time

    in hours

    Figure 6 Implementation Cost for Prototypes

    Similarly, the same procedure is repeated to calculate the implementation cost of all three

    software prototypes. During this whole process authors have observed which phase

    consumes more time and effort for development.

    3.2.5 Performance Performance is defined [75] as “the degree to which a system or component accomplishes its

    designated functions within given constraints, such as speed, accuracy or memory usage”.

    Performance is about timing [23] which starts with request to a system and process the

    arriving request to generate a response. We measure the performance in term of response

  • 22

    time and the total process is shown in Figure 7. In each prototype the response time is

    measured (in millisecond) by using user request.

    START

    Request

    Response time

    (time=end-start)

    END

    start Time

    (Procedure call time)

    Script/code execution

    for providing the

    service including

    database access

    end Time

    Figure 7 Measuring Response Time for Prototypes

    To measure the performance we have executed the prototype several times. But for the

    comparative analysis of performance we took equal number of executions (thirty executions)

    in each prototype. Average and standard deviation [58] is taken as a reference to compare the

    performance and maintainability in all three prototypes.

    3.2.5.1 Average:

    Mean or average is defined as the sum of all the given elements divided by the total

    number of elements [83]. The average formula is given below:

    Formula: Mean = sum of elements / number of elements

    N

    XX

    Equation 2 Average Formula

    Where

    N is number of observations and

    X is sum of all the observations

    3.2.5.2 Standard deviation:

    Standard deviation is used to calculate the average distance from the average. The equation

    for standard deviation [83] is given below:

    n

    i

    XXin

    S1

    2

    1

    1

    Equation 3 Standard Deviation Formula

    Where

    S= Standard deviation, Mean sum of all, X= Value of data set, X = Mean of the values, n= No. of values.

  • 23

    4 RESULT & ANALYSIS

    4.1 Result from Literature Review

    4.1.1 Search String Results from Databases The internet is worthful source to find published articles through digital library resources

    [59]. The following digital libraries are selected for obtaining the research result of literature

    review.

    Table 3 Databases Description

    S. No. Name of Database Type of Database

    01 Engineering Village Digital

    02 Scopus Digital

    03 IEEE Xplore Digital

    The search string was formulated by using keywords and got the results from various

    databases as shown in Table 4 and other strings with different strategies is included in

    appendix A.

    Table 4 Search String Results

    4.1.2 Conducting the Literature Review Articles are selected from various databases by using the search string and filtered according

    to the inclusion/exclusion criteria. The articles which are most relevant to our research

    question are considered and shown in Table 5.

    Database Search String Results

    Engineering

    Village

    Inspec

    ( (($software $architecture) WN KY) AND (1969-2012

    WN YR)) AND ( (($Maintainability) WN KY) AND

    (1969-2012 WN YR)) AND ( (($Implementation $cost)

    WN KY) AND (1969-2012 WN YR)) AND (

    (($performance) WN KY) AND (1969-2012 WN YR))

    76

    ( (($software $platform) WN KY) AND (1969-2012 WN

    YR)) AND ( (($Maintainability) WN KY) AND (1969-

    2012 WN YR)) AND ( (($Implementation $cost) WN

    KY) AND (1969-2012 WN YR)) AND ( (($performance)

    WN KY) AND (1969-2012 WN YR))

    31

    Scopus

    (TITLE-ABS-KEY(software architecture) AND TITLE-

    ABS-KEY(performance) AND TITLE-ABS-

    KEY(maintainability) AND TITLE-ABS-

    KEY(implementation))

    30

    (TITLE-ABS-KEY(software platform) AND TITLE-

    ABS-KEY(performance) AND TITLE-ABS-

    KEY(maintainability) AND TITLE-ABS-

    KEY(implementation))

    12

    IEEE Xplore

    (((software architecture) AND maintainability) AND

    performance) 106

    ((((software platform) AND maintainability) AND

    performance)) 25

  • 24

    Table 5 Refinement of Article from Databases

    S. No. Name of

    Database

    Total

    Articles

    Found

    Filtered by

    Reading

    Title/Abstract

    Screening

    Full Text

    01 Engineering

    Village Inspec

    107 39 6

    02 Scopus 42 17 2

    03 IEEE Xplore 131 24 4

    Sum of all article 280 80 12

    Total (A) 12

    Duplicate (B) 2

    Grand Total (A-B) 10

    4.1.3 Snowball Sampling

    4.1.3.1 Selection of Article

    Snowball sampling method is a repetitive process, by selecting one reference from article

    [64]. Snowball sampling is used to develop a good scope of research. Ten articles in total are

    selected from different databases after filtering and then we performed snowball sampling on

    those articles and extract five more relevant articles given in Table 6.

    Table 6 Snowball Sampling Selected Articles

    S. No. Description Articles Selected Articles

    1 Databases 10 [6] [8] [10] [35] [15] [19] [24] [70]

    [21] [84]

    2 Snowball 5 [1] [56] [12] [40] [26]

    Total 15

    4.1.3.1.1 Articles Selected for Study

    During the literature review, we found 15 articles in primary study and each selected article

    is assigned with article number from (A1-A15) and after that every research article is

    summarized.

    Table 7 List of Selected Articles Paper /

    Article

    no.

    Article / Paper Reference

    no.

    A1

    P. O. Bengtsson and J. Bosch, “Haemo dialysis software Architecture design

    experiences,” in Proceedings of the 1999 International Conference on

    Software Engineering, 1999, 1999, pp. 516–525.

    [1]

    A2

    J. E. Bardram, H. B. Christensen and K. M. Hansen, “Architectural

    prototyping: an approach for grounding architectural design and learning,”

    in Software Architecture, 2004. WICSA 2004. Proceedings. Fourth Working

    IEEE/IFIP Conference on, 2004, pp. 15 – 24.

    [56]

    A3

    J. Bosch and P. Bengtsson, “Assessing optimal software architecture

    maintainability,” in Fifth European Conference on Software Maintenance

    and Reengineering, 2001, 2001, pp. 168–175.

    [6]

    A4

    J. S. Dong, K. Lee, K. H. Kim, S. T. Kim, J. M. Cho, and T. H. Kim,

    “Platform Maintenance Process for Software Quality Assurance in Product

    Line,” in Computer Science and Software Engineering, 2008 International

    Conference on, 2008, vol. 2, pp. 325 –331.

    [8]

    A5 P. Bengtsson and J. Bosch, “Scenario-based software architecture [10]

  • 25

    reengineering,” in Fifth International Conference on Software

    Reuse, 1998. Proceedings, 1998, pp. 308–317.

    A6

    R. Kazman, L. Bass, G. Abowd, and M. Webb, “SAAM: a method for

    analyzing the properties of software architectures,” in, 16th International

    Conference on Software Engineering, 1994. Proceedings. ICSE-16, 1994,

    pp. 81–90.

    [12]

    A7

    H. Algestam, M. Offesson, and L. Lundberg, “Using components to increase

    maintainability in a large telecommunication system,” in Software

    Engineering Conference, 2002. Ninth Asia-Pacific, 2002, pp. 65 – 73.

    [35]

    A8

    Xiaosong Wang, Li Wang, Benhai Yu, and Guixue Dong, “Studies on

    Network Management System framework of Campus Network,” in 2010 2nd

    International Asia Conference on Informatics in Control, Automation and

    Robotics (CAR), 2010, vol. 2, pp. 285–289.

    [15]

    A9

    Lundberg, J. Bosch, D. Haggander, and P.O. Bengtsson, “Quality attributes

    in software architecture design,” in Proceedings of SEA’99: 3rd Annual

    IASTED International Conference on Software Engineering and

    Applications, 6-8 Oct. 1999, Anaheim, CA, USA, 1999, p. 353–62.

    [40]

    A10

    Z. Li and J. Keung, “Software Cost Estimation Framework for Service-

    Oriented Architecture Systems Using Divide-and-Conquer Approach,” in

    Service Oriented System Engineering (SOSE), 2010 Fifth IEEE International

    Symposium on, 2010, pp. 47 –54.

    [26]

    A11

    P. Bengtsson and J. Bosch, “Architecture level prediction of software

    maintenance,” in Proceedings of the Third European Conference on

    Software Maintenance and Reengineering, 1999, 1999, pp. 139–147.

    [19]

    A12

    H. Grahn and J. Bosch, “Some initial performance characteristics of three

    architectural styles,” in Proceedings of the 1st international workshop on

    Software and performance, New York, NY, USA, 1998, pp. 197–198.

    [24]

    A13

    G. Ortiz, B. Bordbar, and J. Hernandez, “Evaluating the use of AOP and

    MDA in Web service development,” in 2008 3rd International Conference

    on Internet and Web Applications and Services, 8-13 June 2008, Piscataway,

    NJ, USA, 2008, pp. 78–83.

    [70]

    A14

    D. Haggander, L. Lundberg, and J. Matton, “Quality attribute conflicts -

    experiences from a large telecommunication application,” in Engineering of

    Complex Computer Systems, 2001. Proceedings. Seventh IEEE International

    Conference on, 2001, pp. 96 –105.

    [21]

    A15

    K. Bennett, M. Munro, J. Xu, N. Gold, P. Layzell, N. Mehandjiev, D.

    Budgen, and P. Brereton, Prototype Implementations of an Architectural

    Model for Service-Based Flexible Software. 2002.

    [84]

    P. O. Bengtsson et al. [A1] redesigned the existing system which is hard to maintain for

    dialysis machines. They has implemented a prototype which fulfills the quality attributes and

    represent the prototype into a complete system [1]. J. E. Eyvind et al. [A2] discussed the

    architectural prototypes and their qualities to dig into the experiment with substitute

    architectural styles, patterns and features. Furthermore, they developed prominent

    performance and distributed customer system by developing prototype of a globally

    distributed customer service system [56].

    J. Bosch et al. [A3] discussed about maintenance effort and impact of scenario profile on

    architecture to calculate theoretical minimal maintainability effort. These results were used

    to assess scenario based maintainability of software architecture [6]. J. S. Dong et al. [A4]

    developed different software platforms by reusing software architectures and source code.

    They proposed a technique to reuse software architecture and management of platform

    quality attributes through defect removing and continuous maintenance [8].

  • 26

    P. O. Bengtsson et al. [A5] addressed the quality attributes of software architecture while

    introducing a method for reengineering of software architecture. The main intention of the

    research was on maintainability and reusability for domain specific software architecture

    [10]. R. Kazman et al. [A6] discussed about the methods which requires some attention

    while evaluation of software architecture. SAAM method was used to describe and analyze

    software architecture with respect to maintainability [12]. H. Algestam et al. [A7] designed

    and implementation of the prototype with new component-based architecture to compare

    with the existing architecture [35].

    Xiaosong Wang et al. [A8] proposed an extensive network management system design based

    on web services that covers the demand for network management including system

    architecture, prototype, as well as the deployment method. In the design of network

    management system major components are data acquisition, performance and configuration

    management. The architecture was based on integrated network management system by

    using platform LAMP [15].

    D. Haggander et al. [A9] reported the experiences from five industrial applications. The

    experience gained from five projects shows that the performance and modifiability is

    affected by using different implementation techniques [40]. J. Keung et al. [A10] described a

    framework based on divide-and-conquer approach for cost estimation of service oriented

    architecture-based software. The framework was able to overcome the hurdles of

    organization when dealing with separate parts of architecture [26].

    P. Bengtsson et al. [A11] used a method for maintainability prediction of software

    architecture using requirement specification and design of architecture. The method was

    suitable for designing the process that will iterate frequently for evaluation of architecture on

    each iteration [19].

    H. Grahn et al. [A12] is concern with some quality attributes and the design styles of

    software. The architecture is described with components and direct interconnections to get

    the best performance data [24]. G. Ortiz et al. [A13] presented a case study for the evaluation

    of MDA and AOP in the development of web services. By using metrics measurement that

    can separate the results observed from the generated techniques are traceable and modular

    [70].

    J. Matton et al. [A14] discussed about the application that provide high performance,

    availability and maintainable in order to reduce the cost. To maximize the quality in a

    design which is not possible always, therefore making trade-offs is necessary and major

    challenge in software design is to find solutions that balance and optimize the quality

    attributes [21]. K. Bennett et al. [A15] discussed about the service architecture during

    implementation of two prototypes with different technologies such as scripting and e-Speak.

    After the implementation they found negotiation of functional and nonfunctional attributes of

    services [84].

  • 27

    4.2 Prototyping

    4.2.1 Prototype 1 Prototype 1 is developed and implemented by using PHP platform as shown in Figure 21

    appendix B. The prototype performance in terms of response time, implementation cost in

    hours and maintainability is shown in Tables 8, 9 and10 respectively.

    4.2.1.1 Performance (Response Time)

    Three main functionalities (test cases) have been selected from industrial partner for

    measuring the response time and analysing the whole prototype performance is based on

    these functionalities/test cases shown in Table 8 with the average and standard deviation.

    The response time is measured by using user request. The prototype is executed several

    times but for statistical analysis [83] thirty executions are considered and the recorded

    response time (millisecond) of each execution results are shown in appendix B Table 23 and

    also shown in graph:

    Table 8 Response Time of Prototype 1

    Functionalities/

    Test cases

    Total no .of

    executions

    Average

    Response Time

    in milliseconds

    Standard

    Deviation

    View All Network 30 6.74 1.1622

    Status of Network 30 6.86 1.4755

    Search Network 30 7.63 1.4370

    Figure 8 Response Time of Prototype 1

    The Figure 8 shows the response time of prototype 1 with three test cases of prototype based

    on view all networks, search networks and status of networks. The x-axis contains the total

    number of executions and y-axis contain the time in milliseconds. From above graph we

    have observed that the result of functionality is different during every execution of

    prototype.

    4.2.1.2 Implementation Cost

    The total implementation cost in term of time for completing this prototype is 78 hours. That

    includes design of prototype, prototype development and testing of prototype. The Table 9

    0

    2

    4

    6

    8

    10

    12

    1 3 5 7 9 11 13 15 17 19 21 23 25 27 29

    Tim

    e in

    m

    illi

    seco

    nds

    No. of executations

    Response Time of Prototype 1

    View All Network

    Search Network

    Status of Network

  • 28

    describes the implementation cost of this prototype at each stage and pie chart is shown in

    Figure 9.

    Table 9 Implementation Cost of Prototype 1

    Stages / Phases Prototype 1 PHP

    Cost in Hours

    Prototype design 10

    Prototype development 60

    Testing of prototype 8

    Total Time( hours) 78

    Figure 9 Implementation Cost of Prototype 1

    4.2.1.3 Maintainability

    A change scenario effort for this prototype has taken from survey both in industry and

    academia. The survey results were taken from twenty five industrial experts for the

    maintainability evaluation and result of each survey is mentioned in appendix E. After

    getting every survey from responder, the statistical operation [83] [6] is performed for each

    scenario of the prototype by using different scenarios that is given in Chapter 3 Table 2.

    Then efforts were calculated by using maintainability effort formula (equation 1) mentioned

    in Chapter 3 and calculated maintainability effort is recorded in Table 22 mentioned in

    appendix B. The average maintainability of twenty five surveys is then calculated by using

    Table 22 and average results are shown in Table 10.

    Table 10 Maintainability of Prototype 1

    Prototype Total no .of

    Surveys

    Average

    Maintainability

    effort

    Standard

    Deviation

    Prototype 1(PHP) 25 80.920 19.349677

    The Figure 10 shows maintainability of prototype 1 based on the results taken from Table

    22. The x-axis contains the total number of surveys conducted and y-axis is taken as efforts

    required for changing the scenario of software prototype.

    10

    60

    8

    Prototype 1 Implementation cost in hours

    Prototype design

    Prototype development

    Testing of prototype

  • 29

    Figure 10 Maintainability of Prototype 1

    4.2.2 Prototype 2 Prototype 2 is developed and implemented by using JSP platform as shown in Figure 22

    appendix B. The prototype performance in term of response time, implementation cost and

    maintainability is given in Table 11, 12 and 13 respectively.

    4.2.2.1 Performance (Response Time)

    The response time is measured in millisecond for three selected functionalities by using user

    request and output of this prototype is as shown in Table 11. Prototype is executed several

    times, for statistical analysis thirty executions are considered same as prototype 1 for

    comparison among these prototypes. The recorded response time of each execution that is

    shown in appendix B Table 24 and also in Figure 11.

    Table 11 Response Time of Prototype 2

    Functionalities/test

    cases

    Total no .of

    executions

    Average Response

    Time in milliseconds

    Standard

    Deviation

    View All Network 30 13.80 2.8696

    Status of Network 30 10.83 2.3057

    Search Network 30 14.03 2.5527

    Figure 11 Response Time of Prototype 2

    0

    20

    40

    60

    80

    100

    120

    1 3 5 7 9 11 13 15 17 19 21 23 25

    Mai

    nta

    inab

    ility

    Eff

    ort

    s

    No. of Surveys

    Maintainability

    Prototype 1 Maintainability

    0

    5

    10

    15

    20

    25

    1 3 5 7 9 11 13 15 17 19 21 23 25 27 29

    Tim

    e in

    mil

    lise

    conds

    No. of executations

    Response Time of Prototype 2

    View All Network

    Search Network

    Status of Network

  • 30

    The Figure 11 shows the response time of prototype 2 based on view all networks, search

    networks and status of networks. The x-axis contains the total number of executions and y-

    axis contains time in milliseconds. From the above graph we have observed that results are

    varying during execution of this prototype.

    4.2.2.2 Implementation cost

    Prototype 2 has taken 80 hours during the whole development process and the results

    shown in Table 12 and Figure 12 respectively. The table below describes the estimated

    implementation cost in hours of this prototype at every stage.

    Table 12 Implementation Cost of Prototype 2

    Stages / Phases Prototype 2 JSP

    Cost in Hours

    Prototype design 8

    Prototype development 64

    Testing of prototype 8

    Total Time( hours) 80

    Figure 12 Implementation Cost of Prototype 2

    4.2.2.3 Maintainability

    Maintainability is calculated by using effort formula mentioned in Chapter 3. Each survey

    result taken from twenty five industrial experts is shown in Figure 13 and Table 24 in

    appendix B. The average maintainability effort of this prototype is shown in Table 13.

    Table 13 Maintainability of Prototype 2

    Prototype Total no .of

    Surveys

    Average

    Maintainability

    effort

    Standard

    Deviation

    Prototype 2(JSP) 25 79.400 16.258331

    The Figure 13 shows the maintainability effort of prototype 2 taken from survey. The x-axis

    contains the total number of surveys and on y-axis efforts required to change the scenario of

    the prototype.

    8

    64

    8

    Prototype 2 Implementation cost in hours

    Prototype design

    Prototype development

    testing of prototype

  • 31

    Figure 13 Maintainability of Prototype 2

    4.2.3 Prototype 3 Prototype 3 is developed and implemented by using Perl platform as shown in Figure 23

    appendix B. The prototype performance in terms of response time, implementation cost and

    maintainability is shown in Table 14, 15 and 16 respectively.

    4.2.3.1 Performance (Response Time)

    The response time is measured in millisecond for selected functionalities by using user

    request and results of prototype 3 as shown in Table 14. Prototype 3 is also executed several

    times but for statistical operation thirty executions are considered same as prototype 1 and 2

    for comparison among these prototypes. The response time is recorded for each execution

    that is shown in appendix B Table 27 and also in Figure 14.

    Table 14 Response Time of Prototype 3

    Functionalities/

    test cases

    Total no .of

    executions

    Average

    Response Time

    in milliseconds

    Standard

    Deviation

    View All Network 30 6.90 1.3132

    Status of Network 30 9.04 1.6877

    Search Network 30 8.24 1.3778

    The Figure 14 shows the response time of prototype 3 that is based on view all networks,

    search networks and status of networks. The x-axis contains the total number of executions

    and y-axis is taken as the time in milliseconds. From graph we have observed that results are

    irregular during execution of this prototype and thirty executions are considered for

    performance comparison.

    0

    20

    40

    60

    80

    100

    120

    1 3 5 7 9 11 13 15 17 19 21 23 25

    Mai

    nta

    inab

    ility

    Eff

    ort

    s

    No. of Surveys

    Maintainability

    Prototype2 Maintainability

  • 32

    Figure 14 Response Time of Prototype 3

    4.2.3.2 Implementation Cost

    Prototype 3 has taken 58 hours for the development of whole process and results are shown

    in Table 15 and Figure 15 respectively. The table describes the estimated implementation

    cost in hours for prototype 3 at every stage.

    Table 15 Implementation Cost of Prototype 3

    Stages / Phases Prototype 3 Cost in

    Hours

    Prototype design 6

    Prototype development 45

    Testing of prototype 7

    Total Time( hours) 58

    Figure 15 Implementation Cost of Prototype 3

    4.2.3.3 Maintainability

    The same strategy which is used in prototype 1 and 2 is applied to calculate the

    maintainability effort of prototype 3. The result of each survey responder is shown in Figure

    16 and Table 26 mentioned in appendix B. The average maintainability effort of this

    prototype is shown in Table 16.

    0

    2

    4

    6

    8

    10

    12

    14

    1 3 5 7 9 11 13 15 17 19 21 23 25 27 29

    Tim

    e in

    mil

    lise

    conds

    No.of executions

    Response Time of Prototype 3

    View All Network

    Search Network

    Status of Network

    6

    45

    7

    Prototype 3 Implementation cost in hours

    Prototype design

    Prototype development

    Testing of prototype

  • 33

    Table 16 Maintainability of Prototype 3

    Prototype Total no .of

    Surveys

    Average

    Maintainability

    effort

    Standard

    Deviation

    Prototype 3(Perl) 25 81.400 19.733643

    Figure 16 shows the maintainability effort of this prototype taken from each survey. The x-

    axis contains the total number of surveys and y-axis contains the efforts required to change

    the scenario of the prototype.

    Figure 16 Maintainability of Prototype 3

    4.3 Comparison and Summing up

    4.3.1 Maintainability We have assessed the maintainability effort of each prototype from survey. Average and

    standard deviation is calculated for maintainability effort of all three prototypes and the

    comparison is provided in Table 17. Results shown in Figure 17 stated that prototype 2 has

    lowest maintainability effort when it is compared with other two prototypes. Prototype 3 has

    highest maintainability effort when compared with prototype 2 and prototype 1. Among all

    three prototypes, it is observed that prototype 2 has lowest maintainability effort when

    compared to prototype 1 and prototype 3.

    Table 17 Maintainability of all Three Prototypes

    Maintainability Prototype1

    PHP

    Prototype 2

    JSP

    Prototype 3

    Perl

    Average Maintainability 80.920 79.400 81.400

    Standard Deviation 19.349677 16.258331 19.733643

    0

    20

    40

    60

    80

    100

    120

    140

    1 3 5 7 9 11 13 15 17 19 21 23 25

    Mai

    nta

    inab

    ility

    Eff

    ort

    s

    No. of Surveys

    Maintainability

    Prototype 3 Maintainability

  • 34

    Figure 17 Comparison of Maintainability

    4.3.2 Implementation Cost We have recorded the implementation cost of each prototype in hours and comparison

    among them is shown in Table 18 and also graph shown in Figure 18. The assessment of

    these prototypes are shown as the total implementation time taken for each prototype to

    design, develop and testing of prototypes. We have observed that the prototype 3 has lowest

    implementation cost as compared to prototype 1 and prototype 2. Whereas implementation

    cost of prototype 1 and prototype 2 is approximately same in hours but different in term of

    lines of code.

    Table 18 Implementation Cost of all Three Prototypes

    Stages / Phases Prototype

    1

    Prototype

    2

    Prototype

    3

    Design of prototype 10 8 6

    Prototype development 60 64 45

    Testing of prototype 8 8 7

    Total Time( hours) 78 80 58 lines of code 1.5 kloc 1.2 kloc 1.35 kloc

    Figure 18 Comparison of Implementation Cost

    78.000

    78.500

    79.000

    79.500

    80.000

    80.500

    81.000

    81.500

    Prototype 1 PHP

    Prototype 2 JSP

    Prototype 3 Perl

    80.920

    79.400

    81.400

    Avg

    . Eff

    ort

    s

    Prototypes

    Comparison of Maintainability

    Prototype 1 PHP

    Prototype 2 JSP

    Prototype 3 Perl

    0 10 20 30 40 50 60 70 80 90

    Design of prototype

    Prototype development

    Testing of prototype

    Total Time( hours)

    Tim

    e in

    ho

    urs

    Comparison of Implementation Cost

    Prototype 1 PHP

    Prototype 2 JSP

    Prototype 3 Perl

  • 35

    4.3.3 Performance The response time for performance comparison of each prototype is show in Figure 19. Each

    prototype is executed numerous times (1~30) to measure average response time. Table 19

    shows the performance results of all prototypes and the results show that with prototype 1

    PHP platform is fastest among other two prototypes with respect to all three major

    functionalities/test cases which was considered for evaluating the performance of each

    prototype. The lowest performance is on prototype 2 with JSP platform. Prototypes 3 with

    Perl platform has mediate performance with respect to other two prototypes. To put it in

    other way, Prototype 3 is faster comparatively with prototype 2 but slower in contrast with

    prototype 1.

    Table 19 Response Time of all Three Prototypes

    Avg. Response time in milliseconds

    S. No. Functionalities/

    test cases

    Prototype 1

    PHP

    Prototype 2

    JSP

    Prototype 3

    Perl

    1 View Network 6.74 13.80 6.90

    2 Search Network 6.86 10.83 9.04

    3 Status of Network 7.63 14.03 8.24

    Figure 19 Comparison of Response Time

    4.3.4 Quality Attributes Summing Up The overall comparison of quality attributes using different software platforms is shown in

    Table 20. The experience from each individual prototype shows that maintainability,

    performance and implementation cost are important quality attributes in a software system.

    Table 20 Over all Comparison of Quality Attributes

    Prototype

    number

    Quality Attribute

    Maintainability Performance

    (average response time) in milliseconds

    Implementation

    cost in hours

    1 80.92 7.63 78

    2 79.40 14.03 80

    3 81.40 8.24 58

    0

    2

    4

    6

    8

    10

    12

    14

    16

    View Network Search Network Status of Network

    Tim

    e in

    mil

    lise

    con

    ds

    Comparison of Response Time

    Prototype 1 PHP

    Prototype 2 JSP

    Prototype 3 Perl

  • 36

    First Prototype has highest ranking with respect to performance, mediate ranking in

    maintainability and implementation cost correlates with second prototype. Second Prototype

    has highest ranking in maintainability comparatively with first prototype and second

    prototype, while it has lowest performance and debatable implementation cost which is

    approximately similar with first prototype. In the same fashion third prototype has highest

    ranking in implementation cost and better performance up to some extent as equate with both

    other two prototypes. In the final analysis