73
Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for Different Software Platforms in a Network Management System Muhammad Nadeem Mohammed Azharuddin School of Computing Blekinge Institute of Technology SE 371 79 Karlskrona Sweden

Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

  • Upload
    others

  • View
    4

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

1

Master’s Thesis

Electrical Engineering

June 2012

Performance, Maintainability and

Implementation Cost for Different Software

Platforms in a Network Management System

Muhammad Nadeem

Mohammed Azharuddin

School of Computing

Blekinge Institute of Technology

SE – 371 79 Karlskrona

Sweden

Page 2: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

2

Contact Information:

Authors:

Muhammad Nadeem

E-mail: [email protected]

Mohammed Azharuddin

E-mail: [email protected]

External advisor:

Anders Grahn Email: [email protected]

Company/Organization: Smart Grid Networks AB

Address: Rombvägen 4 SE-371 65 Lyckeby Sweden

University advisor:

Professor Lars Lundberg Email: [email protected]

School of Computing

Blekinge Institute of Technology

School of Computing

Blekinge Institute of Technology

SE – 371 79 Karlskrona

Sweden

Internet : www.bth.se/com

Phone : +46 455 38 50 00

Fax : +46 455 38 50 57

This thesis is submitted to the School of Computing at Blekinge Institute of Technology in

partial fulfillment of the requirements for the degree of Master of Science in Electrical

Engineering. The thesis is equivalent to 20 weeks of full time studies.

University Examiner:

Professor Patrik Arlos Email: [email protected]

School of Computing

Blekinge Institute of Technology

Page 3: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

3

ABSTRACT

Context: Software architecture is an emerging field and progressively more popular in software

engineering. Software architecture has become an essential part in development of software systems.

Prototyping is possibly one of the most commonly used learning paradigms in software architecture.

Hence, it is reasonable to accept some of the requirements that could be expressed as specific quality

attributes for developing and comparative analysis of prototype. In this thesis we deal with software

architecture based on different prototypes, where the different platforms have been shared canonical

within the software architecture. It also has a good potential for performance intensification to analyze

the prototype according to the required quality attributes.

Objectives: In this study, we investigate the significance of quality attributes such as performance,

maintainability and implementation cost of different software platforms. Mainly, it is focused on

integration of prototypes in software architecture. We specifically investigate several challenges being

faced by the organizations in the maintainability for addressing the challenges in prototype of network

management system using software platforms.

Methods: In this study, both theoretical and empirical research methods have been applied. In order to

accomplish the goal of this thesis, literature review in this research has performed by studying articles

from several sources and also performed snowball sampling method to decrease the chance of missing

any relevant article. During literature review, we have analyzed learning structure and workflow of

prototypes and then incorporated quality attributes by theoretical analysis. In the experiment part,

three prototypes were built by deploying different software platforms such as PHP, JSP and Perl. Each

of these prototypes was evaluated with respect to maintainability using twenty five surveys from

industrial experts, implementation cost in number of hours and performance in terms of response time.

Results: As a result of our study, we have identified different challenges in software architecture and

practicing in software prototypes by using different software platforms. By this study we analyze the

performance, maintainability and implementation cost for different software platforms. Survey has

been conducted to recognize challenges and practices in maintainability of prototypes. We have

shown the possibility to achieve better quality attributes given to a certain system.

Conclusions: There is trade-off, the best implementation alternative depends on how important the

different quality attributes are in a certain situation.

Keywords: Implementation Cost, Maintainability, Network

Management System, Performance, Software Architecture

and Software Platform.

Page 4: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

4

ACKNOWLEDGEMENT

First and foremost, all praise and thanks be to Allah, the Lord of all the Worlds, the most

Gracious and most Merciful, who enabled us to complete this thesis successfully.

We are heartily thankful to our academic supervisor Dr. Lars Lundberg for giving us chance

to work under his supervision and providing us with his esteem guidance, encouragement

and suggestions throughout this thesis. We honor his huge knowledge and skill in many

areas, which have made us to improve our research.

We would also thankful for technical guidance, support, and encouragement given by

Andres Grahn (Chief Technical Officer) and Peter Engström (Design Engineer) from Smart

Grid Network AB Sweden. We would also like to thank all industry and academia survey

participants, who contributed towards survey part of this research.

Next, we express our gratitude to our beloved friends for helping us by suggesting

improvements in the report and providing ideas for the analysis. Surely, we owe our deepest

gratitude to our families for their prayers, moral and unconditional support throughout our

study.

Finally, we wish to express our sincere appreciation to all of those who pray for us and give

us moral support and encouragement.

Page 5: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

5

LIST OF FIGURES Figure 1 Structural Diagram of Literature Review ................................................................. 15 Figure 2 Structural Design of Experimental Process .............................................................. 17 Figure 3 Structure of Software Architecture........................................................................... 18 Figure 4 Basic Prototype Development Process ..................................................................... 19 Figure 5 Maintainability Assessment of Software Prototype ................................................. 20 Figure 6 Implementation Cost for Prototypes ........................................................................ 21 Figure 7 Measuring Response Time for Prototypes ............................................................... 22 Figure 8 Response Time of Prototype 1 ................................................................................. 27 Figure 9 Implementation Cost of Prototype 1 ........................................................................ 28 Figure 10 Maintainability of Prototype 1 ............................................................................... 29 Figure 11 Response Time of Prototype 2 ............................................................................... 29 Figure 12 Implementation Cost of Prototype 2 ...................................................................... 30 Figure 13 Maintainability of Prototype 2 ............................................................................... 31 Figure 14 Response Time of Prototype 3 ............................................................................... 32 Figure 15 Implementation Cost of Prototype 3 ...................................................................... 32 Figure 16 Maintainability of Prototype 3 ............................................................................... 33 Figure 17 Comparison of Maintainability .............................................................................. 34 Figure 18 Comparison of Implementation Cost ..................................................................... 34 Figure 19 Comparison of Response Time .............................................................................. 35 Figure 20 Prototype 1 of Software Architecture ..................................................................... 50 Figure 21 Prototype 2 of Software Architecture ..................................................................... 52 Figure 22 Prototype 3 of Software Architecture ..................................................................... 54 Figure 23 Maintainability Efforts Validation ......................................................................... 69 Figure 24 Web Server Survey by Netcraft.............................................................................. 71

Page 6: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

6

LIST OF TABLES Table 1 Search String for Literature Review .......................................................................... 16 Table 2 Description of Each Scenario .................................................................................... 21 Table 3 Databases Description ............................................................................................... 23 Table 4 Search String Results ................................................................................................. 23 Table 5 Refinement of Article from Databases ...................................................................... 24 Table 6 Snowball Sampling Selected Articles ........................................................................ 24 Table 7 List of Selected Articles ............................................................................................ 24 Table 8 Response Time of Prototype 1 ................................................................................... 27 Table 9 Implementation Cost of Prototype 1 .......................................................................... 28 Table 10 Maintainability of Prototype 1 ................................................................................. 28 Table 11 Response Time of Prototype 2 ................................................................................ 29 Table 12 Implementation Cost of Prototype 2 ........................................................................ 30 Table 13 Maintainability of Prototype 2 ................................................................................. 30 Table 14 Response Time of Prototype 3 ................................................................................ 31 Table 15 Implementation Cost of Prototype 3 ........................................................................ 32 Table 16 Maintainability of Prototype 3 ................................................................................. 33 Table 17 Maintainability of all Three Prototypes ................................................................... 33 Table 18 Implementation Cost of all Three Prototypes .......................................................... 34 Table 19 Response Time of all Three Prototypes ................................................................... 35 Table 20 Over all Comparison of Quality Attributes ............................................................ 35 Table 21 List of Search Strings .............................................................................................. 49 Table 22 Maintainability Average for Prototype 1 ................................................................. 50 Table 23 Prototype 1 Response Time ..................................................................................... 51 Table 24 Maintainability Average for Prototype 2 ................................................................. 52 Table 25 Prototype 2 Response Time ..................................................................................... 53 Table 26 Maintainability Average for Prototype 3 ................................................................. 54 Table 27 Prototype 3 Response Time ..................................................................................... 55 Table 28 Prototype Efforts Rating .......................................................................................... 57 Table 29 Probability Ratings of Selected Scenario ................................................................ 57 Table 30 Example Table for Filling Survey Form .................................................................. 57 Table 31 Prototype and Probability Scenario Rating .............................................................. 57 Table 32 List of Industries for Survey as Target Audience .................................................... 58 Table 33 Organization Using Different Platforms .................................................................. 71 Table 34 Websites Based on Different Platforms ................................................................... 71

Page 7: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

7

ACRONYMS

ADD Attribute Driven Design

ATAM Architectural Trade-off Analysis Method

ANOVA Analysis of Variance

AOP Aspect-Oriented Programming

CBAM Cost Benefit Analysis Method

CSS Cascading Style Sheets

CGI Common Gateway Interface

CORBA Common Object Request Broker Architecture

CPU Central Processing Unit

GUI Graphical User Interface

FDM Functionality-based Design Method

HTML Hyper Text Markup Language

HTTP Hyper Text Transfer Protocol

IEEE Institute of Electrical and Electronics Engineers

JDBC Java Data Base Connectivity

JDK Java Development Kit

JPL Jet Propulsion Laboratory

JSP Java Server Pages

J2EE Java 2 Platform, Enterprise Edition

LAMP Linux, Apache, MySQL and PHP

LOC Line of Code

MAAP New Millennium Autonomy Architecture Prototype

MDA Model-Driven Architecture

NASA National Aeronautics and Space Administration

NMRA New Millennium Remote Agent

ODBC Open Data Base Connectivity

PERL Practical Extraction and Report Language

PHP Hypertext Preprocessor

PRO3D Programming for Future 3D Architecture with Many Cores

RCS Remote Control System

SA Software Architecture

SAAM Scenario-based Architecture Analysis Method

SPL Software Product Line

SQL Structure Query Language

SSPS Statistical Package for the Social Sciences

TCP Transmission Control Protocol

WWW World Wide Web

Page 8: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

8

TABLE OF CONTENTS ABSTRACT .......................................................................................................................................... 3

ACKNOWLEDGEMENT ................................................................................................................... 4

LIST OF FIGURES .............................................................................................................................. 5

LIST OF TABLES ................................................................................................................................ 6

ACRONYMS ......................................................................................................................................... 7

1 INTRODUCTION ..................................................................................................................... 10

1.1 OVERVIEW ........................................................................................................................... 10 1.2 AIMS AND OBJECTIVES ........................................................................................................ 11 1.3 RESEARCH QUESTION .......................................................................................................... 11 1.4 THESIS OUTLINE .................................................................................................................. 11

2 BACKGROUND ........................................................................................................................ 12

2.1 SOFTWARE ARCHITECTURE AND QUALITY ATTRIBUTES ..................................................... 12 2.2 SOFTWARE PLATFORM AND PROTOTYPE ............................................................................. 12 2.3 NETWORK MANAGEMENT SYSTEM ...................................................................................... 13 2.4 RELATED WORK .................................................................................................................. 13

3 RESEARCH METHODOLOGY ............................................................................................. 15

3.1 LITERATURE REVIEW ........................................................................................................... 15 3.1.1 Search String and Databases .......................................................................................... 15 3.1.2 Article Selection Criteria ................................................................................................ 16 3.1.3 Snowball Sampling Method ............................................................................................ 16

3.2 EXPERIMENT ........................................................................................................................ 17 3.2.1 Software Architecture Design Method for NMS ............................................................. 17 3.2.2 Development of Prototype .............................................................................................. 18 3.2.3 Maintainability ............................................................................................................... 19 3.2.4 Implementation cost ........................................................................................................ 21 3.2.5 Performance ................................................................................................................... 21

4 RESULT & ANALYSIS ............................................................................................................ 23

4.1 RESULT FROM LITERATURE REVIEW .................................................................................... 23 4.1.1 Search String Results from Databases ............................................................................ 23 4.1.2 Conducting the Literature Review .................................................................................. 23 4.1.3 Snowball Sampling ......................................................................................................... 24

4.2 PROTOTYPING ...................................................................................................................... 27 4.2.1 Prototype 1...................................................................................................................... 27 4.2.2 Prototype 2...................................................................................................................... 29 4.2.3 Prototype 3...................................................................................................................... 31

4.3 COMPARISON AND SUMMING UP .......................................................................................... 33 4.3.1 Maintainability ............................................................................................................... 33 4.3.2 Implementation Cost ....................................................................................................... 34 4.3.3 Performance ................................................................................................................... 35 4.3.4 Quality Attributes Summing Up ...................................................................................... 35

5 DISCUSSION ............................................................................................................................. 37

5.1 VERIFICATION, VALIDATION AND DISCUSSION .................................................................... 37 5.2 VALIDITY THREATS ............................................................................................................. 39

5.2.1 Internal Validity Threats ................................................................................................. 39 5.2.2 External Validity Threats ................................................................................................ 39 5.2.3 Construct Validity Threats .............................................................................................. 40 5.2.4 Conclusion Validity Threats ........................................................................................... 40

6 CONCLUSION .......................................................................................................................... 41

Page 9: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

9

7 FUTURE WORK ....................................................................................................................... 43

REFERENCE ..................................................................................................................................... 44

APPENDIX ......................................................................................................................................... 49

APPENDIX-A: SEARCH STRING.......................................................................................................... 49 APPENDIX-B: PROTOTYPES ............................................................................................................... 50 APPENDIX-C: SURVEY FORM ............................................................................................................ 56 APPENDIX-D: TARGET AUDIENCE ..................................................................................................... 58 APPENDIX-E: SURVEY DATA ............................................................................................................ 59 APPENDIX-F: SURVEY DATA ANALYSIS ........................................................................................... 68 APPENDIX-G: SERVERS & PLATFORMS ............................................................................................. 71 APPENDIX-H: SOURCE CODE ............................................................................................................. 73

Page 10: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

10

1 INTRODUCTION

1.1 Overview

In many real-world situations, it is important to make accurate predictions based on the

available information. Software architecture constrains the achievement of quality attributes

such as performance, usability and maintainability of a system [1][2]. Generally it has

become an accepted concept in academia as well as industry. Software architecture is a

structured solution to facilitate all the technical and operational requirements while analyzing

the entire quality attributes of software system [3]. It requires a series of decisions based on a

wide range of factors and then each of these decisions have considerable impact on the

performance, maintainability for overall success of the application [4].

Why do we need software architecture and their design method with different software

platforms? The inspiration behind this research is that global technologies are entirely

depending on software. Software need to be developed with different software platforms

(e.g. PHP, JSP and Perl), maintain and update in order to remain contemporary with change

in the technology [5].

Every software depend on their architectural design and software platform, so there is a need

to conduct even more in depth experiment to investigate the design of prototypes with

different software platforms. Industry wants to go further with other software platforms

using the modern technology that are available today. They are in search of best solution that

is as quick as possible to implement, easy to maintain, easy to extend and with the needed

functionality in any NMS software. One part of the work could be to investigate the

programming platform that are available today and find out their quality attributes. Therefore

it is very important to build a right prototype on the basis of requirements before the

development of software.

J. Bosch et al. [1] described an approach to design and evaluate the software architecture

with logical and dynamic view along with different styles and scenarios. According to P. O.

Bengtsson et al. [6] a technique for analyzing the maintainability of software architecture

based on specific scenario for a particular system. This technique is illustrated and evaluated

using industrial cases also [7].

Nowadays software architecture design is a foundation of any software system and these

systems entirely depend upon there design method [1][5]. In both industry and academia

several components are used in software architecture that are generally recognized and has

lead to better control over the design [4]. A software platform development method

that allows suitable description of requirements and characteristics of software are crucial

[8]. So, there is a need to accomplish scientific research to identify the different software

platforms and obtain all required quality attributes (performance, maintainability and

implementation cost) with minimum utilization of resources.

The software platform is a concrete model of the architecture that allows executing the

software with detailed hardware-software interaction based on performance and

maintainability [9]. Software platform evaluation has a great impact on the quality attributes.

Evaluation of platform has been carried out several times until the entire requirement and

quality attributes are gathered [10]. The evaluation process iterates the design of platform

and fulfils all the required quality attributes [11]. Change scenario is an important part of

practicing architecture in order to gain more information about maintainability of the

software system [12][13][14].

Page 11: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

11

Network management system is a technique for development and perfection. While the

technologies of network management have been getting increase day-after-day, but it still

fails to meet the demands for the rapid development of the network management system.

Network management basically involves monitoring of the devices connected in a network

by gathering and analyzing the data from the devices [15]. The purpose of any network

management system is to monitor and control the behaviour of a network system to

accomplish better operation of the network system [16].

1.2 Aims and Objectives

The aim of this research is to investigate the maintainability, performance and

implementation cost of different software platforms for a network management system.

Build three prototypes with different software platforms based on performance,

maintainability and implementation cost.

Investigate the possibility for implementation cost of network management system.

Investigate the performance of different prototypes based on software platforms.

Investigate the maintainability of prototypes based on different software platforms.

Investigate the significance (trades-off and conflicts) of performance, maintainability

and implementation cost for different software platforms.

Validation of all three prototypes from industry (Smart Grid Network AB).

1.3 Research Question

RQ1. What is the maintainability for different software platforms in a network management

system?

RQ2. What is the implementation cost for different software platforms in a network

management system?

RQ3. What is the performance for different software platforms in a network management

system?

RQ4. What are the trades-off and conflicts in between performance, maintainability and

Implementation cost for different software platforms in a network management system?

1.4 Thesis Outline

The thesis report is organized as follows:

Chapter 1 Introduction, motivation, research question, aims and objectives

Chapter 2 Background and related work of software architecture, software platforms,

performance, maintainability, implementation cost and network management

system

Chapter 3 Research methodology for this research. It covers strategies that we used for

the literature review and experiment

Chapter 4 Results from literature review and experiment

Chapter 5 Discusses results and findings

Chapter 6 Conclusion

Chapter 7 Future work

Page 12: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

12

2 BACKGROUND

2.1 Software Architecture and Quality Attributes

The software architecture was first used in a scientific article in early 1981, Erik Sandewall

[17] described a concept for dealing with decomposition of system into modules. At the

beginning of 1990s software architecture was widely used in the software engineering

community and industry. Today it has become a most acceptable concept by the new roles

appearing in the software developing organizations [18].

J. Bosch et al. [1] described a development procedure to design and assess the software

architecture of haemo dialysis system with logical and dynamic view along with various

styles and scenarios. The main focus in this architecture was maintainability of software

architecture [19]. To build software architecture two design approaches were used i.e. top-

down and bottom-up approach. It is better to choose top-down approach as compare to

bottom-up during the architecture design and reengineering of architecture [10][11][4].

Software architecture which was built from a scratch has very less chances to reuse it. For

this purpose product line architecture approach to software development has gained a

solution which enables the development of component in a systematic way for reuse of

software architecture [20].

J. Bosch et al. [10] evaluated the software quality attributes specifically for maintainability.

The most useful method for maintainability is change scenario method as compared to other

methods such as simulation, mathematical modeling and experience-based assessment.

These methods were used for high performance and fault tolerance in software architecture

[13][6]. For removing the deficiencies and improving the quality attributes of software

architecture, there are five different techniques for transformation of architecture such as

impose architectural style, apply architectural pattern, use design pattern, convert quality

requirements to functionality and distribute quality requirements [22][18][23][24]. After

every transformation of software architecture a newer version is obtained with same

functionalities but different in their quality attributes [10][25]. Traditional object oriented

design method was used for designing the software architecture [5].

Software developers were facing many challenges to reach the customer needs and their

expectations within specific time and cost [13]. Z. Li et al. [26] have discussed a method

regarding implementation cost for software architecture systems by using divide-and-

conquer approach. This implementation cost and effort estimation that can simplify and

regulate the complicated development for software based application.

2.2 Software Platform and Prototype

Earlier software development group used the word “platform” transformed dramatically [27].

The influence of system was too common for software developers who began to think of

platforms as the peculiar operating systems on their products. Some of the organizations that

make software and taking Microsoft’s operating systems as their own platform are not

enough for a company to be efficient and effective for development of robust platform

architectures [9][28]. Still, it is externally focused on platform definition which does not be

enough for a software company to fully manage the architecture of its own. Management

needs an internal comprehensive definition of its own software product platform

[27][29][30].

Page 13: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

13

The idea of prototyping in software development come out in 1970 [31][32] as a reaction

against most traditional phase-oriented model (e.g. life-cycle model, waterfall model etc) of

how to develop software system [33]. Prototype was described by J. C. Hallet et al. [34] in

the year 1975 and described a method to overcome the well-documented problems of

software and maintainability during development of prototype. J. S. Dong et al. [8]

developed different software platforms by reusing software architectures, source code and

proposed a technique to reuse software architecture and management of a platform quality

through defect removing and continuous maintenance.

H. Algestam et al. [35] has developed a component-based prototype and differentiated new

architecture with existing architecture to know the performance and maintainability. From

the experiment, there observed that maintainability cost was reduced around 20% and

performance evaluation of prototype was lower when using one CPU in component based

architecture. when using multiprocessor component-based prototype was very efficient to

increase the maintainability and conserve performance in the system [36]. M. Christensen et

al. [37] has develop the dragon project as a series of evolutionary prototype of a customer

service. The project progressed through two main stages in which an iterative and

incremental progress strategy was used [38].

2.3 Network Management System

Z. Yongjun et al. [39] has generally described the fast development of network technology

that meets network requirement in every organization and provides reliable services of

network devices to improve the complexity in the network. The management information

system that uses request/response mode to ensure the reliable and stable operation of

network management system by using web-based architecture [15]. In any system, quality

attributes such as performance and maintainability are more versatile measures in software

architecture [40]. For example, the quality attributes like performance and maintainability

typically require explicit attention during the development to achieve required levels.

Traditionally, software engineers have performed their designs based on assumed relations

between design solutions and quality requirements [41]. Another technology as CORBA for

intelligent system to build a network management system [42].

2.4 Related Work

J. Bosch [20] has represented the development of software product line to reduce the cost of

developing specific components of software. This approach was used for games: 2d real time

strategy & 3d first person shooter, online playing and web-based application e-commerce

pages, corporate information page and database application using functionality-based

architecture design method.

M. H. Meyer et al. [27] described the software platform as a collection of subsystems and

interface of the common structure for the development of software product. C. Bădică et al.

[43] used the java and C/C++ platform for implementation of agent-based system. J. S. Dong

et al. [8] described the maintainability of component in software and their side effect on

components due to change in software components. When a particular software component

is changed then unplanned error occurs in that component and developer has to spend lot of

effort in term of maintainability.

R. Kazman et al. [14] described about experimental scenarios to obtain an understanding of

real world system in various domains. SAAM (Software Architecture Analysis Method)

[11] was used to evaluate different scenarios for analyses of architecture and achieve

maintainability. To evaluate software architecture, SAAM forces key stakeholders to

influence a relative rating among the various potential of the system. For the evaluation

Page 14: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

14

process of SAAM, each session is generated by considering customer requirements in the

form of scenarios. This method is also used to evaluate global information system, air traffic

control and remote control system [14].

Software architecture evaluation has a great impact on the quality attributes. Evaluation of

design transformation can be carried out several times until the entire requirements and

quality attributes are gained [10]. Change scenario is one of the important part for practicing

architecture in order to gain more information about the software system with respect to set

of define quality attributes which are evaluated by using SAAM [11][13][14][19][40].

Quality requirements such as performance and maintainability are generally specified clearly

in industrial requirement specifications. In some industrial projects [44], the initial required

specification contained statements such as “The maintainability of the system should be as

good as possible” and “The performance should be satisfactory for an average user”. Such

subjective statements are well intended and useful for the evaluation of software [6].

L. Dobrica et al. [45] described the set of various software architecture evaluation methods

and compared the performance of two or more competing software platforms [35][46][21].

Gulimanescu et al. [47] compared the structure, hardware functionality and software of a

network management system based on the ATMEGA256 controller. The development cost

of software system almost utilizes about 20 % to 40 % as a whole project cost other than the

remaining cost approximately 60 % to 80 % is consumed by maintenance [15][48]. System

with poor maintainability is difficult to alter and update. So the maintainability cost of any

project is key parameter for developing the robust software system [49]. J. Bosch et al. [6]

proposed the technique for analyzing the optimal maintainability of software based on

different scenarios.

H. Grahn et al. [24] presented the evaluation of performance characteristics with different

quality attributes of three architectural styles. S. Balsamo et al. [50] proposed performance

models to characterize the quantitative behavior of software architecture for the system.

Performance results such as response time using system workload are useful to directly

interpret the software design level [51][52]. M. Marzolla et al. [53] mentioned the

performance evaluation and prediction of software at the software architecture level.

N. Huang et al. [54] described a method for estimating software development cost of

software system and size to intend the development during implementation time of the

software life cycle. M. E. Helander et al. [55] has described the modelling work on cost

distribution among software with in the quantified system reliability.

J. Matton et al. [21] described the implementation of a service data point (SDP) prototype

with an alternative architecture to reduce the size of code for optimizing the performance and

availability of prototype as much as possible [56]. D. Dvorak et al. [57] added a new

dimension to the requirements for software reliability and described about case study

between NASA and JPL (Jet Propulsion Laboratory) on project NMAAP (New Millennium

Autonomy Architecture Prototype) for improving the software architecture of flight in a

prototype form. Xiaosong Wang et al. [15] proposed an extensive system which covers the

demand for network management including system architecture, prototype as well as the

deployment method. X. Lu [58] has built the heterogeneous telecommunication network

management system based on J2EE architecture.

Page 15: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

15

3 RESEARCH METHODOLOGY

Research is defined by HECFE (Higher Education Funding Council for England) as

“original investigation undertaken in order to gain knowledge and understanding“ [59]. In

our thesis, two methods are used i.e. literature review and experiment.

3.1 Literature review

According to Creswell [60] literature review is defined as “a written summary of journal

articles, books and other papers that describes the past and current state of information,

organizes the literature into topics and documents a need for a proposed study”.

The first phase of research study is literature review that is based on the reviewing previous

work to know the current knowledge and prepare a foundation for new research to carry on.

The literature review is sorting, managing and gripping the already available research articles

[59]. We conducted literature review on software architecture, software platform,

performance, maintainability and implementation cost [59]. The process for literature review

is shown in Figure 1.

Research question

Literature review

search string

Selection

criteria(Inclusion/

Exclusion)

Refine papers by

reading Title/

Abstract

Databases

Final out come of

LR

Defining keywords

IEE

E

INS

PE

C

SC

OP

US

update list of

articles

Apply snowball

sampling

Figure 1 Structural Diagram of Literature Review

3.1.1 Search String and Databases The sources for literature were IEEE Xplore, Engineering Villas (Inspec), Scopus

(Compendex) and the university library at BTH. Search strategy formulation for search

string is done with keywords.

Page 16: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

16

3.1.1.1 Defining Keywords

The below given keywords are defined for searching the related articles from different

databases for research and deeper analysis of new knowledge:

“Implementation Cost, Maintainability, Network Management System, Performance,

Software Architecture, Software Platform”

The search strings were formulated by using keywords and the search string is shown below.

Table 1 Search String for Literature Review

((( (($Software $Platform) WN KY) AND (1969-2012 WN YR)) AND ( (($Maintainability)

WN KY) AND (1969-2012 WN YR)) AND ( (($Implementation $cost) WN KY) AND (1969-

2012 WN YR)) AND ( (($Performance) WN KY) AND (1969-2012 WN YR))))

( (($Software $Architecture) WN KY) AND (1969-2012 WN YR)) AND ( (($Maintainability)

WN KY) AND (1969-2012 WN YR)) AND ( (($Implementation $cost) WN KY) AND (1969-

2012 WN YR)) AND ( (($Performance) WN KY) AND (1969-2012 WN YR))

3.1.2 Article Selection Criteria Kitchenham [61] described selection criteria for inclusion and exclusion. The articles are

filtered for research on the basis of inclusion and exclusion criteria techniques.

3.1.2.1 Inclusion criteria

Studies covering the software architecture and software platform

The accumulation of various database articles based on software architecture,

software platform, maintainability, performance, implementation cost and network

management system

Studies covering software prototype

Studies focus on different software platform and evaluation methods

Considering the articles which are in English language

3.1.2.2 Exclusion criteria

Articles which are not related to software architecture and software platform

Remove the duplicate articles

Articles which are not peer reviewed

Articles which are not published

3.1.3 Snowball Sampling Method Snowball sampling is defined by K. D. Bailey [62] as “a non-probabilistic form of sampling

in which persons initially chosen for the sample are used as informants to locate other

persons having necessary characteristics making them eligible for sample”. In this thesis

snowball sampling method is used to discover reference among the articles found from

literature review.

The basic techniques for literature review are searching article with keywords and also

perform snowball sampling method [63] to decrease the chance of missing any relevant

article. Snowball sampling is a chain study of article selecting from reference of one article

[59]. D. Waldorf et al. recently referred to snowball sampling as “chain referral sampling”

[64]. We used snowball sampling method in our research to develop a good scope and in

order to avoid missing any important research articles related to our thesis.

Page 17: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

17

3.2 Experiment

In second phase of the research study, experimental research is used to answer and support

our research question. Dawson [59] defined empirical research as “a research based on

observed and measured phenomena”. It reports research based on actual observations or

experiments are usually performed in development, evaluation and problem-solving projects

[60].

The implementation of the three prototypes with different software platforms is a

challenging task based on supporting literature review. Three prototypes were built by

deploying different software platforms such as PHP [65], JSP [66] and Perl [67]. The whole

part of experiment is conducted as step by step process as shown in Figure 2. Firstly,

software architecture is designed to check the possibilities for building three prototypes.

Each of these prototypes was evaluated with respect to maintainability, implementation cost,

performance (response time) and also the trade-off & conflicts between them.

Literature review

Software

architecture design

in Section 3.2.1

Maintainability

evaluation in

Section 3.2.4

Implementation

cost calculation in

Section 3.2.5

Performance

(response time)

measurement in

Section 3.2.6

Prototype

development in

Section 3.2.2

Final research out

come(Results)

Industry

requirement

Change scenario

Design & evaluation

methods

Figure 2 Structural Design of Experimental Process

3.2.1 Software Architecture Design Method for NMS

3.2.1.1 Software Architecture design

In general, Software architecture can be defined as “a decomposition and structure of

software components and how these system parts are interconnected to each other”

[68][23][69].

According to the J. Bosch [13], architecture design is the process of changing a set of

requirement into software architecture which satisfies the requirements. The design method

of software is the activity to identifying the sub-systems of the software system. To design

Page 18: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

18

the software architecture, functionality-based architecture design method is used [6]. This

design method has demonstrated as a great impact on the assessment of different quality

attributes.

3.2.1.2 Software Architecture Description

The basic architecture of prototype is described in Figure 3 [13] by using functionality-based

architecture design method.

Figure 3 Structure of Software Architecture

The major components in Figure 3 are database, web server, platforms, data acquisition, web

browser and graphical user interface. We have used Apache web server, the most popular

open source cross-platform that executes all modules present in Xampp (Apache HTTP

Server, Tomcat, PHP, Perl and MySQL). All three prototypes are browser independent (like

Chrome, Firefox, Safari, and Explorer) [20]. The major functionalities are considered for the

development of prototypes. For the research work industrial partner has decided to develop

the major functionalities that are identical in all three prototypes. The selected functionalities

with their respective prototype are assessed by considering the quality attributes such as

performance, maintainability and implementation cost. Different operating systems like

windows and Linux are considered for platform independent application [27].

3.2.2 Development of Prototype In software development, prototyping is basic model/structure of software system [70].

Prototype is used to narrow down and refine the user’s requirement for the system [59].

The structural design of software prototype has various processes shown in Figure 4. All

three prototypes have same functionality but the quality attributes such as performance,

maintainability and implementation cost is different in each prototype with different software

platforms. The prototypes are based on web browser application which provides user to

analyze the behavior of network and faults if any in the network.

Graphical User Interface (GUI)

Architecture Components

Web Browser

Security

Database Management System

Operating System

(Windows / Linux / Mac OS X)

Web Server

Platform

(PHP/Java/

HTML/Perl)

Database

conector

Main functionality of Software

Data Acquisition

(Probing devices / injecting data into database through script, tcp/ip, comm port)

Page 19: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

19

Implementation

Initially Identified the

Requirements

Requirement

Prioritization &

selection

Design

Integration

Testing

Figure 4 Basic Prototype Development Process

Three prototypes have been developed with different software platforms. First prototype is

developed by using PHP (Hypertext Preprocessor) [65] which is especially used for web

development, Apache web server (Xampp) [71] which is mostly used in many organizations

than any other web server described in survey conducted by Netcraft [72] as shown in Figure

24 appendix-G and MySQL is the most popular open source database because of

its performance and reliability [73]. Second prototype is developed by using JSP (Java

Server Pages) [66] which provides fast and simplified way for creating dynamic web

content, Tomcat [74] web server which is an open source software implementation of java

server pages technologies and MySQL [73]. Third prototype is based on Perl [67] which is

highly featured programming language, MySQL and Apache web server. The computer for

all three prototypes was Intel core i3 processor 2.4 GHz and 2 GB RAM.

3.2.3 Maintainability Maintainability of any software is defined by IEEE [75][69] as “The ease with which a

software or component can be modified to correct faults, improve performance or other

attributes, or adapt to a change environment”. The definition turn into three main classes of

maintenance, that is corrective, perfective and adaptive. The prediction method concentrate

only on perfective maintenance and adaptive maintenance that does not predict efforts

requisite for corrective maintenance [76][75][13].

Scenario-based analysis method (change scenario) is used to evaluate the maintainability of

the software prototype. The evaluation part is done by using SAAM as show in Figure 5 and

to analyze the maintainability of prototypes with different scenarios, a survey was conducted

from twenty five industrial experts [19].

Page 20: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

20

Software

prototype design

List of scenarios Evaluation using

Change scenarios

Assessment of

Maintainability

Result of

Maintainability

Figure 5 Maintainability Assessment of Software Prototype

Survey based research is used to distinguish the knowledge, attributes and behaviour of large

group of people and support effective research results. The collection of survey results are

taken from twenty five industrial experts for the maintainability evaluation in a highly

efficient way [77][78]. Survey is extensively used for specific problem solving and it is an

approach for collecting information close by problem under study described in [79][80] as

“A survey is data-gathering and analysis approach in which respondents answer the

question that are developed in advance”.

The maintainability survey is based on following steps [79][81]:

Description of prototypes and scenarios

Indentify the target audience both from industry and academia

Designing the survey form

Approval of survey from advisor

Statistical analysis of results

In every assessment, the change scenario method is applied which is shown in Figure 5. The

software prototype is described for the assessment of each scenario to find the probability of

each change scenario of software prototype, which is used to figure out the maintainability

efforts using formula as given in Equation 1 [6][82].

Equation 1 Maintainability Effort Formula

Where

Pn = Prototype number

Ks = Number of scenarios

i = Scenario number

Probability (i) = Probability of change scenario (i) occurrence

Effort (Pn, i) = Effort to implement change scenario (i) in prototype Pn

3.2.3.1 Scenario Description

Every scenario demonstrates the semantics of software quality attributes that may be

assigned an associated weight or probability of the scenario occurrence [6][19]. During the

literature review we have found a lot of possible change scenarios. But for better

development of prototype few fundamental scenarios were selected with the help of internal

and external advisors given in Table 2.

Page 21: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

21

Table 2 Description of Each Scenario

Scenario No. Scenario Description

S1 Change of database (Microsoft SQL or PostgreSQL , Oracle, mongoDB or

NoSql or any other )

S2 Change of web server (Tomcat, Blazix, Jboss, Lamp or any other)

S3 Enhancing GUI feature for any functionality in main menu of software.

S4 Change of operating system (Linux, Microsoft Windows and Microsoft

windows server edition)

S5 Changing the technique for probing the network and storing data into

database.

S6 Enhancing functionalities for keeping backup of software data on cloud

storages services or dedicated storage server (Redundancy).

S7 Additional search capabilities ( in GUI of Networks)

S8 Web page upgrading (for example: Multilingual, Meta-refresh , or any

other)

S9 Upgrade of database

S10 Remote administration of NMS

3.2.4 Implementation cost Implementation cost helps us to figure out how much time (in hours) is consumed during

development of software prototype successfully with all functionalities [20]. The Figure 6

has three main phases of prototype and it shows how the implementation cost for each

prototype is calculated. For each phase of software prototype (Design of prototype, prototype

development & implementation and testing of prototype), the time is recorded in hours and

summarized [13][20].

Testing of Prototype

Design of prototype

(Modelling estimated time )

Prototype Development &

ImplementationTotal time

in hours

Figure 6 Implementation Cost for Prototypes

Similarly, the same procedure is repeated to calculate the implementation cost of all three

software prototypes. During this whole process authors have observed which phase

consumes more time and effort for development.

3.2.5 Performance Performance is defined [75] as “the degree to which a system or component accomplishes its

designated functions within given constraints, such as speed, accuracy or memory usage”.

Performance is about timing [23] which starts with request to a system and process the

arriving request to generate a response. We measure the performance in term of response

Page 22: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

22

time and the total process is shown in Figure 7. In each prototype the response time is

measured (in millisecond) by using user request.

START

Request

Response time

(time=end-start)

END

start Time

(Procedure call time)

Script/code execution

for providing the

service including

database access

end Time

Figure 7 Measuring Response Time for Prototypes

To measure the performance we have executed the prototype several times. But for the

comparative analysis of performance we took equal number of executions (thirty executions)

in each prototype. Average and standard deviation [58] is taken as a reference to compare the

performance and maintainability in all three prototypes.

3.2.5.1 Average:

Mean or average is defined as the sum of all the given elements divided by the total

number of elements [83]. The average formula is given below:

Formula: Mean = sum of elements / number of elements

N

XX

Equation 2 Average Formula

Where

N is number of observations and

X is sum of all the observations

3.2.5.2 Standard deviation:

Standard deviation is used to calculate the average distance from the average. The equation

for standard deviation [83] is given below:

n

i

XXin

S1

2

1

1

Equation 3 Standard Deviation Formula

Where

S= Standard deviation, Mean sum of all, X= Value of data set, X = Mean of the

values, n= No. of values.

Page 23: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

23

4 RESULT & ANALYSIS

4.1 Result from Literature Review

4.1.1 Search String Results from Databases The internet is worthful source to find published articles through digital library resources

[59]. The following digital libraries are selected for obtaining the research result of literature

review.

Table 3 Databases Description

S. No. Name of Database Type of Database

01 Engineering Village Digital

02 Scopus Digital

03 IEEE Xplore Digital

The search string was formulated by using keywords and got the results from various

databases as shown in Table 4 and other strings with different strategies is included in

appendix A.

Table 4 Search String Results

4.1.2 Conducting the Literature Review Articles are selected from various databases by using the search string and filtered according

to the inclusion/exclusion criteria. The articles which are most relevant to our research

question are considered and shown in Table 5.

Database Search String Results

Engineering

Village

Inspec

( (($software $architecture) WN KY) AND (1969-2012

WN YR)) AND ( (($Maintainability) WN KY) AND

(1969-2012 WN YR)) AND ( (($Implementation $cost)

WN KY) AND (1969-2012 WN YR)) AND (

(($performance) WN KY) AND (1969-2012 WN YR))

76

( (($software $platform) WN KY) AND (1969-2012 WN

YR)) AND ( (($Maintainability) WN KY) AND (1969-

2012 WN YR)) AND ( (($Implementation $cost) WN

KY) AND (1969-2012 WN YR)) AND ( (($performance)

WN KY) AND (1969-2012 WN YR))

31

Scopus

(TITLE-ABS-KEY(software architecture) AND TITLE-

ABS-KEY(performance) AND TITLE-ABS-

KEY(maintainability) AND TITLE-ABS-

KEY(implementation))

30

(TITLE-ABS-KEY(software platform) AND TITLE-

ABS-KEY(performance) AND TITLE-ABS-

KEY(maintainability) AND TITLE-ABS-

KEY(implementation))

12

IEEE Xplore

(((software architecture) AND maintainability) AND

performance) 106

((((software platform) AND maintainability) AND

performance)) 25

Page 24: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

24

Table 5 Refinement of Article from Databases

S. No. Name of

Database

Total

Articles

Found

Filtered by

Reading

Title/Abstract

Screening

Full Text

01 Engineering

Village Inspec

107 39 6

02 Scopus 42 17 2

03 IEEE Xplore 131 24 4

Sum of all article 280 80 12

Total (A) 12

Duplicate (B) 2

Grand Total (A-B) 10

4.1.3 Snowball Sampling

4.1.3.1 Selection of Article

Snowball sampling method is a repetitive process, by selecting one reference from article

[64]. Snowball sampling is used to develop a good scope of research. Ten articles in total are

selected from different databases after filtering and then we performed snowball sampling on

those articles and extract five more relevant articles given in Table 6.

Table 6 Snowball Sampling Selected Articles

S. No. Description Articles Selected Articles

1 Databases 10 [6] [8] [10] [35] [15] [19] [24] [70]

[21] [84]

2 Snowball 5 [1] [56] [12] [40] [26]

Total 15

4.1.3.1.1 Articles Selected for Study

During the literature review, we found 15 articles in primary study and each selected article

is assigned with article number from (A1-A15) and after that every research article is

summarized.

Table 7 List of Selected Articles Paper /

Article

no.

Article / Paper Reference

no.

A1

P. O. Bengtsson and J. Bosch, “Haemo dialysis software Architecture design

experiences,” in Proceedings of the 1999 International Conference on

Software Engineering, 1999, 1999, pp. 516–525.

[1]

A2

J. E. Bardram, H. B. Christensen and K. M. Hansen, “Architectural

prototyping: an approach for grounding architectural design and learning,”

in Software Architecture, 2004. WICSA 2004. Proceedings. Fourth Working

IEEE/IFIP Conference on, 2004, pp. 15 – 24.

[56]

A3

J. Bosch and P. Bengtsson, “Assessing optimal software architecture

maintainability,” in Fifth European Conference on Software Maintenance

and Reengineering, 2001, 2001, pp. 168–175.

[6]

A4

J. S. Dong, K. Lee, K. H. Kim, S. T. Kim, J. M. Cho, and T. H. Kim,

“Platform Maintenance Process for Software Quality Assurance in Product

Line,” in Computer Science and Software Engineering, 2008 International

Conference on, 2008, vol. 2, pp. 325 –331.

[8]

A5 P. Bengtsson and J. Bosch, “Scenario-based software architecture [10]

Page 25: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

25

reengineering,” in Fifth International Conference on Software

Reuse, 1998. Proceedings, 1998, pp. 308–317.

A6

R. Kazman, L. Bass, G. Abowd, and M. Webb, “SAAM: a method for

analyzing the properties of software architectures,” in, 16th International

Conference on Software Engineering, 1994. Proceedings. ICSE-16, 1994,

pp. 81–90.

[12]

A7

H. Algestam, M. Offesson, and L. Lundberg, “Using components to increase

maintainability in a large telecommunication system,” in Software

Engineering Conference, 2002. Ninth Asia-Pacific, 2002, pp. 65 – 73.

[35]

A8

Xiaosong Wang, Li Wang, Benhai Yu, and Guixue Dong, “Studies on

Network Management System framework of Campus Network,” in 2010 2nd

International Asia Conference on Informatics in Control, Automation and

Robotics (CAR), 2010, vol. 2, pp. 285–289.

[15]

A9

Lundberg, J. Bosch, D. Haggander, and P.O. Bengtsson, “Quality attributes

in software architecture design,” in Proceedings of SEA’99: 3rd Annual

IASTED International Conference on Software Engineering and

Applications, 6-8 Oct. 1999, Anaheim, CA, USA, 1999, p. 353–62.

[40]

A10

Z. Li and J. Keung, “Software Cost Estimation Framework for Service-

Oriented Architecture Systems Using Divide-and-Conquer Approach,” in

Service Oriented System Engineering (SOSE), 2010 Fifth IEEE International

Symposium on, 2010, pp. 47 –54.

[26]

A11

P. Bengtsson and J. Bosch, “Architecture level prediction of software

maintenance,” in Proceedings of the Third European Conference on

Software Maintenance and Reengineering, 1999, 1999, pp. 139–147.

[19]

A12

H. Grahn and J. Bosch, “Some initial performance characteristics of three

architectural styles,” in Proceedings of the 1st international workshop on

Software and performance, New York, NY, USA, 1998, pp. 197–198.

[24]

A13

G. Ortiz, B. Bordbar, and J. Hernandez, “Evaluating the use of AOP and

MDA in Web service development,” in 2008 3rd International Conference

on Internet and Web Applications and Services, 8-13 June 2008, Piscataway,

NJ, USA, 2008, pp. 78–83.

[70]

A14

D. Haggander, L. Lundberg, and J. Matton, “Quality attribute conflicts -

experiences from a large telecommunication application,” in Engineering of

Complex Computer Systems, 2001. Proceedings. Seventh IEEE International

Conference on, 2001, pp. 96 –105.

[21]

A15

K. Bennett, M. Munro, J. Xu, N. Gold, P. Layzell, N. Mehandjiev, D.

Budgen, and P. Brereton, Prototype Implementations of an Architectural

Model for Service-Based Flexible Software. 2002.

[84]

P. O. Bengtsson et al. [A1] redesigned the existing system which is hard to maintain for

dialysis machines. They has implemented a prototype which fulfills the quality attributes and

represent the prototype into a complete system [1]. J. E. Eyvind et al. [A2] discussed the

architectural prototypes and their qualities to dig into the experiment with substitute

architectural styles, patterns and features. Furthermore, they developed prominent

performance and distributed customer system by developing prototype of a globally

distributed customer service system [56].

J. Bosch et al. [A3] discussed about maintenance effort and impact of scenario profile on

architecture to calculate theoretical minimal maintainability effort. These results were used

to assess scenario based maintainability of software architecture [6]. J. S. Dong et al. [A4]

developed different software platforms by reusing software architectures and source code.

They proposed a technique to reuse software architecture and management of platform

quality attributes through defect removing and continuous maintenance [8].

Page 26: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

26

P. O. Bengtsson et al. [A5] addressed the quality attributes of software architecture while

introducing a method for reengineering of software architecture. The main intention of the

research was on maintainability and reusability for domain specific software architecture

[10]. R. Kazman et al. [A6] discussed about the methods which requires some attention

while evaluation of software architecture. SAAM method was used to describe and analyze

software architecture with respect to maintainability [12]. H. Algestam et al. [A7] designed

and implementation of the prototype with new component-based architecture to compare

with the existing architecture [35].

Xiaosong Wang et al. [A8] proposed an extensive network management system design based

on web services that covers the demand for network management including system

architecture, prototype, as well as the deployment method. In the design of network

management system major components are data acquisition, performance and configuration

management. The architecture was based on integrated network management system by

using platform LAMP [15].

D. Haggander et al. [A9] reported the experiences from five industrial applications. The

experience gained from five projects shows that the performance and modifiability is

affected by using different implementation techniques [40]. J. Keung et al. [A10] described a

framework based on divide-and-conquer approach for cost estimation of service oriented

architecture-based software. The framework was able to overcome the hurdles of

organization when dealing with separate parts of architecture [26].

P. Bengtsson et al. [A11] used a method for maintainability prediction of software

architecture using requirement specification and design of architecture. The method was

suitable for designing the process that will iterate frequently for evaluation of architecture on

each iteration [19].

H. Grahn et al. [A12] is concern with some quality attributes and the design styles of

software. The architecture is described with components and direct interconnections to get

the best performance data [24]. G. Ortiz et al. [A13] presented a case study for the evaluation

of MDA and AOP in the development of web services. By using metrics measurement that

can separate the results observed from the generated techniques are traceable and modular

[70].

J. Matton et al. [A14] discussed about the application that provide high performance,

availability and maintainable in order to reduce the cost. To maximize the quality in a

design which is not possible always, therefore making trade-offs is necessary and major

challenge in software design is to find solutions that balance and optimize the quality

attributes [21]. K. Bennett et al. [A15] discussed about the service architecture during

implementation of two prototypes with different technologies such as scripting and e-Speak.

After the implementation they found negotiation of functional and nonfunctional attributes of

services [84].

Page 27: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

27

4.2 Prototyping

4.2.1 Prototype 1 Prototype 1 is developed and implemented by using PHP platform as shown in Figure 21

appendix B. The prototype performance in terms of response time, implementation cost in

hours and maintainability is shown in Tables 8, 9 and10 respectively.

4.2.1.1 Performance (Response Time)

Three main functionalities (test cases) have been selected from industrial partner for

measuring the response time and analysing the whole prototype performance is based on

these functionalities/test cases shown in Table 8 with the average and standard deviation.

The response time is measured by using user request. The prototype is executed several

times but for statistical analysis [83] thirty executions are considered and the recorded

response time (millisecond) of each execution results are shown in appendix B Table 23 and

also shown in graph:

Table 8 Response Time of Prototype 1

Functionalities/

Test cases

Total no .of

executions

Average

Response Time

in milliseconds

Standard

Deviation

View All Network 30 6.74 1.1622

Status of Network 30 6.86 1.4755

Search Network 30 7.63 1.4370

Figure 8 Response Time of Prototype 1

The Figure 8 shows the response time of prototype 1 with three test cases of prototype based

on view all networks, search networks and status of networks. The x-axis contains the total

number of executions and y-axis contain the time in milliseconds. From above graph we

have observed that the result of functionality is different during every execution of

prototype.

4.2.1.2 Implementation Cost

The total implementation cost in term of time for completing this prototype is 78 hours. That

includes design of prototype, prototype development and testing of prototype. The Table 9

0

2

4

6

8

10

12

1 3 5 7 9 11 13 15 17 19 21 23 25 27 29

Tim

e in

m

illi

seco

nds

No. of executations

Response Time of Prototype 1

View All Network

Search Network

Status of Network

Page 28: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

28

describes the implementation cost of this prototype at each stage and pie chart is shown in

Figure 9.

Table 9 Implementation Cost of Prototype 1

Stages / Phases Prototype 1 PHP

Cost in Hours

Prototype design 10

Prototype development 60

Testing of prototype 8

Total Time( hours) 78

Figure 9 Implementation Cost of Prototype 1

4.2.1.3 Maintainability

A change scenario effort for this prototype has taken from survey both in industry and

academia. The survey results were taken from twenty five industrial experts for the

maintainability evaluation and result of each survey is mentioned in appendix E. After

getting every survey from responder, the statistical operation [83] [6] is performed for each

scenario of the prototype by using different scenarios that is given in Chapter 3 Table 2.

Then efforts were calculated by using maintainability effort formula (equation 1) mentioned

in Chapter 3 and calculated maintainability effort is recorded in Table 22 mentioned in

appendix B. The average maintainability of twenty five surveys is then calculated by using

Table 22 and average results are shown in Table 10.

Table 10 Maintainability of Prototype 1

Prototype Total no .of

Surveys

Average

Maintainability

effort

Standard

Deviation

Prototype 1(PHP) 25 80.920 19.349677

The Figure 10 shows maintainability of prototype 1 based on the results taken from Table

22. The x-axis contains the total number of surveys conducted and y-axis is taken as efforts

required for changing the scenario of software prototype.

10

60

8

Prototype 1 Implementation cost in hours

Prototype design

Prototype development

Testing of prototype

Page 29: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

29

Figure 10 Maintainability of Prototype 1

4.2.2 Prototype 2 Prototype 2 is developed and implemented by using JSP platform as shown in Figure 22

appendix B. The prototype performance in term of response time, implementation cost and

maintainability is given in Table 11, 12 and 13 respectively.

4.2.2.1 Performance (Response Time)

The response time is measured in millisecond for three selected functionalities by using user

request and output of this prototype is as shown in Table 11. Prototype is executed several

times, for statistical analysis thirty executions are considered same as prototype 1 for

comparison among these prototypes. The recorded response time of each execution that is

shown in appendix B Table 24 and also in Figure 11.

Table 11 Response Time of Prototype 2

Functionalities/test

cases

Total no .of

executions

Average Response

Time in milliseconds

Standard

Deviation

View All Network 30 13.80 2.8696

Status of Network 30 10.83 2.3057

Search Network 30 14.03 2.5527

Figure 11 Response Time of Prototype 2

0

20

40

60

80

100

120

1 3 5 7 9 11 13 15 17 19 21 23 25

Mai

nta

inab

ility

Eff

ort

s

No. of Surveys

Maintainability

Prototype 1 Maintainability

0

5

10

15

20

25

1 3 5 7 9 11 13 15 17 19 21 23 25 27 29

Tim

e in

mil

lise

conds

No. of executations

Response Time of Prototype 2

View All Network

Search Network

Status of Network

Page 30: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

30

The Figure 11 shows the response time of prototype 2 based on view all networks, search

networks and status of networks. The x-axis contains the total number of executions and y-

axis contains time in milliseconds. From the above graph we have observed that results are

varying during execution of this prototype.

4.2.2.2 Implementation cost

Prototype 2 has taken 80 hours during the whole development process and the results

shown in Table 12 and Figure 12 respectively. The table below describes the estimated

implementation cost in hours of this prototype at every stage.

Table 12 Implementation Cost of Prototype 2

Stages / Phases Prototype 2 JSP

Cost in Hours

Prototype design 8

Prototype development 64

Testing of prototype 8

Total Time( hours) 80

Figure 12 Implementation Cost of Prototype 2

4.2.2.3 Maintainability

Maintainability is calculated by using effort formula mentioned in Chapter 3. Each survey

result taken from twenty five industrial experts is shown in Figure 13 and Table 24 in

appendix B. The average maintainability effort of this prototype is shown in Table 13.

Table 13 Maintainability of Prototype 2

Prototype Total no .of

Surveys

Average

Maintainability

effort

Standard

Deviation

Prototype 2(JSP) 25 79.400 16.258331

The Figure 13 shows the maintainability effort of prototype 2 taken from survey. The x-axis

contains the total number of surveys and on y-axis efforts required to change the scenario of

the prototype.

8

64

8

Prototype 2 Implementation cost in hours

Prototype design

Prototype development

testing of prototype

Page 31: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

31

Figure 13 Maintainability of Prototype 2

4.2.3 Prototype 3 Prototype 3 is developed and implemented by using Perl platform as shown in Figure 23

appendix B. The prototype performance in terms of response time, implementation cost and

maintainability is shown in Table 14, 15 and 16 respectively.

4.2.3.1 Performance (Response Time)

The response time is measured in millisecond for selected functionalities by using user

request and results of prototype 3 as shown in Table 14. Prototype 3 is also executed several

times but for statistical operation thirty executions are considered same as prototype 1 and 2

for comparison among these prototypes. The response time is recorded for each execution

that is shown in appendix B Table 27 and also in Figure 14.

Table 14 Response Time of Prototype 3

Functionalities/

test cases

Total no .of

executions

Average

Response Time

in milliseconds

Standard

Deviation

View All Network 30 6.90 1.3132

Status of Network 30 9.04 1.6877

Search Network 30 8.24 1.3778

The Figure 14 shows the response time of prototype 3 that is based on view all networks,

search networks and status of networks. The x-axis contains the total number of executions

and y-axis is taken as the time in milliseconds. From graph we have observed that results are

irregular during execution of this prototype and thirty executions are considered for

performance comparison.

0

20

40

60

80

100

120

1 3 5 7 9 11 13 15 17 19 21 23 25

Mai

nta

inab

ility

Eff

ort

s

No. of Surveys

Maintainability

Prototype2 Maintainability

Page 32: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

32

Figure 14 Response Time of Prototype 3

4.2.3.2 Implementation Cost

Prototype 3 has taken 58 hours for the development of whole process and results are shown

in Table 15 and Figure 15 respectively. The table describes the estimated implementation

cost in hours for prototype 3 at every stage.

Table 15 Implementation Cost of Prototype 3

Stages / Phases Prototype 3 Cost in

Hours

Prototype design 6

Prototype development 45

Testing of prototype 7

Total Time( hours) 58

Figure 15 Implementation Cost of Prototype 3

4.2.3.3 Maintainability

The same strategy which is used in prototype 1 and 2 is applied to calculate the

maintainability effort of prototype 3. The result of each survey responder is shown in Figure

16 and Table 26 mentioned in appendix B. The average maintainability effort of this

prototype is shown in Table 16.

0

2

4

6

8

10

12

14

1 3 5 7 9 11 13 15 17 19 21 23 25 27 29

Tim

e in

mil

lise

conds

No.of executions

Response Time of Prototype 3

View All Network

Search Network

Status of Network

6

45

7

Prototype 3 Implementation cost in hours

Prototype design

Prototype development

Testing of prototype

Page 33: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

33

Table 16 Maintainability of Prototype 3

Prototype Total no .of

Surveys

Average

Maintainability

effort

Standard

Deviation

Prototype 3(Perl) 25 81.400 19.733643

Figure 16 shows the maintainability effort of this prototype taken from each survey. The x-

axis contains the total number of surveys and y-axis contains the efforts required to change

the scenario of the prototype.

Figure 16 Maintainability of Prototype 3

4.3 Comparison and Summing up

4.3.1 Maintainability We have assessed the maintainability effort of each prototype from survey. Average and

standard deviation is calculated for maintainability effort of all three prototypes and the

comparison is provided in Table 17. Results shown in Figure 17 stated that prototype 2 has

lowest maintainability effort when it is compared with other two prototypes. Prototype 3 has

highest maintainability effort when compared with prototype 2 and prototype 1. Among all

three prototypes, it is observed that prototype 2 has lowest maintainability effort when

compared to prototype 1 and prototype 3.

Table 17 Maintainability of all Three Prototypes

Maintainability Prototype1

PHP

Prototype 2

JSP

Prototype 3

Perl

Average Maintainability 80.920 79.400 81.400

Standard Deviation 19.349677 16.258331 19.733643

0

20

40

60

80

100

120

140

1 3 5 7 9 11 13 15 17 19 21 23 25

Mai

nta

inab

ility

Eff

ort

s

No. of Surveys

Maintainability

Prototype 3 Maintainability

Page 34: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

34

Figure 17 Comparison of Maintainability

4.3.2 Implementation Cost We have recorded the implementation cost of each prototype in hours and comparison

among them is shown in Table 18 and also graph shown in Figure 18. The assessment of

these prototypes are shown as the total implementation time taken for each prototype to

design, develop and testing of prototypes. We have observed that the prototype 3 has lowest

implementation cost as compared to prototype 1 and prototype 2. Whereas implementation

cost of prototype 1 and prototype 2 is approximately same in hours but different in term of

lines of code.

Table 18 Implementation Cost of all Three Prototypes

Stages / Phases Prototype

1

Prototype

2

Prototype

3

Design of prototype 10 8 6

Prototype development 60 64 45

Testing of prototype 8 8 7

Total Time( hours) 78 80 58

lines of code 1.5 kloc 1.2 kloc 1.35 kloc

Figure 18 Comparison of Implementation Cost

78.000

78.500

79.000

79.500

80.000

80.500

81.000

81.500

Prototype 1 PHP

Prototype 2 JSP

Prototype 3 Perl

80.920

79.400

81.400

Avg

. Eff

ort

s

Prototypes

Comparison of Maintainability

Prototype 1 PHP

Prototype 2 JSP

Prototype 3 Perl

0 10 20 30 40 50 60 70 80 90

Design of prototype

Prototype development

Testing of prototype

Total Time( hours)

Tim

e in

ho

urs

Comparison of Implementation Cost

Prototype 1 PHP

Prototype 2 JSP

Prototype 3 Perl

Page 35: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

35

4.3.3 Performance The response time for performance comparison of each prototype is show in Figure 19. Each

prototype is executed numerous times (1~30) to measure average response time. Table 19

shows the performance results of all prototypes and the results show that with prototype 1

PHP platform is fastest among other two prototypes with respect to all three major

functionalities/test cases which was considered for evaluating the performance of each

prototype. The lowest performance is on prototype 2 with JSP platform. Prototypes 3 with

Perl platform has mediate performance with respect to other two prototypes. To put it in

other way, Prototype 3 is faster comparatively with prototype 2 but slower in contrast with

prototype 1.

Table 19 Response Time of all Three Prototypes

Avg. Response time in milliseconds

S. No. Functionalities/

test cases

Prototype 1

PHP

Prototype 2

JSP

Prototype 3

Perl

1 View Network 6.74 13.80 6.90

2 Search Network 6.86 10.83 9.04

3 Status of Network 7.63 14.03 8.24

Figure 19 Comparison of Response Time

4.3.4 Quality Attributes Summing Up The overall comparison of quality attributes using different software platforms is shown in

Table 20. The experience from each individual prototype shows that maintainability,

performance and implementation cost are important quality attributes in a software system.

Table 20 Over all Comparison of Quality Attributes

Prototype

number

Quality Attribute

Maintainability Performance

(average response time) in milliseconds

Implementation

cost in hours

1 80.92 7.63 78

2 79.40 14.03 80

3 81.40 8.24 58

0

2

4

6

8

10

12

14

16

View Network Search Network Status of Network

Tim

e in

mil

lise

con

ds

Comparison of Response Time

Prototype 1 PHP

Prototype 2 JSP

Prototype 3 Perl

Page 36: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

36

First Prototype has highest ranking with respect to performance, mediate ranking in

maintainability and implementation cost correlates with second prototype. Second Prototype

has highest ranking in maintainability comparatively with first prototype and second

prototype, while it has lowest performance and debatable implementation cost which is

approximately similar with first prototype. In the same fashion third prototype has highest

ranking in implementation cost and better performance up to some extent as equate with both

other two prototypes. In the final analysis, there is trade-off that should be considered during

deployment of any software in the industry.

Page 37: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

37

5 DISCUSSION

5.1 Verification, Validation and Discussion

Verification and validation (V&V) refers to two noticeably sets of activities, heretofore far

the both terms are frequently used indistinguishable [85]. The definition of verification

according to IEEE standards [75] is “the process of evaluating a system or component to

determine whether the product of a given development phase satisfy conditions imposed at

the start of that phase” and validation is “the process of evaluating a system or component

during or at the end of development process to determine whether it satisfies specified

requirements” [85]. Verification and validation includes evaluation of system when it is

finally completed and testing refers to the program itself to see if it works correctly or has

any bugs/errors with in it [22].

The main result of this study is discussed as, first prototype (PHP) has highest ranking with

respect to performance but mediate ranking in maintainability and also the implementation

cost slightly correlates with second prototype (JSP) although lowest ranking as compare to

third prototype (Perl). Second Prototype has highest ranking in maintainability

comparatively to first prototype and third prototype, while it has lowest ranking in

performance and mediate ranking in implementation cost. In the same way third prototype

has highest ranking in implementation cost but lowest rating in maintainability however

mediate ranking in performance as compared to both other two prototypes. It is not always

possible to escalate each quality attribute in a software system, henceforth making trade-offs

is requisite. These findings also found by previous researcher J. Matton et al. [21] during

implementation of a service data point prototype with an alternative architecture, that there is

a negotiation which needs to be considered during deployment of software system. Under

those circumstances, a major challenge in software design is to find solutions that balance

and optimize the quality attributes.

To test all functionalities/test cases for verification and validation, we have executed each

prototype for several times and observed the output against each execution to verify all the

functionalities working appropriately. In addition to compare the performance of all three

prototypes equal number (1~30) of observation from each prototype was recorded and

compute the average value and standard deviation to identify the performance (response

time) of every prototype and these observed results were also validated from our industry

partner. Our finding does not seem to replicate the finding of previous researcher, Shenzhi

Zhang et al. [51] measured performance from distributed workload and response time

management for web application. However, there are set of experiments which authors

conducted and main focus on the management of data from different resource sharing.

At first glance our results of maintainability do not appear to replicate the finding of

J. Bosch et al. [19], they found maintainability during software architecture of dialysis

machine by synthesize different scenarios and then assigned a weight to each scenario along

with estimated size of all elements in term of volume of each component. Another

researcher, J. S. Dong et al. [8] has presented detail procedure for platform maintainability in

term of modification of component and their side effect on components due to modification

in software. In the light of above described studies [8][19]. We have calculated the

maintainability effort of each scenario in different software platforms by using survey

research from twenty five industrial experts and after that statistical operation is performed

have on each survey by using maintainability effort formula as mentioned in Chapter 3.

Page 38: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

38

The average and standard deviation is calculated for maintainability analysis of each

prototype from the responder surveys form. We have selected one way analysis of variance

(ANOVA) [82][86] test for verification and validation of maintainability results by using

Microsoft excel and SSPS [87] windows software respectively. One way ANOVA

calculates, statistically significant difference between groups mean (averages) based on

groups variance and sample sizes [82][86]. The verification and validation of results from

one way ANOVA test is included in appendix E. We calculated and compared the mean and

standard deviation of all three prototypes which verify our results that is discussed in

Chapter 4.

First it was difficult for designing the software prototype because there were many

procedures to be learned before development of prototype. At initial stage implementation of

first prototype has taken lot of time as compared to other prototypes and while developing

this prototype, we earned knowledge and practice for different technique which has taken

more times to complete the prototype. This was an advantage while developing the

remaining two prototypes and also it is considered as the evolution for the other two

prototypes. So, there are lots of factors that can lead to change the quality of prototype.

Implementation cost is important to deploy any software system in industry before

developing or implementing it. We have developed three different prototypes and each of

them has different implementation cost. The recorded implementation cost (in hours) during

development of each phase for all prototypes are verified from our schedule and record. The

current study is verified from previous researcher J. Bosch [20], calculated the development

cost of three products in number of weeks. Furthermore another researcher Z. Li et al. [26]

discussed the implementation cost for software architecture systems by using divide-and-

conquer approach.

The dissimilarities in all three prototypes are due to learning effect. We learned during the

development of first prototype which took 78 hours to develop, while second prototype has

taken 80 hours throughout development. Whereas third prototype has taken only 58 hours to

develop which is less time in contrast with first and second prototype, due to authors

acquired adequate knowledge during development of other two prototypes.

The main results of quality attributes (performance, maintainability and implementation cost)

that were expected from different prototypes are discussed here and all quality attributes are

achieved. Furthermore, for verification and validation of all functionalities and quality

attributes, we have deployed and demonstrated all three prototypes in industry (Andres Gran

and Peter Engström from Smart Grid Network AB Karlskrona Sweden). They verified and

validated all the required functionalities of each prototype with respect to quality attributes

of network management system. After analysis of each prototype from industrial partner and

they desires that prototype which grabs low efforts to update or modify.

Generally different organizations have different preferences to select the quality attributes of

a network management system. Several corporations demand best alternative in terms of

performance, maintainability and implementation cost. Some other firms have very higher

expectation regarding all quality attributes in a single software system based on network

management system. Henceforth, the evaluated results are used to analyze the software

prototypes on basis of performance, maintainability and implementation cost of different

software platforms, which might be supportive to industries and academia to choose best one

according to their preference. Consequently, there is negotiation between different quality

requirements in a software system based on network management system.

The limitation of this study is based on development of prototype instead of full-fledged

operational software. Because the studies is based on high level instead of low level design

and also present study is not dealing with hardware part in an industry (Smart Grid Network

Page 39: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

39

AB). Conducting research on software platform of prototypes is of great advantage.

Research on perspective of customer usage of software architecture practices could be more

beneficial.

During this research, we have studied large industrial applications where performance,

maintainability and implementation cost are important quality attributes. The conflicts

between performance, maintainability and implementation cost are shown in this research.

Some of the conflicts are based on fable and misconceptions but some are natural.

5.2 Validity Threats

Validity assessment is a significant element in any research and should be considered at the

designing phase of study. Wohlin et al. [88] classified four different kinds of validity threats

i.e. internal validity, external validity, construct validity and conclusion validity. The validity

threats that are concerned during current research work are described as below:

5.2.1 Internal Validity Threats Internal validity deals with the difficulties while conducting research study. Almost every

factor that may affect the internal validity of the software prototypes are discussed below

[88][89].

At first using keywords, authors have formed different search query to find research articles

from different databases. But it doesn't work on regular iteration of strings, some articles

were collected. It took us extra time but we learned a lot during the iteration process of

search string. So we performed snowball sampling method to reduce the internal threats

effectively.

To maintain the validity of this study, initially we built one prototype and then built other

two prototypes after pre-testing of first prototype. We learned much while the development

of first prototype in industry and then we implemented other two prototypes with same

software architecture.

In order to calculate the maintainability efforts of each prototype, we have conducted the

survey from both industry and academia. The core problem is selecting the target audience

related to our study. It is threat to our validation that some of the selected audience does not

have complete picture of software prototype based on different platforms. To overcome this

threat, selection of the audience based on their work experience in software development.

Moreover, industrial partner helped us to select the participants from different organizations

to get different perspectives for the prototype maintainability.

5.2.2 External Validity Threats External validity is a close relation whether our research is generalized to the other group of

companies or participants at different time zone and places [88]. Concerning, the risk of

results to other study in an organization and replicate the study to get the same result which

is seen as high generalizability. The result of the literature review might change if the search

is repeated by using snowball sampling method.

The generality of finding is depending up to some extent on the category of research that has

been done. The result from the study on the basis of software prototypes are more likely to

be generalizing to satisfy the software prototype which is a consequence while development

of any prototype in companies.

Page 40: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

40

5.2.3 Construct Validity Threats Construct validity is a relationship between theory and observation of the results [88]. In

current study this type of validity can threaten some prospect of study. There is a chance that

search string can not reveal all related literature. To minimized this effect our supervisor

checked search string and also refines it.

5.2.4 Conclusion Validity Threats Conclusion validity associates with the results of research study. To eliminate this type of

threat, results obtained from all three prototypes were critically analyzed before suggesting

to a conclusion. For getting appropriate results of maintainability through survey, scenarios

were designed based on the literature review. Before conducting survey, feedbacks were

taken from both academia and industrial supervisors and improved it repeatedly.

Page 41: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

41

6 CONCLUSION

This thesis presents three software prototypes with different software platforms (PHP, JSP

and Perl) to extract, represent and explore software quality attributes such as performance,

maintainability and implementation cost. We have indentified different challenges in

software architecture and software prototypes by using different software platforms.

Prototypes have been identified as an important aspect for software development in any

organisation. The experiment was conducted by using different software platforms to draw a

meaningful result in term of quality attributes. Therefore software platforms for evaluating

the quality attributes of software prototypes are important. We found significance differences

between quality attributes of prototypes using different software platforms.

RQ1. What is the maintainability for different software platforms in a network

management system?

Answer: The maintainability of all three software platforms PHP, JSP and PERL are 80.92,

79.40 and 81.40 respectively. Each average value of the maintainability describes the efforts

required to be maintain the software platform. Survey was conducted from twenty industrial

experts to calculate the efforts of each platform.

RQ2. What is the implementation cost for different software platforms in a network

management system?

Answer: The implementation cost for three different software platforms PHP, JSP and

PERL are 78, 80 and 58 hours. In the experimental phase, implementation cost (in terms of

no. of hours) was calculated for the development of all prototypes in three stages for

different software platforms and summarized the total time taken to built each prototype in

different software platforms.

RQ3. What is the performance for different software platforms in a network management

system?

Answer: The performance of PHP platform is faster, JSP platform is slower and PERL is in

between both of them. The performance is measured in terms of response time for all three

software platforms. Each prototype was executed number of times, but for the analysis of all

three prototypes thirty executions was considered and average was calculated.

RQ4. What are the trades-off and conflicts in between performance, maintainability and

Implementation cost for different software platforms in a network management system?

Answer: The performance of three prototypes has illustrated with different significance

levels. For all three prototypes, we have considered different significance levels such as

high, average and low to the quality attributes. The first prototype (PHP) has significance

level as high performance, average maintainability and average implementation cost. Second

prototype (JSP) has same quality attributes but the change of significance level will be

determined as low performance, high maintainability and low implementation cost. The

implementation cost of both prototypes first and second is approximately same with average

significance level. Finally the third prototype (Perl) has significance level as average

performance, low maintainability and high implementation cost. It is not always possible to

acquire each quality attribute in a software system. Consequently, there is trade-off between

quality attributes of prototypes towards software platform is vital.

Page 42: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

42

The main concern rises between the quality attributes and software platforms. By using this

we demonstrated how the significance levels for quality attributes will differ. If the main

concern is maintainability then industry has to select second prototype. If the performance is

the main concern then first prototype is the best alternative and the main advantage with

third prototype is low implementation cost. All these arguments will depend when the

organizations wants to need the selection criteria of platforms from various quality attributes

with correspondent significance levels for the need of developing other prototypes as well as

designing of prototypes in various fields.

Page 43: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

43

7 FUTURE WORK

In our research, we studied the notion of design for quality attributes. In addition to that

auxiliary research is required to realize, how to select suitable quality attributes based on

certain problem. This research has a great potential and can extended for future work by

using the same software platforms to measure the different quality attributes such as

utilization, throughput, availability, portability, re-usability and marketability. Furthermore,

similar study is proposed as future work for different application fields like banking and

telecommunication with the same quality attributes that we have used.

In this section, some suggestions are provided to prolong this research deeply in the filed of

software architecture specifically for software architecture design and their styles. Different

style methods can be used to attain quality attributes in software architecture, e.g. what style

method can be taken to improve maintainability, stability, implementation cost and

performance for a given system. As we know that expert assessment method has a gap for

human biasness where as structural assessment is totally dependent on software platform of

the software systems. So we recommend that there is a need for style and design method

which should utilize the benefits of software architecture. While the style and design method

are already exists but even how expert judgment and structural assessment should be mixed

in a best way that need to be answered.

Page 44: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

44

REFERENCE

[1] P. O. Bengtsson and J. Bosch, “Haemo dialysis software architecture design

experiences,” in Proceedings of the 1999 International Conference on Software

Engineering, 1999, 1999, pp. 516–525.

[2] D. Andrzej, “Visual modeling for scientific software architecture design. A practical

approach,” Computer Physics Communications, vol. 183, no. 2, pp. 213–230, Feb. 2012.

[3] C. López, V. Codocedo, H. Astudillo, and L. M. Cysneiros, “Bridging the gap between

software architecture rationale formalisms and actual architecture documents: An

ontology-driven approach,” Science of Computer Programming, vol. 77, no. 1, pp. 66–

80, Jan. 2012.

[4] J. Bosch, “Software Architecture: The Next Step,” in Software Architecture, vol. 3047,

F. Oquendo, B. C. Warboys, and R. Morrison, Eds. Berlin, Heidelberg: Springer Berlin

Heidelberg, 2004, pp. 194–199.

[5] P. B. Kruchten, “The 4+1 View Model of architecture,” IEEE Software, vol. 12, no. 6,

pp. 42–50, Nov. 1995.

[6] J. Bosch and P. Bengtsson, “Assessing optimal software architecture maintainability,”

in Fifth European Conference on Software Maintenance and Reengineering, 2001,

2001, pp. 168–175.

[7] “Smart Grid Networks.” [Online]. Available: http://smartgridnetworks.net/. [Accessed:

23-Jan-2012].

[8] J. S. Dong, K. Lee, K. H. Kim, S. T. Kim, J. M. Cho, and T. H. Kim, “Platform

Maintenance Process for Software Quality Assurance in Product Line,” in Computer

Science and Software Engineering, 2008 International Conference on, 2008, vol. 2, pp.

325 –331.

[9] K. Popovici, X. Guerin, F. Rousseau, P. S. Paolucci, and A. Jerraya, “Efficient Software

Development Platforms for Multimedia Applications at Different Abstraction Levels,”

in Rapid System Prototyping, 2007. RSP 2007. 18th IEEE/IFIP International Workshop

on, 2007, pp. 113 –122.

[10] P. Bengtsson and J. Bosch, “Scenario-based software architecture reengineering,” in

Fifth International Conference on Software Reuse, 1998. Proceedings, 1998, pp. 308–

317.

[11] Len Bass, “Library | Recommended Best Industrial Practice for Software Architecture

Evaluation.” [Online]. Available:

http://www.sei.cmu.edu/library/abstracts/reports/96tr025.cfm?DCSext.abstractsource=S

earchResults. [Accessed: 13-May-2012].

[12] R. Kazman, L. Bass, G. Abowd, and M. Webb, “SAAM: a method for analyzing the

properties of software architectures,” in , 16th International Conference on Software

Engineering, 1994. Proceedings. ICSE-16, 1994, pp. 81–90.

[13] J. Bosch, Design and use of software architectures: adopting and evolving a product-

line approach. Pearson Education, 2000.

[14] R. Kazman, G. Abowd, L. Bass, and P. Clements, “Scenario-based analysis of software

architecture,” IEEE Software, vol. 13, no. 6, pp. 47–55, Nov. 1996.

[15] Xiaosong Wang, Li Wang, Benhai Yu, and Guixue Dong, “Studies on Network

Management System framework of Campus Network,” in 2010 2nd International Asia

Conference on Informatics in Control, Automation and Robotics (CAR), 2010, vol. 2, pp.

285–289.

[16] M. Kona and C.-Z. Xu, “A Framework for Network Management Using Mobile

Agents,” in Proceedings of the 16th International Parallel and Distributed Processing

Symposium, Washington, DC, USA, 2002.

[17] E. Sandewall, C. Strömberg, and H. Sörensen, “Software architecture based on

communicating residential environments,” in Proceedings of the 5th international

conference on Software engineering, Piscataway, NJ, USA, 1981, pp. 144–152.

Page 45: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

45

[18] D. Garlan and M. Shaw, “An introduction to software architecture,” in Advances in

Software Engineering and Knowledge Engineering, 1993, pp. 1–39.

[19] P. Bengtsson and J. Bosch, “Architecture level prediction of software maintenance,” in

Proceedings of the Third European Conference on Software Maintenance and

Reengineering, 1999, 1999, pp. 139–147.

[20] J. Bosch, “Software Product Lines - Three Examples - Electronic Research Archive @

Blekinge Institute of Technology (BTH).” [Online]. Available:

http://www.bth.se/fou/forskinfo.nsf/alfs/800f8fcdd0d40c6ac12568d300415fe3.

[Accessed: 13-Mar-2012].

[21] D. Haggander, L. Lundberg, and J. Matton, “Quality attribute conflicts - experiences

from a large telecommunication application,” in Engineering of Complex Computer

Systems, 2001. Proceedings. Seventh IEEE International Conference on, 2001, pp. 96 –

105.

[22] C. Chassot, K. Guennoun, K. Drira, F. Armando, E. Exposito, and A. Lozes,

“Architecture transformation and refinement for model-driven adaptability management:

application to QoS provisioning in group communication,” in Software Architecture.

Third European Workshop, EWSA 2006. Revised Selected Papers, 4-5 Sept. 2006,

Berlin, Germany, 2006.

[23] L. Bass, P. Clements, and R. Kazman, Software Architecture in Practice, 2nd ed.

Boston, MA, USA: Addison-Wesley Longman Publishing Co., Inc., 2003.

[24] H. Grahn and J. Bosch, “Some initial performance characteristics of three architectural

styles,” in Proceedings of the 1st international workshop on Software and performance,

New York, NY, USA, 1998, pp. 197–198.

[25] D. Häggander, P. Bengtsson, J. Bosch, and L. Lundberg, “Maintainability Myth Causes

Performance Problems in SMP Application,” in Proceedings of the Sixth Asia Pacific

Software Engineering Conference, Washington, DC, USA, 1999.

[26] Z. Li and J. Keung, “Software Cost Estimation Framework for Service-Oriented

Architecture Systems Using Divide-and-Conquer Approach,” in Service Oriented

System Engineering (SOSE), 2010 Fifth IEEE International Symposium on, 2010, pp. 47

–54.

[27] M. H. Meyer and R. Seliger, “Product Platforms in Software Development,” Sloan

Management Review, vol. 40, no. 1, pp. 61–74, Fall98 1998.

[28] M. Kolland, T. Mehner, and K.-J. Kuhn, “Software platforms-more than a buzzword,”

Wirtschaftsinformatik, vol. 35, no. 1, pp. 23–31, Feb. 1993.

[29] P. Clements and L. Northrop, Software Product Lines: Practices and Patterns.

Addison-Wesley, 2002.

[30] G. Blair, L. Blair, V. Issarny, P. Tuma, and A. Zarras, “The Role of Software

Architecture in Constraining Adaptation in Component-Based Middleware Platforms,”

in Middleware 2000, vol. 1795, J. Sventek and G. Coulson, Eds. Springer Berlin /

Heidelberg, 2000, pp. 164–184.

[31] J. D. Naumann and A. M. Jenkins, “Prototyping: The New Paradigm for Systems

Development,” MIS Quarterly, vol. 6, no. 3, pp. 29–44, 1982.

[32] L. Bally, J. Brittan, and K. H. Wagner, “A prototype approach to information system

design and development,” Information & Management, vol. 1, no. 1, pp. 21–26, 1977.

[33] P. Mogensen, “Towards a Provotyping Approach in Systems Development,”

Scandinavian Journal of Information Systems, vol. 4, pp. 31–53, 1992.

[34] J. C. Hall and D. L. Hemmel, “Digital Flight Control: An Approach to Efficient

Design,” IEEE Transactions on Aerospace and Electronic Systems, vol. AES-11, no. 5,

pp. 816–828, Sep. 1975.

[35] H. Algestam, M. Offesson, and L. Lundberg, “Using components to increase

maintainability in a large telecommunication system,” in Software Engineering

Conference, 2002. Ninth Asia-Pacific, 2002, pp. 65 – 73.

[36] H. Hermansson, M. Johansson, and L. Lundberg, “A distributed component architecture

for a large telecommunication application,” in Software Engineering Conference, 2000.

APSEC 2000. Proceedings. Seventh Asia-Pacific, 2000, pp. 188 –195.

Page 46: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

46

[37] M. Christensen, A. Crabtree, C. Damm, K. Hansen, O. Madsen, P. Marqvardsen, P.

Mogensen, E. Sandvad, L. Sloth, and M. Thomsen, “The M.A.D. experience:

Multiperspective application development in evolutionary prototyping,” in ECOOP’98

— Object-Oriented Programming, vol. 1445, E. Jul, Ed. Springer Berlin / Heidelberg,

1998, pp. 13–40.

[38] J. E. Bardram, H. B. Christensen, and K. M. Hansen, “Architectural prototyping: an

approach for grounding architectural design and learning,” in Software Architecture,

2004. WICSA 2004. Proceedings. Fourth Working IEEE/IFIP Conference on, 2004, pp.

15 – 24.

[39] Z. Yongjun and J. Dingfu, “Web-Based Network Management System Revolving about

Database,” in Business and Information Management, 2008. ISBIM ’08. International

Seminar on, 2008, vol. 2, pp. 263 –266.

[40] L. Lundberg, J. Bosch, D. Haggander, and P.-O. Bengtsson, “Quality attributes in

software architecture design,” in Proceedings of SEA’99: 3rd Annual IASTED

International Conference on Software Engineering and Applications, 6-8 Oct. 1999,

Anaheim, CA, USA, 1999, pp. 353–362.

[41] Y. Tonoshima, T. Kanai, and M. Azuma, “Common platform for network operation

systems,” Fujitsu, vol. 50, no. 4, pp. 281–290, 1999.

[42] Li-wei Tian and Chao-wan Yin, “Intelligent network management system based on

CORBA: architecture and implementation,” Mini-Micro Systems, vol. 23, no. 7, pp.

810–813, Jul. 2002.

[43] C. Bădică, Z. Budimac, H.-D. Burkhard, and M. Ivanovic, “Software agents:

Languages, tools, platforms,” Computer Science and Information Systems, vol. 8, no. 2,

pp. 255–298, 2011.

[44] A. Alebrahim, D. Hatebur, and M. Heisel, “Towards Systematic Integration of Quality

Requirements into Software Architecture,” in Software Architecture, vol. 6903, I.

Crnkovic, V. Gruhn, and M. Book, Eds. Berlin, Heidelberg: Springer Berlin Heidelberg,

2011, pp. 17–25.

[45] L. Dobrica and E. Niemela, “A survey on software architecture analysis methods,”

IEEE Transactions on Software Engineering, vol. 28, no. 7, pp. 638– 653, Jul. 2002.

[46] S. Balsamo, P. Inverardi, and C. Mangano, “An approach to performance evaluation of

software architectures,” in Proceedings of the 1st international workshop on Software

and performance, New York, NY, USA, 1998, pp. 178–190.

[47] G. Valentin, C. Victor, and B. Pompiliu, “ATMEGA256-based data network

management software architecture,” in International Symposium on Signals, Circuits

and Systems, 2009. ISSCS 2009, 2009, pp. 1–4.

[48] K. I. M. Jae-Young, J. U. Hong-Taek, H. J. Won-Ki, K. I. M. Seong-Beom, and H.

Chan-Kyou, “Towards TMN-Based Integrated Network Management Using CORBA

and Java Technologies (Special Issue on New Paradigms in Network Management),”

IEICE transactions on communications, vol. 82, no. 11, pp. 1729–1741, Nov. 1999.

[49] I. A. . El-Maddah, W. Abdelmoez, and H. H. Ammar, “Towards Maintainable

Architecture for Process Control Applications,” in IEEE/ACS International Conference

on Computer Systems and Applications, 2007. AICCSA ’07, 2007, pp. 59–66.

[50] S. Balsamo, M. Marzolla, A. Di Marco, and P. Inverardi, “Experimenting different

software architectures performance techniques: a case study,” SIGSOFT Softw. Eng.

Notes, vol. 29, no. 1, pp. 115–119, Jan. 2004.

[51] S. Zhang, H. Wu, W. Wang, B. Yang, P. Liu, and A. V. Vasilakos, “Distributed

workload and response time management for web applications,” in Network and Service

Management (CNSM), 2011 7th International Conference on, 2011, pp. 1 –9.

[52] S. Balsamo, R. Mamprin, and M. Marzolla, “Performance evaluation of software

architectures with queuing network models,” in Modelling and Simulation 2004. The

European Simulation and Modelling Conference 2004. ESM’2004, 25-27 Oct. 2004,

Ghent, Belgium, 2004, pp. 206–213.

Page 47: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

47

[53] S. Balsamo and M. Marzolla, “Performance evaluation of UML software architectures

with multiclass Queueing Network models,” in Proceedings of the 5th international

workshop on Software and performance, New York, NY, USA, 2005, pp. 37–42.

[54] Hui Guan, Wei-Ru Chen, Ning Huang, and Hong-Ji Yang, “Estimation of Reliability

and Cost Relationship for Architecture-based Software,” International Journal of

Automation and Computing, vol. 7, no. 4, pp. 603–610, Nov. 2010.

[55] M. E. Helander, M. Zhao, and N. Ohlsson, “Planning models for software reliability

and cost,” Software Engineering, IEEE Transactions on, vol. 24, no. 6, pp. 420 –434,

Jun. 1998.

[56] J. E. Bardram, H. B. Christensen, and K. M. Hansen, “Architectural prototyping: an

approach for grounding architectural design and learning,” in Software Architecture,

2004. WICSA 2004. Proceedings. Fourth Working IEEE/IFIP Conference on, 2004, pp.

15 – 24.

[57] D. Dvorak, R. Rasmussen, G. Reeves, and A. Sacks, “Software architecture themes in

JPL’s Mission Data System,” in 2000 IEEE Aerospace Conference Proceedings, 2000,

vol. 7, pp. 259–268.

[58] X. Lu, “Construct heterogeneous telecommunication network management system

based on J2EE,” in Power Electronics and Intelligent Transportation System (PEITS),

2009 2nd International Conference on, 2009, vol. 1, pp. 278 –281.

[59] C. W. Dawson, Projects on computing and information systems: a student’s guide.

Pearson Education, 2005.

[60] J. W. Creswell, Qualitative inquiry and research design: choosing among five

traditions. Sage Publications, 1998.

[61] B. Kitchenham, O. Pearl Brereton, D. Budgen, M. Turner, J. Bailey, and S. Linkman,

“Systematic literature reviews in software engineering – A systematic literature review,”

Information and Software Technology, vol. 51, no. 1, pp. 7–15, Jan. 2009.

[62] K. D. Bailey, Methods of Social Research. Simon and Schuster, 1994.

[63] L. A. Goodman, “Snowball Sampling,” The Annals of Mathematical Statistics, vol. 32,

no. 1, pp. 148–170, Mar. 1961.

[64] P. Biernacki and D. Waldorf, “Snowball Sampling: Problems and Techniques of Chain

Referral Sampling,” Sociological Methods & Research, vol. 10, no. 2, pp. 141 –163,

Nov. 1981.

[65] “PHP: Hypertext Preprocessor.” [Online]. Available: http://www.php.net/. [Accessed:

17-Apr-2012].

[66] “JavaServer Pages Technology.” [Online]. Available:

http://www.oracle.com/technetwork/java/javaee/jsp/index.html. [Accessed: 03-May-

2012].

[67] “The Perl Programming Language - www.perl.org.” [Online]. Available:

http://www.perl.org/. [Accessed: 27-Mar-2012].

[68] D. Garlan and D. Perry, “Software architecture: practice, potential, and pitfalls,” in

Software Engineering, 1994. Proceedings. ICSE-16., 16th International Conference on,

1994, pp. 363 –364.

[69] P. Bourque and R. Dupuis, “Guide to the Software Engineering Body of Knowledge

2004 Version,” Guide to the Software Engineering Body of Knowledge, 2004. SWEBOK,

2004.

[70] G. Ortiz, B. Bordbar, and J. Hernandez, “Evaluating the use of AOP and MDA in Web

service development,” in 2008 3rd International Conference on Intenet and Web

Applications and Services, 8-13 June 2008, Piscataway, NJ, USA, 2008, pp. 78–83.

[71] “apache friends - xampp.” [Online]. Available:

http://www.apachefriends.org/en/xampp.html. [Accessed: 01-Apr-2012].

[72] “Web Server Survey | Netcraft.” [Online]. Available:

http://news.netcraft.com/archives/category/web-server-survey/. [Accessed: 10-May-

2012].

[73] “MySQL :: Why MySQL?” [Online]. Available: http://www.mysql.com/why-mysql/.

[Accessed: 17-Apr-2012].

Page 48: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

48

[74] Apache Software Foundation, “Apache Tomcat - Welcome!” [Online]. Available:

http://tomcat.apache.org/. [Accessed: 07-Apr-2012].

[75] “IEEE Standard Glossary of Software Engineering Terminology,” IEEE Std 610.12-

1990, p. 1, 1990.

[76] PerOlof Bengtsson, “Design and Evaluation of Software Architecture Electronic

Research Archive @ Blekinge Institute of Technology (BTH).” [Online]. Available:

http://www.bth.se/fou/forskinfo.nsf/all/3a18ded3a4a0876dc12568a3002cabfc.

[Accessed: 14-Mar-2012].

[77] D. C. Dawson, Projects in Computing and Information Systems: A Student’s Guide, 2nd

ed. Addison Wesley, 2009.

[78] Saunders, Research Methods for Business Students, 4TH EDITION. Prntic

Hal;Inc.,2007, 2007.

[79] Mark Kasunic, “Library | Designing an Effective Survey.” [Online]. Available:

http://www.sei.cmu.edu/library/abstracts/reports/05hb004.cfm?DCSext.abstractsource=

SearchResults. [Accessed: 25-Mar-2012].

[80] H. K. Baker, J. C. Singleton, and E. T. Veit, Survey Research in Corporate Finance:

Bridging the Gap Between Theory and Practice. Oxford University Press, 2011.

[81] F. J. Fowler, Improving Survey Questions: Design and Evaluation. SAGE, 1995.

[82] J. Demšar, “Statistical Comparisons of Classifiers over Multiple Data Sets,” J. Mach.

Learn. Res., vol. 7, pp. 1–30, Dec. 2006.

[83] A. Papoulis and S. U. Pillai, Probability, Random Variables, and Stochastic Processes.

McGraw-Hill, 2002.

[84] K. Bennett, M. Munro, J. Xu, N. Gold, P. Layzell, N. Mehandjiev, D. Budgen, and P.

Brereton, Prototype Implementations of an Architectural Model for Service-Based

Flexible Software. 2002.

[85] S. R. Rakitin, Software Verification and Validation for Practitioners and Managers.

Artech House, 2001.

[86] S. R. A. Fisher, Statistical methods and scientific inference. Oliver and Boyd, 1959.

[87] N. H. Nie, D. H. Bent, K. Steinbrenner, J. G. Jenkins, and C. H. Hull, SPSS; Statistical

Package for the Social Sciences. Second Edition. McGraw-Hill Book Company, New

York, N.Y., 1975.

[88] C. Wohlin, P. Runeson, and M. Höst, Experimentation in Software Engineering: An

Introduction, 1st ed. Springer, 1999.

[89] R. E. Millsap and A. Maydeu-Olivares, Eds., The SAGE Handbook of Quantitative

Methods in Psychology. Sage Publications Ltd, 2009.

[90] M. J. Kabir, Apache Server 2 Bible. John Wiley & Sons, 2002.

[91] “PHP vs ASP.net Performance Comparison.” [Online]. Available:

http://www.comentum.com/php-vs-asp.net-comparison.html. [Accessed: 25-Apr-2012].

[92] “Top 40 Website Programming Languages.” [Online]. Available:

http://rogchap.com/2011/09/06/top-40-website-programming-languages/. [Accessed: 27-

Apr-2012].

Page 49: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

49

APPENDIX

Appendix-A: Search String

Table 21 List of Search Strings

Databases Search Strings

IEEE ((((software platform) AND maintainability) AND performance))

IEEE (((software architecture) AND maintainability) AND performance)

IEEE ((((software architecture) AND maintainability) AND performance) AND cost)

Scopus

(TITLE-ABS-KEY(software platform) AND TITLE-ABS-KEY(performance)

AND TITLE-ABS-KEY(maintainability) AND TITLE-ABS-

KEY(implementation))

Scopus

(TITLE-ABS-KEY(software architecture) AND TITLE-ABS-

KEY(performance) AND TITLE-ABS-KEY(implementation cost) AND

TITLE-ABS-KEY(maintainability))

Scopus

(TITLE-ABS-KEY(software architecture) AND TITLE-ABS-

KEY(maintainability) AND TITLE-ABS-KEY(PERFORMANCE) OR TITLE-

ABS-KEY(performance evaluation) OR TITLE-ABS-KEY(RESPONSE

TIME))

Inspec

((( (($software $platform) WN KY) AND (1969-2012 WN YR)) AND (

(($Maintainability) WN KY) AND (1969-2012 WN YR)) AND (

(($Implementation $cost) WN KY) AND (1969-2012 WN YR)) AND (

(($performance) WN KY) AND (1969-2012 WN YR))) AND )))

Inspec

( (($software $architecture) WN KY) AND (1969-2012 WN YR)) AND (

(($Maintainability) WN KY) AND (1969-2012 WN YR)) AND (

(($Implementation $cost) WN KY) AND (1969-2012 WN YR)) AND (

(($performance) WN KY) AND (1969-2012 WN YR))

Inspec

( (((((((($software $architecture) WN KY) AND (1969-2012 WN YR)) OR

((($software $architecture $design) WN KY) AND (1969-2012 WN YR)))

AND (((($performance $evaluation) WN KY) AND (1969-2012 WN YR)) OR

((($performance) WN ALL) AND (1969-2012 WN YR))) AND

(((($maintainability) WN KY) AND (1969-2012 WN YR)) OR

((($ARCHITECTURE $MAINTAINABILITY) WN KY) AND (1969-2012

WN YR))) AND ((((COST* $ESTIMATION) WN ALL) AND (1969-2012

WN YR)) OR ((($IMPLEMENTATION $TIME) WN ALL) AND (1969-2012

WN YR))) AND (((($Network $management $system) WN KY) AND (1969-

2012 WN YR)) OR ((($Network $management* $system $software

$architecture) WN KY) AND (1969-2012 WN YR))))) AND ({english} WN

Page 50: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

50

Appendix-B: Prototypes

Prototype 1(PHP):

User or client

Web server Apache

(xampp 1.7.4)

PHP 5.3.4

MySQL

server

version

5.5.8

Data

acquisition

Web browser

output in HTML &

Php

Through script

Figure 20 Prototype 1 of Software Architecture

Maintainability of Prototype 1

Formula for the calculation of maintainability is described in Chapter 3 research

methodology and getting the result of each respondent value and then placed in the Table.

22.

Table 22 Maintainability Average for Prototype 1

Respondent

Number

Maintainability Value from Equation 4

efforts formula

Average value Standard

Deviation

1 59

80.920

19.349677

2 99

3 72

4 94

5 72

6 49

7 108

8 72

9 66

10 56

11 64

12 76

13 87

14 60

15 118

16 105

Page 51: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

51

17 114

18 99

19 76

20 78

21 77

22 100

23 72

24 90

25 60

Performance (Response Time) results of Prototype 1

Table 23 Prototype 1 Response Time

Execution

number Functionalities/test cases

View All Network

Search Network

Status of Network

1 8.1 9.6 7.5 2 6.5 6.3 6.1 3 6.1 6.8 5.7 4 7 8.4 9.3 5 6.6 10.8 5.6 6 5 6.7 8.1 7 5.7 7.4 9.6 8 6.5 8.2 6.3 9 5.5 9.8 6.4 10 6.5 8.7 8.5 11 5.8 7.6 8.3 12 5.8 9 9.6 13 7.6 8.2 8.6 14 6.1 9.6 5.9 15 5.9 7.4 9.2 16 9.3 6.6 7.1 17 6.5 8.3 8.8 18 6.7 9.9 5.8 19 8.9 8.9 6.6 20 5.4 8 4.8 21 9.1 7.4 6 22 8.2 6.1 5.9 23 6.8 6.8 5.3 24 6.1 7 5.7 25 6.3 6.2 5.9 26 6 5.7 6.2 27 5.8 5.6 5.2 28 7.2 6 6.5 29 8.8 5.7 5.4 30 6.4 6.4 5.9

AVG 6.74 7.63666667 6.86 STD 1.162221 1.43706678 1.475454

Page 52: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

52

Prototype 2(JSP):

User or client

Web server Tomcat V 7.0

JSP(Java

Servlet Pages)

MySQL

server

version

5.5.8

Data

acquisition

Web browser

output in JSP

jdk6

Through script

Figure 21 Prototype 2 of Software Architecture

Maintainability of Prototype 2

Table 24 Maintainability Average for Prototype 2

Respondent

Number

Maintainability Value from Equation 5

efforts formula

Average value Standard

Deviation

1 72

79.400

16.2583312

2 92

3 86

4 78

5 162

6 79

7 36

8 94

9 79

10 91

11 65

12 62

13 94

14 87

15 54

16 110

17 89

18 99

19 81

20 69

21 79

22 95

23 81

24 82

25 54

Page 53: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

53

Performance (response time) results of Prototype 2

Table 25 Prototype 2 Response Time

Execution

number Functionalities/test cases

View All Network

Search Network

Status of Network

1 19 14 10 2 15 10 10 3 20 13 9 4 19 15 11 5 18 11 9 6 12 18 8 7 14 17 12 8 12 19 11 9 10 11 9 10 13 15 10 11 13 11 11 12 15 15 8 13 11 14 9 14 16 17 11 15 11 12 10 16 18 11 14 17 13 12 13 18 12 16 11 19 13 11 13 20 16 14 7

21 12 16 18

22 14 12 10

23 10 13 9

24 14 13 13

25 11 14 14

26 12 17 13

27 16 19 9

28 11 16 9

29 10 12 11

30 14 13 13

Avg 13.8 14.03333 10.83333

STD 2.869579 2.552664 2.305665

Page 54: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

54

Prototype 3(Perl):

User or client

Web server Apache

(xampp 1.7.4)

Perl ( active

perl5.14.2)

MySQL

server

version

5.5.8

Data

acquisition

Web browser

output in CGI

Through script

Figure 22 Prototype 3 of Software Architecture

Maintainability of Prototype 3

Table 26 Maintainability Average for Prototype 3

Respondent

Number

Maintainability Value from Equation 6

efforts formula

Average value Standard

Deviation

1 52

81.400

19.73364302

2 61

3 91

4 106

5 167

6 74

7 56

8 90

9 74

10 69

11 72

12 66

13 78

14 87

15 59

16 118

17 128

18 99

19 76

20 71

21 77

22 105

23 75

24 84

25 62

Page 55: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

55

Performance (Response time) results of Prototype 3

Table 27 Prototype 3 Response Time

Execution

number Functionalities/test cases

View All Network

Search Network

Status of Network

1 6.82 8.4 6.39 2 6 9 7.01 3 6.5 6.79 9.9 4 4.36 8.25 8.28 5 6.2 9.63 6.25 6 7.73 6.16 9.55 7 8.09 7.4 9.7 8 7.48 10.72 9.35 9 4.26 7.76 6.83 10 6.45 6.02 11.84 11 5.96 6.1 9.01 12 4.39 7.02 6.19 13 8.43 11 6.92 14 7.99 7.72 8.29 15 7.43 7.85 9.42 16 6.31 6.73 10.31 17 6.4 9.24 11.27 18 7.37 8.68 11.68 19 6.17 7.28 10.72 20 5.39 8.47 8.05

21 8.13 7.72 9.85

22 5.98 10.2 8.36

23 6.57 10.7 6.32

24 7.74 7.46 10.74

25 6.13 7.13 10.85

26 8.43 8.07 9.02

27 8.69 8.98 10.6

28 8.29 9.88 9.68

29 8.49 8.74 9.12

30 8.92 8.22 9.83

Avg 6.903333 8.244 9.044333

STD 1.313183 1.377848 1.687681

Page 56: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

56

Appendix-C: Survey Form

Maintainability Survey

Personal Attributes form Name/Organisation/Industry: _________________________________________

Age : _________________________________________

Education Qualification : _________________________________________

Software Development

Experience : ________________________________________

Maintenance Experience : _________________________________________

Present domain work

Experience : _________________________________________

Knowledge in:

C □ None □ Novice □ Skilled □ Expert

C++ □ None □ Novice □ Skilled □ Expert

Java(JSP) □ None □ Novice □ Skilled □ Expert

Perl □ None □ Novice □ Skilled □ Expert

PHP □ None □ Novice □ Skilled □ Expert

HTML □ None □ Novice □ Skilled □ Expert

.Net □ None □ Novice □ Skilled □ Expert

Visual Basic □ None □ Novice □ Skilled □ Expert

Any other specifies ___________________________________________________

Architecture Design Method Experiences: □ Use case □ ADD □ FDM □ None

US - Use case/Scenario-driven

ADD - Attribute Driven Design

FDM-Functionality-based Design Method

Architecture Analysis Method Experiences: Chose any one method, you have experienced

□ SAMM □ ATAM □ 4+1 View □ None

SAAM- Scenario-based Architecture Analysis Method

ATAM- Architectural trade-off Analysis Method

4+1 View Method

The rating for efforts and Probability of change scenario occurrence, which you need to

assign for each prototype given in Table 28 and Table 29 respectively

Page 57: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

57

Table 28 Prototype Efforts Rating

Very little effort little effort Average effort high effort Very High

effort

1 2 3 4 5

Table 29 Probability Ratings of Selected Scenario

Very low

Probability

low Probability Average

Probability

High

Probability

Very High

Probability

1 2 3 4 5

Table 30 Example Table for Filling Survey Form

Scenario

No. Prototype (P1) Prototype (P2) Prototype (P3)

Probability of change

scenario Occurrence

S0 3 4 3 3

Responders have to fill the Table 31considering scenario description in Table 3 and opinion

rating according to Table 28 and Table 29.

Table 31 Prototype and Probability Scenario Rating

Scenario

No.

Prototype (P1)

PHP+MySql+ Apache

web server

Prototype (P2)

JSP+MySql+Tomcat

Prototype (P3)

Perl+MySql+ Apache

web server

Probability of

change scenario

occurrence

S1

S2

S3

S4

S5

S6

S7

S8

S9

S10

Note: We here by inform all participants that, their participation will be voluntary. These

answers will be helpful for ensuring the validity of study. We will not share any of your

personal information with any authority, but your answers will be used to analyze the results

of the study.

Page 58: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

58

Appendix-D: Target Audience

List of participants selected for the survey and also each responder result is given below this

Table.

Table 32 List of Industries for Survey as Target Audience

S no. Industry

/Academia Web site Industry

/Academia Address Telephone no

1

Smart Grid

Network

AB

http://smartgridnetwork

s.net/

Smart Grid

Network AB

Rombv. 4 371 65

LYCKEBY 0455 - 60 01 00

2

Contribe

AB http://www.contribe.se/ Contribe AB

Ronnebygatan 41 371 33

Karlskrona near

Centrum opposite Haris

0708-225377

3 Ericsson

AB http://www.ericsson.com/se/ Ericsson AB

Ölandsg. 1, 371 33

KARLSKRONA 010 - 719 00 00

4 Telenor

AB

http://www.telenor.se/privat

/index.html Telenor AB

Campus Gräsvik 12, 371

75 KARLSKRONA 0455 331422

5 Fujitsu AB http://www.fujitsu.com/se/ Fujitsu AB

Stortorget 10, 371 34

KARLSKRONA near

bank at centrum 0455 - 61 65 00

6 Sigma AB http://www.sigma.se/ Sigma AB

Östra Vittusgatan 36, 371

33 KARLSKRONA 020 - 55 05 50

7 Capgemini

AB

http://www.se.capgemini.co

m/

Capgemini

AB

Campus Gräsvik 2, 371

75 KARLSKRONA 08 - 536 850 00

8 Modondo

AB http://www.modondo.com/

Modondo

AB

Minervavägen 4, 371 75

Karlskrona 0455 - 34 10 30

9

IT

Solution

AB http://www.ilt.se/

IT Solution

AB

Östra Vittusgatan 36, 371

33 KARLSKRONA 070 - 838 12 98

10 WIP AB www.wip.se WIP AB

Campus Gräsvik 5 371

75 Karlskrona 0455 - 33 98 00

11 Oracle AB www.oracle.com

Oracle AB

Campus Gräsvik 1 371

75 K-na 08 - 477 33 00

12

Softhouse

AB http://www.softhouse.se/

Softhouse

AB

Campus Gräsvik 3 A

371 75 K-na 0455 616460

13 alatest AB www.alatest.se alatest AB

Gävleg. 16113 30

STOCKHOLM 08 - 30 12 75

14 Real Soft http://realsoftonline.com/ Real Soft Karlskrona 46760609664

Page 59: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

59

Appendix-E: Survey Data Survey statistical data results forms given by responder according to above form:

Responder 1

Scen

ario

no

Pro

toty

pe1

Pro

toty

pe2

Pro

toty

pe3

Pro

bab

ility

o

f C

han

ge

Scen

ario

occ

urr

ence

P

roto

typ

e 1

mai

nta

inab

ility

Pro

toty

pe2

mai

nta

inab

ility

Pro

toty

pe

3

mai

nta

inab

ility

S1 1 3 2 2 2 6 4 S2 1 1 1 3 3 3 3 S3 1 1 1 5 5 5 5 S4 1 1 1 1 1 1 1 S5 4 5 4 3 12 15 12 S6 2 3 2 2 4 6 4 S7 4 5 4 4 16 20 16 S8 4 4 1 3 12 12 3 S9 1 1 1 1 1 1 1 S10 1 1 1 3 3 3 3

Responder 2

Scen

ario

no

Pro

toty

pe1

Pro

toty

pe2

Pro

toty

pe3

Pro

bab

ility

o

f C

han

ge

Scen

ario

occ

urr

ence

P

roto

typ

e 1

mai

nta

inab

ility

Pro

toty

pe2

mai

nta

inab

ility

Pro

toty

pe

3

mai

nta

inab

ility

S1 2 2 1 4 8 8 4 S2 1 1 1 2 2 2 2 S3 1 1 1 5 5 5 5 S4 4 4 2 3 12 12 6 S5 3 3 5 3 9 9 15 S6 3 4 2 3 9 12 6 S7 5 3 2 5 25 15 10 S8 1 1 1 5 5 5 5 S9 3 3 1 5 15 15 5 S10 3 3 1 3 9 9 3

Responder 3

Scen

ario

no

Pro

toty

pe1

Pro

toty

pe

2

Pro

toty

pe

3

Pro

bab

ility

o

f C

han

ge

Scen

ario

occ

urr

ence

P

roto

typ

e 1

mai

nta

inab

ility

Pro

toty

pe2

mai

nta

inab

ility

Pro

toty

pe

3

mai

nta

inab

ility

S1 4 4 4 2 8 8 8 S2 1 5 3 4 4 20 12 S3 2 2 2 5 10 10 10 S4 1 1 1 1 1 1 1

Page 60: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

60

S5 3 3 3 2 6 6 6 S6 4 4 4 3 12 12 12 S7 2 2 2 4 8 8 8 S8 2 3 4 3 6 9 12 S9 2 2 2 4 8 8 8 S10 3 3 3 3 9 9 9

Responder 4

Scen

ario

no

Pro

toty

pe1

Pro

toty

pe2

Pro

toty

pe3

Pro

bab

ility

o

f C

han

ge

Scen

ario

occ

urr

ence

P

roto

typ

e 1

mai

nta

inab

ility

Pro

toty

pe2

mai

nta

inab

ility

Pro

toty

pe

3

mai

nta

inab

ility

S1 5 4 5 1 5 4 5 S2 2 1 3 1 2 1 3 S3 3 2 4 5 15 10 20 S4 1 1 2 4 4 4 8 S5 3 3 4 3 9 9 12 S6 5 4 4 5 25 20 20 S7 4 3 2 4 16 12 8 S8 3 2 5 4 12 8 20 S9 2 4 4 1 2 4 4 S10 2 3 3 2 4 6 6

Responder 5

Scen

ario

no

Pro

toty

pe1

Pro

toty

pe2

Pro

toty

pe3

Pro

bab

ility

o

f C

han

ge

Scen

ario

occ

urr

ence

P

roto

typ

e 1

mai

nta

inab

ility

Pro

toty

pe2

mai

nta

inab

ility

Pro

toty

pe

3

mai

nta

inab

ility

S1 2 3 2 2 4 6 4 S2 1 1 2 2 2 2 4 S3 3 4 3 3 9 12 9 S4 3 3 3 3 9 9 9 S5 2 2 2 3 6 6 6 S6 3 3 3 4 12 12 12 S7 2 2 2 3 6 6 6 S8 2 3 2 2 4 6 4 S9 2 2 2 4 8 8 8 S10 4 4 4 3 12 12 12

Responder 6

Scen

ario

no

Pro

toty

pe1

Pro

toty

pe2

Pro

toty

pe3

Pro

bab

ility

o

f C

han

ge

Scen

ario

occ

urr

ence

P

roto

typ

e 1

mai

nta

inab

ility

Pro

toty

pe2

mai

nta

inab

ility

Pro

toty

pe

3

mai

nta

inab

ility

S1 3 3 4 1 3 3 4

Page 61: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

61

S2 2 2 2 1 2 2 2 S3 3 1 4 3 9 3 12 S4 2 2 2 1 2 2 2 S5 4 4 4 1 4 4 4 S6 3 3 3 1 3 3 3 S7 4 2 4 2 8 4 8 S8 4 3 4 2 8 6 8 S9 3 2 3 3 9 6 9 S10 4 3 4 1 1 3 4

Responder 7

Scen

ario

no

Pro

toty

pe1

Pro

toty

pe2

Pro

toty

pe3

Pro

bab

ility

o

f C

han

ge

Scen

ario

occ

urr

ence

P

roto

typ

e 1

mai

nta

inab

ility

Pro

toty

pe2

mai

nta

inab

ility

Pro

toty

pe

3

mai

nta

inab

ility

S1 4 3 3 3 12 9 9 S2 3 2 2 4 12 8 8 S3 4 4 3 4 16 16 12 S4 5 4 4 2 10 8 8 S5 3 2 2 2 6 4 4 S6 4 3 3 4 16 12 12 S7 3 4 4 4 12 16 16 S8 2 2 2 4 8 8 8 S9 4 3 3 3 12 9 9 S10 4 4 4 1 4 4 4

Responder 8

Scen

ario

no

Pro

toty

pe1

Pro

toty

pe2

Pro

toty

pe3

Pro

bab

ility

o

f C

han

ge

Scen

ario

occ

urr

ence

P

roto

typ

e 1

mai

nta

inab

ility

Pro

toty

pe2

mai

nta

inab

ility

Pro

toty

pe

3

mai

nta

inab

ility

S1 2 3 2 2 4 6 4 S2 1 1 2 2 2 2 4 S3 3 4 3 3 9 12 9 S4 3 3 3 3 9 9 9 S5 2 2 2 3 6 6 6 S6 3 3 3 4 12 12 12 S7 2 2 2 3 6 6 6 S8 2 3 2 2 4 6 4 S9 2 2 2 4 8 8 8 S10 4 4 4 3 12 12 12

Page 62: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

62

Responder 9

Scen

ario

no

Pro

toty

pe1

Pro

toty

pe2

Pro

toty

pe3

Pro

bab

ility

o

f C

han

ge

Scen

ario

occ

urr

ence

P

roto

typ

e 1

mai

nta

inab

ility

Pro

toty

pe2

mai

nta

inab

ility

Pro

toty

pe

3

mai

nta

inab

ility

S1 1 1 1 1 1 1 1 S2 4 4 4 2 8 8 8 S3 2 5 2 4 8 20 8 S4 4 2 4 2 8 4 8 S5 1 4 2 3 3 12 6 S6 4 4 4 3 12 12 12 S7 1 2 1 2 2 4 2 S8 2 4 2 3 6 12 6 S9 1 1 1 2 2 2 2 S10 4 4 4 4 16 16 16

Responder 10

Scen

ario

no

Pro

toty

pe1

Pro

toty

pe2

Pro

toty

pe3

Pro

bab

ility

o

f C

han

ge

Scen

ario

occ

urr

ence

P

roto

typ

e 1

mai

nta

inab

ility

Pro

toty

pe2

mai

nta

inab

ility

Pro

toty

pe

3

mai

nta

inab

ility

S1 2 2 3 2 4 4 6 S2 3 3 4 1 3 3 4 S3 3 4 4 4 12 16 16 S4 2 3 4 1 2 3 4 S5 3 3 4 3 9 9 12 S6 2 2 2 3 6 6 6 S7 2 3 3 2 4 6 6 S8 3 4 4 2 6 8 8 S9 2 2 2 2 4 4 4 S10 2 2 2 3 6 6 6

Responder 11

Scen

ario

no

Pro

toty

pe1

Pro

toty

pe2

Pro

toty

pe3

Pro

bab

ility

o

f C

han

ge

Scen

ario

occ

urr

ence

P

roto

typ

e 1

mai

nta

inab

ility

Pro

toty

pe2

mai

nta

inab

ility

Pro

toty

pe

3

mai

nta

inab

ility

S1 3 3 3 2 6 6 6 S2 3 2 4 2 6 4 8 S3 2 2 2 5 10 10 10 S4 1 1 1 5 5 5 5 S5 3 3 3 3 9 9 9 S6 2 2 2 4 8 8 8 S7 2 2 2 4 8 8 8 S8 1 1 1 3 3 3 3 S9 1 1 1 4 4 4 4

Page 63: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

63

S10 1 1 1 5 5 5 5

Responder 12

Sc

enar

io n

o

Pro

toty

pe1

Pro

toty

pe2

Pro

toty

pe3

Pro

bab

ility

o

f C

han

ge

Scen

ario

occ

urr

ence

P

roto

typ

e 1

mai

nta

inab

ility

Pro

toty

pe2

mai

nta

inab

ility

Pro

toty

pe

3

mai

nta

inab

ility

S1 1 2 1 1 1 2 1 S2 1 1 1 1 1 1 1 S3 3 4 3 4 12 16 12 S4 1 2 2 2 2 4 4 S5 3 4 4 4 12 16 16 S6 4 5 3 4 16 20 12 S7 2 3 2 3 6 9 6 S8 4 4 4 3 12 12 12 S9 2 2 2 1 2 2 2 S10 4 4 4 3 12 12 12

Responder 13

Scen

ario

no

Pro

toty

pe1

Pro

toty

pe2

Pro

toty

pe3

Pro

bab

ility

o

f C

han

ge

Scen

ario

occ

urr

ence

P

roto

typ

e 1

mai

nta

inab

ility

Pro

toty

pe2

mai

nta

inab

ility

Pro

toty

pe

3

mai

nta

inab

ility

S1 5 5 5 1 5 5 5 S2 4 3 4 2 8 6 8 S3 3 3 3 5 15 15 15 S4 4 2 4 3 12 6 12 S5 3 3 3 3 9 9 9 S6 4 4 4 4 16 16 16 S7 4 4 4 4 16 16 16 S8 4 4 4 4 16 16 16 S9 3 3 3 3 9 9 9 S10 4 4 4 3 12 12 12

Responder 14

Scen

ario

no

Pro

toty

pe1

Pro

toty

pe2

Pro

toty

pe3

Pro

bab

ility

o

f C

han

ge

Scen

ario

occ

urr

ence

P

roto

typ

e 1

mai

nta

inab

ility

Pro

toty

pe2

mai

nta

inab

ility

Pro

toty

pe

3

mai

nta

inab

ility

S1 4 3 4 2 8 6 8 S2 4 3 4 4 16 12 16 S3 3 2 3 5 15 10 15 S4 2 2 2 2 4 4 4 S5 3 3 3 2 6 6 6 S6 4 2 4 2 8 4 8 S7 3 2 3 5 15 10 15

Page 64: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

64

S8 3 2 3 5 15 10 15 S9 3 3 3 3 9 9 9 S10 3 2 3 3 9 6 9

Responder 15

Scen

ario

no

Pro

toty

pe1

Pro

toty

pe2

Pro

toty

pe3

Pro

bab

ility

o

f C

han

ge

Scen

ario

occ

urr

ence

P

roto

typ

e 1

mai

nta

inab

ility

Pro

toty

pe2

mai

nta

inab

ility

Pro

toty

pe

3

mai

nta

inab

ility

S1 4 4 4 2 8 8 8 S2 3 3 3 2 6 6 6 S3 2 2 2 5 10 10 10 S4 4 4 4 2 8 8 8 S5 3 3 3 3 9 9 9 S6 3 3 3 4 12 12 12 S7 2 2 2 5 10 10 10 S8 2 2 2 5 10 10 10 S9 2 2 2 4 8 8 8 S10 2 2 2 4 8 8 8

Responder 16

Scen

ario

no

Pro

toty

pe1

Pro

toty

pe2

Pro

toty

pe3

Pro

bab

ility

o

f C

han

ge

Scen

ario

occ

urr

ence

P

roto

typ

e 1

mai

nta

inab

ility

Pro

toty

pe2

mai

nta

inab

ility

Pro

toty

pe

3

mai

nta

inab

ility

S1 2 2 3 2 4 4 6 S2 3 2 2 2 6 4 4 S3 2 2 3 4 8 8 12 S4 2 2 2 2 4 4 4 S5 3 3 3 2 6 6 6 S6 3 3 3 2 6 6 6 S7 4 3 2 2 8 6 4 S8 4 3 2 2 8 6 4 S9 2 2 3 3 6 6 9 S10 2 2 2 2 4 4 4

Responder 17

Scen

ario

no

Pro

toty

pe1

Pro

toty

pe2

Pro

toty

pe3

Pro

bab

ility

o

f C

han

ge

Scen

ario

occ

urr

ence

P

roto

typ

e 1

mai

nta

inab

ility

Pro

toty

pe2

mai

nta

inab

ility

Pro

toty

pe

3

mai

nta

inab

ility

S1 3 3 3 4 12 12 12 S2 3 2 4 2 6 4 8 S3 3 3 4 5 15 15 20 S4 5 5 5 1 5 5 5 S5 3 3 4 2 6 6 8

Page 65: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

65

S6 4 2 4 4 16 8 16 S7 3 2 4 5 15 10 20 S8 3 2 3 4 12 8 12 S9 3 3 3 5 15 15 15 S10 4 2 4 3 12 6 12

Responder 18

Scen

ario

no

Pro

toty

pe1

Pro

toty

pe2

Pro

toty

pe3

Pro

bab

ility

o

f C

han

ge

Scen

ario

occ

urr

ence

P

roto

typ

e 1

mai

nta

inab

ility

Pro

toty

pe2

mai

nta

inab

ility

Pro

toty

pe

3

mai

nta

inab

ility

S1 4 4 4 3 12 12 12 S2 4 4 4 3 12 12 12 S3 1 1 1 5 5 5 5 S4 4 4 4 3 12 12 12 S5 2 2 2 3 6 6 6 S6 3 3 3 4 12 12 12 S7 2 2 2 4 8 8 8 S8 2 2 2 4 8 8 8 S9 3 3 3 4 12 12 12 S10 3 3 3 4 12 12 12

Responder 19

Scen

ario

no

Pro

toty

pe1

Pro

toty

pe2

Pro

toty

pe3

Pro

bab

ility

of

Ch

ange

Scen

ario

o

ccu

rren

ce

Pro

toty

pe

1

mai

nta

inab

ility

Pro

toty

pe2

mai

nta

inab

ility

Pro

toty

pe

3

mai

nta

inab

ility

S1 4 4 4 2 8 8 8 S2 3 3 3 2 6 6 6 S3 2 2 2 4 8 8 8 S4 3 3 3 3 9 9 9 S5 3 4 3 2 6 8 6 S6 3 3 3 3 9 9 9 S7 2 3 2 3 6 9 6 S8 3 3 3 4 12 12 12 S9 2 2 2 3 6 6 6 S10 2 2 2 3 6 6 6

Responder 20

Scen

ario

no

Pro

toty

pe1

Pro

toty

pe2

Pro

toty

pe3

Pro

bab

ility

o

f C

han

ge

Scen

ario

occ

urr

ence

P

roto

typ

e 1

mai

nta

inab

ility

Pro

toty

pe2

mai

nta

inab

ility

Pro

toty

pe

3

mai

nta

inab

ility

S1 3 3 3 2 6 6 6 S2 2 2 2 2 4 4 4 S3 3 2 3 3 9 6 9

Page 66: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

66

S4 4 4 2 3 12 12 6 S5 2 2 2 2 4 4 4 S6 4 4 4 3 12 12 12 S7 2 2 3 2 4 4 6 S8 4 3 4 3 12 9 12 S9 3 3 3 2 6 6 6 S10 3 2 2 3 9 6 6

Responder 21

Scen

ario

no

Pro

toty

pe1

Pro

toty

pe2

Pro

toty

pe3

Pro

bab

ility

o

f C

han

ge

Scen

ario

occ

urr

ence

P

roto

typ

e 1

mai

nta

inab

ility

Pro

toty

pe2

mai

nta

inab

ility

Pro

toty

pe

3

mai

nta

inab

ility

S1 4 4 4 2 8 8 8 S2 2 3 3 2 4 6 6 S3 3 2 2 3 9 6 6 S4 2 3 3 4 8 12 12 S5 3 3 3 3 9 9 9 S6 3 4 2 2 6 8 4 S7 2 2 3 3 6 6 9 S8 3 3 2 4 12 12 8 S9 3 2 3 3 9 6 9 S10 2 2 2 3 6 6 6

Responder 22

Scen

ario

no

Pro

toty

pe1

Pro

toty

pe2

Pro

toty

pe3

Pro

bab

ility

o

f C

han

ge

Scen

ario

occ

urr

ence

P

roto

typ

e 1

mai

nta

inab

ility

Pro

toty

pe2

mai

nta

inab

ility

Pro

toty

pe

3

mai

nta

inab

ility

S1 4 3 4 4 16 12 16 S2 2 2 2 4 8 8 8 S3 2 3 2 3 6 9 6 S4 5 5 5 2 10 10 10 S5 4 3 4 4 16 12 16 S6 4 4 4 3 12 12 12 S7 2 2 3 3 6 6 9 S8 4 4 5 2 8 8 10 S9 3 3 3 4 12 12 12 S10 2 2 2 3 6 6 6

Responder 23

Scen

ario

no

Pro

toty

pe1

Pro

toty

pe2

Pro

toty

pe3

Pro

bab

ility

o

f C

han

ge

Scen

ario

occ

urr

ence

P

roto

typ

e 1

mai

nta

inab

ility

Pro

toty

pe2

mai

nta

inab

ility

Pro

toty

pe

3

mai

nta

inab

ility

S1 2 3 1 3 6 9 3

Page 67: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

67

S2 4 3 2 2 8 6 4 S3 3 3 1 1 3 3 1 S4 1 4 3 2 2 8 6 S5 5 3 2 1 5 3 2 S6 3 1 4 2 6 2 8 S7 2 4 4 4 8 16 16 S8 2 2 3 1 2 2 3 S9 4 5 3 4 16 20 12 S10 4 3 5 4 16 12 20

Responder 24

Scen

ario

no

Pro

toty

pe1

Pro

toty

pe2

Pro

toty

pe3

Pro

bab

ility

o

f C

han

ge

Scen

ario

occ

urr

ence

P

roto

typ

e 1

mai

nta

inab

ility

Pro

toty

pe2

mai

nta

inab

ility

Pro

toty

pe

3

mai

nta

inab

ility

S1 2 2 2 4 8 8 8 S2 3 2 1 3 9 6 3 S3 4 3 4 5 20 15 20 S4 3 3 3 3 9 9 9 S5 2 2 2 1 2 2 2 S6 3 3 3 2 6 6 6 S7 3 3 3 4 12 12 12 S8 3 3 3 5 15 15 15 S9 1 1 1 5 5 5 5 S10 4 4 4 1 4 4 4

Responder 25

Scen

ario

no

Pro

toty

pe1

Pro

toty

pe2

Pro

toty

pe3

Pro

bab

ility

o

f C

han

ge

Scen

ario

occ

urr

ence

P

roto

typ

e 1

mai

nta

inab

ility

Pro

toty

pe2

mai

nta

inab

ility

Pro

toty

pe

3

mai

nta

inab

ility

S1 2 2 3 2 4 4 6 S2 3 2 2 2 6 4 4 S3 2 2 3 4 8 8 12 S4 2 2 2 2 4 4 4 S5 3 3 3 2 6 6 6 S6 3 3 3 2 6 6 6 S7 4 3 4 2 8 6 8 S8 4 3 3 2 8 6 6 S9 2 2 2 3 6 6 6 S10 2 2 2 2 4 4 4

Page 68: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

68

Appendix-F: Survey Data Analysis

Validation test results taken by using One way ANOVA for maintainability effort of each

prototype.

ONEWAY effort BY prototypes

/STATISTICS DESCRIPTIVES HOMOGENEITY

/PLOT MEANS

/POSTHOC=LSD ALPHA(0.05).

Oneway ANOVA

[DataSet1] Descriptives

effort

N Mean Std. Deviation Std. Error 95% Confidence

Interval for Mean

Lower Bound

Prototype 1 PHP 25 80.92 19.350 3.870 72.93

Prototype 2 JSP 25 79.40 16.258 3.252 72.69

Prototype 3 PERL 25 81.40 19.734 3.947 73.25

Total 75 80.57 18.281 2.111 76.37

Descriptives

Effort

95% Confidence Interval for

Mean

Minimum Maximum

Upper Bound

Prototype 1 PHP 88.91 49 118

Prototype 2 JSP 86.11 36 110

Prototype 3 PERL 89.55 52 128

Total 84.78 36 128

Test of Homogeneity of Variances

effort

Levene Statistic df1 df2 Sig.

1.351 2 72 .265

ANOVA

effort

Page 69: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

69

Sum of Squares df Mean Square F Sig.

Between Groups 54.507 2 27.253 .080 .924

Within Groups 24675.840 72 342.720

Total 24730.347 74

Means Plots

Figure 23 Maintainability Efforts Validation

The posthoc test is done for extra knowledge of the given results.

Post Hoc Tests

Multiple Comparisons

Dependent Variable: effort

LSD

(I) prototypes (J) prototypes Mean Difference

(I-J)

Std. Error Sig. 95%

Confidence

Interval

Lower Bound

Prototype 1 PHP Prototype 2 JSP 1.520 5.236 .772 -8.92

Prototype 3 PERL -.480 5.236 .927 -10.92

Prototype 2 JSP Prototype 1 PHP -1.520 5.236 .772 -11.96

Prototype 3 PERL -2.000 5.236 .704 -12.44

Prototype 3 PERL Prototype 1 PHP .480 5.236 .927 -9.96

Prototype 2 JSP 2.000 5.236 .704 -8.44

Page 70: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

70

Multiple Comparisons

Dependent Variable: effort

LSD

(I) prototypes (J) prototypes 95% Confidence Interval

Upper Bound

Prototype 1 PHP Prototype 2 JSP 11.96

Prototype 3 PERL 9.96

Prototype 2 JSP Prototype 1 PHP 8.92

Prototype 3 PERL 8.44

Prototype 3 PERL Prototype 1 PHP 10.92

Prototype 2 JSP 12.44

Page 71: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

71

Appendix-G: Servers & Platforms

Apache web server is number one server in all over the world, according to the survey

conducted by netcraft [90][72] .

Figure 24 Web Server Survey by Netcraft

Most popular organizations which are using different platform are shown in Table 33 [91].

Table 33 Organization Using Different Platforms

Organizations Up since Platforms

Google November 1998 PHP & MySQL

Facebook February 2004 PHP, MySQL and C++

Wikipedia January 2001 PHP & MySQL

Word Press November 2005 PHP & MySQL

YouTube February 2005 C, Java and MySQL

Yahoo August 1995 C++, C, Java, PHP & MySQL

Amazon October 1995 C++, Java, J2EE

Microsoft (MSN) August 1995 ASP.net

Microsoft (Live.com) August 2008 ASP.net

Craigslist 1995 Perl, MySQL, and JQuery

Well reputed websites using different platforms as mention in Table 34 [92].

Table 34 Websites Based on Different Platforms

S no. Websites Platforms

1 http://www.facebook.com/ PHP

2 http://www.wikipedia.org/ PHP

3 http://www.answers.com/ PHP

4 http://wordpress.com/ PHP

5 http://www.flickr.com/ PHP

6 http://www.aol.com/ JSP

7 http://www.servletsuite.com/index.html JSP

8 http://twitter.com/#!/t411 JSP

9 http://www.ibm.com/us/en/ JSP

Mar-12 Percent Apr-12 Percent Change

Apache 42,03,37,1 65.24% 44,31,02,5 65.46% 0.22

Microsoft 8,89,71,97 13.81% 9,24,88,75 13.66% -0.15

nginx 6,53,69,14 10.15% 6,98,69,91 10.32% 0.18

Google 2,11,50,93 3.28% 2,20,39,90 3.26% -0.03

-5,00,00,000 0

5,00,00,000 10,00,00,000 15,00,00,000 20,00,00,000 25,00,00,000 30,00,00,000 35,00,00,000 40,00,00,000 45,00,00,000 50,00,00,000

We

b S

ite

s

Top Servers Across all Domains (August 1995 - April 2012)

Page 72: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

72

10 http://stockholm.sv.craigslist.se/ Perl

11 http://www.ebay.com/ Perl

12 https://www.paypal.com/se Perl

13 http://www.bbc.co.uk/ Perl

14 www.pool.ntp.org/en Perl

15 http://www.imdb.com/ Perl

16 http://www.microsoft.com/en-us/default.aspx ASP.NET

17 https://login.live.com/ ASP.NET

18 http://se.msn.com/ ASP.NET

19 http://edition.cnn.com/ ASP.NET

Page 73: Performance, Maintainability and Implementation …831392/FULLTEXT01.pdf1 Master’s Thesis Electrical Engineering June 2012 Performance, Maintainability and Implementation Cost for

73

Appendix-H: Source code

Steps to deploy and run all three prototypes:

1. To check the source code of the three prototypes with the different platforms,

click on the below link:

https://docs.google.com/open?id=0Byn6b0tZNJJodHd5Q0ZZdWdOWUk

2. Unzip the project folder and follow the guidelines given in the “readme.txt” file.