26
1 Software metrics Software metrics in general in general Seminar on Software Engineering Seminar on Software Engineering Sanna Martikainen, 8.4.2005 Sanna Martikainen, 8.4.2005

1 Software metrics in general Seminar on Software Engineering Sanna Martikainen, 8.4.2005

  • View
    217

  • Download
    1

Embed Size (px)

Citation preview

1

Software metrics in generalSoftware metrics in general

Seminar on Software EngineeringSeminar on Software Engineering

Sanna Martikainen, 8.4.2005Sanna Martikainen, 8.4.2005

2

What is measurement?

• measurement is the process by which numbers or symbols are assigned to attributes of entities in the real world in such a way as to describe them according to clearly defined rules

• some measures are likely to be accurate, either because the measurement is imprecise or because it depends on the judgment of the person doing the measuring

3

Measurement for understanding, control and improvement

1. helps us to understand what is happening during development and maintenance

2. allow us to control what is happening on ou projects

3. encourages us to improve our processes and products

• limited accuracy of prediction

• the margin of error

4

Neglect of measurement in software engineering

• measurement has been considered a luxury in software engineering

1. we fails to set measurable targets for our software products– projects without clear goals will not achieve their goals clearly

2. we fail to understand and quantify the component costs of software projects

3. we don’t quantify or predict the quality of the products we produce

4. we allow anecdotal evidence to convince us to try yet another revolutionary new development technology, without doing a carefully controlled study to determine if the technology is efficient and effective

• when measurements are made, they are often done infrequently, inconsistently and incompletely

5

Objectives for software measurement

• how can you tell if your project is healthy if you have no measures of its health?

• we must control our projects, not just run them

• document trends, the magnitude or corrective action, and the resulting changes

• every measurement action must be motivated by a particular goal or need that is clearly defined and easily understandable

• understand and control a software development project

6

A goal-based framework for software measurement

1. classifying the entities to be examined

2. determining relevant measurement goals

3. identifying the level of maturity that your organization has reached

7

1. Classifying the entities to be examined

• identifying the entities and attributes we wish to measure

• processes are collections of software-related activities, usually associated with some timescale

• products are any artifacts, deliverables or documents that result from a process activity

• resources are entities required by a process activity

• internal attributes: measured purely in terms of the product, process or resource itself (size, complexity)

• external attributes: measured only with respect to how the product, process or resource relates to its environment (the number of failures experienced by the user)

8

2. Determining relevant measurement goals• a particular measurement is useful only if it helps you to

understand the underlying process or one of its resultant products

• the Goal-Question-Metric approach (GQM)– list the major goals of the development or maintenance project

– derive from each goal the questions that must be answered to determine if the goals are being met

– decide what must be measured in order to be able to answer the questions adequately

– typical goals are expressed in terms of productivity, quality, risk, customer satisfaction and coupled with verbs expressing the need to assess, evaluate, improve or understand

– subgoals

9

3. Identifying the level of maturity that your organization has reached

• The Software Engineering Institute (SEI): five levels of process maturity

1. ad hoc• the transition from inputs to outputs is undefined and

uncontrolled

• similar projects may vary widely in their productivity and quality characteristics because of lack of adequate structure and control

2. repeatable• proper inputs produce proper outputs, but there is no visibility

into how the outputs are produced

10

...

3. defined• intermediate activities are defined, and their inputs and

outputs are known and understood

4. managed• feedback from early project activities can be used to set

priorities for current activities and later project activities

5. optimizing• measures from activities are used to improve the process,

possibly by removing and adding process activities, and changing the process structure dynamically in response to measurement feedback

• the spiral model

– are meant to be nested

11

The Scope of Software metrics

• cost and effort estimation

• productivity measures and models

• data collection

• quality models and measures

• reliability models

• performance evaluation and models

• structural and complexity metrics

• capability-maturity assessment

• management by metrics

• evaluation of methods and tools

12

Cost and effort estimation• predictions of the likely amount of effort, time, and

staffing levels required to build a software system

• needed throughout the life cycle

• problems: – we are solving a problem that has never been solved before

– political problems

– technical problems

• four techniques:1. expert opinion takes advantage of a mature developer’s personal

experience

2. analogy: the estimators compare the proposed project with one or more past projects (similarities and differences)

13

...3. decomposition

• software is decribed in terms of its smallest components and then estimates are made for the effort required to produce each component

• activities are decomposed into low-level tasks and effort estimates are made for each of these

4. models of effort are techniques that identify key contributors to effort, generating mathematical formula that relate these items to effort (usually based on past experience)

• cost models provide direct estimates of effort or duration (example: regression based models)

• constraint models demonstrate the relationship over time between two or more parameters of effort, duration, or staffing level (example: COCOMO, SLIM)

14

Productivity measures and models

• in numerous attempts to define measures and models for assessing staff productivity during different software processes and in different environments

• the notion of productivity involves the contrast between what goes into a process and what comes out – example: a productivity equation

productivity = size / effort

• because productivity is a ratio scale measure, we can perform all reasonable statistical analyses on it

15

Data collection• must be planned and executed in a careful and sensitive manner

• software measurement is only as good as the data that are collected and analyzed

• what is good data? – correctness, accuracy, precision, consistent, time-stamped

• how to define the data– raw data, refined data

– deciding what to measure

• how to collect data– requires human observation and reporting

– manual recording, automatic data capture

– which products to measure (based on GQM analysis)

– making sure that the product is under configuration control (version)

16

...• establish procedures for handling the forms, analyzing the data, and

reporting the results

• takes place during many phases of development– at the beginning of the project to establish initial values

– then again as the initial values change to reflect activities and resources being studied

• raw software-engineering data should be stored on a database

• data collection should be simple and non-obstrusive, so that developers and maintainers can consentrate on their primary tasks, with data collection playing a supporting role

17

Analyzing data

• the nature of the data

• purpose of the experiment – to confirm a theory

– to explore a relationship

• design considerations

• statistical techniques (sampling, population etc.)

18

Quality models and measures

• software quality: fitness for purpose, conformance to specification, degree of excellence, timeliness

• usually constructed in a tree-like fashion

• the tree describes the pertiment relationships between factor and their dependent criteria, so we can measure the factors in terms of the dependent criteria measures

• the notion of quality is usually captures in a model that depicts the composite characteristics and their relationships

• models are useful in articulating what people think is important, and in identifying the commonalities of view

• ISO 9126 standard quality model

19

...• Defects-based quality measures

– quality is considered only to be a lack of defects

• Usability measures– user-friendliness

– the probability that the operator of a system will not experience a user interface problem during a given period of operation under a given operational profile

– well-structured manuals, good use of menus and graphics, informative error messages, help functions, consistent interfaces

• Maintainability measures– software should be easy to understand

– corrective maintenance: finding and fixing faults

– adaptive maintenance (system changes)

– preventive maintenance: fixing problems before the user sees them

– perfective maintenance: additions

20

Reliability models• reliability: a likelihood of successful operation during a

given period of time

• software-reliability growth problem: estimating and predicting the reliability of a program as faults are identified and attempts made to fix them

• parametric reliability growth models– acceptable and not-acceptable (a failure occurs) outputs

– uncertainty about the operational environment

– uncertainty about the effect of fault removal

– good reliability models must address both types of uncertainty

• accuracy is likely to vary from one data set to another, and from one type of prediction to another

21

Performance evaluation and models

• includes externally observable system performance characteristics, such as response times and completion rates

• the internal working of a system: the effiency of algorithms

• efficiency can be predicted quite accurately by considering any representation of the source code

22

Structural and complexity metrics

• structural attributes of representations of the software which are available in advance of execution

• try to establich empirically predictive theories to support quality assurance, quality control and quality prediction

• three parts of structure:– the control-flow addresses the sequence in which instructions are

executed in a program

– data-flow follows the trail of a data item as it is created or handled by a program

– data structure is the organization of the data itself, independent of the program

23

...

• the complexity of a problem: the amount of resources required for an optimal solution to the problem

• the complexity of a solution can be regarded in terms of the resources needed to implement a particular solution

• we can measure efficiency in terms of the number of primitice arithmetic operations required for a given input

big-O-notation

24

Capability-maturity assessment• to measure a contractor’s ability to develop quality software

• process maturity suggests both what can be measured and where in the process it can be captured

• a process maturity assessment: a contractor answered over 100 questions designed to determine the contractor’s actual practices (many problems )

• SEI: capability-maturity model (CMM)– five-level scale

– based on key practices that every good contractor should be using

– potential customers: to identify the strengths and weaknesses of their suppliers

– software developers: to assess their capabilities and set a path toward improvement

• SPICE

• ISO 9001

25

Management by metrics

• measurement is becoming an important part of software project management

• standard set of measurements and reporting methodscomparation and contrastion

• presented in a way that tells both customer and developer how the project is doing (targets needed)

• the measure helps in understanding a process, its resources, or one of its resultant products

26

Evaluation of methods and tools

• may make organization or project more productive and products better and cheaper

• careful, controlled measurement and analysis

• evaluation’s success depends on good experimental design, proper identification of the factors likely to affect the outcome, and appropriate measurement of factor attributes

• tried first on a small project