87
6. Capability Maturity Model Integration (CMMI) 2007

6. Capability Maturity Model Integration (CMMI) 2007

Embed Size (px)

Citation preview

6. Capability Maturity Model Integration (CMMI)

2007

2

Kõrgküpsed organisatsioonid 2003

• 2003. a. seisuga oli 197 kõrgküpset organisatsiooni:– 100 neljanda taseme ja – 97 viienda taseme organisatsiooni

• väljaspool USA-d oli 98 kõrgküpset organisatsiooni Indias (kusjuures 2001. aastal oli neid vaid 69. 2000. a. 24!),

• Motorola Global Software Group – Krakow 5s tase

3

Kõrgküpsed organisatsioonid 2003

• 2003. aastal lisandus vaid 4 neljanda ja 4 viienda taseme organisatsiooni – oli alanud CMMI ajastu

CMMI High Level Organizations

Feb 2004

5

CMMI Published Level 5 Organizations

• Blue Star Infotech Limited,Mumbai, India • GE Capital International Services - Software, Hyderabad &

Gurgaon Development Centers, India • Infosys Technologies Limited Development Centre in Chennai • Larsen & Toubro Limited, EmSyS,Mysore, India • Lockheed Martin Management & Data Systems (M&DS) King of

Prussia, Pennsylvania • SSI TECHONOLOGIES (SSIT) A DIVISION OF SSI LIMITED

CHENNAI, INDIA • Wipro Technologies Doddkannelli, Sarjapur Road Bangalore

– Wipro technologies is world's first CMMI Level 5 Ver1.1 organization – http://www.prdomain.com/companies/w/wipro/news_releases/

200205may/pr_wipro_20020530.htm

6

CMMI Published Level 4 Organizations • IBM Japan, Ltd., IBM Global Services Japan, Public Sector Services 19-21,

Nihonbashi Hakozaki-cho Chuo-ku, Tokyo 103-8510 Japan • Digital Design, Software Development Department, St. Petersburg, Russia

– Digital Design sets the standard by becoming the first Russian software company to successfully undergo a formal CMMI assessment.

• Hitachi Software Engineering Co.,Ltd., Financial Systems Group, 4-12-6 Higashishinagawa Shinagawa-ku, Tokyo 140-0002 Japan

– (1)Hitachi Software achieved CMMI Level 3 in all the business groups. (2)Achieved CMMI Level 3 at the whole company level.

• Hitachi Software Engineering Co.,Ltd., Government & Public Corporation Systems Division, 4-12-6 Higashishinagawa Shinagawa-ku, Tokyo 140-0002 Japan

• Hitachi Software Engineering Co.,Ltd., Software Development Division, Tokyo Japan

• Nihon Unisys, Ltd., Financial, Koto-ku, Tokyo, Japan • Northrop Grumman Mission Systems, Command, Control and Intelligence

Division, Omaha Site • Northrop Grumman Mission Systems, Tactical Systems Division • Rockwell Collins Government Systems • Soza & Company, Ltd

7

CMM-i probleemid• Kuna CMM-i tasemed 4 ja 5 on (peaaegu) kättesaamatud, siis on

enamiku organisatsioonide eesmärgiks küpsustaseme 3 saavutamine. – Oluliseks küpsustaseme saavutamise instrumendiks on

reglementeeritud hindamisprotsess – Software Capability Evaluation (SCE), mis määrab kindlaks, kas organisatsioon „räägib, mis teeb ja teeb, mis räägib“.

– Pisut teisiti: kas organisatsiooni toimimine on poliitikate, direktiivide jmt määratletud ja kas ka tegelikult nendest määratlustest kinni peetakse.

• Protsessi võtmealad (KPA) on suunatud traditsioonilisele kosemudelile vastava protsessi toimingutele ja toiminguid kajastavatele tulemitele (artifacts): – nõudmiste spetsifikatsioonid, – dokumenteeritud plaanid, – kvaliteedi auditi protokollid, – dokumenteeritud protsessid ja protseduurid.

8

CMM-i probleemid

• Tarkvaraprotsessi tegelikku eesmärki – töötavat süsteemi – määratlevaid/tagavaid tulemeid – süsteemi arhitektuur, – kavandimudelid, – lähtekood, – süsteemi evitamine jmt

protsessi võtmealad ei käsitle. • CMM ületähtsustab mitmesuguseid

traditsioonilisi kvaliteedi tagamise meetmeid (spetsifikatsioonide, kavandi, koodi ülevaatusi jmt), mis pole piisavad tarkvara protsessi edukuseks.

9

CMM-i probleemid

• CMM-i rakendamisel hakkab organisatsioon looma (tootma?) järjest enam dokumentatsiooni, koostama plaane, kehtestama kontrollpunkte jne – mida paksem dokumentatsioon ja mida pikemad nõupidamised, seda parem. – Samal ajal on karmistunud konkurentsiõhkkonnas

vaja luua tarkvara kiiremini, odavamalt, selleks on aga vaja minimeerida „mittemasina“ (=inimese) poolt koostatavaid dokumente jt abi-/vahetulemeid.

10

CMM-i probleemid• Probleemiks on ka organisatsiooni küpsuse mõõtmine.

– CMM kasutab toimingupõhist küpsuse mõõtmist: kui te viite läbi ettenähtud toimingud projektis, siis olete tasemel 2.

– Kui te viite läbi ettenähtud toimingud organisatsiooni ulatuses, siis olete tasemel 3. Ja nii edasi.

– Pole midagi, mis näitaks/iseloomustaks/mõõdaks, kui hästi (missuguse kvaliteediga) te neid toiminguid teostate.

– Täiendavalt oleks vaja organisatsiooni küpsuse tulemipõhist mõõtmist: kui te suudate korrata tarkvaprotsessi ennustatava maksumuse, kvaliteedi ja ajagraafikuga – siis olete näiteks tasemel 2.

– Kui te aga suudate järgnevates projektides ühte mõõtmetest (maksumus, kvaliteet, ajagraafik) parandada, siis olete tasemel 3 jne.

– Kombineerides toimingupõhise ja tulemipõhise küpsuse mõõtmise saame adekvaatsemalt määrata organisatsiooni tarkvaraprotsessi taset.

11

CMM-i probleemid

• Kosemudelil tuginev nõuetepõhine tarkvara kvaliteedi hindamine teeb spetsifikatsiooni järgimise olulisemaks kliendi tegelike vajaduste rahuldamisest (pole haruldane, et klient ei suuda esialgu nõudeid korrektselt spetsifitseerida, ka võivad kliendi vajadused aja jooksul muutuda).

• CMM-i erinevad standardid (CMM for Software (SW-CMM), People CMM (P‑CMM), SA-CMM (CMM for Software Acquisition), Systems Engineering Capability Model (SECM) jt) on osaliselt kattuvad (nt projektijuhtimise metoodika, nõuete spetsifitseerimine, protsessi määratlus), isegi vasturääkivad.

Roots of CMMI

13

Intro

• U.S. Department of Defense—specifically, the Deputy Under Secretary of Defense (Science and Technology)—teamed up with the National Defense Industrial Association (NDIA) to jointly sponsor the development of Capability Maturity Model Integration (CMMI).

• Working with the Software Engineering Institute (SEI) at Carnegie Mellon University, this effort produced the first integrated CMMI models in 2000, together with associated appraisal and training materials; 2002 saw the release of CMMI version 1.1.

14

Intro

• CMMI provides guidance for your managerial processes.

• CMMI guidance on technical matters includes ways to develop, elaborate, and manage requirements, and to develop technical solutions that meet those requirements.

15

Process Improvement

• The improvement information in CMMI models includes the creation of a viable, improvable process infrastructure.

• Processes need to be planned just like projects, and it helps if the organization has given some weight and validity to it through policy. – You need to make sure that resources are available for trained,

empowered people to perform the process.

• Processes become more capable when they are standardized across the organization and their performance is monitored against historical data.

16

Produce quality products or services

• The process-improvement concept in CMMI models evolved out of the Deming, Juran, and Crosby quality paradigm: – Quality products are a result of quality

processes

• CMMI has a strong focus on quality-related activities including requirements management, quality assurance, verification, and validation.

17

Create value for the stockholders

• Mature organizations are more likely to make better cost and revenue estimates than those with less maturity, and then perform in line with those estimates.

• CMMI supports quality products, predictable schedules, and effective measurement to support management in making accurate and defensible forecasts.

• This process maturity can guard against project performance problems that could weaken the value of the organization in the eyes of investors.

18

Be an employer of choice

• Watts Humphrey has said, "Quality work is not done by accident; it is done only by skilled and motivated people." [1]

• CMMI emphasizes training, both in disciplines and in process.

• Experience has shown that organizations with mature processes have far less turnover than immature organizations.

[1] Humphrey, W. Winning with Software, Boston: Addison-Wesley, 2002

19

Implement cost savings and best practices

• Processes that are documented, measured, and continuously improved are perfect candidates for becoming best practices, resulting in cost savings for the organization.

• CMMI encourages measurement as a managerial tool.

• By using the historical data collected to support schedule estimation, an organization can identify and widely deploy practices that work, and eliminate those that don't.

20

Gain an industry-wide recognition for excellence

• The best way to develop a reputation for excellence is to consistently perform well on projects, delivering quality products and services within cost and schedule parameters.

• Having processes that conform to CMMI requirements can enhance that reputation.

21

CMMI Milestones

• 1997 - CMMI initiated by U.S. Department of Defense and NDIA

• 1998 - First team meeting held• 1999 - First pilot completed • 2000:

– CMMI-SE/SW version 1.0 released for initial use – CMMI-SE/SW/IPPD version 1.0 released for initial use – CMMI-SE/SW/IPPD/SS version 1.0 released for piloting

• 2002:– CMMI-SE/SW version 1.1 released – CMMI-SE/SW/IPPD version 1.1 released – CMMI-SE/SW/IPPD/SS version 1.1 released – CMMI-SW version 1.1 released

22

The CMMI Product Line Approach

Industry

SEI

Government

• Team of Teams • Modeling and

Discipline Experts• Collaborative Process

SWSE

IPPD

Assess

Training

...

Acquisition

CMMIProduct Suite

CMMI-SE/SW

CMMI-SE/SW/

IPPD

...CMMI-SE/SW/IPPD/SA

Software - SW-CMM, draft version 2(c) Systems Engineering - EIA/IS 731 Integrated Product and Process Development - IPD-CMM, version 0.98Supplier Sourcing (SS)

Content of CMMI

24

Buyer/Supplier Mismatch

Management Capability

Level

Acquirer

Developer

Disaster

Matched TeamMismatch

Mismatch

• mature buyer must mentor low maturity developer

• outcome not predictable

•match of skills, maturity•team risk approach•execution to plan•measurable performance•quantitative management

highest probability of success

• constant crises• no req’s mgt.• no risk mgt.• no discipline• no process. . . • no product

• “Customer is always right” hurts.

• Customer encourages “short cuts.”

increasing

incr

easi

ng

25

One Model, Two Representations

Maturity Level 5 OID, CAR

Maturity Level 4 OPP, QPM

Maturity Level 3 REQD, TS, PI, VER, VAL, OPF, OPD, OT, IPM, RSKM, DAR

Overview Introduction Structure of the Model Model Terminology Maturity Levels, Common Features, and Generic Practices Understanding the Model Using the Model

Maturity Level 2 REQM, PP, PMC, SAM, MA, PPQA, CM

Appendixes

Engineering REQM, REQD, TS, PI, VER, VAL

Project Management PP, PMC, SAM IPM, RSKM, QPM

Process Management OPF, OPD, OT, OPP, OID

Process Management PAs - Goals - Practices

Support CM, PPQA, MA, CAR, DAR

Appendixes

CMMI-SE/SWStaged

Overview Introduction Structure of the Model Model Terminology Capability Levels and Generic Model Components Understanding the Model Using the Model

CMMI-SE/SWContinuous

26

CMMI-SE/SW/IPPD/A - Continuous

CMMI

Engineering SupportProcess

ManagementProject

Management

• Organizational Process Focus• Organizational Process • Definition• Organizational Training• Organizational Process • Performance• Organizational Innovation and Deployment

• Project Planning• Project Monitoring and Control• Supplier Agreement Mgmt.• Integrated Project Mgmt.• Risk Management• Quantitative Project Mgmt.

• Requirements Management• Requirements Development• Technical Solution• Product Integration• Verification• Validation

• Configuration Mgmt.• Process and Product Quality Assurance• Measurement & Analysis• Decision Analysis and Resolution• Causal Analysis and Resolution

IPPD

• Organizational Environment for Integration • Integrated Team

Acquisition

• Supplier Selection and Monitoring• Integrated Supplier Management• Quantitative Supplier Management

27

CMMI-SE/SW/IPPD/A - Staged

Initial(1)

Defined(3)

Managed(2)

Ad hoc, chaotic processes

QuantitativelyManaged

(4)

Optimizing(5)

ProcessStandardization

(11 PAs)

ContinuousProcess Improvement

(2 PAs)

•Organization Environment for Integration (OEI)•Integrated Team (IT)

• Supplier Selection and Monitoring (SSM)

* Additional PA goals and activities added for IPPD

•Quantitative Supplier Management (QSM)

•Requirements Management (REQM)•Project Planning (PP)•Project Monitoring and Control (PMC)•Supplier Agreement Management (SAM)•Measurement and Analysis (M&A)•Process and Product Quality Assurance (PPQA)•Configuration Management (CM)

•Requirements Development (RD)•Technical Solution (TS)•Product Integration (PI)•Verification (VER)•Validation (VAL)•Organizational Process Focus (OPF)•Organizational Process Definition (OPD)•Organization Training (OT)•Integrated Project Management (IPM) *•Risk Management(RSKM)•Decision Analysis and Resolution (DAR)

•Organizational Process Performance (OPP)•Quantitative Project Management (QPM)

•Organizational Innovation and Deployment (OID) •Causal Analysis and Resolution (CAR)

•Integrated Supplier Management (ISM)

Basic Project Management

(7 PAs)

QuantitativeManagement

(2 PAs)

Focus

28

Process Areas

• CMMI selects only the most important topics for process improvement and then groups those topics into "areas."

• This classification results in CMMI version 1.1 models with a relatively small set of process areas: – 22 in CMMI-SE/SW and CMMI-SW, – 24 in CMMI-SE/SW/IPPD, – and 25 in CMMI-SE/SW/IPPD/SS.

29

CMMI Process Area Contents

•Purpose

•Introductory Notes

•Goals: Specific and Generic

•Generic Practices

•Specific Practices

•Notes

•Work Products

•Subpractices

•Amplifications

•Elaborations

Required

Expected

Informative

30

Content Classification

• Any process-improvement model must, of necessity, include a scale relating to the importance and role of the materials contained in the model.

• In the CMMI models, a distinction is drawn between the terms "required," "expected," and "informative."

31

Required Materials

• The sole required component of the CMMI models is the "goal."

• A goal represents a desirable end state, the achievement of which indicates that a certain degree of project and process control has been achieved.

• When a goal is unique to a single process area, it is called a "specific goal" (SG).

• In contrast, when a goal may apply across all of the process areas, it is called a "generic goal" (GG).

32

Specific Goals - Requirements Management

• REQM SG 1: Requirements are managed and inconsistencies with project plans and work products are identified.

33

Specific Goals - Project Monitoring and Control

• PMC SG 2: Corrective actions are managed to closure when the project's performance or results deviate significantly from the plan.

34

Specific Goals - Organizational Process Performance

• OPP SG 1: Baselines and models that characterize the expected process performance of the organization's set of standard processes are established and maintained.

35

Specific Goals - Causal Analysis and Resolution

• CAR SG 2: Root causes of defects and other problems are systematically addressed to prevent their future occurrence.

36

Expected Materials

• The only expected component of the CMMI models is the statement of a "practice."

• A practice represents the "expected" means of achieving a goal.

• Every practice in the CMMI models is mapped to exactly one goal.

• A practice is not a required component, however; a specific organization may possess demonstrated means of achieving a goal that do not rely on the performance of all the practices that are mapped to that goal. – That is, "alternative" practices may provide an equally useful

way of reaching the goal.

37

Expected Materials

• When a practice is unique to a single process area, it is called a "specific practice" (SP).

• When a practice may apply across all of the process areas, it is called a "generic practice" (GP).

38

Specific Practices Associated with Specific Goals

• Specific Goal – REQM SG 1:

Requirements are managed and inconsistencies with project plans and work products are identified.

• Specific Practice – REQM SP 1.1-1:

Develop an understanding with the requirements providers on the meaning of the requirements.

39

Specific Practices Associated with Specific Goals

• Specific Goal – PMC SG 2: Corrective

actions are managed to closure when the project's performance or results deviate significantly from the plan.

• Specific Practice – PMC SP 2.1-1: Collect

and analyze the issues and determine the corrective actions necessary to address the issues.

40

Specific Practices Associated with Specific Goals

• Specific Goal – OPP SG 1: Baselines

and models that characterize the expected process performance of the organization's set of standard processes are established and maintained.

• Specific Practice – OPP SP 1.2-1:

Establish and maintain definitions of the measures that are to be included in the organization's process performance analyses.

41

Specific Practices Associated with Specific Goals

• Specific Goal – CAR SG 2: Root

causes of defects and other problems are systematically addressed to prevent their future occurrence.

• Specific Practice – CAR SP 2.2-1:

Evaluate the effect of changes on process performance.

42

Informative Materials

1. Purpose 2. Introductory Note 3. Reference 4. Names 5. Practice-to-Goal Relationship Table 6. Notes 7. Typical Work Products 8. Subpractices 9. Discipline Amplifications 10. Generic Practice Elaborations

43

Document Map

CMMI Representations

45

CMMI Model Representations

PA PA

Continuous

Cap

abili

ty

0

1 2

3

4

5

Process

PA

ML 1

Staged

ML2

ML3

ML4

ML5

Organization

46

CMMI Model Structure

Process Area 1

Commitmentto Perform

Abilityto Perform

DirectingImplementation

VerifyingImplementation

GenericPractices

Common Features

SpecificGoals

Capability Levels

Staged Continuous

RequiredExpected

Maturity Levels

SpecificPractices

GenericPractices

SpecificPractices

GenericGoals

GenericGoals

SpecificGoals

Process Area 2 Process Area n Process Area 1 Process Area 2 Process Area n

47

Staged Models

• The term "staged" comes from the way that the model describes this road map as a series of "stages" that are called "maturity levels."

• Each maturity level has a set of process areas that indicate where an organization should focus to improve its organizational process.

• Each process area is described in terms of the practices that contribute to satisfying its goals.

• The practices describe the infrastructure and activities that contribute most to the effective implementation and institutionalization of the process areas.

• Progress occurs by satisfying the goals of all process areas in a particular maturity level

48

Staged Models

• The CMM for Software is the primary example of a staged model

49

Continuous Models

• Continuous models provide less specific guidance on the order in which improvement should be accomplished.

• They are called continuous because no discrete stages are associated with organizational maturity.

• Like the staged models, continuous models have process areas that contain practices.

• Unlike in staged models, however, the practices of a process area in a continuous model are organized in a manner that supports individual process area growth and improvement.

• Most of the practices associated with process improvement are generic; they are external to the individual process areas and apply to all process areas

50

Continuous Models • The generic practices are grouped into capability levels (CLs), each

of which has a definition that is roughly equivalent to the definition of the maturity levels in a staged model.

• Process areas are improved and institutionalized by implementing the generic practices in those process areas.

• In a continuous model goals are not specifically stated, which puts even more emphasis on practices.

• The collective capability levels of all process areas determine organizational improvement, and an organization can tailor a continuous model and target only certain process areas for improvement.

• In other words, they create their own "staging" of process areas.

51

CMMI Capability Levels

Improved Performance Through Managed Processes

52

Capability Profile

• In a continuous appraisal, each process area is rated at its own capability level.

• An organization will most likely have different process areas rated at different capability levels.

• The results can be reported as a capability profile.

53

Capability Profile

54

CMMI Model Representations

• The CMMI Team was given an unprecedented challenge: Create a single model that could be viewed from two distinct perspectives, continuous and staged.

• The 25 process areas of CMMI-SE/SW/IPPD/SS are divided into four maturity levels in the staged representation and four process categories in the continuous representation.

55

Achieve Specific Goals.

Generic Goals (GG):

Institutionalize a Defined Process.

Institutionalize an Optimizing Process.

Generic Practices (GP):

Establish org. policy.Plan the process. Provide resources.Assign responsibility.Train people.Perform managed process.

Manage configurations.Identify & involve relevant stakeholders.Monitor and control the process.Objectively verify adherence.Review status with mgmt.

Institutionalize a Managed Process.

Establish a defined process.Collect improvement information.

Identify work scope.Perform base practices.

Establish quality objectives.Stabilize subprocess performance.

Ensure continuous process improvement.Correct common cause of problems.

1

CapabilityLevel:

2

3

4

5

Institutionalize a Quantitatively Managed Process.

(None) (None)0 (Incomplete)

(Performed)

(Managed)

(Defined)

(Quantitatively Managed)

(Optimizing)

56

Staged Groupings - Maturity level 2 Acronyms Process Areas

REQM Requirements Management

PP Project Planning

PMC Project Monitoring and Control

SAM Supplier Agreement Management

MA Measurement and Analysis

PPQA Process and Product Quality Assurance

CM Configuration Management

57

Staged Groupings - Maturity level 4

Acronyms Process Areas

OPP Organizational Process Performance

QPM Quantitative Project Management

58

Continuous Grouping – Process Management

Acronyms Process Areas

OPF Organizational Process Focus

OPD Organizational Process Definition

OT Organizational Training

OPP Organizational Process Performance

OID Organizational Innovation and Deployment

59

Equivalent Staging

• To reinforce the concept of one model with two views or representations, CMMI provides a mapping to move from the continuous to the staged perspective.

• For maturity levels 2 and 3, the concept is straightforward and easy to understand.

• If an organization using the continuous representation has achieved capability level 2 in the seven process areas that make up maturity level 2 (in the staged representation), then it can be said to have achieved maturity level 2

60

Target profile 2

61

Target profile 5

CMMI as an Enterprise Process-Improvement Framework

63

Strategy: Use Process Framework Standards to Aid in Life Cycle Definition

• Systems Life Cycle– ISO/IEC 15288, Systems engineering — System life cycle processes

• Software Life Cycle– ISO/IEC 12207, Standard for Information Technology —Software life

cycle processes

– IEEE/EIA 12207.0, Industry Implementation of International Standard ISO/IEC12207:1995 — (ISO/IEC 12207) Standard for Information Technology —Software life cycle processes

• IEEE/EIA 12207.1, Industry Implementation of International Standard ISO/IEC12207:1995 — (ISO/IEC 12207) Standard for Information Technology —Software life cycle processes – Life Cycle Data

• IEEE/EIA 12207.2, Industry Implementation of International Standard ISO/IEC12207:1995 — (ISO/IEC 12207) Standard for Information Technology —Software life cycle processes – Implementation considerations

64

The ISO/IEC 15288 SystemsLife Cycle Process Framework

SYSTEM LIFE CYCLE

PROJECT ASSESSMENT

PROJECT PLANNING

PROJECT CONTROLDECISION MAKING

RISK MANAGEMENTCONFIGURATION MANAGEMENT

INFORMATION MANAGEMENT

ENTERPRISE

SYSTEM LIFE CYCLE MANAGEMENT

RESOURCE MANAGEMENT

QUALITY MANAGEMENT

ENTERPRISE ENVIRONMENT MANAGEMENT

INVESTMENT MANAGEMENT

TECHNICAL

PROJECT

ACQUISITION

SUPPLYAGREEMENT

TRANSITIONSTAKEHOLDER REQUIREMENTS DEFINITION

REQUIREMENTS ANALYSISARCHITECTURAL DESIGN

IMPLEMENTATIONINTEGRATION

VERIFICATION

VALIDATIONOPERATION

MAINTENANCEDISPOSAL

65

The IEEE/EIA 12207 Software Life Cycle Process Framework

SOFTWARE LIFE CYCLE

TAILORING

CONFIGURATION MANAGEMENTDOCUMENTATION

QUALITY ASSURANCEVERIFICATION

VALIDATIONJOINT REVIEW

AUDITPROBLEM RESOLUTION

PRIMARY

DEVELOPMENT

OPERATION

MAINTENANCE

ACQUISITION

SUPPLY

ORGANIZATIONALMANAGEMENT

INFRASTRUCTUREIMPROVEMENT

TRAINING

SUPPORTING

66

Relationship between ISO/IEC 15288 and ISO/IEC 12207

Hardware Implementation

Software ImplementationRefer to ISO/IEC 12207

Human TaskImplementation

Acquisition

Supply

Enterprise Environment Management

Investment Management

System Life Cycle Processes Management

Resource Management

Quality Management

Implementation

StakeholderRequirements

Definition

Requirements Analysis

Architectural Design Integration

Verification

Transition

Validation Operation

Disposal

Maintenance

Project Planning Project Assessment Project Control

Configuration ManagementRisk ManagementDecision Making Information Management

Usability Projectprocesses

Enterpriseprocesses

Agreementprocesses

Technicalprocesses

67

Strategy: Use Supporting Standards as Best Practice Support for Your Defined

System Life Cycle

Concept Development Production Utilization Support Retirement

ISO/IEC 15288 Life Cycle Model Stages

ISO/IEC 15288 system life cycle processes

Incr

easi

ng

le

vel

of

det

ail

Task

Life cycle progression

Concept Development Production Utilization Support Retirement

ISO/IEC 15288 Life Cycle Model Stages

ISO/IEC 15288 system life cycle processes

Task level detail from a

3rd Standard

For example,

IEEE 1220

Incr

easi

ng

le

vel

of

det

ail

Life cycle progression

Activity level detail from a 2nd StandardFor example, ANSI/EIA 632

Activity level detail from a 4th Standard

A1. ISO/IEC 15288 and other engineering standards

Task level detail from a

5th Standard

68

Strategy: Use Supporting Standards as Best Practice Support for Your

Defined Software Life Cycle

69

CMMI SE/SW/IPPD/SS v1.1 Standards Mapping - Process Management

Process Management

• EIA 632 - Processes for Engineering a System

• IEEE 1220, Application and Management of the Systems Engineering Process

• IEEE 1074, Developing Software Life Cycle Processes

• 1517-1999, Reuse Processes

• ISO/IEC 15288, System Life Cycle Processes

• IEEE 12207.0, Software Life Cycle Processes

• IEEE 12207.1, Guide to Software Life Cycle Processes—Life Cycle Data

• IEEE 12207.2, Guide to Software Life Cycle Processes—Implementation Considerations

CMMISM SE/SW/IPPD/SS v1.1Process Area/Specific Practice

Framework Standards

Supporting Standards

70

CMMI SE/SW/IPPD/SS v1.1 Standards Mapping – Project Management

Project Management

• IEEE 1220, Application and Management of the Systems Engineering Process

• IEEE 1058, Software Project Management Plans

• IEEE 1490, A Guide to the Program Management Body of Knowledge

• IEEE 1062, Recommended Practice for Software Acquisition

• IEEE 1540, Risk Management• IEEE 1028, Software Reviews

• ISO/IEC 15288, System Life Cycle Processes

• IEEE 12207.0, Software Life Cycle Processes

• IEEE 12207.1, Guide to Software Life Cycle Processes—Life Cycle Data

• IEEE 12207.2, Guide to Software Life Cycle Processes—Implementation Considerations

71

CMMI SE/SW/IPPD/SS v1.1 Standards Mapping – Engineering

Engineering

• IEEE 1233, Guide for Developing System Requirements Specifications

• IEEE 1362, Guide for Concept of Operations Document

• IEEE 1471, Architectural Description of Software Intensive Systems

• IEEE 830, Software Requirements Specifications

• IEEE 1016, Software Design Descriptions

• IEEE 1012, Software Verification and Validation

• IEEE 1008, Software Unit Testing

• ISO/IEC 15288, System Life Cycle Processes

• IEEE 12207.0, Software Life Cycle Processes

• IEEE 12207.1, Guide to Software Life Cycle Processes—Life Cycle Data

• IEEE 12207.2, Guide to Software Life Cycle Processes—Implementation Considerations

• IEEE 1228, Software Safety Plans

• IEEE 1063, Software User Documentation

• IEEE 1219, Software Maintenance

• IEEE 1320.1,.2, IDEF0, IDEF1X97

• IEEE 1420.1, Data Model for Reuse Library Interoperability

72

CMMI SE/SW/IPPD/SS v1.1 Standards Mapping – Support

Support

• IEEE 828, Software Configuration Management Plans

• IEEE 730, Software Quality Assurance Plans

• IEEE 982.1, Dictionary of Measures to Produce Reliable Software

• IEEE 1045, Software Productivity Metrics

• IEEE 1061, Software Quality Metrics Methodology

• IEEE 1219, Software Maintenance

• ISO/IEC 15288, System Life Cycle Processes

• IEEE 12207.0, Software Life Cycle Processes

• IEEE 12207.1, Guide to Software Life Cycle Processes—Life Cycle Data

• IEEE 12207.2, Guide to Software Life Cycle Processes—Implementation Considerations

• IEEE 1465 (ISO/IEC 12119) - Software Packages - Quality Requirements and Testing

• IEEE 14143.1 (ISO/IEC 1443-1) - Functional Size Measurement - Part 1: Definition of Concepts

73

An Example - Requirements Development

SP 2.1-1 Establish Product and Product Component Requirements

– Establish and maintain, from the customer requirements, product and product component requirements essential to product and product component effectiveness and affordability

• ISO/IEC 15288, System Life Cycle Processes

– Clause 5.5.3 - Requirements Analysis Process

• IEEE/EIA 12207.0, Software Life Cycle Processes

– Clause 5.3.2 - System Requirements Analysis

– Clause 5.3.4 - Software requirements analysis

• IEEE 1233, Guide for Developing System Requirements Specifications

• IEEE 830, Software Requirements Specifications

74

An Example - Requirements Development

SP 2.1-1 Establish Product and Product Component Requirements

– Establish and maintain, from the customer requirements, product and product component requirements essential to product and product component effectiveness and affordability

• ISO/IEC 15288, System Life Cycle Processes

– Clause 5.5.3 - Requirements Analysis Process

• IEEE/EIA 12207.0, Software Life Cycle Processes

– Clause 5.3.2 - System Requirements Analysis

– Clause 5.3.4 - Software requirements analysis

• IEEE 1233, Guide for Developing System Requirements Specifications

• IEEE 830, Software Requirements Specifications

5.5.3 Requirements Analysis Process5.5.3.1 Purpose of the Requirements Analysis ProcessThe purpose of the Requirements Analysis Process is to transform the stakeholder, requirement-driven view of desired services into a technical view of a required product that could deliver those services.This process builds a representation of a future system that will meet stakeholder requirements and that, as far as constraints permit, does not imply any specific implementation. It results in measurable system requirements that specify, from the developer’s perspective, what characteristics it is to possess and with what magnitude in order to satisfy stakeholder requirements.5.5.3.2 Requirements Analysis Process OutcomesAs a result of the successful implementation of the Requirements Analysis Process:a) The required characteristics, attributes, and functional and performance requirements for a product solution are specified.b) Constraints that will affect the architectural design of a system and the means to realize it are specified.c) The integrity and traceability of system requirements to stakeholder requirements is achieved.. . .

75

An Example - Requirements Development

SP 2.1-1 Establish Product and Product Component Requirements– Establish and maintain, from

the customer requirements, product and product component requirements essential to product and product component effectiveness and affordability

• ISO/IEC 15288, System Life Cycle Processes

– Clause 5.5.3 - Requirements Analysis Process

• IEEE/EIA 12207.0, Software Life Cycle Processes

– Clause 5.3.2 - System Requirements Analysis

– Clause 5.3.4 - Software requirements analysis

• IEEE 1233, Guide for Developing System Requirements Specifications

• IEEE 830, Software Requirements Specifications

5.3.2.1 The specific intended use of the system to be developed shall be analyzed to specify system requirements. The system requirements specification shall describe: functions and capabilities of the system; business, organizational and user requirements; safety, security, human-factors engineering (ergonomics), interface, operations, and maintenance requirements; design constraints and qualification requirements. The system requirements specification shall be documented.5.3.4.1 The developer shall establish and document software requirements, including the quality characteristics specifications, described below. . . .a) Functional and capability specifications, including performance, physical characteristics, and environmental conditions under which the software item is to perform;b) Interfaces external to the software item; c) Qualification requirements;d) Safety specifications, including those related to methods of operation and maintenance, environmental influences, and personnel injury; e) Security specifications, including those related to compromise of sensitive information . . .

76

An Example - Requirements Development

SP 2.1-1 Establish Product and Product Component Requirements– Establish and maintain, from

the customer requirements, product and product component requirements essential to product and product component effectiveness and affordability

• ISO/IEC 15288, System Life Cycle Processes

– Clause 5.5.3 - Requirements Analysis Process

• IEEE/EIA 12207.0, Software Life Cycle Processes

– Clause 5.3.2 - System Requirements Analysis

– Clause 5.3.4 - Software requirements analysis

• IEEE 1233, Guide for Developing System Requirements Specifications

• IEEE 830, Software Requirements Specifications

7.2 Build a well-formed requirementThe analysts carry out this subphase by doing the following:a) Ensuring that each requirement is a necessary, short, definitive statement of need (capability, constraints);b) Defining the appropriate conditions (quantitative or qualitative measures) for each requirement and avoiding adjectives such as “resistant” or “industry wide;”c) Avoiding requirements pitfalls (see 6.4);d) Ensuring the readability of requirements, which entails the following: 1) Simple words/phrases/concepts; 2) Uniform arrangement and relationship; 3) Definition of unique words, symbols, and notations; 4) The use of grammatically correct language and symbology.e) Ensuring testability.Example: Capability: Move people between Los Angeles and New York Condition: Cruising speed of 200 km/hr Constraint: Maximum speed of 300 km/hr Well-formed requirement: This system should move people between Los Angeles and New York at an optimal cruising speed of 200 km/hr with a maximum speed of 300 km/hr.

77

An Example - Requirements Development

SP 2.1-1 Establish Product and Product Component Requirements– Establish and maintain, from

the customer requirements, product and product component requirements essential to product and product component effectiveness and affordability

• ISO/IEC 15288, System Life Cycle Processes

– Clause 5.5.3 - Requirements Analysis Process

• IEEE/EIA 12207.0, Software Life Cycle Processes

– Clause 5.3.2 - System Requirements Analysis

– Clause 5.3.4 - Software requirements analysis

• IEEE 1233, Guide for Developing System Requirements Specifications

• IEEE 830, Software Requirements Specifications

78

An Example - Requirements Development

SP 2.1-1 Establish Product and Product Component Requirements– Establish and maintain, from

the customer requirements, product and product component requirements essential to product and product component effectiveness and affordability

• ISO/IEC 15288, System Life Cycle Processes

– Clause 5.5.3 - Requirements Analysis Process

• IEEE/EIA 12207.0, Software Life Cycle Processes

– Clause 5.3.2 - System Requirements Analysis

– Clause 5.3.4 - Software requirements analysis

• IEEE 1233, Guide for Developing System Requirements Specifications

• IEEE 830, Software Requirements Specifications

5.3.2 FunctionsFunctional requirements should define the fundamental actions that must take place in the software in accepting and processing the inputs and in processing and generating the outputs. These are generally listed as “shall” statements starting with “The system shall”These include:a) Validity checks on the inputsb) Exact sequence of operationsc) Responses to abnormal situations, including: 1) Overflow 2) Communication facilities 3) Error handling and recoveryd) Effect of parameterse) Relationship of outputs to inputs . . . 1) It may be appropriate to partition the functional requirements into subfunctions or subprocesses. This doesnot imply that the software design will also be partitioned that way.5.3.3 Performance requirementsThis subsection should specify both the static and the dynamic numerical requirements placed on the soft-ware or on human interaction with the software as a whole. Static numerical requirements may include thefollowing:a) The number of terminals to be supported;b) The number of simultaneous users to be supported;c) Amount and type of information to be handled.

79

An Example - Requirements Development

SP 2.1-1 Establish Product and Product Component Requirements– Establish and maintain, from

the customer requirements, product and product component requirements essential to product and product component effectiveness and affordability

• ISO/IEC 15288, System Life Cycle Processes

– Clause 5.5.3 - Requirements Analysis Process

• IEEE/EIA 12207.0, Software Life Cycle Processes

– Clause 5.3.2 - System Requirements Analysis

– Clause 5.3.4 - Software requirements analysis

• IEEE 1233, Guide for Developing System Requirements Specifications

• IEEE 830, Software Requirements Specifications

80

Strategy: Base Your Business Activities on Your Defined Processes

• Institutionalize your processes– Train your workforce in your

defined processes

– Incentivize both management and staff

• Check process performance– Goal Process Results

(GPR)• A business-focused Plan Do

Check Act cycle• For a given set of results

either goals or processes may be modified

Goal

Results

Process

81

Strategy: Establish an Active Measurement Program

• For both Processes and Products:– Identify measurement objectives that are aligned with your

business needs– Choose specific measures, data collection and storage

mechanisms, analysis techniques, and reporting and feedback mechanisms that support your measurement objectives

– Implement your measurement program– Provide decision-makers with objective results that can be used

in making informed decisions– Periodically assess your measurement program to ensure that

it is meeting current business needs

82

Commercial• Product development in the commercial environment drives revenue

for the corporation:– Strategic planning for the corporation is driven by market pressures with

the ultimate goal of improving revenue and profitability.– Process discipline will enhance productivity by reducing the time to

market.– Customer satisfaction will increase with measurable improvements in

reliability and quality.– Process improvement activities compete for corporate resources:

• Return on investment must be equal to or better than other opportunities.• Examples for software process improvement:[1]

– Development time Reduced 73%– Rework costs Reduced 96%– Average schedule length Reduced 37%– Post release defects Reduced 80%– Return on investment 21:1

83

Commercial

84

IEEE 1220

• IEEE 1220 is a systems engineering life cycle standard.

• Assumptions in IEEE 1220 include:– Quality management system a la ISO

9001:1994 exists– IEEE 1220 conforms to the requirements of

ISO 12207

85

ISO 10006

• Quality Management: Guidelines to Quality in Project Management, 1997

• This document is a supplement to ISO 9004-1, and is itself not a "normative" standard (one against which an ISO audit would be held).

• Rather, it provides interpretation of ISO 9004-1 in the context of project management practices.

• One of its primarily sources was the Project Management Institute Guide to the Project Management Body of Knowledge, 1996 version.

86

ISO 9001:2000

• This document is focused on organizational level infrastructure to support implementation of an overall quality management system, but does have some project and engineering-related requirements.

87

CMMI transition project: Schedule• Consider the following guidelines:

– Making the transition to CMMI is like implementing other cultural changes in an organization;

• the further the starting point on the journey, the longer the trip will take.

– Based on SEI measurements on implementing the CMM for Software, organizations with little existing infrastructures or experience with process improvement projects typically require 18 to 24 months to reach maturity level 2.

• To reach maturity level 3 typically requires an additional 12 months.

– Organizations at higher maturity or capability levels in software and systems engineering will probably require one year, depending upon the current measurement programs and the number of new processes that need to be developed and rolled out.

• Typically, new processes take at least six months to be rolled out in an organization.