Upload
carol-summers
View
216
Download
0
Tags:
Embed Size (px)
Citation preview
1
DevOps Maturity Model
IBM Rational
03/10/2014
© 2013 IBM Corporation2
What Are Our Goals?
• Ultimate goal is to continuously improve tools and process to produce a quality product with faster delivery to our customers.
• Measuring the continuous improvement is important to show we are focusing in the areas that provide us the best ROI.
• How do we plan to measure?– DevOps Maturity Model Self Assessment– DevOps Outcome Based Self Measurement
• How often do we plan on doing the self assessments?– Current plan is once quarterly
• What will be done with the measurements?– Identify Tools/Processes that can help improve DevOps Maturity– Continuous improvement measured through the metrics collected.– Will be shared with the executive teams
© 2013 IBM Corporation3
DevOps Principles and Values
Fully Achieved Partially Achieved
Document objectives locally
Manage department resources
Document objectives locally
Manage department resources
Manage Lifecycle artifactsSchedule SCM integrations
and automated builds Test following construction
Manage Lifecycle artifactsSchedule SCM integrations
and automated builds Test following construction
Plan and manage releases
Standardize deployments
Plan and manage releases
Standardize deployments
Automate problem isolation and issue resolution
Optimize to customer KPIscontinuously
Automate problem isolation and issue resolution
Optimize to customer KPIscontinuously
Improve continuously with development intelligence
Test Continuously
Improve continuously with development intelligence
Test Continuously
Manage environments through automation
Provide self-service build, provision and deploy
Manage environments through automation
Provide self-service build, provision and deploy
Standardize and automate cross-enterprise
Automate patterns-based provision and deploy
Standardize and automate cross-enterprise
Automate patterns-based provision and deploy
Plan departmental releases and automate status
Automated deployment with standard topologies
Plan departmental releases and automate status
Automated deployment with standard topologies
Goals
Define release with business objectives
Measure to customer value
Define release with business objectives
Measure to customer value
Optimize applications
Use enterprise issue resolution procedures
Optimize applications
Use enterprise issue resolution procedures
Manage data and virtualizeservices for test
Deliver and integrate continuously
Manage data and virtualizeservices for test
Deliver and integrate continuously
Monitor using business and end user context
Centralize event notification and incident resolution
Monitor using business and end user context
Centralize event notification and incident resolution
Monitor resources consistently
Collaborate Dev/Ops informally
Monitor resources consistently
Collaborate Dev/Ops informally
Plan and source strategically
Dashboard portfolio measures
Plan and source strategically
Dashboard portfolio measures
Link objectives to releasesCentralize Requirements
ManagementMeasure to project metrics
Link objectives to releasesCentralize Requirements
ManagementMeasure to project metrics
Link lifecycle information Deliver and build with testCentralize management
and automate test
Link lifecycle information Deliver and build with testCentralize management
and automate test
DevOps Maturity Model
1) Which of the following best describes the state of your code at the end of each iteration? * Ready for testing
* Partially tested, ready for additional integration, performance, security, and/or other testing* Fully tested, documented, and ready for production delivery or GA release (modulo translation work or legal approvals)
2) How quickly can your team pivot (complete a feature in progress and start working on a newly-arrived, high-priority feature)?
* 3 months or longer* less than 3 months* less than one month* less than one week
3) How quickly are you able to change a line of code and deliver to customers as part of a fully-tested, non-fixpack release?
* 12 months or longer* less than 6 months* less than 3 months* less than one month* less than one week* less than one day
4) What is the cost (in person-hours) of executing a full functional regression test?<enter value>
5) How long does it take for developers to find out that they have committed a source code change that breaks a critical function?
* One week or longer* 1 to 7 days* 12 to 24 hours* 3 to 12 hours* 1 to 3 hours* less than one hour
6) Which of the following best describes the state of your deployment automation for the environments used for testing?* We have no deployment automation. Setting up our test environments is entirely manual.* We have some deployment automation, but manual intervention is typically required (for example, to provision machines, setup dependencies, or to complete the process).* We create fully-configured test environments from scratch and we reliably deploy into those environments without manual intervention.* We create fully-configured production-congruent environments from scratch and we reliably deploy into those environments without manual intervention.
Outcome Based Metrics
DevOps Maturity Model Self Assessment
"Plan & Measure" Questions.1..63.
Develop and test business strategy requirements against a production-like system
Iterative and frequent deployments using repeatable and reliable processes
Continuously monitor and validate operational quality characteristics
Amplify feedback loops
Continuous Feedback
Continuous Build
STG DevOps Proof of Concept Investigation
Customer Interaction
RFESCE
Continuous Integration
Customer Interaction
Continuous Test
Service Management ConnectContent
Download
Hosted Environment
Feedback
Driver VM
Continuous DeploymentAgile Development
Driver VM
Driver VM
Rational UrbanCode Rational Focal Point
Security (AppScan)
Rational Team ConcertTask WI, Change Record WI
Jazz SCM
Jazz Build Engine (JBE)
Compilers
Compile Pool ResourceWeb Browser
RTC Web Client
RTC Eclipse Client
RTC Build Client (JBE)
AppScan
Driver Images
Test Resources
Debug Environment
Focal Point Client
Web Browser
RTC Web Client
RTC Eclipse Client
Development VM Builder VM
UrbanCode Deploy?
RTC Eclipse Client
RTC Build Engine Client
Test Environment
Driver Images
Debug Environment
Test Resources
RTC Web ClientRTC Eclipse Client
Compilers
Compile Pool Resource
AppScan
Build Resources
RTC Build Engine Agent
Image Catalog
Continuous Feedback
Continuous Build
STG DevOps Proof of Concept Investigation
Customer Interaction
RFESCE
Continuous Integration
Customer Interaction
Continuous Test
Service Management ConnectContent
Download
Hosted Environment
Feedback
Driver VMDriver VM
Continuous DeploymentAgile Development
Driver VMDriver VM
Driver VMDriver VM
Rational UrbanCode Rational Focal Point Rational Focal Point
Security (AppScan) Security (AppScan)
Rational Team ConcertTask WI, Change Record WI
Jazz SCM
Jazz Build Engine (JBE)
Rational Team ConcertTask WI, Change Record WI
Jazz SCM
Jazz Build Engine (JBE)
Compilers
Compile Pool ResourceWeb Browser
RTC Web Client
RTC Eclipse Client
RTC Build Client (JBE)
AppScan
Driver Images
Test Resources
Debug Environment
Focal Point Client
Web Browser
RTC Web Client
RTC Eclipse Client
Development VM Builder VM
UrbanCode Deploy?
RTC Eclipse Client
RTC Build Engine Client
Test Environment
Driver Images
Debug Environment
Test Resources
RTC Web ClientRTC Eclipse Client
Compilers
Compile Pool Resource
AppScan
Build Resources
RTC Build Engine Agent
Image Catalog
© 2013 IBM Corporation4
Where do you start? DevOps improvements adoption
Assess and define outcomes & supporting practices to drive strategy and roll-out
What am I trying to achieve?
• Think through business-level drivers for improvement • Define measurable goals for your organizational investment• Look across silos and include key Dev and Ops stakeholders
Where am I currently?
• What do you measure and currently achieve• What don’t you measure, but should to improve• What practices are difficult, incubating, well scaled • How do your team members agree with these findings
What are my priorities ?
• Start where you are today and where your improvement goals
• Consider changes to People, Practices, Technology• Prioritize change using goals, complexities and dependencies
Ste
p 1
Ste
p 2
Ste
p 3
Current PracticeAssessment
Objective & Prioritized Capabilities
Business Goal Determination
Determine Activities Objective
How should my practices
improve?Ste
p 4
• Understand your appetite for cross functional change • Target improvements which get the best bang for the buck• Roadmap and agree on an actionable plan• Use measurable milestones that include early wins
Roadmap
4
Fully Achieved Partially Achieved
Document objectives locally
Manage department resources
Document objectives locally
Manage department resources
Manage Lifecycle artifactsSchedule SCM integrations
and automated builds Test following construction
Manage Lifecycle artifactsSchedule SCM integrations
and automated builds Test following construction
Plan and manage releases
Standardize deployments
Plan and manage releases
Standardize deployments
Automate problem isolation and issue resolution
Optimize to customer KPIscontinuously
Automate problem isolation and issue resolution
Optimize to customer KPIscontinuously
Improve continuously with development intelligence
Test Continuously
Improve continuously with development intelligence
Test Continuously
Manage environments through automation
Provide self-service build, provision and deploy
Manage environments through automation
Provide self-service build, provision and deploy
Standardize and automate cross-enterprise
Automate patterns-based provision and deploy
Standardize and automate cross-enterprise
Automate patterns-based provision and deploy
Plan departmental releases and automate status
Automated deployment with standard topologies
Plan departmental releases and automate status
Automated deployment with standard topologies
Goals
Define release with business objectives
Measure to customer value
Define release with business objectives
Measure to customer value
Optimize applications
Use enterprise issue resolution procedures
Optimize applications
Use enterprise issue resolution procedures
Manage data and virtualizeservices for test
Deliver and integrate continuously
Manage data and virtualizeservices for test
Deliver and integrate continuously
Monitor using business and end user context
Centralize event notification and incident resolution
Monitor using business and end user context
Centralize event notification and incident resolution
Monitor resources consistently
Collaborate Dev/Ops informally
Monitor resources consistently
Collaborate Dev/Ops informally
Plan and source strategically
Dashboard portfolio measures
Plan and source strategically
Dashboard portfolio measures
Link objectives to releasesCentralize Requirements
ManagementMeasure to project metrics
Link objectives to releasesCentralize Requirements
ManagementMeasure to project metrics
Link lifecycle information Deliver and build with testCentralize management
and automate test
Link lifecycle information Deliver and build with testCentralize management
and automate test
© 2013 IBM Corporation5
Outcome Based Metrics
1) Which of the following best describes the state of your code at the end of each iteration? * Ready for testing
* Partially tested, ready for additional integration, performance, security, and/or other testing* Fully tested, documented, and ready for production delivery or GA release (modulo translation work or legal approvals)
2) How quickly can your team pivot (complete a feature in progress and start working on a newly-arrived, high-priority feature)?* 3 months or longer* less than 3 months* less than one month* less than one week
3) How quickly are you able to change a line of code and deliver to customers as part of a fully-tested, non-fixpack release? * 12 months or longer
* less than 6 months* less than 3 months* less than one month* less than one week* less than one day
4) What is the cost (in person-hours) of executing a full functional regression test?<enter value>
5) How long does it take for developers to find out that they have committed a source code change that breaks a critical function?
* One week or longer* 1 to 7 days* 12 to 24 hours* 3 to 12 hours* 1 to 3 hours* less than one hour
6) Which of the following best describes the state of your deployment automation for the environments used for testing?* We have no deployment automation. Setting up our test environments is entirely manual.* We have some deployment automation, but manual intervention is typically required (for example, to provision machines, setup dependencies, or to complete the process).* We create fully-configured test environments from scratch and we reliably deploy into those environments without manual intervention.* We create fully-configured production-congruent environments from scratch and we reliably deploy into those environments without manual intervention.
© 2013 IBM Corporation6
Outcome Based Metrics 7) (If your product is SaaS) Which of the following best describes the state of your deployment automation
for your staging and production environments?* We have no deployment automation. Setting up our staging and production environments is entirely
manual.* We have some deployment automation, but manual intervention is typically required (for example, to provision machines, setup dependencies, or to complete the process).* We create fully configured staging and production environments from scratch and we reliably deploy into those environments without manual intervention.
8) (If your product is SaaS) Are you able to make business decisions based on data provided by infrastructure, application, and customer experience monitoring?
* yes* no
9) (If your product is SaaS) How much downtime is generally required to deploy a new version into production? * 4 hours or longer* 1-4 hours* less than 1 hour* No downtime is needed
10) (If your product is SaaS) How often do problems occur when deploying a new version into production? * Problems always occur
* Problems occur about 50% of the time* Problems occur about 25% of the time* Problems occur about 10% of the time* Problems are rare
© 2013 IBM Corporation7
Initial RolloutProject Contacts DevOps Overview &
Maturity ModelSelf Assessment Analyze Results Roadmap for Goals
Ice Castle Marla Berg, John Beveridge, Lafonda Richburg 01/30/2014 03/07/2014 03/21/2014
Platform Resource Scheduler Anumpa, Jason Adelman, Shen Wu. 01/30/2014 03/06/2014 03/21/2014
IBM Cloud OpenStack Platform Christine Grev, Gary Palmersheim 01/30/2014 03/06/2014 03/21/2014
GPFS Bonnie Pulver, Lyle Gayne, Steve Duersch, Yuri L Volobuev 02/19/2014 02/21/2014 03/07/2014
Pinehurst Sue Townsend 01/30/2014
Power FW Atit 02/19/2014
AIX Atit 01/30/2014
PowerVC (Thomas)
Platform LSF (Akthar)
Platform Symphony (Akthar)
MCP (Frye)
PowerKVM (Frye)
Power Linux? (Frye)
© 2013 IBM Corporation8
Sample ResultsDevOps Maturity Model Self Assessment ResultsOutcome Based Metrics
© 2013 IBM Corporation9
Sample Details
Plan & MeasureReliable
© 2013 IBM Corporation10
Sample Details
Develop & TestPracticed
© 2013 IBM Corporation11
Sample Details
Release & DeployPracticed
© 2013 IBM Corporation12
Sample Details
Monitor & OptimizePracticed
© 2013 IBM Corporation13
DevOps Outcome Metrics
DevOp Outcome Metrics Sample1 Sample2
1. State of code at end of iteration Partially Tested Partially tested
2. How quick pivot to new high priority Less than 1 month Less than 3 months
3. How long from LOC to tested release
Less than 1 month Less than 3 months
4. Person Hours of full functional regression
280 hours (7 PW)(if 1-2 weeks, what
PW)?
5. Dev time to knowing critical function broken
1-3 hours 12-24 hours
6. State of deployment automation
We create fully configured test
environments from scratch and we
reliably deploy into those environments
without manual intervention
We have some deployment
automation but manual intervention is typically required
7. If SAAS, Deployment automation for staging
N/A N/A
8. If SAAS, decisions based on monitoring?
N/A N/A
9. If SAAS, downtime required to deploy prod
N/A N/A
10. If SAAS, problems on prod deployments
N/A N/A
© 2013 IBM Corporation14
Plan / Measure Development / Test Release / Deploy Monitor / Optimize
Scaled
Reliable
Repeatable
Practiced
Sample Maturity Model Assessment
Define release with business objectives
Measure to customer value
Optimize applicationsUse enterprise issue resolution procedures
Standardize and automate cross-enterprise
Automate patterns-based provision and deploy
Fully Achieved Partially Achieved
Manage data and virtualize services for test
Deliver and integrate continuously
Monitor using business and end user context
Centralize event notification and incident resolution
Automate problem isolation and issue resolution
Optimize to customer KPIs continuously
Improve continuously with development intelligence
Test Continuously
Manage environments through automation
Provide self-service build, provision and deploy
Goal
Document objectives locallyManage department
resources
Link objectives to releasesCentralize Requirements
ManagementMeasure to project metrics
Plan and source strategically
Dashboard portfolio measures
Automated test environment deployment
Run unattended test automation / regression
Plan and manage releases Standardize deployments
Plan departmental releases and automate status
Automated deployment with standard topologies
Monitor resources consistently
Collaborate Dev/Ops informally
Schedule SCM integrations and automated builds
Test following construction
14
© 2013 IBM Corporation15
Plan / Measure Development / Test Release / Deploy Monitor / Optimize
Scaled
Reliable
Repeatable
Practiced
Sample Maturity Model Assessment
Define release with business objectives
Measure to customer value
Optimize applicationsUse enterprise issue resolution procedures
Standardize and automate cross-enterprise
Automate patterns-based provision and deploy
Fully Achieved Partially Achieved
Manage data and virtualize services for test
Deliver and integrate continuously
Monitor using business and end user context
Centralize event notification and incident resolution
Automate problem isolation and issue resolution
Optimize to customer KPIs continuously
Improve continuously with development intelligence
Test Continuously
Manage environments through automation
Provide self-service build, provision and deploy
Goal
Document objectives locallyManage department
resources
Link objectives to releasesCentralize Requirements
ManagementMeasure to project metrics
Plan and source strategically
Dashboard portfolio measures
Automated test environment deployment
Run unattended test automation / regression
Plan and manage releases Standardize deployments
Plan departmental releases and automate status
Automated deployment with standard topologies
Monitor resources consistently
Collaborate Dev/Ops informally
Schedule SCM integrations and automated builds
Test following construction
15
Focus across
Foc
us u
p GOALS: Where is thebest result?
© 2013 IBM Corporation16
Goal Discussion: Planning for Initiative #1
Pain Point Improvement Value Effort Required
Priority Next Step
© 2013 IBM Corporation17
Continuous Feedback
Continuous Build
DevOps Proof of Concept Investigation
Customer Interaction
RFESCE
Continuous Integration
Customer Interaction
Continuous Test
Service Management ConnectContent
Download
Hosted Environment
Feedback
Driver VM
Continuous DeploymentAgile Development
Driver VM
Driver VM
Rational UrbanCode Rational Focal Point
Security (AppScan)
Rational Team ConcertTask WI, Change Record WI
Jazz SCM
Jazz Build Engine (JBE)
Compilers
Compile Pool ResourceWeb Browser
RTC Web Client
RTC Eclipse Client
RTC Build Client (JBE)
AppScan
Driver Images
Test Resources
Debug Environment
Focal Point Client
Web Browser
RTC Web Client
RTC Eclipse Client
Development VM Builder VM
UrbanCode Deploy?
RTC Eclipse Client
RTC Build Engine Client
Test Environment
Driver Images
Debug Environment
Test Resources
RTC Web ClientRTC Eclipse Client
Compilers
Compile Pool Resource
AppScan
Build Resources
RTC Build Engine Agent
Image Catalog
© 2013 IBM Corporation18
Plan / Measure Development / Test Release / Deploy Monitor / Optimize
Scaled
Reliable
Repeatable
Practiced
Introduction to Practice Based Maturity Model
Define release with business objectives
Measure to customer value
Optimize applicationsUse enterprise issue resolution procedures
Standardize and automate cross-enterprise
Automate patterns-based provision and deploy
Manage data and virtualize services for test
Deliver and integrate continuously
Link objectives to releasesCentralize Requirements
ManagementMeasure to project metrics
Link lifecycle information Deliver and build with testCentralize management
and automate test
Plan departmental releases and automate status
Automated deployment with standard topologies
Document objectives locallyManage department
resources
Manage Lifecycle artifactsSchedule SCM integrations
and automated builds Test following construction
Plan and manage releases Standardize deployments
Monitor resources consistently
Collaborate Dev/Ops informally
Plan and source strategically
Dashboard portfolio measures
Monitor using business and end user context
Centralize event notification and incident resolution
Automate problem isolation and issue resolution
Optimize to customer KPIs continuously
Improve continuously with development intelligence
Test Continuously
Manage environments through automation
Provide self-service build, provision and deploy
© 2013 IBM Corporation19
Maturity Levels Defined
Practiced Some teams exercise activities associated with the practice, inconsistently. No enterprise standards defined. Automation may be in place but without consistent usage models.
Repeatable (Consistent)
Enterprise standards for practice are defined. Some teams exercise activities associated with the practice and follow the standards. No core team or COE to assist with practice adoption. Automation, if used, follows enterprise standards.
Reliable Mechanisms exist to assist adoption and ensure that standards are being followed. Core team of mentors available to assist in adoption.
ScaledInstitutionalized adoption across the enterprise. COE is a matured and integral part of continuous improvement and enablement. Practices are mainstreamed across the enterprise. Feedback process in place to improve the standards.
Specific maturity levels are defined by how well an organization can perform practices. The levels look at consistency, standardization, usage models, defined practices, mentor team or center of excellence, automation, continuous improvement and organizational or technical change management.
© 2013 IBM Corporation20
At the practiced level, organizations capture business cases or goals in documents for each project to define scope within the strategy but resourcing for projects are managed at the department level. Once projects are executed change decisions and scope are managed within the context of the project or program to achieve goals within budget/time. As organizations mature business needs are documented within the context of the enterprise and measured to meet customer value metrics. Those needs are then prioritized and aligned to releases and linked to program or project requirements. Project change decisions and scope are managed at the portfolio level.
Plan/Measure
© 2013 IBM Corporation21
At the practiced level, project and program teams produce multiple software development lifecycle products in the form of documents, spreadsheets to explain their requirements, design, test plans. Code changes and application level builds are performed on a formal, periodic schedule to ensure sufficient resources are available to overcome challenges. Testing, except for unit level, is performed following a formal delivery of the application build to the QA team after most if not all construction is completed. As organizations mature, software development lifecycle information is linked at the object level to improve collaboration within the context of specific tasks and information. This provides the basis for development intelligence used to assess the impact of processor technology improvements, continuously. A centralized testing organization and service provides support across application/projects that can continuously test regressions and higher level automated tests provided infrastructure and application deployment can also support. Software delivery, integration and build with code scans/unit testing are performed routinely and on a continuous basis for individual developers, teams, applications and products. .
Development/Test
© 2013 IBM Corporation22
At the practiced level, releases are planned annually for new features and maintenance teams. Critical repairs and off-cycle releases emerge as needed. All are managed in a spreadsheet updated through face-to-face meetings. Impact analysis of change is performed manually as events occur. Application deployments and middleware configurations are performed consistently across departments using manual or manually staged and initiated scripts. Infrastructure and middleware are provisioned similarly. As organization mature, releases are managed centrally in a collaborated environment that leverages automation to maintain the status of individual applications. deployments and middleware configurations are automated then move to a self-service providing individual developers, teams, testers and deployment managers with a capability to build, provision, deploy, test and promote, continuously . Infrastructure and middleware provisioning evolves to an automated then self-service capability similar to application deployment. Operations engineers move to changing automation code and re-deploying over manual or scripted changes to existing environments. .
Release/Deploy
© 2013 IBM Corporation23
At the practiced level, deployed resources are monitored and events or issues are addressed as they occur without context of the affected business application. Dev and Ops coordination is usually informal and event driven. Feedback of user experience with business applications is achieved through formalized defect programs. As organizations mature, monitoring is performed within the context of business applications and optimization begins in QA environments to improve stability, availability and overall performance. Customer experience is monitored to optimize experiences within business applications. Optimization to customer KPIs is part of the continuous improvement program.
Monitor/Optimize
© 2013 IBM Corporation24
Plan / Measure Development / Test Release / Deploy Monitor / Optimize
Scaled
Reliable
Repeatable
Practiced
Sample: Practice based maturity model: Maturity Goals for an Initiative
Fully Achieved Partially Achieved
Document objectives locallyManage department
resources
Manage Lifecycle artifactsSchedule SCM integrations
and automated builds Test following construction
Plan and manage releases Standardize deployments
Automate problem isolation and issue resolution
Optimize to customer KPIs continuously
Improve continuously with development intelligence
Test Continuously
Manage environments through automation
Provide self-service build, provision and deploy
Standardize and automate cross-enterprise
Automate patterns-based provision and deploy
Plan departmental releases and automate status
Automated deployment with standard topologies
Goals
Define release with business objectives
Measure to customer value
Optimize applicationsUse enterprise issue resolution procedures
Manage data and virtualize services for test
Deliver and integrate continuously
Monitor using business and end user context
Centralize event notification and incident resolution
Monitor resources consistently
Collaborate Dev/Ops informally
Plan and source strategically
Dashboard portfolio measures
Link objectives to releasesCentralize Requirements
ManagementMeasure to project metrics
Link lifecycle information Deliver and build with testCentralize management
and automate test