40
PM Challenge 2012 Evolve and Excel Focus on the Objective Right Sizing Your Project/Program Design Review Plan PM Challenge 2012 Charles Armstrong National Aeronautics and Space Administration Orion VIO SE&I Plans & Processes Manager [email protected] 281-244-6315 Tony Byers Lockheed Martin Space Systems Company Software Systems & Data Sr. Manager [email protected] 303-977-4389

C armstrong tbyers

  • Upload
    nasapmc

  • View
    13.867

  • Download
    0

Embed Size (px)

DESCRIPTION

 

Citation preview

Page 1: C armstrong tbyers

PM Challenge 2012

Evolve and Excel

Focus on the Objective

Right Sizing Your Project/Program

Design Review Plan

PM Challenge 2012

Charles ArmstrongNational Aeronautics and Space AdministrationOrion VIO SE&I Plans & Processes [email protected]

Tony ByersLockheed Martin Space Systems CompanySoftware Systems & Data Sr. [email protected]

Page 2: C armstrong tbyers

Agenda

2

• Introduction

• Program/Project Design Review Approach• Planning

• Technical & Peer Reviews

• Component Design Reviews

• Subsystem Design Reviews

• System Level Reviews

• Boards & Panels

• Summary• Affordability

• Time-line Comparison

• Conclusion

• Closing Remarks

Page 3: C armstrong tbyers

PM Challenge 2012

Evolve and Excel

Right Sizing Your Design Review Plan

Focus on the Objective

Introduction

Page 4: C armstrong tbyers

4

Introduction

• Presentation Focus:• Historically, NASA major milestone reviews have been conducted by allowing participation

by a large number of agency personnel. • Review a wide range of materials • Write a large number of candidate issues in the form of RIDs

• Very expensive • Significant infrastructure and logistics • Large effort to review, disposition, and close

• Using the lessons learned from previous Programs/Projects a streamlined integrated approach to conducting major design reviews is proposed.

• Recognizes ongoing insight/oversight as part of the design review plan• Expands the use of lower level design reviews (Design products / documentation

Tech/Peer Reviews, Component Reviews, and Subsystem Reviews)• Uses highly qualified personnel - Limits participation to individuals with an

identified ability to make a value-added contribution• Changes the way issues are identified and vetted

If successful, this methodology will reduce the cost of conducting major milestone reviews and set the new standard for the conduct of major milestone reviews within the agency.

Page 5: C armstrong tbyers

Management Challenges

5

• Planning Challenges– Develop and execute a program/project technical life-cycle review (SRR, SDR, PDR,

CDR) process that ensures NASA’s next generation products can meet all mission objectives and satisfy requirements with acceptable risk

– Develop a multi-tiered approach that ensures the integrated system (components, subsystems, system) is reviewed to ensure the design approach is at a sufficient level of maturity to proceed to the next life-cycle phase.

– Maintain an eye on affordability

– Balance thoroughness and schedule. The desire to review every aspect of the design down to the lowest possible level must be balanced against the available time to execute the review as to not delay the development activities unnecessarily.

– Balance inclusion and efficiency. Develop a process that ensures all stakeholders have an adequate time to participate and provide input helping to shape the design, ensure mission success, and ultimately provide a safe crew exploration vehicle.

LMIT-ODIN
do we want to make this more generic to all projecgts/programs?
Page 6: C armstrong tbyers

Management Challenges

6

• Execution Challenges– Communicate the expectations to an often diverse personnel organization consisting

of NASA, prime contractor, and many support contractors

– Balance the desire to improve the design with the need to efficiently transition to the next life-cycle phase

– From the time a baseline is struck, a lot of potential energy develops to revisit the requirements (did we miss something?) and improve the design (could it be better?)

– This energy must be managed to ensure that identified issues and design refinement are necessary to ensure mission success (including crew safety for human spaceflight) and not simply to resolve an “I would have did it differently” comment

– Review products must align as closely as possible to the baseline

– Monitor and control process execution throughout the duration of the effort

LMIT-ODIN
Done
Page 7: C armstrong tbyers

PM Challenge 2012

Evolve and Excel

Right Sizing Your Design Review Plan

Focus on the Objective

Program/Project Design Review Approach

• Planning

• Technical & Peer Reviews

• Component Design Reviews

• Subsystem Design Reviews

• System Level Reviews

• Boards & Panels

Page 8: C armstrong tbyers

Design Review Planning

• Form a review integration team– Identify and assign a manager to lead the major design review (NASA & Prime

Contractor)– Identify a lead from each IPT– Identify key participants from other groups (Planning, Configuration & Data

Management, Subcontracts, etc.)– Ensure joint NASA/Prime Contractor support in each area, where possible

• Define review– Review NASA / Industry review guidance– Review program documentation

• Statement of Objectives, Statement of Work, Systems Engineering Management Plan, etc.– Define review scope

• Baselines (Technical, Schedule, & Cost)• Level of penetration (Component, Subsystem, System)

– Define review objectives– Define Entry and Success Criteria

Be cognizant of the applicable program / project phase

Page 9: C armstrong tbyers

Design Review Planning

• Develop the detailed review process– Include how each review is supposed to be executed– Include how issues are recorded, communicated, tracked, and closed

• Increase formality as you increase integration– Technical / Peer reviews – lower formality– Subsystem & System level reviews – increased formality (RIDs, etc.)

• Ensure open issues can move from review to review to minimize duplication

• Define Review Participants– Determine Chair(s)– Define roles and responsibilities (Chairs, Reviewers, Program/Project Team, etc.)– Include Program/Project leadership– Include external stakeholders in component level reviews (other interfacing components,

subsystems, etc.)– Include external stakeholders in Subsystem and System level reviews (Engineering, Safety,

Crew office, Mission Operations, other NASA Centers, etc.)– Include independent reviewers (especially important as you progress to subsystem and

system level reviews where SRB participation is expected)

Page 10: C armstrong tbyers

Design Review Planning

• Develop detailed schedule– Include all lower level reviews that are determined to be a part of the overall major

milestone design review process• Example: Orion included technical / peer reviews for PDR products, Subsystem Design

Reviews, and System Level Reviews– Include all products that are determined to make up the review data package

• Example: Requirements, Plans, Drawings, CAD Models, Design Data Books, etc.– Incorporate dependencies where possible

• Example: Component PDRs required to be completed and products needing to be developed prior to a subsystem design review

– Ensure owners of all products and reviews are identified (NASA & Contractor)– Develop a process to ensure schedule is updated / managed– Develop the list of metrics / reports that will be utilized to track performance

Page 11: C armstrong tbyers

Design Review Planning

11

• Document the plan• Formalize the overall review process

• All significant stakeholders organizations will participate in review of document prior to release

• Ensures leadership has agreed to plan• Document the requirements and

associated documentation baseline for review

• Incorporate NPR 7123.1A criteria into plan (including any tailoring)

• Serves as the primary method of documenting all agreements regarding method to meet the review criteria

• Place under configuration control• Use as a reference throughout the

execution to ensure the process is adhered to

Example

Page 12: C armstrong tbyers

PM Challenge 2012

Evolve and Excel

Right Sizing Your Design Review Plan

Focus on the Objective

Program/Project Design Review Approach

• Planning

• Technical / Peer Reviews

• Component Design Reviews

• Subsystem Design Reviews

• System Level Reviews

• Boards & Panels

Page 13: C armstrong tbyers

13

Technical / Peer Reviews

• A significant amount of oversight occurs throughout the life-cycle of the project– This interaction serves to familiarize the reviewers with the lower level details and provides an

opportunity to identify and resolve issues early in the review process

• Technical / Peer reviews provide an opportunity to perform technical assessment and to focus content for the next tier of reviews (Component, Subsystem, & System PDRs/CDRs)

• Technical / Peer Review Purpose / Objectives– NASA and Contractor Subject Matter Experts (SMEs) assess the technical progress and quality of

design related products (could be a DRD or parts of a DRD)– Work to identify and solve design, document, process issues– Ensure the Product accurately represents the appropriate baseline configuration / architecture– Ensure the Product fulfills the DRD Description provided by NASA– Provides joint NASA/Contractor forum with Intra-Subsystem and Cross-Subsystem Participation– Serves as a foundational review supporting follow-on reviews that culminate in a successful

design life-cycle milestone review – Iterative when required

Tech Reviews are “Roll up the sleeves” reviews of the products that support or represent the design

Page 14: C armstrong tbyers

Technical / Peer Reviews

• Key Lessons Learned– Enhance Technical Reviews of the design work products and ensure correct participation

• Consistency of technical review process important – tracking of issues / conduct of review

– Identify the tech review of most importance – i.e. Design Data Books provide an opportunity to review the design at the subsystem level prior to a Subsystem PDR/CDR

– Provide TR participants the data product(s) 5 to 10 days prior to the review– Ensure review dates are documented in a easily accessible integrated schedule

• Work to deconflict schedules where possible to ensure appropriate participation

– Identify minimum required stakeholders for a successful TR• TRs must be given high priority by each appropriate organization

– Be willing to reschedule the review if the right personnel cannot attend

• Appropriate SMEs are expected to support DRD TRs

– For all contractor developed DRDs, assign a NASA POC to ensure stakeholder selection is correct and the work product receives adequate review

• Official delivery to NASA occurs after the TR/PR

Orion conducted 316 Technical / Peer Reviews for 224 PDR Data Package Products. Reviews were successful in identifying and resolving many issues. However, open review findings were not rolled up to higher level

reviews. Thus creating significant duplication of actions / RIDs in Subsystem and System level reviews.

Page 15: C armstrong tbyers

PM Challenge 2012

Evolve and Excel

Right Sizing Your Design Review Plan

Focus on the Objective

Program/Project Design Review Approach

• Planning

• Technical & Peer Reviews

• Component Design Reviews

• Subsystem Design Reviews

• System Level Reviews

• Boards & Panels

Page 16: C armstrong tbyers

Component Design Reviews

• Component Design Reviews provide stakeholders the opportunity to review component designs and associated interfaces and to ensure component is on track to transition to product implementation

• Component Design Review Purpose / Objectives– Ensures products provide objective evidence that the component design meets all

requirements with acceptable risk and within cost and schedule constraints– Serves as a face-to-face open forum with NASA/Contractor SMEs, Subsystem Leads and other

stakeholders to raise issues/concerns, answer questions or clarify design details– Review open liens and forward plans– Review analysis updates, based on continuous refinement of element designs– *Review design against functional and performance specifications– *Evaluate technical adequacy / maturity of design– Ensure interfaces have been identified and defined– *Review the verification and test approach– Assess the results of producibility analyses conducted on system hardware– Identify and address high risk elements of the design– *Review the readiness to proceed to fabrication, assembly, integration and test

* Maturity expectations based on what is needed to proceed to the next life-cycle phase

Page 17: C armstrong tbyers

Component Design Reviews

• Key Lessons Learned– Ensure component level reviews are incorporated into your larger design review

plan• Identify what component level reviews must be complete prior to the next level of

assembly design review (i.e. Subsystem) - ~90% ideal• Include data products in plan/schedule

– Develop a consistent approach and expectations for conducting each component level review

• This includes minimum common entrance and success criteria– Leads can then add to the criteria to further define expectations

• Identify a common set of participants (typically systems engineering) that can participate in similar component reviews to ensure consistency

– Conduct review where the work will be completed

Properly executed component level reviews can significantly reduce the effort required to address identified issues. If not found until later reviews, (Subsystem or System) impacts may reverse lower review level

decisions and cause rework.

Page 18: C armstrong tbyers

PM Challenge 2012

Evolve and Excel

Right Sizing Your Design Review Plan

Focus on the Objective

Program/Project Design Review Approach

• Planning

• Technical & Peer Reviews

• Component Design Reviews

• Subsystem Design Reviews

• System Level Reviews

• Boards & Panels

Page 19: C armstrong tbyers

19

Subsystem Design Reviews

• Subsystem Design Reviews provide stakeholders the opportunity to review subsystem and associated component designs including GFE and ADP as appropriate.

• Subsystem Design Review Purpose / Objectives– Ensures products provide objective evidence that the subsystem design meets all

requirements with acceptable risk and within cost and schedule constraints– Review and drill into the detailed design that is documented in the technical design products

for each subsystem/component– Serves as a face-to-face open forum with NASA/Contractor SMEs, Subsystem Leads and other

stakeholders to raise issues/concerns, answer questions or clarify design details– Review open liens and forward plans for each Subsystem– Provides a forum for subsystems to horizontally integrate– *Review design against functional and performance specifications– *Evaluate technical adequacy / maturity of design

• Ensure interfaces have been identified and defined– *Review the verification and test approach– *Review the readiness to proceed to fabrication, assembly, integration and test– *Review the operation limits and constraints on the system– Review associated risks and risk management strategies

Using the entrance and success criteria, the Subsystem Design Review (PDR/CDR) will establish the basis for proceeding to system level review (PDR/CDR)

* Maturity expectations based on what is needed to proceed to the next life-cycle phase

Page 20: C armstrong tbyers

Subsystem Design Reviews

• Key Lessons Learned– Improve the efficiency and effectiveness of the reviews.

• Ensure adequate time is given during each Subsystem Design Review to display/discuss the design– Implement Interactive Reading Rooms (IRRs) – Drawings and design documentation available in via electronic

and select hard copy form

» Ensure SMEs are available in the IRRs to answer/discuss design questions

– Minimize “all PowerPoint” reviews – review the design material / documentation

• General Improvements to Process– Allocate time to caucus after the review

» Review identified issues and ensure POCs / preliminary closure plans are established for all actions

• Stack / overlap reviews where possible– Be cognizant of resources that need to support more than one review – deconflict where possible

– No more than 3-4 Subsystem Design Reviews in parallel

Orion conducted 18 PDR Subsystem Design Reviews over 5 weeks. 3 to 4 SSDRs were conducted in parallel. Overall this approach was highly successful and minimized the time span of the review. Reviews were held

in a common location and review participants could move between reviews if necessary. Daily caucuses were held at the end of each day often resulting in 12 to 14 hour days. Future approach will shorten the

review day to allow for caucuses to occur in a 8 or 9 hour day. Post review days will be allocated to caucus completion to ensure actions are properly dispositioned.

Page 21: C armstrong tbyers

PM Challenge 2012

Evolve and Excel

Right Sizing Your Design Review Plan

Focus on the Objective

Program/Project Design Review Approach

• Planning

• Technical & Peer Reviews

• Component Design Reviews

• Subsystem Design Reviews

• System Level Reviews

• Boards & Panels

Page 22: C armstrong tbyers

22

System Level Design Reviews

• Systems level reviews provide an opportunity to look across the system while emphasizing areas needing additional focus

– Reviews may be conducted separately similar to the subsystem reviews or as one larger integrated review

• *Orion’s “DRAFT” CDR plan conducts three focused system level reviews (1 to 2 days each) prior to the subsystem level reviews focusing on:

– System Analysis Review will be held to review the integrated analysis products that apply to multiple subsystems (Aerosciences Environments, Loads and Dynamics, Thermal Analysis, MMOD, Radiation, EMI/EMC, Mass, etc.)

– Test & Verification Approach / Plan– Safety & Mission Assurance

• *Orion will then follow the subsystem level design reviews with a 5 day System level Integrated Vehicle Design Review (IVDR)

– The IVDR is treated similar to the SSDRs» All design DRDs that did not trace to an SSDR will trace to the IVDR

– Topics included Ground/Flight Operations, Integrated Vehicle Performance and Margins (operational mitigations to hazards), Integrated Aborts Performance and Margins, Critical Events as a function of mission, External Interfaces, Logistics, Module Interfaces, Integrated Module and System Test and Verification Summary, Module and System Assembly, Integration, and Production, and Vehicle level hazards and reliability

– Results of review(s) satisfy the traditional major milestone entry and success criteria

* Orion is current planning effort captures this approach. However, lessons learned from the flight test program may allow for further streamlining.

Page 23: C armstrong tbyers

23

System Level Design Reviews

• System Level Design Review Purpose / Objectives– Provide a focused review of the integrated system– Review open liens and forward plans including a roll up from all lower level reviews– *Ensure that all system requirements have been allocated, the requirements are complete, and flow down is

adequate to verify system performance– *Confirm verification methods have been defined are consistent with the vehicle design– *Ensure test approach and plans are at the appropriate level of maturity– *Demonstrate the system design meets requirements at the system, subsystem and component levels (within

margins) with acceptable risk and within cost and schedule constraints (including interface requirements)– *Ensure ICDs are appropriately matured to proceed with fabrication, assembly, integration, and test– *Review of the system level design products including engineering drawings, CAD Models, MELs,

specifications, ICDs, schematics, analysis plots, trade study results, etc.– Present system configuration by mission phase

• Example: Vehicle integration thru launch site delivery, on-pad operations, launch, ascent, on-orbit phases, re-entry, landing and recovery

• *Demonstrate that the system configuration is a closes within all specified margins– Ensure that technical and programmatic risks are identified and listed in order of criticality, and that risk

mitigation plans are drafted.– Present a full baseline of cost and schedules (often handled in a splinter session)– Establish the basis and a viable path for proceeding to the next life-cycle phase, including interim milestones

and entrance/success criteria.

The System Level Design Review establishes the basis and a viable path for proceeding to the next life-cycle phase, including interim milestones and entrance/success criteria.

* Maturity expectations based on what is needed to proceed to the next life-cycle phase

Page 24: C armstrong tbyers

System Level Design Reviews

• Key Lessons Learned– Avoid PowerPoint only Design Reviews – allow time to review the actual design

products– An auditorium type of review does not facilitate adequate interaction between the

reviewers and the design team – control review team size • Remote reviewers can feed in room participants

– Ensure key review participants are participating on site• Reviewers attending by Webex and telecon were less effective

– Improve how contractual/cost issues are addressed during reviews– Ensure open issues can move from review to review to minimize duplication

• Ensure actions from lower level reviews are not rewritten in each higher level review– Provide feedback to initiators

• Minimize duplication of Pre-declared actions/RIDs, SSDR actions, and Errata

• Minimize RIDs/issues being discussed several times during the review process – when issue arises gather the SMEs and make an effective decision

A key streamlining goal is to improve the overall quality of RIDs while reducing the number of RIDs against known, out of scope or duplicate issues.

Page 25: C armstrong tbyers

25

System Level Design Reviews

• 170 DRDs Delivered & Reviewed– 100,000+ pages of design, process, and

plan documentation– Requirements, Interface Definition,

Drawings, Verification Details, General Plans, etc.

• 1470 RID Initiators (DRD Reviewers)– 520 unique initiators wrote a RID (35.4%)– 399 unique initiators with an approved RID

(76.7%)

• 3625 Sponsored RIDs– 2049 were approved (56.5%)

Orion PDR Sponsored RIDs by Discipline

CM, 452

LAS, 267

Mech, 115

SM, 126

ECLSS, 124

EPS, 103

Propulsion, 76

Pyrotechnics, 88

ICDs, 107

Av: C&DH, 30

Av: Comm & Track, 22

Con Ops, 204

LRS, 57

EGSE, 43

Vehicle, 142

Wiring, 40

Other, 272

Av & Sw , 605

Crew Systems, 88

Hazard Analysis, 45

T&V, 204

GN&C, 65

ICCS, 120

A key CDR streamlining goal is to improve the overall quality of RIDs while reducing the number of RIDs against known, out of scope or duplicate issues.

Key Lessons Learned – Orion Example

Page 26: C armstrong tbyers

What is a Good RID?

• What is a good RID?– A “new” design related issue– Accurately describes the design related issue consistent with the RID criteria– Provides the details of where the issue was found

• For DRD related RIDs (section, page, etc)• For Architecture Element RIDs with no DRD referenced (chart, conversation, etc.)

– Provides a sufficiently detailed recommended resolution– Stays within the scope of PDR expectations– For requirements, provides correct “from” language and recommends “to” language

• What is a bad RID?– Incomplete issue identification– No recommended resolution– Recommended resolution of “fix this”– Editorial comment (spelling, grammar, etc.)– RID that expands on errata or duplicates a Pre-Declared RID– Post PDR level of maturity / unrealistic expectations– RID written against a baselined or reference document (Examples: SRD, S/C Spec)– RID entered by someone other than the original initiator without documenting the original initiator– RID written against a previously made project decision (VICB, CPCB, etc.) without new data or evidence

to challenge the decision

Don’t tell us something we already know!!!

An issue that has not previously been identified.

Page 27: C armstrong tbyers

PM Challenge 2012

Evolve and Excel

Right Sizing Your Design Review Plan

Focus on the Objective

Program/Project Design Review Approach

• Planning

• Technical & Peer Reviews

• Component Design Reviews

• Subsystem Design Reviews

• System Level Reviews

• Boards & Panels

Page 28: C armstrong tbyers

Boards & Panels

• There are a number of boards and panels that can be implemented during a review process depending on the size and complexity of the review– Caucus Teams

• Made up of the lower level review chair(s) and the key leads of products being reviewed– May be implemented for Subsystem design reviews and system level reviews– Not recommended for Tech/Peer reviews

• Focus is to resolve issues (disposition actions) that arise in the review quickly allowing the team to not repeat the action later in the review and to facilitate the design team’s ability to rapidly start resolving the issue

– Management Review Team• Made up of the Program Manager and key direct reports

– May be implemented for Subsystem design reviews and system level reviews– Not recommended for Tech/Peer reviews

• Focus is to resolve program level issues that extend beyond a Subsystem or IPT lead’s purview

– Screening Team• Often led by systems engineering with key subject matter expert support• Focus is on quickly reviewing and bucket RIDs into categories (subsystems, orgs, etc.) for

later disposition

Page 29: C armstrong tbyers

Boards & Panels

• There are a number of boards and panels that can be implemented during a review process depending on the size and complexity of the review– Disposition Teams

• Often led by IPT Leads with key subject matter expert support• Focus is on ensuring all open issues (RIDs/actions) are dispositioned and an effective path

to closure is established– Pre-Board

• Serves a preparatory board for the later review board• Often chaired by the Deputy Program/Project Manager, Chief Systems Engineer, or Chief

Engineer• Focus is on:

– Resolving and dispositioning elevated or disputed RIDs – Elevating unresolved issues to the Board– Identifying issues that need to go forward to the Board (due to significant cost, schedule, or

technical impact/risk) – Providing the Initial evaluation of the project’s success in meeting the entrance/success criteria

Page 30: C armstrong tbyers

Boards & Panels

• There are a number of boards and panels that can be implemented during a review process depending on the size and complexity of the review– Board

• Chaired by the Program/Project Manager• Focus is on:

– Resolving and dispositioning elevated or disputed RIDs – Evaluating the Project’s success in meeting the PDR success criteria– Approving the basis of and viable path for proceeding to the next life-cycle phase

Page 31: C armstrong tbyers

Boards and Panels

• Key Lessons Learned– Caucuses

• Resist the desire to include everyone – keep the team small and bring in key SMEs as needed

– Helps ensure the process does not take longer than necessary

• Allow for additional time post review to disposition any remaining issues/actions

– Screening and Disposition Teams• Pay particular attention in selection Team leadership and identifying SMEs to support

– Significant time can be lost by tracking down the correct person who understands the issue and can recommend a corrective action

– Pre-Board and Board• Avoid the desire to start the Board the day after the System Level Reviews (be flexible)

– A few days extra to resolve conflicts between design team and RID initiators can ensure the appropriate action is taken to resolve issues

Ensure sufficient time to review RIDs/ActionsAllow enough time to process all of the actions and roll up the action closure status from the lower level reviews

Page 32: C armstrong tbyers

PM Challenge 2012

Evolve and Excel

Right Sizing Your Design Review Plan

Focus on the Objective

Summary• Time-line Example

• Affordability

• Conclusion

Page 33: C armstrong tbyers

Time-line Example

Month 1 Month 2 Month 3 Month 4 Month 5 Month 6 Month 7 Month 8 Month 9 Month 10 Month 11 Month 12

Month 1 Month 2 Month 3 Month 4 Month 5 Month 6 Month 7 Month 8 Month 9 Month 10 Month 11 Month 12

Large Complex Program/Project*

* Notional time-line – duration and number of design review elements dependent on many factors (Scope, complexity, volume of products to be reviewed, number of participants required, available time to conduct review, etc.)

Planning

Work Product (Plans, Drawings, Rqmts, etc.) Tech / Peer Reviews

Board

Pre-Board

System Level Review (Vehicle)

Subsystem Level Reviews

Component Level Reviews

CaucusesManagement Review Team

Screening & Disposition Teams

Work Product Delivery (NLT 2 weeks prior to applicable SS/Sys Reviews)

Work Product Review (outside of reviews )Results (actions / RIDs) flowed into the applicable review for disposition

Reviews all open issues to date and ensures appropriate dispositions are developed

System Level Review(s) (Environments, Etc.)

Page 34: C armstrong tbyers

PM Challenge 2012

Evolve and Excel

Right Sizing Your Design Review Plan

Focus on the Objective

Summary• Time-line example

• Affordability

• Conclusion

Page 35: C armstrong tbyers

Affordability

• When planning and executing your design review affordability should remain a constant theme weaved throughout ensuring optimal solutions are chosen

• Key areas to focus on:– Balance thoroughness and schedule

• The desire to review every aspect of the design down to the lowest possible level must be balanced against the available time to execute the review as to not delay the development activities unnecessarily

– Balance inclusion and efficiency• Develop a process that ensures all stakeholders have an adequate time to participate and

provide input helping to shape the design, ensure mission success, and ultimately provide a safe crew exploration vehicle

– Including stakeholders in the planning process– Review documents for duplication and remove or minimize early in program /

project life cycle– Utilize ongoing lower level reviews to provide relevant insight as part of the larger

major milestone review process (technical and peer reviews, component level design reviews, etc.)

Page 36: C armstrong tbyers

Affordability

• Key areas to focus on:– Communicate expectations early and often to minimize lost time due to out of

scope comments, actions, etc.– Ensure government/contractor joint development of process and associated

expectations, including entry and success criteria– Consider work location, amount of travel needed, etc. when determining plan and

choosing review locations– Overlay review schedule with program/project schedule and work to minimize

disruptions and in ongoing program/project efforts– Track issues/actions throughout entire process to minimize duplication

• Ensure adequate dispositions are developed the first time– Look to other programs/projects during planning phase

• Take advantage of lessons learned

• “Affordability remains the most significant issue facing the Agency and in particular human spaceflight.” – Heft Steering Committee

• Future NASA space programs must be affordable, sustainable and realistic to survive political and funding dangers… - Charles Bolden, AIAA Conference Jan 2011

Page 37: C armstrong tbyers

PM Challenge 2012

Evolve and Excel

Right Sizing Your Design Review Plan

Focus on the Objective

Summary• Time-line example

• Affordability

• Conclusion

Page 38: C armstrong tbyers

Keys to Review Success(Operational Excellence)

• Teamwork– Create a joint NASA/Contractor Program Integration Team (PIT) to ensure common expectations are

communicated across the NASA, Prime Contractor, and support contractor teams• Key personnel from each IPT should be included in team• Incorporate planning and C&DM personnel into team

• Communication– Communicate early and often– Conduct a series of TIMs and Training Sessions with each of the leads (Subsystem/System) to

develop common expectations and identify issues needing resolution prior to Review– Institute multiple statuses and forums targeting all stakeholders (memos, status meetings, etc.)

• Commitment– Assigned a manager responsible for ensuring review success (NASA & Prime Contractor)– Routinely stress the importance of attaining the appropriate level of design maturity with the goal

of “Get it right the 1st time”– Routinely communicate the review expectations and monitor status weekly

• Accountability– Ensure Leads take ownership of their respective lower level review preparation and execution– Develop a detailed schedule to manage the reviews including data package review and delivery – Develop detailed metrics for tracking all deliverables by Component, Subsystem, and System

38

Page 39: C armstrong tbyers

Conclusion

• Develop and execute a program/project technical life-cycle review (SRR, SDR, PDR, CDR) process that ensures NASA’s next generation products can meet all mission objectives and satisfy requirements with acceptable risk

• Use lessons learned from previous Programs/Projects to develop a multi-tiered approach that ensures the integrated system (components, subsystems, system) is reviewed to ensure the design approach is at a sufficient level of maturity to proceed to the next life-cycle phase.– Expanded use of lower level design reviews (Design products / documentation

Tech/Peer Reviews, Component Reviews, and Subsystem Reviews) help to ensure a thorough review while better integrating with ongoing program activities

• When planning and executing your design review keep an eye on affordability

Page 40: C armstrong tbyers

PM Challenge 2012

Evolve and Excel

Right Sizing Your Design Review Plan

Focus on the Objective

Questions ???