Upload
melissa-phillips
View
225
Download
4
Embed Size (px)
Citation preview
NASA OSMA SAS’04 [2] of 15
Research Heaven,West Virginia
What is SAS?
• SAS is a zoo? – Don’t feed the
funny animals, er, researchers
NASA OSMA SAS’04 [3] of 15
Research Heaven,West Virginia
What else is SAS?
• A spectator blood sport? – Come see the dueling paradigms?
Process
improvement
or death!
Let none deny us
our formal methods!
NASA OSMA SAS’04 [4] of 15
Research Heaven,West Virginia
SAS is a progress report
# scored projects:– 2003: 34– 2004: 48
12
34
56
78
9
2003
2004
0%
20%
40%
percent of projects
penetration factor
20032004
10 There is no ten
9 results actually used by project
8 data passed back to project
7 data used by researcher
6 data passed
5 project agrees to passing data
4 positive response to contact
3 project contacted
2 project targeted
1 no project targeted
The research infusion team
• Defects found during the initial training session!
NASA OSMA SAS’04 [5] of 15
Research Heaven,West Virginia
CB-
BB+
A-A
A+
2003
2004
0%
20%
40%
20032004
SAS recognizes good research
A Exemplar
B Coming along fine
C “C” for “cull?”
ARC=1 Transitioning from Software Requirements Models to Design Models (CI03) GRC=1 Injecting Faults for Software Error Evaluation (CI03) IVV=4 Automated Testing & Quantitative Evaluation of Real-Time Sys Source Code (CI03)
Empirical Assurance of Embedded Software using Realistic Simulated Failure Modes (CI04)Robust Requirements Tracing Via Internet Tech:Improving an IV&V Technique (CI03) Tandem Experiments in Finding Faults During Model Based Devlopment (CI03)
JPL=1 Reducing Software Security Risk Through an Integrated Approach (CI03) JSC=1 The Use of a Virtual System Simulator & Executable Specifications (CI03) WVU=2 Lyapunov Stability Analysis and On-Line Monitoring (UI03)
See more! Learn more! Tell more! (UI03)
Grade= “A”
Penetrationfactor= 9
Candidates for “best project” award
NASA OSMA SAS’04 [6] of 15
Research Heaven,West Virginia
Anything else happens at SAS?
• Where OSMA looks for new answers
So manytools…
How do they compare?
When won’t they work?
method3
method2
Cost
Benefit
method1
externalvalidity
project 1
project 2
project N
NASA OSMA SAS’04 [7] of 15
Research Heaven,West Virginia
Q: what are our best answers? Examples of proven “best practices”?
• Welcome to “knowledge elicitation by irritation”– Here are some example “best practices” (Timm’s views only)– Your homework (for Day 3):
• What would you add?
• E.g.#1: Attach research to commonly used platforms – e.g. MDP (Chapman, Galaxy Global; Menzies / Cukic,WVU)– e.g. Vxworks (Beims, SAIC)– e.g. CASPER (Smith, JPL; Offutt, Interface&Control Systems Inc)
•Stardust•PROBA (ESA)•Mars Odyssey•X-38 (space station Lifeboat”)•RHESSI
• Reuven Ramaty High-Energy Solar Spectroscopic Imager
•Swift• (Gamma Ray Observatory; RHESSI Heritage)
•Gamma ray Large Area Space Telescope
•GLAST; Swift Heritage•Mars Pathfinder Rover•Mars Exploration Rovers •etc
•Stardust•PROBA (ESA)•Mars Odyssey•X-38 (space station Lifeboat”)•RHESSI
• Reuven Ramaty High-Energy Solar Spectroscopic Imager
•Swift• (Gamma Ray Observatory; RHESSI Heritage)
•Gamma ray Large Area Space Telescope
•GLAST; Swift Heritage•Mars Pathfinder Rover•Mars Exploration Rovers •etc
VXworks@NASA
•Autonomous Spacecraft - 3C3•Autonomous Spacecraft - TS-21•Rover Sequence Generation•Distributed Rovers•CLEaR (Closed Loop Execution and Recovery)•etc
•Autonomous Spacecraft - 3C3•Autonomous Spacecraft - TS-21•Rover Sequence Generation•Distributed Rovers•CLEaR (Closed Loop Execution and Recovery)•etc
CASPER@JPL
http://mpd.ivv.nasa.gov
NASA OSMA SAS’04 [8] of 15
Research Heaven,West Virginia
Other example“best practices”?
• E.g. #2: Process maturity reduces the amount of avoidable rework– Says:
• [Shull02]
– If so:• then demand higher levels of process maturity
• E.g. #3: Peer reviews catch >= 50% of defects – Says:
• [Shull02]• [SEI03]: SEI workshop on software risk at NASA,
Pittsburgh, 2003• [McConnell00]: The Best Influences on Software
Engineering, Boasson, Billinger, Card, Cochran, Ebert, Glass, Ishida, Mead, Mello, Moitra, Strigel , Wiegers, IEEE Software, Jan 2000
– If so:• Then demand peer reviews on software artifacts
Already inNPR 7150.xSWE-097SWE-098
Fuel for thought:
are any of our great tools
more cost-effective
than “mere” manual
peer reviews?
NASA OSMA SAS’04 [9] of 15
Research Heaven,West Virginia
Yet more examples of“best practices”?
• Low cost defect detection methods– E.g. #4
• Thrashing• Just crank it up and let it rip• Berens, GRC• Powell, JPL
– E.g. #5: • V&V of SQA via static defect measures• Menzies, WVU• detectors stable across multiple NASA projects• sampling policy to check where else to place your effort
– E.g. #6: • Temporal queries over control/data-flows in C programs• Beims, SAIC (tool= Codesurfer)• Why low cost? No need to abstract code to a formal model.
• Other important ideas: – E.g. #7:
• Bidirectional tractability matrix between requirements, test cases, code modules• Hayes, UK
– E.g. #8: • Automated test suites
NASA OSMA SAS’04 [10] of 15
Research Heaven,West Virginia
Hence, SAS ‘04
• Day 1,2:– Morning:
• 1 track• Executive summaries (short)
– Afternoon:• 5 parallel tracks• All the gory details (longer briefings)
• Day 3:– Morning:
• 1 track• Report back from tracks/ discussions
– Lunch• Your table:
– list 3 best “best practice” practices
– Afternoon• 1 track• Build “the” list of best practices• Leave early!
A good summary attracts an audience to the afternoon sessions
A good summary attracts an audience to the afternoon sessions
Afternoon sessions:• vital they start and end at
advertised times (so folks can jump between them)
Afternoon sessions:• vital they start and end at
advertised times (so folks can jump between them)
So, if you run over time, STOP!
If run under-time, WAIT!
So, if you run over time, STOP!
If run under-time, WAIT!
NASA OSMA SAS’04 [11] of 15
Research Heaven,West Virginia
Day -3 report back slides
• Slides= 15 to 20 minutes– Good slides generate lively
discussion
• After slides, 20 minutes discussion– Topic:
• “In the future, what to do more? What to do less?”
• Important note:– The report back material can
cover MORE than just the SAS-presented work
– The SAS work are examples from a field
– The report back should try to sketch that field
Brief notes on the fieldE.g., potential benefits to NASA. High
water mark in this area, Brief notes on the track presentationsReadiness and guidelines
E.g. Technology readiness levels for various tools (see next slide) and methodological guidelines
Opportunities and potentials and road-blocksE.g. any factors inhibiting this work?
Standard traps, costs and benefits, limits to the technologiesE.g. hot topics, open issues.
Key playersE.g. NASA groups or university research
groups of commercial companies that can support this kind of thing
War stories E.g. projects that have used this stuff,
successfullyWhere to find more information
E.g. tutorials, manuals, landmark papers, supply Urls if possible
Brief notes on the fieldE.g., potential benefits to NASA. High
water mark in this area, Brief notes on the track presentationsReadiness and guidelines
E.g. Technology readiness levels for various tools (see next slide) and methodological guidelines
Opportunities and potentials and road-blocksE.g. any factors inhibiting this work?
Standard traps, costs and benefits, limits to the technologiesE.g. hot topics, open issues.
Key playersE.g. NASA groups or university research
groups of commercial companies that can support this kind of thing
War stories E.g. projects that have used this stuff,
successfullyWhere to find more information
E.g. tutorials, manuals, landmark papers, supply Urls if possible
Report-back slides; any or all of:
NASA OSMA SAS’04 [12] of 15
Research Heaven,West Virginia
NASA OSMA SAS’04 [13] of 15
Research Heaven,West Virginia
This idea isn’t even false-- Niels Bohr
So what is true about engineering quality software?
A plea for more empiricism
From NPR: 7150.x: The requirements in this NPR are easily
traceable to … proven NASA experience in software engineering.
NASA OSMA SAS’04 [14] of 15
Research Heaven,West Virginia
More pleas for more empiricism
Delphi statement: I think, they think Empirical statement: I saw, they saw
Claudius Galenus of Pergamum (131-201 AD)’s 2nd century anatomy text Galen's authority dominated medicine all the way to the 16th century. Experimenter's disciples did not bother to experiment and studies of physiology and anatomy stopped - Galen had already written about everything. Note:Galen mainly dissected animals, not humans
Galen’s views prevailed till Andreas Vesalius (1514-1564) had the gall (pun intended) to descend Into the dissection
pit and perform his own anatomical experiments. And there he found that
much of Galen’s writing were actual wrongings
1300 years!
NASA OSMA SAS’04 [15] of 15
Research Heaven,West Virginia
• Process conclusions– Q: When does cost-estimation get accurate?– A: after a dozen projects
• Demonstrably adequate.• Repeatable
– Data sets on the web
• Refutable:– A better cost-estimation method would tune
faster, decrease variance faster
30 times , shuffle order on 60 NASA projects
Lyapunov Stability Analysis and On-Line Monitoring
(Cukic WVU + an ISR collaboration “A Methodology for V&V of Neural Networks”)
Failure: Loss of left stabilizer,
50% missing surface
See more! Learn more! Tell more! (Menzies, WVU)
Learning software cost estimation models Learning aircraft controller
Max/minconfidence intervalin alearner
Estimationvariance