Upload
hoangque
View
232
Download
6
Embed Size (px)
Citation preview
Engineering as
Social Experimentation
Engineering/Experimentation
Comparison Engineering Experimentation
Objective is to solve problems which often involves:
Objective to find new knowledge or answers which also involves:
- unknowns - uncertain outcome - monitor, learn from past experiments - human subjects / participants often unaware, uninformed - often don't recognize all variables - natural experiment
- unknowns - uncertain outcome, test hypothesis - draw conclusions or verify hypothesis based on experience / evidence - "informed consent" of subjects - try to control all variables - controlled experiment
Responsible Experimenters:
ò Conscientious – open eyes, ears, mind
ò protect the safety of human subjects, providing a safe exit whenever possible, and respect their right of informed, valid consent (knowledge & Voluntariness)
ò use imaginative forecasting of possible side effects, and reasonable efforts to monitor them
ò have autonomous, personal involvement in all aspects of a project
ò accept accountability for the results
ò display technical competence and other attributes of responsible professionals
ò (Martin & Schinzinger, Introduction to Engineering Ethics, 2nd Ed., 77-94)
“Informed Consent” ò Knowledge
ò All information needed to make a reasonable decision
ò Voluntariness
ò No force, fraud or deception
ò Valid Consent
ò Given voluntarily
ò Based on info… in understandable form
ò Competent consenter
ò Where subjects are not individually identifiable ò Info widely disseminated
ò Consent by proxy
Milgram’s Experiment
ò Unwitting subject ordered to shock subject (actor)
ò People generally defer to authority
ò Don’t feel responsible
ò Have a tendency to conform
ò (26/40 administered the highest, 450V, shock; none refused before 300V!)
Engineers and Safety & Risk
We’ll look at…
ò What is Risk?
ò Factors Affecting Risk Perception
ò What is Safety
ò Risk – Benefit
ò Uncertainties in Design
ò Safe Design
ò Misconceptions about Safety
ò Lessons
It is an engineer’s duty (paramount) to protect the safety and well being of the public
(re: code of ethics)
What does “safety” mean?
ò Safety is the freedom from damage , injury or risk
ò Risk is the possibility of suffering harm or loss (American Heritage Dictionary)
ò A risk is the potential that something unwanted and harmful may occur
Active:
In examining safety, we must acknowledge that the public can be “active consumers” (informed, voluntary); that is they actively participate in the use of a product (their choice) even though they know there are risks ò Hang gliders, bungee jumping etc.
Passive:
But the public can also be “passive consumers” (uninformed, involuntary) in that they need/choose to use a product, but are not directly involved in the decisions that affect their personal risk. (e.g. electrical power, water, airline passengers etc.),
ò Bystander: The Public can also be “bystanders”; not directly involved in its use, but affected by it (e.g. second hand smoke!)
Safe (Lowrance, modified)
Martin, Mike W. and Roland Schinzinger, "Ethics in Engineering", Third Edition, McGraw-Hill Companies, Inc. 1996.
ò A thing is considered “safe” if, were its risks fully known, those risks would be judged acceptable, in light of “settled valued principles*”.
ò * rational, enduring, long term principles; not affected by transient emotion or temporary conditions
Factors associated with risk (page 64, Fledderrmann)
ò Voluntary vs. involuntary risk ò We will take higher risks if voluntary
ò E.g. Going to a bar where smoking is involved and being aware of the effects of second hand smoke
ò Short term vs. long term consequences ò We accept higher risks if the effect occurs in the distant
future ò E.g. cancer from smoking versus injuries from a motorcycle
accident ò We accept higher risks if reversible (e.g. broken leg vs losing an
arm)
ò Proximity: more sensitive to losses if “closer to home”
Factor associated with risk
(cont.)
ò Expected probability
ò We will discount the risk if the probability is extremely low ò E.g. avoid jellyfish sting, but accept shark attack
ò Threshold level of risk
ò We will accept low levels of exposure
ò We ignore “nuisance” risks ò E.g. if probability or consequences re low enough, we
won’t be bothered (crossing the street…)
Effect of Information: ò Martin, Mike W. and Roland Schinzinger, "Ethics in Engineering", Third Edition, McGraw-Hill
Companies, Inc. 1996. P134
ò How information is presented can affect how it is interpreted. For example:
ò Consider the outbreak of an unusual disease expected to kill 600 people. Two alternative programs are proposed. Assume the exact scientific extimates of the outcomes are as follows:
One group asked to select either:
ò Program A
ò 200 people will be saved
ò Program B
ò P=1/3 that 600 will be saved ò P=2/3 that nobody will be saved
ò Result: 72% chose A, 28% chose B.
Martin, Mike W. and Roland Schinzinger, "Ethics in Engineering", Third Edition, McGraw-Hill Companies, Inc. 1996.
Second group asked to select either:
ò Program C
ò 400 people will die
ò Program D
ò P=1/3 that nobody will die ò P=2/3 that 600 will die
ò Results: 22% chose C (same as A), 78% chose D (same as C)
Martin, Mike W. and Roland Schinzinger, "Ethics in Engineering", Third Edition, McGraw-Hill Companies, Inc. 1996.
Bottom line…
ò Decisions are always based on our perception of the risk vs the benefit, or the potential loss vs the potential gain.
ò … may be distorted by: biases, inaccurate information, emotion AND presentation.
ò For engineers… have to figure out what we need to consider to fulfill our role
ò May evolve as society changes
Back to Safety vs. Risk
ò Total Production costs of products is dependent on both product cost and safety costs.
ò As the risk increases, the cost to produce decreases, but the safety costs increase (litigation, insurance etc)
Martin, Mike W. and Roland Schinzinger, "Ethics in Engineering", Third Edition, McGraw-Hill Companies, Inc. 1996.
Least Most
Risk
Secondary
Costs
Primary
Costs
Martin, Mike W. and Roland Schinzinger, "Ethics in Engineering", Third Edition, McGraw-Hill Companies, Inc. 1996, p141.
Prod
uct
Life
cycl
e Co
st
Pinto Case Assumptions… (Harris, Pritchard, Rabins)
ò Liabilities:
ò 180 burn deaths, 180 injuries, 2100 vehicles
ò $200K/death, $67K/injury, $700/vehicle
ò Total: $49.15M
ò Costs:
ò $11/car, $11/truck, 11M cars, 1.5M trucks
ò Total: $137M
ò Benefits and risks on both sides here…
Risk-Benefit Analysis
ò Usually applied to large projects
ò Risk and Benefits are assigned a dollar value
ò Problems (and questions):
ò Comparing apples and oranges
ò Time shifting of risks/benefits
ò Who benefits and who takes the risks?
Some terms:
ò Probable gain
ò Prob. Success X Value ò E.g. 50% chance of success; $1M project = $.5M
ò Probable Loss
ò Prob. Failure X Value
Martin, Mike W. and Roland Schinzinger, "Ethics in Engineering", Third Edition, McGraw-Hill Companies, Inc. 1996.
Uncertainties in Design…
ò Absolute safety… neither practically attainable nor affordable.
ò Expected performance vs expected duty
ò Safety margin
ò “gap” between capability and duty
ò Safety factor
ò Ratio of average capability/duty
SafetyMargin
Safety Factor=B/A
StressA B
CapabilityDuty
Uncertainties in Design…
Expected performance vs expected duty
Safe Designs
ò Design must comply with all applicable laws (readily available)
ò Designs must meet “acceptable engineering practice” (this term can be vague but continuous upgrading of skills through short courses and extensive literature searches can help here)
Safe Designs (cont.)
ò Safety consideration must be included in design from the start
ò Alternate designs that are potentially safer must be considered
ò Compare what you have to other approaches that are deemed “safe”
ò Foresee potential misuses of the product and design to avoid them
ò Courts are sympathetic to the stupid user ò Warning labels ARE NOT sufficient
Safe Designs (cont.)
ò Products must be extensively tested
ò Failure analysis techniques
ò Checklists
ò Hazard and Operability (HAZOP)
ò Failue modes, effects critical analysis (FMECA)
ò Fault tree of Event tree analysis
Some misconceptions about risk and safety
ò Assumption: Operator error and negligence are
the principle causes of all accidents.
ò Reality: Accidents are caused by dangerous conditions that can be corrected by design. E.g. automatic couplers for rail cars gretly reduced accidents.
ò Martin, Mike W. and Roland Schinzinger, "Ethics in Engineering", Third Edition, McGraw-Hill Companies, Inc. 1996.
ò Assumption: Making a product safe invariably increases costs
ò Reality: Initial costs need not be higher if safety is built into a product from the beginning; life cycle costs are lower. Design corrections later are very costly.
ò Martin, Mike W. and Roland Schinzinger, "Ethics in Engineering", Third Edition, McGraw-Hill Companies, Inc. 1996.
ò Assumption: We learn about risks after a product has been introduced
ò Reality: Using imagination, forecasting, lookinf at similar products and experiences, simulation etc. can prevent many risks (and costs) before introduction.
Martin, Mike W. and Roland Schinzinger, "Ethics in Engineering", Third Edition, McGraw-Hill Companies, Inc. 1996.
ò Assumptions: Warnings about hazards are adequate; insurance coverage is cheaper than planning for safety
ò Reality: Depending on how well the warnings are displayed, this could cost big bucks in litigation. Warnings indicate that a hazard may exist. They do not provide protection!!
ò Martin, Mike W. and Roland Schinzinger, "Ethics in Engineering", Third Edition, McGraw-Hill Companies, Inc. 1996.
Liability
ò You are liable for everything you do and design; simple fact!
ò You cannot rely on “codes and standards” alone. This is called minimal compliance and does not guarantee a safe product nor does it provide a valid excuse if a product fails or someone is hurt.
ò Martin, Mike W. and Roland Schinzinger, "Ethics in Engineering", Third Edition, McGraw-Hill Companies, Inc. 1996.
ò Engineers are often compromised by timelines of their employer and do not spend the time necessary to look at safety in depth.
ò You are not protected from liability just because you are part of a large company!!!
ò Lessons from 3-Mile Island…
ò Wisdom and fallacy in public perceptions
ò Safety should be integral part of design, anticipate possible failures
ò Share information when it comes to safety
ò People are overly optimistic about things that haven’t hurt them… yet
Martin, Mike W. and Roland Schinzinger, "Ethics in Engineering", Third Edition, McGraw-Hill Companies, Inc. 1996.
Lessons for Engineers:
ò Perceptions change slowly (like OURS!) ò Watch for ‘filters’; don’t be blind to new info
ò Expert opinion has limited value ò Don’t be so sure…
ò Look hard for the wisdom…. ò Don’t discount public opinion too quickly!
Martin, Mike W. and Roland Schinzinger, "Ethics in Engineering", Third Edition, McGraw-Hill Companies, Inc. 1996.