20

Risk and Regulation Summer 2010

  • Upload
    carr

  • View
    234

  • Download
    0

Embed Size (px)

DESCRIPTION

Issue 19 of the CARR Risk and Regulation magazine

Citation preview

Page 1: Risk and Regulation Summer 2010
Page 2: Risk and Regulation Summer 2010

Contents

b Risk&Regulation, summer 2010

RiSk&REgulAtioN: CARR REviEwNo 19 SuMMER 2010

EditoR: John Downer ASSiStANt EditoR: Anna Phillips

ENquiRiES: Centre Administrator, esRC Centre for Analysis of Risk and Regulation, the London school of economics and Political science, Houghton street, London WC2A 2Ae United Kingdom

tel: +44 (0)20 7955 6577 Fax: +44 (0)20 7955 6578 Website: www.lse.ac.uk/collections/CARR/ email: [email protected]

PubliShEd by: esRC Centre for Analysis of Risk and Regulation, the London school of economics and Political science, Houghton street, London WC2A 2Ae

Copyright in editorial matter and this collection as a whole: London school of economics © 2010 Copyright in individual articles: p 6 © Bridget Hutter 2010 p 8 © tom Horlick-Jones 2010 p 9 © thiago neto 2010 p 10 © Julien etiennne 2010 p 12 © Lee Clarke/Harvey Molotch 2010 p 14 © sally Lloyd-Bostock 2010

All rights reserved. no part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, without the prior permission in writing of the publisher, nor be issued to the public or circulated in any form of binding or cover other than that in which it is published.

the school seeks to ensure that people are treated equitably, regardless of age, disability, race, nationality, ethnic or national origin, gender, religion, sexual orientation or personal circumstances.

the information in this magazine can be made available in alternative formats. Please contact the Web and Publications Administrator tel: +44 (0)20 7849 4635 email: [email protected]

dESigN ANd ARt diRECtioN: Lse Design Unit PRiNtEd by: AquatintBsC CovER iMAgE: Forest ablaze at night © Petejw, Dreamstime.com PhotogRAPhy: Reuters Images (p6), PA Photos (p3, p7, p17), Dreamstime.com (p11 – composite), istockphoto (p4, p9, p11 – composite, p15, p16), sxc.hu (p13)

Issn 1473-6004 online Issn: 1473-6012

3 caRReDItoRIAL in catastrophe’s shadow CARR Director bridget hutter discusses risk and disaster.

4 meet tHe ReGULAtoR how do the federal aviation administration regulate

the us airways? We talk to kathy abbott, Chief scientific and technical Advisor for the FAA,

about the agency’s responsibilities.

5 caRRneWs the latest news from CARR.

6 caRRReseARCH Risk regulation and the anticipation of natural disasters bridget hutter explores the complex relationship between risk regulation

and societal preparedness for natural catastrophes.

8 guestReseARCH public debates about technology: averting disaster, transforming

society or simply making the best of things? tom horlick-Jones reflects on the enthusiasm for public engagement in issues relating

to potentially risky technologies.

9 caRRstUDent good intentions: are voluntary arrangements the way to regulate

social and environmental risk? thiago neto examines the usefulness of voluntary agreements for managing environmental

risks.

10 caRRReseARCH harnessing motives for disaster prevention Julien etienne discusses the role of human factors in precipitating financial

and other organizational disasters.

12 guestReseARCH scientists as disaster Warning systems lee clarke and harvey molotch discuss the difficult role of experts in disaster

forecasting systems.

14 caRRReseARCH gathering data for medical regulation sally lloyd-bostock explores the General Medical Council’s risk data.

16 caRReVents Risk, technology and disaster management chris lawless discusses and recounts a recent esRC event hosted by CARR, which focused

on post-disaster management.

18 caRR In PRInt

19 caRR PeoPLe

Risk&Regulation is also published on CARR’s website. www.lse.ac.uk/collections/caRR/publications/magazine.htm

www.lse.ac.uk/collections/caRR

Page 3: Risk and Regulation Summer 2010

CARReDItoRIAL

Risk&Regulation, summer 2010 3

a theme running through this edition of Risk & Regulation is that of ‘disasters’. Social theorists distinguish between risks and disasters, stressing that risk is a concern

about what might go wrong, dangers as yet unrealized which we work to prevent and avoid. Disasters are the realization of those risks. The threat of disasters is always the shadow under which risk regulation takes place. The hope is that it remains in the shadows but sadly disasters are part of our regular global news reporting. As this edition of Risk & Regulation was being prepared, earthquakes in Haiti and Chile brought home the devastation that may be caused.

A number of articles in this edition focus on what can be done to avert disasters or to try to mitigate their full effects should they occur. Our ‘Meet the Regulator’ item underlines the very real efforts which are put into anticipating risks in the aviation industry, a topic which has attracted a great deal of CARR interest in its analyses of ‘close calls’ and ‘near misses’.

Etienne focuses on the complexity of endeavours to prevent man-made disasters. He argues that we need to focus on human agency and its role in disasters; not from perspectives that assume rationality or pure self-interest, but rather though approaches which harness many motives to cultivate individuals’ capacity to act appropriately.

Very often there are unrealistic expectations of what can be done to avert disasters and this can be a real problem for policy-makers. For example, there may be too great a faith in the ability of early warning systems to alert us to impending earthquakes, hurricanes and flooding. But very often the information we have is partial, imprecise and subject to error, and the technology we dream of is underdeveloped. Where there is a relatively high level of predictability, it may be possible to regulate to mitigate the full effects of a disaster. The very real difference this can make is best illustrated by the dramatic improvements that properly developed and enforced building codes can make in earthquake zones. This is one of the reasons why the 8.8 magnitude earthquake in Chile in February 2010 resulted in considerably fewer deaths (estimated at present at approximately 1,000) than the estimated 220,000 killed in Haiti which was of 7.0 magnitude.

The role of science and technology in disasters is a recurring theme in risk regulation debates. Horlick-Jones focuses on public debates about technology, most especially the fear of long term damage associated with some technological developments. He argues for conversations involving citizens, with the objective of achieving better understandings of technical innovations. Clarke and Molotch broaden the discussion of disaster warning systems beyond the technological with a reminder that science warns us of ‘slow burn’ disasters such as health warnings about smoking and AIDS. They focus on how scientific warnings of all types need to be better co-ordinated and acted upon without being alarmist.

The role of science and technology in the post-disaster period was the theme of a lively public debate during the ESRC’s Social Science Week. The subject was the role of forensic science in victim identification in natural and man-made disaster scenarios.

CARR Director bridget hutter discusses risk and disaster.

In catastrophe’s shadow

The debate emphasized the difficult working circumstances of such activities and how important identification is for survivors. Another dimension of the work on a global scale is the need for co-ordination of resources and standards across teams from different countries, something which can be very difficult to work out in the heat of a disaster.

In the longer term, post-disaster scenarios often lead to organizational and regulatory changes. Lloyd-Bostock discusses such changes in the wake of the Shipman case, namely the endeavour to seek out doctors who are risks to their patients. Assessing risk demands data and, as Lloyd-Bostock explains, available data are not always helpful for purposes other than those originally collected. This points to a common problem for regulators urged to adopt risk-based regulation, namely whether they have adequate data upon which to assess adequately the risks they are regulating. This case also raises very difficult questions about how much we should be chasing the last disaster rather than anticipating the next one. Learning is clearly important but one of the lessons of learning is appreciating how to differentiate the actionable points from those which are not worth pursuing. This is, of course, a recurrent dilemma of risk regulation.

bridget hutterCARR Director

© B

rynj

ar G

auti

/PA

Pho

tos

Page 4: Risk and Regulation Summer 2010

MEEt tHe ReGULAtoR

4 Risk&Regulation, summer 2010

We talk to kathy abbott about the agency’s responsibilities.

What do the faa do?the most important job of the Federal Aviation Administration (FAA) is to protect the safety of the travelling public. the Federal Aviation Act of 1958 gave rise to the Federal Aviation Agency. We adopted our present

name in 1967 when we became a part of the Department of transportation. the legal origin of the FAA’s regulatory activities is founded in the Us Constitution and is generally considered to have begun with the Air Commerce Act, enacted in 1926. this act commissioned the secretary of the Department of Commerce to be responsible for fostering air commerce, issuing and enforcing air traffic rules, certifying pilots and aircraft, and operating and maintaining air navigation aids.

Basically, the FAA’s fundamental responsibilities are to: (1) Regulate civil aviation to promote safety; (2) encourage and develop civil aeronautics, including new aviation technology; (3) Develop and operate a system of air traffic control and navigation for both civil and military aircraft; (4) Research and develop the national Airspace system and civil aeronautics; (5) Develop and carry out programs to control aircraft noise and other environmental effects of civil aviation; and (6) Regulate U.s. commercial space transportation.

how do the faa achieve its goals?We pursue these ends through many means. Among many other duties, we issue and enforce regulations and minimum standards covering the manufacture, operation, and maintenance of civil aircraft. We also certify the airmen and airports that serve air carriers, and operate the air-traffic control infrastructure: developing air traffic rules and assigning the use of airspace.

In part, these functions are accomplished via some type of ‘certification.’ Here, certification means the approval and authorization for aircraft; personnel (e.g. pilots); operations; procedures; facilities; and equipment.

the FAA encourages industry to go beyond the minimum required. one way we do this is by working together with industry to foster voluntary safety enhancements. For example, the Commercial Aviation safety team (CAst) was founded in 1997 as a joint government-industry team with a goal to reduce the commercial aviation fatality rate in the Us by 80 percent by 2007. By 2007, CAst

reported that through the implementation of the most promising safety enhancements, the fatality rate of commercial air travel in the United states had been reduced by 83 percent. now they are working to maintain a continuous reduction in fatality risk in Us and international commercial aviation beyond 2007.

What has the faa achieved so far?Civil aviation, in the Us and beyond, has an amazingly good safety record. so much so that other domains, such as medicine, look to us for lessons on how to improve their own safety. But we cannot rest on our laurels and we understand that we must continue to improve. to this end, we are constantly addressing the safety issues highlighted by recent commercial airline accidents. Currently, for instance, pilot fatigue is getting a great deal of attention, and the FAA is in the process of updating the regulations for flight and duty time. In addition, pilot training and qualification requirements are being updated to reflect new aircraft, new operations, and safety experience. Another current priority for the FAA is the modernization of our air traffic system to the next generation airspace operations, called nextGen. It is a major undertaking, but one which is necessary.

Another consideration for the future airspace is the introduction of unmanned aircraft, which were primarily developed and flown by the military services. this area is rapidly changing as these types of aircraft are becoming more widespread, so we are concerned with safely integrating them into the civil airspace system while ensuring that they do not harm other airspace users. one major challenge in this area arises from the fact that our regulations are based on the assumption that pilots will be onboard the aircraft to mitigate various risks (for instance, ‘see and avoid’ other aircraft and obstacles). to address this, we are developing new regulations for these aircraft and their operations. We must also find methods to support the integration of these aircraft into the airspace without causing delays, capacity reduction, or placing current airspace users at increased risk.

our role as a regulator is a significant responsibility. the actions of the FAA have far-reaching effects on aviation, of which we must be mindful. We must foster a knowledgeable workforce, work together with the aviation community, and look at the potential effects and consequences of regulatory changes (or lack of change!). We do this in several ways. For instance, we have encouraged the development of voluntary reporting programs for safety data, and we use this data to identify safety issues and trends.

What is the most common myth about your organization? Many people don’t realize that regulation can be viewed as a form of risk management. Regulators must gauge the varying appetites for risk of the society they serve. In the Us, for example, private aviation is considered to be more of a private concern than a public concern. Commercial aviation is much more a public concern, and is expected to have the highest level of safety. therefore, legislation requires that government oversight and standards are more stringent for commercial aviation than for private aviation.

kathy abbott is a Chief scientific and technical Advisor for the FAA.

How do the Federal Aviation Administration regulate the US airways?

Page 5: Risk and Regulation Summer 2010

CARRneWs

Risk&Regulation, summer 2010 5

Rume premfors visited CARR in november. A Professor of Political science and Research Director at sCoRe, Rune has a longstanding interest in the tensions between political democracy and public administration. He is currently researching the swedish Government offices as part of a major national research programme. He is also co-writing a book analyzing the role of the state in creating and supervising markets in sweden.

Renita thedvall visited CARR in March. Renita has a PhD in social Anthropology and is a researcher at score. Her research is focused on policy processes and organization studies, and she has a particular interest in the construction of markets, social and working-life issues. she is currently involved in a project on fair trade and the organizing of fair markets.

karen kastenhofer visited CARR between April and July. Karen works for the Institute of technology Assessment at the Austrian Academy of sciences. From 1999 to 2001, she took part in the research project ‘science as Culture’ at the Institute of Interdisciplinary Research and education (IFF) in Vienna and finished her PhD thesis on the enculturation of biology students in 2005. As post-doc research fellow, she contributed to the research project

‘Cultures of non-Knowledge’ at the University of Augsburg.

STAFF NEWSmatthias benzer has joined CARR as a Peacock Fellow. His resea rch in te res ts a re sociological theory, sociological Methodology, socio-scientific

Approaches to suffering and Death, Quality and Quantity of Life Debates, and Risk in Healthcare Regulation. He is working on a project that aims for a sociological analysis of the approaches of UK health regulatory organizations to defining, assessing, and managing risks to quality of life.

katherine taylor has joined CARR as office Administrator.

Have you moved or changed jobs recently? Please keep us informed of any changes in your contact details so you can continue receiving Risk&Regulation. Email: [email protected] or tel: +44 (0)20 7955 6577

martin lodge presented a joint paper entitled ‘Public service Bargains Under Pressure’ at a workshop on ‘Public Governance After the Crisis’ at suffolk University Law school in Boston, on 13 november.

Jeanette hofmann gave a presentation entitled ‘Before the sky Falls Down: a “Constitutional Dialogue” over the Depletion of Internet Addresses’ at the Fourth Annual symposium of Giganet in sharm el sheikh, egypt, on 14 november and co-moderated a plenary meeting on ‘Managing Critical Internet Resources’ at the 4th internet Governance Forum, convened by the United nations secretary-General, in sharm el sheikh, egypt, on 16 november.

sharon gilad gave a series of talks in Israel on 28 December. these were entitled: ‘Looking from the inside out – organizational

identity and consumer protection regulation in the UK’ (to the Federman school of Public Policy, the Hebrew University, Israel); ‘the UK Financial services Authority’s treating Customers Fairly Initiative’ (to the Bank of Israel); and ‘the UK’s Financial services Authority’s Approach to Regulating Retail Finance’ (to the Israeli Ministry of Finance).

martin lodge spoke about ‘Managerialism and Prisons’ at a conference on prison reform at the University of Grenoble on 21 January.

Julien etienne gave a talk entitled ‘the participation of third parties in the regulation of industrial hazard: What are the stakes? What are the effects?’ at a workshop called ‘the territorial government of the environment: ongoing transformations and status report’, hosted by the

CURAPP-CnRs, Université de Picardie Jules Verne, in Amiens, on 29 January.

martin lodge gave a series of presentations on civil service and public sector reform in the UK to the national Personnel Authority in Japan, from 22-26 March.

Jeanette hofmann gave a talk on ‘Wikipedia between emancipation and self-regulation’ at a conference entitled ‘Critical Point of View’, organized by the Institute of Interactive Media at Amsterdam University on 26-27 March.

david demortain presented a paper entitled ‘Regulatory toxicology in Controversy. the Contentious Application of the 90-Day Rat Feeding study to GM safety Assessment’ at the IRIst international conference in strasbourg on 29-31 March.

ACADEMICS ABROAD

sally lloyd-bostock submitted an end of Award Report to the esRC on her project ‘An Analysis of Data on Registration and Fitness to Practice Cases Held by the General Medical Council in the Context of Risk-Based Approaches to Medical Regulation’ in november.

peter miller co-organized a workshop on ‘Accounting, Management and Government’ at the University of edinburgh, from 12-13 november.

bridget hutter took part in the summit on the Global Agenda 2009 in Dubai on 20-22 november after being invited to become a Member of the World economic forum’s Global Agenda Council on Catastrophic Risks 2009.

mike power was part of a panel discussion on ‘Managing Risk and Behaviour in Financial Markets’, organized by the european Forum for Philosophy, at Lse on 25 november.

frank vibert was part of a session entitled ‘Can We Regulate our Way to Better Public services in the 2010s?’ at a conference, ‘Public services in the 2010s: Prosperity, Austerity and Recovery’, held in London on 11 December.

bridget hutter was a panellist at the Harvard Kennedy school Global series Regional Meeting on the subject of ‘Managing Widespread Global Risk’, in London on 25 January.

mike power gave a plenary address on ‘Risk Governance’ to the Corporate Governance network of Institutional Investors, in London on 23 February.

mike power presented a paper entitled ‘Building the Audit Reporting Pyramid: a Discussion Document’ to the Audit Practices Board of the Financial Reporting Council on 25 February.

bridget hutter chaired an expert review of the environment Agency’s Better Regulation

science Programme in Bristol on 26 February.

bridget hutter was conferred the Award of Academician of the Academy of social sciences in March.

mike power took part in a round-table discussion entitled

‘the future role of bank auditors – a helping hand for the regulator?’ at the Centre for the study of Financial Innovation, in London on 1 March.

Jeanette hofmann attended a d iscuss ion in Geneva about the fifth annual Internet Governance Forum meeting as part of her membership of the Multistakeholder Advisory Group (MAG). the advisory group was established by the secretary-General of the Un and is comprised of 50 Members from governments, the private sector and civil society, including representatives from the academic and technical communities.

CARR IMPACT CARR VISITORS

Page 6: Risk and Regulation Summer 2010

6 Risk&Regulation, summer 2010 6 Risk&Regulation, summer 20106 Risk&Regulation, summer 2010

© F

aisa

l Mah

mo

od/R

eute

rs

include risk regulation regimes. Risk regulation is inherently about the anticipation of risk and preventing its realization: it is forward-looking, trying to be preventive rather than reactive in its outlook. In relation to natural disasters, risk regulation is primarily useful in pre-event mitigation. this is an area which has arguably received less attention than ex-post recovery actions such as disaster relief. Indeed the stanford economist Roger noll has argued that in disaster situations policy makers have a tendency to under-invest in prevention and over-invest in response. of course, the balance very much depends on particular circumstances and an important consideration is the significance of balancing anticipation and resilience – putting in place firmer emergency plans and capacities where one can realistically anticipate and act constructively, and focusing on resilience in less certain areas. Mitigation strategies can effect significant risk reduction. For example, avoiding land use in known hazardous areas, establishing and enforcing building codes to prevent the collapse of poorly designed and constructed buildings can help us avoid the considerable costs which may be associated with disaster recovery and in turn save lives and prevent injury. Understanding these benefits can also encourage

bridget hutter explores the complex relationship between risk regulation and societal preparedness for natural catastrophes.

Modern social theorists regard anticipation as central to the concept of risk, notably the anticipation of danger and

catastrophe. Private and public sector organizations devote a great deal of effort to anticipating risks and organizing risk regulation for their mitigation. Much attention has been given to the anticipation and management of manufactured risks, especially those associated with scientific and technological developments. But increasingly attention is turning to the anticipation and management of natural risks. the expectation is that their occurrence may be anticipated, and how to react to them determined, through emergency planning and mitigation strategies.

the risk of natural disasters is increasing across the world partly as a result of climate change, which is thought to be increasing the incidence and patterns of natural disasters and so exposing larger populations to the possibility of their effects. And the impact of these disasters is exacerbated by environmental degradation and poor urban planning, especially the co-location of large conurbations and high technology sites. the case of Hurricane Katrina which caused so much damage in new orleans, Louisiana, in 2005 is a case in point. China is another area very prone to natural disasters and has particular problems as it rapidly urbanizes and witnesses increasing urban concentrations of people and industrial sites in close proximity to each other. Here the ‘lines’ between the natural and the social are blurred as human activities contribute to climate change and further exacerbate the consequences of these disasters through concentrations of resources and power.

the costs of natural disasters are multiple and devastating. 280,000 people were killed in the 2004 tsunami in south east Asia; 1,300 were killed in 2005 through the devastation caused by Hurricane Katrina in the Us. the financial costs are also substantial and growing. Developing mitigation strategies for risk reduction and risk reaction is therefore prescient and a problem of global proportions.

Regulation and datathere are various generic forms of mitigation which can prove useful in reducing the damage which may be caused by natural disasters, and these

Risk regulation and the anticipation of

natural disasters

longer term thinking and investment in mitigating disasters which may otherwise be perceived as uncertain and even improbable.

Risk regulation policy decisions depend on information about the likelihood of a natural disaster occurring and also information about the extent of probable damage should a disaster occur. the knowledge base upon which policy is formulated is crucial, a point underlined in a recent Un Report which called for countries to consider in much more local detail how climate change will affect their towns and cities, and also the Lancet medical journal in the UK which urged public health services to give much more pressing and detailed consideration to the ways that climate change might affect health and to develop policies to cope with this. But as risk regulation research repeatedly cautions, we also need to be aware of the limitations of the data we have. the past is not always a good predictor of the future; we need to be realistic about the reliability of the available information. this is particularly the case with climate change as it may be increasing the incidence and patterns of natural disasters. Moreover, the accuracy of information varies according to the type of disaster – while there may be some confidence with respect to

CARRReseARCH

Page 7: Risk and Regulation Summer 2010

Risk&Regulation, summer 2010 7 Risk&Regulation, summer 2010 7

volcanoes and flooding, it may be very difficult to locate with much certainty the precise location of earthquakes, hurricanes and wild fires. this is well illustrated by Hurricane Gustav in 2008. the trajectory and force of the hurricane proved very difficult to predict and nearly 2 million people fled the Louisiana coast in anticipation of a category 3-4 hurricane but by the time it reached Louisiana it had downgraded to category 2. this does, of course, raise the question of how seriously the next hurricane warnings in Louisiana will be taken: there is a danger that they might be ignored. such occurrences may shake public and policy making confidence in science and the scientific community. they can also waste valuable resources.

Just as the available data may not always be as comprehensive and certain as we would like, so too are early warning systems often unreliable. It is important to have up to date information about the availability and accuracy of these systems and the amount of warning they give before deciding whether or not early warning systems are indeed worth mandating, should this be a financial or political option. And this is a transnational problem. not only is climate change affecting the globe and global weather patterns: there are great inequalities in knowledge, resources and vulnerability across the world and a responsibility to collate and share knowledge and resources where possible.

Risk strategiesthere are various risk strategies in pre-event disaster risk management: risk avoidance; risk minimization; and risk sharing. Risk avoidance strategies focus primarily around land use policies and most particularly the delineation of areas where settlements are regulated. these range from hazard zones where no urban development or planning is permitted through to development laws. the best known risk reduction strategies are building codes, especially in earthquake and flooding zones where damage is primarily to buildings and infrastructure. Where building codes do not exist or are not enforced the loss of life may be considerable. Part of pre-event planning includes precautionary decision-making about how to deal with a natural disaster when it happens and in its aftermath. this anticipatory planning may cover short-term contingency planning and longer term repair planning. this may involve the state and the market sharing the burden at local and national levels. Risk sharing includes pre-disaster risk transfer through insurance or other hedging instruments such as catastrophe bonds or weather derivatives, or collective loss sharing post-disaster which may take place nationally or globally through charitable donations. Globally this entails richer countries assuming some responsibility for losses in less developed countries.

In each case, careful consideration needs to be given to whether information regarding the location of hazards is sufficiently reliable to inform land-use planning; how feasible it is to implement building codes with respect to new builds and also retroactively; and how one might encourage public-private partnerships to share the burdens of risk events and where possible to encourage transnational co-operation. this would mean that impoverished areas of the world are helped by those with more resources to cope with disaster planning and recovery.

there are limitations to risk avoidance, risk minimization and risk sharing strategies. Many of these are encountered in all societies but they are undoubtedly exacerbated in poorer developing countries where the prerequisites of stable governments and good governance systems may not be met. Indeed these societies are especially, but not exclusively, vulnerable to concerns about the solvency and sustainability of such schemes given the increasing incidence and severity of natural disasters caused by climate change.

there are also a number of general factors which can influence the efficacy of risk regulation strategies. An important cultural influence is how anticipatory or fatalistic a culture is. Local communities may view natural hazards as ‘facts of life’ which cannot be avoided. And this may be a view shared by planners who do not believe risk minimization efforts will be successful. Central-local government relations are crucial to the success of risk regulation as so often these measures emanate from central government but the implementation of risk regulation measures – such as planning laws, building controls and hazard zones – is local. this can easily lead to difficulties and inconsistencies because of intergovernmental tensions and the existence of obstacles to local hazard risk regulation. Corruption is another major obstacle to the effective implementation of regulation and also the distribution of large aid packages. It is cited as a major factor in the very high loss of life in the turkish earthquake of 1999. In this case, corruption meant that planning and construction codes were not enforced.

Using risk regulation in the mitigation of natural and other disasters can contribute significantly to reducing the risks and costs associated with these disasters. But these measures need to be used strategically. Working locally to enhance sustainability and resilience is important in all areas vulnerable to extreme events. Where levels of certainty are high, more detailed risk regulation measures and planning are also possible. Where risk avoidance strategies are not possible, for either informational or socio-political reasons, risk mitigation strategies may be optimal. to be successful these are reliant on enforcement mechanisms and a readiness to use sanctions. Retrofit measures may also require financial incentives. But these need to be handled carefully so as to avoid the danger of offering false assurance that they mitigate all eventualities. In reality, these measures may still be insufficient to protect against extreme events and so there has to be awareness of this and of alternative strategies. this underlines the need for good governance systems and the pre-requisites of stable governance structures.

a regulatory mixRecovery strategies are therefore advisable in all areas of high risk. It is common in risk regulation discussions to advocate a regulatory mix in which the state harnesses sources of regulation beyond the state so as to empower different participants in the regulatory process. this includes national and local governments, businesses and local communities. In the private sector insurance companies are especially important in helping to secure and reinforce risk regulation mitigation strategies. It is essential to acknowledge that there are likely to be local variations influencing the effectiveness or even possibility of using some forms of mitigation. this needs to be dealt with by providing regulations which can be tailored to local circumstances. typically such regulations are principles rather than rules based, allowing for flexibility and discretion in implementing them in varying situations.

bridget hutter is Professor of Risk Regulation at Lse and Director of CARR.

this article is adapted from Hutter, Bridget M (2009) ‘the role of regulation in mitigating the risks of natural disasters’. In Kunreuther, Howard and Useem, Michael, (eds.) Learning from catastrophes: strategies for reaction and response. Wharton school Publishing, Upper saddle River, UsA, pp. 121-138. this book is a product of the World economic Forum Council on the Mitigation of natural Disasters 2008.

© S

USA

N W

ALS

H/P

A P

hoto

s

Page 8: Risk and Regulation Summer 2010

8 Risk&Regulation, summer 20108 Risk&Regulation, summer 2010

as I pen this article, a number of government agencies in europe, north America and elsewhere are gearing themselves up to

conduct yet more exercises in public engagement about technology. As readers will have noticed, the use of various forms of extended consultation, participation and deliberative involvement with members of the lay public has grown significantly in recent years as a tool in the development of public policy. the figure of impending disaster tends to loom large in discussions about the technologies in question. Whether it’s about the moral dilemmas of smart drugs, the uncertain science of nanotechnologies, or the acceptability of incorporating human genes into new living forms, engagement with citizens and stakeholders is regarded by many policy makers as a way of dealing with various problems associated with technological innovation. of course, the enthusiasm doesn’t stop with policy makers. It also includes a host of academics and think-tankers, many advocating more radical, ‘upstream’, versions of engagement, together with quite a sizeable little industry of consultants who now specialize in this line of work.

In europe, genetically modified, or ‘GM’, crops and food have proved especially controversial. GM is perhaps only second to nuclear technologies in its capacity to mobilize environmental activists, flurries of letters to newspapers, and direct action. In Britain, the Food standards Agency is about to run the third UK government exercise in GM-related public engagement to have been staged in the space of a decade; of which the large-scale GM Nation? debate, some seven years ago, is perhaps best known. Unfortunately, the casual internet surfer is unable to consult the material posted to support that earlier debate as the GM Nation? website has now been closed down. the independent but accredited evaluation of that debate, perhaps the most comprehensive to have been carried out for any process of public engagement, seems to have prompted little interest outside academic circles; whether from government, industry or pressure groups. It is not at all clear how the findings of the debate, or the lessons of having staged the

process, have found their way into policy-making. one is tempted to ask, what was the GM Nation? debate all about?

At the time my colleagues and I were looking very closely at the functioning of the GM Nation? debate, we noted that it was never entirely clear what the government was trying to achieve with this initiative. Was it an exercise in communication, designed to inform, and possibly persuade; or some sort of consultation exercise, perception study or information-gathering exercise; or a genuine debate with pro- and anti- arguments fighting it out; or an opportunity for citizens to learn and come to their own conclusions. In fact there was ambiguity about what was going on, and, significantly, this is still the case.

Consider an intervention by the then science minister Ian Pearson around two years ago. In classic

‘deficit’ style, as critical public understanding of science scholars would put it, Pearson was arguing that if only the public could understand the facts about GM, then they would behave sensibly and support the technology. In making this statement, he alluded to the GM Nation? debate, describing it as being ‘badly handled’; the implication being that it failed in its job to communicate those facts. In response, the environmental pressure group Friends of the earth pointed out that ‘it seems that the government has forgotten what came out of its own debate (which) overwhelmingly showed the public was not ready for GM’.

Arguably, this exchange amounted to rather crude politicking, as neither minister nor environmentalists appear particularly interested in the substance of the debate and what it had to say. the minister seems to be more interested in selling the technology, and the environmentalists in recruiting a version of the debate as a political weapon. It should be noted that the idea of GM Nation? as some sort of national referendum on GM, which is implicit in the latter’s position, was the predominant, and unhelpful, representation offered by the British media. the debate was certainly never conceived as such. the ‘vote’ in question, collected in the form of questionnaires, was based

upon a self-selecting sample. Arguably, the only nuanced, deliberative, engagement generated by the debate was provided by a small component part of the overall process, in which a number of groups of citizens were taken on a learning and discussion exercise.

It seems to me that citizen engagement about technological innovation provides a potentially valuable opportunity to reconcile competing perspectives, and to incorporate such perspectives into decision-making, subject to the constraints and opportunities afforded by the character of the technology in question. engagement is certainly not a panacea for the various troubles encountered by innovation. these processes are always politically charged to some degree, but for some they are regarded as opportunities for political gain, or as vehicles for social change, perhaps towards a ‘deliberative democracy’. I have a rather more pragmatic notion of engagement in mind, one concerned primarily with finding practical ways of making the best of things; and with achieving in practice the least bad world, rather than always dreaming about the best of all worlds.

In many ways, the word ‘debate’ points in the wrong direction. It is the direction of the sort of sterile exchange between the minister and the environmentalists. the word ‘conversation’ is more appropriate. there is, I suggest, a need to develop an understanding of how to engender certain sorts of conversations that can accomplish helpful and productive forms of engagement. those conversations need to be grounded in the sensibilities of mundane everyday experience; yet they also need to be informed by scientific and other forms of expertise. As far as I’m concerned, the real research challenge lies there.

tom horlick-Jones is a professor of sociology at Cardiff University’s school of social sciences. He led the evaluation of the UK government’s GM Nation? public debate. He is the lead author of The GM Debate: Risk, Politics and Public Engagement, which was recently published in paperback by Routledge. this article represents the author’s own personal views.

tom horlick-Jones reflects on the enthusiasm for public engagement in issues relating to potentially risky technologies.

guEStReseARCH

Public debates about technologyAverting disaster, transforming society or simply making the best of things?

Page 9: Risk and Regulation Summer 2010

CARRstUDent

Risk&Regulation, summer 2010 9

in April 2008, one of the largest Brazilian meat-processing companies was found to be using deforested land in the Amazon for

cattle ranching. A little more than a year after that, a Greenpeace report showed that problems were even bigger. named ‘slaughtering the Amazon’, and based on three years of research, the document showed that the practice was not limited to that firm, but was also used by two of the other four largest beef exporters of the country. the report also demonstrated how their goods, which include other bovine products, such as leather, were sold to famous international companies in the clothing, car and furniture industries, and to supermarkets, including in the UK. Another character was strongly accused in the story: the main funder of those firms’ activities, a state-owned development bank.

the Brazilian environmental authority issued a fine of approximately £1 million to the first firm, and seized more than 3,000 of their cattle. the other firms in the supply chain had no legal liability for the goods that they were buying, but they received a lot of public pressure to change their procurement practices.

the banking sector, in turn, was approached for a voluntary agreement. the ‘Protocol of Intentions for social and environmental Responsibility’, or

‘Green Protocol’, as it is commonly called, was signed between the five banks owned by Brazil’s federal government (either solely or as the main shareholder) and the Ministry of environment in August 2008. Private banks, by means of the Brazilian Federation of Banks, subscribed to a slightly different version of the Protocol in April 2009. In both cases, the Protocol calls, among other things, for the provision of funding under sustainability directives, and for the inclusion of social and environmental criteria in the risk analysis of clients and in investment appraisal. It also requests that banks advise credit-takers to adopt sustainable practices in production and consumption. Levels of adherence to the Protocol are to be decided by each particular bank, depending on how they will put into operation the directives given.

But this agreement was not entirely novel. In 1995, the five federal state-owned banks subscribed to a previous version of the Green Protocol. In line with the premises of the 1987 Brundtland Report and the 1992 Un Conference on environment and Development, this first Protocol already

Good intentions

Are voluntary arrangements the way to regulate social and environmental risk?

thiago neto examines the usefulness of voluntary agreements for managing environmental risks.

recognized the role of banks in inducing economic development that would not imperil the needs of future generations.

so what makes the situation different now? Are there reasons to believe that a voluntary arrangement for dealing with social and environmental risk will work this time?

one reason is the different economic situation. the second half of the 1990s was a transformational time for Brazilian banks. After a long period of high inflation, prices were once again stabilized, which strongly affected the way the financial sector could reap profits. Many banks were shut down, others were taken over, and competition from foreign banks increased. the country also suffered the consequences of various international financial crises, and a maxi-devaluation of its currency in 1999. Given that such turbulence has passed, competition between banks is now set at a different level – and this includes competition for good reputation. In their current situation, gaining respect and acceptability by clients, be they individual account holders or large corporations, is essential.

A second reason may sound trivial but actually has strong potential. the new Green Protocol establishes the need for publishing the results of the initiative (on a yearly basis for state-owned banks). It also demands that the agreement be reassessed every two years, in order to discuss its progress. this allows for increased stakeholder engagement, and is linked to the quest for reputation. Besides, the fact that banks were invited to work first on the draft of the Protocol, and then on its continuous improvement, has led to more consideration of the views of banks and to the prioritization of guidance and motivation over surveillance and punishment.

A third reason to believe that the new Green Protocol can be effective has to do with the recent implementation of similar arrangements

at the international level. Despite a tradition of silence from international financial frameworks, schemes such as UneP Finance Initiative and the equator Principles have brought social and environmental issues to the debate. the UneP Finance Initiative has been in place since 2003, and more than 180 financial institutions are now signatories, two of them being from Brazil. By subscribing to the Initiative, they recognize their role in promoting sustainability and commit themselves to considering environmental variables in their operations. Another arrangement, the equator Principles, is more specific still. Created in 2003 and revised in 2006, with support from the International Finance Corporation, it is a set of directives for managing social and environmental impacts in project financing. As of March 2010, there are 67 financial institutions taking part in the Principles, including two private and two state-owned Brazilian banks. the support of international organizations and the example of other companies around the world can put voluntary arrangements in the move.

even though the financial industry is no stranger to dealing with risk, the category of social and environmental risk is a new development, as are the approaches to dealing with it. Voluntary arrangements for regulating it via third parties could become a popular form of risk regulation. ensuring sustainability was one of the six thematic pillars of the most recent World economic Forum in Davos, accompanied by a call to redesign institutions, policies and regulations. even though it is still too early to know whether voluntary initiatives may be properly able to tackle the issue, the Brazilian example can provide a test and suggest a blueprint for the future.

thiago neto is an Lse Department of Accounting PhD student affiliated with CARR.

Page 10: Risk and Regulation Summer 2010

10 Risk&Regulation, summer 2010

as we experience the aftershocks of the financial crisis, the scientific and regulatory debate is raging on. Contributions from

multiple disciplines are shedding light on the causes of the crisis, and discussion is focused on ways of counteracting these causes. Like any other ‘man-made’ disaster, the current crisis is being scrutinized for its lot of wrongdoings and errors, at both an individual and a collective level. the ‘human factor’ proves again to be an important issue for the explanation and prevention of disasters, one that has been increasingly scrutinized in post-accident studies. the same general questions are being asked: can we track the link from disasters all the way back to the decisions of company managers, employees, or regulators? And if so, is there a way to avert disasters by influencing all those actors so that they behave ‘safely’?

the answers to these questions are never simple. the current debate provides evidence of the complexity of the issue. Like other large scale accidents, the financial crisis is a tale of wrongdoings ranking from the petty to the scandalous, and of small or big errors committed by a great variety of people: including employees and managers in the banking and financial sectors, households, and regulators. As in almost every tale of a major accident, all sorts of human errors can be found, either as contributing causes or as missed opportunities to avert the crisis. these errors can sometimes date back to long before the disaster itself. the current crisis is no exception, with claims that some of its roots lie in regulatory changes decided some 30 years ago. this implies that the relation between human error and a possible disaster is generally unclear in advance. Rather, the causal role of individual or collective choices is frequently revealed only with the benefit of hindsight.

thus, the exploration of human failures and of their disastrous consequences provides multifarious vignettes of human agency. It is great material for theory and policy building. this fact has not escaped many commentators, both inside and outside academia. taking stock of the financial crisis, a number of recent publications, principally authored by representatives of the ‘behavioural’ stream in economics (Akerlof, shiller, Ariely, sunstein, thaler)

are doing just that. they provide a fundamental critique of the behavioural assumptions supporting mainstream economics, and call for a new set of policies that are not founded on the dominant view of rational action in economics. this is fresh air for the debate on human agency and its role in generating or averting disasters.

conflicting goalsthese commentators have been keen to emphasize the role played by biases such as short-sightedness, and to pinpoint the incentives that have contributed to the pervasiveness of these biases in the financial sector and the economy more generally. social environments cultivating the pursuit of short-term self-interest have been said to enhance excessive risk-taking in multiple instances at the expense of safety, and thus to have contributed to the systemic disaster that ensued. this is reminiscent of another pervasive argument in risk research. According to a number of post-accident studies, errors that pave the way for disastrous outcomes may be the result of conflicting goals. In particular, risk researchers frequently lament that productivity, or the search for profit in the short term, may conflict with safety, at the expense of the latter.

Undoubtedly, pressure to improve short-term performance can have harmful consequences when the related tasks involve risk-taking. the idea that goal conflict contributes to disasters is not limited to the goals of productivity or profit making. other motives may also be incompatible with safety. For example, hierarchy and the drive or pressure to abide by its rules can be a reason for isolated but critical opinions to remain unspoken or unheard. there again, powerful anecdotes in the literature on man-made disasters suggest that one’s courage to speak up and others’ aptitude to hear could, in some cases, have helped avert catastrophes.

Yet it has become banal for people and organizations in advanced economies to pursue conflicting goals. on the one hand, the goals of productivity and profit are pervasive. they have become the dominant motives in multiple domains and social activities. on the other hand, organizations and individuals are faced with pressing demands for

the acknowledgment of collective interests – safety, health, the protection of the environment etc. It is unlikely that incompatibilities between goals could be eliminated without a radical reorganization that would shield hazardous sectors from environmental pressures and from perverse incentives (if that were at all possible).

incentivesIn the ongoing debate on the financial crisis and how it should be dealt with, propositions have been made to address this issue of conflicting goals and its adverse, sometimes disastrous effects. some argue for the instalment of better incentives for risk-takers. For example, the Financial services Authority and some representatives of the financial industry have advocated remuneration systems rewarding long-term performance rather than short-term performance. this appears to imply that incompatible goals could and should be translated in the same commensurable terms of costs and benefits. In other words, if safety could be priced and if that price could be inserted into everyone’s utility function with incentives, goal conflict could be resolved.

even if it takes into account the problem of short-sightedness, this is a very partial take on incompatible goals and risk-taking. It relies on a single type of motive: the maximization of self-interest. As such, it builds on a remarkably incomplete view of what makes people tick, although sometimes a highly efficient one. social norms, in particular, are being completely ignored. there is also the question of how effective incentives would be if the circumstances of risk takers were to change, for example if so-called ‘golden opportunities’ for rapid and large profits emerged. Arguably, self-interest in itself is not a good guarantee for behavioural consistency in the long run. As French sociologist emile Durkheim wrote, ‘self-interest is, in fact, the least constant thing in the world. today it is useful for me to unite with you; tomorrow the same reason will make me your enemy.’ take away a genuine concern for safety on the part of the regulated, and incentives might work only as long as the returns of bold risk-taking do not dwarf the benefits promised to a more conservative course of action.

harnessing motivesAs an alternative, I would argue that incompatible goals are not necessarily a problem. In fact, they are not a specificity of hazardous organizations. on the contrary, it is very common for individuals to

Julien etienne discusses the role of human factors in precipitating financial and other organizational disasters.

Harnessing motives for disaster prevention

10 Risk&Regulation, summer 2010

CARRReseARCH

Page 11: Risk and Regulation Summer 2010

Risk&Regulation, summer 2010 11

pursue incompatible goals on a daily basis. to do so, they do not aggregate payoffs nor consciously build compromises between goals. Rather, motives influence cognition and drive attention towards certain choice criteria and away from others. In that process, secondary goals, such as safety, play a role by partly diverting one’s attention away from one’s main objective, for example productivity. As a result, individuals tend to choose second best options, fulfilling main goals in ways that do not fully undermine secondary ones. thus, individuals pursuing productivity and safety might arrive at better choices precisely because they are holding and trying to achieve different goals simultaneously.

And it works. empirical studies on risk management in various industrial sectors have shown that, when safety is a strong enough concern for managers, even if it is only a secondary concern, it may consistently bend individual and organizational decisions in favour of safer, so-called ‘second order’ options. Besides, in degraded conditions where the potential for failure and accident is greater, safety can temporarily take precedence over whatever goal dominates in more settled times, as it has been observed in the so-called High Reliability organizations (HRos). When trouble was in sight, individuals in HRos stopped interacting according to the rules of hierarchy but rather deferred to expertise. For example, a less senior member of the organization would instruct a more senior one on what to do. Yet hierarchy remained in the background as an important goal, and was actually reinstated as soon as things improved.

All in all, incompatible goals are a problem only when one of the goals is strong enough to cancel out any other consideration: when productivity is to blame for a disaster, it is because there was no place for safety in the minds of decision-makers. otherwise, for individuals and organizations to simultaneously pursue potentially conflicting goals such as productivity/profit and safety is a situation that could lead to a positive outcome.

Risk awarenessobviously the desire for productivity and profit does not need to be strengthened by regulators. But many risk-taking individuals and organizations could be made more attentive to disaster prevention and safety. It might not seem like much, and it certainly is only one strategy among several that could be implemented to prevent future errors.

nevertheless, risk awareness helps to develop a concern for safety. As a result, safety might then be pursued for its own sake, not as another piece of a remuneration package.

In the past, organizations have answered such pleas for safety by setting up risk management departments. that sometimes meant that the rest of the organization did not have to pursue safety and could focus instead on other goals. safety then became the hostage of internal politics, and so trade-offs remained likely. But the propensity to trade safety for productivity would be reduced if individuals shared a genuine concern for both motives across the whole organization, especially at management level.

to conclude, harnessing motives can be a powerful way of regulating risk-taking. this does not mean changing the options open to individuals in a given situation: for example by setting up incentives to act in a certain way, by proscribing certain behaviours and threatening to sanction non-compliers, or by ‘nudging’ individuals into making specific choices. Rather, it means influencing the criteria used for

choosing a course of action regardless of what situation the individual is in. Conversely, incentives can only be a cure for settled times. to address the behavioural sources of disasters, regulators should make use of the many motives influencing individuals, rather than focusing only on self-interest in its various forms. this means cultivating the not yet moribund capacity of individuals to act according to what they think is appropriate, even if this is not the most profitable course of action in the short run.

Julien etienne is an esRC Postdoctoral Fellow at CARR.

Page 12: Risk and Regulation Summer 2010

12 Risk&Regulation, summer 2010

Warning systems against impending disasters are often thought of as arrangements of technology, with airport

security screening being an obvious example, however contested, of making an environment secure. other examples include the Us national oceanic and Atmospheric Administration’s modelling and measuring apparatus for tsunami forecasting, and FeMA’s ‘Integrated Public Alert and Warning system’. In household and office settings, smoke detectors emit alarms or automatically trigger flows of water to drown out flames.

Less apparent than such technologies are experts and scientists whose work, while often primarily devoted to more basic issues, contains mechanisms of warning. It was scientists, for instance, who first told us to stop smoking and to eat vegetables, to wear seat belts and condoms. scientists were the first to warn that the consequences of nuclear war would be severely different from even the largest conventional war. It is scientists who now tell us that we must attend to climate change.

to some degree, this relationship is obvious enough, but it is often under-considered in relation to overall strategies of safeguarding populations. one reason for this lack of appreciation is that we’re used to thinking of warnings happening quickly, precisely and in the time immediately before a crisis. But knowledge of a problem, including a catastrophic one, often grows only gradually, without the familiar emergency narrative. there is no single and distinctive moment in which to announce an impending disaster. scientists who study seismic hazards in the Caribbean, for instance, were not surprised that a large earthquake happened on Hispaniola and indeed had been predicting it for some time. nor were they surprised that the death toll was appallingly high because poor construction practices created high vulnerabilities. But they couldn’t predict exactly when the earthquake would happen or precisely where. Instead, they understood the processes building up, the inevitability of mass destruction and, following on from this, the measures that could have been taken to reduce risk.

science can sometimes offer clear and timely warnings. As Hurricane Katrina developed in the Gulf, the director of the Us national Hurricane Center, meteorologist Max Mayfield, issued precise and accurate warnings of what was to come. only months before, under the aegis of a

virtual ‘Hurricane Pam’, the entire scenario had been remarkably well simulated as a table top exercise, in the presence not only of local experts but also important bureaucrats and political leaders. Many people heeded the warnings of this exercise combined with that of the science-based evidence gathered over prior decades. some ignored the warnings, however. Indeed, one of the remarkable features of Katrina was the contrast between the accuracy of the advance analyses and the evident lack of preparation on the ground.

We have interviewed many dozens of scientists, engineers, and other experts who were working in Louisiana before Katrina. Katrina’s experts provide a good basis for learning how science can ‘warn’, at least in some operational settings. our goal has been to determine what exactly experts did warn, and how they did it. We wanted to discover the degree that ‘warners’ were organized in advance of events – the ways they resembled an ‘Integrated Public Alert and Warning system’ versus an ad hoc and perhaps even silent scatter of knowledgeable individuals. We wanted to discover the potential for greater effectiveness in these regards.

Warners’ interactionsA finding of our work is that the Katrina experts resemble the ad hoc model more closely than the integrated system model. Integration problems among security experts at the bureaucratic level are legendary in the post 9-11 world. But the terror-fighting world has always been privileged – even before 2001 – in benefiting from the formal organizations within which the intelligence agencies operate. there is a basis for inter-organizational linkage and personnel whose job it is to help information move from one place to another and to issue warnings. Formal organizations can use standard operating procedures and take advantage of relatively centralized decision-making; including, in the case of anti-terrorism, access to authority at the highest level.

From our interviews in the Louisiana Gulf region, by contrast, we found that the relevant experts lack mechanisms to co-ordinate or aggregate disparate voices. It is a stove-piped system, with many closed ends and, with some exceptions, only weak couplings. experts live within their specific disciplines, university departments, mission-driven government agencies and profit-targeting private corporations. With each type of position (and there are variations within type) there are alternative

modes of communicating with lay publics, political leaders, and other experts. these variations give rise to different conceptions of ‘the cause’ of the Katrina disaster and, by implication, who should be held responsible for the damage and what could have been done to mitigate it.

scientists and experts in the Gulf region meet only periodically and irregularly. Most of their contact involves exchanging research results on topics defined by their respective disciplines and subfields. these can be at professional association meetings or workshops oriented toward more narrowly drawn academic specialties. At times, they meet as advisors to agencies, foundations, or other funding units.

In coming to their conclusions they rely in part on evaluations of one another. these judgments rest on usual academic yardsticks, but also on personal trust born of direct contact with colleagues and their data. A lot turns on instances where detailed inspection of a colleague’s past work, perhaps their calculations or models on a specific problem, showed it to have been done with care and honesty. these experiences then become the basis for a willingness to trust subsequent work without repeating the same degree of inspection. this becomes important, for example, when a colleague makes a public pronouncement on some controversial matter. not privy to the original data nor with the time to replicate or even inspect the relevant materials, experts often trust that, other things being equal, what is being said can be treated as valid. Any impediments to such first-hand knowledge, such as physical distance or disciplinary boundaries, impedes coordination among experts: their ability to reach common understandings and communicate with a unified voice is compromised. Boundaries among scientists thus work against effective warning systems.

going publicthe willingness of experts to speak out varies along a number of dimensions. experts in universities often express the view that their employers appreciate the publicity they generate, and so feel free to speak out. some faculties who want their results to enter the public sphere rely almost entirely on the public information offices of their campuses to publicize their work. the campus publicity office may pick up their articles in scholarly journals, and then issue a press release. In other cases, faculties are more pro-active; they forward their study results

lee clarke and harvey molotch discuss the difficult role of experts in disaster forecasting systems.

scientists as Disaster Warning Systems

guEStReseARCH

Page 13: Risk and Regulation Summer 2010

Risk&Regulation, summer 2010 13

to a public information official or a journalist with whom they have previously dealt.

the ‘middle-man’ role of the campus public relations office gives the university’s administration the power to shape what is promoted off campus and how it is framed. A university’s support and enthusiasm might thus be tempered by the perceived risk to future funding from sources which may take offense at particular scientific findings and especially of evident efforts made to circulate them widely. the prospect of such offense to funding sources may therefore also affect the willingness of researchers to pursue media outlets.

Respondents in government agencies reported clear limits to their media interactions, and often had to receive dispensation from those in positions of authority before making contact. some of the concern is about litigation, the extent of which, in the case of Katrina, has become considerable. some issues are ‘radioactive’ enough that government-employed experts refrain from bringing them up in public settings, or bring them up only in circuitous ways.

scientists in private industry are usually the least free to voice their concerns. In the Louisiana Gulf region, a significant proportion of such experts are in the employ of the energy industry, which has major economic stakes in assessments of the sources of danger. not only do such assessments influence the conduct of energy exploration and development, but also the potential of the industry to develop its extensive real estate holdings. Land officially deemed ‘safe’ from floods can be subdivided and marketed.

At the other end of the continuum are experts in the employ of environmentally oriented non-governmental organizations. they consider their employers to be solidly in support of outreach activity and do all they can to generate media coverage. they warn and warn; it is precisely their mandate to find and persuade larger audiences.

the stress of vocationthe classic issue for science when dealing with disaster can loosely be summarized as whether to remain ‘pure’ or to present a case for imperative action. the problem of ‘warning’ brings this issue into sharp focus because the efficiency and speed of translation from laboratory to social and political implementation becomes radically important. to warn someone is to urge them to change what they are doing, and that, in the mainstream interpretation, is not the proper business of science.

there are, of course, middling variations on the continuum of pure research versus strident activism, but these fall between two extremes. At one extreme is the expert who lives behind a wall of monitors or in a lab, writing technical papers, going to conferences of the like-minded, and so forth. such individuals may have little conception that their knowledge is relevant to larger publics, or may believe, as a matter of principle, that it is not their role to directly participate in the public realm. At the other extreme, there are experts who are willing to speak to reporters and to argue for controversial policy positions. they may appear before legislative committees or seek out influential

politicians. there are experts who write op-eds and provocative books. several among our informants even switched career paths, moving to positions at, for instance, advocacy nGos. Driven by regional commitments and emotional frustrations, they make unusual efforts to step out of the science-as-usual behavioral mould.

And here is a basic problem for scientists: it is important to avoid being seen as ‘alarmist,’ but, as a matter of influencing policies that might mitigate disasters, fear and anxiety are useful tools. the pronouncements of the Intergovernmental Panel on Climate Change reflect this tension: the experts involved want people to worry and take action but don’t want to sound alarmist in the process. so there is a continuous tension between direct expression of emergency concern versus a stance of dispassionate weighing and consideration of alternative explanations and courses of action. the challenge is for relevant institutions to negotiate this tension and issue credible warnings that work.

lee clarke is Professor of sociology at Rutgers University. He was recently named a fellow of the American Association for the Advancement of science.

harvey molotch is Professor of sociology and Metropolitan studies at new York University.

Page 14: Risk and Regulation Summer 2010

14 Risk&Regulation, summer 2010

CARRReseARCH

the UK General Medical Council [GMC] came under intense scrutiny in the early 2000s following cases of ‘rogue’ doctors

– notably Harold shipman – and is in the midst of implementing the most radical programme of reform in its history. Against this background, I have been studying the information gathered by the GMC, to clarify the nature and potential uses of its data.

Gathering and recording information is central to the GMC’s core function of maintaining and controlling a register of doctors qualified to practise in the UK. the GMC records data about every doctor registered, and information about cases where a doctor’s fitness to practise has been questioned. there are currently approximately 230,000 doctors with live registrations, and, in 2008, 5,195 were the subject of complaints or referrals to the GMC; 204 fitness to practise panel hearings were held; and 42 doctors were erased from the register. In 2004, the GMC implemented new fitness to practise procedures and in 2006 it went live with a new electronic database (known within the GMC as ‘siebel’) that now contains both the register and the fitness to practise data.

Can the data in the new database be used for wider regulatory purposes, such as identifying risks and targeting regulatory activity? Databases such as the GMC’s can appear to be goldmines of information, but there are strong reasons for caution. Data gathered for one purpose are often ill-suited to another purpose. From the study I conclude that, on its own, the data cannot reliably tell us much about wider patterns of risks to patient safety from poorly performing doctors; but the database does tell us quite a bit about the GMC as an organization.

the underlying premise of the research is that information in a database is always a function of the context in which it is created, and that this affects the uses it can validly serve. three broad kinds of influence on the GMC’s data were distinguished: source factors; GMC factors; and database factors. Most previous work on patient-safety related data has concentrated on the first of these, i.e. the sources of information. the GMC relies on concerns about doctors being brought

sally lloyd-bostock explores the General Medical Council’s risk data.

Gathering datafor medical regulation

to its attention. this, in turn, depends on the identification of a ‘problem’ to which the GMC is seen as the appropriate body to respond, and also a willingness to enter a possibly unpleasant social process that can damage relationships.

the fitness to practise data were clearly shaped and limited by the sources of complaints or referrals, and by the inherently antagonistic nature of making them. the substance of complaints and referrals reflected the knowledge, experience and obligations of those making them, whether they be members of the public or organizations. Referral to the GMC was seen as a last resort by medical directors in nHs trusts as well as by members of the public. the fitness to practise data are unlikely to be representative of concerns about doctors, let alone representative of risks to patients from poorly performing doctors.

the second type of influence – GMC factors – recognizes that the scope and quality of data recorded by an organization are determined by its organizational purposes. the registration data directly reflected the GMC’s function of maintaining a register of doctors qualified to practice. the GMC must establish certain facts in order to grant registration, including details of qualifications and where these were obtained. this information, and the identity of the doctor, is carefully checked before registration is granted. Beyond that, little is recorded on the doctors registered. Register data cannot be narrowed down to doctors currently practising, let alone those currently practising in a particular field. Date of first registration is recorded, but date of birth is not known with any accuracy. specialization is recorded either at the time of registration or when the doctor notifies the GMC, but as doctors often move area of training the information is known to be unreliable. the register does not indicate where the doctor is now employed – nor indeed, whether s/he is practising at all. the nature of the register data clearly limits its value for identifying risk factors. Analysis comparing register data and fitness to practise data is possible, but there is a danger that information available on the register will loom larger than it would if other information were also available; and explanatory factors will be missing.

similarly, the fitness to practice data were directly structured by the GMC’s legally defined functions and powers. the tight focus of its concerns was the fitness of individual doctors to continue to practise, not the identification of risks to patients. Its fitness to practise procedures, governed by rules with legal force, are designed to identify, direct and develop cases that might lead to action on a doctor’s registration. the allegation codes used in siebel are designed, not to capture what is alleged, but rather to define a potential case within the GMC’s powers. the codes are directly related to matters covered in the GMC’s guidance for doctors (Good Medical Practice), which sets out what is expected of a doctor and what behavior might lead to a doctor’s fitness to practise being questioned; and what is coded is shaped by what is deemed appropriate for investigation and potentially provable.

the GMC must also manage risks to itself as an organization. these include risks associated with its high public profile, and its vulnerability to criticism and blaming strategies: maintaining trust in the medical profession is an explicit part of its remit, and this entails maintaining trust in itself as a regulatory body. Its legal remit is continually changing and sometimes fuzzily defined, and it relies extensively on other bodies to provide reliable information. Furthermore, demonstrable compliance with legal requirements and procedures is essential to the organization. Managing these various risks forms an integral part of the organizational context, and so its fitness to practise database serves as a risk management tool in itself.

the third type of influence considered was the database itself – its configuration and its use in practice. once in place, a database structures and filters information, acting as the ‘scaffolding of knowledge’, and, as electronic databases proliferate, literature is growing on the effects of databases on an organization’s information. Any database obscures complexity in systematic ways. siebel not only copes with a complex world, but also with complex and detailed procedures and legal requirements. the legal framework governing the work of the GMC resulted in a correspondingly complex and specialist database configuration.

Page 15: Risk and Regulation Summer 2010

Risk&Regulation, summer 2010 15

the configuration and use of the GMC’s database reflected organization-level concerns with legal compliance, defensibility, organizational legitimacy, and day-to-day case management, producing data of limited wider value for identifying or assessing risks to patients from poorly performing doctors.

the value of sweeping overviews of risk factors in this context is anyway questionable. the recent period of GMC reform has coincided with strong government support for risk-based regulation across regulatory fields, and it is not surprising to find the rhetoric of risk-based approaches in debates surrounding reform of the GMC. the GMC’s need for reliable information about risks could be seen as an example of a much-rehearsed problem raised by any attempt to adopt risk-based approaches: they rely heavily on good quality information about risks and can only be as good as the available data. However, the relevance and applicability of risk-based approaches to medical regulation has been fundamentally questioned, by the GMC itself, the Chief Medical officer and others. While the GMC’s data cannot be used in any straightforward way for purposes of risk-based regulation, special studies and modifications to the database and the types of data collected would make it possible to produce much more valuable information about risks to patients from poorly performing doctors, and about where such risks are most likely to arise.

disastersthe research also cast an unusual light on the role of disasters and high profile incidents. the potential for high profile events to command attention is often seen as double edged: they can lead to productive analysis, but they can also command

excessive attention and resources at the expense of less visible risks. the GMC is undergoing radical changes, often seen as resulting from highly publicized scandals. In particular, extensive criticism following the shipman case has had a lasting impact on the GMC. the name ‘shipman’ is likely to come up quite quickly in any discussion of the GMC. A line can indeed be traced from that case to several current GMC reforms. In the Fifth Report of the shipman Inquiry, published in 2004, Dame Janet smith was highly critical of the GMC. the Chief Medical officer, sir Liam Donaldson, responded immediately with a detailed agenda for reform of medical regulation, which in turn led to a White Paper in 2007 and subsequent legislation to implement reforms.

However, Dame Janet’s criticisms and the current reforms have little directly to do with remedying GMC failings that might have contributed to shipman. In fact, Dame Janet concluded that the GMC could not be held responsible for shipman remaining free to practise even though he was murdering patients. the impact of the case on the GMC owes as much to the terms of reference of the various early-to-mid-2000s ‘rogue doctor’ inquiries as it does to the shipman case in particular. Indeed, it could be argued that other cases represented more containable risks, clearer grounds for criticism of the GMC and more opportunities for change to help prevent similar cases in the future. At the time that shipman’s crimes came to light, changes to GMC procedures already in place for dealing with doctors with health problems had made it unlikely that a similar case could escape closer scrutiny. However, consideration of GMC procedures did not fall within the remit of inquiries such as those into the cases of Ayling and neale.

Although the remit of the shipman Inquiry was defined with reference to the shipman case, Dame Janet interpreted the terms of reference widely on the grounds that they included making recommendations for the better protection of patients in the future. on this basis, much of the Fifth Report of the Inquiry was devoted to criticism of the GMC’s ethos and procedures, and to recommendations for reform. the recommendations were not aimed at preventing ‘another shipman’ so much as at giving impetus and shape to reforms of the GMC already proposed or in progress but progressing slowly. similarly, within the GMC it was the shipman Inquiry rather than the case itself that had an impact. the indirect effects of the case and its aftermath were evident, for example, in the way the GMC’s new fitness to practise procedures were designed and implemented, including use of the database itself as a defensive tool. the question is whether wider societal changes, and a reform process already under way, created the environment for cases to become highly publicized scandals, and for the shipman case to become a vehicle of reform.

sally lloyd-bostock is a Professorial Research Fellow at CARR.

I am very grateful to the GMC for extensive help and co-operation with the research, which was supported by esRC award number Res-153-25-0087.

Page 16: Risk and Regulation Summer 2010

16 Risk&Regulation, summer 2010 16 Risk&Regulation, summer 2010

CARReVents

in March, CARR organized a public seminar on ‘Risk, technology and Disaster Management’ as part of the esRC’s social science Week.

the seminar showcased some of the country’s leading social science research and illustrated its impact on our social, economic and political lives. It was part of a series of over 130 events which took place between 12-21 March all over the UK. this particular event focused on the ways in which advances in science and technology are being used in disaster response, and the technical and social risks associated with the use of such technologies. It also provided a forum for practitioners, policymakers and publics to debate the technical, social and ethical aspects of disaster management and relief, and the risks associated with these undertakings.

the recent severe earthquakes in Haiti and Chile prompted a worldwide response, yet also provided an abject reminder of the practical and logistical problems associated with efforts to deal with the aftermath of mass disasters. While there will always be a need to bring rapid relief to survivors of disasters, those involved in managing that relief will also face a considerable task in identifying the remains of the dead. Identification in death is a fundamental human right, specified by the terms of the Geneva Convention, and is necessary for many reasons. As well as bringing much-needed psychological closure to families and loved ones, identification may be needed to facilitate settlements of estate, or to allow pursuit of insurance claims. In other cases, identification of the deceased may be instrumental in pursuing criminal prosecutions, such as in cases of suspected genocide.

Hence, while modern science and technology plays an important role in all areas of disaster management, the skills and technologies of forensic scientists are finding increasing application in such scenarios. sophisticated DnA profiling techniques now play an integral part in modern Disaster Victim Identification (DVI) strategies, and are beginning to play an instrumental role in mass fatality incidents. these newer techniques have served to enhance and complement longer-established forensic disciplines. For example, anatomical and anthropological methods continue to play an important role in identifying disaster victims, and can also provide vital evidence in man-made catastrophes, including acts of terrorism or politically motivated killings, such as the ‘ethnic cleansing’ that occurred during the Balkan civil wars of the 1990s.

Although disaster response teams have keenly embraced cutting-edge science and technology, the use of such sophisticated methods comes with risks that demonstrate how the deployment of technology is inextricably bound with political, social

and ethical considerations. such interdependencies present numerous challenges, and the seminar sought to bring these to further attention.

CARR welcomed a number of distinguished speakers to the seminar, chaired by Professor Bridget Hutter. It began with a presentation by Professor sue Black oBe, who is based in the Department of Anatomy at Dundee University. sue specializes in Forensic Anatomy and Anthropology, and has extensive experience of applying her expertise on DVI in areas including Kosovo, Iraq, and in thailand during the aftermath of the 2004 tsunami. sue gave an account of a number of mass fatality events, including those in which she had personally been involved, and highlighted a number of issues faced by DVI workers. these included threats to their own personal safety when working in conflict zones such as Kosovo, and the pressures brought about by the often intense scrutiny from both media and local populations.

sue’s presentation also emphasized the highly variable conditions under which DVI workers may have to operate. In sierra Leone for example, workers had to deal with facilities which lacked basic functions such as running water. she illustrated how environmental conditions affect the DVI process. Bodies may decompose rapidly in hot conditions, for instance, being reduced to skeletons in as little as six days. such conditions, she explained, shorten the time-window for convenient identification and strongly influence the kinds of techniques that DVI workers employ. Identification by DnA analysis is greatly complicated by badly decomposed bodies and hence the skills of forensic anthropologists to identify skeletal remains may be required instead. Various forms of expertise may therefore be required to bring about identification. A number of wide parameters shape the technicalities of DVI work, including the magnitude of casualties, the type of disaster (manmade accidents, acts of violence or natural disasters), and the need to provide prosecutorial evidence. Professor Black closed

her presentation by summarizing these similarities and differences, and explaining their ramifications for DVIs working with mass fatality situations.

the next presentation was given by Commander nick Bracken, oBe, of the Metropolitan Police service (MPs), who is currently Area Commander of east London. During his career at the MPs, nick has led numerous high-profile investigations. He was the senior Investigating officer for the Winsford, Paddington (Ladbroke Grove) and selby train crashes and the reviewing officer for the tebay rail accident. He was also appointed the senior Identification Manager following the Yarls Wood immigration centre fire in Bedfordshire, and 16 years after the 1987 Kings Cross underground fire he lead the successful effort to identify the last unidentified victim. In 2004, following the Asian tsunami, he was appointed the senior Identification Manager for the UK’s response and subsequently appointed as the International DVI Commander.

In his presentation nick gave the police officer’s perspective on this difficult process. He described his role as that of a conductor, orchestrating the various forms of expertise involved in DVI, and he emphasized the highly contingent nature of disaster response. He described the need to make quick and effective decisions regarding which forms of expertise may be needed in specific situations.

While acknowledging that each disaster presents its own unique set of challenges, nick regarded misidentification to be the biggest risk in all DVI work. He explained that the identification process could be greatly complicated in all manner of ways. He particularly emphasized the significance of ante-mortem (AM) data in reliable identification. this kind of information may take various forms; for example, in transport-related incidents such as plane crashes, passenger lists are a basic but vital tool, while bioinformation databases, such as those involving dental records or DnA, represent more sophisticated forms of AM data. nick stressed the national differences which exist with regard to

Risk, technology and disaster Management

chris lawless discusses and recounts a recent esRC event hosted by CARR, which focused on post-disaster management.

Page 17: Risk and Regulation Summer 2010

Risk&Regulation, summer 2010 17

© A

NJU

M N

AV

EED

/PA

Pho

tos

such bioinformation. scandinavian countries, for instance, possess well-developed dental databases which have facilitated DVI in several instances, whereas in other jurisdictions, such as France, other forms of data, such as fingerprints, are more available. effectively managing and utilizing this diversity of information represents a major challenge in incidents such as the 2004 Asian tsunami, where casualties were drawn from a wide cross-section of nationalities.

A response to sue and nick’s presentations was given by Roger Baldwin. A member of the senior management team of the MPs Directorate of Forensic services at new scotland Yard, Roger has the strategic lead for Disaster Victim Identification and mass fatality forensic systems. Roger built upon the themes of the previous talks by emphasizing the political and cultural pressures that accompany DVI work. He pointed out that cultural norms regarding identification may differ somewhat in particular locales; for example, certain religious requirements may not always demand full identification despite strong political pressures to do so, which may complicate the wider project of mass identification. the ever-growing influence of the media may also provide a hindrance to the work of investigators, and may create pressure to generate rapid results, which may not reflect the practicalities of the work involved.

Roger also raised the issue of the ongoing controversies around the scientific status of certain forensic technologies. A recent report published by the Us national Academy of sciences has echoed similar concerns about the reliability of such technologies, and has raised issues about the way in which scientific evidence is perceived and understood. Roger, however, pointed out that while no scientific techniques can ever fully identify an individual, scientific evidence can contribute to the

reliable exclusion of the possibility of identification. eliminating possibilities, therefore, plays an important role in the identification process. on the other hand, while no single forensic technique can be regarded as producing full certainty, a multi-modal approach involving a repertoire of techniques can still contribute to the identification of individuals beyond a reasonable doubt.

Following Roger’s response there was time for a short but enthusiastic discussion of the issues raised by the presenters. Among the items that were discussed further was the historical tradition of personal identification. While this can be seen to have its roots in ancient history, the work of the nineteenth century police scientist Alphonse Bertillon was cited by panel members as a key influence in the tradition of forensic science. His innovations, in terms of the collection and archiving of various standard body measurements through which individuals could be characterized, is widely seen as the first systematic attempt to produce a consciously scientific and quantitative form of personal identification. even modern methods such as DnA profiling owe a certain methodological debt to this kind of system.

Much discussion also focused on the challenges facing efforts to co-ordinate the various national responses to mass disasters. on a broad level, there exists the problem of co-ordination of the disaster response among the various actors involved. such actors may include representatives of several nations but also a series of non-governmental institutions. effective co-ordination may, however, be hampered by political posturing. For example, the occurrence of disasters in specific locations may bring certain political aspirations into play on the part of certain national governments. For example, regional powers may feel a responsibility to provide

a particularly comprehensive response to disasters occurring close to home, which may represent political more than purely altruistic motives, and may not be entirely conducive to effective and efficient international co-operation. the supply of advanced technological aid is dependent on the attendant presence of adequate expertise, but also requires those in power to have a sufficiently well-developed sense of precisely how this expertise should be used and where it should be deployed. Clear channels of communication and well-developed understandings within chains of commands are instrumental in ensuring that technology, along with the requisite skills and knowhow, are used with best effect.

Panel members discussed how the various disaster response teams involved in the Asian tsunami came together to implement quality assurance methods, which represented a particularly sophisticated and effective form of international co-ordination. the extent to which jurisprudential arrangements for DVI may be effectively harmonized on a global scale was also discussed. the decision-making process regarding both the identity of an individual (and, indeed, the nature of ‘death’), spans an extensive network of actors and may be greatly complicated by jurisdictional differences. the panel concluded that this presented a significant barrier to future international co-operation efforts. While Interpol has played a prominent role in this area, it was felt that more could be achieved, possibly through other international organizations. However, in closing, the panel stressed the significance of learning from the experience of each specific incident.

chris lawless is an esRC Postdoctoral Fellow at CARR.

Page 18: Risk and Regulation Summer 2010

CARRInPRIntCARReVents

18 Risk&Regulation, summer 2010

PuBlICATIONS

the Role of Risk Regulation in mitigating natural disastersBridget M Hutter in Learning from Catastrophes: Strategies for Reaction and Response by Howard Kunreuther and Michael Useem, Wharton school Publishing, 2009.

legitimation by standards: transnational experts, the european commission and regulation of novel foods David Demortain, Sociologie Du Travail, 51(2): 104-116, 2009.

trust and technology: the social foundations of aviation RegulationJohn Downer, British Journal of Sociology, 61(1): 87-110, 2010.

Why the ‘haves’ do not necessarily come out ahead in informal dispute Resolutionsharon Gilad, forthcoming Law & Policy.

managing epistemic Risk in forensic science: sociological aspects and issuesC J Lawless, forthcoming Sociology Compass.

RECENT CARR DISCuSSION PAPERS

www.lse.ac.uk/collections/CARR/publications/discussionPapers.htmdp 63 a curious Reconstruction? the shaping of ‘marketized’ forensic scienceChris Lawless, May 2010

dp 62 the libertarian origins of cybercrime: unintended side-effects of a political utopiaJeanette Hofmann, April 2010

dp 61 anatomy of a disaster: Why some accidents are unavoidableJohn Downer, March 2010

dp 60 silence of evidence in the case of pandemic influenza Risk assessmenterika Mansnerus, February 2010

dp 59 the impact of Regulatory policy on individual behaviour: a goal framing theory approachJulien etienne, January 2010

dp 58 the many meanings of ‘standard’: the politics of the international standard for food Risk analysisDavid Demortain, January 2010

dp 57 Rights as Risk: managing human Rights and Risk in the uk prison sectornoel Whitty, January 2010

dp 56 modelled encounters with public health Risks: how do We predict the ‘unpredictable’?erika Mansnerus, november 2009

Risk and performance management in major uk public and private sector organizations: a tale of contrasting cultures

professor margaret Woods

nottingham University Business school

seminar: tuesday 1 December

‘Risk assessment policy’ – a critical innovation for both scientific and democratic legitimacy

professor erik millstone

University of sussex

seminar: tuesday 9 February

protecting children from maltreatment and protecting agencies from blame: can they be compatible?

professor eileen munro

London school of economics and Political science

seminar: tuesday 9 March

the Risk university: organizational Risk management in english higher education

professor michael huber

University of Bielefeld

seminar: tuesday 4 May

technoscience and Risk Regulation: a non-trivial Relationship

dr karen kastenhofer

Austrian Academy of sciences

seminar: tuesday 18 May

away day

CARR staff took part in an away day at the Churchill Museum & Cabinet War Rooms in January. the away day provided an opportunity to discuss intellectual strategy and developments in the field of risk and regulation; sessions included a discussion of ‘Values in Risk-based Management and Regulation’.

Page 19: Risk and Regulation Summer 2010

CARRPeoPLe

Risk&Regulation, summer 2010 19

CARR research staff

bridget hutterCARR Director Professor of Risk RegulationSociology of risk regulation; risk management; regulation of economic life; the impact of state and non-state regulation; risk regulation in Asia.

matthias benzerPeacock FellowSocio-scientific approaches to suffering, dying and death; quality of life debate; healthcare regulation; sociological methodology; sociological theory.

david demortainESRC Research OfficerSociology of regulation and risk management; sociology of expertise and scientific advice.

John downerESRC Research OfficerSociology of knowledge; epistemology of technological risk assessment; regulation of complex and dangerous technologies.

Julien etienneESRC Postdoctoral FellowCompliance theory, major accident hazard regulation, incident reporting, and regulator-regulatee relations.

sharon giladESRC Research OfficerCorporate responses to regulation; citizen-consumer complaints and complaint handling; retail financial services regulation.

Jeanette hofmannESRC Research OfficerInternet regulation and the development of intellectual property rights.

christopher lawlessESRC Postdoctoral FellowSociology of forensic sciences, and sociological and philosophical issues concerning the use of evidence and probability theory.

martin lodgeCARR Research Theme Director: Reputation, Security and Trust. Reader in Political Science and Public PolicyComparative regulation and public administration; government and politics of the EU and of Germany.

sally lloyd-bostockProfessorial Research FellowMedical regulation by the GMC. The psychology of routine decision making, blaming and accountability and the construction and use of information about risk. Regulation and compensation culture.

peter millerDeputy Director and CARR Research Theme Director: Performance, Accountability and Information; Professor of Management AccountingAccounting and advanced manufacturing systems; investment appraisal and capital budgeting; accounting and the public sector; social and institutional aspects of accounting.

michael powerCARR Research Theme Director: Knowledge, Technology and Expertise; Professor of AccountingInternal and external auditing; risk management and corporate governance; financial accounting and auditing regulation.

ww

w.ls

e.ac

.uk/

colle

ctio

ns/C

AR

R

edward pageProfessor of Public Policy, LSE

nick pidgeonProfessor of Applied Psychology, Cardiff University

tony prosserProfessor of Public Law, Bristol University

Judith ReesProfessor of Environmental and Resources Management, LSE

henry RothsteinLecturer, Centre for Risk Management, King’s College London

colin scottProfessor of EU Regulation and Governance, University College Dublin

susan scottSenior Lecturer, Information Systems, LSE

Jon sternHonorary Senior Visiting Fellow, City University

lindsay stirtonLecturer in Medical Law and Ethics, University of Manchester

peter taylor-goobyProfessor of Social Policy, University of Kent at Canterbury

mark thatcherProfessor of Public Administration and Public Policy, LSE

kai WegrichProfessor of Public Management, Hertie School of Governance, Berlin

paul WillmanProfessor in Employment Relations and Organizational Behaviour, LSE

brian WynneProfessor of Science Studies, Lancaster University

CARR visiting professors

keith hawkinsEmeritus Professor of Law and Society, University of Oxford

frank vibertFounder Director, European Policy Forum

CARR administrative team

christine sweedCentre Administrator

anna phillipsWeb and Publications Administrator

attila szanto Office Administrator

katherine taylorOffice Administrator

CARR research associates

michael barzelayProfessor of Public Management, LSE

ulrich beckProfessor, Institute for Sociology, Munich

gwyn bevanProfessor of Management Science, LSE

Julia blackProfessor of Law, LSE

damian chalmersProfessor in European Union Law, LSE

simon deakinProfessor of Corporate Governance, University of Cambridge

anneliese doddsLecturer in Public Policy, King’s College London

george gaskellProfessor of Social Psychology, LSE

maitreesh ghatakProfessor of Economics, LSE

andrew gouldsonDirector, Sustainability Research Institute, University of Leeds

terence gourvishDirector, Business History Unit, LSE

carol harlowProfessor Emeritus of Public Law, LSE

christopher hoodProfessor of Government and Fellow, All Souls College, University of Oxford

michael huberProfessor for Higher Education Research, Institute for Science and Technology Studies, Bielefeld University

Will JenningsHallsworth Research Fellow in Political Economy, University of Manchester

Roger kingVisiting Professor, Centre for Higher Education Research and Information, Open University

liisa kurunmäkiReader in Accounting, LSE

Javier lezaunLecturer in Science and Technology Governance, James Martin Institute, Saïd Business School, University of Oxford.

donald mackenzieProfessor of Sociology, University of Edinburgh

andrea mennickenLecturer in Accounting, LSE

yuval milloLecturer in Accounting, LSE

Page 20: Risk and Regulation Summer 2010

esRc centre for analysis of Risk and Regulation the london school of economics and political science houghton street london Wc2a 2ae united kingdom

tel: +44 (0)20 7955 6577

fax: +44 (0)20 7955 6578

Website: www.lse.ac.uk/collections/caRR/

email: [email protected]