86
BBC Trust Impartiality Review: Making Sense of Statistics 1 BBC Trust Impartiality Review: Making Sense of Statistics August 2016

Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

Embed Size (px)

Citation preview

Page 1: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

1

BBC Trust Impartiality Review:

Making Sense of Statistics August 2016

Page 2: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

2

Contents

Trust Conclusions 4

Making Sense of Statistics - Report by impartiality review Panel

Foreword 9

Executive Summary 11

Introduction 14

The Panel 15

Statistics and the BBC – why it matters 16

Statistical literacy 16

The role of the BBC 17

Accuracy and impartiality – their application to the use of statistics 17

BBC journalists and the resources they can access 20

Statistical advice within the BBC 20

External expertise 24

Training and skills 24

How well does the BBC use statistics? 27

How appropriate is the BBC’s selection of the statistics it reports? 28

How accurately does the BBC report statistics? 35

Does the BBC explain and interpret statistics appropriately? 41

Context 44

Page 3: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

3

Interpretation 49

Statistics in debate 53

Risk 59

Presentation 62

"Big numbers" 67

Transparency and correction 68

Some thoughts on the future 73

The rise of data journalism 73

List of contributors 75

BBC 75

External 77

Appendices 79

Executive Response 84

Page 4: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

4

Trust Conclusions

Introduction

The BBC, as the UK’s main public service broadcaster, has a particularly important role to play in bringing statistics to public attention and helping audiences to digest, understand and apply them to their daily lives. Accuracy and impartiality have a specific meaning when applied to statistics. Reporting accurately and impartially on critical and sometimes controversial topics requires understanding the data that informs them and accurate and impartial presentation of that data. The past decade has seen a huge increase in the public availability of facts and figures, an increase resulting from the near universal availability of the internet and the growing acceptance by public authorities of the need for transparency. This has been accompanied by a growing market for data journalism and an ever-greater reliance in public discourse on figures and trends, some of which can be taken at face value but many of which require analysis and understanding if their significance is to be understood. On top of that, facts and figures are often used – and abused – to create stories or to justify assertions that they simply cannot support. Against this background, the BBC Trust decided to focus its final impartiality review on the way the BBC handles statistics and on what lessons can be learned in offering its audiences the best chance of understanding and evaluating what they mean. We therefore asked the former UK Chief Statistician, Dame Jil Matheson – supported by Paul Johnson of the Institute of Fiscal Studies, Sir Peter Stothard, former editor of the Times, and Sir David Spiegelhalter, the Winton Professor of the Public Understanding of Risk at Cambridge University, and with detailed audience research and independent content analysis – to consider the reporting of statistics in news and current affairs output. The review set out to consider the following:

to assess whether BBC UK public services news and current affairs output achieves due impartiality and accuracy on controversial subjects when it includes statistical analysis;

to establish more generally whether the BBC provides duly accurate and impartial reporting of statistics in its UK public services news and current affairs output;

to assess whether BBC output provides accessible, critical analysis when reporting statistics to enable audiences to understand the issues; and

to assess whether BBC content demonstrates due impartiality and accuracy through

appropriate challenge and scrutiny of context in its reporting of statistics cited by third parties.

This review consists of the following:

a report to the Trust, Making Sense of Statistics, by the independent impartiality review Panel, led by Dame Jil Matheson;

Page 5: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

5

audience research conducted by Oxygen Brand Consulting;

content analysis undertaken by the Cardiff School of Journalism, Media and Cultural Studies at Cardiff University; and

a response from the BBC Executive, outlining the actions they intend to take in response to the review’s findings.

All these are published below and a brief summary of the main points can be found after the full Trust conclusions.

Page 6: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

6

Conclusions of the BBC Trust

Statistics are used increasingly in debate about public policy and they therefore feature frequently in news and current affairs coverage. That trend seems likely to continue following Britain's decision to leave the EU. Trustees consider that, as the UK’s leading public service broadcaster, the BBC has a crucial role to play in helping audiences to evaluate the figures and the various policy arguments that are often based upon them. The Trust is grateful to the independent Panel, led by Dame Jil Matheson, which undertook this review, carefully considering evidence and drawing on additional content analysis by Cardiff University and audience research by Oxygen Brand Consulting. The expert Panel's report and the research that accompanies it paint a striking picture of the extent to which audiences trust the BBC to give them an accurate and impartial understanding of what statistics can tell us. That places an immense responsibility on the BBC and all its journalists to be ever more sophisticated in understanding the figures they help put before the public and in recognising the risks of taking statistical claims at face value. The report concludes that there are many outstanding examples of people and programmes at the BBC setting the highest standards. But it also shows that performance is far from uniform and there are many instances where reporting falls short of what audiences have a right to expect. The Trust is clear that a fundamental duty of BBC journalism over the next decade is to justify the public's confidence that, when they read or hear statistics from the BBC, the figures are given no more and no less weight than the evidence supports. We expect the BBC to build on its existing successes and boost its efforts to help audiences understand the statistics they are faced with every day. The Trust welcomes the conclusion that the BBC has great strengths in this area and that people at the BBC place value on using statistics responsibly. In particular, Trustees note that:

the Panel’s observation that the BBC is often held in higher regard than other areas of the media in terms of its reporting of statistics is supported by the content analysis and audience research – the BBC appears to be a model of good practice in some areas;

BBC journalists appear to deal with statistics cautiously and the BBC is perceived to deal with statistics in an accurate and impartial way;

there are many examples of BBC journalists providing relevant information and analysis which enhances audience understanding;

some current BBC journalists can be seen as benchmarks for the use of statistics; and

this success appears to be strongly rooted in an appreciation by staff of the BBC’s values.

These accomplishments are testament to the effort which the BBC clearly makes to take into account the impact statistics can have on audiences.

Page 7: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

7

Despite these achievements, the Trust also shares the Panel's frustration that the BBC sometimes falls short in its reporting of statistics. The BBC needs to do consistently better in some areas and the Trustees support the introduction of initiatives which will build the capacity to achieve this. Key areas of concern which have been highlighted by this review are:

a lack of confidence among some journalists in the handling of statistics. It is crucial that practitioners feel confident to work with statistics in their daily reporting. The Trust shares the Panel’s desire to see a culture change to give more journalists confidence with statistics in their day-to-day work, supported by managers (recognising that it will not be practical for every journalist to become an expert in every use of a statistic);

a perceived reluctance sometimes to provide fuller interpretation of statistics for audiences. We support the recommendation for the BBC to be “better and braver” in this area – this can have legitimate constraints but it is the BBC’s job to provide helpful explanations for audiences as far as possible;

an occasional failure to recognise that reporting on a devolved UK could be enhanced by greater use of statistics. Trustees would like to see them used more often to illuminate stories where there are measurable differences between different nations and regions of the UK;

a lack of appropriate challenge in some output when statistics are used by external contributors. Trustees acknowledge that the most experienced interviewers and best resourced programmes will challenge contributors who use statistics in a partisan manner, but the BBC should aim to achieve this more broadly throughout daily journalism.

the BBC Editorial Guidelines do not currently include specific guidance on the reporting of statistics (although there are some relevant sections on related issues such as opinion polls, surveys and votes). The Trust agrees with the Panel’s recommendation that this should be addressed.

The Trust welcomes the recommendations made by the Panel which are intended to assist the BBC in making improvements to its statistical reporting. In particular, we support the conclusion that it will be vital for the BBC to improve its data journalism capabilities in order to develop new forms of investigation in the future. Turning to the Executive’s response, we are pleased to note that the Executive has plans in place to develop data journalism as a lynchpin of its future strategy for reporting and analysis. The three areas the Executive have set out for immediate action – the creation of a hub for data journalism (in partnership with the News Media Association), the formation of a centre for data journalism linking to newsrooms in the four UK Nations, and the development of previous partnerships with centres of excellence in this area – could transform the BBC’s capabilities. There is an opportunity for the BBC to become the market leader in this type of journalism. We support the Executive in their ambitions in this area and in their commitment to partner centres of excellence. We welcome the Executive’s decision to devise further online training for journalists in the use of statistics and to promote external guidance and develop the BBC’s own guidance. The lack of confidence in this area amongst some practitioners is a key finding of the Panel’s

Page 8: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

8

report and we would support and encourage all journalists to develop their skills and knowledge in this important area. We suggest that the training should incorporate information on how audiences understand statistics, given the evidence on the differing levels of audience understanding and the extent to which they rely on the BBC for accurate, impartial information. We are particularly pleased to note the Executive’s agreement to make Reality Check a permanent fixture and we welcome the intention that it should go some way towards better enabling presenters to challenge contributors’ use of statistics where necessary. Finally, the Trust notes and endorses the Executive’s commitment that the News Editor and Head of Statistics will review biannually the progress of programmes and presenters in holding people in public office to account. Trustees have asked the Executive to give a report to the Trust’s Editorial Standards Committee in four months’ time (ie in December 2016) to update on progress on taking forward these actions. That report will subsequently be published and should include information about:

the BBC’s progress in its plans to develop its data journalism functions and partner centres of excellence;

the implementation of Reality Check as a permanent fixture in BBC news;

the development of internal guidance and the promotion of external guidance;

the rollout of online training in the use of statistics for journalism across the BBC’s journalists; and

an update on the progress in the three nominated programmes (the Today programme, Breakfast and on 5live Drive) in holding people in public office to account.

The Panel has written a well-evidenced, accessible and insightful report and we are keen that the Executive should take full advantage of their thoughts, suggestions and recommendations. We hope that full engagement with the issues outlined in this review will build on the BBC’s considerable existing successes to ensure that it remains the foremost provider of impartial and accurate reporting on statistics in the UK throughout the period of the next Royal Charter.

Richard Ayre BBC Trustee and Chair of Editorial Standards Committee

Page 9: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

9

Making sense of statistics – report by impartiality review Panel

Foreword by Panel Chair During my career I loved it when I met taxi drivers who said ‘that’s interesting’ when I told them what I did. Most statisticians I know want their work known and understood. That’s because statistics matter. They are used by politicians and policy makers, businesses, lobby groups, academics and charities to inform decisions that affect all of our lives. They can help us understand the world we live in and the circumstances of those beyond our experience. And they can be used to hold governments and others to account for their decisions. But not all ‘statistics’ are equal. Uncertainty and caveats are inherent to statistical (and indeed scientific) work and few of us have the time or the skills to delve into the detail to help us make sense of all the data around us. We rely on others to interpret the numbers and tell us what they mean. The media and, in particular, the BBC, have an important role to present and interpret the right numbers in the right way. The BBC has a mission to educate, inform and entertain its audience. It’s where most people go for trustworthy, impartial news. Given the proliferation of statistics in news and current affairs, I was delighted that the BBC Trust decided to commission this review. How is the BBC fulfilling its remit with respect to statistics? Which statistics does it choose to report, and which not, and why? Can we trust that due care is paid in selecting and interpreting statistics and in challenging appropriately the selective use of statistics by politicians and others? These questions are at the heart of impartiality. The test that the Panel carrying out this review has chosen to apply is: are audiences being well served by the decisions the BBC makes when it comes to reporting and using statistics? Good statistical reporting, and understanding, is not about maths. You won’t find any maths in this report. It’s about asking some basic questions of a set of statistics, the first of which is often “does it make any sense?”. I want to thank my fellow Panelists, Paul Johnson, David Spiegelhalter and Peter Stothard, for helping me make sense of the evidence we accumulated during the course of this review. We drew on the content analysis carried out by Cardiff University and the audience research carried out by Oxygen Brand Consulting. We also spoke to around 100 people, inside and outside the BBC, and received written submissions from 27 organisations and individuals. Thanks go to them for their time, their interest and their ideas.

Page 10: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

10

And particular thanks go to Natalie Rose and her colleagues in the BBC Trust Unit. The conclusions are the Panel’s, but we couldn’t have done this review without them. As well as administrative support, Natalie helped us find our way around the BBC, including advising on who we should talk to and how the different parts of the BBC fit together – a challenging task. I am grateful to her for her words, her advice and her organisation.

Chair, independent Panel for the BBC Trust

Page 11: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

11

Executive Summary The BBC, as the UK’s main public service broadcaster, has a particularly important role to play in bringing statistics to public attention and helping audiences to digest, understand and apply them to their daily lives. Accuracy and impartiality have a specific meaning when applied to statistics. Reporting accurately and impartially on critical and sometimes controversial topics requires understanding the data that informs them and accurate and impartial presentation of that data. Overall, the BBC is to be commended in its approach to the use of statistics. People at the BBC place great value on using statistics responsibly. Journalists often go to some lengths to verify the statistics they receive. They exercise judgement when deciding which statistics to cover and the BBC has a strong record in selecting and presenting statistics effectively. Journalists and programme makers often make attempts to challenge conventional wisdom and provide independent assessments of stories reported elsewhere. Many areas of the BBC give careful thought to the way in which statistics are presented for audiences and the BBC has prioritised responsiveness to mistakes in recent years. Informed by the evidence supporting this report, including Cardiff University’s content analysis and Oxygen Brand Consulting’s audience research study, we have nevertheless identified some areas for improvement. These include the following: Contextualising statistics Numbers are sometimes used by the BBC in ways which make it difficult for audiences to understand whether they are really big or small, worrying or not. Audiences have difficulty in particular in interpreting “big numbers”. And a number on its own, without trends or comparisons, rarely means much. We recommend that much more is done to ensure that statistics are always contextualised in such a way that audiences can understand their significance. Interpreting, evaluating and “refereeing’“statistics The BBC has a responsibility to help audiences make sense of the statistical evidence in an impartial way. That involves being willing, more than at present, to weigh, interpret and explain the statistical evidence and, when appropriate, challenge and correct when it is misused. The BBC frequently presents different sets of statistics put forward by those on either side of an argument. Audience research, and our own discussions, showed considerable frustration with this way of presenting statistics and effectively leaving the audience to make up its own mind. The BBC needs to get better and braver in interpreting and explaining rival statistics and guiding the audience.

Going beyond the headlines There is also a need for more regular, deeper investigation of the figures underlying sources such as press releases. This is especially pertinent as the Government is the predominant source of statistics on the BBC. We cannot expect, and do not suggest it is necessary for, all journalists to have access to and a full understanding of every single statistic which is in

Page 12: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

12

the public domain. But there is a need to look beyond the headlines to ask how the figures were obtained and whether they seem sensible. Failure to dig deeper into the data also represents lost opportunities to provide new and broader insights on topical issues. For example, reporting GDP per head of population might give a different perspective of the economy than just GDP alone, and we would like to see such analyses covered by the BBC more often. Geographic breakdowns could enhance reporting on the devolved UK.

We recommend that “Reality Check” becomes a permanent feature of the BBC’s

activities, with a prominent online presence, reinforcing the BBC’s commitment to providing well-informed, accurate information on topical and important issues. Challenging statistics We came across many examples where statistics were used erroneously or in misleading ways by guests on programmes and were not challenged by presenters. We recommend that more is done to ensure that presenters are in a position to challenge such assertions appropriately and confidently. Presenting risk While the BBC is often better than other media at explaining risk, for example in stories about health, it by no means does so consistently well. Clearer guidelines and a more consistent approach to reporting risk are needed. The BBC should make sure that it always includes base case risk information. In other words never report simply that eating X doubles your risk of cancer. Always say it doubles the risk from 1 in 1,000 to 2 in 1,000, for example.

Being clear about significance There is a tendency in the BBC and other media to focus on change in unemployment, inflation, GDP or whatever. Frequently the headlines imply that changes from month to month or quarter to quarter are important and (statistically) significant. We came across various examples of overplaying this with concern expressed by several contributors. It is important to be clear when things have in fact not changed significantly (in either a statistical or economic sense). The fact that unemployment, inflation, GDP growth are broadly steady is important to understand.

Increasing statistical capacity Sources of advice and expertise within the BBC are dependent on individuals and who knows whom. Much responsibility will always rest with individual journalists to stand by the content they produce. We believe more could be done to help them exercise this responsibility. The BBC should consider how better to identify, use and develop the expertise it has and how to champion and incentivise the excellent use of statistics in programme making. It is important that non-specialist programme makers have access to in-house expertise as a matter of course, as well as drawing on external sources. The BBC needs to have the internal capacity to question press releases, relate them to other data sources and, if necessary, do some additional calculations – for example translating relative to absolute risk. There remains a need for basic training on, for example, percentages and percentage change, and nominal and real financial numbers.

Page 13: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

13

In supporting programme makers it is vital that it is clear where responsibility lies for setting statistical standards, and that they know where to go for advice on their application. This should be sufficiently visible, and sufficiently senior, to help boost editorial judgement and confidence in dealing with statistics. Consistently applying standards The excellent practice we saw in some broadcasts and online pieces was not consistently applied. We were struck by the lack of guidance available. We recommend that the BBC develops its guidance, including adopting or modifying the advice commonly found in guidelines from, for example, the Royal Statistical Society and others, and disseminates their availability widely across the BBC. The audience research confirmed the value of well-produced visualisations in broadcasts and online. And that radio can present numbers extremely well – Radio 1’s Newsbeat and Radio 4’s More or Less were consistently commended. But there were cases too where the way in which numbers were presented only confused. We believe that more can be done to share knowledge of how audiences understand the statistics presented to them, and how this can be improved. “Data journalism” is part of the future. We sound a note of caution about big data1which can lead to big claims and big errors. Now, and increasingly, the BBC needs the capability to understand and interpret figures and a statistical capacity to work alongside journalists and developers. It should develop plans for how to build such capability.

1 We use the Oxford Dictionaries definition of “big data” – available at

http://www.oxforddictionaries.com/definition/english/big-data - which is “extremely large data sets that may be analysed

computationally to reveal patterns, trends, and associations, especially relating to human behaviour and interactions”.

Page 14: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

14

Introduction Data and statistics are everywhere. They are in the images that we see and hear, the messages we process and they inform the judgements we make. We rely on data to a greater extent than ever before. But how do we make sense of all this data? Who produces it and why? Who shapes it? How is it used? And how does that affect our understanding of the world we live in? Having a trusted, authoritative, accurate and impartial broadcaster helping audiences make sense of numbers and navigate their way through sometimes competing statistical claims is critical. This is why in 2015 the BBC Trust commissioned a Panel to undertake an impartiality review of the BBC’s use of statistics. The BBC’s Editorial Guidelines say impartiality “lies at the core of the BBC’s commitment to its audiences”. This review set out to look at the accuracy and impartiality of the BBC’s reporting of statistics in its news and current affairs coverage across network radio, television, online and selected output in the four nations of the UK. We were particularly asked to:

assess whether BBC UK public services news and current affairs output achieves due impartiality and accuracy on controversial subjects when it includes statistical analysis;

establish more generally whether the BBC provides duly accurate and impartial reporting of statistics in its UK public services news and current affairs output;

assess whether BBC output provides accessible, critical analysis when reporting statistics to enable audiences to understand the issues; and

assess whether BBC content demonstrates due impartiality and accuracy through appropriate challenge and scrutiny of context in its reporting of statistics cited by third parties.

A key aspect of our review was to consider the coverage of topics within health, welfare/benefits, economics and migration. That does not mean we have ignored other stories using statistics during the period of our review, but we have particularly focussed on these areas in order to look in depth at how the BBC uses statistics when covering controversial subjects. We have received evidence from individuals in England, Northern Ireland, Scotland and Wales during our review. Our findings are not necessarily nationally based, although we do reflect differences in reporting in the four nations of the UK where relevant. The report considers what, in practice, impartiality means when applied to statistics, and examines how the BBC attempts to achieve this and how well it succeeds. We look at which statistics are (and aren’t) reported and how accurately; how well they are explained and interpreted; and at some of the things that sometimes go wrong. We discuss the skills and other resources at its disposal and offer some thoughts on the future.

Page 15: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

15

We hope our report reassures audiences that the BBC is working hard to be impartial and accurate in its coverage of statistics with some notable results. It is also our intention that it will help practitioners in the BBC and elsewhere by pointing out where there are shortcomings and by suggesting ways in which improvements can be made.

The Panel

The Panel consists of Paul Johnson, Director of the Institute for Fiscal Studies, Sir Peter Stothard, former Editor of The Times and The Times Literary Supplement, and Sir David Spiegelhalter, the Winton Professor of the Public Understanding of Risk at the University of Cambridge, who has acted as an Adviser. The Panel has been chaired by Dame Jil Matheson, former UK National Statistician, Head of the Government Statistical Service and Chief Executive of the UK Statistics Authority from 2009 to 2014.

Page 16: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

16

Statistics and the BBC – why it matters

Data are the lifeblood of decision-making. Without data, we cannot know how many people are born and at what age they die; how many men, women and children still live in poverty; how many children need educating; how many doctors to train or schools to build; how public money is being spent and to what effect; whether gas emissions are increasing or the fish stocks in the ocean are dangerously low; how many people are in what kinds of work, what companies are trading and whether economic activity is expanding.

A World that Counts: UN Secretary General’s

Expert Advisory Group on Data Revolution, 20142 The BBC, through its news and current affairs, reports on these issues and many other matters every day. These reports often contain data and statistics. It is impossible to watch television, listen to the radio, or read a newspaper or news online without seeing references to statistics. Polls, surveys, trends, totals. Percentages, averages, distributions, samples. Rates, ratios and estimates. These words are used regularly in the media. As the volume of data and statistics grow, then the ability to understand data and where it comes from, to interpret it appropriately (taking into account its strengths and limitations) and to explain numbers clearly to a wide audience, becomes ever more important. Why? Because data and statistics shape our understanding of the world we live in. They inform public policy and individual choice. They cover every aspect of our lives. This is why they feature so prominently in news and current affairs broadcasting. Statistics are rarely, themselves, the subject of programming (with the exception of a few programmes such as Radio 4’s More or Less). But rather, they permeate the news presented to us.

Statistical literacy

Statistics can sometimes be challenging for audiences to understand, not least because there is a variety of evidence which suggests that “statistical literacy” in the UK – that is, the ability to understand and interpret basic statistical information – is relatively low. A survey by the Organisation for Economic Cooperation and Development in 20153, for example, found that the average numeracy proficiency of young people in the UK (aged 16 – 29) was in the bottom 25% of the 22 countries they surveyed. And it seems that little importance or value is put on understanding numbers – a survey of adults aged 16-75 by IpsosMORI in 2013 suggested that people are four times more likely to say they would be proud of their children if they excelled in reading and writing (55%) than if they were very good at

2 Available at http://www.undatarevolution.org/wp-content/uploads/2014/12/A-World-That-Counts2.pdf

3 Available at http://www.keepeek.com/Digital-Asset-Management/oecd/education/oecd-skills-outlook-2015_9789264234178-

en#page1

Page 17: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

17

numbers (13%). Only 6% would be embarrassed to admit poor numeracy skills while 15% would be embarrassed to admit to poor reading and writing skills.4

The role of the BBC

Against this background, the media have a key role to play in reporting and explaining the statistics we are presented with every day, bringing them to public attention and helping audiences to digest, understand and apply them to their daily lives. They make important, and sometimes difficult, decisions on which statistics to report and which to ignore. And which statistical claims made by politicians and others to challenge and which to accept. The BBC, whose overall mission is “to enrich people's lives with programmes and services that inform, educate and entertain”, has a particularly important role to play. As the UK’s principal public service broadcaster, the BBC is trusted to deliver independent, impartial and honest reporting on issues that affect audiences. Because of its unique public funding mechanism, audiences are its stakeholders and have high expectations of BBC content. They look to the BBC to produce news and current affairs output which is reliable, up-to-date, of a high quality and easily accessible. A recent consultation by the BBC Trust on the Government’s Charter Review proposals5 showed this is felt to be central to the organisation’s role. And research conducted by Ipsos MORI in June 20156 demonstrated that the BBC remains the most trusted source of news in the UK (at 57%), is the news source people turn to most for impartial news coverage (53%) and is the source they are most likely to turn to for accurate news coverage (59%).

Accuracy and impartiality – their application to the use of statistics

But what do accuracy and impartiality mean in practice in relation to statistics? Our investigation, and particularly the audience research which accompanies this review, found that these concepts have a specific resonance when applied to this area. Reporting accurately and impartially on critical and sometimes controversial topics requires understanding the data that informs them and accurate and impartial presentation of that data. Politicians, lobby and interest groups, businesses and trade unions will attempt to persuade us of their positions, often by quoting statistics that support them. Audiences expect that numbers are accurate, factual and verified, that claims that are wrong or misleading are challenged, and that a range of evidence is used to put statistical claims into context. In other words, the BBC has to ensure that the public is informed accurately and impartially on the important issues of the day by helping audiences navigate through the statistical evidence and make sense of the numbers.

4 https://www.ipsos-mori.com/researchpublications/researcharchive/3175/In-an-age-of-big-data-and-focus-on-economic-

issues-trust-in-the-use-of-statistics-remains-low.aspx

5 http://www.bbc.co.uk/bbctrust/governance/charter_review/consultation

6

http://downloads.bbc.co.uk/aboutthebbc/insidethebbc/howwework/reports/pdf/bbc_report_trust_and_impartiality_jun_2015.pd

f

Page 18: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

18

Regarding accuracy, there is a presumption of veracity – if a story contains a number, it must be true. Certainly, the audience research found that “adding statistics does increase the impression of accuracy”:

There is an assumption by the audience that figures quoted by the BBC will be accurate, factual and well verified and that the BBC sets out to be impartial in its use of statistics.

Audience research report, Oxygen Brand Consulting

If statistics shape our view of the world then we need to be confident that the ones which are reported to us are correct. Statistics which suggest a drug prescribed by health professionals is dangerous, for example, might lead to people making particular decisions about their lives or the lives of those in their family which they could later regret if it emerges that the original report was inaccurate. We equally need to be confident that the statistics which are reported to us are impartial, in terms of which statistics are reported to us and how they are explained. Numbers do not speak for themselves, they need to be framed in order for audiences to understand them. The concept of “impartiality” regarding statistics therefore encompasses putting them into a context or “frame” so that their magnitude and importance can be properly judged. Today, the BBC no longer sees impartiality as a binary philosophy, pitching two “sides” against each other. Instead, it encompasses a range of beliefs and ideas relating to a topic.

Due impartiality is often more than a simple matter of “balance” between opposing viewpoints.

BBC Editorial Guidelines on Impartiality This is also relevant when considering statistics. There is a risk when reporting statistics that opportunities are missed to report occasional or one-of releases and to interrogate data in different ways. Some statistics get routinely reported, such as those for Gross Domestic Product (GDP) and employment figures, because they are familiar and are seen as economically and politically significant. But this can lead to a deceptively narrow range of insights, with some facets of the “story” missing. Reporting GDP per head of population might give a different perspective of the economy than GDP alone and we would like to see such analyses covered by the BBC more often. Similarly, too much emphasis can be placed on a narrative of “change”, in the context of producing an interesting story, when the statistical changes being reported on (such as changes in unemployment figures) may not be statistically important. This again can lead to key aspects of the picture (ie the “no change” narrative) being ignored. We explore both of these concepts in more detail later in this report. There is also a danger of simply polarising a topic. Professor Steve Jones, in his report for the Trust in 2011, talked of “false balance”, which he defined as “presenting the views of tiny and unqualified minorities as if they have the same weight as the scientific consensus”.7 The use of statistics can exacerbate this effect, as differing opinions use numbers to back up and try to authenticate their arguments. Statistical underpinnings need to be evaluated. In a world where data is now so much easier to generate – from our digital transactions, social media, and as by-products of administrative systems, as well as from statistical research and

7 http://downloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/science_impartiality/science_impartiality.pdf

Page 19: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

19

surveys – ever greater caution is needed in accepting, uncritically, the conclusions that are claimed. Our review, then, considers a range of examples and evidence in light of these considerations. Thanks to the content analysis conducted by Cardiff University, we have an idea of just how much of the BBC’s coverage currently includes statistics. In the 6,916 news items which they examined in their initial four-week monitoring period between 12 October and 8 November 2015, they identified 4,285 references to statistics (there were multiple references in some news items)8. This is a substantial number of references by any measure. And these references are made in a variety of different contexts. Much of the BBC’s coverage includes reports of statistics produced by other organisations, both official (such as the Office for National Statistics (ONS)) and non-official (including academics and think tanks; business and charities and other Non-Governmental Organisations). They are cited regularly by politicians. They are also used to provide a sense of scale and context in narrative stories and to throw light on important and topical issues. Finally, the BBC generates its own statistics by commissioning surveys or carrying out data collection initiatives (sometimes as Freedom of Information requests) or scientific experiments and reporting on these. Different considerations apply to each of these contexts and we will consider them in greater detail in this report. There may be several reasons for the proliferation of statistics in BBC coverage. Perhaps audiences are, rightly, assumed to be looking for evidence, rather than anecdote. Perhaps statistical releases are being used to create long-term familiarity (like the monthly reporting of GDP figures). Perhaps lobby groups think that including numbers in press releases will be a shortcut to securing a punchy headline. Whatever the reasons, if the role of the BBC is to help people to understand the world around them, and if that world is more data-focussed, then it follows that news and current affairs coverage will be too. Perhaps it’s simply the case that, as one senior BBC editor told us, “you can’t tell the story without the numbers”. And most of us need help in understanding what the statistics mean – a trusted guide through the numbers. As one audience member said during the audience research:

I think sometimes you’re left with more questions than have been actually answered. Audience research report, Oxygen Brand Consulting

Our experience of leading this review has shown the Panel that people at the BBC place great value on using statistics responsibly. This is reflected to some extent in the BBC’s Editorial Guidelines, and it is incorporated into their training programmes, considered in editorial meetings and contemplated as part of their complaints process. In our view, there are lots of examples of good practice, which we will elaborate on in the report. But the good practice isn’t consistently applied. Our intention in this review is to highlight how the BBC can consistently improve its reporting of an area of such importance in a changing, data-heavy world.

8 Table 2.5, p16

Page 20: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

20

BBC journalists and the resources they can access We heard evidence that the very journalists who are required to lead audiences through statistics can sometimes be as subject as the general population to having trouble with understanding their meaning. So, whilst we met BBC journalists who are at home with numbers and can swiftly assess what they mean and explain it, there are others who told us that they lacked confidence if asked to report a statistical story. This may be because journalism has historically been seen as a “literary profession”, as we were told by one senior BBC editor, and recruitment to this career has traditionally favoured arts graduates. Of course we do not mean to suggest that these individuals are without an analytical mindset (quite the opposite in fact), but the nature of arts degrees has traditionally meant that such graduates may be less familiar with analysing figures than colleagues with a science background. This was indeed also a theme covered by Professor Steve Jones in his report in 20119.

One frequent comment was that the BBC – News most of all – was an arts‐ and humanities‐based organisation and that science suffered as a result. Although some of the best science journalists, within and without the BBC, do not have degrees in the subject it is hard not to notice the dearth of that skill in News and Current Affairs compared to its abilities in finance, politics, sport or the arts.

Professor Steve Jones, BBC Trust review of impartiality and accuracy of the BBC’s coverage of science, 2011

As part of considering how effective the BBC’s use of statistics is, we therefore wanted to understand the processes by which statistics come to be used in broadcast and online coverage. Of course we cannot expect, and we certainly would not suggest it is necessary for, all journalists to have access to and a full understanding of every single statistic which is in the public domain. Nor would we suggest it is necessary for all journalists to have the skills of professional statisticians, although we do believe they can be trained to be competent with statistics. We hence looked in particular at the infrastructure within the BBC for seeking advice on specific statistics to help inform news and current affairs coverage so that audiences can be left with clear understandings from the stories in which they appear. What resources are available for journalists to make sure statistics are used appropriately? And how accessible are these resources to news journalists across the BBC?

Statistical advice within the BBC

There are no dedicated statistical specialists within the BBC. However, there is undoubtedly expertise, for example in subject specific hubs (such as the Business Unit or the Health Team, covered in more detail below) who are expected to be familiar with the main statistics in their area, and their strengths and weaknesses. Many of them fulfill this requirement commendably.

9 http://downloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/science_impartiality/science_impartiality.pdf

Page 21: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

21

For around 18 months until summer 2015 there was a post of Head of Statistics but this was a temporary post and has since been discontinued. The BBC has also previously employed secondees from the ONS on a short-term basis, but there are none in position currently. Nor is there, as far as we are aware, any network or forum within the BBC for those with statistical knowledge to share their expertise or experience or develop their professional contribution. Every use of statistics in the BBC’s coverage is governed in a broad sense by the BBC’s Editorial Guidelines, which is the key resource shaping all of the BBC’s content and which every person who is involved in making any kind of content for the BBC is expected to use. But there is no specific section in the Guidelines for using statistics. There is a section on Politics, Public Policy and Polls, which includes guidelines on Opinion Polls, Surveys and Votes10. And there is a specific Editorial Guidance note on Reporting Risk11, which is designed to “help ensure the context of statistics is clear and avoid distortion of the risk”. Otherwise, the use of statistics is covered only by the general Guidelines on Accuracy and Impartiality. We heard that the BBC’s Editorial Policy team (who are responsible for offering advice to BBC content producers on how to interpret and work within the Editorial Guidelines) do not receive many requests for advice on the use of statistics. There is a case for expanding the BBC’s Editorial Guidelines and Guidance (which we understand are due to be revised shortly in any case) to cover some of the issues we identify in this report. As the Royal Statistical Society noted in its evidence for this review:

For a major news outlet the lack of comprehensive statistical guidance is a cause for concern.

We were surprised that an organisation the size of the BBC, with a high (and increasing) volume of statistics in its outputs, does not itself have an explicit expert statistical capability. If guidelines were issued advising journalists to seek statistical advice, virtually noone we talked to could complete the sentence “For advice, contact ….” Some of the specialist teams told us that they get calls “from those who know them”, as does the former Head of Statistics. They help when they can but they are not staffed to provide advice or reassurance about statistics to staff across the BBC. And nor do they appear to know definitively where to redirect any such requests. We identified the following existing areas of expertise:

specialists, such as the BBC’s Health team led by Health Editor Hugh Pym, who provide coverage and analysis of stories relating to their specific subject area. Sometimes they originate stories, at other times they provide advice or contribute to other items or programmes, both publicly (ie appearing on-air as part of content packages) and privately (during the planning phases). They are often best placed to provide detailed analysis and advice relating to subject-specific statistics and are contacted regularly by content producers. We heard instances of cases where, if a journalist from Newsnight, for example, was putting together a piece on health, they would draw on the BBC’s health team for guidance. Similarly, if the health team became aware of information which they thought others should be aware of, they would attempt to disseminate this to the relevant areas. Specialists also operate in the Nations in order to provide specific nationally-based expertise – for example the

10 http://www.bbc.co.uk/editorialguidelines/guidelines/politics/opinion-polls

11 The BBC’s Editorial Guidance notes supplement the Editorial Guidelines . The Guidance note on Risk can be found at

http://www.bbc.co.uk/editorialguidelines/guidance/reporting-risk

Page 22: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

22

Economics and Business Editor (BBC Northern Ireland), the Business and Economy Editor (BBC Scotland) and the Economics Correspondent (Wales). We heard evidence that such topic specialists are able to speak with expert contacts and fully investigate and understand an issue because they would know their subject area ”inside out”. We were told of many instances where such specialist expertise had benefitted other colleagues, some examples of which we include in this report.

the BBC’s analysis and research team, who provide background briefings for BBC colleagues on all main news topics (published on an internal website) and a twice-daily bulletin to around 3,000 individuals in BBC News. These briefings can help disseminate advice on particular statistical issues and queries, although the individual researchers are not trained statisticians but rather subject-specific specialists. From the evidence we have received, our sense is that this team do not have as high a profile within the BBC as they might – we certainly saw examples of useful information they disseminate, but some we spoke to did not understand their role or how they could be used.

the team behind the BBC’s More or Less programme is also sometimes used as a source of internal advice regarding specific statistics which have either been covered previously by More or Less or which the programme team are considered likely to have knowledge or experience of. An example is included in Cardiff University’s content analysis12. The More or Less series is presented on Radio 4 by Tim Harford who “explains – and sometimes debunks – the numbers and statistics used in

political debate, the news and everyday life” 13. Some of its most recent topics

include “junk” surveys, whether e-cigarettes can really help people quit smoking and weekend stroke deaths. We heard that if a programme such as the Today programme was planning to cover a topic which they knew had been recently investigated by the More or Less team, they might call the relevant people involved and seek their opinion on the subject, and particularly the statistics, in question. The More or Less team though is small (around four people) and not in place throughout the year (only during the seasonal broadcasts) and has limited capacity to advise others.

the BBC’s Visual Journalism team, a collection of graphic designers, developers and journalists working across platforms with TV graphic designers and picture editors to create maps, charts, graphics and more complex interactive digital content. Their expertise is used by journalists who require graphics to illustrate their stories and they work closely with the specialist teams, but they also produce bespoke work where visual journalism is the driving force behind the content. They have contact with several universities (eg Kings, Oxford, Loughborough and Birmingham City) and other organisations like the ONS and Dr Foster (a provider of healthcare information in the UK).

specialist BBC Editors, for example Mark Easton, have blogs which give their own expert take on the meaning behind topical stories and the statistics that accompany them. The blogs are audience-focused but are also designed for other BBC journalists to read and is another way of disseminating information on statistics.

12 p102

13http://www.bbc.co.uk/programmes/b006qshd

Page 23: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

23

handover reports within specific programme areas, advising colleagues of concerns about the robustness of particular statistics and cautioning their use within programmes. These can help to ensure that where research on specific statistics has been undertaken, the conclusions are not wasted or forgotten. We heard of various examples of this taking place, particularly in nightly handover reports, and this system appears to work well.

Other resources exist in the BBC which may be useful for disseminating advice on statistics in stories or statistical interpretation but which do not seem currently to be used for this purpose as much as they might be. The Journalist Portal (“J-Portal”), for example, stores news agendas and planning diaries and is accessible by all BBC News journalists. We heard that it could potentially be used to disseminate advice on the use of particular statistics, but our understanding is that this is not done regularly and the J-Portal is not used consistently by journalists in a way which would enable any such advice to be usefully acted upon. A system which is used more commonly is the Electronic News Production System (ENPS), the news management system used by BBC journalists to put and keep BBC news on air. This system carries newsfeeds and alerts to the desktop, sends scripts to the prompter, “speaks” to the BBC’s audio and video systems and schedules the playout of programmes and reports. ENPS is used by production teams every day and we understand it is sometimes used by journalists to disseminate advice on particular statistics but it was emphasised to us that advice on statistics will always need to be of and in the moment, so being able to contact individuals as and when necessary will always be preferable. Many of our contributors commented on the former post of Head of Statistics. This role was created to advise individuals and teams on statistical issues, produce relevant content to help the public understand the numbers at play, conduct statistical training for journalists and build relationships with the external statistical community. We were told that it was specifically initiated in the context of preparations for the 2015 General Election and ceased once the Election had concluded, with the budget being reassigned elsewhere. We received a considerable amount of evidence on the usefulness and effectiveness of this role. One parliamentarian told us that the scrapping of the post was “a retrograde step”. An academic institution said:

. . . we feel that it was a mistake to remove the position . . . The need for easily-obtained good internal advice on statistical matters has certainly not gone away.

We heard examples of statistical stories which would not have got to air without this role, such as an online article in 2014 on EU migration policy14 which explored a claim by the Secretary of State for Work and Pensions that 121,000 people could be affected by a change to the way in which EU migrants in the UK are entitled to claim Jobseeker’s Allowance (the actual figure was shown to be unlikely to be above 10,000). We also heard of some “facts” which might have been reported by the BBC had the Head of Statistics not intervened, from the trivial – that the value of luxury handbags had risen twice as fast as FTSE 100 shares – to the more serious – that average earnings had fallen for women in one in three constituencies between 2010 and 2013, when the figures in fact showed they had only fallen in one constituency. Our understanding is that since its cessation the individual who previously held the post, and who remains employed by the BBC, continues to be used by programme makers as an

14 http://www.bbc.co.uk/news/uk-politics-28544715

Page 24: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

24

unofficial source of advice on statistical issues and is still, for some, the first port of call within the BBC in a statistical storm. We should say that of those who submitted evidence for this review, not everyone supported a centralised role to disseminate statistical information and advice. We also heard that regardless of whether this post existed or not, much responsibility would always rest with individual journalists to stand by the content they produce. We understand and agree with this. But we also believe that more could be done to help them exercise this responsibility.

External expertise

Of course, BBC journalists also have external expertise available to them. We heard evidence from some that they would not hesitate to contact organisations like Full Fact, the Royal Statistical Society or the Institute for Fiscal Studies in order to seek advice on specific statistics before including them in coverage. Such organisations are very willing to help. However, such external sources of advice are generally only available in business hours and we heard from external stakeholders that they cannot provide a 24-hour service appropriate to the needs of a busy news organisation; they see themselves as supplementary to stronger internal capability.

Training and skills

We heard repeatedly about a lack of confidence amongst journalists in dealing with statistics. The BBC Academy (the BBC’s training and development body) provides some training on the reporting and use of statistics to help to address this. We heard that this training is part of a thread that runs through both the Editorial Leadership Foundation and Journalism Foundation courses (mandatory courses for every new journalist joining the BBC and for anyone taking up an editorial leadership role for the first time). The Head of Journalism Foundation, Leadership and Personal Development told us that these courses include dedicated sessions on data journalism

[as] part of an integrated approach about applying the BBC’s journalism values to the interpretation of data, in particular when covering issues of high public interest. The aim is to give every new journalist and editorial leader the confidence to question the data and to challenge any default narratives in reporting elsewhere.

The BBC Academy was kind enough to invite the Chair of the Panel to attend one of these training sessions. She found it a useful opportunity to witness these efforts in action and was particularly struck by the key message imparted that the BBC’s role in challenging conventional narratives and defaults can be enhanced by the appropriate use of data and statistics. The session was not about teaching maths – it aimed at giving journalists the skills and confidence to interrogate stories and claims using numbers in the same way as they would check other kinds of evidence and argument. It was a brilliant training session for which the BBC Academy should be congratulated. Specific (non-mandatory) courses have also been run in the past on Making Sense of Statistics (a face to face course) and Accuracy: Reporting Statistics (an online module of a wider Editorial Standards course). We heard that these courses have trained more than

Page 25: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

25

1,600 people in the past five years, although we understand that Making Sense of Statistics has not run since 2014 due to internal restructuring and relocation of the Academy. Smaller modules and films, such as an insight into the reporting of “big numbers” by the former Economics Editor and the Home Editor (see later), have also recently been incorporated onto the Academy website as a means of providing useful guides which can be accessed by staff on a regular basis. And standalone workshops on “reporting statistics” have sometimes been run. We heard that an increasing focus on providing this training on a national (ie not just in England) basis would be beneficial for journalists working in Northern Ireland, Scotland and Wales. We also heard that there remains a need for basic training, or “refresher” training, on, for example, percentages and percentage change and reporting of “real” (that is, after taking account of the effects of inflation) and “‘nominal” financial numbers. And that any statistical training needs to be linked to an awareness of how audiences understand statistics, so that content can be tailored appropriately. We understand that new material is in development to strengthen the Academy’s work in the field of statistics, including more face to face courses, further bitesize online modules and films and advanced support and consultation. And this can of course be supplemented by external training and qualifications, including standalone courses on statistics which are run by the Royal Statistical Society15 and courses on data and journalism which are now incorporated into some degree programmes, such as those offered by Birmingham City University16, Cardiff University17, City University18 and the London School of Economics19. There are too some good “rules of thumb” available – checklists which could, if appropriately incorporated into the BBC’s daily work, build on the BBC’s journalistic values and existing skills to extend the remit of ”healthy scepticism” (a phrase we heard often) to the treatment of statistics. But these are not widely known or referred to, as far as we understand it, and could usefully be given a higher profile. We observed during this review that the excellent practice we saw in some broadcasts and online pieces was not consistently applied. We recommend that the BBC develops its guidance, including adopting or modifying the advice commonly found in guidelines from, for example, the Royal Statistical Society20 (we also note a useful checklist which is handed out to journalists in BBC training sessions, appended to this report, which could also be disseminated more widely) and disseminates their availability widely across the BBC. As we will come to address throughout this report, we have seen during this investigation that there is a lack of standardised advice, or at the very least a poor recognition of the resources which are already available, for those who are responsible for using statistics in content. And there is an absence of any kind of network or forum for those working with

15

https://www.rss.org.uk/RSS/pro_dev/RSS_training_courses/RSS/pro_dev/RSS_training_courses_sub/Our_courses.aspx?hkey=2

0e85bcf-4c66-467f-afd4-28cf897ad7fe

16 http://bcu.ac.uk/courses/online-journalism

17 http://courses.cardiff.ac.uk/postgraduate/course/detail/p411.html

18 http://www.city.ac.uk/courses/undergraduate/journalism#course-detail=1

19 http://www.lse.ac.uk/media@lse/study/mscProgrammes/Data-and-Society.aspx

20 For example, their “number hygiene” rules for journalists, available at http://www.statslife.org.uk/images/pdf/rss-number-

hygiene-list-2014.pdf and appended to this report and the checklist found in the article “Score and ignore: a radio listener’s

guide to ignoring health stories”, available at http://onlinelibrary.wiley.com/doi/10.1111/j.1740-9713.2012.00611.x/epdf

Page 26: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

26

statistics across the BBC to share experiences and knowledge. We recognise that on many occasions the subject-specific structure is entirely appropriate as a means of ensuring the quality of coverage on particular statistical topics. However, our impression from the evidence we have received is that the process is haphazard – it relies on knowing who knows what – and depends on the availability of particular individuals. How a story is first broadcast is vital, so statistical advice at night and weekends should be available to those preparing stories then. Anthony Reuben, a BBC journalist, quoted in ‘Stand Out and Be Counted’, produced by the British Academy, the Higher Education Funding Council for England and the Economic and Social Research Council21, said:

Being comfortable with numbers sets you apart from many other journalists. It helps you to develop the alarm bells that should go off when PR companies send you bogus research or governments announce figures that do not add up.

We would like to see more journalists “comfortable with numbers” and incentives developed for them to be so. The BBC should consider how better to identify, use and develop the expertise it has. It is important that non-specialist programme makers have access to in-house expertise as a matter of course, as well as drawing on external sources. In supporting programme makers it is vital that they know where to go for advice. This should be sufficiently visible, and sufficiently senior, to help boost editorial judgement and confidence in dealing with statistics.

21 http://www.britac.ac.uk/policy/Stand_Out_and_Be_Counted.cfm

Page 27: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

27

How well does the BBC use statistics? The BBC has a difficult job to do in deploying statistics appropriately to explain the world around us. So how successful is it in doing this? How appropriate is its selection of the statistics it uses in content? How accurate is its coverage of the statistics it uses? How effectively does it explain and interpret them? And to what extent do audiences understand them? Any assessment in this area is by its nature limited, as we cannot know (as many pointed out to us) the extent of the stories which the BBC chooses not to run due to concerns about the statistics contained within them. Certainly, we heard this is one of the purposes of the main news meetings which take place each day – to pull stories which are not thought to “stack up”. And we have only looked in depth at certain examples of programmes and online material (as well as drawing on the content analysis and audience research) in order to make our assessment. Given the volume of output that the BBC produces, as well as information online, our conclusions can only be based on a small sample of the BBC’s coverage and it is to be expected that both successes and errors will be found somewhere in the mix. Nevertheless, the evidence we have seen and heard leads us towards some conclusions. In general, we have seen much for which the BBC should be praised. Many of the people we received evidence from stated their admiration for the BBC in general and its use of statistics in particular. One of the submissions we received said

…overall the BBC is excellent at reporting statistics accurately. Moreover on the occasions when errors are made (as, inevitably, they are) they are generally corrected in a timely fashion…I would also say that as far as I can see the BBC provides accurate and impartial reporting on the statistics to the best of its ability.

Jonathan Portes, Principal Research Fellow National Institute of Economic and Social Research

Some classed it as a frontrunner amongst the media industry in its coverage of numbers and figures – “better than most”, “generally commendable”, “well-balanced and unproblematic” and “definitely at the higher end, in comparison to most news and media organisations” were some of the comments we received. And indeed, the quality of the BBC’s journalism has been recognised by external bodies such as the Royal Statistical Society (RSS), whose annual awards in “statistical excellence in journalism”22 have been presented to the BBC more than to any other news organisation.

22

http://www.rss.org.uk/RSS/About/Recognising_Statistical_Excellence/Journalism_awards/RSS/About_the_RSS/Recognising_stat

istical_excellence_sub/Statistical_Excellence_in_Journalism_Awards.aspx?hkey=f10b8e1e-466a-4ef1-af2a-f8f53e04f338

Page 28: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

28

In 2012 Fergus Walsh et al produced “Population – the world at 7 billion’ for the BBC. The programme won an RSS award for statistical excellence in journalism (broadcast). The relevant citation from the judges said that they “were impressed at how the piece, at just under 3 minutes, had met the challenge of effective reporting within the time constraints of television news broadcasting. A strong presentation had been made of the nature and context of the issues involved in population measurement and prediction, including important explanations of uncertainty. The graphics had been used well both to engage the viewer and, particularly, to explain statistical issues such as higher and lower estimates”.23

However, although there are examples of good practice – and we saw many – there are also areas where the BBC could and more consistently should be doing better. As the Royal Statistical Society’s Statistics User Forum said in its evidence to the Panel:

…while there is much that is good about statistics and the BBC, we want to encourage the BBC to do even better in its reporting of statistics.

That is the Panel’s view too.

How appropriate is the BBC’s selection of the statistics it reports?

The appropriate selection of statistics can be tricky. Journalists always have a difficult balance to strike between the new and the important – a flatlining economy, for example, may be important but it may also be perceived as “boring”, with the temptation to lead on something that’s new or that’s changed. Such decisions need to be made within a context of limited time and resources. We understand that BBC News tends to have at least two to five discussions per week around the reliability of different pieces of statistical information, so it is clear that attention is given to the appropriate selection of statistics. So how do journalists and programme makers come to select which statistics to use both to illustrate specific stories and to report statistics in their own right? What sorts of processes do they go through to determine which should appear in BBC content? And what implications does this have for impartiality and accuracy?

Statistics used as illustrations Statistics are frequently deployed to illustrate broad stories, to provide detail and insight into particular issues and serve as a focus for audience understanding. But their selection in this context can be complex – sometimes they can highlight key points and at other junctures they can obfuscate the main message. As the audience research report notes:

The presence of large numbers of figures or statistics within a news or current affairs piece [therefore] had a polarising effect.

23 http://www.rss.org.uk/Images/PDF/about/rss-journalism-winners-2012.pdf

Page 29: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

29

• For a few of the audience it actively engaged them with the story.

• For others large amounts of numbers, or numbers presented at speed without graphics, lively verbalisation or visual reinforcement disengaged them from the communication.

Audience research report, Oxygen Brand Consulting The evidence we received during this review suggests that the BBC has a mixed record in successfully selecting appropriate statistics to illustrate coverage. We certainly noted examples of content in which the BBC has selected statistics very effectively. This was seen very clearly in some of the case studies undertaken in the audience research, such as that for migration. In such cases, the statistics gave clear information to audiences to supplement the other detail in the piece.

BBC One’s News at Six covered the issue of net migration reaching 330,000 on 27 August 2015. The opening part of the report involved the reporter explaining how many migrants entered the UK and presenting a variety of statistics to the audience. The latter part incorporated interviews with individuals, including a Polish business owner and a female patron of a pub. A pie chart explaining where migrants to the UK were coming from was used in the first half of the report and was shown by the audience research to be “the most praised use of statistics in all the prompted stimulus” due to its use of statistics to “clear things up” or enlighten the audience.

However, we also noted instances where statistics were presented in such a way that they merely confused the audience. As one participant in the audience research put it:

They’ve hit you with one set of figures, and then they hit you with another set of figures. Before you’ve digested that second set of figures they hit you with a third set of figures, to the point where your brain just can’t cope with it.

Participant, Audience research report, Oxygen Brand Consulting

On 25 November 2015 BBC One’s News at Six included a piece on the Government’s Spending Review explaining the implications for individuals. There were lots of numbers included in the piece, reinforced with graphics on the screen. The audience research report concluded that for the majority of the participants in the research groups, this report contained too many numbers and that they were presented too fast. They felt that perhaps the number of figures included was just a little too ambitious for the length of the report24.

24 p26, Impartiality Review: BBC Reporting of Statistics, Report on qualitative research with the BBC audience, Oxygen Brand

Consulting, July 2016

Page 30: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

30

We also saw examples of numbers not being used to illustrate stories, when their inclusion might have been helpful. The content analysis, for example, highlighted that there were some topics covered within their sample period, such as crime, where statistics were available and where their use might have enhanced the story.

In short, if audiences are to understand the scale of an issue, whether it is levels of immigration or the risks posed by terrorism or climate change, it could be argued there are areas where statistical information might be used more often.

Content analysis, Cardiff University A broader understanding across the BBC of the statistics that are available, and their effective use, would contribute to public understanding of important issues.

The reporting of statistics We can reasonably assume (and indeed this was confirmed by many we spoke to) that BBC journalists and programme makers are inundated with statistics to include in coverage. As one organisation which submitted evidence to us noted, official statistics are released on almost every working day of the year and over 2,000 upcoming releases are currently listed on the Government’s publication hub25. And that’s just official statistics. Journalists therefore need to choose carefully the statistical releases to cover in content. We noted that providers of statistics often employ journalists in their press offices to draft their press releases specifically so that they use a style and language which makes them easy to be picked up by news organisations. As one senior BBC editor put it:

It’s difficult when every pressure group provides statistics to back up their own story.

In some respects, the BBC makes these judgements well. We heard examples of the BBC choosing not to cover particular statistics which have either been sent to them in press releases or featured in other media coverage, due to concerns with the methodology behind them or the interpretations placed on them.

The British Retail Consortium publish regular “footfall” figures for the UK, documenting customer activity in town and city centre locations and in and out of town shopping locations throughout the UK26. However, the survey results are generally not reported on BBC Northern Ireland, due to a concern that they are based on very few sampling points of particular types of retailer in Belfast city only.

The audience research found that participants felt that “. . . the BBC was not over-dependent on any particular sources of statistics”, although certain statistics do get routinely reported because they are familiar (both to audiences and to journalists and programme makers). So the BBC will always, for example, report on the latest inflation figures, even if

25 https://www.gov.uk/government/statistics/announcements. Figure correct at time of writing (April 2016).

26 http://www.brc.org.uk/brc_footfall_and_vacancies_monitor.asp

Page 31: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

31

the precise story they tell has not changed very much since the previous figures were released. One organisation we heard from described a “premier league” of statistical releases, including GDP, unemployment, NHS waiting lists and migration figures. We heard from several individuals (both BBC and non-BBC) that although in an ideal world journalists would be able to search for and check every single statistic, this is unlikely ever to be feasible. Instead, they have to put releases into a “credible” basket, such as those National Statistics which are independently produced and subject to assessment by the UK Statistics Authority, and use secondary factors to make decisions about including particular numbers, such as “have the figures from this source proved robust on previous occasions?” We think such an approach is entirely reasonable, but we could find no guidance on how to make such judgements or how they should be applied.

Missed opportunities and going “beyond the headlines” This regular reporting of specific sets of statistics may be entirely appropriate. But it can present missed opportunities to reveal new insights that are relevant to audiences and can be used to challenge prevailing narratives. Headline figures often concentrate on averages, or the UK as a whole. How much more informative to dig beneath the average, or to put the experience of one group in the context of others. Looking beyond the headline numbers would give audiences a richer understanding of, for example, how different sectors of the economy were faring, or different parts of the country, or, linking changes in economic performance to population growth, how households may be experiencing changes in the economy. We were surprised at how rarely the BBC did this. From the evidence we saw, there appears to be a lack of capacity to explore other non-standard statistical releases or data sets, or indeed to analyse other data which are included even in regular standard releases and not usually reported on. Bringing new, and broader, perspectives such as this on a more regular basis requires, for example, health or education correspondents to have an understanding of the macro and micro economic data and analyses relevant to their area (for example, we heard from an economics correspondent – an expert in macroeconomic data – that he struggles when weighing evidence on migration, because migration statistics are beyond his normal area of expertise). Journalists also cited pressure of time, both to prepare stories, and to report them, particularly in main news bulletins, but this risks allowing dominant narratives to prevail. A previous impartiality review, on the breadth of opinion in the BBC’s output27 recommended that the BBC finds a more systematic way of reviewing coverage of big stories and of questioning assumptions about the prevailing concerns. More exploration of statistics behind the headlines could help further this objective. Similarly, many statistics can be broken down to national, regional and, sometimes, local level and used to enhance reporting on the devolved UK.

We noted a tendency to report statistics straight from a press release, without necessarily going “beyond the headline”. A previous impartiality review on the BBC’s coverage of science was supported by content analysis by Imperial College London28. This showed that 55 research stories accounted for

27 http://downloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/breadth_opinion/breadth_opinion.pdf

28 Pages 67-69 http://downloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/science_impartiality/appendix_a.pdf

Page 32: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

32

171 of the broadcast news items in the sample. For 42 of the 55 stories, Imperial was able to find an associated press release. It concluded that 82% of the stories appeared to be derived from PR. A study by Cardiff School of Journalism in 2007 looking at the PR dependency of UK journalism in general, found that 34% of broadcast news came wholly or partly from pre-packaged sources and that PR plays an agenda setting role in 52% of broadcast news29. This reliance on press releases is concerning, particularly in light of a recent study in the British Medical Journal30 which found that around 40% of press releases about health stories contained exaggerations and these fed through into the majority of media coverage. In making its judgements on which statistics to report an important criterion which should be considered is whether the underlying source data are in the public domain and therefore open to scrutiny. If they aren’t then alarm bells should ring.

On 10 March 2015 BBC online published an article about the Government’s Troubled Families Programme, which pays councils to work directly with a family to deal with problems including worklessness and truancy, rather than lots of different agencies working with each aspect of the families' troubles. The article, “Eric Pickles: 105,000 troubled families' lives turned round”31, reported that Communities Secretary Eric Pickles had told the Westminster Parliament that 105,000 troubled families had had their lives "turned around" by the scheme. The analysis presented to Parliament was also contained in a government press release. The National Institute of Economic and Social Research showed that the financial figures in the press release were highly suspect since no independent estimate of the schemes impact were yet available and questioned the idea that every single problem family identified had in fact been turned around.

Having an understanding of the wider set of evidence relating to a particular topic can help, so that a new statistical release can be tested against what is already known. As one organisation we heard from pointed out:

The BBC is usually good at spotting when a spokesperson is indulging in the selective use of statistics and often it can rely on their political opponents to point to alternative measures, but the BBC needs to have the in-house expertise to know which statistics are relevant to which areas, so that a more impartial picture is presented.

There is some expertise within specialist hubs and via specialist roles in the nations. The Business Unit, for example, told us that when sales figures for cars were released in September 2015 they were at a record high despite the recent VW emissions scandal. The Unit’s specialists were able to put them into context, using their overall knowledge of the

29 Justin Lewis, Andrew Williams, Bob Franklin, James Thomas and Nick Mosdell (2007), The Quality and Independence of

British Journalism, Cardiff School of Journalism. 30 http://www.bmj.com/content/349/bmj.g7015 31 http://www.bbc.co.uk/news/uk-politics-31819172

Page 33: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

33

industry, to highlight that sales figures were based on orders in advance and did not necessarily reflect the impact of the scandal at that time32. And checking back with producers of statistical and press releases is a basic, although the evidence we have seen suggests it doesn’t happen as often as it should.

An individual who works in an organisation which produces census statistics told us that she had recently been involved in the release of information on the Scotland census which was published in 2015. She had been surprised at how few questions her organisations had received from journalists on the statistics they had released.

However, our Panel strongly believes that it is not enough for journalists to rely on these specialist areas – they should also be encouraged to develop their own expertise and confidence in the selection and reporting of statistics, looking beyond the headlines to provide fuller understandings for audiences. As one BBC staff member put it:

My observation is that I don’t think anyone sets out to misuse statistics or that any bias comes into play. I often think there is a lack of skill and understanding in interpreting statistics among non-specialist staff. For example… the labour market figures in our region are showing a much more complicated picture than the monthly unemployment rate indicates.

One organisation who submitted evidence to us specifically said:

After the breaking news has been observed, there should be scope for further analysis of the statistics, moving beyond what might have been designated as a ‘headline’ statistic – including by the government or the ONS – to show a broader picture (including to help the audience assess the validity of the chosen headlines).

Statistics Users Forum

32 See http://www.bbc.co.uk/news/business-34452495 for an example of this caveat in BBC coverage.

Page 34: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

34

On 5 April 2016 a study by MyFamilyCare and the Women’s Business Council revealed that just 1% of men had taken up the opportunity of shared parental leave after having a new baby, 12 months after the policy was introduced33. The figure of 1% in fact referred to all male employees, not just those eligible for shared parental leave, but we understand that the original press release from one of the organisations which produced the report did not make this clear. Much of the reporting of the study in the media therefore contained inaccurate reporting of the figure. The BBC did however make some efforts to provide the correct context, albeit also misreporting it on some occasions. The Today programme on Radio 4, for example, initially reported in its news bulletin at the start of the programme that

Only 1% of new fathers have taken up the offer of shared parental leave, 12 months after the policy was introduced.

This was changed in a later bulletin in the programme to

Only 1% of fathers have taken up . . . (which was still incorrect) However, the programme went on to explain (accurately) that

just 1% of men have chosen to take up shared parental leave, but 63% of fathers say they are likely to in the future.

We feel it is vital that the BBC should have the internal capacity to question press releases, relate them to other data sources and, if necessary, provide some additional calculations of its own.

A word for statistics producers Producers of statistics can help improve reporting by making the statistics they release more accessible and easier for journalists to cover. We were given examples of cases where stories had been covered (or not been covered) purely due to the way the statistics had been set out in releases and the availability of statisticians within the provider organisation to explain the issues to journalists (although we also heard journalists don’t always take advantage of the opportunities that are there). We are encouraging journalists to delve into press releases and ask questions about how the statistics to which they refer were produced, the context in which they were produced and why, their robustness, and to check their interpretation of the numbers. We heard that sometimes journalists try to do this – particularly those from the BBC. But they can have difficulty getting past press offices to get technical answers to their questions. They value the ability to speak directly to experts. For National Statistics34, the name and contact number and email of the responsible statistician

33 https://www.myfamilycare.co.uk/

34 These are statistics which are compliant with the UK Statistics Authority’s Code of Practice for Official Statistics, available at

https://www.statisticsauthority.gov.uk/wp-content/uploads/2015/12/images-

codeofpracticeforofficialstatisticsjanuary2009_tcm97-25306.pdf. They include statistics from England, Northern Ireland,

Scotland and Wales.

Page 35: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

35

should be included in all releases and they should be available to answer technical (not policy) questions directly. This does not happen in some departments and should. Similarly, releases from academic and other organisations should contain contact details for journalists. Statistics producers can also anticipate some of the questions and provide them alongside the press notice, including a link from press releases to original sources. If statements are not backed up by access to the data or analysis, then journalists should be wary of reporting them at all.

How accurately does the BBC report statistics?

The need to communicate to a broad public requires a simple and compelling narrative. But the underlying issues may be far from simple, and presenting them as such risks being misleading.35

Richard Sambrook, Professor of Journalism and

Director of the Centre for Journalism at Cardiff University When asking about how journalists go about selecting statistics, we received evidence from some that this can be affected by a tension between “storytelling” and statistical accuracy. As one person put it, editors like clarity; statisticians like caveats. So how well does the BBC perform with respect to accuracy? Part of our remit was to look at the accuracy of the BBC’s reporting of statistics – once selected, how often do they “get the numbers right”? If audiences are to be able to rely on the BBC to provide them with “the correct story” (as one participant in the audience research suggested), then clearly they need to be confident that the statistics they are given are reliable. They cannot fully understand the issue at stake if the figures at the heart of the discussion are mistaken. This has three aspects: how accurately the BBC reports numbers produced by external sources (such as the ONS or the Welsh Government); how accurately they generate their own statistics (for example, by conducting their own surveys); and finally, what can we learn from audience research about the best way to get those figures across to the public? How they deal with statistics quoted by others, for example in speeches or debates, is dealt with in our later discussion on statistics in debate. We note that the BBC’s commitment as outlined in its Editorial Guidelines is not to strict accuracy but to “due accuracy”, which it defines as:

35 Delivering Trust: Impartiality and Objectivity in the Digital Age. Reuters Institute for the Study of Journalism. University of

Oxford, available at

http://reutersinstitute.politics.ox.ac.uk/sites/default/files/Delivering%20Trust%20Impartiality%20and%20Objectivity%20in%20

a%20Digital%20Age_0.pdf

Page 36: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

36

. . . accuracy [which] must be adequate and appropriate to the output, taking account of the subject and nature of the content, the likely audience expectation and any signposting that may influence that expectation.36

BBC Editorial Guidelines on Accuracy In terms of the duly accurate reporting of external statistics, the evidence we received suggests that the BBC often does a good job. When journalists receive statistics as part of a discussion with individuals, through reading press releases or via other forms of research, they often go to some lengths to verify the statistics they receive (although this is sometimes difficult to do, as discussed above). If the accuracy of a statistic appears questionable, some effort seems to be made to communicate this between colleagues so that errors are not repeated. We detected a keenness amongst programming teams to check information which they had concerns over, often with internal colleagues who might have greater expertise in the area. In many of our discussions (and in the content analysis and audience research) specific platforms and programmes have been praised for their good practice, including BBC online, Radio 1 Newsbeat, BBC Two’s Newsnight and Radio 4’s More or Less. Specific journalists were also consistently commended for their diligence in this area, particularly Mark Easton (Home Editor), along with Hugh Pym (Health Editor), Robert Peston (the former Economics Editor, still in post at the start of our review), Evan Davis (current lead presenter on Newsnight) and national specialists such as Douglas Fraser (Business and Economy Editor, BBC Scotland). The Audience Research report noted that

Accurate numbers and information were expected from the BBC and were felt to be delivered.

Audience research report, Oxygen Brand Consulting

Of course it has not been possible for us to assess the accuracy of every item of BBC content. But Cardiff University’s content analysis gives us some feel for the extent to which the BBC’s reporting of statistics is accurate. With reference to their eight specific case studies, they state that:

Broadly speaking, in stories where statistical information was central, we found BBC journalists dealt with statistics cautiously, explaining both their significance and, on some occasions, the methodology behind them.

Content analysis, Cardiff University

We have also received a general sense of the accuracy of the BBC’s reporting of statistics from several organisations and individuals who regularly contribute to BBC programmes. Many of them praised in general terms the BBC’s accuracy in using external statistics, and we were told that overall the BBC is “excellent at reporting statistics accurately” and that its output is “generally seen to be well balanced and unproblematic”. The audience research concludes in this area that

The study produced a consensus that the BBC is perceived to be accurate in its use of statistics.

Audience research report, Oxygen Brand Consulting This perception is corroborated when we look at complaints. As few as 260 of the 259,886 complaints contacts received by the BBC themselves in 2014/15 were about statistics, although it is hard to be precise as audiences may quote these when making other points

36 http://www.bbc.co.uk/editorialguidelines/guidelines/accuracy/

Page 37: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

37

and the BBC cannot easily classify complaints in a way that would enable a definitive figure to be reached. But certainly, Jonathan Portes of the National Institute for Economic and Social Research told us that of all the times in the past five years when he has found cause to complain about the factual accuracy of press reports, he has never felt obliged to complain about the BBC. When looking for specific examples of the BBC’s success in this area, our attention was drawn to a number of instances of cases where the BBC had reported accurately on stories which other media had got wrong. This is illustrated in the content analysis too:

Overall, we found that BBC outlets tend to be more circumspect in their use of and interpretation of statistics than commercial television . . .

Content analysis, Cardiff University There are examples where the BBC may have, perhaps, covered an issue in broader terms, debunked the key figures or gone into further detail on the statistics in a way in which other media sources had not. This is encouraging, and it suggests there is good reason for the results from the Ipsos MORI poll we mentioned earlier about audience “trust” in the BBC for accurate coverage.

In April 2014 some media outlets reported on a drop in violent crime figures and attributed this to the rising cost of alcohol. However, it subsequently became clear that the figures did not directly indicate how much overall violent crime there is. The figures were the number of people treated for injuries related to violence at emergency departments, minor injury units and walk in centres across England and Wales. While other outlets had led with punchy headlines, BBC News was more circumspect – it reported that the research behind the figures said alcohol “might have played a role” and used a headline that referenced the fact that the story was based on A&E data (“Violent crime in England and Wales falls again, A&E data shows”37).

In terms of the internal workings of the BBC, we were told of occasions when specific journalistic teams investigating the accuracy of a statistic try to disseminate their findings to colleagues. So the editors of the Today programme might, for example, leave a note for their night team that a particular statistic should not be used when putting together material for the following day. This is clearly beneficial for the audience as it increases the likelihood of them receiving consistently accurate information, as well as saving much duplicated effort for the journalists involved.

37 http://www.bbc.co.uk/news/uk-27119689

Page 38: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

38

Efforts were apparently made during the flooding crisis in December 2015 to verify the figure widely-cited in the media at the time that flood defence spending had been cut by 14% in the year, which the Department for the Environment, Food and Rural Affairs denied. The figure was used in some BBC coverage, although BBC researchers attempted to advise programme makers internally as to how that 14% figure had been reached and that it was potentially misleading.

BBC-generated statistics We also noted some examples of statistics which had been generated by the BBC itself. In the main this is done when the BBC commissions an opinion poll or makes a Freedom of Information request for example to local councils, to generate statistical results. This is a difficult area and the BBC needs to be scrupulous in ensuring that the findings are robust. The importance of this is reflected in the BBC’s Editorial Guidelines, which specifically refer to the use of polls and surveys (indeed, this is one of only two statistics-specific mentions in either the Guidelines or Guidance). The Guidelines state:

When we report the results of any research, especially when information is being summarised, the audience must be able to trust that the journalism behind what they see and hear is robust, the research is reliable and meaningful and the language used is both consistent and truthful. This accuracy, clarity and credibility is as important when we report on "polls" and "surveys" as it is in the rest of our journalism. When we commission "polls" or "surveys" ourselves and invest them with the BBC's authority, we must take even more care to ensure we have made good judgements about their relative importance and the audience can trust what we are saying. . . . Any proposal to commission an opinion poll on matters of public policy, political or industrial controversy, or on “controversial subjects'” in any other area, for any BBC service, must be referred to Chief Adviser Politics. 38 BBC Editorial Guidelines on Politics, Public Policy and Polls

We understand that the BBC carries out somewhere in the region of half a dozen polls each month (BBC-wide) and one or two surveys.

38 http://www.bbc.co.uk/editorialguidelines/guidelines/politics/opinion-polls

Page 39: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

39

On 6 March 2016 Radio 4’s The World This Weekend ran an item on the forthcoming EU referendum vote which included coverage of a BBC-commissioned survey of Conservative constituency chairmen. The BBC reported that all 330 such chairmen had been approached regarding how they intended to vote. 128 had responded and their intentions had been recorded. The BBC used these results to conclude in an online article that “Tory Chairmen [are] divided over EU referendum vote”39 but were careful to caveat it on the basis that most of those surveyed did not respond to the enquiry.

Some concern has been expressed to us about BBC-commissioned surveys and polls. While it is understandable that journalists would wish to generate their own data, the results can sometimes be patchy and any conclusions can only be tentative. But this is not always how they are reported. We heard from one BBC staff member, for example, that consideration was given to reporting a survey of 655 GP practices in Yorkshire who had been asked if they had experienced problems with recruitment, when only 25% had responded to the question. The other way in which we heard the BBC generates its own statistics is in the field of scientific experiments for programmes such as Trust Me I’m a Doctor. This is designed, according to the BBC Two website, to go “behind the headlines to give the definitive answer to health questions”40. We heard how the programme runs its own experiments with small groups of volunteers, relying on relevant academics to help design the experiments and interrogate the statistics. They provide detailed information on their website to show their workings, including the raw data wherever possible. Although sometimes the subject of accuracy complaints, and despite the fact that the experiments are small and the results therefore by no means definitive, the team appears to go to great effort to make their methodology transparent and accessible for audiences. “Showing its own workings” should be a consistently applied standard for the BBC and its own statistics.

In 2015 BBC Two’s Trust Me I’m a Doctor team conducted The Big Olive Oil Experiment41. They linked with a team from the University of Glasgow to investigate the health benefits of various vegetable oils. Piggybacking off an initial study of 70 people done by the University comparing the effects of 12 weeks of taking daily extra virgin or refined olive oil on heart health (as measured using a new technique called urinary proteomics), the series funded a further study asking 75 more participants to take daily sunflower oil, rapeseed oil or nothing. Together these two experiments showed that whilst a daily dose of 20ml of either extra virgin or refined olive oil could improve heart health, neither rapeseed oil nor sunflower oil (both also marketed for their health benefits) did. The experiment was written up on the BBC Two website and a link was provided to the results which were formally published in The American Journal of Clinical Nutrition42 (with further publications pending).

39 http://www.bbc.co.uk/news/uk-politics-eu-referendum-35739314

40 http://www.bbc.co.uk/programmes/p01dgd9c

41 http://www.bbc.co.uk/programmes/articles/tWtLcz30LZm3YTk5VfZ307/is-olive-oil-really-good-for-me

Page 40: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

40

But the BBC doesn’t always get it right, sometimes over-interpreting results based on small samples. For example, a popular recent BBC programme “How to Stay Young” featured an experiment in which 22 older people were allocated to walking or table tennis for 10 weeks and various responses were measured. The method of allocating people randomly to either walking or table tennis was appropriately emphasised. However, when the final average results were compared, small differences were over-interpreted by the presenters and there was not enough information given on unavoidable variability – judging whether a difference could have happened by chance given the small size of the sample. Such programmes encourage the public to think that every observed difference, however small, has a reason: this is not the message we would expect journalists to promote. There have also been cases, some of them documented in the content analysis, where the BBC has sometimes just got the information wrong. Errors include not explaining accurately what is being measured (an edition of BBC Two’s Newsnight in April 2015 described how “numbers using foodbanks will hit a million this week”, when there were actually a million uses by a smaller number of people43). Or results relating to only part of the UK presented as though they applied to all. Or sometimes, the figures have just been misinterpreted entirely.

Radio 4’s Today programme covered the topic of forced marriages in June 2014. Following the introduction of a law making forced marriage a criminal offence, the programme said that "an estimated 8,000 people are forced into marriages against their will every year". However, this figure was taken from a 2009 report for the Department for Children, Schools and Families (now the Department for Education). This estimated that between 5,000 and 8,000 forced marriage cases were encountered by authorities in England, but this does not mean that 8,000 forced marriages took place. As the report states:

Of the FM cases reported to local organisations, almost two-thirds related to threats of marriage (62%) and just over one-third (38%) related to marriages that had taken place.

This of course can have a big impact on audience understanding. When the News at Ten told us that there have been only three successful race discrimination cases relating to employment in the last five years, this tells us something about the way that race discrimination laws are operating in the UK. If we find out subsequently that this is actually the percentage success rate, this tells us something quite different44.

42 http://ajcn.nutrition.org/content/101/1/44.full.pdf+html

43 We note that Newsnight included a corrective statement at the end of the programme on this mistake and a BBC online

article was also written around the same time about the issue of how many people actually use foodbanks.

44 http://downloads.bbc.co.uk/bbctrust/assets/files/pdf/appeals/esc_bulletins/2015/nov.pdf

Page 41: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

41

On 2 September 2013 the BBC website published an article, “Syria crisis: MPs “right to reject military action” - BBC poll”45. The article reported that a BBC-commissioned poll had revealed that almost three-quarters of people believed MPs were right to reject UK military action in Syria in a House of Commons vote the previous week on supporting military action. The article went on to explain that ICM Research had spoken to 1,000 adults in England, Scotland and Wales to reach this conclusion. Individuals in Northern Ireland were not included in the survey, which was made clear in the article. Northern Ireland MPs took part in the vote and in some cases took a different view to the majority, but their constituents were not represented in the BBC poll.

Errors are avoidable. As noted earlier in our section on existing BBC resources, there are lots of checklists produced by different organisations on how to avoid making mistakes, as well as some rules of thumb operating in parts of the BBC as far as we understand. But we could find no common standards which apply across the organisation, or much awareness of such guidelines that do exist. This is consistent with our recommendation that the BBC needs to strengthen its capability by making clear where responsibility lies for setting statistical standards, and where to go for advice on their application.

Does the BBC explain and interpret statistics appropriately?

But it’s not just about whether the numbers are accurate or not. Audience understanding also depends on the explanation of those numbers – whether the coverage includes appropriate context, whether it incorporates explanations of how the numbers have been generated and what caveats might be involved, whether it includes any kind of general evaluation to help the audience to understand the bigger picture. As one respondent in the audience research put it:

You just want the clear facts. A lot of this to me with all these figures, it just does your brain in. Well it does for me, I just want to listen to somebody and understand it.

Audience research report, Oxygen Brand Consulting This is something the BBC recognises and it is already reflected in the Editorial Guidelines in terms both of context and of opinion:

Accuracy is not simply a matter of getting facts right. If an issue is controversial, relevant opinions as well as facts may need to be considered. When necessary, all the relevant facts and information should also be weighed to get at the truth.46

BBC Editorial Guidelines on Accuracy So to what extent does the BBC achieve this when it reports statistics? Are the numbers explained, put into context, weighed and interpreted, and presented clearly?

45 http://www.bbc.co.uk/news/uk-23931479

46 http://www.bbc.co.uk/editorialguidelines/guidelines/accuracy

Page 42: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

42

Again, we witnessed substantial efforts in this area. Journalists and programme makers make many attempts to fulfil the BBC’s remit of “informing” audiences when they use statistics. We saw several examples of content which explained the background of a statistic in order to give a sense of its significance – to go “beyond the headline” and provide an assessment (whether direct or implied) of the big picture. This can involve deploying comparisons, analysis of trends, interpretation and challenge. This information helps audiences to understand what weight to give to the figures and how to evaluate their significance.

A story in September 2015 reported the possibility of people contracting Alzheimer's disease during certain medical procedures in the same way as the brain disease CJD. This came from a research paper published that month in Nature magazine. Following investigation by the BBC’s health team, all BBC teams were advised that the evidence was speculative, and it was henceforth referred to as a “theoretical” risk across all coverage (for example in its online article, “”Transmittable Alzheimer’s” concept raised”47, which also stressed that the findings ‘are inconclusive and do not mean Alzheimer’s is infectious’).

Certainly the content analysis found instances of particular BBC journalists challenging conventional wisdom and re-interpreting the wider meaning of numbers reported elsewhere.

In February 2015 Radio 4’s More or Less programme debunked a story that strenuous jogging is bad for you, which was receiving much press attention at the time48. The narrative was based on a report in the Journal of the American College of Cardiology of a study involving about 1,500 people in Denmark over a twelve year period. More or Less read the report and noticed that the group classified as “strenuous joggers” was small, being made up of 36 people. Only two of those people had died by the end of the study and it was not made clear in the research how they had died. The programme interviewed an expert from NHS Behind the Headlines, who confirmed that with such small numbers no firm conclusions can be drawn about the health effects of strenuous jogging. It then interviewed the lead author of the study, who said that his study had not proved its claim that strenuous joggers have the same average life expectancy as sedentary people.

We also found many examples of journalists providing independent assessments of key stories as a means of breaking down the information for audiences. This was done across platforms but we particularly noted it being achieved through the use of online blogs by senior editors, such as Robert Peston49 and Mark Easton50. In this way, audiences could

47 http://www.bbc.co.uk/news/health-34184470

48 http://www.bbc.co.uk/programmes/b0512ln0

49 http://www.bbc.co.uk/news/correspondents/robertpeston

Page 43: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

43

have access to detailed analysis which they could digest in their own time to help them to a fuller understanding.

On 19 October 2015 Robert Peston wrote an article on his BBC blog examining the rate of growth of China’s economy, which had been announced that day as having been 6.9% in July – September 201551. By deconstructing the figure and comparing it with trends and estimates from other sectors and countries, he was able to demonstrate that “we are still some way from a robust understanding of both the current condition of China's complex sprawling economy and its direction – with the implication that our prosperity will be determined both by the changing reality and the changing perception of what's going on there”.

Yet despite these specific achievements, we found that explanation and interpretation of the figures is not provided as consistently as we might have expected across BBC coverage in general. Some have pointed out to us that this is sometimes not due to any lack of effort on the BBC’s part. We understand, for example, that the BBC’s health team went to some lengths to deconstruct a statement which was made by the Secretary of State for Health on 16 July 2015 in the House of Commons that:

About 6,000 people lose their lives every year because we do not have a proper seven-day service in hospitals.

Jeremy Hunt, Secretary of State for Health52 However, they were hampered by the fact that the government had not, at the time that this statement was made, revealed the details of their workings. And yet it was such a big story, with the statistic being used again and again, that the BBC could not have chosen not to cover it53. As noted in Cardiff’s content analysis in their case study of the pay negotiation for junior doctors:

although it is difficult to analyse claims made by the government when little information is supplied, arguably their case could still have been more effectively scrutinised. In doing so, there could have been a more rigorous attempt to understand how the government had come up with the pay offer of 11%, why 1% of doctors would be worse off, why 75% would have a pay rise and why hospitals are currently forced to roster three times fewer staff at weekends.

Content analysis, Cardiff University We have noted just how many examples we saw of the BBC providing additional information to help audiences understand the numbers. And we are also conscious that it will not be possible, practical or appropriate to include detailed explanations and evaluations every time

50 http://www.bbc.co.uk/news/correspondents/markeaston

51 http://www.bbc.co.uk/news/business-34570813

52 Quoted in Hansard, available at http://www.publications.parliament.uk/pa/cm201516/cmhansrd/cm150716/debtext/150716-

0002.htm

53 See http://www.bbc.co.uk/news/health-33542940 for an example of how the BBC covered this, making clear that the 6,000

figure came from research which had yet to be published.

Page 44: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

44

a statistic is used in BBC coverage. We explore some of these issues below which should be addressed if interpretation is to be strengthened. They are not universal across all platforms and are certainly not unique to the BBC (we received much evidence of the same potential problem areas throughout the media). But they occurred often enough for us to find them worthy of discussion.

Context

Context is crucial. Without it, statistics can rarely be fully understood by audiences. We might, for example, observe five fire engines attending a fire. But without the context, how do we know whether this is a lot of fire engines or a very small number of them? How can we judge whether their deployment is excessive, stretched too thin or just about right? A number on its own means nothing. As one person told us:

The power of statistics lies not simply in its ability to summarise numerical data, but from the ability to make inferences.

Academic, Scotland

A good example was provided by the Statistics Users Forum in its evidence to the Panel. There is clearly a need to be concise in news bulletins, but to just report that “German GDP is 0.3%”, as heard recently, can leave the listener wondering which time periods are being compared, how large is German GDP, and how do the change and the level compare with those in other European countries.

As noted above, we saw many instances of journalists acknowledging this and making substantial efforts to provide additional relevant information. We have detailed some earlier in this section and they are also noted in the audience research, including the following example:

Radio 1 News was felt to be good at contextualising in an unpatronising way and respondents felt other channels could learn something from its approach, which was perceived to make fewer assumptions about existing audience knowledge.

Audience research report, Oxygen Brand Consulting

Cardiff’s content analysis gives us a sense of the level of responsibility in this area which is placed on journalists – in their sample period, they found that journalists were a little more likely than external sources such as politicians or spokesmen to provide context and a little less likely to make passing or vague references to statistics54.

54 Table 2.6, p17, of Impartiality Review of BBC Reporting of Statistics: A Content Analysis, Cardiff School of Journalism, Media

and Cultural Studies, Cardiff University January 2016

Page 45: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

45

On 10 February 2016 the BBC covered the fact that nearly 3,000 operations had been cancelled when junior doctors in England took part in a 24-hour strike over pay and conditions. Early reporting by the BBC sometimes included scant context around this statistic. However, an online article later in the day, “Operations hit as doctors strike again”55, set out more of the latest developments in the dispute and explained the background to the walkout. Members of focus groups for the audience research pointed out how crucial context was to this story:

In order to assess detriment, respondents said they wanted to know not only how many operations had been cancelled but how many had still gone ahead.

The BBC included this information in the above article, stating that “92% of operations would go ahead as normal”.

We note that some earlier coverage did eventually include this statistic, such as Breakfast on BBC One. This programme reported on the strike several times, each time mentioning the 3,000 operations cancelled, but it was well over an hour into the programme – and on the fourth mention of the strike – before any reference to how many operations had gone ahead was included. Stories do of course evolve. But such additional information could, and should, have been included in broadcasts throughout the day.

The BBC was particularly commended by participants in the audience research for a report on the decline in the number of motability cars, for quoting not only how many cars had been lost but how many remained56. As the audience research notes,

The audience sometimes needed clearer context to maintain concentration on a story containing data. This firstly meant more context on the importance of the story (‘why I should care’), and secondly on the validity of the numbers (“should I be worried?”).

Audience research report, Oxygen Brand Consulting

55 http://www.bbc.co.uk/news/health-35535704

56 http://www.bbc.co.uk/news/uk-35476904

Page 46: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

46

In January 2016 the online BBC News magazine covered the issue of Lotto rollovers57. In trying to explain the massive rollover of £60 million that month, which was the biggest figure since the initiation of the National Lottery in 1994, the BBC included explanations from organisations such as the Royal Statistical Society, such as “I said when these changes came in that they'd increase the chance of rollovers, and that's happened”. This placed the rollover in the context of the changes which had previously been made to the Lotto, helping audiences to understand that winning the jackpot is now less likely than it was before and hence the rollover was not unexpected.

As an example of the BBC including appropriate context, on 27 October 2015 the ONS released its Preliminary Estimate of GDP growth for Quarter 3 (July to September) 2015. The content analysis commissioned for our review investigated coverage of this issue as one of its case studies (“Reporting Economic Growth”)58. In particular, it looked at a report on BBC Radio 4’s Six O’Clock News that day which dwelt on what the figures told us about the government’s commitment to rebalance the economy but also put the figures in a longer-term perspective. The news report included the following statement:

Even though ministers have for years banged on about the need to rebalance the economy towards making stuff, the manufacturing sector has returned to recession and is still more than 6% smaller than it was before the crash of 2008, the service sector by contrast is 11% bigger.

The content analysis report noted that:

this was the only report to provide detailed figures by sector since 2008, which is rather more meaningful than the quarter-by-quarter comparisons referred to in most headlines.

Content analysis, Cardiff University And yet, while we note examples where this sort of information has been included in BBC content and the initiatives which sometimes assist, we also saw occasions when this context should (in our view) have been included but was either significantly lacking or misplaced. In several cases reported to us, statistics were quoted without additional information included. Numbers were simply stated in a vacuum without any specific reference to relevant trends or comparisons which might have enabled audiences to make an evaluation of their significance to the story or to draw firm conclusions. Indeed, Cardiff’s content analysis showed that only just over one third of statistical references in news programmes contained some meaningful context59. This flaw is difficult to understand, given the wealth of evidence we have just seen on how often the BBC was able to include such information.

57 http://www.bbc.co.uk/news/magazine-35250885

58 pp63 - 67

59 Table 2.4, p15

Page 47: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

47

On 20 April 2016 Radio 4’s Today programme included a report on depression in young people. It stated that 80,000 children and young people currently suffered from depression in the UK, a figure taken from the National Institute for Health and Care Excellence (NICE) in 201360. The programme covered the issue several times (as a follow-up to an edition the week before which had covered access for young people to mental health services). It included an interview with a young child who suffered from depression and his mother, excerpts of which were run more than once during the course of the episode. None of the earlier references or excerpts included any context for the figure of 80,000, such as the proportion of young people this represented or how this related to prevalence among the adult population. However, around two thirds of the way through the programme an external expert was featured (the Head of Child and Adolescent Psychiatry at a University) who did provide some contextual background – that the proportion of young people experiencing significant depression was somewhere around 1 in 100 to 1 in 50 of teenagers and 1 in 200 of younger children. Nevertheless, listeners who only tuned in to the first section of the programme would not have heard this additional information.

On other occasions, we saw examples of the mis-reporting of trends or comparisons which potentially distorted the significance of the figures. Contextual information such as trends should always be included where available. But in reporting trends, attention should be paid to selection of the period – would the overall picture be different if a different starting point had been selected? – and to how periods are reported to avoid misleading audiences. One external contributor told us:

A typical report goes “Britain’s economy grew by 1% in October.” This is without meaning unless further specified. “In the year to October” “Compared with the previous October” “In the quarter ending in October over the previous October.” In the search for crispness of expression, all meaning is sacrificed.

This issue is demonstrated in the audience research report which accompanies this review:

We also prompted with an example of planned challenge on flood defence funding from Radio 4’s Today programme involving Sarah Montague and Rory Stewart. In this, Sarah Montague challenged Rory Stewart on whether flood defence funding had fallen under the Conservative government. This challenge was not successful because Rory Stewart and Sarah Montague used aggregated spending figures in £milllions spread over different “blocks” of three or more years. The spending “blocks’” the Minister and Today’ journalist were comparing covered different time periods for example 2005-2010 versus 2007-2011, and 2010-2015 versus 2011-2015. As the time blocks didn’t coincide year by year spending could not be clearly compared and a conclusion could not be reached on air.

Audience research report, Oxygen Brand Consulting

60 We understand this figure was used in a press release by NICE in relation to the publication of their quality standard for the

care of depression in children and young people in September 2013. Information on this quality standard can be found at

https://www.nice.org.uk/guidance/qs48.

Page 48: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

48

On 12 April 2016 BBC online included an article, “UK inflation rate rises to 0.5% in March”. The article reported that UK inflation, as measured by the Consumer Price Index, had risen to 0.5% in March, from February’s rate of 0.3%. However, the article did not include key contextual information, such as the basis for comparison (eg it was not clear if the increase was within the month, year-on-year or anything else). This information would have been crucial, in our view, in helping audiences to evaluate the impact of the inflation rise.

Attention should also be directed towards the underlying message. For example, reports of changes in UK government debt are, we were told, sometimes reported in cash (nominal) terms. This can be misleading. One submission we received said:

…I have seen a well-known BBC financial journalist quote, without qualification, numbers for how much UK government debt is increasing every day, in a manner that is clearly designed to suggest that this is a very serious problem. But numbers like this are meaningless on their own. For example, we could do exactly the same to nominal GDP to give a positive impression about the health of the economy. To the layperson any number would seem large, but it would be a meaningless measure of economic health, particularly if all the growth has come from inflation.

Context always matters in explaining and interpreting numbers. Of course, we recognise that there are time and logistical limitations in the amount of context it is possible to provide in short pieces of content such as some news items. But it should be possible to provide context in even relatively short soundbites, with some careful consideration. We would like to see this happen more regularly than it does. Moreover, understanding context is important in preparing content – it sets the tone for how reports are framed. As one senior BBC journalist told us:

It’s not about how you choose the numbers, it’s how you choose the words. The audience research suggested a list which might help audiences to understand the figures in news and current affairs. The list includes:

• what numbers mean in terms of what has been lost or gained from the total;

• the base size and its reliability;

• normal incidence (eg frequency of an event);

• what a number or sum equates to in “real terms”;

• adding normative context; and

• risk.

Page 49: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

49

In other words, the objective of reporting on statistics should be to help audiences understand what is significant and what is “noise”. These suggested elements are consistent with the advice we referred to earlier contained in external checklists. We have seen that numbers are sometimes used by the BBC in ways which make it difficult for audiences to understand whether they are big or small, worrying or not. We recommend that much more is done to ensure that statistics are always contextualised in such a way that audiences can understand their significance.

Interpretation

Providing context aids interpretation. But it is not always enough. It is important that, as well as communicating the statistics, journalists are also able to provide interpretations around the sometimes-complex subjects in which they appear (such as banking and health) in order to help audiences to understand the relevance of the figures. And these interpretations need to be based on a balanced assessment of the evidence in order to provide audiences with an impartial reading. The audience research which supplements this review confirms the requirements of the audience for an interpretation of the basic figures on some occasions, since some individuals clearly require assistance in digesting the numbers they receive.

. . . due to differing levels of audience understanding and knowledge of maths, not all types of statistics are understandable for the audience. Sometimes the audience believe they would benefit from more explanation of what the figures mean. The audience judged this analysis can come either from a statistical expert or a journalist. If it is a statistical expert they need to have a skill in making meaning accessible. If it is a journalist they need to have (or be able to access) the requisite analytical expertise to interpret the significance and meaning of the figures.

Audience research report, Oxygen Brand Consulting And the content analysis demonstrated that this need falls very often to journalists to fill.

. . . external sources tended to make more vague or passing references to statistics, leaving substantial explanations about data sets or comparisons about issues to journalists.

Content analysis, Cardiff University

The issue of interpretation can be broken down into two parts – firstly, whether any interpretation is provided at all by journalists and programme makers, and secondly, whether the right interpretation is provided, for example, whether the significance of the figures is over- or under-stated.

Providing interpretations We noted many occasions in our review where attempts were made to provide detailed interpretations of complicated statistical information. Some of these are documented above

Page 50: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

50

(such as the More or Less programme on strenuous jogging), and also in Cardiff’s content analysis61.

On 13 November 2015 Kamal Ahmed, the BBC’s then Business Editor, wrote a blogpost, “Banking crisis – will we get our money back?”62. This attempted to explain a tweet which had been published that morning by Chancellor George Osborne that the sale of Northern Rock mortgages to US investment firm Cerberus was “another important step in clearing up the mess left by the financial crisis”. By separating the question into a number of parts and incorporating information such as valuations of the assets, loans outstanding and the wider economic costs of the recession, Mr Ahmed was able to provide the interpretation that “the government still has some way to go to ensuring the taxpayer "gets their money back".”

And yet, we also noted times when this interpretation was lacking or inadequate, potentially leaving audiences confused as to the relevance of the statistics included in the piece. Again, some examples can be seen in the case studies covered in the content analysis.

On 26 October 2015 BBC One broadcast a Panorama programme on “Britain’s Mental Health Crisis”. This investigated the increasing demand for NHS mental health services in the face of funding cuts. While the content analysis found much in this episode to praise, it also revealed that parts of the programme might have benefitted from more detailed interpretations of the statistics.

At times, however, statistics could have been unpacked or explained in greater detail. At the end of “Hooked on Painkillers”, for example, the journalist stated:

“1.8 million people have admitted misusing prescription painkillers” [On screen: ‘Source: National Crime Survey’]. (‘Britain’s Mental Health Crisis’, Panorama, October 26)

The programme might, for example, have interpreted the proportion of the UK population this represents (it is a study relating to England and Wales); whether this represents an upward or downward trend; or how it compares with illegal drug use.

Content analysis, Cardiff University

Interpretation is important; even more important is correct interpretation.

61 pp 32 – 33

62 http://www.bbc.co.uk/news/business-34806376

Page 51: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

51

Providing the right interpretations: statistical significance and uncertainty There is a tendency in the BBC and other media outlets to focus on change in unemployment, inflation, GDP and so on. Frequently the headlines imply that changes from month to month or quarter to quarter are important and (statistically) significant (such as in the following example concerning ONS unemployment statistics). We came across several examples of overplaying this with concern expressed by several contributors. We note that statistics are often used to give an impression of change in the world around us. Unemployment figures can tell us how the labour force has changed, crime statistics can tell us about shifting levels of safety and security in our society, mortality figures can tell us about improvements in medical treatments, and so on. We note a clear tendency to use numbers to illustrate narratives of “up” versus “down”, “better” versus “worse”, “progression” versus “decline” and so on. Statistics can certainly throw light on these kinds of interpretations, if they are used in the right way. But sometimes the short-term change in the numbers is not as statistically important as we assume it to be and an emphasis on longer term trends might be more useful in helping audiences to interpret the figures and understand their implications. It is important to be clear when things have in fact not changed significantly (in either a statistical or economic sense). The fact that unemployment, inflation, GDP growth are broadly steady is important to understand, for example. It is clear from both the content analysis and audience research that the BBC on some occasions does go to some lengths to caveat information and provide other necessary details so that audiences can make a judgement on the relevance of a statistic.

On October 29, the ONS published its national population projections that were based on the UK population on June 30, 2014. The ONS’s press release made clear these population figures were projections and that care should be taken in interpreting them. The content analysis63 documents how this warning was not necessarily heeded by all broadcasters in all reporting of the issue, but it notes some examples of where the BBC was more cautious in interpreting the ONS’s estimates and the rising levels of migration than other media:

These predictions have to come with a health warning in that it is not an exact science, but these figures do give governments an idea of what they should be planning for in years to come. (BBC News Channel, October 29)

So there’s a conflict: young migrants may have an important role in the future to help pay for our elderly, but there are worries that the overall cost may be too high. (TV BBC News at Six and Radio 4’s Six O’Clock News, October 29)

63 pp93 - 104

Page 52: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

52

However, some areas of the BBC’s coverage are sometimes deficient in this respect. John Pullinger, the current UK National Statistician, told us that:

There is . . . an important balance to be struck and in its desire to stimulate audiences the BBC needs to take care not to overplay the statistical significance of the figures. For example, a fall in the monthly rate of CPI inflation from 0% to minus 0.1% was reported as “Britain plunged back into deflation”.

John Pullinger, UK National Statistician And the content analysis shows that in much of the BBC’s reporting of the ONS unemployment statistics which were released in October 2015, broadcast outlets (such as the News at Six) still reported the statistics without offering the relevant caveats or doubts about their significance which Anthony Reuben had raised in an online article on the subject64. This is an important issue as it suggests that it would have been very difficult for some audiences, depending on which BBC content they had accessed on that day, to evaluate the gist of the story (ie the extent to which efforts to increase employment had succeeded). We therefore agree with the view expressed to us by one stakeholder that:

caveats are not optional. Staff member, Welsh Government

As one person who submitted evidence put it:

One of the things that does strike me as in need of attention is the lack of conditionality in reporting unemployment and immigration numbers.

We note that in 2015 there was broad concern about the failure of the opinion polls accurately to predict the outcome of the General Election65. The British Polling Council commissioned an independent enquiry into the performance of the polls at the election, led by Professor Patrick Sturgis of Southampton University, which reported in March 201666. Among its conclusions were two recommendations relevant to our review – that polling organisations should:

“provide confidence (or credible) intervals for each separately listed party in their headline share of the vote; [and] provide statistical significance tests for changes in vote shares for all listed parties compared to their last published poll’”.

This is because they concluded that:

The inquiry was also tasked with assessing whether uncertainty in poll estimates is adequately communicated to stakeholders and the public. Our conclusion is that it is

64http://www.bbc.co.uk/news/business-34486717. The ONS’s margin of error for quarterly unemployment statistics is around

+/- 80,000, which is often larger than the observed change in content which covers the statistics.

65 Opinion polls had suggested throughout the election campaign that the Conservatives and Labour were likely to secure equal

shares of the vote (around 34% each). No individual poll put the Conservatives more than a point ahead. Yet in the event,

the Conservatives won 38% of the vote in Great Britain, while Labour won 31%.

66 Report available at http://eprints.ncrm.ac.uk/3789/1/Report_final_revised.pdf

Page 53: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

53

not . . .Commentators are prone to over-interpreting small changes in party shares between opinion polls, sometimes giving the impression that party fortunes are shifting when the evidence does not support the inference. A requirement to test for change since the last published poll does not preclude discussion and analysis of whether changes from a larger set of polls, taken together, might constitute evidence of change. Responsible media commentators would be much less inclined, however, to report a change in party support on the basis of one poll which shows no evidence of statistically significant change.

The BBC will doubtless be considering how to incorporate these findings into its own reporting of opinion polls. Similar considerations apply to other statistics too.

Statistics in debate

Statistics should be used to illuminate debate, not confuse. They can help audiences make appropriate evaluations and understand issues in clear and simple ways. This was demonstrated in the Audience Research report, which concluded that:

The public appear to put a premium on statistics and information which can illuminate in confusing situations and add some accurate knowledge.

Audience research report, Oxygen Brand Consulting Sometimes the statistics which appear in coverage require challenge and journalists and programme makers need to exercise careful judgement in how to achieve this. They also require evaluation to help audiences navigate through the arguments. We have every sympathy with the view which was frequently expressed to us that broadcasts that descend into trading apparently conflicting numbers can be confusing and a turn-off for audiences. But that does not mean that wrong or misleading statistics, particularly when they are central to an argument on an important issue, should go unchallenged. Sometimes this can be provided by another contributor in the content (we discuss this in more detail below) but where it is not done, the journalist has an important role to play. We have seen how most members of the audience are not experts with numbers, hence they rely on the BBC to identify when contributors are using figures which might be open to alternative interpretation. And yet, the evidence we have seen suggests this is rarely done.

Page 54: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

54

On 2 September 2015 an actress appeared on BBC Two’s Newsnight being interviewed about climate change (which she had campaigned on recently). During the interview she made inaccurate statements about climate threats. This included the claim that

if they [oil companies] take out of the earth all the oil they want to take out, you look at the science – our temperature will rise 4 degrees Celsius by 2030, and that’s not sustainable.

Scientific research suggests that this temperature rise is in fact likely to be arrived at much later – the World Bank, for example, puts it at “by the end of the century”67. However, the statement, and others like it, were not challenged in any way in the programme by the presenter.

(It is worth noting, however, that the issue was revisited by Newsnight on 23 November. The Science Correspondent and the Executive Secretary of UNFCC addressed the scale and context of climate change in a piece on the run-up to the UN Conference on Climate Change which reflected scientific understanding regarding changes in emissions, in relation to a target to limit global temperature increases to 2 degrees by the end of the century. The piece took care to report in accordance with scientific understanding)

The Joseph Rowntree Foundation pointed us to a report by Full Fact which it funded on “How did the Media and Politicians in the UK Discuss Poverty in 2015?”68. In their submission they told us:

One of the most striking findings was that journalists did not appear to explicitly scrutinise or challenge claims made by politicians or by pressure groups – including on use of statistics. This suggests a recommendation for the BBC to develop its role in supporting and encouraging BBC journalists to do this, thereby contributing to more accurate reporting and better understanding of poverty.

Joseph Rowntree Foundation This conclusion could apply equally to other topics of public debate. Cardiff University noted in its content analysis that only 4% of the references they examined in their sample period were challenged further. While it would be hoped that most statistics would not need to be challenged, they concluded that:

this often meant that sources were left to advance competing statistical positions, without any independent assessment offered to assist audiences in making a judgement about the veracity of the claims made.

Content analysis, Cardiff University This lack of challenge may on some occasions be caused by insufficient briefing of presenters and/or programme makers before they interview contributors. Cardiff

67 See their report, “Turn Down the Heat: Why a 4°C Warmer World Must Be Avoided”, available at

https://openknowledge.worldbank.org/handle/10986/11860

68 Available at https://www.jrf.org.uk/report/talking-about-poverty-2015

Page 55: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

55

University’s case study of the way in which the BBC reported cuts to the tax credits system, for example, documented how many BBC outlets recycled versions of the same statistic (regarding the government’s contention that those on low incomes would gain more (or as much) from increases to the minimum wage (and other changes) than they would lose in working tax credits). This was despite a wealth of statistical information available from independent sources which indicated the government’s claims were, at best, debatable. As Cardiff notes:

This indicates that the BBC could improve its consistency of performance by ensuring that reporters, editors and presenters familiarise themselves with the most detailed and readily accessible running accounts of statistical stories – likely to be available online.

Content analysis, Cardiff University It is hard to disagree with this conclusion. We are conscious that difficult judgements are involved in deciding when, and how, to challenge particular claims. However, complex situations could arise if specific sources are continually left without challenge. In particular, the content analysis demonstrates that there is an especially high number of political figures providing statistical information on the BBC (21% of all sources used in the period monitored by Cardiff)69. And Conservative politicians represented nearly three quarters (73%) of these statistical references, which Cardiff University concludes “shows a high dependence on the governing party”70. It is reasonable to expect the BBC to cover statements which the UK or devolved governments make. However, as Cardiff’s content analysis points out, it does make it vital that those statements are challenged where necessary so that the impartiality of the BBC’s coverage of political affairs is not affected.

Following a statement in November 2015 by the Prime Minister that 43% of EU migrants claimed benefits in their first four years of being in the UK, Cardiff University tracked this story in their content analysis of BBC news coverage. Reservations had been expressed about the methodology of this figure before it was used by the Prime Minister and, particularly, that the official source had not been released. However, Cardiff found many instances of it being quoted by government sources in BBC content without challenge on any fundamental level by journalists. This included in the Radio 5 live studio interview which took place with Cabinet Minister Michael Fallon on 10 November 2015 in which he made a statement that “. . . around 40% of those who come here from the rest of the Union are claiming benefits straight away and that isn’t right”. As the content analysis notes,

while the presenter made a point about the use of statistics, the veracity of the figure or the methodology behind it were not subject to a great deal of interrogation.

Content analysis, Cardiff University

69 Table 2.11, p24

70 Table 2.12, p26

Page 56: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

56

The issue of challenge can have a national perspective. Northern Ireland, for example, has a cross-party government and there is only limited explicit oppositional “voice”. In such circumstances, the onus is placed even more squarely on the media and other commentators to introduce that element of challenge. As one senior statistician in Northern Ireland told us, in this situation:

. . . the BBC has to provide interpretation and context.

It was suggested to us by a number of individuals that there can also be a tendency not to challenge statistics produced by charities. The fact that they are produced in the context of philanthropic endeavour can potentially lead to them being assumed to have a greater accuracy than might actually be the case. And yet, charities usually have a clear agenda which they are trying to promote and should, of course, be subject to the same degree of scrutiny and scepticism as others.

On 17 August 2015 the BBC published an online article, “Diabetes cases soar by 60% in last decade”71. This included the latest measurement of the number of people in the UK living with some form of the condition – 3.3 million people, up from 2.1 million in 2005. However, this statistic was produced by Diabetes UK, the UK’s leading diabetes charity, in the context of an agenda to initiate a call for the Government to take increased action to prevent new cases and improve treatment. It did not take into account factors such as people living longer and diagnosis improving, which could account for the increase in the measurement of people with diabetes.

We recommend that more be done to ensure that presenters are in a position to challenge such assertions, appropriately and confidently. Another preferred method of resolving the conundrum between constraint and challenge appears to be to incorporate challenges into content from other oppositional external sources, and this is one of the key areas over which we had concerns. We received evidence that indicated that where contentious statistics are anticipated from external sources who are due to feature in content, the BBC’s tendency is to plan for the inclusion of an opposing external source to counterbalance these statements and provide the appropriate challenge (rather than journalists providing it themselves). By presenting the audiences with both “sides” of the debate, they would therefore appear to be inviting the audience to make up its own mind as to where the balance of evidence lies. Hence, the content analysis:

. . . found many news programmes – particularly in the evening bulletins – allowing competing sources to make statistical references, with no journalists or independent experts making a judgement about the veracity or credibility of either claim.

Content analysis, Cardiff University

71 http://www.bbc.co.uk/news/health-33932930

Page 57: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

57

Indeed, we were told by one or two regular contributors to BBC content that they were sometimes specifically asked to provide an oppositional opinion by programme makers in lieu of the presenter.

In an interview on the Today programme on the Government’s proposals to cut working tax credits on 26 October 2015, Matthew Hancock, MP (a government spokesperson) made a claim that “. . . by next year eight out of ten people will be better off as a result [of the Government’s proposal]. . .” This was based on an understanding that those on low incomes would gain more (or as much) from increases to the National Minimum Wage (and other changes) than they would lose in working tax credits. This claim had been heavily refuted by organisations such as the Institute for Fiscal Studies, who concluded that it was “arithmetically impossible” for losses in tax credit cuts to be mitigated by increases in the NMW. Rather than challenging this statistical claim, however, the presenter on the Today programme simply stated, “Well, we will hear the opposition view on that from Labour and the Liberal Democrats later in the programme.”

However, as pointed out by the content analysis, this approach is not generally helpful to audiences who are trying to understand the statistics. Neither broadcasters, nor we, would like to see news and current affairs debates regularly become arguments about numbers. But, as the content analysis notes, this state of affairs

. . . often meant that sources were left to advance competing statistical positions, without any independent assessment offered to assist audiences in making a judgement about the veracity of the claims made.

Content analysis, Cardiff University We note that the debate leading up to the June 2016 EU Referendum (ongoing at the time of writing) has exemplified the public demand for “facts’’.

Page 58: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

58

An online article on 18 March 2016 (“Reality Check: Would Brexit cost every household £850?”72) attempted to analyse a report from the Centre for Economic Performance (CEP) which claimed that the UK leaving the European Union would knock £850 off the average UK household's income. The article set out some explanation of how the CEP had arrived at these figures and the assumptions that underpin their model. It included a statement from the Chief Executive of Vote Leave (the organisation campaigning for the UK to leave the EU) that “these ridiculous claims lack credibility”. However, it did not include any helpful independent analysis by the content producers to enable audiences to evaluate the two competing claims beyond the following, which in our view does not go far enough in weighing the available evidence:

Nobody knows what would happen if the UK were to leave the EU, so well-qualified people are trying to guess. There is little reason to believe they are guessing correctly. Predicting things is a thankless task – remember those opinion polls at last year's general election?

It is understandable that incorporating challenges from external sources rather than journalists providing them directly might be pursued as a means of adhering to the letter of the current Editorial Guidelines. But in our view, simply pitching one challenging statistical claim against another without providing any kind of “refereeing” voice will rarely be sufficient to ensure audiences reach a full understanding of the relevance of the numbers. As one senior BBC editor put it:

If you count [impartiality] as giving 20 seconds to one side, 20 seconds to the other and the correspondent saying “it’s up to you to decide”, you are doing the audience a big disservice.

Senior Editor, BBC Scotland We agree. The BBC has a responsibility to help audiences make sense of the statistical evidence in an impartial way. That involves being willing, more than at present, to weigh, interpret and explain the statistical evidence and, when appropriate, challenge or correct when it is misused. We can see how journalists might feel constrained to avoid substantial challenge to proffered statistics. And yet, we have also seen how important these challenges and impartial evaluations are. Audience research, and our own discussions, showed considerable frustration with this way of presenting statistics and effectively leaving the audience to make up its own mind. The BBC needs to get better and braver in interpreting and explaining rival statistics and guiding the audience. The BBC has implemented a specific initiative which furthers the ability to provide impartial analysis, namely the Reality Check service which has been used by the BBC at various key moments to provide analysis and advice, including on statistics, and now recently resumed in the run-up to the EU referendum. Reality Check is a team of news journalists who act as both an internal resource to staff wanting to verify information and a content-production service who generate pieces to explain some of the key concepts currently being covered on

72 http://www.bbc.co.uk/news/uk-politics-eu-referendum-35843953

Page 59: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

59

the news. Their Twitter feed73 describes their aim as being to “cut through the spin and

concentrate on the facts”. Examples of their output include pieces such as the article by

Anthony Reuben in April 2015 looking at how many people have second homes74, written in

the context of pre-election claims by political parties about Council Tax. We heard evidence that the team is also consulted by BBC journalists wanting to share information on the use of specific statistics. We note that Channel 4 runs a permanent Fact Check feature easily

accessible from its news website home page75, as does the news website of the Australian

Broadcasting Corporation76. The content featured in Reality Check can sometimes be

susceptible to the challenges we raise in this report (such as the example above about Brexit), but the BBC’s online information is generally praised by audiences, along with Channel 4’s Fact Check. We recommend that Reality Check becomes a permanent feature of the BBC’s activities, with a prominent online presence, reinforcing the BBC’s commitment to providing well-informed, accurate information on topical and important issues.

Risk

Although frequently a complex subject to communicate to audiences, this was an area we found particularly wanting on occasion and where clear interpretation and presentation can be vital. The reporting of risk, for example health risks or the risk of becoming a victim of crime, is crucially important, as its mishandling can lead to scaremongering or panic. Indeed, it was in the context of the reporting of the MMR vaccine controversy in the late 1990s (when the media gave a distorted impression of the weight of evidence supporting one side of the debate77) that specific Editorial Guidance on risk78 was produced for the BBC. This includes a “Reporting Risks Checklist” and recognises that the BBC

. . . should report statistics and risks in context, taking care not to worry the audience unduly, especially about health or crime.

BBC Editorial Guidance on Risk Yet despite this Guidance, all too often the BBC’s reporting of risk can still be confusing. Sometimes, for example, rhetorical statements on risk are not supplemented by any specific statistics to outline what the actual risk is. This raises the possibility of audiences being misinformed or even alarmed by a headline which does not provide them with any information to explain the actual risk.

73 https://twitter.com/BBCRealityCheck

74 http://beta.bbc.co.uk/news/election-2015-32393222

75 http://www.channel4.com/news/

76 http://www.abc.net.au/news/

77 http://www.bbc.co.uk/news/health-22173393

78 http://www.bbc.co.uk/editorialguidelines/guidance/reporting-risk

Page 60: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

60

A BBC online article from August 2015, “Cancer risk “even from light drinking””79, reported on work in the British Medical Journal to assess two large US studies of links between drinking patterns and cancer risks involving more than 100,000 adults. Although containing many statistics (largely relating to the recommended level of alcohol consumption), the article did not set out any useful figures to put the statements it made into a context so that readers could interpret them. In particular, despite including statements such as, “For women, the researchers observed, the risk of alcohol-related cancers – mainly breast cancer – increased even after one alcoholic drink a day”, it did not provide any specific figures for the risk or by how much it would be increased by light to moderate drinking.

There should, if possible, be a meaningful explanation of what the risk would actually mean to an individual. This is not straightforward. One former BBC editor told us that it would be good practice to express a risk in a way which draws on numbers of people rather than percentages – for example, instead of saying, “the risk of developing bowel cancer is X%” one might say “X people have developed bowel cancer this year”. Presumably this is an attempt to personalise the statistic and make it more meaningful and we support this sentiment. And yet we note that psychological research has shown that expressing a risk as a fraction of a large population makes the risk look bigger80. So, for example, 1,000 out of 10,000 people sounds like a higher risk than 1 out of 10. When trying to communicate risk to individuals, it is therefore our view that it is in fact misleading (however well-intentioned) to talk only about the number of people in the country who are affected. We would prefer to see other methods of explaining risks explored in more detail on BBC content.

On 26 October 2015 the International Agency for Research on Cancer issued a

press release which classified processed meat as “carcinogenic to humans”,

placing it in the same category (on their measurement scale) as many other

substances. This included tobacco, one of the deadliest carcinogens (in terms

of risk factors). At no point did the press release make a comparison with

tobacco, but many headlines in subsequent reporting by the media did (as

evidenced in the content analysis). This was misleading, as it suggested an

equivalence between processed meat and tobacco which was not an accurate

representation of the comparable risks. Some of the coverage earlier in the

day by the BBC included this misleading comparison, such as 5 live Breakfast

which stated “The World Health Organisation is going to put the cat amongst

the pigeons today. Processed meats, bacon, sausages etc, are going to be

declared carcinogenic. A WHO person told the Mail last week that these meat

products are going to be in the same category as cigarettes and asbestos . . .”

79 http://www.bbc.co.uk/news/health-33975946

80 Reyna VF, Brainerd CJ. Numeracy, ratio bias, and denominator neglect in judgements of risk and probability. Learning and

Individual Differences. 2008;18(1):89 – 107

Page 61: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

61

A common manifestation of the issues with risk reporting seems to be that, while much attention is given to risk factors (and their increase or decrease), baseline risks are sometimes not stated. Again, this is considered very poor practice in risk communication – a “relative risk” may be a 50% higher risk, for example, but if the corresponding absolute risk is 50% higher than 1 in 10,000, the actual risk is not as great as it might first seem to be. This can lead to confusion and difficulty in accurately assessing the degree of the risk being reported on. As noted in the audience research report:

. . . participants found it essential to know the base risk on which an uplift in risk was being calculated (medical reports) . . .

Audience research report, Oxygen Brand Consulting

An example of the BBC’s successful reporting of risk factors can be found in its reporting of the same news story as the previous example. The World Health Organisation reported that a daily 50g serving of processed meat increased the lifetime risk of bowel cancer by 18%. This relative risk can be converted to an absolute risk by noting that around 6% of people develop bowel cancer in their lifetime. The 18% increase would therefore mean that seven people, or one extra, would get bowel cancer among 100 people eating 50g of processed meat each day. We were pleased to see that this was explained fully by BBC Online in detail in their article ”Processed meats do cause cancer”, while BBC One’s News at Six and Ten said:

But that increase is still only equivalent to an extra one in every 100 people getting cancer.

However, the Audience Research report showed that participants nevertheless failed to understand their risk from processed meat and the online article still did not bring them clarity despite the additional information. For them, the issue came down to too much information but not enough contextualisation of the risk.

A report for the King’s Fund in 200381 found that

News coverage of health issues is seriously out of proportion with actual risks to health and fails to reflect mortality risks shown in health data . . . 8,571 people died from smoking for each story about smoking on the BBC news programmes studied. By contrast, it took only .33 deaths from vCJD to merit a story on the BBC news.

That was 13 years ago. Our impression is that there has been some improvement over this time. And we recognise too that new and emerging risks such as vCJD at the time, or ebola more recently, will, and should, receive media attention. However, the conclusion of the King’s Fund that there needs to be ”more consistently robust handling of data analysis by news media – particularly by non-specialist journalists – in accessible terms, to help lay audiences put risks into perspective” remains valid.

81 “Health in the News: Risk, reporting and media influence”, Roger Harrabin; Anna Coote; Jessica Allen, King’s Fund, Sept

2003

Page 62: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

62

While the BBC is often better than other outlets at explaining risk, it by no means does so consistently well. Clearer guidelines and a more consistent approach are needed. As noted above, it should make sure that it always includes base case risk information. In other words never report simply that eating X doubles your risk of cancer. Always say it doubles the risk from 1 in 1,000 to 2 in 1,000, for example. It is vital that BBC journalists can have immediate access to calculations regarding risk, as it is often not present in the press release.

Presentation

When it comes to numbers, the way we present them is fundamental to their reception. Whether they are hurled at audiences in undiluted streams, whether they are supplemented with visual aids, whether any emphasis is made with voice, body language or the position of the statistic in an online article – these considerations all greatly affect the way in which audiences will interpret the numbers. The audience research which accompanies this review demonstrates this, as the variety of ways numbers were presented in the stimulus material considered by the focus groups greatly affected individuals’ understandings of the information.

[regarding the online report considered as part of the spending review case study82] This presentation was liked. It was felt to be methodical and clear. People claimed figures were easier to follow when embedded in separate bullet points rather than unbroken online text. [regarding the Newsbeat item on Radio 1 at 12.45am on 25 November 2015 on the same subject] The piece had a lively delivery, plenty of context, sparing use of figures and attempted to draw out personal relevance to listeners. [Regarding the BBC News 24 item on television at 4pm on 26 October 2015 considered as part of the processed meat case study] 70g was [also] misheard several times as 17g, 50g as 15g and so on. There was a lot of similar sounding data, for example the evidence was based on 800 studies, and 900 “agents” were evaluated (in itself unfamiliar vocabulary).

Audience research report, Oxygen Brand Consulting We saw evidence that many areas of the BBC give careful thought to the way in which statistics are presented for audiences, particularly by drawing on the resources provided by the Visual Journalism team or visuals teams in the nations. We saw some examples of excellent visual presentations of statistics which had been produced with such teams’ assistance, and there seems to be an appetite among audiences for more (simple) graphics.

82 http://www.bbc.co.uk/news/uk-politics-34908612

Page 63: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

63

Throughout the end of 2014 and beginning of 2015 the BBC website carried an online “Weekly A & E Tracker”83 which allowed readers to track the NHS’s progress in their current location on waiting times in Accident and Emergency Departments and four-hour performance. It presented the statistics visually in a clear, simple way and included analysis by the Health Correspondent.

83 http://www.bbc.co.uk/news/health-25055444

Page 64: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

64

The BBC’s online Care Calculator was launched in January 2015 as a guide to the care system for the over-65s84. Users can submit their postcode and find out how much each service costs wherever they live in the UK. The BBC used significant resources both internally and externally to rigorously test the statistics before it went live. It is an example of the BBC making efforts to provide audiences with helpful information to supplement news on the care system and has recently won a Royal Television Society award for News Technology85.

However, we also saw instances where the BBC could have improved the way it presented statistics to audiences, some of which we have already documented in this report and which are particularly set out in the audience research. Indeed, the research notes that

Although the BBC’s presentation of data is generally perceived to be good or very good, audiences feel there is still room to improve and simplify to help people interpret and “hear” it better.

Audience research report, Oxygen Brand Consulting

The presentation of statistics in coverage, of course, varies depending on many factors. One of the most important of these is platform. The same statistic might be covered in a very different way depending on whether it features in television, radio or online coverage. Whether it’s simple and to-the-point, part of a broad explanatory piece, supplemented by additional information (visual or verbal) – these considerations vary depending on the medium through which it is communicated, and the audience for whom it is intended. Some of those from whom we received evidence told us radio was considered to be a less appropriate medium for including numbers than other platforms (since they can be difficult for audiences to absorb audibly). We understand the BBC has conducted audience research

84 http://www.bbc.co.uk/news/health-30990913

85 https://rts.org.uk/award/television-journalism-awards-2016

Page 65: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

65

which shows, for example, that including substantial amounts of statistics in a piece often loses an audience. The audience research for this project suggested that this could be well understood by current radio practitioners, whom we saw often exercising good judgement in this area.

Radio is not expected to present and use as many numbers as TV. It is expected to be quite selective and is in fact felt to be quite successful at presenting what numbers it does contain in a managed and digestible manner.

Audience research report, Oxygen Brand Consulting

The audience research also suggests that television might have a tendency to over-use statistics (there being an assumption that visual aids will guarantee an understanding).

Some found the clips from the main 6pm news on BBC One too “number heavy”, and our analysis indicated some TV packages were perhaps too ambitious for some viewers, both in the volume of numerical information they contained, and the separate strands of thought and argument they tried to include within an individual story. Some feel TV news reports don’t contain enough graphical representation of data and are too reliant on written and spoken numbers. In particular, viewers say they struggle to “keep up” and “tune out” when volumes of numbers are read out and shown on screen at a fast pace and in succession.

Audience research report, Oxygen Brand Consulting Certainly, both the content analysis and audience research provide evidence for one of the most frequent contentions we heard – that online is often relied upon for a fuller presentation of figures already covered more briefly elsewhere. Hence, Cardiff found that in their initial one-month analysis, BBC online news coverage incorporated more references to statistics (almost one in three news items) than either radio or television coverage.

In our analysis of news generally and consistently across the eight case studies, online news was clearly the distinctive BBC platform in its use of statistics, regularly supplying background/context in major stories. As a platform it has more space to do so than radio or TV – but demonstrates an appetite to inform using statistics that other platforms could emulate.

Content analysis, Cardiff University And this finding is reinforced by the audience research.

Online is seen as a source of definitive numerical explanations and graphics. It allows examination and review of facts at an individual pace. BBC Online is particularly valued by the educational community.

Audience research report, Oxygen Brand Consulting Of course a key feature of online is that it can be read at the reader’s own pace and if necessary reread and studied. There are also important logistical considerations at play – an online piece has more space and time to include explanatory information whereas much television and radio content must necessarily be short and precise (especially for bulletins). For this reason, we were pleased to note occasions when pieces on TV and radio refer to further information online and would like to encourage this to be done more often. This was

Page 66: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

66

brought out clearly in the audience research, particularly regarding the piece on Newsbeat on Radio One on 26 October 2015 considered as part of the tax credits case study:

Finally, the invitation to visit the Newsbeat app and see “exactly what this meant to you” was thought to be helpful and engaging, because it was an active invitation to refer across to a parallel medium where statistics are thought to be particularly readable and digestible. The invitation to interact between a short term news report and a longer term online source was lacking in the other clips, and was strongly advocated by respondents.

Audience research report, Oxygen Brand Consulting Another factor which can often be thought to influence the presentation of numbers, regardless of which platform they appear on, is the type of programme they feature in. An in-depth analytical current affairs programme will perhaps be assumed to be able to incorporate statistics in a very different way to a very short bulletin in a news broadcast. We were specifically told this by one senior BBC Editor, who put it:

There’s a huge difference in time and space to explain a specific statistic in different news outlets, so it’s a big challenge. For example, you can put more explanations and caveats in an 800 word online piece than in a three-minute item on network TV news.

This is also borne out by the audience research, which found that

The level of coverage is expected to be tailored to the length of the news report. Short reports are expected to carry less data.

Audience research report, Oxygen Brand Consulting However, the content analysis for this review does not appear to support the contention that the presentation of numbers will always automatically be influenced by the type of programme. Cardiff analysed the use of statistics in a small sample of current affairs programmes, for instance, and found that

Perhaps surprisingly . . . vague or passing references were, proportionately speaking, more common in current affairs programming than in news (55% of references compared with 24% in more conventional news bulletins). While over one third of statistical references in news programmes contained some meaningful context, we identified only one instance of this in the current affairs programmes . . .

Content analysis, Cardiff University Without detracting from the success of the BBC’s presentation in some areas (notably its online visual presentation), we have seen earlier in this report, and also throughout the content analysis and audience research, examples of ways in which numbers can be used successfully on any platform, in either short or long broadcasts. In our view, more can be done to share knowledge of how audiences understand the statistics presented to them, and how this can be improved. The BBC should build on its existing training in this area, including highlighting the role of existing internal resources which could be better drawn upon to improve the presentation of statistics and incorporating good practice from external media organisations nationally and internationally.

Page 67: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

67

“Big numbers”

The use of big numbers was a particular area where we perceived that the BBC sometimes stumbles. Millions, billions, trillions – it can be hard for audiences to understand, process and distinguish between these words (for example, in terms of the national debt), particularly if they are not contextualised, for example as pounds per person per year. And it can be very difficult to communicate clear information when some numbers tend to be “big” – current departmental spending is due to be cut by £12 billion between 2015–16 and 2019–20, there was a £113.3 billion spending on the NHS in 2014/15, the UK population grew to an estimated 64.6 million in 2014, etc. And yet the audience research suggests that:

“Most [participants] found it particularly hard to visualise “billions” or very large distances.”

Audience research report, Oxygen Brand Consulting As one respondent to the audience research put it,

“To be honest, and I am embarrassed to admit this, I find it hard to envisage the difference in scale between £1million and £1billion. If this [programme] was for a different audience (ie Radio 1 not Radio 4) it would be helpful to remind people what these numbers mean. . .”

On 26 October 2015 the News at Six on BBC One ran an item on the Government’s tax credits proposals. The item tried to explain the figures involved and their implications for individuals and families. It included a visual analysis by the Business Correspondent which used many numbers and supplemented them with pictures of families, pound coins etc. According to the audience research report, the first part of the item appears to have succeeded in giving an idea of the scale of potential loss or gain. But the second part referred several times to numbers in billions (“4 billion”, “12 billion” and “1.4 billion” were all used within a few seconds of each other, two visually represented on the screen but one not). The Audience Research report describes this as follows:

At this point, when talking about the later introduction of the new wage the report also reverted to referencing large and less personally identifiable numbers in billions. A common theme in the research was that respondents found numbers in the billions hard to imagine.

Audience research report, Oxygen Brand Consulting

The Audience research report suggested that some statistics can be interpreted as sensational simply because they are large. Oxygen reported that often readers do not go beyond online headlines and first paragraphs which may contain a “shocking” figure to see the caveats. On TV the emotional interpretation of the numbers can dominate.

Page 68: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

68

Our view is that it is possible to communicate clearly when using “big numbers” – it simply requires forethought to put the figures into an accessible format. For example, geographical areas which are hard to imagine can be put into perspective by comparing them to a more well-known area (although “the size of Wales” is a phrase which we hear far too often!) We were specifically told that some successful reporting of big numbers is achieved on a national level – for example, a Scottish stakeholder suggested that Scottish news organisations covered the big statistics involved in the energy industry better than other parts of the UK, perhaps because so much of this industry’s employment is based in Scotland. The BBC does recognise the need for vigilance in this area and our attention was drawn to some helpful training pieces on the BBC Academy website about the use of “big numbers”, such as an excellent article by Robert Peston and Mark Easton86 which considered some of the issues involved. For example:

Did you know that millions and billions are actually a very long way apart? A million seconds is about 12 days. A billion seconds is 32 years. And a trillion seconds is 32,000 years! Whenever possible it is good to get big figures in proportion.

This should be required reading for those tasked with presenting statistics.

Transparency and correction

In order to evaluate the importance or validity of a statistic, audiences often need to some extent to understand where it came from, who is using it and how it has been generated – in other words, the provenance of a statistic needs to be transparent. Good practice suggests that in order to achieve this, those descriptors should be routinely presented, although not necessarily as a full suite on every occasion. In some cases of course – such as a fleeting reference in an interview – it is not possible to give all this information. But where the story is the statistic, then transparency is vital for the audience as attribution can sometimes greatly affect the weight audiences give to particular figures. And yet, there appear to be occasions where statistics in BBC content are not clearly attributed, or where a link to the direct source (if in an online article) is not provided. For example, when the audience research groups were shown the clip from BBC One’s News at Six on 26 October 2015 in the context of the tax credits case study:

Most didn’t grasp that the analysis in this report came from the IFS, not the BBC, so there were some challenges on BBC impartiality.

Audience research report, Oxygen Brand Consulting This is of course not unique to the BBC. But the content analysis produced by Cardiff found that in their sample period, 43% of statistical references in news coverage were journalists asserting something or making a claim rather than an external source.

86 http://www.bbc.co.uk/academy/journalism/article/art20130702112133478

Page 69: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

69

Across all media, in other words, a significant proportion of statistical references were made without identifying the source of those statistics.

Content analysis, Cardiff University Although Cardiff also found that when statistical references were sourced, in the overwhelming majority of cases this was done clearly, there were also statistical references without any attribution. Indeed, during the audience research it was notable that members of focus groups would in fact attribute a statistic to the BBC unless it was clearly sourced – for example the statistic that crime figures have been falling. As well as attributing statistics, audiences also need to know how they have been generated. It is difficult to fully appreciate their significance if we receive little or no information on the workings behind them. This includes issues such as sample size and representativeness, how data were collected, geographical relevance, time periods and so on.

In March 2012 BBC online reported that the Mumsnet website had launched a rape awareness campaign following research that it said revealed the "hidden scale of rape and sexual assault" (““Hidden scale of rape” highlighted by Mumsnet campaign”87). The figures came from “write-in” responses on the Mumsnet website from 1,609 women and the methodology of the survey was later criticised by organisations such as Straight Statistics as unrepresentative of women’s true experiences.88

Some of these issues are already recognised in the BBC’s Editorial Guidelines, which, for example, says this in respect of reporting survey results:

We should exercise appropriate scepticism when reporting the results of surveys commissioned or carried out by other organisations and, where necessary, include a description of the methodology used. Care is required, particularly in news output, not to report such surveys in a way which leads our audience to believe they are more robust than is actually the case.

BBC Editorial Guidelines on Politics, Public Policy and Polls

87 http://www.bbc.co.uk/news/uk-17338031

88 http://straightstatistics.fullfact.org/article/mumsnet-rape-survey-take-two

Page 70: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

70

In December 2014 the Visual Journalism team worked with Kings College, London to produce a piece for the BBC website, “Jihadism: tracking a month of deadly attacks”89. The article reported that Jihadist attacks killed more than 5,000 people in November 2014 and incorporated a variety of maps (interactive and static) and bar charts which set out the number of deaths in specific countries, as well as expressing them in terms of the proportion of total deaths in those areas. The article specifically included a link to the methodology behind the research (“how the data was gathered”), which set out issues such as “how robust are the figures?” and “what is the raw data?”

Sometimes the BBC, or someone it cites, just gets it wrong. How successful is the BBC at spotting, or receiving feedback, on mistakes and issuing swift corrections? How does it let audiences know when corrections have taken place? It appears that the BBC has prioritised responsiveness to mistakes in recent years. We note that the BBC has a specific Corrections and Clarifications webpage90 which was introduced following a review of BBC Governance conducted in 2011 by the BBC Trust91. This includes apologies, significant corrections, statements and responses to issues and findings by the BBC’s Editorial Complaints Unit and by the BBC Trust. The BBC also appointed a Chief Complaints Editor in 2012 (now the Chief Complaints Adviser) to oversee the BBC’s complaints process (the BBC receives over 250,000 complaints contacts each year).

89 http://www.bbc.co.uk/news/world-30080914

90 http://www.bbc.co.uk/helpandfeedback/corrections_clarifications

91 http://downloads.bbc.co.uk/bbctrust/assets/files/pdf/about/how_we_govern/bbc_governance.pdf

Page 71: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

71

These measures appear to have been helpful in increasing transparency. We note at the time of writing, for example, 12 entries of “corrections and clarifications” were listed on the relevant webpage for February 2016 alone, some of which specifically involved stories which included statistics.

“3 February 2016: Correction - The Town That Took On The Taxman, BBC Two (broadcast 20 January 2016)92

In this programme it was stated that Amazon paid “just £11.9m in corporation tax last year (2014) on £5.3 bn of sales. That’s 0.0002%”. In fact this represents 0.22% by proportion.”

Our attention was also drawn to the existence of active and lively social media communities. If the BBC gets it wrong then viewers and listeners can more easily check and challenge via social media. And they do, with some suggesting that the BBC picks up on this quickly and revises its reporting accordingly. And yet, we received evidence that there remains concern in some quarters over the speed in which the BBC issues corrections when it gets the numbers wrong and the transparency with which they inform audiences that changes have been made (for example to online articles). Organisations such as Full Fact told us they frequently experience difficulty in accessing the BBC’s complaints system and have sometimes experienced a slowness in responses to the issues they are raising. Indeed, we understand that the misreporting of the number of successful race discrimination cases relating to employment in the last five years which we referred to earlier took some time to be corrected. This appears to be contrary to the criterion set out in the BBC’s Complaints Framework93 that “the complaints process should be quick and simple”.

92 http://www.bbc.co.uk/corporate2/helpandfeedback/corrections_clarifications/corrections_feb_2016#

93 http://downloads.bbc.co.uk/bbctrust/assets/files/pdf/regulatory_framework/protocols/2014/e3_complaints_framework.pdf

Page 72: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

72

On 16 April 2014 the ONS announced that earnings were rising at the same rate as prices. The BBC initially reported this online (see below) as wages rising faster than prices. However, it was too early to say at that point (as inflation and earnings figures covered different time periods and not all the information was available).

The BBC amended their headline within hours but did not inform readers that a correction had been made.

As noted earlier, it is unsurprising that errors or missed opportunities will occur in BBC coverage, given the sheer quantity of the output it produces. But building on its transparency and responsiveness will help towards maintaining the BBC’s accountability.

Page 73: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

73

Some thoughts on the future

The rise of data journalism

We noted earlier that some focus has recently been placed by the BBC Academy (and external training providers and academic institutions) on giving journalists additional training in skills to interrogate, question and interpret data and statistics and be confident in deploying them. It will be vital to increase such capabilities to help develop new forms of investigation, spot new and different stories (including at a local level) and help debunk the claims of others. But there are risks too in “data journalism”. As data become ever more available then so increases the risk of misleading – either deliberately or inadvertently – claims and conclusions.

. . . big data do not solve the problem that has obsessed statisticians and scientists for centuries: the problem of insight, of inferring what is going on, and figuring out how we might intervene to change a system for the better. . . “Big data” has arrived, but big insights have not. The challenge now is to solve new problems and gain new answers – without making the same old statistical mistakes on a grander scale than ever.94

“Big data: are we making a big mistake?” Tim Harford, economist, journalist and broadcaster

In this climate the basic ability to:

beware: check and challenge. Ask: who is saying this and why? What are they telling me – and what aren’t they? Check the facts – a basic tenet of good journalism

• check what others sources are saying • ask how the data are produced? Are they in the public domain? If not, why not?

is an essential part of every journalist’s job. We were told many times during this review that journalists may well perceive the ability to critically analyse information as the basis of their profession when it comes to literary matters (arguments, ideas, contentions, statements). But when it comes to statistics, some can struggle to find the confidence to research or confront the numbers in front of them. We were surprised how many times we heard this as an issue, both from BBC staff members (current and former) and from external stakeholders. As the audience research noted:

Some [people] have a “psychological block” when dealing with figures. Those with low confidence sometimes switch off and make little effort to understand. They become panic stricken or “number blind” when asked to comment on numbers.

Audience research report, Oxygen Brand Consulting The evidence we received suggests the same can be said of some journalists and programme makers.

94 http://timharford.com/2014/04/big-data-are-we-making-a-big-mistake/

Page 74: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

74

The audience were asked to look out for spontaneous challenge on these grounds in their pre and post diaries. Their observation was that they had not found any spontaneous or “on the hoof’” challenges by journalists to interviewees on such issues as sample size, sources, error margins, methodology and so on. On the other hand, they did observe many challenges on non-statistical matters.

Audience research report, Oxygen Brand Consulting In our view, this is not sufficient. It is not enough to say “I don’t do data” just because one does not happen to have a science or maths background. Publicly funded journalists should have the skills to work with the “language” of public service, which is, increasingly, data. And they should have the confidence to treat it with the same journalistic scepticism as any other source of information. We noted that the More or Less team, whom many cited as an expert resource within the BBC which they might turn to for advice, do not have substantial academic qualifications in dealing with statistics – they are simply journalists who are not afraid of using numbers. We understand from informal conversations with the BBC Executive that confidence and critical reasoning in data journalism is already recognised by the BBC as an important area for focus in the future. It is our view that development in this area is likely to address a large number of the issues we have highlighted in this report concerning how the BBC uses statistics to further its accuracy and impartiality. As Sir Tim Berners-Lee has said:

Journalists need to be data-savvy . . . [it’s] going to be about poring over data and equipping yourself with the tools to analyse it and picking out what’s interesting. And keeping it in perspective, helping people out by really seeing where it all fits together, and what’s going on in the country.95

Developing an appropriate capability in this area will not be easy – it will require a step change in the culture of journalism within the BBC and an effort to upskill and resource in this area. Leadership, team working and collaboration between journalists, developers and statisticians are essential, with clear signals and incentives in place. We hope that the BBC’s attention to this issue will provide future journalists and programme makers with increased skills to interrogate numerical information more effectively and consistently when putting together content. We would like the BBC to develop plans for how to build such capability.

95 http://www.theguardian.com/media/2010/nov/22/data-analysis-tim-berners-lee

Page 75: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

75

List of contributors

BBC

• Ade Adeluwoye – Mentee, BBC Mentoring Scheme • Ruth Alexander – Senior Journalist, Radio Current Affairs • Gavin Allen – Controller, Daily News Programmes • Ric Bailey – Chief Adviser, Politics • Peter Barnes – Senior Elections and Political Analyst • Michael Buchanan – Senior Broadcast Journalist • Jasmin Buttar – Editor, Business and Economics • John Campbell – Economics and Business Editor, Northern Ireland • Richard Clark – Editor, Radio Newsroom • Owain Clarke – Health Correspondent, BBC Wales • Chris Cook – Policy Editor • Sarah Dickins – Economics Correspondent, BBC Wales • Mark Easton – Home Editor • Amanda Farnsworth – Head of Visual Journalism, Deputy Controller 24/7

News • Douglas Fraser – Business and Economy Editor, BBC Scotland • Alex Freeman – Series Producer, Trust me I’m a doctor (BBC Two) • Simon Hamer – Assistant Editor, business online and Today programme • James Harding – Director, BBC News • Alison Holt – Health Correspondent • Bella Hurrell – Assistant Editor, Visual Journalism • David Jordan – Head of Editorial Policy and Standards • Ben King – Senior Broadcast Journalist, Business Unit • Tamara Kovacevic – Senior Broadcast Journalist, Analysis and Research • Charlotte McDonald – Broadcast Journalist, Radio Current Affairs • Brian Meechan – Business Correspondent, BBC Wales • Ransome Mpini – Data Analyst and Developer • Mark O’Callaghan – Head of News and Current Affairs, BBC Wales • Julian O’Neill – Business Correspondent, Northern Ireland • Piers Parry-Crooke – Deployment Editor, Business and Economics • Hugh Pym – Health Editor • Anthony Reuben – Broadcast Journalist, former Head of Statistics • Garmon Rhys – Deputy Head of News & Current Affairs, BBC Wales • Martin Rosenbaum – News producer • Carol Rubra – Assignment Editor • James Stephenson – News Editor • Wesley Stephenson – Senior Broadcast Journalist, Radio Current Affairs • Richard Vadon – Assistant Editor, Radio Current Affairs • John Walton – Senior Broadcast Journalist, Visual Journalism • Andrew Wilson – Head of Journalism Foundation, Leadership and Personal

Development, BBC Academy • Sam Woodhouse – Head of Research and Analysis

Page 76: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

76

• Mark Wray – Head of Core Training, BBC Academy • Jon Zilkha – Controller, News 24/7 and Digital • 1 other individual who submitted written evidence

Page 77: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

77

External

• Phoebe Arnold – Senior Communications Officer, Full Fact • Sam Ashworth-Hayes – Researcher, Full Fact • Mevan Babakar – Senior Communications Officer, Full Fact • Michael Blastland – writer and broadcaster • Professor Jonathan Baker – Professor of Journalism at Essex University,

former BBC Head of News and Head of BBC College of Journalism • Derek Bond – Member, Northern Ireland Statistics Advisory Committee • Norman Caven – former Northern Ireland Registrar General and Head of

Northern Ireland Statistics and Research Agency • Emily Craig – former employee of Full Fact (currently Researcher, BBC) • Diane Coyle – economist, former BBC Trustee • Sir Andrew Dilnot, Chair UK Statistics Authority

• Stephanie Flanders – Former BBC Economics Editor • Miles Fletcher – Head of Media and Public Relations, Office for National

Statistics • Amy Hawkins – Executive Assistant, Full Fact • Scott Heald – Head of Profession for Statistics for NHS National Services

Scotland • Stephanie Howarth – member of Government Statistical Service • Glyn Jones – Chief Statistician, Welsh Government • Professor Steve Jones – author, BBC Trust review of impartiality of BBC’s

science reporting, 2011 • The Lord Lipsey – Co-Chair of the All Party Parliamentary Committee on

Statistics • Toby Mason – Head of Communications, Welsh Government • Professor Kevin McConway – Professor of Applied Statistics, Open University • Philip McDonagh – Chair, Northern Ireland Statistics Advisory Committee • Conor James McKinney – Senior Researcher, Full Fact • Chris Morris – Member, Northern Ireland Statistics Advisory Committee • Will Moy – Director, Full Fact • Laura O’Brien – Researcher, Full Fact • Joseph O’Leary – Senior Researcher, Full Fact • Robert Peston – former Economics Editor, BBC • Jonathan Portes – Principal Research Fellow, National Institute of Economic

and Social Research • John Pullinger – UK National Statistician, Chief Executive, UK Statistics

Authority and Permanent Secretary • Professor Gareth Rees – Cardiff School of Social Sciences, Cardiff University • Professor Ian Rees Jones – Professor of Sociological Research, Cardiff

University • Amy Sippitt – Senior Researcher, Full Fact • Professor Karen Turner, Director of the Centre for Energy Policy at the

University of Strathclyde International Public Policy Institute • Professor Jonathan Wadsworth – Economics Department, Royal Holloway

College, University of London • Amy Wilson, Head of Census Statistics, National Records of Scotland

Page 78: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

78

• Professor Simon Wren-Lewis – Blavatnik School of Government, Oxford University

• Academy of Social Sciences • British Academy • Economic and Social Research Council • Joseph Rowntree Foundation • Market Research Society • Nuffield Foundation • Royal Statistical Society • Royal Statistical Society’s Statistics User Forum • Science Media Centre • UNISON • 9 other individuals

Page 79: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

79

Appendix 1

The Royal Statistical Society’s “A dozen rules of thumb for journalists”

This document was prepared for the RSS Science Journalism Programme with assistance from Professor David Spiegelhalter and David Walker. It is not intended to be a prescriptive statement on what journalism students ought to know about statistics, rather, it is a guide to what those visiting media colleges might like to cover.

A dozen rules of thumb for journalists

Numbers are compelling, but treacherous. They can make a story, but they are also open to misunderstanding and manipulation. Caution and a preparedness to check and check again should be a journalist’s watchwords

1. You come across a number in a story or press release. Buyer beware. Before making it your own, ask who cooked it up. What are their credentials? What is their pitch? Do we have alternative evidence; what numbers are they not showing us; why this number, now? If the number comes from a study or research, has anyone reputable said the work is any good?

2. Sniff around. Do the numbers refer to a whole group of people or things or just a sample of them? If it’s a sample, are the people being questioned or the things being referred fairly representative of the wider group? Say a company is claiming something applies to the population at large. If they mean it is a sample of the population, beware. A panel of internet users, say, that the company goes back to time and again may not be representative – not everyone uses the internet. Organisations use samples based on their own mailing lists, or on people who have received a free sample of their product, and the samples may be biased.

3. More probing. What questions were the sample asked? Wording can hugely influence the answers you get. In a jobs survey, our understanding of what it means to ‘be employed’ may differ; likewise in a crime survey our sense of what is ‘violent’. The public’s understanding may not correspond with the survey researcher’s. Might a pollster’s choice of words have led people into giving a particular and slanted response?

4. A single number is often used to sum up a group, the average. But different averages measure different things. Here are some definitions. The mean is extremely sensitive to highs and lows: Bill Gates coming to live in the UK would push up mean wealth.

Page 80: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

80

The median tells us, for example, the income of a person at the midpoint –half the population get less, half more. Comparing earnings, the mode tells us the salary most people earn.

5. Editors like a sure thing, but with numbers can be uncertain. We need to be sure the number on offer is not just due to chance. With a sample, check the margin of error, usually plus or minus 3 per cent. A poll saying 52 per cent of people are in favour of something is not a definitive statement: it could be 49 per cent. Uncertainty is inevitable when you are using a sample, so reputable polling companies state the margin of error around their confidence a sample does represent the wider population. Remember the margin of error tells us only about the sampling, not whether the right questions were asked appropriately.

6. Beware league tables, except in sports reports. Manchester United is higher than Chelsea for a simple and genuine reason: the side has collected more points. With hospitals or schools, a single score is unlikely to be a valid basis for comparing one with another. A teaching hospital may have a worse score, but only because sicker patients are referred to it. Comparisons between universities or police forces are unreliable if the scores fall within margins of error. Midshires scores 650 on the ranking and Wessex 669: they could be performing at the same level or their respective positions could be reversed.

7. The numbers show a big increase or sharp decrease. Yet a single change does not mean a trend. Blips happen often. Peaks and troughs go away, so we have to ask whether a change in the numbers is just a recovery or return to normal after a one-off rise or fall. This is what statisticians refer to as ‘regression to the mean’. The numbers may come from a survey, such as ONS figures for household spending or migration. Is the change being recorded bigger than the margin of error?

8. After a controlled experiment (such as a trial of a new drug, based on a randomly chosen group, some of whom don’t know they are getting a placebo), researchers are more confident in saying that a causes b. The numbers may show an association between two things, say obesity and cancer. But a correlation is not the same as saying obesity causes cancer. The connection may be spurious and explicable by a third or background factor. If children’s use of mobile phones is associated with later behavioural disorders, the connexion could be the parents, and the way their behaviour affects both things. If the numbers do suggest an association, we have to try to assess whether it is plausible, on the back of other evidence. Finding a link may stimulate further study, but ought not itself to be the basis for action, let alone some new government policy. Recommendations for changing daily behaviour such as eating should not be based on speculative associations between particular food and medical conditions.

9. A question to pose of any number is ‘out of how many?’ Some events are rare – such as the death of a British child of junior school age. That’s why they are news, but that’s also why they have to be put in context. Noting scarcity value is part of good reporting, which tells us about an event’s significance. The meaning of an event for an individual or family has to be distinguished from its public importance.

10. Billions and millionths are hard to grasp. We take in figures better if they are human scale. One comparison is between a number and the whole UK. Another is to capture the effect of an event or behaviour on an individual. Colourful comparisons can make risk intelligible: the risk of dying while being operated on under a general anaesthetic is on average the same as the risk of being killed while travelling 60 miles on a motorbike.

Page 81: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

81

11. Good reporting gives a balanced view of the size of the numbers being reported.

Better to focus on the most likely number rather than the most extreme, for example in stories about the effects of a flu pandemic. ‘Could be as high as’ points to an extreme; better to say ‘unlikely to be greater than’. Think about how your audience will perceive a number.

12. Risk is risky. ‘Eating bacon daily increases an individual’s lifetime risk of bowel cancer by 20 per cent.’ Another way of saying that is: out of 100 people eating a bacon sandwich every day one extra person will get bowel cancer. Using the first without noting the second tells a story that is both alarmist and inaccurate. If the information is available, express changes in risk in terms of the risks experienced by 100 or 100,000 people.

Endnote:

Guides and pointers on how to get the best out of stats and data abound. RSS getstats has resources. The switch from print to digital brings opportunities to present numbers more dynamically and imaginatively, for example in scatter plots. Graphics can show a trend. Stacked icons in graphs can show effects on 100 people. The same rules of thumb apply whatever the medium: is the graphic clear; does it tell the story that is in the text. A pretty picture is not necessarily informative.

Page 82: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

82

Appendix 2

Making Sense of Statistics – 10 Golden Rules (handed out at the end of BBC training sessions) Look on statistics as your friends, providing you with facts and evidence on which to base your stories. But treat them with caution and respect.

1. Let the statistics drive the story and not the other way about. Taking a theory and trying to find statistics that fit it is a recipe for disaster, and one of the biggest causes of inaccuracy and misrepresentation. Make sure that whoever has provided the figures hasn’t fallen into that trap.

2. Too good to be true? If a story looks wrong, it probably is wrong. Don’t take things at face value, especially if you are looking not at the raw figures, but at how someone else has interpreted them or written them up.

3. Context. Look at the background, what is being measured and over what period of time. Could the chosen start and end date have an effect on the findings? Remember that many important social and other changes happen over long periods of time. Beware of something that looks like a dramatic shift.

4. Check your source. Is it likely to be someone with a vested interest in interpreting findings in a particular way?

5. Look at the methodology. All responsible producers of statistics will tell you how they have been produced, the size of the sample and the margins of error. Beware of people seeking publicity using poor surveys, self-selecting samples or partial selection from someone else’s data. For something which affects more than 20,000 people, you’ll need a survey sample of at least 1,000, ideally canvassed face to face.

6. Compare like with like - both over time and between different sources. Just because two sets of statistics look alike, it doesn’t always mean you can compare them – methods and samples can differ. Comparisons between different countries are especially difficult.

7. Correlation and causation. Just because two facts are sitting alongside each other and look as though they might be connected, do not assume that they are. There may be no connection between them at all, causal or otherwise.

8. Big numbers and little numbers. Seen in context, each can look very different. A risk going from 0.01 to 0.02 might be a ‘doubling’ but it’s still a very small risk. A billion pounds of health spending might sound like a lot, but looks less so if it’s expressed as less than 1% of the total budget. Make sure you look at both the percentage and the raw numbers.

Page 83: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

83

9. Don’t exaggerate. To say the cost of something ‘could be as high as’ a large sum might be strictly true but could be misleading if it’s a worst case scenario. The central estimate is the most likely to be accurate.

10. Averages. The ‘mean’ is all the figures added together and divided by the number of figures. It is the most commonly used. The ‘median’ is the middle figure within a range. It often gives a fairer picture. Understand the difference and be clear which you are using.

Never be afraid to ask advice from a statistician about how to understand statistics. With thanks to: Office for National Statistics, More or Less, Anthony Reuben

Page 84: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

84

Executive Response to Trust Impartiality Review of Statistics BBC News welcomes this constructive analysis of the use of statistics. The use of data is transforming the way we can report politics and public services, business and sport, the society we live in and the choices we make. This report on statistics comes, therefore, at a critical moment, when the BBC is considering how it uses the growing volume of data to inform its news coverage both in the UK and around the world. The report rightly emphasises the importance of accuracy and context in the reporting of statistics. We are pleased that it recognises how much consideration and care the BBC puts into the use of statistics on air and online. We appreciate, too, the expectation by audiences that the BBC can be trusted to treat statistical information with scrutiny and circumspection. In a world in which the numbers are so regularly abused, the BBC knows that people come to us for reporting and analysis they can trust. We’re particularly pleased that the report highlights excellent programmes like ‘More or Less’. We also think that Reality Check is exceptionally good in cutting through the clamour around facts and figures – and we accept the recommendation that it should continue beyond the EU Referendum. We will make it a permanent fixture in BBC News, reviewing the shape of its work and audience appreciation annually between now and the next General Election. There are also individual editors and journalists across the BBC who have produced detailed storytelling through the use of data. They have proved themselves as important sources of reference for important continuing issues, alongside the valuable briefings supplied by Analysis and Research in the BBC’s Political Unit. In response to this report, though, the most important future development in News is data journalism. The BBC has a plan to develop data journalism as a cornerstone of its reporting and analysis in all areas of News coverage. We are confident it will help us meet the suggestions and recommendations of this review. More than that, we believe that we should be using data not simply to illustrate the BBC’s reporting but to prompt investigations and to inform the focus of our coverage. If we can marry intelligent statistical analysis with powerful story-telling, we will be able to understand better the world we live in by reporting not just what people say but by what the data tells us people actually do. There are three areas we’re working on. The BBC has already announced a partnership with the News Media Association to create a hub for data journalism for the benefit of local and regional news organisations across the UK, including the BBC’s English Regions. In addition, the BBC Future of News strategy sets out plans to create a centre for data journalism linked to newsrooms in the four Nations under a UK Head of Statistics. The

Page 85: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

85

centre will act as hubs of expertise for journalists across the BBC. Their combined task would be to:

Gather, interrogate and clean data

Produce data-based stories

Provide support in assessing credibility of data-based stories offered to BBC News

Be a daily fact checker for output

Undertake investigations based on data sets

Curate long term data sets (crime, education, health stats) Third, we will look to partner centres of excellence such as Universities and research institutes, building on previous collaboration with the Universities of Oxford, Loughborough and Westminster. We agree with the suggestion in this report to promote guidance from such bodies as the Royal Statistical Society and other centres of expertise in using and handling statistical information. We appreciate that we need to ensure journalists across News have the skills to make the most of statistics and complicated data sets. The BBC Academy will devise online training in the use of statistics for journalism for rollout across BBC News. This will significantly heighten awareness and knowledge of the way in which statistics should be interpreted and presented. The audience research for this review supports the need for journalists to provide context in the presentation and use of statistical data and to reflect this in reporting and discussion. We agree that analysis and explanation of data sets is critical to allowing viewers, listeners and readers to arrive at informed opinions. The BBC recognises the importance of its presenters and programmes testing and rebutting misleading use of statistics, whether it is by politicians, public figures or business executives. The review urges the BBC to be better at this and to give more guidance to viewers, listeners and readers on the background to official releases. Similarly, we note that the review is critical about presenting different sets of statistics, and claims based on them, presented by those on either side of political arguments, without interpretation or guide. It is our role to give our audiences the best information so that they can form their own judgments. The News Editor, working in conjunction with the Head of Statistics, will seek to review every six months the progress of programmes and presenters in holding people in public office to account. In the first six months the focus will be on the Today programme, Breakfast and on 5live Drive. The review notes the editorial policy Guidance on Reporting Risk but suggests that clearer Guidelines and a more consistent approach to reporting risk are needed. As the review indicates, reference to reporting statistics and risk (Accuracy 3.4.21) will be reviewed in the next edition of the Editorial Guidelines. Meanwhile Guidance on the Use of Statistics will be

Page 86: Making Sense of Statistics - BBCdownloads.bbc.co.uk/bbctrust/assets/files/pdf/our_work/stats... · BBC Trust Impartiality Review: Making Sense of Statistics 2 Contents Trust Conclusions

BBC Trust Impartiality Review: Making Sense of Statistics

86

added to the existing Guidance on Reporting Risk (http://www.bbc.co.uk/editorialguidelines/guidance/reporting-risk) and will complement the Guidelines and Guidance on the reporting of opinion polling and surveys (http://www.bbc.co.uk/editorialguidelines/guidelines/politics/opinion-polls). When these are distributed there will be an opportunity to remind programme makers of the requirement to report statistics accurately.