Forecasts, projections or estimates?

Communicating uncertainty: how to better understand an estimate?

Tom van Vuren
11 April 2024

 

Sometimes you hear a speaker in a podcast and say to yourself: “Oh, yes! I know exactly what you mean!” It happened to me when listening to Statistically Speaking, the podcast of the Office for National Statistics (ONS), on the topic “Communicating Uncertainty: How to better understand an estimate”.

The interviewer was Miles Fletcher, Head of Media and Public Relations at Office for National Statistics, and the speakers Robert Chote, Chair of the UK Statistics Authority, Craig McLaren of the ONS; and Mairi Spowage, Director of the Fraser of Allander Institute. They did not discuss transport forecasts, but something not too dissimilar and probably just as controversial, GDP estimates, which of course feature at least indirectly in our work. These are their key messages.


Join us at Modelling World 2024 to discuss these hot topics with data and modelling experts, 19 June, Birmingham


All important statistics, including forecasts, are estimates

They may be based on huge datasets calculated with the most robust methodologies we have available. But at the end of the day, they are statistical judgments subject to a degree of uncertainty, both in input values and in the internal calculations.

What’s in a name? A few years back, the Department for Transport renamed the National Road Traffic Forecasts, the National Road Traffic Projections. Merrriam-Webster’s definition of these two terms isn’t that much different; and as a non-native speaker, a projection feels more rooted in current trends, than a forecast. Sadly, the term Road Traffic Estimates is saved by DfT for current year values, rather than for the future.

Much of the criticism of models, modellers and modelling focuses on the processes, and too rarely recognises and explores the underlying and critical uncertainty in input assumptions of processes that we can’t control

Communicating uncertainty is key

Not just in transport planning there is concern about trading off openness about these uncertainties in statistics, and worries about how this may affect how the data is subsequently perceived and used.

But Chote is adamant: “It's important when you're presenting and publishing statistics that you help people engage. And the more that producers (of forecasts and statistics) can do to help people engage in an informed and intelligent way, the better.

Estimates needs to be near enough to be reliable, but at the same time, we need to know about the uncertainty. You want to try to guide people by saying that it is an estimate, there's no guarantee that this is going to exactly reflect the real world. And the more you can do to put some sort of numerical context around that, the more reliable basis you have for people who are using those numbers, particularly if those statistics may be revised in future as you get more information.” 

Adds Spowage:” The more you can do to set expectations of users that this is normal, sort of core part of estimation or what's going on, the better when these revisions inevitably happen. 

And McLaren: “This is why it's important not to just focus on a single estimate. We at ONS don't formally produce what we call range estimates. What we do is, we actually look at our historical record of GDP revisions. And overall, it's quite a sort of considered picture. Then the phrasing becomes very important, of course, to reflect that these are estimates, our best estimate at the time, and they are expected to change.”

Statistics change

U.S. Treasury Secretary William E. Simon (1927-2000) appears to have said in 1975: “I sometimes think that economists use decimal points in their forecasts to prove they have a sense of humor.” Perhaps we may borrow that phrase for transport modelling and forecasting as well – I wholeheartedly agree with Glenn Lyons in his crusade against false precision.

Forecasts are not just estimates, they inevitably change over time when new information becomes available, and when assumptions get adjusted in response. This of course does not mean that forecasting is pointless – but throughout the use of numbers that transport models produce (as with GDP forecasts), uncertainty and the possibility of updating them to more up-to-date values, must be recognised and communicated, as a positive process rather than an excuse.

As Chote says in the interview: “That's not a bug, that's a feature of the system. It's a complex task, and inevitably the picture evolves.”

Look for other evidence

You don’t have to rely on just the absolute numbers that the computer produces. In GDP estimates, perhaps more easily than in transport forecasts, you can learn things, obviously from the direction and the size of revisions to numbers that have happened in the past.

In transport we have access to similar evidence, for example the so-called hedgehog diagrams of the evolution over the years of the estimates (see what I did there?) of future growth in car traffic, ascribed to Phil Goodwin. But also general trends, project evaluations (why are we so bad at these in transport?), international comparisons. 

I have called this triangulation in the past – use more than one source to give people a sense of how much confidence they should place in any given number produced at any given point in that that cycle of evolution as the numbers get firmer in the business case process.

Different people will have different appetites for the technical detail around this. Testing out with end users as to what they find helpful and what they don't is a valuable thing to be doing.

Numbers will be misused

Just like policymakers, forecasters may be under pressure to achieve certain outcomes which put too much reliance on these estimates. In order to show progress against some policy objective? Or in support of a pet project?

The small print gets forgotten. Says Chote: “If you're trying to set policy in order to achieve a target for a particular statistic, at some point in the future, then having an understanding of the uncertainty, the nature of it, the potential size of it in that context, helps you avoid making promises that it's not really in your power to keep with the best will in the world, given those uncertainties”.

And that’s an excellent point: we as modellers have no control over future developments, but they become key model inputs, assumptions, and they are presented as immutable. Such uncertainties in inputs generally outweigh the model’s simplifications and associated errors in the outturn statistics. 

The danger is the way these statistics are then described in the media and elsewhere. Fletcher: “The numbers are invested by observers with more authority than they deserve, particularly, of course, if the numbers are going their way. Then obviously, you want people to believe they are 100% accurate.”

So what to do?

Uncertainty is topical, and it’s comforting to see that it’s not just transport modellers and our end users struggling with it. The podcast is a timely reminder that this uncertainty already enters our processes in some of the key input assumptions; and that economic forecasters, like us, face the challenge that important numbers change.

And let’s not forget that society accepts this in weather forecasts – they get better over time, nearer that day to the seaside or your decision to hang out the washing. We should embrace that process: expect change, early predictions will evolve, not because the tools are wrong, but because the quality of the inputs improves. To paraphrase Chote: complaining about revisions in input data is like sailors complaining about waves in the sea. I'm afraid that is what you're dealing with.

The second point: communicate that uncertainty, create the narrative. Where do users look for the small print? Provide guidance on how reliable any number that you produce is, including reference to other datapoints that support or even challenge it.

Much of the criticism of models, modellers and modelling focuses on the processes, and too rarely recognises and explores the underlying and critical uncertainty in input assumptions of processes that we can’t control.

Even scenario planning is at risk of relying on (admittedly multiple but) assumed set assumptions, be they prescribed (such as in the Common Analytical Scenarios) or developed locally (as the Transport for the North’s Future Travel Scenarios). Keeping such scenario assumptions fixed over time does not do justice to evolving insights, as discussed by the podcast participants.

And finally, we are not alone in the challenges that numbers change, even statistics that describe the situation today; and that inputs for the future should be updated over time. Are we confident enough to start calling our model outcomes what they really are: estimates?


Tom van Vuren is a Strategic Consulting Partner at Amey, Visiting Professor at the University of Leeds and Board Member at the Transport Planning Society

Housing Strategy and Performance Lead
Wiltshire Council
Trowbridge, Wiltshire
£38,626 - £40,476
Traffic Network Engineer
Portsmouth City Council
Portsmouth
£31,067 - £37,937
Traffic Network Engineer
Portsmouth City Council
Portsmouth
£31,067 - £37,937
View all Vacancies
 
Search
 
 
 

TransportXtra is part of Landor LINKS

© 2024 TransportXtra | Landor LINKS Ltd | All Rights Reserved

Subscriptions, Magazines & Online Access Enquires
[Frequently Asked Questions]
Email: subs.ltt@landor.co.uk | Tel: +44 (0) 20 7091 7959

Shop & Accounts Enquires
Email: accounts@landor.co.uk | Tel: +44 (0) 20 7091 7855

Advertising Sales & Recruitment Enquires
Email: daniel@landor.co.uk | Tel: +44 (0) 20 7091 7861

Events & Conference Enquires
Email: conferences@landor.co.uk | Tel: +44 (0) 20 7091 7865

Press Releases & Editorial Enquires
Email: info@transportxtra.com | Tel: +44 (0) 20 7091 7875

Privacy Policy | Terms and Conditions | Advertise

Web design london by Brainiac Media 2020