Forecast Verification

Information about Forecast Verification

Published on October 5, 2007

Author: Spencer

Source: authorstream.com

Content

Forecast Verification:  Forecast Verification Presenter: Neil Plummer National Climate Centre Lead Author: Scott Power Bureau of Meteorology Research Centre Acknowledgements A. Watkins, D. Jones, P. Reid, NCC Introduction:  Introduction Verification - what it is and why it is important? Terminology Potential problems Comparing various measures Assisting users of climate information What is verification?:  What is verification? “check truth or correctness of” “process of determining the quality of forecasts” “objective analysis of degree to which a series of forecasts compares and contrasts with the equivalent observations of a given period” Why bother with verification?:  Why bother with verification? Scientific admin support is a new system better? assist with consensus forecasts Application of forecasts “how good are your forecasts?” “should I use them?” can be used to help estimate value Terminology can be confusing:  Terminology can be confusing Verification is made a little tricky by the fact that everyday words are used to describe quantities with a precise statistical meaning. Common words include: accuracy skill reliability bias value hit rates, percent consistent, false alarm rate, ... all have special meanings in statistics Accuracy:  Accuracy Average correspondence between forecasts and observations Measures mean absolute error, root mean square error Bias:  Bias Correspondence between average forecast with average observation e.g. average forecast - average value of observation Skill:  Skill Accuracy of forecasts relative to accuracy of forecasts using a reference method (e.g. guessing, persistence, climatology, damped persistence, …) Measures numerous! Reliability:  Reliability Degree of correspondence between the average observation, given a particular forecast, and that forecast taken over all forecasts e.g. suppose forecasts of : “10% or 30% or , …, or 70% or … chance of rain tomorrow” are routinely issued for many years if we go back through all of the forecasts issued a forecast of looking for occasions when forecast probability of 70% was issued, then we would expect to find rainfall on 70% of occasions if the forecast system is “reliable” this is often not the case Slide10:  Reliability Graph Value:  Value Impact that prudent use of a given forecast scheme has on the user’s profits, COMPARED WITH profits made using a reference strategy Measures $, lives saved, disease spread reduced, … Slide12:  Contingency Table OBSERVED HIT RATE = Hits/(Hits + Misses) FALSE ALARM RATE = False Alarms/(False Alarms + Correct Rejections) PERCENT CONSISTENT = 100*(Hits+Correct Rejections)/Total Accuracy measures:  Accuracy measures Hit rates Proportion of observed events correctly forecast False alarm rates Proportion of observed non-events forecasted as events Percent Correct 100x (proportion of all forecasts that are correct) 1. Forecast performance 2x2 contingency table:  1. Forecast performance 2x2 contingency table Is this a good scheme?:  Is this a good scheme? 1. Original Scheme: Percent correct = 100(28 + 2680)/2803 = 96.6% so it is a very accurate scheme! or is it? 2. Performance of 2nd (reference) forecast method: never predict a tornado = a “lazy” forecast scheme!:  2. Performance of 2nd (reference) forecast method: never predict a tornado = a “lazy” forecast scheme! Performance measures:  Performance measures 1. Original Scheme: Percent correct = 100(28 + 2680)/2803 = 96.6% 2. Reference Lazy Scheme: Percent correct = 100(0 + 2752)/2803 = 98.2% !! Percent Correct: Performance measures:  Performance measures Hit rates: ) 28/51 … so over half the tornadoes predicted ) reference scheme: 0/51 … no tornadoes predicted Value:  Value Suppose an unexpected (unpredicted) tornado causes $500 million damage and that an expected (predicted) tornado results in only $100 million damage So forecast scheme (1) saves 28 x 400 million compared to forecast scheme (2) a huge saving - highly valuable!! Categorical versus probabilistic:  Categorical versus probabilistic Categorical “The temperature will be 26ºC tomorrow” Probabilistic “There is a 30% chance of rain tomorrow” “There is a 90% chance that wet season rainfall will be above median” Artificial Skill :  Artificial Skill danger of too many inputs danger of trying too many inputs independent data cross-validation importance of supporting evidence simple plausible hypothesis climate models process studies How do users verify predictions? :  How do users verify predictions? No single answer, however: some switch from probabilistic to categorical media prefer categorical forecasts assessments made on a single season extrapolation How can we assist users in verification:  How can we assist users in verification Increase access to verification information Simplify information Build partnerships media users & user groups other government departments Education (booklets, web, …) Summary:  Summary Verification is crucial but care is needed! Familiarise with terminology used skill, accuracy, value, … No single measure tells the whole story Importance of using independent data in verification Keep it simple Communicating verification results is challenging Users sometimes do their own verification - sobering Most people like to think categorically - challenging Dialogue with end-users is very important

Related presentations


Other presentations created by Spencer

AI
30. 04. 2008
0 views

AI

entrepreneurial finance
01. 10. 2007
0 views

entrepreneurial finance

Chapter11
07. 10. 2007
0 views

Chapter11

China as exporter
12. 10. 2007
0 views

China as exporter

UML Tool Tutorial
24. 10. 2007
0 views

UML Tool Tutorial

Differences that Bind Us
15. 10. 2007
0 views

Differences that Bind Us

Int comparisons
19. 10. 2007
0 views

Int comparisons

TheHarlemRenaissance
21. 10. 2007
0 views

TheHarlemRenaissance

AFDEC China RoHS
10. 10. 2007
0 views

AFDEC China RoHS

ArtTemps PhyPsy
24. 10. 2007
0 views

ArtTemps PhyPsy

Generalidades
24. 10. 2007
0 views

Generalidades

Chap17
16. 11. 2007
0 views

Chap17

filtering for smrc dsc
10. 12. 2007
0 views

filtering for smrc dsc

Owens
17. 10. 2007
0 views

Owens

MYP Jan 2002
23. 10. 2007
0 views

MYP Jan 2002

Lecture Three
23. 12. 2007
0 views

Lecture Three

9 5 06 Trigger
05. 10. 2007
0 views

9 5 06 Trigger

K2 WG4 Sum
07. 01. 2008
0 views

K2 WG4 Sum

20063201311221191
10. 10. 2007
0 views

20063201311221191

Editorial Peer Review
15. 10. 2007
0 views

Editorial Peer Review

ZP584PP M
20. 11. 2007
0 views

ZP584PP M

21 news
29. 09. 2007
0 views

21 news

Gina MacD
15. 10. 2007
0 views

Gina MacD

Tussauds
13. 03. 2008
0 views

Tussauds

144
04. 10. 2007
0 views

144

ch09 lecture light
27. 03. 2008
0 views

ch09 lecture light

145 14
10. 04. 2008
0 views

145 14

chap001 002 MRM
13. 04. 2008
0 views

chap001 002 MRM

WHI Review I
24. 03. 2008
0 views

WHI Review I

EAU launch presentation
14. 04. 2008
0 views

EAU launch presentation

Wine Pres 008
18. 04. 2008
0 views

Wine Pres 008

Stane Citrix slo
22. 04. 2008
0 views

Stane Citrix slo

AALL2007
28. 04. 2008
0 views

AALL2007

sec train mod
07. 05. 2008
0 views

sec train mod

3971intro1
19. 11. 2007
0 views

3971intro1

Eating Disorders
02. 05. 2008
0 views

Eating Disorders

NS presentation DPE Oct 07
11. 03. 2008
0 views

NS presentation DPE Oct 07

kobes hilltop 03
19. 10. 2007
0 views

kobes hilltop 03

panhelpulse oct 1
07. 11. 2007
0 views

panhelpulse oct 1

2006 05 31 Citigroup Boston
24. 02. 2008
0 views

2006 05 31 Citigroup Boston

38613IntroToComputing
15. 10. 2007
0 views

38613IntroToComputing

musulmanbekov
12. 10. 2007
0 views

musulmanbekov

FortWorth
15. 10. 2007
0 views

FortWorth

Alex ERF 2005
30. 10. 2007
0 views

Alex ERF 2005

2004 07 APTLD APEET
09. 10. 2007
0 views

2004 07 APTLD APEET

Volunteer Presentation
21. 10. 2007
0 views

Volunteer Presentation

dbrown
09. 04. 2008
0 views

dbrown

Water2 0506
07. 11. 2007
0 views

Water2 0506

recoil
04. 12. 2007
0 views

recoil

rossel
17. 10. 2007
0 views

rossel