Laboratory and Field Evaluation of the NubiScope

17
TECO-2010, Helsinki | 30 August 2010 Laboratory and Field Evaluation of the NubiScope Wiel Wauben * Fred Bosveld Henk Klein Baltink KNMI * R&D Information and Observation Technology Regional Climate Department

description

Laboratory and Field Evaluation of the NubiScope. Wiel Wauben * Fred Bosveld Henk Klein Baltink KNMI * R&D Information and Observation Technology Regional Climate Department. Contents. Introduction NubiScope Laboratory tests contamination Field evaluation cloudiness - PowerPoint PPT Presentation

Transcript of Laboratory and Field Evaluation of the NubiScope

Page 1: Laboratory and Field Evaluation of the NubiScope

TECO-2010, Helsinki | 30 August 2010

Laboratory and Field Evaluation of the NubiScope

Wiel Wauben*

Fred BosveldHenk Klein Baltink

KNMI*R&D Information and Observation Technology Regional Climate Department

Page 2: Laboratory and Field Evaluation of the NubiScope

TECO-2010, Helsinki | 30 August 2010

Contents

• Introduction

• NubiScope

• Laboratory tests contamination

• Field evaluation cloudiness

• Conclusions and outlook

Page 3: Laboratory and Field Evaluation of the NubiScope

TECO-2010, Helsinki | 30 August 2010

3

Automated cloud observations• SYNOP cloud observations fully automated since 2003• Hourly manual SYNOP versus LD40 with cloud algorithm• 6 stations with 3 years overlap data

Total cloud cover (n in okta)

SYNOP NA 0 1 2 3 4 5 6 7 8 9 Sum <n>

NA 0 0 0 0 0 0 0 0 0 0 0 00 16 151 91 41 15 7 2 4 3 1 1 332 1.011 13 308 252 84 58 20 14 14 10 2 0 775 1.20

2 9 90 106 78 59 48 23 26 10 4 2 455 2.293 6 65 94 36 66 47 57 32 34 12 2 451 3.104 2 25 51 24 39 44 48 71 58 35 1 398 4.435 13 22 36 19 30 35 47 51 98 79 2 432 5.216 13 42 57 33 22 33 73 89 167 336 4 869 5.977 42 18 55 23 42 43 63 105 276 1713 2 2382 7.268 92 1 10 10 11 10 11 24 91 2278 42 2580 7.859 11 0 0 0 0 0 1 1 0 9 64 86 8.79

Sum 217 722 752 348 342 287 339 417 747 4469 120 8760<n> 1.77 2.58 2.81 3.49 4.14 4.82 5.21 6.05 7.36 8.12

Band0 = 39.2% Band1 = 75.5% Band2 = 88.0% <Dn> = 0.13 <|Dn|> = 1.10 Miss = 7.1% False = 4.9%

39±5% 75±3% 87±3% -0.2±0.3 1.2±0.2 10±3% 4±2%

AUTOSYNOP

Introduction

Page 4: Laboratory and Field Evaluation of the NubiScope

TECO-2010, Helsinki | 30 August 2010

4

Experiences AUTO versus Observer

• Missing high clouds / sensitivity• Moist layer reported as cloud / threshold• “Gaps” in cloud deck during precipitation• Missing information during shallow fog• Faulty isolated hits / instrumental noise, precipitation, aircraft, ...• Fewer cases with 1 and 7 okta compared to observer• Missing spatial representativeness scanning

Introduction

Page 5: Laboratory and Field Evaluation of the NubiScope

TECO-2010, Helsinki | 30 August 2010

5

NubiScope

•scanning pyrometer (passive day & night)•8-14m thermal IR (contribution of moisture)•FOV=3° (mixed scenes)•scan every 10 minutes •step azimuth 10° •step zenith 3°•1080 sky temperatures•2 surface temperatures•T range -70+50°C•Total cloud cover >20°•low, middle, high cloud•cloud mask

FD12P De Bilt Test

Overview

Page 6: Laboratory and Field Evaluation of the NubiScope

TECO-2010, Helsinki | 30 August 2010

6

Sun

Polynomial fit to clear skyEliminates moisture/contaminationDeviations indicate clouds

Horizon gives Tambient,start of standard T profile

Boundary cloud processing< 70° SZA

Sky temperatures

NubiScope

Page 7: Laboratory and Field Evaluation of the NubiScope

TECO-2010, Helsinki | 30 August 2010

7

Discontinuity near North Derived cloud mask

Cloud mask

NubiScope

Page 8: Laboratory and Field Evaluation of the NubiScope

TECO-2010, Helsinki | 30 August 2010

8

Laboratory tests

Setup

Page 9: Laboratory and Field Evaluation of the NubiScope

TECO-2010, Helsinki | 30 August 2010

9

Laboratory tests

y = -0.0000x + 0.0000

y = -0.0022x + 0.1183y = -0.0042x + 0.1171

y = -0.0037x + 0.1568

y = -0.0059x + 0.2452

y = -0.0076x + 0.2814

y = -0.0147x + 0.4221

y = -0.0161x + 0.4163y = -0.0278x + 0.6155

y = -0.0032x + 0.1239

-0.5

0.0

0.5

1.0

1.5

2.0

-50 -40 -30 -20 -10 0 10 20 30 40 50

TGalai (oC)

TPyr

omet

er -

TG

alai (o C

)-T

polfi

t

12/06/2008

17/07/2008

14/08/2008

11/09/2008

16/10/2008

20/11/2008

08/01/2009

13/03/2009

08/07/2009

09/07/2009

Linear fits

Effect contamination pyrometer lens

Page 10: Laboratory and Field Evaluation of the NubiScope

TECO-2010, Helsinki | 30 August 2010

10

Laboratory tests

Stability pyrometer (clean lens)

-1

0

1

2

3

4

5

6

-50 -40 -30 -20 -10 0 10 20 30 40 50

TGalai (oC)

TPy

rom

eter -

TG

alai (o C

)

12/06/2008

09/07/2009

26/10/2009

26/11/2009

εGalai = 97.3 %

2 years

Page 11: Laboratory and Field Evaluation of the NubiScope

TECO-2010, Helsinki | 30 August 2010

11

Manual evaluation total cloudiness

• NubiScope versus LD40 (10-minute window used in METAR)• Using Total Sky Imager (TSI) images at start/end of scan• Evaluation by observers Rotterdam airport (25 km from Cabauw)• Data and evaluation via Web tool.

Field evaluation Weather Dpt.

Page 12: Laboratory and Field Evaluation of the NubiScope

TECO-2010, Helsinki | 30 August 2010

12

Overview manual evaluations

• Period 1 June to 16 August 2009 • 243 evaluations (10’ or longer periods)• Scores assigned to NubiScope and LD40

+1 (good) of -1 (false)

• “Scores” of NubiScope much better than LD40• Discrepancies of LD40 mainly due to lack of spatial

representativeness• NubiScope generally better than LD40 at detection of

middle and high clouds ... but sometimes not

NubiScope LD40

# +1 71 7

# -1 28 184

Field evaluation Weather Dpt.

Page 13: Laboratory and Field Evaluation of the NubiScope

TECO-2010, Helsinki | 30 August 2010

13

Evaluation total cloud cover

• Evaluation fractional cloudiness LD40, NubiScope, LIDAR, cloud radar, long wave IR radiation

• Goal: optimal cloud product for Cabauw presentation 2(4)

• Example: Total cloud cover NubiScope versus LD40 for period 14 May 2008 - 30 September 2009

Field evaluation Climate Dpt.

Page 14: Laboratory and Field Evaluation of the NubiScope

TECO-2010, Helsinki | 30 August 2010

14

Evaluation total cloud cover contingency matrix

• Scores LD40 versus NubiScope in agreement with LD40 versus OBS and better 0-1 en 7-8 okta behaviour

Field evaluation Climate Dpt.

LD40 SYNOP 30min 30*1min

NA 0 1 2 3 4 5 6 7 8 Sum

NA 155 1581 806 362 328 389 430 524 1408 4201 10184

0 156 6081 2146 486 229 90 30 17 20 37 9136

1 194 3972 3054 1052 654 376 195 80 75 111 9569

2 70 343 598 513 468 406 253 118 73 25 2797

3 70 126 255 307 378 405 365 279 178 43 2336

4 44 93 134 147 222 330 395 393 366 91 2171

5 48 74 103 97 125 216 337 427 671 190 2240

6 64 57 74 50 91 138 244 399 956 508 2517

7 483 228 302 216 322 393 623 1019 3512 7394 14009

Nub

iSco

pe

8 502 115 190 139 122 134 168 276 1592 13394 16130

Sum 1786 11089 6856 3007 2611 2488 2610 3008 7443 21793 60905

Δn±0 = 46% Δn±1 = 81% Δn±2 = 90% Miss = 5% False = 5% <Δn> = 0.06 <|Δn|> = 0.95

Page 15: Laboratory and Field Evaluation of the NubiScope

TECO-2010, Helsinki | 30 August 2010

15

Effect of scanning

Field evaluation Climate Dpt.

0 9 18 27 36 45 54 63 R LM LS

1

10

100 cc=0 cc=1 cc=2 cc=3 cc=4 cc=5 cc=6 cc=7 cc=8

Perc

enta

ge o

f ca

ses

per

okt

a in

terv

al

Zenith angle range (o)

n = 0 and 8 ▼

n = 1 and 7 ▲

n = 2 to 6 ▲

Page 16: Laboratory and Field Evaluation of the NubiScope

TECO-2010, Helsinki | 30 August 2010

16

Conclusions

• NubiScope has added value for total cloudiness compared to LD40 mainly due to scanning / better spatial representativeness

• Effect of contamination is “small” and can be monitored.• Climate Dept. decided to keep the NubiScope as part of the

Cabauw/CESAR infrastructure• Weather Dept. noted added value of NubiScope but accurate

height information is essential / required

• Make NubiScope operational (maintenance, ...)• Improve timeliness by scanning continuously with “running”

updates• Investigate / improve NubiScope cloud base height (coincidences

with LD40, T-profile, moisture correction, ...)

Conclusions & Outlook

Outlook

Page 17: Laboratory and Field Evaluation of the NubiScope

TECO-2010, Helsinki | 30 August 2010

17

Thank you for your attention!

Lookup conference paper for more information

See paper 2(4) for more details on evaluation of cloudiness

Acknowledgements

Rob van Krimpen, Cor van Oort, Mark Savenije, Siebren de Haan, and observers at Rotterdam Airport (all KNMI)

Hans Möller (IMK) and Theo Sattler (Sattler-SES)