Dustin,
Just a quick update - we are still seeing occasional missing grids.
As before, it is almost always entire forecast hours that are missing - as if
they were deleted/moved from the source before they were ingested into CONDUIT..
Here is a list of the last week or so of 1 deg GFS files along with the number
of grids in them. It should be 95796 grids per file, but note that the 00UTC
files for 4/3, 4/4, 4/5, and 4/9 are short. It seems to be mostly the 00 UTC
runs. The half degree and quarter degree data are also missing complete
forecast hours.
I can get you more info about the specific forecast hours that we missed from
each run if that would help. I don't think they are the same for every run.
GFS_Global_onedeg_20210401_0000.grib2
95796 411019 5684816
GFS_Global_onedeg_20210401_0600.grib2
95796 411019 5685021
GFS_Global_onedeg_20210401_1200.grib2
95796 411019 5684822
GFS_Global_onedeg_20210401_1800.grib2
95796 411019 5684812
GFS_Global_onedeg_20210402_0000.grib2
95796 411019 5684851
GFS_Global_onedeg_20210402_0600.grib2
95796 411019 5684833
GFS_Global_onedeg_20210402_1200.grib2
95796 411019 5684808
GFS_Global_onedeg_20210402_1800.grib2
95796 411019 5684860
GFS_Global_onedeg_20210403_0000.grib2
94310 404621 5597285
GFS_Global_onedeg_20210403_0600.grib2
95796 411019 5684818
GFS_Global_onedeg_20210403_1200.grib2
95796 411019 5684807
GFS_Global_onedeg_20210403_1800.grib2
95796 411019 5684797
GFS_Global_onedeg_20210404_0000.grib2
94310 404621 5597200
GFS_Global_onedeg_20210404_0600.grib2
95796 411019 5684821
GFS_Global_onedeg_20210404_1200.grib2
95796 411019 5684798
GFS_Global_onedeg_20210404_1800.grib2
95796 411019 5684813
GFS_Global_onedeg_20210405_0000.grib2
90595 388626 5378190
GFS_Global_onedeg_20210405_0600.grib2
95796 411019 5684802
GFS_Global_onedeg_20210405_1200.grib2
95796 411019 5684774
GFS_Global_onedeg_20210405_1800.grib2
95796 411019 5684758
GFS_Global_onedeg_20210406_0000.grib2
95796 411019 5684813
GFS_Global_onedeg_20210406_0600.grib2
95796 411019 5684809
GFS_Global_onedeg_20210406_1200.grib2
95796 411019 5684776
GFS_Global_onedeg_20210406_1800.grib2
95796 411019 5684697
GFS_Global_onedeg_20210407_0000.grib2
95796 411019 5684791
GFS_Global_onedeg_20210407_0600.grib2
95796 411019 5684746
GFS_Global_onedeg_20210407_1200.grib2
95796 411019 5684691
GFS_Global_onedeg_20210407_1800.grib2
95796 411019 5684687
GFS_Global_onedeg_20210408_0000.grib2
95796 411019 5684782
GFS_Global_onedeg_20210408_0600.grib2
95796 411019 5684808
GFS_Global_onedeg_20210408_1200.grib2
95796 411019 5684828
GFS_Global_onedeg_20210408_1800.grib2
95796 411019 5684846
GFS_Global_onedeg_20210409_0000.grib2
92824 398223 5509733
GFS_Global_onedeg_20210409_0600.grib2
95796 411019 5684852
GFS_Global_onedeg_20210409_1200.grib2
95796 411019 5685000
Here's the 0.25 degree
GFS_Global_0p25deg_20210405_0000.grib2
64590 276661 3884224
GFS_Global_0p25deg_20210405_0600.grib2
69048 295855 4151442
GFS_Global_0p25deg_20210405_1200.grib2
69048 295855 4151414
GFS_Global_0p25deg_20210405_1800.grib2
69048 295855 4151483
GFS_Global_0p25deg_20210406_0000.grib2
68305 292656 4106901
GFS_Global_0p25deg_20210406_0600.grib2
69048 295855 4151437
GFS_Global_0p25deg_20210406_1200.grib2
69048 295855 4151442
GFS_Global_0p25deg_20210406_1800.grib2
69048 295855 4151436
GFS_Global_0p25deg_20210407_0000.grib2
68305 292656 4106873
GFS_Global_0p25deg_20210407_0600.grib2
69048 295855 4151358
GFS_Global_0p25deg_20210407_1200.grib2
69048 295855 4151344
GFS_Global_0p25deg_20210407_1800.grib2
69048 295855 4151287
GFS_Global_0p25deg_20210408_0000.grib2
67562 289457 4062300
GFS_Global_0p25deg_20210408_0600.grib2
69048 295855 4151419
GFS_Global_0p25deg_20210408_1200.grib2
69048 295855 4151447
GFS_Global_0p25deg_20210408_1800.grib2
69048 295855 4151518
GFS_Global_0p25deg_20210409_0000.grib2
66819 286258 4017944
GFS_Global_0p25deg_20210409_0600.grib2
69048 295855 4151509
GFS_Global_0p25deg_20210409_1200.grib2
69048 295855 4151651
Pete
<http://www.weather.com/tv/shows/wx-geeks/video/the-incredible-shrinking-cold-pool>-----
Pete Pokrandt - Systems Programmer
UW-Madison Dept of Atmospheric and Oceanic Sciences
608-262-3086 - poker@xxxxxxxxxxxx
________________________________
From: Dustin Sheffler - NOAA Federal <dustin.sheffler@xxxxxxxx>
Sent: Monday, March 29, 2021 9:39 AM
To: Pete Pokrandt <poker@xxxxxxxxxxxx>
Cc: Anne Myckow - NOAA Federal <anne.myckow@xxxxxxxx>; Unidata CONDUIT Support
<support-conduit@xxxxxxxxxxxxxxxx>; mschmidt@xxxxxxxxxxxxxxxx
<mschmidt@xxxxxxxxxxxxxxxx>; Person, Arthur A. <aap1@xxxxxxx>;
_NCEP.List.pmb-dataflow <ncep.list.pmb-dataflow@xxxxxxxx>
Subject: Re: [Support #SZF-234138]: [conduit] GFSv16 Upgrade now planned for
MONDAY, March 22
Hello CONDUIT users,
Can you tell me if you are still seeing issues getting all the forecast hours
from each GFS cycle? I can tell you that all the data is getting from our
supercomputer to the downstream systems as those are the systems that host the
data for NOMADS/FTPPRD. From there the data is sent one more hop to get into
the NCEP CONDUIT LDM queues. I don't believe we've had any issue with the data
getting inserted into the queues from what we've seen though. Please advise if
you are still having any trouble.
On Wed, Mar 24, 2021 at 12:03 AM Pete Pokrandt
<poker@xxxxxxxxxxxx<mailto:poker@xxxxxxxxxxxx>> wrote:
Anne/Dustin,
I found the emails from when this kind of thing last happened (11/25/2019)
Here was the last email from Anne to me/CONDUIT about that issue, in case it
rings any bells:
Hi Pete,
Since Friday we've been having a problem with our connections that feed data
from the supercomputer to our College Park data center. This is causing various
random files to be missing or hours late, and we are working to get this fixed
today.
I apologize that I did not send out a notice to the CONDUIT list. We were
fighting this starting Tuesday until late Friday at which time it was decided
we would put it down for the weekend and come back fresh today, and I forgot to
send an email to your list about the expected impacts with all the other
notification I was doing. I will modify our critical notification procedure to
include the CONDUIT email moving forward.
As I said we are working on the problem today but it is turning out to be a
thorny problem that has been hard to pin down so far. We do have a mitigation
that we could do but it would prevent us from further troubleshooting so I
can't put it in place at this time.
I will let you know when we find anything or if it will remain this way
overnight. I apologize for the inconvenience.
Thanks,
Anne
________________________________
From: Anne Myckow - NOAA Federal
<anne.myckow@xxxxxxxx<mailto:anne.myckow@xxxxxxxx>>
Sent: Tuesday, March 23, 2021 4:11 PM
To: Dustin Sheffler - NOAA Federal
<dustin.sheffler@xxxxxxxx<mailto:dustin.sheffler@xxxxxxxx>>
Cc: Unidata CONDUIT Support
<support-conduit@xxxxxxxxxxxxxxxx<mailto:support-conduit@xxxxxxxxxxxxxxxx>>;
mschmidt@xxxxxxxxxxxxxxxx<mailto:mschmidt@xxxxxxxxxxxxxxxx>
<mschmidt@xxxxxxxxxxxxxxxx<mailto:mschmidt@xxxxxxxxxxxxxxxx>>; Pete Pokrandt
<poker@xxxxxxxxxxxx<mailto:poker@xxxxxxxxxxxx>>; Person, Arthur A.
<aap1@xxxxxxx<mailto:aap1@xxxxxxx>>; _NCEP.List.pmb-dataflow
<ncep.list.pmb-dataflow@xxxxxxxx<mailto:ncep.list.pmb-dataflow@xxxxxxxx>>
Subject: Re: [Support #SZF-234138]: [conduit] GFSv16 Upgrade now planned for
MONDAY, March 22
Unidata/CONDUIT users,
I suggest that we drop at least one resolution of the GFS. I think the 0p25
might be too big for the grib insert to process.
Can you tell us exactly what you are missing? Certain variables, or certain
forecast hours?
Thanks,
Anne
On Tue, Mar 23, 2021 at 2:12 PM Dustin Sheffler - NOAA Federal
<dustin.sheffler@xxxxxxxx<mailto:dustin.sheffler@xxxxxxxx>> wrote:
Hello Unidata,
We will investigate to see what the issue may be. Thanks for bringing this to
our attention.
On Tue, Mar 23, 2021 at 11:06 AM Unidata CONDUIT Support
<support-conduit@xxxxxxxxxxxxxxxx<mailto:support-conduit@xxxxxxxxxxxxxxxx>>
wrote:
Hi Dustin and Anne,
re:
> GFS was successfully upgraded to version 16 today. Please let me know if
> you see any issues as soon as possible. Thanks
You most likely received the email from Pete Pokrandt (UWisconsin/AOS)
to Kevin Tyle (UAlbany/Atmospheric Science) as a CC a bit earlier today.
I am writing to support Pete's and Kevin's comments about the GFS data in
the CONDUIT feed being spotty.
For completeness, here is the body of the email that Pete wrote earlier:
Kevin (cc conduit, support, Anne, Dustin)
Re: incomplete GFS runs on conduit
Yes, we are seeing the same. The forecast hours that come through are complete,
but not all forecast hours are coming through. For the 06 UTC run on the 0p50
grid last night, it looks complete up to F201 and then there are some forecast
hours that are completely missing (e.g. F204, F207, F213, etc..)
I seem to recall a problem like this within the past year or two, but I don't
offhand recall what the problem was or how it got fixed. Something is up on the
ingest side, where not all of the forecast hours are making into CONDUIT.
You can see the reduction in data amounts on the IDD status graphs at Unidata -
both for us (idd-agg.aos.wisc.edu<http://idd-agg.aos.wisc.edu>) and for Unidata
(lead.unidata.ucar.edu<http://lead.unidata.ucar.edu>)
https://rtstats.unidata.ucar.edu/cgi-bin/rtstats/iddstats_vol_nc?CONDUIT+idd-agg.aos.wisc.edu
https://rtstats.unidata.ucar.edu/cgi-bin/rtstats/iddstats_vol_nc?CONDUIT+lead.unidata.ucar.edu
Here's what the 06 UTC 23 Mar 2021 0p50 GFS run looked like from our end (file
names are changed from what came over on CONDUIT)
-rw-r--r--. 1 ldm ldm 150475420 Mar 23 04:31 gblav2.21032306_F000
-rw-r--r--. 1 ldm ldm 161458306 Mar 23 04:34 gblav2.21032306_F003
-rw-r--r--. 1 ldm ldm 162855161 Mar 23 04:37 gblav2.21032306_F006
-rw-r--r--. 1 ldm ldm 162587162 Mar 23 04:34 gblav2.21032306_F009
-rw-r--r--. 1 ldm ldm 162917663 Mar 23 04:36 gblav2.21032306_F012
-rw-r--r--. 1 ldm ldm 162116581 Mar 23 04:39 gblav2.21032306_F015
-rw-r--r--. 1 ldm ldm 163642600 Mar 23 04:37 gblav2.21032306_F018
-rw-r--r--. 1 ldm ldm 163140953 Mar 23 04:38 gblav2.21032306_F021
-rw-r--r--. 1 ldm ldm 162532415 Mar 23 04:39 gblav2.21032306_F024
-rw-r--r--. 1 ldm ldm 162443371 Mar 23 04:40 gblav2.21032306_F027
-rw-r--r--. 1 ldm ldm 162900215 Mar 23 04:41 gblav2.21032306_F030
-rw-r--r--. 1 ldm ldm 162325126 Mar 23 04:41 gblav2.21032306_F033
-rw-r--r--. 1 ldm ldm 162115930 Mar 23 04:42 gblav2.21032306_F036
-rw-r--r--. 1 ldm ldm 161711022 Mar 23 04:42 gblav2.21032306_F039
-rw-r--r--. 1 ldm ldm 162205354 Mar 23 04:43 gblav2.21032306_F042
-rw-r--r--. 1 ldm ldm 162494846 Mar 23 04:44 gblav2.21032306_F045
-rw-r--r--. 1 ldm ldm 163184046 Mar 23 04:44 gblav2.21032306_F048
-rw-r--r--. 1 ldm ldm 161488854 Mar 23 04:45 gblav2.21032306_F051
-rw-r--r--. 1 ldm ldm 162339913 Mar 23 04:47 gblav2.21032306_F054
-rw-r--r--. 1 ldm ldm 162605480 Mar 23 04:47 gblav2.21032306_F057
-rw-r--r--. 1 ldm ldm 162067320 Mar 23 04:48 gblav2.21032306_F060
-rw-r--r--. 1 ldm ldm 162428155 Mar 23 04:48 gblav2.21032306_F063
-rw-r--r--. 1 ldm ldm 162421492 Mar 23 04:51 gblav2.21032306_F066
-rw-r--r--. 1 ldm ldm 162414972 Mar 23 04:52 gblav2.21032306_F069
-rw-r--r--. 1 ldm ldm 162541484 Mar 23 04:52 gblav2.21032306_F072
-rw-r--r--. 1 ldm ldm 160813325 Mar 23 04:54 gblav2.21032306_F075
-rw-r--r--. 1 ldm ldm 162061554 Mar 23 04:54 gblav2.21032306_F078
-rw-r--r--. 1 ldm ldm 161611841 Mar 23 04:54 gblav2.21032306_F081
-rw-r--r--. 1 ldm ldm 161876278 Mar 23 04:55 gblav2.21032306_F084
-rw-r--r--. 1 ldm ldm 161372460 Mar 23 04:56 gblav2.21032306_F087
-rw-r--r--. 1 ldm ldm 162169224 Mar 23 04:56 gblav2.21032306_F090
-rw-r--r--. 1 ldm ldm 161434969 Mar 23 04:57 gblav2.21032306_F093
-rw-r--r--. 1 ldm ldm 161021815 Mar 23 04:57 gblav2.21032306_F096
-rw-r--r--. 1 ldm ldm 159923756 Mar 23 04:58 gblav2.21032306_F099
-rw-r--r--. 1 ldm ldm 160659519 Mar 23 05:00 gblav2.21032306_F102
-rw-r--r--. 1 ldm ldm 159581680 Mar 23 05:00 gblav2.21032306_F105
-rw-r--r--. 1 ldm ldm 160292034 Mar 23 05:01 gblav2.21032306_F108
-rw-r--r--. 1 ldm ldm 160712053 Mar 23 05:01 gblav2.21032306_F111
-rw-r--r--. 1 ldm ldm 161058411 Mar 23 05:02 gblav2.21032306_F114
-rw-r--r--. 1 ldm ldm 161093163 Mar 23 05:02 gblav2.21032306_F117
-rw-r--r--. 1 ldm ldm 161976752 Mar 23 05:04 gblav2.21032306_F120
-rw-r--r--. 1 ldm ldm 160955140 Mar 23 05:04 gblav2.21032306_F123
-rw-r--r--. 1 ldm ldm 161864402 Mar 23 05:05 gblav2.21032306_F126
-rw-r--r--. 1 ldm ldm 161284116 Mar 23 05:05 gblav2.21032306_F129
-rw-r--r--. 1 ldm ldm 161914513 Mar 23 05:06 gblav2.21032306_F132
-rw-r--r--. 1 ldm ldm 161183283 Mar 23 05:06 gblav2.21032306_F135
-rw-r--r--. 1 ldm ldm 161934137 Mar 23 05:07 gblav2.21032306_F138
-rw-r--r--. 1 ldm ldm 161018288 Mar 23 05:08 gblav2.21032306_F141
-rw-r--r--. 1 ldm ldm 161258540 Mar 23 05:08 gblav2.21032306_F144
-rw-r--r--. 1 ldm ldm 161227819 Mar 23 05:09 gblav2.21032306_F147
-rw-r--r--. 1 ldm ldm 162484147 Mar 23 05:10 gblav2.21032306_F150
-rw-r--r--. 1 ldm ldm 162453652 Mar 23 05:10 gblav2.21032306_F153
-rw-r--r--. 1 ldm ldm 162169306 Mar 23 05:11 gblav2.21032306_F156
-rw-r--r--. 1 ldm ldm 161267470 Mar 23 05:12 gblav2.21032306_F159
-rw-r--r--. 1 ldm ldm 161089660 Mar 23 05:13 gblav2.21032306_F162
-rw-r--r--. 1 ldm ldm 162070875 Mar 23 05:14 gblav2.21032306_F165
-rw-r--r--. 1 ldm ldm 161681467 Mar 23 05:15 gblav2.21032306_F168
-rw-r--r--. 1 ldm ldm 161140254 Mar 23 05:15 gblav2.21032306_F171
-rw-r--r--. 1 ldm ldm 162243140 Mar 23 05:16 gblav2.21032306_F174
-rw-r--r--. 1 ldm ldm 162092750 Mar 23 05:17 gblav2.21032306_F177
-rw-r--r--. 1 ldm ldm 162156634 Mar 23 05:17 gblav2.21032306_F180
-rw-r--r--. 1 ldm ldm 161621678 Mar 23 05:19 gblav2.21032306_F183
-rw-r--r--. 1 ldm ldm 162393378 Mar 23 05:19 gblav2.21032306_F186
-rw-r--r--. 1 ldm ldm 161208777 Mar 23 05:20 gblav2.21032306_F189
-rw-r--r--. 1 ldm ldm 160910941 Mar 23 05:21 gblav2.21032306_F192
-rw-r--r--. 1 ldm ldm 161003293 Mar 23 05:22 gblav2.21032306_F195
-rw-r--r--. 1 ldm ldm 161413148 Mar 23 05:22 gblav2.21032306_F198
-rw-r--r--. 1 ldm ldm 160776464 Mar 23 05:23 gblav2.21032306_F201
-rw-r--r--. 1 ldm ldm 160265668 Mar 23 05:26 gblav2.21032306_F210
-rw-r--r--. 1 ldm ldm 159395465 Mar 23 05:27 gblav2.21032306_F216
-rw-r--r--. 1 ldm ldm 159484261 Mar 23 05:28 gblav2.21032306_F219
-rw-r--r--. 1 ldm ldm 160606423 Mar 23 05:30 gblav2.21032306_F222
-rw-r--r--. 1 ldm ldm 159827636 Mar 23 05:33 gblav2.21032306_F237
-rw-r--r--. 1 ldm ldm 161442268 Mar 23 05:47 gblav2.21032306_F294
-rw-r--r--. 1 ldm ldm 161139109 Mar 23 05:47 gblav2.21032306_F297
-rw-r--r--. 1 ldm ldm 160828989 Mar 23 05:49 gblav2.21032306_F300
-rw-r--r--. 1 ldm ldm 159837903 Mar 23 05:49 gblav2.21032306_F303
-rw-r--r--. 1 ldm ldm 160115585 Mar 23 05:50 gblav2.21032306_F306
-rw-r--r--. 1 ldm ldm 161293996 Mar 23 05:52 gblav2.21032306_F312
-rw-r--r--. 1 ldm ldm 161529924 Mar 23 05:52 gblav2.21032306_F315
-rw-r--r--. 1 ldm ldm 162262363 Mar 23 05:53 gblav2.21032306_F321
-rw-r--r--. 1 ldm ldm 162652140 Mar 23 05:55 gblav2.21032306_F327
-rw-r--r--. 1 ldm ldm 163059071 Mar 23 05:56 gblav2.21032306_F330
-rw-r--r--. 1 ldm ldm 163167459 Mar 23 05:57 gblav2.21032306_F333
-rw-r--r--. 1 ldm ldm 163317132 Mar 23 05:57 gblav2.21032306_F336
-rw-r--r--. 1 ldm ldm 161847992 Mar 23 06:01 gblav2.21032306_F351
-rw-r--r--. 1 ldm ldm 162842095 Mar 23 06:02 gblav2.21032306_F354
-rw-r--r--. 1 ldm ldm 162385970 Mar 23 06:03 gblav2.21032306_F357
-rw-r--r--. 1 ldm ldm 162980199 Mar 23 06:03 gblav2.21032306_F360
-rw-r--r--. 1 ldm ldm 162501030 Mar 23 06:05 gblav2.21032306_F366
-rw-r--r--. 1 ldm ldm 161922055 Mar 23 06:06 gblav2.21032306_F369
-rw-r--r--. 1 ldm ldm 162414317 Mar 23 06:08 gblav2.21032306_F378
-rw-r--r--. 1 ldm ldm 162022901 Mar 23 06:09 gblav2.21032306_F381
Pete
Cheers,
Tom
--
****************************************************************************
Unidata User Support UCAR Unidata Program
(303) 497-8642 P.O. Box 3000
support@xxxxxxxxxxxxxxxx<mailto:support@xxxxxxxxxxxxxxxx>
Boulder, CO 80307
----------------------------------------------------------------------------
Unidata HomePage http://www.unidata.ucar.edu
****************************************************************************
Ticket Details
===================
Ticket ID: SZF-234138
Department: Support CONDUIT
Priority: Normal
Status: Open
===================
NOTE: All email exchanges with Unidata User Support are recorded in the Unidata
inquiry tracking system and then made publicly available through the web. If
you do not want to have your interactions made available in this way, you must
let us know in each email you send to us.
--
Dustin Sheffler
NCEP Central Operations - Dataflow
<tel:%28301%29%20683-1400>5830 University Research Court, Rm 1030
College Park, Maryland 20740
Office: (301) 683-3827<tel:%28301%29%20683-1400>
<tel:%28301%29%20683-1400>
--
Anne Myckow
Dataflow Team Lead
NWS/NCEP/NCO
--
Dustin Sheffler
NCEP Central Operations - Dataflow
<tel:%28301%29%20683-1400>5830 University Research Court, Rm 1030
College Park, Maryland 20740
Office: (301) 683-3827<tel:%28301%29%20683-1400>
<tel:%28301%29%20683-1400>