Re: [ldm-users] Missing GLM lightning data

I might have to eat my words a little, twas not quite as straight forward
as I had expected...

I'm attaching a Python script that seems to be getting it most of the way
there.  I can confirm the following:


   - Level2 data looks good in both python/cartopy as well as McIDAS-X
   - Level3 data looks good in python/cartopy, but NOT McIDAS-X
   - Metadata (ncdump -hs) looks effectively the same in both files.
   - There are valid data and location values in both data sets


I want to say this should get things most of the way there for most people,
and may even solve peoples' workflows completely if they don't involve
McIDAS-X.  But I'm having a hard time nailing down the differences that
McIDAS-X doesn't like.  I might have to leave this here for the weekend,
but I wanted to show what I've found thus far.  Hoping this helps!

-Mike

On Wed, Dec 10, 2025 at 11:43 AM Mike Zuranski <mike@xxxxxxxxxxxxxxxxxxx>
wrote:

> Will do, time to make some coffee...
>
> On Wed, Dec 10, 2025 at 11:42 AM Stonie Cooper <cooper@xxxxxxxx> wrote:
>
>> Tiffany is also looking at it from what the WFO AWIPS instance is doing.
>> I think the community would like this, so please let everyone know if you
>> come up with something.
>> -
>> Stonie Cooper, PhD
>> Software Engineer III
>> NSF Unidata
>> cooper@xxxxxxxx
>>
>>
>> On Wed, Dec 10, 2025 at 10:39 AM Mike Zuranski <mike@xxxxxxxxxxxxxxxxxxx>
>> wrote:
>>
>>> Hey Stonie,
>>>
>>> Thanks for that info.
>>>
>>> I might try to hack something together later but level2 -> level3 should
>>> be fairly straight forward.  If I get it working would you (or anyone else
>>> here) be interested in that solution?
>>>
>>> Best,
>>> -Mike
>>>
>>> On Wed, Dec 10, 2025 at 9:54 AM Stonie Cooper <cooper@xxxxxxxx> wrote:
>>>
>>>> It is no longer being produced by NOAA at a national level, but rather
>>>> at each WFO. Unidata staff are investigating if it is feasible to replicate
>>>> creating L3 GLM data.
>>>>
>>>> -
>>>> Stonie Cooper, PhD
>>>> Software Engineer III
>>>> NSF Unidata
>>>> cooper@xxxxxxxx
>>>>
>>>>
>>>> On Tue, Dec 9, 2025 at 2:10 PM Mike Zuranski <mike@xxxxxxxxxxxxxxxxxxx>
>>>> wrote:
>>>>
>>>>> Hi Gilbert & Everyone,
>>>>>
>>>>> I see what you see (and don't see).  It looks like the level2
>>>>> GLMISatSS (gridded) GLM data is still flowing but not level3, see below.
>>>>>
>>>>> The difference between the level2 and level3 GLMISatSS data sets is
>>>>> temporal aggregation; level2 is one minute while level3 is five minute.
>>>>> Interesting that we lost one but not the other, especially since the five
>>>>> minute set is more used.  Did an IDD script die someplace?  I can't recall
>>>>> where/how this data is generated/fed...
>>>>>
>>>>> If it is just gone you have a couple options.  Since we still have the
>>>>> level2 data you can aggregate that yourself.  You can also compile this
>>>>> product yourself starting from raw GLM tiles, if you want to go that route
>>>>> I'm including some references below to help you do that. TL;DR: you stitch
>>>>> raw tiles together into the level2 product using glm-restitch, and then
>>>>> each time you make a new one you aggregate that and the last four to make 
>>>>> a
>>>>> level3 set.  And see, that's why I wonder if a script just crashed
>>>>> someplace because that tiled and level2 data is still there and I thought
>>>>> this was Unidata's processing flow too... but it's been a minute and I'm
>>>>> probably wrong.
>>>>>
>>>>> Best,
>>>>> -Mike
>>>>>
>>>>>
>>>>> References:
>>>>>
>>>>> https://mailinglists.unidata.ucar.edu/archives/idvusers/2022/msg00007.html#gsc.tab=0
>>>>>
>>>>> https://www.weather.gov/media/notification/pdf_2023_24/scn22-112_goes-r_glm_gridded_data_products_aaa.pdf
>>>>> https://github.com/Unidata/ldm-alchemy/blob/main/glm-restitch.py
>>>>>
>>>>>
>>>>> $ notifyme -v -h idd.cod.edu -p GLMISatSS -o 300
>>>>> 20251209T194830.514624Z notifyme[3157216]
>>>>> notifyme.c:main:596                 NOTE  Starting Up: idd.cod.edu:
>>>>> 20251209194330.514495 TS_ENDT {{ANY, "GLMISatSS"}}
>>>>> 20251209T194830.515119Z notifyme[3157216]
>>>>> inetutil.c:addrbyhost:964           INFO  Resolving idd.cod.edu to
>>>>> 192.203.136.196 took 0.000435 seconds
>>>>> 20251209T194830.539576Z notifyme[3157216]
>>>>> notifyme.c:notifyme6:432            NOTE  OK
>>>>> 20251209T194831.599375Z notifyme[3157216]
>>>>> notifyme.c:notifymeprog_6:276       INFO      184781 20251209194416.270800
>>>>> NIMAGE 000
>>>>>  
>>>>> /data/ldm/pub/native/satellite/GOES/GOES18/Products/GLMISatSS/Level2/FullDisk/20251209/OR_GLM-L2-GLMF-M6_G18_s20253431942000_e20253431943000_c20253431942300.nc
>>>>> 20251209T194831.626749Z notifyme[3157216]
>>>>> notifyme.c:notifymeprog_6:276       INFO      269354 20251209194425.008981
>>>>> NIMAGE 000
>>>>>  
>>>>> /data/ldm/pub/native/satellite/GOES/GOES19/Products/GLMISatSS/Level2/FullDisk/20251209/OR_GLM-L2-GLMF-M6_G19_s20253431942000_e20253431943000_c20253431942300.nc
>>>>> 20251209T194831.678468Z notifyme[3157216]
>>>>> notifyme.c:notifymeprog_6:276       INFO      172906 20251209194529.559040
>>>>> NIMAGE 000
>>>>>  
>>>>> /data/ldm/pub/native/satellite/GOES/GOES18/Products/GLMISatSS/Level2/FullDisk/20251209/OR_GLM-L2-GLMF-M6_G18_s20253431943000_e20253431944000_c20253431943300.nc
>>>>> 20251209T194831.678525Z notifyme[3157216]
>>>>> notifyme.c:notifymeprog_6:276       INFO      277577 20251209194542.851395
>>>>> NIMAGE 000
>>>>>  
>>>>> /data/ldm/pub/native/satellite/GOES/GOES19/Products/GLMISatSS/Level2/FullDisk/20251209/OR_GLM-L2-GLMF-M6_G19_s20253431943000_e20253431944000_c20253431943300.nc
>>>>> 20251209T194831.678546Z notifyme[3157216]
>>>>> notifyme.c:notifymeprog_6:276       INFO      167196 20251209194615.440857
>>>>> NIMAGE 000
>>>>>  
>>>>> /data/ldm/pub/native/satellite/GOES/GOES18/Products/GLMISatSS/Level2/FullDisk/20251209/OR_GLM-L2-GLMF-M6_G18_s20253431944000_e20253431945000_c20253431944300.nc
>>>>> 20251209T194831.678574Z notifyme[3157216]
>>>>> notifyme.c:notifymeprog_6:276       INFO      264742 20251209194622.883552
>>>>> NIMAGE 000
>>>>>  
>>>>> /data/ldm/pub/native/satellite/GOES/GOES19/Products/GLMISatSS/Level2/FullDisk/20251209/OR_GLM-L2-GLMF-M6_G19_s20253431944000_e20253431945000_c20253431944300.nc
>>>>> 20251209T194831.692097Z notifyme[3157216]
>>>>> notifyme.c:notifymeprog_6:276       INFO      180089 20251209194715.952655
>>>>> NIMAGE 000
>>>>>  
>>>>> /data/ldm/pub/native/satellite/GOES/GOES18/Products/GLMISatSS/Level2/FullDisk/20251209/OR_GLM-L2-GLMF-M6_G18_s20253431945000_e20253431946000_c20253431945300.nc
>>>>> 20251209T194831.692142Z notifyme[3157216]
>>>>> notifyme.c:notifymeprog_6:276       INFO      271423 20251209194724.174000
>>>>> NIMAGE 000
>>>>>  
>>>>> /data/ldm/pub/native/satellite/GOES/GOES19/Products/GLMISatSS/Level2/FullDisk/20251209/OR_GLM-L2-GLMF-M6_G19_s20253431945000_e20253431946000_c20253431945300.nc
>>>>> 20251209T194831.717980Z notifyme[3157216]
>>>>> notifyme.c:notifymeprog_6:276       INFO      162728 20251209194815.425465
>>>>> NIMAGE 000
>>>>>  
>>>>> /data/ldm/pub/native/satellite/GOES/GOES18/Products/GLMISatSS/Level2/FullDisk/20251209/OR_GLM-L2-GLMF-M6_G18_s20253431946000_e20253431947000_c20253431946300.nc
>>>>> 20251209T194831.718022Z notifyme[3157216]
>>>>> notifyme.c:notifymeprog_6:276       INFO      267687 20251209194824.354332
>>>>> NIMAGE 000
>>>>>  
>>>>> /data/ldm/pub/native/satellite/GOES/GOES19/Products/GLMISatSS/Level2/FullDisk/20251209/OR_GLM-L2-GLMF-M6_G19_s20253431946000_e20253431947000_c20253431946300.nc
>>>>> ^C20251209T194834.112035Z notifyme[3157216]
>>>>> notifyme.c:cleanup:73               NOTE  exiting
>>>>>
>>>>>
>>>>>
>>>>> On Mon, Dec 8, 2025 at 5:48 PM Sebenste, Gilbert <sebensteg@xxxxxxx>
>>>>> wrote:
>>>>>
>>>>>> Hello everyone,
>>>>>>
>>>>>> It has been over a week (possibly more due to lack of weather) where
>>>>>> we are not seeing any GLM level 3 products. We ARE seeing lightning 
>>>>>> strike
>>>>>> data
>>>>>> just fine (level 2 data):
>>>>>>
>>>>>> OR_GLM-L2-LCFA_G19_s20253242246000_e20253242246200_c20253242246218.nc
>>>>>>
>>>>>> However, we are NOT seeing:
>>>>>>
>>>>>> GOES GLM Total Optical Energy
>>>>>> GOES GLM Minimum Flash Area
>>>>>> GOES GLM Flash Extent Density
>>>>>>
>>>>>> A notifyme shows no level 3 products being received on idd or iddb:
>>>>>>
>>>>>> localserver % notifyme -v -h idd.unidata.ucar.edu -p
>>>>>> GLMISatSS/Level3 -o 3600 -O
>>>>>>
>>>>>> 20251120T225605.667426Z notifyme[3709300]
>>>>>> notifyme.c:main:592                 NOTE  Starting Up:
>>>>>> idd.unidata.ucar.edu: 20251120215605.667336 TS_ENDT {{ANY,
>>>>>> "GLMISatSS/Level3"}}
>>>>>>
>>>>>> 20251120T225605.668151Z notifyme[3709300]
>>>>>> inetutil.c:addrbyhost:964           INFO  Resolving
>>>>>> idd.unidata.ucar.edu to 128.117.135.3 took 0.000633 seconds
>>>>>>
>>>>>> 20251120T225605.696393Z notifyme[3709300]
>>>>>> notifyme.c:notifyme6:428            NOTE  OK
>>>>>>
>>>>>> 20251120T225635.793049Z notifyme[3709300]
>>>>>> notifyme.c:sendNotifyMe:346         ERROR NOTIFYME reclassified: "" -> ""
>>>>>>
>>>>>> 20251120T225635.793089Z notifyme[3709300]
>>>>>> notifyme.c:executeService:396       ERROR Connection to upstream LDM
>>>>>> timed-out
>>>>>>
>>>>>>
>>>>>> Is anyone seeing this missing data?
>>>>>>
>>>>>> Thank you for any help!
>>>>>>
>>>>>> Gilbert Sebenste
>>>>>>
>>>>>> Meteorology Support Analyst
>>>>>>
>>>>>> [image: image001.png]
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> _________________________________________________________
>>>>>> NOTE: All exchanges posted to NSF Unidata maintained email lists are
>>>>>> made publicly available through the web. Users who post to any of the
>>>>>> lists we maintain are reminded to remove any personal information that
>>>>>> they do not want to be made public.
>>>>>>
>>>>>> NSF Unidata ldm-users Mailing List
>>>>>> (ldm-users@xxxxxxxxxxxxxxxx)
>>>>>> For list information, to unsubscribe, or change your membership
>>>>>> options,
>>>>>> visit: https://mailinglists.unidata.ucar.edu/listinfo/ldm-users/
>>>>>>
>>>>> _________________________________________________________
>>>>> NOTE: All exchanges posted to NSF Unidata maintained email lists are
>>>>> made publicly available through the web. Users who post to any of the
>>>>> lists we maintain are reminded to remove any personal information that
>>>>> they do not want to be made public.
>>>>>
>>>>> NSF Unidata ldm-users Mailing List
>>>>> (ldm-users@xxxxxxxxxxxxxxxx)
>>>>> For list information, to unsubscribe, or change your membership
>>>>> options,
>>>>> visit: https://mailinglists.unidata.ucar.edu/listinfo/ldm-users/
>>>>>
>>>> _________________________________________________________
>>>> NOTE: All exchanges posted to NSF Unidata maintained email lists are
>>>> made publicly available through the web. Users who post to any of the
>>>> lists we maintain are reminded to remove any personal information that
>>>> they do not want to be made public.
>>>>
>>>> NSF Unidata ldm-users Mailing List
>>>> (ldm-users@xxxxxxxxxxxxxxxx)
>>>> For list information, to unsubscribe, or change your membership options,
>>>> visit: https://mailinglists.unidata.ucar.edu/listinfo/ldm-users/
>>>>
>>>
#!/home/ldm/.conda/envs/dev/bin/python
"""Minimal GLM L2→L3 aggregation (5 min L3 from 5× 1-min L2)."""

import pathlib
import re
from datetime import datetime, timedelta
import xarray as xr
import numpy as np

L2_DIR = pathlib.Path("/home/localdata/satellite/glm/GLMISatSS/level2")
L3_DIR = pathlib.Path("/home/localdata/satellite/glm/GLMISatSS/level3")
L3_DIR.mkdir(parents=True, exist_ok=True)


def extract_time(path):
    """Extract start time from OR_GLM-L2 filename."""
    m = re.search(r"s(\d{13})", path.stem)
    if not m:
        raise ValueError(f"Cannot parse time from {path.name}")
    return datetime.strptime(m.group(1), "%Y%j%H%M%S")


def main():
    files = sorted(L2_DIR.glob("OR_GLM-L2-GLMF*.nc"))
    if not files:
        print("No L2 files found")
        return

    times = [extract_time(f) for f in files]

    # Process most recent complete 5-file window
    for i in range(len(times) - 1, -1, -1):
        t_end = times[i]
        t_start = t_end - timedelta(minutes=4)
        window = [f for f, t in zip(files, times) if t_start <= t <= t_end]
        if len(window) == 5:
            aggregate_and_write(window)
            break


def aggregate_and_write(window):
    """Aggregate 5 L2 files into 1 L3 file."""
    print(f"Aggregating {len(window)} files...")

    # Load first file as template
    ds_template = xr.open_dataset(window[0], decode_cf=False)

    # Initialize accumulators
    fed_sum = None
    toe_sum = None
    mfa_min = None

    # Aggregate
    for p in window:
        ds = xr.open_dataset(p, decode_cf=False)

        # FED: sum (counts up)
        if fed_sum is None:
            fed_sum = ds["Flash_extent_density"].values.astype(np.int32)
        else:
            fed_sum += ds["Flash_extent_density"].values.astype(np.int32)

        # TOE: sum (cumulative energy)
        if toe_sum is None:
            toe_sum = ds["Total_Optical_energy"].values.astype(np.int32)
        else:
            toe_sum += ds["Total_Optical_energy"].values.astype(np.int32)

        # MFA: keep minimum
        if mfa_min is None:
            mfa_min = ds["Minimum_flash_area"].values.astype(np.int32)
        else:
            mfa_min = np.minimum(mfa_min, 
ds["Minimum_flash_area"].values.astype(np.int32))

        ds.close()

    # Convert back to int16
    fed_out = np.clip(fed_sum, 0, 32767).astype(np.int16)
    toe_out = np.clip(toe_sum, 0, 32767).astype(np.int16)
    mfa_out = np.clip(mfa_min, 0, 32767).astype(np.int16)

    print(f"FED: {fed_out.min()}–{fed_out.max()}, 
nonzero={np.count_nonzero(fed_out)}")
    print(f"TOE: {toe_out.min()}–{toe_out.max()}")
    print(f"MFA: {mfa_out.min()}–{mfa_out.max()}")

    # Get center time and attributes
    ds_mid = xr.open_dataset(window[len(window) // 2], decode_cf=False)
    time_val = ds_mid["time"].values
    time_attrs = dict(ds_mid["time"].attrs)
    ds_mid.close()

    # Extract attributes only for the 3 vars we keep
    fed_attrs = dict(ds_template["Flash_extent_density"].attrs)
    toe_attrs = dict(ds_template["Total_Optical_energy"].attrs)
    mfa_attrs = dict(ds_template["Minimum_flash_area"].attrs)
    proj_attrs = dict(ds_template["goes_imager_projection"].attrs)
    y_attrs = dict(ds_template.coords["y"].attrs)
    x_attrs = dict(ds_template.coords["x"].attrs)

    # Build minimal L3 dataset with ONLY the 3 gridded variables
    out = xr.Dataset(
        data_vars={
            "time": ((), time_val, time_attrs),
            "Flash_extent_density": (("y", "x"), fed_out, fed_attrs),
            "Total_Optical_energy": (("y", "x"), toe_out, toe_attrs),
            "Minimum_flash_area": (("y", "x"), mfa_out, mfa_attrs),
            "goes_imager_projection": ((), 0, proj_attrs),
        },
        coords={
            "y": (("y",), ds_template.coords["y"].values, y_attrs),
            "x": (("x",), ds_template.coords["x"].values, x_attrs),
        },
        attrs=ds_template.attrs,
    )

    # Update metadata
    out.attrs["title"] = "GLM L3 Lightning Detection Gridded Product"

    # Build filename (L2→L3)
    base = re.sub(r"_s\d+.*", "", window[0].name)
    base = base.replace("-L2-", "-L3-")
    s_tok = re.search(r"s(\d+)", window[0].name)
    e_tok = re.search(r"e(\d+)", window[-1].name)
    c_tok = re.search(r"c(\d+)", window[len(window) // 2].name)
    s = s_tok.group(1) if s_tok else ""
    e = e_tok.group(1) if e_tok else ""
    c = c_tok.group(1) if c_tok else ""
    filename = f"{base}_s{s}_e{e}_c{c}.nc"

    # Write
    out_path = L3_DIR / filename
    out.to_netcdf(out_path)
    print(f"Wrote {out_path}\n")


if __name__ == "__main__":
    main()

Attachment: lvl2mcx.jpeg
Description: JPEG image

Attachment: lvl2cartopy.png
Description: PNG image

Attachment: lvl3cartopy.png
Description: PNG image

Attachment: lvl3mcx.jpeg
Description: JPEG image