Jim:
No offense taken.
very respectfully,
randy
---------- Original Message ----------------------------------
From: Jim Biard <jim.biard@xxxxxxxx>
Date: Thu, 23 Aug 2012 09:08:29 -0400
>Hi.
>
>I want to make sure that people don't think I feel that anything about GOES-R
>or its data products is sloppy.
>
>Grace and peace,
>
>Jim
>
>Jim Biard
>Research Scholar
>Cooperative Institute for Climate and Satellites
>Remote Sensing and Applications Division
>National Climatic Data Center
>151 Patton Ave, Asheville, NC 28801-5001
>
>jim.biard@xxxxxxxx
>828-271-4900
>
>On Aug 22, 2012, at 3:09 PM, Jim Biard wrote:
>
>> Interesting. According to the standard, that makes them Level 3 data
>> products. There seems to be a lot of sloppy use of the level definitions
>> out there.
>>
>> Jim Biard
>> Research Scholar
>> Cooperative Institute for Climate and Satellites
>> Remote Sensing and Applications Division
>> National Climatic Data Center
>> 151 Patton Ave, Asheville, NC 28801-5001
>>
>> jim.biard@xxxxxxxx
>> 828-271-4900
>>
>> On Aug 22, 2012, at 2:58 PM, Randy Horne wrote:
>>
>>> Note that in GOES-R (NOAA's next generation geostationary satellite
>>> system), the level 1b products are calibrated and resampled to a fixed
>>> grid, and a CF compliant grid_mapping has been established to geolocate to
>>> lat/lon. Meteosat 2nd generation is very similar.
>>>
>>> very respectfully,
>>>
>>> randy
>>>
>>>
>>> ---------- Original Message ----------------------------------
>>> From: Jim Biard <jim.biard@xxxxxxxx>
>>> Date: Wed, 22 Aug 2012 14:40:21 -0400
>>>
>>>> Hi.
>>>>
>>>> I'm currently building such a product for the NPP VIIRS sensor. It is,
>>>> strictly speaking, a NOAA Level 1b data set. I am storing the satellite
>>>> position, velocity, and attitude data with the measurement data, but for
>>>> various reasons (including space considerations) I am storing the
>>>> parameters needed for calibration and geolocation algorithms in a separate
>>>> file.
>>>>
>>>> Just for the record, the definitions of the various NOAA data product
>>>> levels, as defined in the Federal Geographic Data Committee (FGDC) Content
>>>> Standard for Digital Geospatial Metadata (CSDGM): Extensions for Remote
>>>> Sensing Metadata (FGDC-STD-012-2002) are:
>>>>
>>>> Level 0 data products are unprocessed telemetry data as received from the
>>>> observing platform
>>>> excluding communications artifacts introduced by the ground system.
>>>>
>>>> Level 1a data products are telemetry data that have been extracted but not
>>>> decommutated from Level
>>>> 0 and formatted into time-sequenced datasets for easier processing. The
>>>> Level 1a formats are NOAA's
>>>> internal formats and are only used for NOAA processing. They only exist
>>>> briefly for the purpose of
>>>> creating the Level 1b datasets. Levels 2-4 are the same as NASA levels 2-4.
>>>>
>>>> Level 1b data products are discrete, instrument-specific datasets derived
>>>> from Level 1a containing
>>>> unprocessed data at full resolution, time-referenced, and annotated with
>>>> ancillary information
>>>> including data quality indicators, calibration coefficients and
>>>> georeferencing parameters.
>>>>
>>>> Level 2 data products are derived geophysical variables at the same
>>>> resolution and locations as the
>>>> Level 1 source data.
>>>>
>>>> Notice that Level 1b products have been "decommutated" (split up into
>>>> variables), but no calibrations or geolocations have actually been done.
>>>>
>>>> NASA's definitions (which are also defined in the standard) are somewhat
>>>> different. The NASA equivalent for such a data product is Level 1A.
>>>>
>>>> Grace and peace,
>>>>
>>>> Jim
>>>>
>>>> Jim Biard
>>>> Research Scholar
>>>> Cooperative Institute for Climate and Satellites
>>>> Remote Sensing and Applications Division
>>>> National Climatic Data Center
>>>> 151 Patton Ave, Asheville, NC 28801-5001
>>>>
>>>> jim.biard@xxxxxxxx
>>>> 828-271-4900
>>>>
>>>> On Aug 22, 2012, at 2:01 PM, David Santek wrote:
>>>>
>>>>> Yes, when you have an L1 file, lat/lon by pixel is necessary for the
>>>>> reasons you and Jim state.
>>>>>
>>>>> But, will the CF conventions also be applied for a more raw format (Level
>>>>> 0) where orbit and scanning information would be useful to carry along?
>>>>>
>>>>> Dave
>>>>>
>>>>> On 8/22/12 11:20 AM, Armstrong, Edward M (388M) wrote:
>>>>>> Hi,
>>>>>>
>>>>>> I just wanted to add that from the perspective of a satellite data
>>>>>> center where I work, our experience is that the user generally wants to
>>>>>> start working with the data as quickly as possible. Thus, its best to
>>>>>> have an explicit lon/lat for every pixel in L1/L2 data or a very very
>>>>>> simple way to interpolate it.
>>>>>>
>>>>>> I think specifying scanning geometry in CF is overly complex and will
>>>>>> be probably not be used very much (and confuse some folks). I do agree
>>>>>> that these characteristics must be captured "upstream" in other metadata
>>>>>> formats for data/algorithm provenance purposes.
>>>>>>
>>>>>>
>>>>>> On Aug 22, 2012, at 7:41 AM, Jim Biard wrote:
>>>>>>
>>>>>>> Hi.
>>>>>>>
>>>>>>> From my experience, the algorithms used to determine the ray vector for
>>>>>>> each pixel is quite complicated and different for each satellite. I
>>>>>>> doubt there is a useful way to encode the necessary information in a
>>>>>>> standard format. In addition, for LEO satellites (non-geostationary),
>>>>>>> the TLE is not sufficient for the accuracy needed. I know that is the
>>>>>>> case for the NPP VIIRS and CrIS instruments, and it's also true for the
>>>>>>> commercial imaging satellites. You need a corrected time series of
>>>>>>> satellite position, velocity, and attitude covering the path over the
>>>>>>> time the image was acquired. And you then need a significant amount of
>>>>>>> information about the sensor geometry and position on the satellite
>>>>>>> frame relative to the satellite center of motion. You may want to
>>>>>>> store all of the needed information in the file, but you won't be using
>>>>>>> generic software to turn it into geolocations. (Well, you might be
>>>>>>> able to, but that will require developing an abstract sensor model and
>>>>>>> a
set
>>> of parameters to that model that are sufficient to handle any satellite.
>>> This is akin to the issue with geographic coordinate systems. There are so
>>> many different ways that these have been defined that there is no single
>>> model that fully handles every case.)
>>>>>>>
>>>>>>> Grace and peace,
>>>>>>>
>>>>>>> Jim
>>>>>>>
>>>>>>> Jim Biard
>>>>>>> Research Scholar
>>>>>>> Cooperative Institute for Climate and Satellites
>>>>>>> Remote Sensing and Applications Division
>>>>>>> National Climatic Data Center
>>>>>>> 151 Patton Ave, Asheville, NC 28801-5001
>>>>>>>
>>>>>>> jim.biard@xxxxxxxx
>>>>>>> 828-271-4900
>>>>>>>
>>>>>>> On Aug 22, 2012, at 10:05 AM, David Santek wrote:
>>>>>>>
>>>>>>>> Hello Ghansham,
>>>>>>>>
>>>>>>>> Yes, from an Earth perspective (latitude,longitude) the scanning
>>>>>>>> geometry is complicated. But, from an orbit and scanning perspective,
>>>>>>>> most polar satellites behave about the same.
>>>>>>>>
>>>>>>>> Polar Orbits: They have high inclinations to cover the poles or low
>>>>>>>> inclination for the tropics.
>>>>>>>> Scanning: cross track for most instruments; conical for microwave [to
>>>>>>>> keep incidence angle constant].
>>>>>>>>
>>>>>>>> So, the CF specification will need to include orbit information [the
>>>>>>>> Two Line Elements (TLE) define this] and scanning information
>>>>>>>> [incidence angle, sweep angle, etc.] so that the latitude/longitude
>>>>>>>> can be determined for each pixel.
>>>>>>>>
>>>>>>>> Dave
>>>>>>>>
>>>>>>>> On 8/22/12 8:52 AM, ghansham sangar wrote:
>>>>>>>>> Hello Sir..
>>>>>>>>> Hope you are doing fine.
>>>>>>>>>
>>>>>>>>> I understand you point of frame of reference. Even I was also
>>>>>>>>> confused when
>>>>>>>>> I saw that dataset for the first time. But later I realized in one of
>>>>>>>>> the conversation with
>>>>>>>>> Tom Rink Sir, also, this is what came out (as told in earlier mail
>>>>>>>>> too):
>>>>>>>>> The orbit has an inclination of as low as 20 deg (no coverage on
>>>>>>>>> poles).
>>>>>>>>> The reason is to improve the temporal resolution over the tropics.
>>>>>>>>> And the sensor scans across track w.r.t to such low inclination track.
>>>>>>>>> And that is why the data is packed also in that manner (up down).
>>>>>>>>> The best thing I can do is post one snapshot generated from toolsUI
>>>>>>>>> of one
>>>>>>>>> of the parameter displayed as image to have a better understanding of
>>>>>>>>> what exactly
>>>>>>>>> the data looks like. I know its a pretty tough scanning geometry to
>>>>>>>>> understand.
>>>>>>>>>
>>>>>>>>> regards
>>>>>>>>> Ghansham
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Wed, Aug 22, 2012 at 2:35 AM, Tom Whittaker <whittaker@xxxxxxxx>
>>>>>>>>> wrote:
>>>>>>>>> Hello Ghansham...
>>>>>>>>>
>>>>>>>>> I hope you are well.
>>>>>>>>>
>>>>>>>>> I believe the "scan direction" (either "up/down" or "left/right") is a
>>>>>>>>> matter of perspective -- if the frame of reference is on the
>>>>>>>>> satellite, looking "forward" along the flight path, then I would be
>>>>>>>>> more inclined to say "left/right", as "up/down" would refer to some
>>>>>>>>> vertical scanning -- from my frame of reference on the satellite.
>>>>>>>>>
>>>>>>>>> Regarding CF Conventions. There are no conventions for dealing with
>>>>>>>>> this. There have been discussions in the past dealing with "swath
>>>>>>>>> data", and you might have a Google of that (plus 'netcdf') and see
>>>>>>>>> what others have been thinking about.
>>>>>>>>>
>>>>>>>>> There is also at least one reference to some data already being
>>>>>>>>> written to hdf files, which might prove of interest. The sad fact is
>>>>>>>>> that the satellite community for the longest time did not embrace
>>>>>>>>> NetCDF, and so we must play "catch-up" with the people who have
>>>>>>>>> defined conventions for model/gridded data and in-situ data.
>>>>>>>>>
>>>>>>>>> My take is that some common characteristics (like 'band' and
>>>>>>>>> 'central_wavelength' (or _wavenumber) should be defined using
>>>>>>>>> conventions and "standard_names", but that characteristics of
>>>>>>>>> particular platforms must, by necessity, be defined for those
>>>>>>>>> platforms. I also think that the use of the "standard_names" will go
>>>>>>>>> a long way toward helping application developers in writing file
>>>>>>>>> readers that can understand some of the basic structures of the data,
>>>>>>>>> while at the same time providing end users an opportunity to write
>>>>>>>>> specialized interfaces that meet their particular research or
>>>>>>>>> operational needs.
>>>>>>>>>
>>>>>>>>> Best wishes,
>>>>>>>>>
>>>>>>>>> tom
>>>>>>>>>
>>>>>>>>
>>>>>
>>>>> _______________________________________________
>>>>> cf-satellite mailing list
>>>>> cf-satellite@xxxxxxxxxxxxxxxx
>>>>> For list information or to unsubscribe, visit:
>>>>> http://www.unidata.ucar.edu/mailing_lists/
>>>>
>>>>
>>>>
>>>>
>>>>
>>>
>>>
>>>
>>> ..............End of Message ...............................-->
>>>
>>>
>>>
>>>
>>
>
>
>
>
>
..............End of Message ...............................-->