Skip to content
Commits on Source (4)
ref-names: HEAD -> master, tag: v0.19.0
\ No newline at end of file
ref-names: HEAD -> master, tag: v0.19.1
\ No newline at end of file
......@@ -3,7 +3,7 @@ env:
global:
# Set defaults to avoid repeating in most cases
- PYTHON_VERSION=$TRAVIS_PYTHON_VERSION
- NUMPY_VERSION=stable
- NUMPY_VERSION=1.17
- MAIN_CMD='python setup.py'
- CONDA_DEPENDENCIES='xarray!=0.13.0 dask distributed toolz Cython sphinx cartopy pillow matplotlib scipy pyyaml pyproj pyresample coveralls coverage codecov behave netcdf4 h5py h5netcdf gdal rasterio imageio pyhdf mock libtiff geoviews zarr six python-eccodes'
- PIP_DEPENDENCIES='trollsift trollimage pyspectral pyorbital libtiff'
......
......@@ -27,8 +27,8 @@ The following people have made contributions to this project:
- [David Hoese (djhoese)](https://github.com/djhoese)
- [Marc Honnorat (honnorat)](https://github.com/honnorat)
- [Mikhail Itkin (mitkin)](https://github.com/mitkin)
- [JohannesSMHI (JohannesSMHI)](https://github.com/JohannesSMHI)
- [Tommy Jasmin (tommyjasmin)](https://github.com/tommyjasmin)
- [Johannes Johansson (JohannesSMHI)](https://github.com/JohannesSMHI)
- [Sauli Joro (sjoro)](https://github.com/sjoro)
- [Janne Kotro (jkotro)](https://github.com/jkotro)
- [Ralph Kuehn (ralphk11)](https://github.com/ralphk11)
......
## Version 0.19.1 (2020/01/10)
### Issues Closed
* [Issue 1030](https://github.com/pytroll/satpy/issues/1030) - Geostationary padding results in wrong area definition for AHI mesoscale sectors. ([PR 1037](https://github.com/pytroll/satpy/pull/1037))
* [Issue 1029](https://github.com/pytroll/satpy/issues/1029) - NetCDF (CF) writer doesn't include semi_minor_axis/semi_major_axis for new versions of pyproj ([PR 1040](https://github.com/pytroll/satpy/pull/1040))
* [Issue 1023](https://github.com/pytroll/satpy/issues/1023) - RTD "Edit on Github" broken in "latest" documentation
In this release 3 issues were closed.
### Pull Requests Merged
#### Bugs fixed
* [PR 1040](https://github.com/pytroll/satpy/pull/1040) - Fix geostationary axis handling in CF writer ([1029](https://github.com/pytroll/satpy/issues/1029))
* [PR 1037](https://github.com/pytroll/satpy/pull/1037) - Fix segment handling for non-FLDK sectors in the AHI HSD reader ([1030](https://github.com/pytroll/satpy/issues/1030))
* [PR 1036](https://github.com/pytroll/satpy/pull/1036) - Fix ABI L1b/L2 time dimension causing issues with newer xarray
* [PR 1034](https://github.com/pytroll/satpy/pull/1034) - Fix AMI geolocation being off by 1 pixel
* [PR 1033](https://github.com/pytroll/satpy/pull/1033) - Fix avhrr_l1b_aapp reader not including standard_name metadata
* [PR 1031](https://github.com/pytroll/satpy/pull/1031) - Fix tropomi_l2 reader not using y and x dimension names
#### Features added
* [PR 1035](https://github.com/pytroll/satpy/pull/1035) - Add additional Sentinel 3 OLCI 2 datasets
* [PR 1027](https://github.com/pytroll/satpy/pull/1027) - Update SCMI writer and VIIRS EDR Flood reader to work for pre-tiled data
#### Documentation changes
* [PR 1032](https://github.com/pytroll/satpy/pull/1032) - Add documentation about y and x dimensions for custom readers
In this release 9 pull requests were closed.
## Version 0.19.0 (2019/12/30)
### Issues Closed
......
......@@ -11,7 +11,7 @@ environment:
- PYTHON: "C:\\Python37_64"
PYTHON_VERSION: "3.7"
PYTHON_ARCH: "64"
NUMPY_VERSION: "stable"
NUMPY_VERSION: "1.16"
install:
- "git clone --depth 1 git://github.com/astropy/ci-helpers.git"
......
satpy (0.19.1-1) unstable; urgency=medium
* New upstream release.
-- Antonio Valentino <antonio.valentino@tiscali.it> Mon, 13 Jan 2020 21:31:18 +0000
satpy (0.19.0-1) unstable; urgency=medium
* New upstream release.
......
......@@ -466,6 +466,15 @@ needs to implement a few methods:
successful, containing the data and :ref:`metadata <dataset_metadata>` of the
loaded dataset, or return None if the loading was unsuccessful.
The DataArray should at least have a ``y`` dimension. For data covering
a 2D region on the Earth, their should be at least a ``y`` and ``x``
dimension. This applies to
non-gridded data like that of a polar-orbiting satellite instrument. The
latitude dimension is typically named ``y`` and longitude named ``x``.
This may require renaming dimensions from the file, see for the
:meth:`xarray.DataArray.rename` method for more information and its use
in the example below.
- the ``get_area_def`` method, that takes as single argument the
:class:`~satpy.dataset.DatasetID` for which we want
the area. It should return a :class:`~pyresample.geometry.AreaDefinition`
......
......@@ -287,82 +287,66 @@ datasets:
file_types:
hsd_b01:
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler ''
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler
file_patterns: ['HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B01_{area}_R10_S{segment:2d}{total_segments:2d}.DAT',
'HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B01_{area}_R10_S{segment:2d}{total_segments:2d}.DAT.bz2']
expected_segments: 10
hsd_b02:
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler ''
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler
file_patterns: ['HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B02_{area}_R10_S{segment:2d}{total_segments:2d}.DAT',
'HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B02_{area}_R10_S{segment:2d}{total_segments:2d}.DAT.bz2']
expected_segments: 10
hsd_b03:
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler ''
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler
file_patterns: ['HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B03_{area}_R05_S{segment:2d}{total_segments:2d}.DAT',
'HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B03_{area}_R05_S{segment:2d}{total_segments:2d}.DAT.bz2']
expected_segments: 10
hsd_b04:
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler ''
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler
file_patterns: ['HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B04_{area}_R10_S{segment:2d}{total_segments:2d}.DAT',
'HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B04_{area}_R10_S{segment:2d}{total_segments:2d}.DAT.bz2']
expected_segments: 10
hsd_b05:
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler ''
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler
file_patterns: ['HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B05_{area}_R20_S{segment:2d}{total_segments:2d}.DAT',
'HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B05_{area}_R20_S{segment:2d}{total_segments:2d}.DAT.bz2']
expected_segments: 10
hsd_b06:
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler ''
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler
file_patterns: ['HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B06_{area}_R20_S{segment:2d}{total_segments:2d}.DAT',
'HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B06_{area}_R20_S{segment:2d}{total_segments:2d}.DAT.bz2']
expected_segments: 10
hsd_b07:
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler ''
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler
file_patterns: ['HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B07_{area}_R20_S{segment:2d}{total_segments:2d}.DAT',
'HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B07_{area}_R20_S{segment:2d}{total_segments:2d}.DAT.bz2']
expected_segments: 10
hsd_b08:
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler ''
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler
file_patterns: ['HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B08_{area}_R20_S{segment:2d}{total_segments:2d}.DAT',
'HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B08_{area}_R20_S{segment:2d}{total_segments:2d}.DAT.bz2']
expected_segments: 10
hsd_b09:
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler ''
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler
file_patterns: ['HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B09_{area}_R20_S{segment:2d}{total_segments:2d}.DAT',
'HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B09_{area}_R20_S{segment:2d}{total_segments:2d}.DAT.bz2']
expected_segments: 10
hsd_b10:
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler ''
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler
file_patterns: ['HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B10_{area}_R20_S{segment:2d}{total_segments:2d}.DAT',
'HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B10_{area}_R20_S{segment:2d}{total_segments:2d}.DAT.bz2']
expected_segments: 10
hsd_b11:
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler ''
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler
file_patterns: ['HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B11_{area}_R20_S{segment:2d}{total_segments:2d}.DAT',
'HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B11_{area}_R20_S{segment:2d}{total_segments:2d}.DAT.bz2']
expected_segments: 10
hsd_b12:
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler ''
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler
file_patterns: ['HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B12_{area}_R20_S{segment:2d}{total_segments:2d}.DAT',
'HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B12_{area}_R20_S{segment:2d}{total_segments:2d}.DAT.bz2']
expected_segments: 10
hsd_b13:
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler ''
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler
file_patterns: ['HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B13_{area}_R20_S{segment:2d}{total_segments:2d}.DAT',
'HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B13_{area}_R20_S{segment:2d}{total_segments:2d}.DAT.bz2']
expected_segments: 10
hsd_b14:
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler ''
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler
file_patterns: ['HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B14_{area}_R20_S{segment:2d}{total_segments:2d}.DAT',
'HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B14_{area}_R20_S{segment:2d}{total_segments:2d}.DAT.bz2']
expected_segments: 10
hsd_b15:
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler ''
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler
file_patterns: ['HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B15_{area}_R20_S{segment:2d}{total_segments:2d}.DAT',
'HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B15_{area}_R20_S{segment:2d}{total_segments:2d}.DAT.bz2']
expected_segments: 10
hsd_b16:
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler ''
file_reader: !!python/name:satpy.readers.ahi_hsd.AHIHSDFileHandler
file_patterns: ['HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B16_{area}_R20_S{segment:2d}{total_segments:2d}.DAT',
'HS_{platform_shortname}_{start_time:%Y%m%d_%H%M}_B16_{area}_R20_S{segment:2d}{total_segments:2d}.DAT.bz2']
expected_segments: 10
......@@ -92,6 +92,8 @@ datasets:
- longitude
- latitude
file_type: avhrr_aapp_l1b
standard_name: solar_zenith_angle
units: degrees
sensor_zenith_angle:
name: sensor_zenith_angle
......@@ -100,6 +102,8 @@ datasets:
- longitude
- latitude
file_type: avhrr_aapp_l1b
standard_name: sensor_zenith_angle
units: degrees
sun_sensor_azimuth_difference_angle:
name: sun_sensor_azimuth_difference_angle
......@@ -108,18 +112,21 @@ datasets:
- longitude
- latitude
file_type: avhrr_aapp_l1b
units: degrees
latitude:
name: latitude
resolution: 1050
file_type: avhrr_aapp_l1b
standard_name: latitude
units: degrees_north
longitude:
name: longitude
resolution: 1050
file_type: avhrr_aapp_l1b
standard_name: longitude
units: degrees_east
file_types:
avhrr_aapp_l1b:
......
......@@ -16,6 +16,15 @@ file_types:
esa_l2_chl_oc4me:
file_reader: !!python/name:satpy.readers.olci_nc.NCOLCI2
file_patterns: ['{mission_id:3s}_OL_2_{datatype_id:_<6s}_{start_time:%Y%m%dT%H%M%S}_{end_time:%Y%m%dT%H%M%S}_{creation_time:%Y%m%dT%H%M%S}_{duration:4d}_{cycle:3d}_{relative_orbit:3d}_{frame:4d}_{centre:3s}_{mode:1s}_{timeliness:2s}_{collection:3s}.SEN3/chl_oc4me.nc']
esa_l2_iop_nn:
file_reader: !!python/name:satpy.readers.olci_nc.NCOLCI2
file_patterns: ['{mission_id:3s}_OL_2_{datatype_id:_<6s}_{start_time:%Y%m%dT%H%M%S}_{end_time:%Y%m%dT%H%M%S}_{creation_time:%Y%m%dT%H%M%S}_{duration:4d}_{cycle:3d}_{relative_orbit:3d}_{frame:4d}_{centre:3s}_{mode:1s}_{timeliness:2s}_{collection:3s}.SEN3/iop_nn.nc']
esa_l2_trsp:
file_reader: !!python/name:satpy.readers.olci_nc.NCOLCI2
file_patterns: ['{mission_id:3s}_OL_2_{datatype_id:_<6s}_{start_time:%Y%m%dT%H%M%S}_{end_time:%Y%m%dT%H%M%S}_{creation_time:%Y%m%dT%H%M%S}_{duration:4d}_{cycle:3d}_{relative_orbit:3d}_{frame:4d}_{centre:3s}_{mode:1s}_{timeliness:2s}_{collection:3s}.SEN3/trsp.nc']
esa_l2_tsm_nn:
file_reader: !!python/name:satpy.readers.olci_nc.NCOLCI2
file_patterns: ['{mission_id:3s}_OL_2_{datatype_id:_<6s}_{start_time:%Y%m%dT%H%M%S}_{end_time:%Y%m%dT%H%M%S}_{creation_time:%Y%m%dT%H%M%S}_{duration:4d}_{cycle:3d}_{relative_orbit:3d}_{frame:4d}_{centre:3s}_{mode:1s}_{timeliness:2s}_{collection:3s}.SEN3/tsm_nn.nc']
esa_l2_wqsf:
file_reader: !!python/name:satpy.readers.olci_nc.NCOLCI2
file_patterns: ['{mission_id:3s}_OL_2_{datatype_id:_<6s}_{start_time:%Y%m%dT%H%M%S}_{end_time:%Y%m%dT%H%M%S}_{creation_time:%Y%m%dT%H%M%S}_{duration:4d}_{cycle:3d}_{relative_orbit:3d}_{frame:4d}_{centre:3s}_{mode:1s}_{timeliness:2s}_{collection:3s}.SEN3/wqsf.nc']
......@@ -339,6 +348,42 @@ datasets:
file_type: esa_l2_chl_nn
nc_key: CHL_NN
iop_nn:
name: iop_nn
sensor: olci
resolution: 300
calibration:
reflectance:
standard_name: cdm_absorption_coefficient
units: "lg(re m-l)"
coordinates: [longitude, latitude]
file_type: esa_l2_iop_nn
nc_key: ADG443_NN
trsp:
name: trsp
sensor: olci
resolution: 300
calibration:
reflectance:
standard_name: diffuse_attenuation_coefficient
units: "lg(re m-l)"
coordinates: [longitude, latitude]
file_type: esa_l2_trsp
nc_key: KD490_M07
tsm_nn:
name: tsm_nn
sensor: olci
resolution: 300
calibration:
reflectance:
standard_name: total_suspended_matter_concentration
units: "lg(re g.m-3)"
coordinates: [longitude, latitude]
file_type: esa_l2_tsm_nn
nc_key: TSM_NN
wqsf:
name: wqsf
sensor: olci
......
......@@ -10,6 +10,7 @@ file_types:
file_patterns:
- 'WATER_VIIRS_Prj_SVI_{platform_shortname}_d{start_time:%Y%m%d_t%H%M%S%f}_e{end_time:%H%M%S%f}_b{orbit:5d}_{source:8s}_{dim0:d}_{dim1:d}_01.hdf'
- 'WATER_VIIRS_Prj_SVI_{platform_shortname}_d{start_time:%Y%m%d_t%H%M%S%f}_e{end_time:%H%M%S%f}_b{orbit:5d}_{source:8s}_{aoi:3s}_{dim0:d}_{dim1:d}_01.hdf'
- 'WATER_COM_VIIRS_Prj_SVI_d{start_time:%Y%m%d}_d{end_time:%Y%m%d}_{dim0:d}_{dim1:d}_{unknown1:2d}_{total_days:3d}day_{tile_num:3d}.hdf'
datasets:
water_detection:
......
......@@ -79,7 +79,7 @@ def get_area_extent(pdict):
# count starts at 1
cols = 1 - 0.5
if (pdict['scandir'] == 'S2N'):
if pdict['scandir'] == 'S2N':
lines = 0.5 - 1
scanmult = -1
else:
......
......@@ -56,14 +56,17 @@ PLATFORM_NAMES = {4: 'NOAA-15',
def create_xarray(arr):
"""Create xarray DataArray from numpy array."""
res = da.from_array(arr, chunks=(CHUNK_SIZE, CHUNK_SIZE))
res = xr.DataArray(res, dims=['y', 'x'])
return res
class AVHRRAAPPL1BFile(BaseFileHandler):
"""Reader for AVHRR L1B files created from the AAPP software."""
def __init__(self, filename, filename_info, filetype_info):
"""Initialize object information by reading the input file."""
super(AVHRRAAPPL1BFile, self).__init__(filename, filename_info,
filetype_info)
self.channels = {i: None for i in AVHRR_CHANNEL_NAMES}
......@@ -89,23 +92,20 @@ class AVHRRAAPPL1BFile(BaseFileHandler):
@property
def start_time(self):
"""Get the time of the first observation."""
return datetime(self._data['scnlinyr'][0], 1, 1) + timedelta(
days=int(self._data['scnlindy'][0]) - 1,
milliseconds=int(self._data['scnlintime'][0]))
@property
def end_time(self):
"""Get the time of the final observation."""
return datetime(self._data['scnlinyr'][-1], 1, 1) + timedelta(
days=int(self._data['scnlindy'][-1]) - 1,
milliseconds=int(self._data['scnlintime'][-1]))
def shape(self):
# return self._data.shape
return self._shape
def get_dataset(self, key, info):
"""Get a dataset from the file."""
if key.name in CHANNEL_NAMES:
dataset = self.calibrate(key)
elif key.name in ['longitude', 'latitude']:
......@@ -128,6 +128,9 @@ class AVHRRAAPPL1BFile(BaseFileHandler):
dataset.attrs.update({'platform_name': self.platform_name,
'sensor': self.sensor})
dataset.attrs.update(key.to_dict())
for meta_key in ('standard_name', 'units'):
if meta_key in info:
dataset.attrs.setdefault(meta_key, info[meta_key])
if not self._shape:
self._shape = dataset.shape
......@@ -135,8 +138,7 @@ class AVHRRAAPPL1BFile(BaseFileHandler):
return dataset
def read(self):
"""Read the data.
"""
"""Read the data."""
tic = datetime.now()
with open(self.filename, "rb") as fp_:
header = np.memmap(fp_, dtype=_HEADERTYPE, mode="r", shape=(1, ))
......@@ -148,10 +150,7 @@ class AVHRRAAPPL1BFile(BaseFileHandler):
self._data = data
def get_angles(self, angle_id):
"""Get sun-satellite viewing angles"""
tic = datetime.now()
"""Get sun-satellite viewing angles."""
sunz40km = self._data["ang"][:, :, 0] * 1e-2
satz40km = self._data["ang"][:, :, 1] * 1e-2
azidiff40km = self._data["ang"][:, :, 2] * 1e-2
......@@ -177,15 +176,10 @@ class AVHRRAAPPL1BFile(BaseFileHandler):
(rows1km, cols1km), along_track_order, cross_track_order)
self.sunz, self.satz, self.azidiff = satint.interpolate()
logger.debug("Interpolate sun-sat angles: time %s",
str(datetime.now() - tic))
return create_xarray(getattr(self, ANGLES[angle_id]))
def navigate(self):
"""Return the longitudes and latitudes of the scene.
"""
tic = datetime.now()
"""Get the longitudes and latitudes of the scene."""
lons40km = self._data["pos"][:, :, 1] * 1e-4
lats40km = self._data["pos"][:, :, 0] * 1e-4
......@@ -209,16 +203,12 @@ class AVHRRAAPPL1BFile(BaseFileHandler):
(lons40km, lats40km), (rows40km, cols40km), (rows1km, cols1km),
along_track_order, cross_track_order)
self.lons, self.lats = satint.interpolate()
logger.debug("Navigation time %s", str(datetime.now() - tic))
def calibrate(self,
dataset_id,
pre_launch_coeffs=False,
calib_coeffs=None):
"""Calibrate the data
"""
tic = datetime.now()
"""Calibrate the data."""
if calib_coeffs is None:
calib_coeffs = {}
......@@ -268,9 +258,6 @@ class AVHRRAAPPL1BFile(BaseFileHandler):
ds.attrs['units'] = units[dataset_id.calibration]
ds.attrs.update(dataset_id._asdict())
logger.debug("Calibration time %s", str(datetime.now() - tic))
return ds
......@@ -458,13 +445,13 @@ def _vis_calibrate(data,
pre_launch_coeffs=False,
calib_coeffs=None,
mask=False):
"""Visible channel calibration only.
"""Calibrate visible channel data.
``calib_type`` in count, reflectance, radiance.
*calib_type* in count, reflectance, radiance
"""
# Calibration count to albedo, the calibration is performed separately for
# two value ranges.
if calib_type not in ['counts', 'radiance', 'reflectance']:
raise ValueError('Calibration ' + calib_type + ' unknown!')
......@@ -525,10 +512,11 @@ def _vis_calibrate(data,
def _ir_calibrate(header, data, irchn, calib_type, mask=False):
"""IR calibration
*calib_type* in brightness_temperature, radiance, count
"""
"""Calibrate for IR bands.
``calib_type`` in brightness_temperature, radiance, count
"""
count = data["hrpt"][:, :, irchn + 2].astype(np.float)
if calib_type == 0:
......
......@@ -53,7 +53,7 @@ class NC_ABI_BASE(BaseFileHandler):
mask_and_scale=False,
chunks={'lon': CHUNK_SIZE, 'lat': CHUNK_SIZE}, )
if 't' in self.nc.dims:
if 't' in self.nc.dims or 't' in self.nc.coords:
self.nc = self.nc.rename({'t': 'time'})
platform_shortname = filename_info['platform_shortname']
self.platform_name = PLATFORM_NAMES.get(platform_shortname)
......
......@@ -25,7 +25,7 @@ import xarray as xr
import dask.array as da
import pyproj
from satpy.readers._geos_area import make_ext, get_area_definition
from satpy.readers._geos_area import get_area_definition, get_area_extent
from pyspectral.blackbody import blackbody_wn_rad2temp as rad2temp
from satpy.readers.file_handlers import BaseFileHandler
from satpy import CHUNK_SIZE
......@@ -85,26 +85,19 @@ class AMIL1bNetCDF(BaseFileHandler):
obs_mode = self.nc.attrs['observation_mode']
resolution = self.nc.attrs['channel_spatial_resolution']
# Example offset: 11000.5
# the 'get_area_extent' will handle this half pixel for us
pdict['cfac'] = self.nc.attrs['cfac']
pdict['coff'] = self.nc.attrs['coff']
pdict['lfac'] = self.nc.attrs['lfac']
pdict['lfac'] = -self.nc.attrs['lfac']
pdict['loff'] = self.nc.attrs['loff']
# AMI grid appears offset, we can not use the standard get_area_extent
bit_shift = 2**16
ll_x = (0 - pdict['coff'] - 0.5) * bit_shift / pdict['cfac']
ll_y = -(0 - pdict['loff'] - 0.5) * bit_shift / pdict['lfac']
ur_x = (pdict['ncols'] - pdict['coff'] + 0.5) * bit_shift / pdict['cfac']
ur_y = -(pdict['nlines'] - pdict['loff'] + 0.5) * bit_shift / pdict['lfac']
area_extent = make_ext(ll_x, ur_x, ll_y, ur_y, pdict['h'])
pdict['scandir'] = 'N2S'
pdict['a_name'] = 'ami_geos_{}'.format(obs_mode.lower())
pdict['a_desc'] = 'AMI {} Area at {} resolution'.format(obs_mode, resolution)
pdict['p_id'] = 'ami_fixed_grid'
area_extent = get_area_extent(pdict)
fg_area_def = get_area_definition(pdict, area_extent)
return fg_area_def
def get_orbital_parameters(self):
......
......@@ -145,6 +145,15 @@ class TROPOMIL2FileHandler(NetCDF4FileHandler):
return metadata
def _rename_dims(self, data_arr):
"""Normalize dimension names with the rest of Satpy."""
dims_dict = {}
if 'ground_pixel' in data_arr.dims:
dims_dict['ground_pixel'] = 'x'
if 'scanline' in data_arr.dims:
dims_dict['scanline'] = 'y'
return data_arr.rename(dims_dict)
def get_dataset(self, ds_id, ds_info):
"""Get dataset."""
logger.debug("Getting data for: %s", ds_id.name)
......@@ -154,4 +163,5 @@ class TROPOMIL2FileHandler(NetCDF4FileHandler):
fill = data.attrs.pop('_FillValue')
data = data.squeeze()
data = data.where(data != fill)
data = self._rename_dims(data)
return data
......@@ -144,8 +144,8 @@ def get_geostationary_bounding_box(geos_area, nb_points=50):
# generate points around the north hemisphere in satellite projection
# make it a bit smaller so that we stay inside the valid area
x = np.cos(np.linspace(-np.pi, 0, nb_points / 2)) * (xmax - 0.001)
y = -np.sin(np.linspace(-np.pi, 0, nb_points / 2)) * (ymax - 0.001)
x = np.cos(np.linspace(-np.pi, 0, nb_points // 2)) * (xmax - 0.001)
y = -np.sin(np.linspace(-np.pi, 0, nb_points // 2)) * (ymax - 0.001)
# clip the projection coordinates to fit the area extent of geos_area
ll_x, ll_y, ur_x, ur_y = (np.array(geos_area.area_extent) /
......
......@@ -860,9 +860,36 @@ def _load_area_def(dsid, file_handlers):
class GEOSegmentYAMLReader(FileYAMLReader):
"""Reader for segmented geostationary data.
This reader pads the data to full geostationary disk.
This reader pads the data to full geostationary disk if necessary.
This reader uses an optional ``pad_data`` keyword argument that can be
passed to :meth:`Scene.load` to control if padding is done (True by
default). Passing `pad_data=False` will return data unpadded.
When using this class in a reader's YAML configuration, segmented file
types (files that may have multiple segments) should specify an extra
``expected_segments`` piece of file_type metadata. This tells this reader
how many total segments it should expect when padding data. Alternatively,
the file patterns for a file type can include a ``total_segments``
field which will be used if ``expected_segments`` is not defined. This
will default to 1 segment.
"""
def create_filehandlers(self, filenames, fh_kwargs=None):
"""Create file handler objects and determine expected segments for each."""
created_fhs = super(GEOSegmentYAMLReader, self).create_filehandlers(
filenames, fh_kwargs=fh_kwargs)
# add "expected_segments" information
for fhs in created_fhs.values():
for fh in fhs:
# check the filename for total_segments parameter as a fallback
ts = fh.filename_info.get('total_segments', 1)
# if the YAML has segments explicitly specified then use that
fh.filetype_info.setdefault('expected_segments', ts)
return created_fhs
@staticmethod
def _load_dataset(dsid, ds_info, file_handlers, dim='y', pad_data=True):
"""Load only a piece of the dataset."""
......@@ -932,8 +959,7 @@ def _stack_area_defs(area_def_dict):
def _pad_later_segments_area(file_handlers, dsid):
"""Pad area definitions for missing segments that are later in sequence than the first available."""
seg_size = None
expected_segments = file_handlers[0].filetype_info.get(
'expected_segments', 1)
expected_segments = file_handlers[0].filetype_info['expected_segments']
available_segments = [int(fh.filename_info.get('segment', 1)) for
fh in file_handlers]
area_defs = {}
......@@ -990,11 +1016,13 @@ def _find_missing_segments(file_handlers, ds_info, dsid):
failure = True
counter = 1
expected_segments = 1
# get list of file handlers in segment order
# (ex. first segment, second segment, etc)
handlers = sorted(file_handlers, key=lambda x: x.filename_info.get('segment', 1))
projectable = None
for fh in handlers:
if fh.filetype_info['file_type'] in ds_info['file_type']:
expected_segments = fh.filetype_info.get('expected_segments', 1)
expected_segments = fh.filetype_info['expected_segments']
while int(fh.filename_info.get('segment', 1)) > counter:
slice_list.append(None)
......
......@@ -27,35 +27,6 @@ except ImportError:
import mock
class FakeDataset(object):
"""Act like an xarray Dataset object for testing."""
def __init__(self, info, attrs, dims=None):
"""Set properties to mimic a Dataset object."""
for var_name, var_data in list(info.items()):
if isinstance(var_data, np.ndarray):
info[var_name] = xr.DataArray(var_data)
self.info = info
self.attrs = attrs
self.dims = dims or tuple()
def __getitem__(self, key):
"""Get the info for the fake data."""
return self.info[key]
def __contains__(self, key):
"""Check if key is in the fake data."""
return key in self.info
def rename(self, *args, **kwargs):
"""Allow for dimension renaming."""
return self
def close(self):
"""Pretend to close."""
return
class Test_NC_ABI_L1B_Base(unittest.TestCase):
"""Common setup for NC_ABI_L1B tests."""
......@@ -81,16 +52,18 @@ class Test_NC_ABI_L1B_Base(unittest.TestCase):
'units': 'W m-2 um-1 sr-1'
}
)
rad['time'] = time
rad['x_image'] = x_image
rad['y_image'] = y_image
rad.coords['t'] = time
rad.coords['x_image'] = x_image
rad.coords['y_image'] = y_image
x__ = xr.DataArray(
range(5),
attrs={'scale_factor': 2., 'add_offset': -1.},
dims=('x',)
)
y__ = xr.DataArray(
range(2),
attrs={'scale_factor': -2., 'add_offset': 1.},
dims=('y',)
)
proj = xr.DataArray(
[],
......@@ -103,30 +76,38 @@ class Test_NC_ABI_L1B_Base(unittest.TestCase):
'sweep_angle_axis': u'x'
}
)
yaw_flip = xr.DataArray([1])
xr_.open_dataset.return_value = FakeDataset({
'Rad': rad,
'band_id': np.array(8),
'x': x__,
'y': y__,
'x_image': x_image,
'y_image': y_image,
'goes_imager_projection': proj,
'yaw_flip_flag': yaw_flip,
"planck_fk1": np.array(13432.1),
"planck_fk2": np.array(1497.61),
"planck_bc1": np.array(0.09102),
"planck_bc2": np.array(0.99971),
"esun": np.array(2017),
"nominal_satellite_subpoint_lat": np.array(0.0),
"nominal_satellite_subpoint_lon": np.array(-89.5),
"nominal_satellite_height": np.array(35786.02),
"earth_sun_distance_anomaly_in_AU": np.array(0.99)},
{
fake_dataset = xr.Dataset(
data_vars={
'Rad': rad,
'band_id': np.array(8),
# 'x': x__,
# 'y': y__,
'x_image': x_image,
'y_image': y_image,
'goes_imager_projection': proj,
'yaw_flip_flag': np.array([1]),
"planck_fk1": np.array(13432.1),
"planck_fk2": np.array(1497.61),
"planck_bc1": np.array(0.09102),
"planck_bc2": np.array(0.99971),
"esun": np.array(2017),
"nominal_satellite_subpoint_lat": np.array(0.0),
"nominal_satellite_subpoint_lon": np.array(-89.5),
"nominal_satellite_height": np.array(35786.02),
"earth_sun_distance_anomaly_in_AU": np.array(0.99)
},
coords={
't': rad.coords['t'],
'x': x__,
'y': y__,
},
attrs={
"time_coverage_start": "2017-09-20T17:30:40.8Z",
"time_coverage_end": "2017-09-20T17:41:17.5Z",
}, dims=('y', 'x'))
},
)
xr_.open_dataset.return_value = fake_dataset
self.reader = NC_ABI_L1B('filename',
{'platform_shortname': 'G16', 'observation_type': 'Rad',
'scene_abbr': 'C', 'scan_mode': 'M3'},
......@@ -173,6 +154,11 @@ class Test_NC_ABI_L1B(Test_NC_ABI_L1B_Base):
'units': 'W m-2 um-1 sr-1'}
self.assertDictEqual(res.attrs, exp)
# we remove any time dimension information
self.assertNotIn('t', res.coords)
self.assertNotIn('t', res.dims)
self.assertNotIn('time', res.coords)
self.assertNotIn('time', res.dims)
def test_bad_calibration(self):
"""Test that asking for a bad calibration fails."""
......
......@@ -19,7 +19,6 @@
import sys
import numpy as np
import xarray as xr
from .test_abi_l1b import FakeDataset
if sys.version_info < (2, 7):
import unittest2 as unittest
......@@ -53,10 +52,12 @@ class Test_NC_ABI_L2_base(unittest.TestCase):
x__ = xr.DataArray(
[0, 1],
attrs={'scale_factor': 2., 'add_offset': -1.},
dims=('x',),
)
y__ = xr.DataArray(
[0, 1],
attrs={'scale_factor': -2., 'add_offset': 1.},
dims=('y',),
)
ht_da = xr.DataArray(np.array([2, -1, -32768, 32767]).astype(np.int16).reshape((2, 2)),
......@@ -67,23 +68,24 @@ class Test_NC_ABI_L2_base(unittest.TestCase):
'_Unsigned': 'True',
'units': 'm'},)
xr_.open_dataset.return_value = FakeDataset({
'goes_imager_projection': proj,
'x': x__,
'y': y__,
'HT': ht_da,
"nominal_satellite_subpoint_lat": np.array(0.0),
"nominal_satellite_subpoint_lon": np.array(-89.5),
"nominal_satellite_height": np.array(35786020.),
"spatial_resolution": "10km at nadir",
fake_dataset = xr.Dataset(
data_vars={
'goes_imager_projection': proj,
'x': x__,
'y': y__,
'HT': ht_da,
"nominal_satellite_subpoint_lat": np.array(0.0),
"nominal_satellite_subpoint_lon": np.array(-89.5),
"nominal_satellite_height": np.array(35786020.),
"spatial_resolution": "10km at nadir",
},
{
attrs={
"time_coverage_start": "2017-09-20T17:30:40.8Z",
"time_coverage_end": "2017-09-20T17:41:17.5Z",
},
dims=('y', 'x'),
}
)
xr_.open_dataset.return_value = fake_dataset
self.reader = NC_ABI_L2('filename',
{'platform_shortname': 'G16', 'observation_type': 'HT',
'scan_mode': 'M3'},
......@@ -168,17 +170,23 @@ class Test_NC_ABI_L2_area_latlon(unittest.TestCase):
x__ = xr.DataArray(
[0, 1],
attrs={'scale_factor': 2., 'add_offset': -1.},
dims=('lon',),
)
y__ = xr.DataArray(
[0, 1],
attrs={'scale_factor': -2., 'add_offset': 1.},
dims=('lat',),
)
fake_dataset = xr.Dataset(
data_vars={
'goes_lat_lon_projection': proj,
'geospatial_lat_lon_extent': proj_ext,
'lon': x__,
'lat': y__,
'RSR': xr.DataArray(np.ones((2, 2)), dims=('lat', 'lon')),
},
)
xr_.open_dataset.return_value = FakeDataset({
'goes_lat_lon_projection': proj,
'geospatial_lat_lon_extent': proj_ext,
'lon': x__,
'lat': y__,
'RSR': np.ones((2, 2))}, {}, dims=('lon', 'lat'))
xr_.open_dataset.return_value = fake_dataset
self.reader = NC_ABI_L2('filename',
{'platform_shortname': 'G16', 'observation_type': 'RSR',
......