Skip to content
Commits on Source (5)
......@@ -5,7 +5,7 @@ env:
- PYTHON_VERSION=$TRAVIS_PYTHON_VERSION
- NUMPY_VERSION=stable
- MAIN_CMD='python setup.py'
- CONDA_DEPENDENCIES='xarray dask toolz Cython sphinx cartopy pillow matplotlib scipy pyyaml pyproj pyresample coveralls coverage codecov behave netcdf4 h5py h5netcdf gdal rasterio<1.0.14 imageio pyhdf mock libtiff pycoast pydecorate'
- CONDA_DEPENDENCIES='xarray dask distributed toolz Cython sphinx cartopy pillow matplotlib scipy pyyaml pyproj pyresample coveralls coverage codecov behave netcdf4 h5py h5netcdf gdal rasterio<1.0.14 imageio pyhdf mock libtiff pycoast pydecorate geoviews'
- PIP_DEPENDENCIES='trollsift trollimage pyspectral pyorbital libtiff'
- SETUP_XVFB=False
- EVENT_TYPE='push pull_request'
......
## Version 0.12.0 (2019/02/15)
### Issues Closed
* [Issue 601](https://github.com/pytroll/satpy/issues/601) - MultiScene 'save_animation' fails if "datasets=" isn't provided ([PR 602](https://github.com/pytroll/satpy/pull/602))
* [Issue 310](https://github.com/pytroll/satpy/issues/310) - Create MultiScene from list of files ([PR 576](https://github.com/pytroll/satpy/pull/576))
In this release 2 issues were closed.
### Pull Requests Merged
#### Bugs fixed
* [PR 616](https://github.com/pytroll/satpy/pull/616) - Fix geotiff writer being unimportable if gdal isn't installed
* [PR 615](https://github.com/pytroll/satpy/pull/615) - Fix confusing error in abi_l1b reader when file fails to open
* [PR 607](https://github.com/pytroll/satpy/pull/607) - Fix VIIRS 'histogram_dnb' compositor not returning new data
* [PR 605](https://github.com/pytroll/satpy/pull/605) - Fix enhancements using dask delayed on internal functions
* [PR 602](https://github.com/pytroll/satpy/pull/602) - Fix MultiScene save_animation not using dataset IDs correctly ([601](https://github.com/pytroll/satpy/issues/601), [601](https://github.com/pytroll/satpy/issues/601))
* [PR 600](https://github.com/pytroll/satpy/pull/600) - Fix resample reduce_data bug introduced in #582
#### Features added
* [PR 614](https://github.com/pytroll/satpy/pull/614) - Support for reduced resolution OLCI data
* [PR 613](https://github.com/pytroll/satpy/pull/613) - Add 'crop' and 'save_datasets' to MultiScene
* [PR 609](https://github.com/pytroll/satpy/pull/609) - Add ability to use dask distributed when generating animation videos
* [PR 582](https://github.com/pytroll/satpy/pull/582) - Add 'reduce_data' keyword argument to disable cropping before resampling
* [PR 576](https://github.com/pytroll/satpy/pull/576) - Add group_files and from_files utility functions for creating Scenes from multiple files ([310](https://github.com/pytroll/satpy/issues/310))
* [PR 567](https://github.com/pytroll/satpy/pull/567) - Add utility functions for generating GeoViews plots ([541](https://github.com/pytroll/satpy/issues/541))
In this release 12 pull requests were closed.
## Version 0.11.2 (2019/01/28)
### Issues Closed
......
......@@ -3,7 +3,7 @@ environment:
PYTHON: "C:\\conda"
MINICONDA_VERSION: "latest"
CMD_IN_ENV: "cmd /E:ON /V:ON /C .\\ci-helpers\\appveyor\\windows_sdk.cmd"
CONDA_DEPENDENCIES: "xarray dask toolz Cython sphinx cartopy pillow matplotlib scipy pyyaml pyproj pyresample coverage netcdf4 h5py h5netcdf gdal 'rasterio<1.0.14' imageio pyhdf mock libtiff pycoast pydecorate"
CONDA_DEPENDENCIES: "xarray dask distributed toolz Cython sphinx cartopy pillow matplotlib scipy pyyaml pyproj pyresample coverage netcdf4 h5py h5netcdf gdal 'rasterio<1.0.14' imageio pyhdf mock libtiff pycoast pydecorate"
PIP_DEPENDENCIES: "trollsift trollimage pyspectral pyorbital libtiff"
CONDA_CHANNELS: "conda-forge"
......
satpy (0.11.2-1) UNRELEASED; urgency=medium
satpy (0.12.0-1) UNRELEASED; urgency=medium
* New upstream release (Closes: #919566).
* debian/patches:
......@@ -6,6 +6,9 @@ satpy (0.11.2-1) UNRELEASED; urgency=medium
applied upstream
- drop 0003-Skip-tests-that-require-pydecorate.patch: no longer necessary
(pydecorate is now available)
- new 0002-Disable-extre-dependency-form-geoviews.patch: geoviews is not
currently available in debian
- refresh remainimg patches
-- Antonio Valentino <antonio.valentino@tiscali.it> Fri, 18 Jan 2019 07:45:38 +0000
......
......@@ -7,7 +7,7 @@ Subject: Fix pyhdf
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/setup.py b/setup.py
index 257f955..7f7308f 100644
index 0b2aa9c..bf8d4e8 100644
--- a/setup.py
+++ b/setup.py
@@ -36,7 +36,7 @@ requires = ['numpy >=1.13', 'pillow', 'pyresample >=1.10.3', 'trollsift',
......@@ -16,6 +16,6 @@ index 257f955..7f7308f 100644
# pyhdf (conda) == python-hdf4 (pip)
-test_requires = ['behave', 'h5py', 'netCDF4', 'pyhdf', 'imageio', 'libtiff',
+test_requires = ['behave', 'h5py', 'netCDF4', 'python-hdf4', 'imageio', 'libtiff',
'rasterio']
'rasterio', 'geoviews']
if sys.version < '3.0':
From: Antonio Valentino <antonio.valentino@tiscali.it>
Date: Sun, 17 Feb 2019 10:56:06 +0000
Subject: Disable extre dependency form geoviews
---
satpy/tests/test_scene.py | 7 +++++++
setup.py | 4 ++--
2 files changed, 9 insertions(+), 2 deletions(-)
diff --git a/satpy/tests/test_scene.py b/satpy/tests/test_scene.py
index 58ad1cf..672023f 100644
--- a/satpy/tests/test_scene.py
+++ b/satpy/tests/test_scene.py
@@ -1730,6 +1730,13 @@ class TestSceneSaving(unittest.TestCase):
os.path.join(self.base_dir, 'test_20180101_000000.tif')))
+try:
+ import geoviews
+except ImportError:
+ geoviews = None
+
+
+@unittest.skipIf(not geoviews, 'geoviews not available')
class TestSceneConversions(unittest.TestCase):
"""Test Scene conversion to geoviews, xarray, etc."""
diff --git a/setup.py b/setup.py
index bf8d4e8..82aa382 100644
--- a/setup.py
+++ b/setup.py
@@ -37,7 +37,7 @@ requires = ['numpy >=1.13', 'pillow', 'pyresample >=1.10.3', 'trollsift',
# pyhdf (conda) == python-hdf4 (pip)
test_requires = ['behave', 'h5py', 'netCDF4', 'python-hdf4', 'imageio', 'libtiff',
- 'rasterio', 'geoviews']
+ 'rasterio'] # , 'geoviews']
if sys.version < '3.0':
test_requires.append('mock')
@@ -71,7 +71,7 @@ extras_require = {
# Documentation:
'doc': ['sphinx'],
# Other
- 'geoviews': ['geoviews'],
+ #'geoviews': ['geoviews'],
}
all_extras = []
for extra_deps in extras_require.values():
0001-Fix-pyhdf.patch
0002-Disable-extre-dependency-form-geoviews.patch
......@@ -16,6 +16,7 @@
import os
import sys
from datetime import datetime
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
......@@ -75,7 +76,7 @@ master_doc = 'index'
# General information about the project.
project = u'SatPy'
copyright = u'2009-2016, The PyTroll Team'
copyright = u'2009-{}, The PyTroll Team'.format(datetime.utcnow().strftime("%Y"))
# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
......@@ -244,8 +245,10 @@ intersphinx_mapping = {
'scipy': ('https://docs.scipy.org/doc/scipy/reference', None),
'xarray': ('https://xarray.pydata.org/en/stable', None),
'dask': ('https://docs.dask.org/en/latest', None),
'jobqueue': ('https://jobqueue.dask.org/en/latest', None),
'pyresample': ('https://pyresample.readthedocs.io/en/stable', None),
'trollsift': ('https://trollsift.readthedocs.io/en/stable', None),
'trollimage': ('https://trollimage.readthedocs.io/en/stable', None),
'pydecorate': ('https://pydecorate.readthedocs.io/en/stable', None),
'geoviews': ('http://geo.holoviews.org', None),
}
......@@ -297,13 +297,14 @@ than creating a delayed function. Similar to delayed functions the inputs to
the function are fully computed DataArrays or numpy arrays, but only the
individual chunks of the dask array at a time. Note that ``map_blocks`` must
be provided dask arrays and won't function properly on XArray DataArrays.
It is recommended that the function object passed to ``map_blocks`` **not**
be an internal function (a function defined inside another function) or it
may be unserializable and can cause issues in some environments.
.. code-block:: python
my_new_arr = da.map_blocks(_complex_operation, my_dask_arr1, my_dask_arr2, dtype=my_dask_arr1.dtype)
http://dask.pydata.org/en/latest/array-api.html#dask.array.core.map_blocks
Helpful functions
*****************
......
......@@ -13,12 +13,18 @@ examples will walk through some basic use cases of the MultiScene.
These features are still early in development and may change overtime as
more user feedback is received and more features added.
Blending Scenes in MultiScene
-----------------------------
Scenes contained in a MultiScene can be combined in different ways.
Stacking scenes
---------------
***************
The MultiScene makes it easy to take multiple Scenes and stack them on top of
each other. The code below takes two separate orbits from a VIIRS sensor and
stacks them on top of each other.
The code below uses the :meth:`~satpy.multiscene.MultiScene.blend` method of
the ``MultiScene`` object to stack two separate orbits from a VIIRS sensor. By
default the ``blend`` method will use the :func:`~satpy.multiscene.stack`
function which uses the first dataset as the base of the image and then
iteratively overlays the remaining datasets on top.
>>> from satpy import Scene, MultiScene
>>> from glob import glob
......@@ -34,17 +40,73 @@ stacks them on top of each other.
>>> blended_scene = new_mscn.blend()
>>> blended_scene.save_datasets()
Timeseries
**********
Using the :meth:`~satpy.multiscene.MultiScene.blend` method with the
:func:`~satpy.multiscene.timeseries` function will combine
multiple scenes from different time slots by time. A single `Scene` with each
dataset/channel extended by the time dimension will be returned. If used
together with the :meth:`~satpy.scene.Scene.to_geoviews` method, creation of
interactive timeseries Bokeh plots is possible.
>>> from satpy import Scene, MultiScene
>>> from satpy.multiscene import timeseries
>>> from glob import glob
>>> from pyresample.geometry import AreaDefinition
>>> my_area = AreaDefinition(...)
>>> scenes = [
... Scene(reader='viirs_sdr', filenames=glob('/data/viirs/day_1/*t180*.h5')),
... Scene(reader='viirs_sdr', filenames=glob('/data/viirs/day_2/*t180*.h5'))
... ]
>>> mscn = MultiScene(scenes)
>>> mscn.load(['I04'])
>>> new_mscn = mscn.resample(my_area)
>>> blended_scene = new_mscn.blend(blend_function=timeseries)
>>> blended_scene['I04']
<xarray.DataArray (time: 2, y: 1536, x: 6400)>
dask.array<shape=(2, 1536, 6400), dtype=float64, chunksize=(1, 1536, 4096)>
Coordinates:
* time (time) datetime64[ns] 2012-02-25T18:01:24.570942 2012-02-25T18:02:49.975797
Dimensions without coordinates: y, x
Saving frames of an animation
-----------------------------
The MultiScene can take "frames" of data and join them together in a single
animation movie file. Saving animations required the `imageio` python library
animation movie file. Saving animations requires the `imageio` python library
and for most available formats the ``ffmpeg`` command line tool suite should
also be installed. The below example saves a series of GOES-EAST ABI channel
1 and channel 2 frames to MP4 movie files. Note that currently there is no
easy way to map files from multiple time steps/orbits in to individual Scene
objects. The `glob` function and for loops are used to group files into Scene
objects that, if used individually, could load the data we want.
1 and channel 2 frames to MP4 movie files. We can use the
:meth:`MultiScene.from_files <satpy.multiscene.MultiScene.from_files>` class
method to create a `MultiScene` from a series of files. This uses the
:func:`~satpy.readers.group_files` utility function to group files by start
time.
>>> from satpy import Scene, MultiScene
>>> from glob import glob
>>> mscn = MultiScene.from_files(glob('/data/abi/day_1/*C0[12]*.nc'), reader='abi_l1b')
>>> mscn.load(['C01', 'C02'])
>>> mscn.save_animation('{name}_{start_time:%Y%m%d_%H%M%S}.mp4', fps=2)
.. versionadded:: 0.12
The ``from_files`` and ``group_files`` functions were added in SatPy 0.12.
See below for an alternative solution.
This will compute one video frame (image) at a time and write it to the MPEG-4
video file. For users with more powerful systems it is possible to use
the ``client`` and ``batch_size`` keyword arguments to compute multiple frames
in parallel using the dask ``distributed`` library (if installed).
See the :doc:`dask distributed <dask:setup/single-distributed>` documentation
for information on creating a ``Client`` object. If working on a cluster
you may want to use :doc:`dask jobqueue <jobqueue:index>` to take advantage
of multiple nodes at a time.
For older versions of SatPy we can manually create the `Scene` objects used.
The :func:`~glob.glob` function and for loops are used to group files into
Scene objects that, if used individually, could load the data we want. The
code below is equivalent to the ``from_files`` code above:
>>> from satpy import Scene, MultiScene
>>> from glob import glob
......@@ -62,3 +124,22 @@ objects that, if used individually, could load the data we want.
GIF images, although supported, are not recommended due to the large file
sizes that can be produced from only a few frames.
Saving multiple scenes
----------------------
The ``MultiScene`` object includes a
:meth:`~satpy.multiscene.MultiScene.save_datasets` method for saving the
data from multiple Scenes to disk. By default this will operate on one Scene
at a time, but similar to the ``save_animation`` method above this method can
accept a dask distributed ``Client`` object via the ``client`` keyword
argument to compute scenes in parallel (see documentation above). Note however
that some writers, like the ``geotiff`` writer, do not support multi-process
operations at this time and will fail when used with dask distributed. To save
multiple Scenes use:
>>> from satpy import Scene, MultiScene
>>> from glob import glob
>>> mscn = MultiScene.from_files(glob('/data/abi/day_1/*C0[12]*.nc'), reader='abi_l1b')
>>> mscn.load(['C01', 'C02'])
>>> mscn.save_datasets(base_dir='/path/for/output')
......@@ -2,8 +2,8 @@
Quickstart
==========
Loading data
============
Loading and accessing data
==========================
.. testsetup:: *
>>> import sys
......@@ -106,7 +106,7 @@ SatPy allows loading file data by wavelengths in micrometers (shown above) or by
>>> global_scene.load(["VIS006", "VIS008", "IR_108"])
To have a look at the available channels for loading from your :class:`~satpy.scene.Scene` object use the
:meth:`available_datasets <satpy.scene.Scene.available_dataset_names>` method:
:meth:`~satpy.scene.Scene.available_dataset_names` method:
>>> global_scene.available_dataset_names()
['HRV',
......@@ -127,11 +127,32 @@ To access the loaded data use the wavelength or name:
>>> print(global_scene[0.6])
Visualizing data
================
To visualize loaded data in a pop-up window:
>>> global_scene.show(0.6)
To make combine datasets and make a new dataset:
Alternatively if working in a Jupyter notebook the scene can be converted to
a `geoviews <http://geo.holoviews.org/index.html>`_ object using the
:meth:`~satpy.scene.Scene.to_geoviews` method. The geoviews package is not a
requirement of the base satpy install so in order to use this feature the user
needs to install the geoviews package himself.
>>> import holoviews as hv
>>> import geoviews as gv
>>> import geoviews.feature as gf
>>> gv.extension("bokeh", "matplotlib")
>>> %opts QuadMesh Image [width=600 height=400 colorbar=True] Feature [apply_ranges=False]
>>> %opts Image QuadMesh (cmap='RdBu_r')
>>> gview = global_scene.to_geoviews(vdims=[0.6])
>>> gview[::5,::5] * gf.coastline * gf.borders
Creating new datasets
=====================
Calculations based on loaded datasets/channels can easily be assigned to a new dataset:
>>> global_scene["ndvi"] = (global_scene[0.8] - global_scene[0.6]) / (global_scene[0.8] + global_scene[0.6])
>>> global_scene.show("ndvi")
......
......@@ -1269,26 +1269,6 @@ class RatioSharpenedRGB(GenericCompositor):
return super(RatioSharpenedRGB, self).__call__((r, g, b), **info)
class SelfSharpenedRGB(RatioSharpenedRGB):
"""Sharpen RGB with ratio of a band with a strided-version of itself.
Example:
R - 500m resolution - shape=(4000, 4000)
G - 1000m resolution - shape=(2000, 2000)
B - 1000m resolution - shape=(2000, 2000)
ratio = R / four_element_average(R)
new_R = R
new_G = G * ratio
new_B = B * ratio
"""
@staticmethod
def four_element_average_dask(d):
"""Average every 4 elements (2x2) in a 2D array"""
def _mean4(data, offset=(0, 0), block_id=None):
rows, cols = data.shape
# we assume that the chunks except the first ones are aligned
......@@ -1311,10 +1291,30 @@ class SelfSharpenedRGB(RatioSharpenedRGB):
new_shape = (int(rows2 / 2.), 2, int(cols2 / 2.), 2)
data_mean = np.nanmean(av_data.reshape(new_shape), axis=(1, 3))
data_mean = np.repeat(np.repeat(data_mean, 2, axis=0), 2, axis=1)
data_mean = data_mean[row_offset:row_offset + rows,
col_offset:col_offset + cols]
data_mean = data_mean[row_offset:row_offset + rows, col_offset:col_offset + cols]
return data_mean
class SelfSharpenedRGB(RatioSharpenedRGB):
"""Sharpen RGB with ratio of a band with a strided-version of itself.
Example:
R - 500m resolution - shape=(4000, 4000)
G - 1000m resolution - shape=(2000, 2000)
B - 1000m resolution - shape=(2000, 2000)
ratio = R / four_element_average(R)
new_R = R
new_G = G * ratio
new_B = B * ratio
"""
@staticmethod
def four_element_average_dask(d):
"""Average every 4 elements (2x2) in a 2D array"""
try:
offset = d.attrs['area'].crop_offset
except (KeyError, AttributeError):
......
......@@ -327,6 +327,11 @@ def chand(phi, muv, mus, taur):
return rhoray, trdown, trup
def _sphalb_index(index_arr, sphalb0):
# FIXME: if/when dask can support lazy index arrays then remove this
return sphalb0[index_arr]
def atm_variables_finder(mus, muv, phi, height, tau, tO3, tH2O, taustep4sphalb, tO2=1.0):
tau_step = da.linspace(taustep4sphalb, MAXNUMSPHALBVALUES * taustep4sphalb, MAXNUMSPHALBVALUES,
chunks=int(MAXNUMSPHALBVALUES / 2))
......@@ -334,9 +339,6 @@ def atm_variables_finder(mus, muv, phi, height, tau, tO3, tH2O, taustep4sphalb,
taur = tau * da.exp(-height / SCALEHEIGHT)
rhoray, trdown, trup = chand(phi, muv, mus, taur)
if isinstance(height, xr.DataArray):
def _sphalb_index(index_arr, sphalb0):
# FIXME: if/when dask can support lazy index arrays then remove this
return sphalb0[index_arr]
sphalb = da.map_blocks(_sphalb_index, (taur / taustep4sphalb + 0.5).astype(np.int32).data, sphalb0.compute(),
dtype=sphalb0.dtype)
else:
......@@ -380,6 +382,10 @@ def G_calc(zenith, a_coeff):
return (da.cos(da.deg2rad(zenith))+(a_coeff[0]*(zenith**a_coeff[1])*(a_coeff[2]-zenith)**a_coeff[3]))**-1
def _avg_elevation_index(avg_elevation, row, col):
return avg_elevation[row, col]
def run_crefl(refl, coeffs,
lon,
lat,
......@@ -423,8 +429,6 @@ def run_crefl(refl, coeffs,
row[space_mask] = 0
col[space_mask] = 0
def _avg_elevation_index(avg_elevation, row, col):
return avg_elevation[row, col]
height = da.map_blocks(_avg_elevation_index, avg_elevation, row, col, dtype=avg_elevation.dtype)
height = xr.DataArray(height, dims=['y', 'x'])
# negative heights aren't allowed, clip to 0
......
......@@ -205,6 +205,10 @@ class HistogramDNB(CompositeBase):
sza_data (ndarray): Solar Zenith Angle data array
"""
# convert dask arrays to DataArray objects
dnb_data = xr.DataArray(dnb_data, dims=('y', 'x'))
sza_data = xr.DataArray(sza_data, dims=('y', 'x'))
good_mask = ~(dnb_data.isnull() | sza_data.isnull())
output_dataset = dnb_data.where(good_mask)
# we only need the numpy array
......@@ -228,21 +232,17 @@ class HistogramDNB(CompositeBase):
for mask in mixed_mask:
if mask.any():
LOG.debug("Histogram equalizing DNB mixed data...")
histogram_equalization(dnb_data,
mask,
out=output_dataset)
histogram_equalization(dnb_data, mask, out=output_dataset)
did_equalize = True
if night_mask.any():
LOG.debug("Histogram equalizing DNB night data...")
histogram_equalization(dnb_data,
night_mask,
out=output_dataset)
histogram_equalization(dnb_data, night_mask, out=output_dataset)
did_equalize = True
if not did_equalize:
raise RuntimeError("No valid data found to histogram equalize")
return dnb_data
return output_dataset
def __call__(self, datasets, **info):
"""Create the composite by scaling the DNB data using a histogram equalization method.
......@@ -255,7 +255,7 @@ class HistogramDNB(CompositeBase):
dnb_data = datasets[0]
sza_data = datasets[1]
delayed = dask.delayed(self._run_dnb_normalization)(dnb_data, sza_data)
delayed = dask.delayed(self._run_dnb_normalization)(dnb_data.data, sza_data.data)
output_dataset = dnb_data.copy()
output_data = da.from_delayed(delayed, dnb_data.shape, dnb_data.dtype)
output_dataset.data = output_data.rechunk(dnb_data.data.chunks)
......@@ -309,6 +309,10 @@ class AdaptiveDNB(HistogramDNB):
sza_data (ndarray): Solar Zenith Angle data array
"""
# convert dask arrays to DataArray objects
dnb_data = xr.DataArray(dnb_data, dims=('y', 'x'))
sza_data = xr.DataArray(sza_data, dims=('y', 'x'))
good_mask = ~(dnb_data.isnull() | sza_data.isnull())
# good_mask = ~(dnb_data.mask | sza_data.mask)
output_dataset = dnb_data.where(good_mask)
......@@ -424,8 +428,7 @@ class ERFDNB(CompositeBase):
dnb_data = datasets[0]
sza_data = datasets[1]
lza_data = datasets[2]
output_dataset = dnb_data.where(
~(dnb_data.isnull() | sza_data.isnull()))
output_dataset = dnb_data.where(~(dnb_data.isnull() | sza_data.isnull()))
# this algorithm assumes units of "W cm-2 sr-1" so if there are other
# units we need to adjust for that
if dnb_data.attrs.get("units", "W m-2 sr-1") == "W m-2 sr-1":
......@@ -471,14 +474,9 @@ class ERFDNB(CompositeBase):
# Update from Curtis Seaman, increase max radiance curve until less
# than 0.5% is saturated
if self.saturation_correction:
delayed = dask.delayed(self._saturation_correction)(
output_dataset.data, unit_factor,
min_val, max_val)
output_dataset.data = da.from_delayed(delayed,
output_dataset.shape,
output_dataset.dtype)
output_dataset.data = output_dataset.data.rechunk(
dnb_data.data.chunks)
delayed = dask.delayed(self._saturation_correction)(output_dataset.data, unit_factor, min_val, max_val)
output_dataset.data = da.from_delayed(delayed, output_dataset.shape, output_dataset.dtype)
output_dataset.data = output_dataset.data.rechunk(dnb_data.data.chunks)
else:
inner_sqrt = (output_dataset - min_val) / (max_val - min_val)
# clip negative values to 0 before the sqrt
......@@ -511,8 +509,7 @@ def make_day_night_masks(solarZenithAngle,
given, the whole terminator region will be one mask)
"""
# if the caller passes None, we're only doing one step
stepsDegrees = highAngleCutoff - \
lowAngleCutoff if stepsDegrees is None else stepsDegrees
stepsDegrees = highAngleCutoff - lowAngleCutoff if stepsDegrees is None else stepsDegrees
night_mask = (solarZenithAngle > highAngleCutoff) & good_mask
day_mask = (solarZenithAngle <= lowAngleCutoff) & good_mask
......
......@@ -146,6 +146,11 @@ def cira_stretch(img, **kwargs):
return apply_enhancement(img.data, func)
def _lookup_delayed(luts, band_data):
# can't use luts.__getitem__ for some reason
return luts[band_data]
def lookup(img, **kwargs):
"""Assign values to channels based on a table."""
luts = np.array(kwargs['luts'], dtype=np.float32) / 255.0
......@@ -155,10 +160,7 @@ def lookup(img, **kwargs):
lut = luts[:, index] if len(luts.shape) == 2 else luts
band_data = band_data.clip(0, lut.size - 1).astype(np.uint8)
def _delayed(luts, band_data):
# can't use luts.__getitem__ for some reason
return luts[band_data]
new_delay = dask.delayed(_delayed)(lut, band_data)
new_delay = dask.delayed(_lookup_delayed)(lut, band_data)
new_data = da.from_delayed(new_delay, shape=band_data.shape,
dtype=luts.dtype)
return new_data
......@@ -231,9 +233,15 @@ def create_colormap(palette):
return None
def _three_d_effect_delayed(band_data, kernel, mode):
from scipy.signal import convolve2d
band_data = band_data.reshape(band_data.shape[1:])
new_data = convolve2d(band_data, kernel, mode=mode)
return new_data.reshape((1, band_data.shape[0], band_data.shape[1]))
def three_d_effect(img, **kwargs):
"""Create 3D effect using convolution"""
from scipy.signal import convolve2d
w = kwargs.get('weight', 1)
LOG.debug("Applying 3D effect with weight %.2f", w)
kernel = np.array([[-w, 0, w],
......@@ -244,22 +252,14 @@ def three_d_effect(img, **kwargs):
def func(band_data, kernel=kernel, mode=mode, index=None):
del index
def _delayed(band_data, kernel, mode):
band_data = band_data.reshape(band_data.shape[1:])
new_data = convolve2d(band_data, kernel, mode=mode)
return new_data.reshape((1, band_data.shape[0],
band_data.shape[1]))
delay = dask.delayed(_delayed)(band_data, kernel, mode)
new_data = da.from_delayed(delay, shape=band_data.shape,
dtype=band_data.dtype)
delay = dask.delayed(_three_d_effect_delayed)(band_data, kernel, mode)
new_data = da.from_delayed(delay, shape=band_data.shape, dtype=band_data.dtype)
return new_data
return apply_enhancement(img.data, func, separate=True, pass_dask=True)
def btemp_threshold(img, min_in, max_in, threshold, threshold_out=None,
**kwargs):
def btemp_threshold(img, min_in, max_in, threshold, threshold_out=None, **kwargs):
"""Scale data linearly in two separate regions.
This enhancement scales the input data linearly by splitting the data
......
......@@ -10,6 +10,8 @@ reader:
sensors: [abi]
default_channels:
reader: !!python/name:satpy.readers.yaml_reader.FileYAMLReader
# file pattern keys to sort files by with 'satpy.utils.group_files'
group_keys: ['start_time', 'platform_shortname', 'scene_abbr']
file_types:
# NOTE: observation_type == product acronym in PUG document
......
......@@ -6,8 +6,9 @@ reader:
description: JMA HRIT Reader
name: ahi_hrit
sensors: [ahi]
default_channels: []
reader: !!python/name:satpy.readers.yaml_reader.FileYAMLReader
# file pattern keys to sort files by with 'satpy.utils.group_files'
group_keys: ['start_time', 'area']
file_types:
hrit_b01:
......
......@@ -6,7 +6,8 @@ reader:
name: ahi_hsd
reader: !!python/name:satpy.readers.yaml_reader.FileYAMLReader ''
sensors: [ahi]
default_datasets:
# file pattern keys to sort files by with 'satpy.utils.group_files'
group_keys: ['start_time', 'platform_shortname', 'area']
datasets:
B01:
......
......@@ -8,17 +8,25 @@ reader:
file_types:
esa_l1b:
file_reader: !!python/name:satpy.readers.olci_nc.NCOLCI1B
file_patterns: ['{mission_id:3s}_OL_1_{datatype_id:_<6s}_{start_time:%Y%m%dT%H%M%S}_{end_time:%Y%m%dT%H%M%S}_{creation_time:%Y%m%dT%H%M%S}_{duration:4d}_{cycle:3d}_{relative_orbit:3d}_{frame:4d}_{centre:3s}_{platform_mode:1s}_{timeliness:2s}_{collection:3s}.SEN3/{dataset_name}_radiance.nc']
file_patterns:
- '{mission_id:3s}_OL_1_{datatype_id:_<6s}_{start_time:%Y%m%dT%H%M%S}_{end_time:%Y%m%dT%H%M%S}_{creation_time:%Y%m%dT%H%M%S}_{duration:4d}_{cycle:3d}_{relative_orbit:3d}_{frame:4d}_{centre:3s}_{platform_mode:1s}_{timeliness:2s}_{collection:3s}.SEN3/{dataset_name}_radiance.nc'
- '{mission_id:3s}_OL_1_{datatype_id:_<6s}_{start_time:%Y%m%dT%H%M%S}_{end_time:%Y%m%dT%H%M%S}_{creation_time:%Y%m%dT%H%M%S}_{duration:4d}_{cycle:3d}_{relative_orbit:3d}______{centre:3s}_{platform_mode:1s}_{timeliness:2s}_{collection:3s}.SEN3/{dataset_name}_radiance.nc'
requires: [esa_cal]
esa_angles:
file_reader: !!python/name:satpy.readers.olci_nc.NCOLCIAngles
file_patterns: ['{mission_id:3s}_OL_1_{datatype_id:_<6s}_{start_time:%Y%m%dT%H%M%S}_{end_time:%Y%m%dT%H%M%S}_{creation_time:%Y%m%dT%H%M%S}_{duration:4d}_{cycle:3d}_{relative_orbit:3d}_{frame:4d}_{centre:3s}_{platform_mode:1s}_{timeliness:2s}_{collection:3s}.SEN3/tie_geometries.nc']
file_patterns:
- '{mission_id:3s}_OL_1_{datatype_id:_<6s}_{start_time:%Y%m%dT%H%M%S}_{end_time:%Y%m%dT%H%M%S}_{creation_time:%Y%m%dT%H%M%S}_{duration:4d}_{cycle:3d}_{relative_orbit:3d}_{frame:4d}_{centre:3s}_{platform_mode:1s}_{timeliness:2s}_{collection:3s}.SEN3/tie_geometries.nc'
- '{mission_id:3s}_OL_1_{datatype_id:_<6s}_{start_time:%Y%m%dT%H%M%S}_{end_time:%Y%m%dT%H%M%S}_{creation_time:%Y%m%dT%H%M%S}_{duration:4d}_{cycle:3d}_{relative_orbit:3d}______{centre:3s}_{platform_mode:1s}_{timeliness:2s}_{collection:3s}.SEN3/tie_geometries.nc'
esa_geo:
file_reader: !!python/name:satpy.readers.olci_nc.NCOLCIGeo
file_patterns: ['{mission_id:3s}_OL_1_{datatype_id:_<6s}_{start_time:%Y%m%dT%H%M%S}_{end_time:%Y%m%dT%H%M%S}_{creation_time:%Y%m%dT%H%M%S}_{duration:4d}_{cycle:3d}_{relative_orbit:3d}_{frame:4d}_{centre:3s}_{platform_mode:1s}_{timeliness:2s}_{collection:3s}.SEN3/geo_coordinates.nc']
file_patterns:
- '{mission_id:3s}_OL_1_{datatype_id:_<6s}_{start_time:%Y%m%dT%H%M%S}_{end_time:%Y%m%dT%H%M%S}_{creation_time:%Y%m%dT%H%M%S}_{duration:4d}_{cycle:3d}_{relative_orbit:3d}_{frame:4d}_{centre:3s}_{platform_mode:1s}_{timeliness:2s}_{collection:3s}.SEN3/geo_coordinates.nc'
- '{mission_id:3s}_OL_1_{datatype_id:_<6s}_{start_time:%Y%m%dT%H%M%S}_{end_time:%Y%m%dT%H%M%S}_{creation_time:%Y%m%dT%H%M%S}_{duration:4d}_{cycle:3d}_{relative_orbit:3d}______{centre:3s}_{platform_mode:1s}_{timeliness:2s}_{collection:3s}.SEN3/geo_coordinates.nc'
esa_cal:
file_reader: !!python/name:satpy.readers.olci_nc.NCOLCICal
file_patterns: ['{mission_id:3s}_OL_1_{datatype_id:_<6s}_{start_time:%Y%m%dT%H%M%S}_{end_time:%Y%m%dT%H%M%S}_{creation_time:%Y%m%dT%H%M%S}_{duration:4d}_{cycle:3d}_{relative_orbit:3d}_{frame:4d}_{centre:3s}_{platform_mode:1s}_{timeliness:2s}_{collection:3s}.SEN3/instrument_data.nc']
file_patterns:
- '{mission_id:3s}_OL_1_{datatype_id:_<6s}_{start_time:%Y%m%dT%H%M%S}_{end_time:%Y%m%dT%H%M%S}_{creation_time:%Y%m%dT%H%M%S}_{duration:4d}_{cycle:3d}_{relative_orbit:3d}_{frame:4d}_{centre:3s}_{platform_mode:1s}_{timeliness:2s}_{collection:3s}.SEN3/instrument_data.nc'
- '{mission_id:3s}_OL_1_{datatype_id:_<6s}_{start_time:%Y%m%dT%H%M%S}_{end_time:%Y%m%dT%H%M%S}_{creation_time:%Y%m%dT%H%M%S}_{duration:4d}_{cycle:3d}_{relative_orbit:3d}______{centre:3s}_{platform_mode:1s}_{timeliness:2s}_{collection:3s}.SEN3/instrument_data.nc'
datasets:
longitude:
name: longitude
......
This diff is collapsed.