Skip to content
Commits on Source (7)
[bumpversion]
current_version = 1.7.1
current_version = 1.8.0
commit = True
tag = True
......
......@@ -2,6 +2,86 @@ Changelog
=========
v1.8.0 (2018-02-02)
-------------------
- update changelog. [Martin Raspaud]
- Bump version: 1.7.1 → 1.8.0. [Martin Raspaud]
- Merge branch 'develop' into new_release. [Martin Raspaud]
- Merge pull request #95 from pytroll/bugfix-pyproj-version. [Martin
Raspaud]
Provide the minimum version of pyproj needed
- Provide the minimum version of pyproj needed. [Martin Raspaud]
- Merge pull request #94 from pytroll/optimize-xarray. [Martin Raspaud]
Optimize xarray
- Add test for new wrap_and_check function. [davidh-ssec]
- Rename chunk size environment variable to PYTROLL_CHUNK_SIZE. [davidh-
ssec]
- Fix circular import between geometry and init's CHUNK_SIZE. [davidh-
ssec]
- Revert import removal in init and add easy access imports. [davidh-
ssec]
Includes attempt to remove circular dependency between utils and
geometry module.
- Use central CHUNK_SIZE constant for dask based operations. [davidh-
ssec]
- Add `check_and_wrap` utility function and fix various docstring
issues. [davidh-ssec]
- Remove tests for removed features. [davidh-ssec]
- Remove longitude/latitude validity checks in BaseDefinition. [davidh-
ssec]
This was causing issues with dask based inputs and was a performance
penalty for all use cases even when the arrays were valid. Removing
this check should not affect 99% of users.
- Combine dask operations to improve resampling performance. [davidh-
ssec]
Still a lot that could be done probably.
- Fix dask minimum version number for meshgrid support. [davidh-ssec]
- Add dask extra to setup.py to specify minimum dask version. [davidh-
ssec]
pyresample uses dask meshgrid which came in version 1.9
- Merge pull request #86 from pytroll/feature-multiple-dims. [Martin
Raspaud]
[WIP] Feature multiple dims
- Remove explicit chunksize. [Martin Raspaud]
- Clean up with pep8. [Martin Raspaud]
- Take care of coordinates when resampling. [Martin Raspaud]
- Define default blocksizes for dask arrays. [Martin Raspaud]
- Merge branch 'feature-optimize-dask' into feature-multiple-dims.
[Martin Raspaud]
- Style cleanup. [Martin Raspaud]
- Fix get_hashable_array for variations of np arrays. [Martin Raspaud]
- Print warning when wrapping is needed independently of type. [Martin
Raspaud]
- Change default blocksize to 5000. [Martin Raspaud]
- Make use of dask's map_blocks. [Martin Raspaud]
Instead of writing our own array definitions
- Revert "Make resampling lazy" [Martin Raspaud]
This reverts commit 5a4f9c342f9c8262c06c28986163fc682242ce75.
- Make resampling lazy. [Martin Raspaud]
- Revert yapf change. [Martin Raspaud]
- Clean up code (pycodestyle, pydocstyle) [Martin Raspaud]
- Make XR resampling work with more dimensions. [Martin Raspaud]
- Merge pull request #91 from avalentino/issues/gh-090. [David Hoese]
Fix test_get_array_hashable on big-endian machines (closes #90)
- Fix test_get_array_hashable on big-endian machines. [Antonio
Valentino]
v1.7.1 (2017-12-21)
-------------------
- update changelog. [davidh-ssec]
......@@ -612,7 +692,6 @@ v1.2.2 (2016-06-21)
Without this, the compilation of the ewa extension crashes.
v1.2.1 (2016-06-21)
-------------------
- update changelog. [Martin Raspaud]
......@@ -768,11 +847,9 @@ v1.2.0 (2016-06-17)
- Make kd_tree test work on older numpy version. [Martin Raspaud]
VisibleDeprecationWarning is not available in numpy <1.9.
- Adapt to newest pykdtree version. [Martin Raspaud]
The kdtree object's attribute `data_pts` has been renamed to `data`.
- Run tests on python 3.5 in travis also. [Martin Raspaud]
......@@ -784,7 +861,6 @@ v1.1.6 (2016-02-25)
A previous commit was looking for a 'data_pts' attribute in the kdtree
object, which is available in pykdtree, but not scipy.
- Merge pull request #32 from mitkin/master. [Martin Raspaud]
[tests] Skip deprecation warnings in test_gauss_multi_uncert
......@@ -796,7 +872,6 @@ v1.1.6 (2016-02-25)
The latest matplotlib (1.5) doesn't support python 2.6 and 3.3. This patch
chooses the right matplotlib version to install depending on the python
version at hand.
- Skip deprecation warnings. [Mikhail Itkin]
Catch the rest of the warnings. Check if there is only one, and
......@@ -838,7 +913,6 @@ Other
- Bugfix to address a numpy DeprecationWarning. [Martin Raspaud]
Numpy won't take non-integer indices soon, so make index an int.
- Merge branch 'release-1.1.3' [Martin Raspaud]
- Merge branch 'licence-lgpl' into pre-master. [Martin Raspaud]
- Switch to lgplv3, and bump up version number. [Martin Raspaud]
......@@ -1060,7 +1134,7 @@ Other
- Set svn:mime-type. [StorPipfugl]
- Corrected doc errors. [StorPipfugl]
- Removed dist dir. [StorPipfugl]
- [StorPipfugl]
- No commit message. [StorPipfugl]
- Updated documentation. New release. [StorPipfugl]
- Started updating docstrings. [StorPipfugl]
- Restructured API. [StorPipfugl]
......@@ -1073,8 +1147,9 @@ Other
- Removed unneeded function. [StorPipfugl]
- Mime types set. [StorPipfugl]
- Mime types set. [StorPipfugl]
- [StorPipfugl]
- No commit message. [StorPipfugl]
- Moved to Google Code under GPLv3 license. [StorPipfugl]
- moved to Google Code. [StorPipfugl]
pyresample (1.7.1-4) UNRELEASED; urgency=medium
pyresample (1.8.0-1) unstable; urgency=medium
* New upstream release
* Standard version bumped to v4.1.3 (no change)
* Update compat version
* debian/control
- recommend dask
* Drop debian/source.lintian-overrides: no longer necessary
* New python3-pyresample.lintian-overrides for:
python-package-depends-on-package-from-other-python-variant
* debian/patches
- drop 0003-fix-tests-on-big-endian-architectures.patch:
applied upstream
- refresh all patches
-- Antonio Valentino <antonio.valentino@tiscali.it> Sun, 14 Jan 2018 17:53:48 +0000
-- Antonio Valentino <antonio.valentino@tiscali.it> Mon, 05 Feb 2018 07:18:35 +0000
pyresample (1.7.1-3) unstable; urgency=medium
......
......@@ -77,7 +77,8 @@ Depends: ${misc:Depends},
python-pyresample-test
Recommends: python3-numexpr,
python3-pil,
python3-mpltoolkits.basemap
python3-mpltoolkits.basemap,
python3-dask
Suggests: python-pyresample-doc,
python3-xarray
Description: Resampling of remote sensing data in Python 3
......
......@@ -21,10 +21,10 @@ index 3c6ef3c..5346cb4 100644
YSIZE: 480
AREA_EXTENT: (-20037508.342789244, -10018754.171394622, 20037508.342789244, 10018754.171394622)
diff --git a/pyresample/test/test_geometry.py b/pyresample/test/test_geometry.py
index 4aec478..e26a26e 100644
index ae8990b..3befc7d 100644
--- a/pyresample/test/test_geometry.py
+++ b/pyresample/test/test_geometry.py
@@ -437,7 +437,7 @@ class Test(unittest.TestCase):
@@ -406,7 +406,7 @@ class Test(unittest.TestCase):
swath_def = geometry.SwathDefinition(lons, lats)
filter_area = geometry.AreaDefinition('test', 'test', 'test',
{'proj': 'eqc', 'lon_0': 0.0,
......@@ -33,7 +33,7 @@ index 4aec478..e26a26e 100644
8, 8,
(-20037508.34, -10018754.17, 20037508.34, 10018754.17))
filter = np.array([[1, 1, 1, 1, 0, 0, 0, 0],
@@ -462,7 +462,7 @@ class Test(unittest.TestCase):
@@ -431,7 +431,7 @@ class Test(unittest.TestCase):
data = np.array([1, 2, 3, 4])
filter_area = geometry.AreaDefinition('test', 'test', 'test',
{'proj': 'eqc', 'lon_0': 0.0,
......@@ -42,7 +42,7 @@ index 4aec478..e26a26e 100644
8, 8,
(-20037508.34, -10018754.17, 20037508.34, 10018754.17))
filter = np.array([[1, 1, 1, 1, 0, 0, 0, 0],
@@ -497,7 +497,7 @@ class Test(unittest.TestCase):
@@ -466,7 +466,7 @@ class Test(unittest.TestCase):
data = np.dstack((data1, data2, data3))
filter_area = geometry.AreaDefinition('test', 'test', 'test',
{'proj': 'eqc', 'lon_0': 0.0,
......@@ -51,7 +51,7 @@ index 4aec478..e26a26e 100644
8, 8,
(-20037508.34, -10018754.17, 20037508.34, 10018754.17))
filter = np.array([[1, 1, 1, 1, 0, 0, 0, 0],
@@ -566,7 +566,7 @@ class Test(unittest.TestCase):
@@ -535,7 +535,7 @@ class Test(unittest.TestCase):
def test_latlong_area(self):
area_def = geometry.AreaDefinition('', '', '',
......
From: Antonio Valentino <antonio.valentino@tiscali.it>
Date: Sat, 23 Dec 2017 09:47:39 +0100
Subject: fix tests on big-endian architectures
---
pyresample/test/test_geometry.py | 34 ++++++++++++++++++++--------------
1 file changed, 20 insertions(+), 14 deletions(-)
diff --git a/pyresample/test/test_geometry.py b/pyresample/test/test_geometry.py
index e26a26e..4db031f 100644
--- a/pyresample/test/test_geometry.py
+++ b/pyresample/test/test_geometry.py
@@ -216,13 +216,24 @@ class Test(unittest.TestCase):
def test_get_array_hashable(self):
arr = np.array([1.2, 1.3, 1.4, 1.5])
-
- np.testing.assert_allclose(np.array([ 51, 51, 51, 51, 51, 51, 243,
- 63, 205, 204, 204, 204, 204,
- 204, 244, 63, 102, 102, 102, 102,
- 102, 102, 246, 63, 0, 0,
- 0, 0, 0, 0, 248, 63],
- dtype=np.uint8),
+ if sys.byteorder == 'little':
+ # arr.view(np.uint8)
+ reference = np.array([ 51, 51, 51, 51, 51, 51, 243,
+ 63, 205, 204, 204, 204, 204,
+ 204, 244, 63, 102, 102, 102, 102,
+ 102, 102, 246, 63, 0, 0,
+ 0, 0, 0, 0, 248, 63],
+ dtype=np.uint8)
+ else:
+ # on le machines use arr.byteswap().view(np.uint8)
+ reference = np.array([ 63, 243, 51, 51, 51, 51, 51,
+ 51, 63, 244, 204, 204, 204,
+ 204, 204, 205, 63, 246, 102, 102,
+ 102, 102, 102, 102, 63, 248,
+ 0, 0, 0, 0, 0, 0],
+ dtype=np.uint8)
+
+ np.testing.assert_allclose(reference,
geometry.get_array_hashable(arr))
try:
@@ -231,13 +242,8 @@ class Test(unittest.TestCase):
pass
else:
xrarr = xr.DataArray(arr)
- np.testing.assert_allclose(np.array([ 51, 51, 51, 51, 51, 51, 243,
- 63, 205, 204, 204, 204, 204,
- 204, 244, 63, 102, 102, 102, 102,
- 102, 102, 246, 63, 0, 0,
- 0, 0, 0, 0, 248, 63],
- dtype=np.uint8),
- geometry.get_array_hashable(arr))
+ np.testing.assert_allclose(reference,
+ geometry.get_array_hashable(arr))
xrarr.attrs['hash'] = 42
self.assertEqual(geometry.get_array_hashable(xrarr),
0001-fix-doc-build.patch
0002-fix-proj4-initialization.patch
0003-fix-tests-on-big-endian-architectures.patch
......@@ -15,15 +15,16 @@ The function **plot.save_quicklook** saves the Basemap image directly to file.
.. doctest::
>>> import numpy as np
>>> import pyresample as pr
>>> from pyresample import load_area, save_quicklook, SwathDefinition
>>> from pyresample.kd_tree import resample_nearest
>>> lons = np.zeros(1000)
>>> lats = np.arange(-80, -90, -0.01)
>>> tb37v = np.arange(1000)
>>> area_def = pr.utils.load_area('areas.cfg', 'ease_sh')
>>> swath_def = pr.geometry.SwathDefinition(lons, lats)
>>> result = pr.kd_tree.resample_nearest(swath_def, tb37v, area_def,
>>> area_def = load_area('areas.cfg', 'ease_sh')
>>> swath_def = SwathDefinition(lons, lats)
>>> result = resample_nearest(swath_def, tb37v, area_def,
... radius_of_influence=20000, fill_value=None)
>>> pr.plot.save_quicklook('tb37v_quick.png', area_def, result, label='Tb 37v (K)')
>>> save_quicklook('tb37v_quick.png', area_def, result, label='Tb 37v (K)')
Assuming **lons**, **lats** and **tb37v** are initialized with real data the result might look something like this:
.. image:: _static/images/tb37v_quick.png
......@@ -49,14 +50,15 @@ Assuming the file **areas.cfg** has the following area definition:
**Example usage:**
>>> import numpy as np
>>> import pyresample as pr
>>> from pyresample import load_area, save_quicklook, SwathDefinition
>>> from pyresample.kd_tree import resample_nearest
>>> lons = np.zeros(1000)
>>> lats = np.arange(-80, -90, -0.01)
>>> tb37v = np.arange(1000)
>>> area_def = pr.utils.load_area('areas.cfg', 'pc_world')
>>> swath_def = pr.geometry.SwathDefinition(lons, lats)
>>> result = pr.kd_tree.resample_nearest(swath_def, tb37v, area_def, radius_of_influence=20000, fill_value=None)
>>> pr.plot.save_quicklook('tb37v_pc.png', area_def, result, num_meridians=0, num_parallels=0, label='Tb 37v (K)')
>>> area_def = load_area('areas.cfg', 'pc_world')
>>> swath_def = SwathDefinition(lons, lats)
>>> result = resample_nearest(swath_def, tb37v, area_def, radius_of_influence=20000, fill_value=None)
>>> save_quicklook('tb37v_pc.png', area_def, result, num_meridians=0, num_parallels=0, label='Tb 37v (K)')
Assuming **lons**, **lats** and **tb37v** are initialized with real data the result might look something like this:
.. image:: _static/images/tb37v_pc.png
......@@ -81,14 +83,15 @@ Assuming the file **areas.cfg** has the following area definition for an ortho p
**Example usage:**
>>> import numpy as np
>>> import pyresample as pr
>>> from pyresample import load_area, save_quicklook, SwathDefinition
>>> from pyresample.kd_tree import resample_nearest
>>> lons = np.zeros(1000)
>>> lats = np.arange(-80, -90, -0.01)
>>> tb37v = np.arange(1000)
>>> area_def = pr.utils.load_area('areas.cfg', 'ortho')
>>> swath_def = pr.geometry.SwathDefinition(lons, lats)
>>> result = pr.kd_tree.resample_nearest(swath_def, tb37v, area_def, radius_of_influence=20000, fill_value=None)
>>> pr.plot.save_quicklook('tb37v_ortho.png', area_def, result, num_meridians=0, num_parallels=0, label='Tb 37v (K)')
>>> area_def = load_area('areas.cfg', 'ortho')
>>> swath_def = SwathDefinition(lons, lats)
>>> result = resample_nearest(swath_def, tb37v, area_def, radius_of_influence=20000, fill_value=None)
>>> save_quicklook('tb37v_ortho.png', area_def, result, num_meridians=0, num_parallels=0, label='Tb 37v (K)')
Assuming **lons**, **lats** and **tb37v** are initialized with real data the result might look something like this:
.. image:: _static/images/tb37v_ortho.png
......@@ -105,15 +108,16 @@ AreaDefintion using the **plot.area_def2basemap(area_def, **kwargs)** function.
>>> import numpy as np
>>> import matplotlib.pyplot as plt
>>> import pyresample as pr
>>> from pyresample import load_area, save_quicklook, area_def2basemap, SwathDefinition
>>> from pyresample.kd_tree import resample_nearest
>>> lons = np.zeros(1000)
>>> lats = np.arange(-80, -90, -0.01)
>>> tb37v = np.arange(1000)
>>> area_def = pr.utils.load_area('areas.cfg', 'ease_sh')
>>> swath_def = pr.geometry.SwathDefinition(lons, lats)
>>> result = pr.kd_tree.resample_nearest(swath_def, tb37v, area_def,
>>> area_def = load_area('areas.cfg', 'ease_sh')
>>> swath_def = SwathDefinition(lons, lats)
>>> result = resample_nearest(swath_def, tb37v, area_def,
... radius_of_influence=20000, fill_value=None)
>>> bmap = pr.plot.area_def2basemap(area_def)
>>> bmap = area_def2basemap(area_def)
>>> bmng = bmap.bluemarble()
>>> col = bmap.imshow(result, origin='upper')
>>> plt.savefig('tb37v_bmng.png', bbox_inches='tight')
......
......@@ -6,6 +6,15 @@ Resampling of swath data
Pyresample can be used to resample a swath dataset to a grid, a grid to a swath or a swath to another swath.
Resampling can be done using nearest neighbour method, Guassian weighting, weighting with an arbitrary radial function.
.. versionchanged:: 1.8.0
`SwathDefinition` no longer checks the validity of the provided longitude
and latitude coordinates to improve performance. Longitude arrays are
expected to be between -180 and 180 degrees, latitude -90 to 90 degrees.
This also applies to all geometry definitions that are provided longitude
and latitude arrays on initialization. Use
`pyresample.utils.check_and_wrap` to preprocess your arrays.
pyresample.image
----------------
The ImageContainerNearest and ImageContanerBilinear classes can be used for resampling of swaths as well as grids. Below is an example using nearest neighbour resampling.
......
......@@ -15,18 +15,28 @@
# You should have received a copy of the GNU Lesser General Public License along
# with this program. If not, see <http://www.gnu.org/licenses/>.
from __future__ import absolute_import
import os
CHUNK_SIZE = os.getenv('PYTROLL_CHUNK_SIZE', 4096)
from pyresample.version import __version__
# Backwards compatibility
from pyresample import geometry
from pyresample import grid
from pyresample import image
from pyresample import kd_tree
from pyresample import utils
from pyresample import plot
# Easy access
from pyresample.geometry import (SwathDefinition,
AreaDefinition,
DynamicAreaDefinition)
from pyresample.utils import load_area
from pyresample.kd_tree import XArrayResamplerNN
from pyresample.plot import save_quicklook, area_def2basemap
__all__ = ['grid', 'image', 'kd_tree',
'utils', 'plot', 'geo_filter', 'geometry']
'utils', 'plot', 'geo_filter', 'geometry', 'CHUNK_SIZE']
def get_capabilities():
......
......@@ -31,7 +31,7 @@ import numpy as np
import yaml
from pyproj import Geod, Proj
from pyresample import _spatial_mp, utils
from pyresample import _spatial_mp, utils, CHUNK_SIZE
try:
from xarray import DataArray
......@@ -41,11 +41,11 @@ except ImportError:
logger = getLogger(__name__)
class DimensionError(Exception):
class DimensionError(ValueError):
pass
class IncompatibleAreas(Exception):
class IncompatibleAreas(ValueError):
"""Error when the areas to combine are not compatible."""
......@@ -63,7 +63,17 @@ class Boundary(object):
class BaseDefinition(object):
"""Base class for geometry definitions"""
"""Base class for geometry definitions
.. versionchanged:: 1.8.0
`BaseDefinition` no longer checks the validity of the provided
longitude and latitude coordinates to improve performance. Longitude
arrays are expected to be between -180 and 180 degrees, latitude -90
to 90 degrees. Use `pyresample.utils.check_and_wrap` to preprocess
your arrays.
"""
def __init__(self, lons=None, lats=None, nprocs=1):
if type(lons) != type(lats):
......@@ -76,31 +86,8 @@ class BaseDefinition(object):
raise ValueError('lons and lats must have same shape')
self.nprocs = nprocs
# check the latitutes
if lats is not None:
if isinstance(lats, np.ndarray) and (lats.min() < -90. or lats.max() > 90.):
raise ValueError(
'Some latitudes are outside the [-90.;+90] validity range')
elif not isinstance(lats, np.ndarray):
# assume we have to mask an xarray
lats = lats.where((lats >= -90.) & (lats <= 90.))
self.lats = lats
# check the longitudes
if lons is not None:
if isinstance(lons, np.ndarray) and (lons.min() < -180. or lons.max() >= +180.):
# issue warning
warnings.warn('All geometry objects expect longitudes in the [-180:+180[ range. ' +
'We will now automatically wrap your longitudes into [-180:+180[, and continue. ' +
'To avoid this warning next time, use routine utils.wrap_longitudes().')
# assume we have to mask an xarray
# wrap longitudes to [-180;+180[
lons = utils.wrap_longitudes(lons)
elif not isinstance(lons, np.ndarray):
lons = utils.wrap_longitudes(lons)
self.lons = lons
self.ndim = None
self.cartesian_coords = None
......@@ -133,26 +120,24 @@ class BaseDefinition(object):
return not self.__eq__(other)
def get_area_extent_for_subset(self, row_LR, col_LR, row_UL, col_UL):
"""Retrieves area_extent for a subdomain
rows are counted from upper left to lower left
columns are counted from upper left to upper right
:Parameters:
row_LR : int
row of the lower right pixel
col_LR : int
col of the lower right pixel
row_UL : int
row of the upper left pixel
col_UL : int
col of the upper left pixel
"""Calculate extent for a subdomain of this area
:Returns:
area_extent : list
Area extent as a list (LL_x, LL_y, UR_x, UR_y) of the subset
Rows are counted from upper left to lower left and columns are
counted from upper left to upper right.
Args:
row_LR (int): row of the lower right pixel
col_LR (int): col of the lower right pixel
row_UL (int): row of the upper left pixel
col_UL (int): col of the upper left pixel
:Author:
Returns:
area_extent (tuple):
Area extent (LL_x, LL_y, UR_x, UR_y) of the subset
Author:
Ulrich Hamann
"""
(a, b) = self.get_proj_coords(data_slice=(row_LR, col_LR))
......@@ -162,7 +147,7 @@ class BaseDefinition(object):
c = c + 0.5 * self.pixel_size_x
d = d + 0.5 * self.pixel_size_y
return (a, b, c, d)
return a, b, c, d
def get_lonlat(self, row, col):
"""Retrieve lon and lat of single pixel
......@@ -454,7 +439,7 @@ def get_array_hashable(arr):
try:
return arr.name.encode('utf-8') # dask array
except AttributeError:
return arr.view(np.uint8) # np array
return np.asarray(arr).view(np.uint8) # np array
class SwathDefinition(CoordinateDefinition):
......@@ -511,7 +496,7 @@ class SwathDefinition(CoordinateDefinition):
return self.hash
def get_lonlats_dask(self, blocksize=1000, dtype=None):
def get_lonlats_dask(self, chunks=CHUNK_SIZE):
"""Get the lon lats as a single dask array."""
import dask.array as da
lons, lats = self.get_lonlats()
......@@ -520,9 +505,9 @@ class SwathDefinition(CoordinateDefinition):
return lons.data, lats.data
else:
lons = da.from_array(np.asanyarray(lons),
chunks=blocksize * lons.ndim)
chunks=chunks)
lats = da.from_array(np.asanyarray(lats),
chunks=blocksize * lats.ndim)
chunks=chunks)
return lons, lats
def _compute_omerc_parameters(self, ellipsoid):
......@@ -967,37 +952,42 @@ class AreaDefinition(BaseDefinition):
return self.get_xy_from_proj_coords(xm_, ym_)
def get_xy_from_proj_coords(self, xm_, ym_):
"""Retrieve closest x and y coordinates (column, row indices) for a
location specified with projection coordinates (xm_,ym_) in meters.
A ValueError is raised, if the return point is outside the area domain. If
xm_,ym_ is a tuple of sequences of projection coordinates, a tuple of
masked arrays are returned.
def get_xy_from_proj_coords(self, xm, ym):
"""Find closest grid cell index for a specified projection coordinate.
:Input:
xm_ : point or sequence (list or array) of x-coordinates in m (map projection)
ym_ : point or sequence (list or array) of y-coordinates in m (map projection)
If xm, ym is a tuple of sequences of projection coordinates, a tuple
of masked arrays are returned.
Args:
xm (list or array): point or sequence of x-coordinates in
meters (map projection)
ym (list or array): point or sequence of y-coordinates in
meters (map projection)
Returns:
x, y : column and row grid cell indexes as 2 scalars or arrays
Raises:
ValueError: if the return point is outside the area domain
:Returns:
(x, y) : tuple of integer points/arrays
"""
if isinstance(xm_, list):
xm_ = np.array(xm_)
if isinstance(ym_, list):
ym_ = np.array(ym_)
if isinstance(xm, list):
xm = np.array(xm)
if isinstance(ym, list):
ym = np.array(ym)
if ((isinstance(xm_, np.ndarray) and
not isinstance(ym_, np.ndarray)) or
(not isinstance(xm_, np.ndarray) and
isinstance(ym_, np.ndarray))):
raise ValueError("Both projection coordinates xm_ and ym_ needs to be of " +
if ((isinstance(xm, np.ndarray) and
not isinstance(ym, np.ndarray)) or
(not isinstance(xm, np.ndarray) and
isinstance(ym, np.ndarray))):
raise ValueError("Both projection coordinates xm and ym needs to be of " +
"the same type and have the same dimensions!")
if isinstance(xm_, np.ndarray) and isinstance(ym_, np.ndarray):
if xm_.shape != ym_.shape:
if isinstance(xm, np.ndarray) and isinstance(ym, np.ndarray):
if xm.shape != ym.shape:
raise ValueError(
"projection coordinates xm_ and ym_ is not of the same shape!")
"projection coordinates xm and ym is not of the same shape!")
upl_x = self.area_extent[0]
upl_y = self.area_extent[3]
......@@ -1007,8 +997,8 @@ class AreaDefinition(BaseDefinition):
yscale = (self.area_extent[1] -
self.area_extent[3]) / float(self.y_size)
x__ = (xm_ - upl_x) / xscale
y__ = (ym_ - upl_y) / yscale
x__ = (xm - upl_x) / xscale
y__ = (ym - upl_y) / yscale
if isinstance(x__, np.ndarray) and isinstance(y__, np.ndarray):
mask = (((x__ < 0) | (x__ > self.x_size)) |
......@@ -1038,36 +1028,25 @@ class AreaDefinition(BaseDefinition):
return self.get_lonlats(nprocs=None, data_slice=(row, col))
def get_proj_coords_dask(self, blocksize, dtype=None):
from dask.base import tokenize
from dask.array import Array
def get_proj_vectors_dask(self, chunks=CHUNK_SIZE, dtype=None):
import dask.array as da
if dtype is None:
dtype = self.dtype
vchunks = range(0, self.y_size, blocksize)
hchunks = range(0, self.x_size, blocksize)
token = tokenize(blocksize)
name = 'get_proj_coords-' + token
dskx = {(name, i, j, 0): (self.get_proj_coords_array,
(slice(vcs, min(vcs + blocksize, self.y_size)),
slice(hcs, min(hcs + blocksize, self.x_size))),
False, dtype)
for i, vcs in enumerate(vchunks)
for j, hcs in enumerate(hchunks)
}
res = Array(dskx, name, shape=list(self.shape) + [2],
chunks=(blocksize, blocksize, 2),
dtype=dtype)
return res
target_x = da.arange(self.x_size, chunks=chunks, dtype=dtype) * \
self.pixel_size_x + self.pixel_upper_left[0]
target_y = da.arange(self.y_size, chunks=chunks, dtype=dtype) * - \
self.pixel_size_y + self.pixel_upper_left[1]
return target_x, target_y
def get_proj_coords_array(self, data_slice=None, cache=False, dtype=None):
return np.dstack(self.get_proj_coords(data_slice, cache, dtype))
def get_proj_coords_dask(self, chunks=CHUNK_SIZE, dtype=None):
# TODO: Add rotation
import dask.array as da
target_x, target_y = self.get_proj_vectors_dask(chunks, dtype)
return da.meshgrid(target_x, target_y)
def get_proj_coords(self, data_slice=None, cache=False, dtype=None):
"""Get projection coordinates of grid
"""Get projection coordinates of grid.
Parameters
----------
......@@ -1080,8 +1059,8 @@ class AreaDefinition(BaseDefinition):
-------
(target_x, target_y) : tuple of numpy arrays
Grids of area x- and y-coordinates in projection units
"""
"""
def do_rotation(xspan, yspan, rot_deg=0):
rot_rad = np.radians(rot_deg)
rot_mat = np.array([[np.cos(rot_rad), np.sin(rot_rad)],
......@@ -1226,37 +1205,22 @@ class AreaDefinition(BaseDefinition):
Coordinate(corner_lons[2], corner_lats[2]),
Coordinate(corner_lons[3], corner_lats[3])]
def get_lonlats_dask(self, blocksize=1000, dtype=None):
from dask.base import tokenize
from dask.array import Array
import pyproj
def get_lonlats_dask(self, chunks=CHUNK_SIZE, dtype=None):
from dask.array import map_blocks
dtype = dtype or self.dtype
proj_coords = self.get_proj_coords_dask(blocksize, dtype)
target_x, target_y = proj_coords[:, :, 0], proj_coords[:, :, 1]
target_x, target_y = self.get_proj_coords_dask(chunks, dtype)
target_proj = pyproj.Proj(**self.proj_dict)
target_proj = Proj(**self.proj_dict)
def invproj(data1, data2):
return np.dstack(target_proj(data1.compute(), data2.compute(), inverse=True))
token = tokenize(str(self), blocksize, dtype)
name = 'get_lonlats-' + token
vchunks = range(0, self.y_size, blocksize)
hchunks = range(0, self.x_size, blocksize)
dsk = {(name, i, j, 0): (invproj,
target_x[slice(vcs, min(vcs + blocksize, self.y_size)),
slice(hcs, min(hcs + blocksize, self.x_size))],
target_y[slice(vcs, min(vcs + blocksize, self.y_size)),
slice(hcs, min(hcs + blocksize, self.x_size))])
for i, vcs in enumerate(vchunks)
for j, hcs in enumerate(hchunks)
}
res = Array(dsk, name, shape=list(self.shape) + [2],
chunks=(blocksize, blocksize, 2),
dtype=dtype)
return np.dstack(target_proj(data1, data2, inverse=True))
res = map_blocks(invproj, target_x, target_y, chunks=(target_x.chunks[0],
target_x.chunks[1],
2),
new_axis=[2])
return res[:, :, 0], res[:, :, 1]
def get_lonlats(self, nprocs=None, data_slice=None, cache=False, dtype=None):
......@@ -1448,13 +1412,13 @@ class StackedAreaDefinition(BaseDefinition):
return self.lons, self.lats
def get_lonlats_dask(self, blocksize=1000, dtype=None):
def get_lonlats_dask(self, chunks=CHUNK_SIZE, dtype=None):
""""Return lon and lat dask arrays of the area."""
import dask.array as da
llons = []
llats = []
for definition in self.defs:
lons, lats = definition.get_lonlats_dask(blocksize=blocksize,
lons, lats = definition.get_lonlats_dask(chunks=chunks,
dtype=dtype)
llons.append(lons)
......
This diff is collapsed.
......@@ -91,43 +91,6 @@ class Test(unittest.TestCase):
self.assertTrue((cart_coords.sum() - exp) < 1e-7 * exp,
msg='Calculation of cartesian coordinates failed')
def test_base_lat_invalid(self):
lons = np.arange(-135., +135, 20.)
lats = np.ones_like(lons) * 70.
lats[0] = -95
lats[1] = +95
self.assertRaises(
ValueError, geometry.BaseDefinition, lons=lons, lats=lats)
def test_base_lon_wrapping(self):
lons1 = np.arange(-135., +135, 50.)
lats = np.ones_like(lons1) * 70.
with catch_warnings() as w1:
base_def1 = geometry.BaseDefinition(lons1, lats)
self.assertFalse(
len(w1) != 0, 'Got warning <%s>, but was not expecting one' % w1)
lons2 = np.where(lons1 < 0, lons1 + 360, lons1)
with catch_warnings() as w2:
base_def2 = geometry.BaseDefinition(lons2, lats)
self.assertFalse(
len(w2) != 1, 'Failed to trigger a warning on longitude wrapping')
self.assertFalse(('-180:+180' not in str(w2[0].message)),
'Failed to trigger correct warning about longitude wrapping')
self.assertFalse(
base_def1 != base_def2, 'longitude wrapping to [-180:+180] did not work')
with catch_warnings() as w3:
base_def3 = geometry.BaseDefinition(None, None)
self.assertFalse(
len(w3) != 0, 'Got warning <%s>, but was not expecting one' % w3)
self.assert_raises(ValueError, base_def3.get_lonlats)
def test_base_type(self):
lons1 = np.arange(-135., +135, 50.)
lats = np.ones_like(lons1) * 70.
......@@ -216,13 +179,24 @@ class Test(unittest.TestCase):
def test_get_array_hashable(self):
arr = np.array([1.2, 1.3, 1.4, 1.5])
np.testing.assert_allclose(np.array([ 51, 51, 51, 51, 51, 51, 243,
if sys.byteorder == 'little':
# arr.view(np.uint8)
reference = np.array([ 51, 51, 51, 51, 51, 51, 243,
63, 205, 204, 204, 204, 204,
204, 244, 63, 102, 102, 102, 102,
102, 102, 246, 63, 0, 0,
0, 0, 0, 0, 248, 63],
dtype=np.uint8),
dtype=np.uint8)
else:
# on le machines use arr.byteswap().view(np.uint8)
reference = np.array([ 63, 243, 51, 51, 51, 51, 51,
51, 63, 244, 204, 204, 204,
204, 204, 205, 63, 246, 102, 102,
102, 102, 102, 102, 63, 248,
0, 0, 0, 0, 0, 0],
dtype=np.uint8)
np.testing.assert_allclose(reference,
geometry.get_array_hashable(arr))
try:
......@@ -231,12 +205,7 @@ class Test(unittest.TestCase):
pass
else:
xrarr = xr.DataArray(arr)
np.testing.assert_allclose(np.array([ 51, 51, 51, 51, 51, 51, 243,
63, 205, 204, 204, 204, 204,
204, 244, 63, 102, 102, 102, 102,
102, 102, 246, 63, 0, 0,
0, 0, 0, 0, 248, 63],
dtype=np.uint8),
np.testing.assert_allclose(reference,
geometry.get_array_hashable(arr))
xrarr.attrs['hash'] = 42
......@@ -755,27 +724,6 @@ class TestSwathDefinition(unittest.TestCase):
self.assertFalse(id(lons1) != id(lons2) or id(lats1) != id(lats2),
msg='Caching of swath coordinates failed')
def test_swath_wrap(self):
lons1 = np.fromfunction(lambda y, x: 3 + (10.0 / 100) * x, (5000, 100))
lats1 = np.fromfunction(
lambda y, x: 75 - (50.0 / 5000) * y, (5000, 100))
lons1 += 180.
with catch_warnings() as w1:
swath_def = geometry.BaseDefinition(lons1, lats1)
self.assertFalse(
len(w1) != 1, 'Failed to trigger a warning on longitude wrapping')
self.assertFalse(('-180:+180' not in str(w1[0].message)),
'Failed to trigger correct warning about longitude wrapping')
lons2, lats2 = swath_def.get_lonlats()
self.assertTrue(id(lons1) != id(lons2),
msg='Caching of swath coordinates failed with longitude wrapping')
self.assertTrue(lons2.min() > -180 and lons2.max() < 180,
'Wrapping of longitudes failed for SwathDefinition')
def test_concat_1d(self):
lons1 = np.array([1, 2, 3])
lats1 = np.array([1, 2, 3])
......@@ -894,7 +842,7 @@ class TestSwathDefinition(unittest.TestCase):
lons = np.array([[-45., 0., 45.],
[-90, 0., 90.],
[-135., 180., 135.]]).T
[-135., -180., 135.]]).T
area = geometry.SwathDefinition(lons, lats)
lons, lats = area.get_edge_lonlats()
......
......@@ -188,6 +188,24 @@ class TestMisc(unittest.TestCase):
self.assertFalse(
(wlons.min() < -180) or (wlons.max() >= 180) or (+180 in wlons))
def test_wrap_and_check(self):
from pyresample import utils
lons1 = np.arange(-135., +135, 50.)
lats = np.ones_like(lons1) * 70.
new_lons, new_lats = utils.check_and_wrap(lons1, lats)
self.assertIs(lats, new_lats)
self.assertTrue(np.isclose(lons1, new_lons).all())
lons2 = np.where(lons1 < 0, lons1 + 360, lons1)
new_lons, new_lats = utils.check_and_wrap(lons2, lats)
self.assertIs(lats, new_lats)
# after wrapping lons2 should look like lons1
self.assertTrue(np.isclose(lons1, new_lons).all())
lats2 = lats + 25.
self.assertRaises(ValueError, utils.check_and_wrap, lons1, lats2)
def test_unicode_proj4_string(self):
"""Test that unicode is accepted for area creation.
"""
......
......@@ -30,10 +30,8 @@ import yaml
from configobj import ConfigObj
from collections import Mapping
import pyresample as pr
class AreaNotFound(Exception):
class AreaNotFound(KeyError):
"""Exception raised when specified are is no found in file"""
pass
......@@ -126,6 +124,7 @@ def _parse_yaml_area_file(area_file_name, *regions):
the files, using the first file as the "base", replacing things after
that.
"""
from pyresample.geometry import DynamicAreaDefinition
area_dict = _read_yaml_area_file_content(area_file_name)
area_list = regions or area_dict.keys()
......@@ -154,7 +153,7 @@ def _parse_yaml_area_file(area_file_name, *regions):
rotation = params['rotation']
except KeyError:
rotation = 0
area = pr.geometry.DynamicAreaDefinition(area_name, description,
area = DynamicAreaDefinition(area_name, description,
projection, xsize, ysize,
area_extent,
optimize_projection,
......@@ -230,6 +229,7 @@ def _parse_legacy_area_file(area_file_name, *regions):
def _create_area(area_id, area_content):
"""Parse area configuration"""
from pyresample.geometry import AreaDefinition
config_obj = area_content.replace('{', '').replace('};', '')
config_obj = ConfigObj([line.replace(':', '=', 1)
......@@ -257,7 +257,7 @@ def _create_area(area_id, area_content):
config['AREA_EXTENT'][i] = float(val)
config['PCS_DEF'] = _get_proj4_args(config['PCS_DEF'])
return pr.geometry.AreaDefinition(config['REGION'], config['NAME'],
return AreaDefinition(config['REGION'], config['NAME'],
config['PCS_ID'], config['PCS_DEF'],
config['XSIZE'], config['YSIZE'],
config['AREA_EXTENT'], config['ROTATION'])
......@@ -292,8 +292,9 @@ def get_area_def(area_id, area_name, proj_id, proj4_args, x_size, y_size,
AreaDefinition object
"""
from pyresample.geometry import AreaDefinition
proj_dict = _get_proj4_args(proj4_args)
return pr.geometry.AreaDefinition(area_id, area_name, proj_id, proj_dict,
return AreaDefinition(area_id, area_name, proj_id, proj_dict,
x_size, y_size, area_extent)
......@@ -314,9 +315,10 @@ def generate_quick_linesample_arrays(source_area_def, target_area_def,
-------
(row_indices, col_indices) : tuple of numpy arrays
"""
from pyresample.grid import get_linesample
lons, lats = target_area_def.get_lonlats(nprocs)
source_pixel_y, source_pixel_x = pr.grid.get_linesample(lons, lats,
source_pixel_y, source_pixel_x = get_linesample(lons, lats,
source_area_def,
nprocs=nprocs)
......@@ -350,8 +352,9 @@ def generate_nearest_neighbour_linesample_arrays(source_area_def,
(row_indices, col_indices) : tuple of numpy arrays
"""
from pyresample.kd_tree import get_neighbour_info
valid_input_index, valid_output_index, index_array, distance_array = \
pr.kd_tree.get_neighbour_info(source_area_def,
get_neighbour_info(source_area_def,
target_area_def,
radius_of_influence,
neighbours=1,
......@@ -513,6 +516,34 @@ def wrap_longitudes(lons):
return (lons + 180) % 360 - 180
def check_and_wrap(lons, lats):
"""Wrap longitude to [-180:+180[ and check latitude for validity.
Args:
lons (ndarray): Longitude degrees
lats (ndarray): Latitude degrees
Returns:
lons, lats: Longitude degrees in the range [-180:180[ and the original
latitude array
Raises:
ValueError: If latitude array is not between -90 and 90
"""
# check the latitutes
if lats.min() < -90. or lats.max() > 90.:
raise ValueError(
'Some latitudes are outside the [-90.:+90] validity range')
# check the longitudes
if lons.min() < -180. or lons.max() >= 180.:
# wrap longitudes to [-180;+180[
lons = wrap_longitudes(lons)
return lons, lats
def recursive_dict_update(d, u):
"""Recursive dictionary update using
......
......@@ -15,4 +15,4 @@
# You should have received a copy of the GNU Lesser General Public License along
# with this program. If not, see <http://www.gnu.org/licenses/>.
__version__ = '1.7.1'
__version__ = '1.8.0'
......@@ -26,11 +26,12 @@ from setuptools.command.build_ext import build_ext as _build_ext
version = imp.load_source('pyresample.version', 'pyresample/version.py')
requirements = ['setuptools>=3.2', 'pyproj', 'numpy', 'configobj',
requirements = ['setuptools>=3.2', 'pyproj>=1.9.5.1', 'numpy', 'configobj',
'pykdtree>=1.1.1', 'pyyaml', 'six']
extras_require = {'pykdtree': ['pykdtree>=1.1.1'],
'numexpr': ['numexpr'],
'quicklook': ['matplotlib', 'basemap', 'pillow']}
'quicklook': ['matplotlib', 'basemap', 'pillow'],
'dask': ['dask>=0.16.1']}
test_requires = []
if sys.version_info < (3, 3):
......