Skip to content
Commits on Source (7)
......@@ -11,9 +11,7 @@ modification, are permitted provided that the following conditions are met:
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
* Neither the name of the GeoDa Center for Geospatial Analysis and Computation
nor the names of its contributors may be used to endorse or promote products
derived from this software without specific prior written permission.
* Neither the name of the PySAL Developers nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND
CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES,
......
# Migrating to PySAL 2.0
<div align='left'>
![https://gitter.im/pysal/pysal](https://badges.gitter.im/pysal/pysal.svg)
</div>
PySAL, the Python spatial analysis library, will be changing its package structure.
- We are changing the module structure to better reflect *what you do* with the library rather than the *academic disciplines* the components of the library come from.
- This also makes the library significantly more maintainable for us, since it reduces the bulk of the library and more evenly distributes the load for maintainers.
- As an added benefit, we will release these components, called *submodules*, independently. This lets end users only install components they need, which is helpful for our colleagues in restricted data centers.
- **the main reason** we are doing this, though, is so we can implement new features in an easier fashion, maintain existing features more easily, and solicit new contributions and modules with less friction.
#### The Long & Short of it
Practially speaking, this means that `pysal` is a single *source redistribution* of many separately-maintained packages, called `submodules`. Each of these submodules are available by themselves on `PyPI`. They are maintained by individuals more closely tied to their code base, and are released on their own schedule. Every six months, the main maintainers of PySAL will collate, test, and re-distribute a stable version of these submodules. The single source release is intended to support our re-packagers, like OSGeoLIVE, Debian, and Conda, as well as most end users. Each *subpackage*, the individual components available on PyPI, are the locus of development, and are pushed forward (or created anew) by their specific maintainers on their respective repositories. For developers interested in minimizing their dependency requirements, it is possible to depend on each `subpackage` alone.
### The Structure of PySAL 2.0
To make this happen, we must re-arrange some existing functionality in the library so that it can be packaged separately. In total, `pysal` 2.0 will be organized with five main thematic modules with the following functionality:
* `lib`: core functionality used by other modules to work with spatial data in Python, including:
- the construction of graphs (also known as spatial weights) from spatial data
- computational geometry algorithms:
+ alpha shapes
+ quadtrees, rTrees, & spherical KDTrees
+ methods to construct graphs from `scipy` Delaunay triangulation objects
+ Pure Python data structures to represent planar geometric objects
- pure Python readers/writers for graph/spatial weights files and some spatial data types
* `explore`: exploratory spatial data analysis of clusters, hotspots, and spatial outliers, plus spatial statistics on graphs and point patterns. These include
* Point pattern analysis of centrography and colocation ($F$,$G$,$J$,$K$ statistics in space)
* Pattern analysis of point patterns snapped to a network ($F$,$G$,$K$ statistics on networks)
* Analysis of spatial patterning on lattices, including
* univariate Join count statistics
* Moran's $I$ statistics, including local & bivariate methods
* Geary's $C$
* Getis-Ord $G$, $G^*$, and local $G$ statistics
* General $\Gamma$ statistics for spatial clustering
* methods to construct & analyze the space-time dynamics and distributions of data, including Markov models and distributional dynamics statistics
* `model`: explicitly spatial modeling tools including:
- geographically-weighted regression, a generalized additive spatial model specification for local coefficients
- spatially-correlated variance-components models, a type of Bayesian hierarchical model providing for group-level spatial mixed effects, as well as diagnostics for Bayesian modeling (Geweke, Markov Chain Monte Carlo Standard Error, Potential Scale Reduction Factors)
- Bayesian spatially-varying coefficient process models (e.g. local random effects models)
- Maximum Likelihood spatial econometric models, including:
+ mixed regressive-autoregressive (spatial lag) model
+ spatially-correlated error models
+ spatial regimes/spatial mixture models
+ seemingly-unrelated regression models
+ combinations of these various effects
- econometric specification testing methods, including spatial Lagrange Multiplier, Anselin-Kelejian, Chow, and Jarque-Bera tests.
- `viz`: methods to visualize and analyze spatial datasets, specifically the output of exploratory spatial statistics.
This document provides a brief overview of what each of the new modules are, how they relate to the last **legacy** version of `pysal`, version `1.14.4`, and what you need to do in order to keep your code running the way you expect.
# Porting your code
The changes in `pysal` collect together modules that are used for a similar purpose:
![migration_graph](https://raw.githubusercontent.com/ljwolf/ljwolf.github.io/master/images/migration_graph.png)
Many things are new in `pysal` 2.0. In order to ensure we can keep making new things easily and maintain what we have, we've released each of the *submodules* (those underneath `model`,`viz`, and `explore`) as their own python packages on PyPI. We will also release the `lib` module as its own package, `libpysal`, on PyPI.
This means that new users can make new submodules by following our [submodule contract](https://github.com/pysal/pysal/wiki/Submodule-Contract). Further, contributors should make pull requests to the submodules directly, not to [`pysal/pysal`](https://github.com/pysal/pysal).
## using the standard `pysal` for a stable six-month release cycle
If you want to keep your usual stable `pysal` dependency, it is sufficient to update your imports according to the mappings we provide in the [**Module Lookup**](#module-lookup) section. `pysal` will continue to stick to regular 6-month releases put out by multiple maintainers, with bug-fix releases as needed throughout the year.
This version will still have nightly regression testing run and will be ensured to work with the latest releases of `numpy` and `scipy`. If you don't have an urgent need to reduce your dependency size or the availability of PySAL, continuing to depend on `pysal` directly is the right choice.
## using the appropriate sub-module for fresher releases & more stable dependencies
If you only use one contained part of `pysal`, are interested in developing another statistical analysis package that only depends on `libpysal`, or simply want to keep your build as lean as possible, you can also install only the sub-modules you require, independently of `pysal`. This is the best option for those of you in restricted analysis environments where every line of code must be vetted by an expert, such as users in restricted data centers conducting academic work.
**To preview these changes, install the `pysalnext` package using `pip`. If your package works with `pysalnext`, it should work on `pysal` 2.0.**
All of the sub-packages included in `pysalnext` contain a significant amount of new functionality, as well as interoperability tools for other packages, such as `networkx` and `geopandas`. In addition, most of the old tools from `pysal` are reorganized. In total, there are `12` distinct packages in `pysalnext`, with more being added often. These packages are:
- `libpysal`: the core of the library, containing computational geometry, graph construction, and read/write tools.
- `esda`: the exploratory spatial data analysis toolkit, containing many statistical functions for characterizing spatial patterns.
- `pointpats`: methods and statistical functions to statistically analyze spatial clustering in point patterns
- `spaghetti`: methods and statistical functions to analyze geographic networks, including the statistical distribution of points on network topologies.
- `giddy`: the geospatial distribution dynamics package, designed to study and characterize how distributions change and relate over space and time.
- `mapclassify`: a package with map classification algorithms for cartography
- `splot`: a collection of statistical visualizations for the analysis methods included across `pysalnext`
- `gwr`: geographically-weighted regression (both single- and multi-scale)
- `spglm`: methods & functions to fit sparse GLMs
- `spint`: spatial interaction models
- `spreg`: spatial econometric regression
- `spvcm`: spatially-correlated variance components models, plus diagnostics and plotting for Bayesian models fit in `pysalnext`.
There are four main changes that have occurred in `pysalnext`. From legacy `pysal`, these are:
1. `pysal.contrib`: removed due to lack of unittesting and heterogeneous code quality; moved to independent modules where possible.
2. `pysal.esda.smoothing`: removed due to intractable and subtle numerical bugs in the code caused by porting to Python 3. There is an effort to re-implement this in Python, and will be added when/if this effort finishes.
3. `pysal.region`: removed because the new version has significantly more dependencies, including `pulp` and `scikit-learn`. Grab this as a standalone package using `pip install region`.
4. `pysal.meta`: removed.
If you'd like to get code from submodules, they usually have a one-to-one replacement. This will be discussed later in the [**Module Lookup**](#Module-Lookup) section.
# Module Lookup
Here is a list of the locations of all of the commonly-used modules in legacy `pysal`, and where they will move in the next release of `pysal`.
**To preview these changes, install the `pysalnext` package using `pip`. If your package works with `pysalnext`, it should work on `pysal` 2.0. **
### Modules in `libpysal`:
- `pysal.cg` will change to `pysal.lib.cg`
- `pysal.weights` will change to `pysal.lib.weights`, and many weights construction classes now have `from_geodataframe` methods that can graphs directly from `geopandas` `GeoDataFrames`.
- `pysal.open` will change to `pysal.lib.io.open`, and most of `pysal.core` will move to `pysal.lib.io`. *Note: using* `pysal.lib.io.open`*for **anything** but reading/writing spatial weights matrices/graphs is not advised. Please migrate to* `geopandas`* for all spatial data read/write. Further, note that* `WeightsConverter` *has also been deprecated; if you need to convert weights, do so manually using sequential* `open`* and *`write`* statements.*
- `pysal.examples` will change to `pysal.lib.examples`
### Modules from `spatial_dynamics`
`pysal.spatial_dynamics` will change to `pysal.explore.giddy`
### Modules from `inequality`
`pysal.inequality` will change to `pysal.explore.inequality`
### Modules from `esda`:
These will mainly move into `pysal.explore.esda`, except for `smoothing` and `mixture_smoothing` (which will be deprecated) and `mapclassify`, which will move to `pysal.viz.mapclassify`.
### Modules from `network`:
These will move directly into `pysal.explore.spaghetti`
### Modules from `inequality`:
`pysal.inequality` has been published as its own package, `inequality`, and moved to `pysal.explore.inequality`.
### Modules from `spreg`:
`pysal.spreg` has been moved wholesale into `pysal.model.spreg`, which now contains many additional kinds of spatial regression models, including spatial interaction, Bayesian multilevel, and geographically-weighted regression methods.
## Examples
#### Reading/Writing Data:
```python
import pysal
file_handler = pysal.open(pysal.examples.get_path('columbus.dbf'))
data = np.asarray(file_handler.by_col('HOVAL'))
```
becomes:
```python
import geopandas
from pysal.lib import examples
dataframe = geopandas.read_file(examples.get_path('columbus.dbf'))
data = data['HOVAL'].values
```
#### Reading/Writing Graphs or Spatial Weights:
```python
import pysal
graph = pysal.open(pysal.examples.get_path('columbus.gal')).read()
```
becomes
```python
from pysal.lib import weights, examples
graph = weights.W.from_file(examples.get_path('columbus.gal'))
```
or, building directly on top of the developer-focused `libpysal` package:
```python
from libpysal import weights, examples
graph = weights.W.from_file(examples.get_path('columbus.gal'))
```
#### Making map classifications
```python
from pysal.esda.mapclassify import Jenks_Caspall
Jenks_Caspall(your_data).yb
```
becomes
```python
from pysal.viz import mapclassify
labels = mapclassify.Jenks_Caspall(features).yb
```
Or, built directly on top of the developer-focused package `mapclassify`, which may have newer features in the future:
```python
import mapclassify
labels = mapclassify.Jenks_Caspall(features).yb
```
#### Fitting a spatial regression:
```python
import pysal
file_handler = pysal.open(pysal.examples.get_path('columbus.dbf'))
y = np.asarray(file_handler.by_col('HOVAL'))
X = file_handler.by_col_array(['CRIME', 'INC'])
graph = pysal.queen_from_shapefile(pysal.examples.get_path('columbus.shp'))
model = pysal.spreg.ML_Lag(y,X, w=graph,
name_x = ['CRIME', 'INC'],
name_y = 'HOVAL')
```
becomes
```python
from pysal.model import spreg
from pysal.lib import weights, examples
import geopandas
dataframe = geopandas.read_file(examples.get_path("columbus.dbf"))
graph = weights.Queen.from_dataframe(dataframe) # Queen.from_shapefile also supported
model = spreg.ML_Lag(dataframe[['HOVAL']].values,
dataframe[['CRIME', 'INC']].values,
name_x = ['CRIME', 'INC'], name_y = 'HOVAL')
```
Or, building on top of the standalone `spreg` package, which may have new bugfixes or compatibility options in the future:
```python
from pysal.model import spreg
from pysal.lib import weights, examples
import geopandas
dataframe = geopandas.read_file(examples.get_path("columbus.dbf"))
graph = weights.Queen.from_dataframe(dataframe)
model = spreg.ML_Lag(dataframe[['HOVAL']].values,
dataframe[['CRIME', 'INC']].values,
name_x = ['CRIME', 'INC'], name_y = 'HOVAL')
model2 = spreg.ML_Lag.from_formula('HOVAL ~ CRIME + INC',
data=dataframe)
```
#### Computing a Moran statistic:
```python
import pysal
file_handler = pysal.open(pysal.examples.get_path('columbus.dbf'))
y = np.asarray(file_handler.by_col('HOVAL'))
graph = pysal.open(pysal.examples.get_path('columbus.gal')).read()
moran_stat = pysal.Moran(y,graph)
print(moran_stat.I, moran_stat.p_z_sim)
```
becomes
```python
from pysal.explore import esda
from pysal.lib import weights, examples
import geopandas
dataframe = geopandas.read_file(examples.get_path("columbus.dbf"))
graph = weights.Queen.from_dataframe(dataframe)
moran_stat = esda.Moran(dataframe['HOVAL'], graph)
print(moran_stat.I, moran_stat.p_z_sim)
```
or, building directly off of the developer-focused package `esda` , which may have features not yet available in `pysal` itself:
```python
import esda
from pysal.lib import weights, examples
import geopandas
dataframe = geopandas.read_file(examples.get_path("columbus.dbf"))
graph = weights.Queen.from_dataframe(dataframe)
moran_stat = esda.Moran(dataframe['HOVAL'], graph, fancy_new_option=True)
print(moran_stat.I, moran_stat.p_z_sim)
```
# I really don't want to change anything; what can I do?
<font color='red'><bf>This is not recommended.</bf></font>
For a longer change window, feel free to `import pysal._legacy as pysal`. We urge you to not do this, since we plan on deprecating this as well. If you can make the changes described above, you will have a much more stable and future-proof API. We feel these changes are reasonable and will greatly enhance how easy it is for us to maintain `pysal` and move new functionality forward.
### Please contact us on [gitter](https://gitter.com/pysal/pysal) if there are any remaining concerns or questions, and for help or advice.
\ No newline at end of file
pysal (1.14.3-2) UNRELEASED; urgency=medium
pysal (1.14.4-1) unstable; urgency=medium
* Team upload.
* New upstream release.
* Update copyright-format URL to use HTTPS.
* Update Vcs-* URLs for Salsa.
* Bump Standards-Version to 4.1.5, no changes.
* Drop ancient X-Python-Version field.
* Strip trailing whitespace from control & rules files.
* Use filter instead of findstring to prevent partial matches.
* Drop test_examples.patch, applied upstream.
* Add overrides for package-contains-documentation-outside-usr-share-doc.
* Add lintian override for python-module-in-wrong-location.
-- Bas Couwenberg <sebastic@debian.org> Sun, 21 Jan 2018 10:32:03 +0100
-- Bas Couwenberg <sebastic@debian.org> Thu, 19 Jul 2018 06:58:36 +0200
pysal (1.14.3-1) unstable; urgency=medium
......
Description: Fix UnicodeDecodeError by changing UTF-8 character to ASCII.
======================================================================
ERROR: test_parser (pysal.examples.test_examples.Example_Tester)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/build/pysal-1.14.3/.pybuild/pythonX.Y_3.6/build/pysal/examples/test_examples.py", line 15, in test_parser
self.extext = ex.explain(example)
File "/build/pysal-1.14.3/.pybuild/pythonX.Y_3.6/build/pysal/examples/__init__.py", line 77, in explain
return _read_example(fpath)
File "/build/pysal-1.14.3/.pybuild/pythonX.Y_3.6/build/pysal/examples/__init__.py", line 55, in _read_example
title = io.readline().strip('\n')
File "/usr/lib/python3.6/encodings/ascii.py", line 26, in decode
return codecs.ascii_decode(input, self.errors)[0]
UnicodeDecodeError: 'ascii' codec can't decode byte 0xe3 in position 384: ordinal not in range(128)
-------------------- >> begin captured stdout << ---------------------
Author: Bas Couwenberg <sebastic@debian.org>
Forwarded: https://github.com/pysal/pysal/pull/1002
Applied-Upstream: https://github.com/pysal/pysal/commit/8d1e59649d2ee14f5499ed5ba68d99e32fea89f8
--- a/pysal/examples/networks/README.md
+++ b/pysal/examples/networks/README.md
@@ -9,7 +9,7 @@ Datasets used for network testing
* eberly_net.shx: spatial index.
* eberly_net_pts_offnetwork.dbf: attribute data for points off network. (k=2)
* eberly_net_pts_offnetwork.shp: Point shapefile. (n=100)
-* eberly_net_pts_offnetwork.shx: spatial index。
+* eberly_net_pts_offnetwork.shx: spatial index.
* eberly_net_pts_onnetwork.dbf: attribute data for points on network. (k=1)
* eberly_net_pts_onnetwork.shp: Point shapefile. (n=110)
* eberly_net_pts_onnetwork.shx: spatial index.
# README files are parsed by the code.
package-contains-documentation-outside-usr-share-doc usr/lib/python*/dist-packages/pysal/examples/*
# README files are parsed by the code.
package-contains-documentation-outside-usr-share-doc usr/lib/python*/dist-packages/pysal/examples/*
# False positive? dh_python3 should do the right thing.
python-module-in-wrong-location usr/lib/python3.*/dist-packages/pysal/* usr/lib/python3/dist-packages/pysal/*
......@@ -47,9 +47,9 @@ copyright = u'2014-, PySAL Developers; 2009-13 Sergio Rey'
# built documents.
#
# The short X.Y version.
version = '1.14.3'
version = '1.14.4'
# The full version, including alpha/beta/rc tags.
release = '1.14.3'
release = '1.14.4'
# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
......
......@@ -14,21 +14,19 @@ Prepare the release
$ python tools/github_stats.py days >> chglog
- where `days` is the number of days to start the logs at
- Prepend `chglog` to `CHANGELOG` and edit
- where `days` is the number of days to start the logs at.
- Prepend `chglog` to `CHANGELOG` and edit.
- Edit THANKS and README and README.md if needed.
- Edit the file `version.py` to update MAJOR, MINOR, MICRO
- Edit the file `version.py` to update MAJOR, MINOR, MICRO.
- Bump::
$ cd tools; python bump.py
- Commit all changes.
- Push_ your branch up to your GitHub repos
- Push_ your branch up to your GitHub repos.
- On github issue a pull request, with a target of **upstream master**.
Add a comment that this is for release.
Make docs
---------
......@@ -37,14 +35,14 @@ As of version 1.6, docs are automatically compiled and hosted_.
Make a source dist and test locally (Python 3)
----------------------------------------------
On each build machine
On each build machine::
$ git clone http://github.com/pysal/pysal.git
$ cd pysal
$ python setup.py sdist
$ cp dist/PySAL-1.14.2.tar.gz ../junk/.
$ cd ../junk
$ conda create -n pysaltest3 python=3 pip
$ conda create -n pysaltest3 python=3 pip nose nose-progressive nose-exclude
$ source activate pysaltest3
$ pip install PySAL-1.14.2.tar.gz
$ rm -r /home/serge/anaconda3/envs/pysaltest3/lib/python3.6/site-packages/pysal/contrib
......@@ -56,15 +54,17 @@ You can modify the above to test for Python 2 environments.
Upload release to pypi
----------------------
- Make and upload_ to the Python Package Index in one shot!::
- Make and upload_ to the `Python Package Index`_ in one shot!::
$ python setup.py sdist upload
- if not registered_, do so. Follow the prompts. You can save the
login credentials in a dot-file, .pypirc
- if not registered_, do so by following the prompts::
$ python setup.py register
- You can save the login credentials in a dot-file, `.pypirc`.
- Make and upload the Windows installer to SourceForge.
- On a Windows box, build the installer as so::
- Make and upload the Windows installer to SourceForge_. On a Windows box, build the installer as so::
$ python setup.py bdist_wininst
......@@ -77,26 +77,29 @@ https://help.github.com/articles/creating-releases/
Announce
--------
- Draft and distribute press release on openspace-list, pysal.org, spatial.ucr.edu
- Draft and distribute press release on openspace-list, pysal.org, `spatial.ucr.edu`_ .
Bump master version
-------------------
- Change MAJOR, MINOR version in setup script.
- Change pysal/version.py to dev number
- Change the docs version from X.x to X.xdev by editing doc/source/conf.py in two places.
- Update the release schedule in doc/source/developers/guidelines.rst
- Change MAJOR, MINOR version in `setup.py`.
- Change `pysal/version.py`.
- Change the docs version by editing `doc/source/conf.py` in two places (version and release).
- Update the release schedule in `doc/source/developers/guidelines.rst`.
Update the `github.io news page <https://github.com/pysal/pysal.github.io/blob/master/_includes/news.md>`_
to announce the release.
.. _upload: http://docs.python.org/2.7/distutils/uploading.html
.. _registered: http://docs.python.org/2.7/distutils/packageindex.html
.. _upload: https://docs.python.org/2.7/distutils/packageindex.html#the-upload-command
.. _registered: https://docs.python.org/2.7/distutils/packageindex.html#the-register-command
.. _source: http://docs.python.org/distutils/sourcedist.html
.. _hosted: http://pysal.readthedocs.org
.. _hosted: http://pysal.readthedocs.io/en/latest/users/index.html
.. _branch: https://github.com/pysal/pysal/wiki/GitHub-Standard-Operating-Procedures
.. _policy: https://github.com/pysal/pysal/wiki/Example-git-config
.. _create the release: https://help.github.com/articles/creating-releases/
.. _Push: https://github.com/pysal/pysal/wiki/GitHub-Standard-Operating-Procedures
.. _Python Package Index: https://pypi.python.org/pypi/PySAL
.. _SourceForge: https://sourceforge.net
.. _spatial.ucr.edu: http://spatial.ucr.edu/news.html
......@@ -20,8 +20,8 @@ PySAL
.. sidebar:: Releases
- `Stable 1.14.3 (Released 2017-11-2) <users/installation.html>`_
- `Development <http://github.com/pysal/pysal/tree/dev>`_
- `Stable 1.14.4 (Released 2018-07-17) <users/installation.html>`_
- `Development <http://github.com/pysal/pysal/tree/master>`_
PySAL is an open source library of spatial analysis functions written in
Python intended to support the development of high level applications.
......
......@@ -10,7 +10,7 @@ References
.. [Anselin1997] Anselin, L. and Kelejian, H. H. (1997). Testing for spatial error autocorrelation in the presence of endogenous regressors. International Regional Science Review, 20(1-2):153–182.
.. [Anselin2011] Anselin, L. (2011). GMM Estimation of Spatial Error Autocorrelation with and without Heteroskedasticity.
.. [Arraiz2010] Arraiz, I., Drukker, D. M., Kelejian, H. H., and Prucha, I. R. (2010). A spatial Cliff-Ord-type model with heteroskedastic innovations: Small and large sample results. Journal of Regional Science, 50(2):592–614.
.. [Assuncao1999] Assuncao, R. M. and Reis, E. A. (1999). A new proposal to adjust moran’s i for population density. Statistics in medicine, 18(16):2147–2162.
.. [Assuncao1999] Assuncao, R. M. and Reis, E. A. (1999). A new proposal to adjust Moran’s I for population density. Statistics in medicine, 18(16):2147–2162.
.. [Baker2004] Baker, R. D. (2004). Identifying space–time disease clusters. Acta tropica, 91(3):291–299.
.. [Belsley1980] Belsley, D. A., Kuh, E., and Welsch, R. E. (1980). Regression diagnostics: Identifying influential data and sources of collinearity, volume 1.
.. [Bickenbach2003] Bickenbach, F. and Bode, E. (2003). Evaluating the Markov property in studies of economic convergence. International Regional Science Review, 26(3):363–392.
......
......@@ -7,7 +7,7 @@ Spatial Econometrics
Comprehensive user documentation on spreg can be found in
Anselin, L. and S.J. Rey (2014) `Modern Spatial Econometrics in Practice:
A Guide to GeoDa, GeoDaSpace and PySAL.
<http://www.amazon.com/Modern-Spatial-Econometrics-Practice-GeoDaSpace-ebook/dp/B00RI9I44K>`_
<http://a.co/aHZnDLW>`_
GeoDa Press, Chicago.
......
......@@ -52,7 +52,19 @@ from pysal.core import IOHandlers
# Assign pysal.open to dispatcher
open = pysal.core.FileIO.FileIO
from pysal.version import version
from pysal.version import version as __version__, new_api_date
from warnings import warn
from numpy import VisibleDeprecationWarning
warn("PySAL's API will be changed on {}. The last "
"release made with this API is version {}. "
"A preview of the next API version is provided in "
"the `pysalnext` package. The API changes and a "
"guide on how to change imports is provided "
"at https://migrating.pysal.org".format(new_api_date,
__version__
), VisibleDeprecationWarning)
# Load the IOHandlers
#from pysal.version import stable_release_date
#import urllib2, json
#import config
......
......@@ -158,7 +158,11 @@ class Arc_KDTree(temp_KDTree):
distance_upper_bound, self.radius)
d, i = temp_KDTree.query(self, self._toXYZ(x), k,
eps=eps, distance_upper_bound=distance_upper_bound)
if isinstance(d, float):
dims = 0
else:
dims = len(d.shape)
r = self.radius
if dims == 0:
return sphere.linear2arcdist(d, r), i
......
This diff is collapsed.
This diff is collapsed.
......@@ -9,14 +9,18 @@ import numpy as np
import pysal as ps
import matplotlib.pyplot as plt
__all__ = ['mplot']
def mplot(m, xlabel='', ylabel='', title='', custom=(7,7)):
'''
"""
Produce basic Moran Plot
...
Parameters
---------
m : array
values of Moran's I
----------
m : pysal.Moran instance
values of Moran's I Global Autocorrelation Statistic
xlabel : str
label for x axis
ylabel : str
......@@ -27,26 +31,45 @@ def mplot(m, xlabel='', ylabel='', title='', custom=(7,7)):
dimensions of figure size
Returns
---------
plot : png
image file showing plot
-------
fig : Matplotlib Figure instance
Moran scatterplot figure
'''
Examples
--------
>>> import matplotlib.pyplot as plt
>>> import pysal as ps
>>> from pysal.contrib.pdio import read_files
>>> from pysal.contrib.viz.plot import mplot
>>> link = ps.examples.get_path('columbus.shp')
>>> db = read_files(link)
>>> y = db['HOVAL'].values
>>> w = ps.queen_from_shapefile(link)
>>> w.transform = 'R'
>>> m = ps.Moran(y, w)
>>> mplot(m, xlabel='Response', ylabel='Spatial Lag',
... title='Moran Scatterplot', custom=(7,7))
>>> plt.show()
"""
lag = ps.lag_spatial(m.w, m.z)
fit = ps.spreg.OLS(m.z[:, None], lag[:,None])
## Customize plot
# Customize plot
fig = plt.figure(figsize=custom)
plt.xlabel(xlabel)
plt.ylabel(ylabel)
plt.suptitle(title)
ax = fig.add_subplot(111)
ax.set_xlabel(xlabel)
ax.set_ylabel(ylabel)
fig.suptitle(title)
plt.scatter(m.z, lag, s=60, color='k', alpha=.6)
plt.plot(lag, fit.predy, color='r')
ax.scatter(m.z, lag, s=60, color='k', alpha=.6)
ax.plot(lag, fit.predy, color='r')
plt.axvline(0, alpha=0.5)
plt.axhline(0, alpha=0.5)
plt.show()
ax.axvline(0, alpha=0.5)
ax.axhline(0, alpha=0.5)
return None
return fig
import matplotlib.pyplot as plt
import pysal as ps
from pysal.contrib.pdio import read_files
from pysal.contrib.viz.plot import mplot
def test_mplot():
link = ps.examples.get_path('columbus.shp')
db = read_files(link)
y = db['HOVAL'].values
w = ps.queen_from_shapefile(link)
w.transform = 'R'
m = ps.Moran(y, w)
fig = mplot(m, xlabel='Response', ylabel='Spatial Lag',
title='Moran Scatterplot', custom=(7,7))
plt.close(fig)
\ No newline at end of file
......@@ -328,7 +328,7 @@ class Moran_BV(object):
Notes
-----
Inference is only based on permutations as analytical results are none too
Inference is only based on permutations as analytical results are not too
reliable.
Examples
......@@ -553,6 +553,16 @@ class Moran_Rate(Moran):
permutations : int
number of random permutations for calculation of pseudo
p_values
geoda_rate : boolean
If adjusted=False, geoda_rate is ignored.
If adjusted=True and geoda_rate=True, rates are adjusted and
conform with Geoda implementation: if a<0, variance
estimator is v_i = b/x_i for any i; otherwise v_i = a+b/x_i.
If adjusted=True and geoda_rate=False, conform with
Assuncao and Reis (1999) [Assuncao1999]_ :
assign v_i = a+b/x_i and check individual v_i: if v_i<0,
assign v_i = b/x_i.
Default is True.
Attributes
----------
......@@ -628,11 +638,11 @@ class Moran_Rate(Moran):
"""
def __init__(self, e, b, w, adjusted=True, transformation="r",
permutations=PERMUTATIONS, two_tailed=True):
permutations=PERMUTATIONS, two_tailed=True, geoda_rate=True):
e = np.asarray(e).flatten()
b = np.asarray(b).flatten()
if adjusted:
y = assuncao_rate(e, b)
y = assuncao_rate(e, b, geoda=geoda_rate)
else:
y = e * 1.0 / b
Moran.__init__(self, y, w, transformation=transformation,
......@@ -1220,6 +1230,16 @@ class Moran_Local_Rate(Moran_Local):
(default=False)
If True use GeoDa scheme: HH=1, LL=2, LH=3, HL=4
If False use PySAL Scheme: HH=1, LH=2, LL=3, HL=4
geoda_rate : boolean
If adjusted=False, geoda_rate is ignored.
If adjusted=True and geoda_rate=True, rates are adjusted and
conform with Geoda implementation: if a<0, variance
estimator is v_i = b/x_i for any i; otherwise v_i = a+b/x_i.
If adjusted=True and geoda_rate=False, conform with
Assuncao and Reis (1999) [Assuncao1999]_ :
assign v_i = a+b/x_i and check individual v_i: if v_i<0,
assign v_i = b/x_i.
Default is True.
Attributes
----------
y : array
......@@ -1293,11 +1313,11 @@ class Moran_Local_Rate(Moran_Local):
"""
def __init__(self, e, b, w, adjusted=True, transformation="r",
permutations=PERMUTATIONS, geoda_quads=False):
permutations=PERMUTATIONS, geoda_quads=False, geoda_rate=True):
e = np.asarray(e).flatten()
b = np.asarray(b).flatten()
if adjusted:
y = assuncao_rate(e, b)
y = assuncao_rate(e, b, geoda=geoda_rate)
else:
y = e * 1.0 / b
Moran_Local.__init__(self, y, w,
......
......@@ -502,11 +502,9 @@ def choynowski(e, b, n, threshold=None):
return np.array(p)
def assuncao_rate(e, b):
"""The standardized rates where the mean and stadard deviation used for
the standardization are those of Empirical Bayes rate estimates
The standardized rates resulting from this function are used to compute
Moran's I corrected for rate variables [Choynowski1959]_ .
def assuncao_rate(e, b, geoda=True):
"""Standardized rates used for computing Moran's I corrected for rate
variables.
Parameters
----------
......@@ -514,10 +512,19 @@ def assuncao_rate(e, b):
(n, 1), event variable measured at n spatial units
b : array
(n, 1), population at risk variable measured at n spatial units
geoda : boolean
If True, conform with Geoda implementation: if a<0, variance
estimator is v_i = b/x_i for any i; otherwise v_i = a+b/x_i.
If False, conform with Assuncao and Reis (1999) [Assuncao1999]_ :
assign v_i = a+b/x_i and check individual v_i: if v_i<0,
assign v_i = b/x_i.
Default is True.
Notes
-----
e and b are arranged in the same order
The mean and standard deviation used for standardizing rates are those
of Empirical Bayes rate estimates.
Based on Assuncao and Reis (1999) [Assuncao1999]_ .
Returns
-------
......@@ -526,13 +533,12 @@ def assuncao_rate(e, b):
Examples
--------
Creating an array of an event variable (e.g., the number of cancer patients)
Create an array of an event variable (e.g., the number of cancer patients)
for 8 regions.
>>> e = np.array([30, 25, 25, 15, 33, 21, 30, 20])
Creating another array of a population-at-risk variable (e.g., total population)
Create another array of a population-at-risk variable (e.g., total population)
for the same 8 regions.
The order for entering values is the same as the case of e.
......@@ -540,20 +546,35 @@ def assuncao_rate(e, b):
Computing the rates
>>> print assuncao_rate(e, b)[:4]
[ 1.03843594 -0.04099089 -0.56250375 -1.73061861]
>>> print(assuncao_rate(e, b)[:4])
[ 0.95839273 -0.03783129 -0.51460896 -1.61105841]
>>> import pysal
>>> w = pysal.open(pysal.examples.get_path("sids2.gal")).read()
>>> f = pysal.open(pysal.examples.get_path("sids2.dbf"))
>>> e = np.array(f.by_col('SID79'))
>>> b = np.array(f.by_col('BIR79'))
>>> print(assuncao_rate(e, b)[:4])
[-1.48875691 1.78507268 -0.34422806 0.26190802]
"""
y = e * 1.0 / b
e = np.array(e).astype(float)
b = np.array(b).astype(float)
y = e / b
e_sum, b_sum = sum(e), sum(b)
ebi_b = e_sum * 1.0 / b_sum
s2 = sum(b * ((y - ebi_b) ** 2)) / b_sum
ebi_a = s2 - ebi_b / (float(b_sum) / len(e))
ebi_v_raw = ebi_a + ebi_b / b
ebi_v = np.where(ebi_v_raw < 0, ebi_b / b, ebi_v_raw)
ebi_b = e_sum / b_sum
s2 = sum(b * (((y - ebi_b) ** 2)/ b_sum))
ebi_a = s2 - (ebi_b / (b_sum / len(e)))
ebi_v = ebi_a + ebi_b / b
if geoda:
if ebi_a < 0:
ebi_v = ebi_b / b
else:
ebi_v = np.where(ebi_v < 0, ebi_b / b, ebi_v)
return (y - ebi_b) / np.sqrt(ebi_v)
class _Smoother(object):
"""
This is a helper class that implements things that all smoothers should do.
......