Commit 5dba6262 authored by Bas Couwenberg's avatar Bas Couwenberg

Imported Upstream version 1.2.4

parent 318595f6
version 1.2.4 (tag v1.2.4rel)
==============================
* Fix for issue #554. It is now ensured that data is in native endian
byte order before passing to netcdf-c library. Data read from variable
with non-native byte order is also byte-swapped, so that dtype remains
consistent with netcdf variable. Behavior now consistent with h5py.
* raise warning for HDF5 1.10.x (issue #549), since backwards
incompatible files may be created.
* raise AttributeError instead of RuntimeError when attribute operation
fails. raise IOError instead of RuntimeError when nc_create or
nc_open fails (issue #546).
* Use NamedTemporaryFile instead of deprecated mktemp in tests
(pull request #543).
* add AppVeyor automated windows tests (pull request #540).
version 1.2.3.1 (tag v1.2.3.1rel)
==================================
* fix bug in setup.py (pull request #539, introduced in issue #518).
......
recursive-include docs *
recursive-include man *
recursive-include include *
recursive-include conda.recipe
include MANIFEST.in
include README.md
include COPYING
include Changelog
include appveyor.yml
include .travis.yml
include setup.cfg
include setup.cfg.template
include examples/*py
......
# netcdf4-python
[Python](http://python.org)/[numpy](http://numpy.org) interface to the netCDF [C library](https://github.com/Unidata/netcdf-c).
[![Build Status](https://travis-ci.org/Unidata/netcdf4-python.svg?branch=master)](https://travis-ci.org/Unidata/netcdf4-python)
[![Linux Build Status](https://travis-ci.org/Unidata/netcdf4-python.svg?branch=master)](https://travis-ci.org/Unidata/netcdf4-python)
[![Windows Build Status](https://ci.appveyor.com/api/projects/status/fl9taa9je4e6wi7n/branch/master?svg=true)](https://ci.appveyor.com/project/jswhit/netcdf4-python/branch/master)
[![PyPI package](https://badge.fury.io/py/netCDF4.svg)](http://python.org/pypi/netCDF4)
## News
For the latest updates, see the [Changelog](https://github.com/Unidata/netcdf4-python/blob/master/Changelog).
4/11/2016: Version [1.2.4](https://pypi.python.org/pypi/netCDF4/1.2.4) released.
Bugs in handling of variables with specified non-native "endian-ness" (byte-order) fixed ([issue #554]
(https://github.com/Unidata/netcdf4-python/issues/554)). Build instructions updated and warning issued
to deal with potential backwards incompatibility introduced when using HDF5 1.10.x
(see [Unidata/netcdf-c/issue#250](https://github.com/Unidata/netcdf-c/issues/250)).
3/10/2016: Version [1.2.3](https://pypi.python.org/pypi/netCDF4/1.2.3) released. Various bug fixes.
All text attributes in ``NETCDF4`` formatted files are now written as type ``NC_CHAR``, unless they contain unicode characters that
cannot be encoded in ascii, in which case they are written as ``NC_STRING``. Previously,
......
environment:
CONDA_INSTALL_LOCN: "C:\\conda"
# SDK v7.0 MSVC Express 2008's SetEnv.cmd script will fail if the
# /E:ON and /V:ON options are not enabled in the batch script interpreter
# See: http://stackoverflow.com/a/13751649/163740
CMD_IN_ENV: "cmd /E:ON /V:ON /C obvci_appveyor_python_build_env.cmd"
# Workaround for https://github.com/conda/conda-build/issues/636
PYTHONIOENCODING: "UTF-8"
PYTHONUNBUFFERED: "1"
# We set a default Python version for the miniconda that is to be installed.
# This can be overridden in the matrix definition where appropriate.
CONDA_PY: "27"
matrix:
- TARGET_ARCH: x86
CONDA_NPY: 110
CONDA_PY: 27
- TARGET_ARCH: x64
CONDA_NPY: 110
CONDA_PY: 27
- TARGET_ARCH: x86
CONDA_NPY: 110
CONDA_PY: 35
- TARGET_ARCH: x64
CONDA_NPY: 110
CONDA_PY: 35
platform:
- x64
install:
# If there is a newer build queued for the same PR, cancel this one.
# The AppVeyor 'rollout builds' option is supposed to serve the same
# purpose but it is problematic because it tends to cancel builds pushed
# directly to master instead of just PR builds (or the converse).
# credits: JuliaLang developers.
- ps: if ($env:APPVEYOR_PULL_REQUEST_NUMBER -and $env:APPVEYOR_BUILD_NUMBER -ne ((Invoke-RestMethod `
https://ci.appveyor.com/api/projects/$env:APPVEYOR_ACCOUNT_NAME/$env:APPVEYOR_PROJECT_SLUG/history?recordsNumber=50).builds | `
Where-Object pullRequestId -eq $env:APPVEYOR_PULL_REQUEST_NUMBER)[0].buildNumber) { `
throw "There are newer queued builds for this pull request, failing early." }
# Set the CONDA_NPY, although it has no impact on the actual build.
# We need this because of a test within conda-build.
- cmd: set CONDA_NPY=19
# Use the pre-installed Miniconda for the desired arch.
- ps: if($env:TARGET_ARCH -eq 'x86')
{$root = "C:\Miniconda"}
else
{$root = "C:\Miniconda-x64"}
$env:path="$root;$root\Scripts;$root\Library\bin;$($env:path)"
# Update conda.
- cmd: conda update --yes --quiet conda
# Conda build tools.
- cmd: conda install --yes --quiet obvious-ci --channel conda-forge
- cmd: obvci_install_conda_build_tools.py
- cmd: conda info
# Skip .NET project specific build phase.
build: off
test_script:
- "%CMD_IN_ENV% conda build conda.recipe --quiet"
set SITECFG=%SRC_DIR%/setup.cfg
echo [options] > %SITECFG%
echo use_cython=True >> %SITECFG%
echo [directories] >> %SITECFG%
echo HDF5_libdir = %LIBRARY_LIB% >> %SITECFG%
echo HDF5_incdir = %LIBRARY_INC% >> %SITECFG%
echo netCDF4_libdir = %LIBRARY_LIB% >> %SITECFG%
echo netCDF4_incdir = %LIBRARY_INC% >> %SITECFG%
"%PYTHON%" setup.py install --single-version-externally-managed --record record.txt
if errorlevel 1 exit 1
#!/bin/bash
SETUPCFG=$SRC_DIR\setup.cfg
echo "[options]" > $SETUPCFG
echo "use_cython=True" >> $SETUPCFG
echo "[directories]" >> $SETUPCFG
echo "netCDF4_dir = $PREFIX" >> $SETUPCFG
${PYTHON} setup.py install --single-version-externally-managed --record record.txt
{% set version = "dev" %}
package:
name: netcdf4
version: {{ version }}
source:
path: ../
build:
number: 0
entry_points:
- ncinfo = netCDF4.utils:ncinfo
- nc4tonc3 = netCDF4.utils:nc4tonc3
- nc3tonc4 = netCDF4.utils:nc3tonc4
requirements:
build:
- python
- setuptools
- numpy x.x
- cython
- hdf5
- libnetcdf
run:
- python
- setuptools
- numpy x.x
- hdf5
- libnetcdf
test:
imports:
- netCDF4
- netcdftime
commands:
- ncinfo -h
- nc4tonc3 -h
- nc3tonc4 -h
about:
home: http://github.com/Unidata/netcdf4-python
license: OSI Approved
summary: Provides an object-oriented python interface to the netCDF version 4 library.
import os
import netCDF4
# Check OPeNDAP functionality.
url = 'http://geoport-dev.whoi.edu/thredds/dodsC/estofs/atlantic'
nc = netCDF4.Dataset(url)
# Check if it was compiled with cython.
assert nc.filepath() == url
# Run the unittests.
test_dir = os.path.join(os.environ['SRC_DIR'], 'test')
os.chdir(test_dir)
os.system('python run_all.py')
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
......@@ -449,7 +449,7 @@ else:
setup(name = "netCDF4",
cmdclass = cmdclass,
version = "1.2.3.1",
version = "1.2.4",
long_description = "netCDF version 4 has many features not found in earlier versions of the library, such as hierarchical groups, zlib compression, multiple unlimited dimensions, and new data types. It is implemented on top of HDF5. This module implements most of the new features, and can read and write netCDF files compatible with older versions of the library. The API is modelled after Scientific.IO.NetCDF, and should be familiar to users of that module.\n\nThis project has a `Subversion repository <http://code.google.com/p/netcdf4-python/source>`_ where you may access the most up-to-date source.",
author = "Jeff Whitaker",
author_email = "jeffrey.s.whitaker@noaa.gov",
......
......@@ -4,6 +4,8 @@ import sys
import unittest
import os
import tempfile
import warnings
import numpy as NP
from numpy.random.mtrand import uniform
import netCDF4
......@@ -14,7 +16,7 @@ except ImportError: # or else use drop-in substitute
from ordereddict import OrderedDict
# test attribute creation.
FILE_NAME = tempfile.mktemp(".nc")
FILE_NAME = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
VAR_NAME="dummy_var"
GROUP_NAME = "dummy_group"
DIM1_NAME="x"
......@@ -139,17 +141,25 @@ class VariablesTestCase(unittest.TestCase):
assert v.getncattr('foo') == 1
assert v.getncattr('bar') == 2
# check type of attributes using ncdump (issue #529)
dep=subprocess.Popen(['ncdump','-h',FILE_NAME],stdout=subprocess.PIPE).communicate()[0]
try: # python 2
ncdump_output = dep.split('\n')
except TypeError: # python 3
ncdump_output = str(dep,encoding='utf-8').split('\n')
for line in ncdump_output:
line = line.strip('\t\n\r')
if "stringatt" in line: assert line.startswith('string')
if "charatt" in line: assert line.startswith(':')
if "cafe" in line: assert line.startswith('string')
if "batt" in line: assert line.startswith(':')
try: # ncdump may not be on the system PATH
nc_proc = subprocess.Popen(
['ncdump', '-h', FILE_NAME], stdout=subprocess.PIPE)
except OSError:
warnings.warn('"ncdump" not on system path; cannot test '
'read of some attributes')
pass
else: # We do have ncdump output
dep = nc_proc.communicate()[0]
try: # python 2
ncdump_output = dep.split('\n')
except TypeError: # python 3
ncdump_output = str(dep,encoding='utf-8').split('\n')
for line in ncdump_output:
line = line.strip('\t\n\r')
if "stringatt" in line: assert line.startswith('string')
if "charatt" in line: assert line.startswith(':')
if "cafe" in line: assert line.startswith('string')
if "batt" in line: assert line.startswith(':')
# check attributes in subgroup.
# global attributes.
for key,val in ATTDICT.items():
......
......@@ -3,7 +3,7 @@ import numpy as np
import sys, os, unittest, tempfile
from numpy.testing import assert_array_equal
FILE_NAME = tempfile.mktemp(".nc")
FILE_NAME = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
dimsize = np.iinfo(np.int64).max # max unsigned 64 bit integer
ndim = 100
arrdata = np.random.randint(np.iinfo(np.uint8).min,np.iinfo(np.uint8).max,size=ndim)
......
......@@ -75,7 +75,7 @@ cells = numpy.array([ (387, 289, 65.64321899414062, -167.90093994140625, 355
(396, 290, 65.71821594238281, -167.9770050048828, 3545, -10149, 8941, -16614, 9, 34164, 1, 0, 200, 526, 511, 301, 170, 65528, 35, 1480, 2350, 3029, 2645, 2928, 5907, 11842, 6208, 16528, 7384, 7988, 870, 527, 661, 3054, 2504, 3291, 3235, 2490, 3424, 354, 354, 10039, 10988, 7958, 7395, 7902, 8811, 14853, 16836, 17231, 20852, 13, 7, 6, 15, 15, 15, 15, 0, 10, 5, 8, 8, 4, 5, 4, 7, 0, 0, 12, 13, 15, 5, 12, 2, 2, 6, 3, 15, 15, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 28, 28, 6, 6, 6, 0, 6, 0, 0, 2, 0, 0, 0, 255, 255, [83, -97, 14, -111, 0, 0], [13, -128, -114, 4, 0, 0, 11, 3, 20, 1], [0, 0, 0, 0, 0])],
dtype=[('mxd03_granule_row', '<i2'), ('mxd03_granule_column', '<i2'), ('mxd03_latitude', '<f4'), ('mxd03_longitude', '<f4'), ('mxd03_sensor_zenith', '<i2'), ('mxd03_sensor_azimuth', '<i2'), ('mxd03_solar_zenith', '<i2'), ('mxd03_solar_azimuth', '<i2'), ('mxd03_height', '<i2'), ('mxd03_range', '<u2'), ('mxd03_land_sea_mask', '|u1'), ('mxd03_gflags', '|u1'), ('mxd02_band_1A', '<u2'), ('mxd02_band_2A', '<u2'), ('mxd02_band_3A', '<u2'), ('mxd02_band_4A', '<u2'), ('mxd02_band_5A', '<u2'), ('mxd02_band_6A', '<u2'), ('mxd02_band_7A', '<u2'), ('mxd02_band_8', '<u2'), ('mxd02_band_9', '<u2'), ('mxd02_band_10', '<u2'), ('mxd02_band_11', '<u2'), ('mxd02_band_12', '<u2'), ('mxd02_band_13lo', '<u2'), ('mxd02_band_13hi', '<u2'), ('mxd02_band_14lo', '<u2'), ('mxd02_band_14hi', '<u2'), ('mxd02_band_15', '<u2'), ('mxd02_band_16', '<u2'), ('mxd02_band_17', '<u2'), ('mxd02_band_18', '<u2'), ('mxd02_band_19', '<u2'), ('mxd02_band_20', '<u2'), ('mxd02_band_21', '<u2'), ('mxd02_band_22', '<u2'), ('mxd02_band_23', '<u2'), ('mxd02_band_24', '<u2'), ('mxd02_band_25', '<u2'), ('mxd02_band_26', '<u2'), ('mxd02_band_26B', '<u2'), ('mxd02_band_27', '<u2'), ('mxd02_band_28', '<u2'), ('mxd02_band_29', '<u2'), ('mxd02_band_30', '<u2'), ('mxd02_band_31', '<u2'), ('mxd02_band_32', '<u2'), ('mxd02_band_33', '<u2'), ('mxd02_band_34', '<u2'), ('mxd02_band_35', '<u2'), ('mxd02_band_36', '<u2'), ('mxd02_band_uncertainity_1A', '|u1'), ('mxd02_band_uncertainity_2A', '|u1'), ('mxd02_band_uncertainity_3A', '|u1'), ('mxd02_band_uncertainity_4A', '|u1'), ('mxd02_band_uncertainity_5A', '|u1'), ('mxd02_band_uncertainity_6A', '|u1'), ('mxd02_band_uncertainity_7A', '|u1'), ('mxd02_band_uncertainity_8', '|u1'), ('mxd02_band_uncertainity_9', '|u1'), ('mxd02_band_uncertainity_10', '|u1'), ('mxd02_band_uncertainity_11', '|u1'), ('mxd02_band_uncertainity_12', '|u1'), ('mxd02_band_uncertainity_13lo', '|u1'), ('mxd02_band_uncertainity_13hi', '|u1'), ('mxd02_band_uncertainity_14lo', '|u1'), ('mxd02_band_uncertainity_14hi', '|u1'), ('mxd02_band_uncertainity_15', '|u1'), ('mxd02_band_uncertainity_16', '|u1'), ('mxd02_band_uncertainity_17', '|u1'), ('mxd02_band_uncertainity_18', '|u1'), ('mxd02_band_uncertainity_19', '|u1'), ('mxd02_band_uncertainity_20', '|u1'), ('mxd02_band_uncertainity_21', '|u1'), ('mxd02_band_uncertainity_22', '|u1'), ('mxd02_band_uncertainity_23', '|u1'), ('mxd02_band_uncertainity_24', '|u1'), ('mxd02_band_uncertainity_25', '|u1'), ('mxd02_band_uncertainity_26', '|u1'), ('mxd02_band_uncertainity_26B', '|u1'), ('mxd02_band_uncertainity_27', '|u1'), ('mxd02_band_uncertainity_28', '|u1'), ('mxd02_band_uncertainity_29', '|u1'), ('mxd02_band_uncertainity_30', '|u1'), ('mxd02_band_uncertainity_31', '|u1'), ('mxd02_band_uncertainity_32', '|u1'), ('mxd02_band_uncertainity_33', '|u1'), ('mxd02_band_uncertainity_34', '|u1'), ('mxd02_band_uncertainity_35', '|u1'), ('mxd02_band_uncertainity_36', '|u1'), ('mxd02_band_nsamples_1A', '|i1'), ('mxd02_band_nsamples_2A', '|i1'), ('mxd02_band_nsamples_3A', '|i1'), ('mxd02_band_nsamples_4A', '|i1'), ('mxd02_band_nsamples_5A', '|i1'), ('mxd02_band_nsamples_6A', '|i1'), ('mxd02_band_nsamples_7A', '|i1'), ('reserved_20120221a', '|u1'), ('mxd11_lst', '<u2'), ('mxd11_qc', '<u2'), ('mxd11_error_lst', '<u2'), ('mxd11_emis31', '<u2'), ('mxd11_emis32', '<u2'), ('mxd11_view_angle', '|u1'), ('mxd11_view_time', '|u1'), ('mxd35_cloud_mask', '|i1', (6,)), ('mxd35_quality_assurance', '|i1', (10,)), ('reserved_20120221b', '|u1', (5,))])
FILE_NAME = tempfile.mktemp(".nc")
FILE_NAME = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
class CompoundAlignTestCase(unittest.TestCase):
......
......@@ -9,7 +9,7 @@ from numpy.testing import assert_array_equal, assert_array_almost_equal
# test compound attributes.
FILE_NAME = tempfile.mktemp(".nc")
FILE_NAME = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
DIM_NAME = 'time'
VAR_NAME = 'wind'
VAR_NAME2 = 'forecast_wind'
......
......@@ -9,7 +9,7 @@ from numpy.testing import assert_array_equal, assert_array_almost_equal
# test compound data types.
FILE_NAME = tempfile.mktemp(".nc")
FILE_NAME = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
#FILE_NAME = 'test.nc'
DIM_NAME = 'phony_dim'
GROUP_NAME = 'phony_group'
......
......@@ -8,7 +8,7 @@ ndim = 100000
ndim2 = 100
chunk1 = 10; chunk2 = ndim2
nfiles = 7
files = [tempfile.mktemp(".nc") for nfile in range(nfiles)]
files = [tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name for nfile in range(nfiles)]
array = uniform(size=(ndim,))
array2 = uniform(size=(ndim,ndim2))
lsd = 3
......
......@@ -6,7 +6,7 @@ import numpy as NP
from numpy.random.mtrand import uniform
import netCDF4
FILE_NAME = tempfile.mktemp(".nc")
FILE_NAME = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
LAT_NAME="lat"
LAT_LEN = 25
LAT_LENG = 50
......
......@@ -12,8 +12,8 @@ n2dim = 73
n3dim = 144
ranarr = 100.*uniform(size=(n1dim,n2dim,n3dim))
ranarr2 = 100.*uniform(size=(n1dim,n2dim,n3dim))
FILE_NAME = tempfile.mktemp(".nc")
FILE_NAME2 = tempfile.mktemp(".nc")
FILE_NAME = tempfile.NamedTemporaryFile(suffix='.nc', delete=True).name
FILE_NAME2 = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
class DisklessTestCase(unittest.TestCase):
......
......@@ -4,8 +4,9 @@ import unittest, os, tempfile
from numpy.testing import assert_array_equal, assert_array_almost_equal
data = np.arange(12,dtype='f4').reshape(3,4)
FILE_NAME = tempfile.mktemp(".nc")
FILE_NAME2 = tempfile.mktemp(".nc")
FILE_NAME = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
FILE_NAME2 = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
FILE_NAME3 = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
def create_file(file,format,data):
import warnings
......@@ -26,6 +27,29 @@ def create_file(file,format,data):
bb[:] = big
dataset.close()
def check_byteswap(file, data):
# byteswapping is done internally to native endian format
# when numpy array has non-native byte order. The byteswap was
# initially done in place, which caused the numpy array to
# be modified in the calling program. Pull request #555
# changed the byteswap to a copy, and this test checks
# to make sure the input numpy array is not modified.
dataset = netCDF4.Dataset(file,'w')
dataset.createDimension('time', None)
dataset.createDimension('space', 4)
dims = ('time', 'space')
bl = dataset.createVariable('big-little', np.float32, dims, endian='big')
data2 = data.copy()
bl[:] = data
dataset.close()
f = netCDF4.Dataset(file)
bl = f.variables['big-little'][:]
# check data.
assert_array_almost_equal(data, data2)
assert_array_almost_equal(bl, data)
f.close()
def check_data(file, data):
f = netCDF4.Dataset(file)
ll = f.variables['little-little'][:]
......@@ -52,9 +76,19 @@ def issue310(file):
var_big_endian = nc.createVariable(\
'obs_big_endian', '>f8', ('obs', ),\
endian=endian,fill_value=fval)
# use default _FillValue
var_big_endian2 = nc.createVariable(\
'obs_big_endian2', '>f8', ('obs', ),\
endian=endian)
# NOTE: missing_value be written in same byte order
# as variable, or masked array won't be masked correctly
# when data is read in.
var_big_endian.missing_value = mval
var_big_endian[0]=np.pi
var_big_endian[1]=mval
var_big_endian2.missing_value = mval
var_big_endian2[0]=np.pi
var_big_endian2[1]=mval
var_native_endian = nc.createVariable(\
'obs_native_endian', '<f8', ('obs', ),\
endian='native',fill_value=fval)
......@@ -63,6 +97,8 @@ def issue310(file):
var_native_endian[1]=mval
assert_array_almost_equal(var_native_endian[:].filled(),
var_big_endian[:].filled())
assert_array_almost_equal(var_big_endian[:].filled(),
var_big_endian2[:].filled())
nc.close()
def issue346(file):
......@@ -90,16 +126,19 @@ class EndianTestCase(unittest.TestCase):
def setUp(self):
create_file(FILE_NAME,'NETCDF4_CLASSIC',data); self.file=FILE_NAME
create_file(FILE_NAME2,'NETCDF3_CLASSIC',data); self.file2=FILE_NAME2
self.file3 = FILE_NAME3
def tearDown(self):
# Remove the temporary files
os.remove(self.file)
os.remove(self.file2)
os.remove(self.file3)
def runTest(self):
"""testing endian conversion capability"""
check_data(self.file, data)
check_data(self.file2, data)
check_byteswap(self.file3, data)
issue310(self.file)
issue346(self.file2)
......
......@@ -6,7 +6,7 @@ from netCDF4 import Dataset
import numpy as np
from numpy.testing import assert_array_equal
FILE_NAME = tempfile.mktemp(".nc")
FILE_NAME = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
ENUM_NAME = 'cloud_t'
ENUM_BASETYPE = np.int8
VAR_NAME = 'primary_cloud'
......
......@@ -19,7 +19,7 @@ to be checked.)
See test2unlim below for an example.
"""
file_name = tempfile.mktemp(".nc")
file_name = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
xdim=9; ydim=10; zdim=11
i = np.array([2,5,7],'i4')
i2 = np.array([0,8],'i4')
......
......@@ -7,8 +7,8 @@ import netCDF4
# test group creation.
FILE_NAME1 = tempfile.mktemp(".nc")
FILE_NAME2 = tempfile.mktemp(".nc")
FILE_NAME1 = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
FILE_NAME2 = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
DYNASTY=u"Tudor"
HENRY_VII=u"Henry_VII"
MARGARET=u"Margaret"
......
......@@ -9,7 +9,7 @@ import netCDF4
# in createVariable and createGroups (added in 1.1.8).
# also test Dataset.__getitem__, also added in 1.1.8.
FILE_NAME = tempfile.mktemp(".nc")
FILE_NAME = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
class Groups2TestCase(unittest.TestCase):
......
......@@ -12,7 +12,7 @@ import netCDF4
# packing/unpacking of short ints.
# create an n1dim by n2dim random ranarr.
FILE_NAME = tempfile.mktemp(".nc")
FILE_NAME = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
ndim = 10
ranarr = 100.*uniform(size=(ndim))
ranarr2 = 100.*uniform(size=(ndim))
......
......@@ -12,9 +12,9 @@ seterr(over='ignore') # don't print warning for overflow errors
# test automatic conversion of masked arrays, and
# packing/unpacking of short ints.
FILE_NAME1 = tempfile.mktemp(".nc")
FILE_NAME2 = tempfile.mktemp(".nc")
FILE_NAME3 = tempfile.mktemp(".nc")
FILE_NAME1 = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
FILE_NAME2 = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
FILE_NAME3 = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
datacheck1 =\
ma.array([0,5000.0,4000.0,0],dtype=np.float,mask=[True,False,False,True])
datacheck2 =\
......
......@@ -15,7 +15,7 @@ class SetAutoMaskTestBase(unittest.TestCase):
def setUp(self):
self.testfile = tempfile.mktemp(".nc")
self.testfile = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
self.fillval = default_fillvals["i2"]
self.v = np.array([self.fillval, 5, 4, -9999], dtype = "i2")
......@@ -137,7 +137,7 @@ class GlobalSetAutoMaskTest(unittest.TestCase):
def setUp(self):
self.testfile = tempfile.mktemp(".nc")
self.testfile = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
f = Dataset(self.testfile, 'w')
......
......@@ -8,7 +8,7 @@ import tempfile, unittest, os, datetime
nx=100; ydim=5; zdim=1
nfiles = 10
ninc = nx/nfiles
files = [tempfile.mktemp(".nc") for nfile in range(nfiles)]
files = [tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name for nfile in range(nfiles)]
data = randint(0,10,size=(nx,ydim,zdim))
missval = 99
data[::10] = missval
......@@ -66,7 +66,7 @@ class NonuniformTimeTestCase(unittest.TestCase):
ninc = 365
def setUp(self):
self.files = [tempfile.mktemp(".nc") for nfile in range(2)]
self.files = [tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name for nfile in range(2)]
for nfile,file in enumerate(self.files):
f = Dataset(file,'w',format='NETCDF4_CLASSIC')
f.createDimension('time',None)
......
......@@ -8,7 +8,7 @@ import tempfile, unittest, os, datetime
nx=100; ydim=5; zdim=1
nfiles = 10
ninc = nx/nfiles
files = [tempfile.mktemp(".nc") for nfile in range(nfiles)]
files = [tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name for nfile in range(nfiles)]
data = randint(0,10,size=(nx,ydim,zdim))
missval = 99
data[::10] = missval
......@@ -68,7 +68,7 @@ class NonuniformTimeTestCase(unittest.TestCase):
ninc = 365
def setUp(self):
self.files = [tempfile.mktemp(".nc") for nfile in range(2)]
self.files = [tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name for nfile in range(2)]
for nfile,file in enumerate(self.files):
f = Dataset(file,'w',format='NETCDF4_CLASSIC')
f.createDimension('time',None)
......
......@@ -506,7 +506,7 @@ class TestDate2index(unittest.TestCase):
self.standardtime = self.TestTime(datetime(1950, 1, 1), 366, 24,
'hours since 1900-01-01', 'standard')
self.file = tempfile.mktemp(".nc")
self.file = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
f = Dataset(self.file, 'w')
f.createDimension('time', None)
time = f.createVariable('time', float, ('time',))
......
import unittest, netCDF4, tempfile, os
file_name = tempfile.mktemp(".nc")
file_name = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
class RefCountTestCase(unittest.TestCase):
......
......@@ -9,7 +9,7 @@ from netCDF4 import __has_rename_grp__
# test changing dimension, variable names
# and deleting attributes.
FILE_NAME = tempfile.mktemp(".nc")
FILE_NAME = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
LAT_NAME="lat"
LON_NAME="lon"
LON_NAME2 = "longitude"
......
......@@ -10,7 +10,7 @@ import math
VAR_NAME='temp'
VAR_TYPE='f4'
VAR_VAL=math.pi
FILE_NAME = tempfile.mktemp(".nc")
FILE_NAME = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
GROUP_NAME = 'subgroup'
# test scalar variable creation and retrieval.
......
......@@ -16,7 +16,7 @@ class SetAutoScaleTestBase(unittest.TestCase):
def setUp(self):
self.testfile = tempfile.mktemp(".nc")
self.testfile = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
self.fillval = default_fillvals["i2"]
self.missing_value = -9999
......@@ -184,7 +184,7 @@ class GlobalSetAutoScaleTest(unittest.TestCase):
def setUp(self):
self.testfile = tempfile.mktemp(".nc")
self.testfile = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
f = Dataset(self.testfile, 'w')
......
......@@ -2,7 +2,7 @@ from netCDF4 import Dataset
import tempfile, unittest, os
import numpy as np
file_name = tempfile.mktemp(".nc")
file_name = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
xdim=None; ydim=121; zdim=169
datashape = (ydim,zdim)
data = np.ones(datashape,dtype=np.float)
......
......@@ -5,7 +5,7 @@ assert_array_almost_equal
import tempfile, unittest, os, random
import numpy as np
file_name = tempfile.mktemp(".nc")
file_name = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
xdim=9; ydim=10; zdim=11
#seed(9) # fix seed
data = randint(0,10,size=(xdim,ydim,zdim)).astype('u1')
......
......@@ -10,7 +10,7 @@ import netCDF4
# test primitive data types.
# create an n1dim by n2dim random ranarr.
FILE_NAME = tempfile.mktemp(".nc")
FILE_NAME = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
n1dim = 5
n2dim = 10
ranarr = 100.*uniform(size=(n1dim,n2dim))
......
......@@ -4,7 +4,7 @@ import sys, unittest, os, tempfile
netCDF4.default_encoding = 'utf-8'
FILE_NAME = tempfile.mktemp(".nc")
FILE_NAME = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
ATT1 = u'\u03a0\u03a3\u03a9'
ATT2 = u'x\xb0'
ATT3 = [u'\u03a0',u'\u03a3',u'\u03a9']
......
......@@ -4,7 +4,7 @@ import sys, unittest, os, tempfile
netCDF4.default_encoding = 'utf-8'
FILE_NAME = tempfile.mktemp(".nc")
FILE_NAME = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
ATT1 = '\u03a0\u03a3\u03a9'
ATT2 = 'x\xb0'
ATT3 = ['\u03a0','\u03a3','\u03a9']
......
......@@ -15,7 +15,7 @@ n1dim = 4
n2dim = 10
n3dim = 8
ranarr = 100.*uniform(size=(n1dim,n2dim,n3dim))
FILE_NAME = tempfile.mktemp(".nc")
FILE_NAME = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
class UnlimdimTestCase(unittest.TestCase):
......
......@@ -9,7 +9,7 @@ import netCDF4
# test variable creation.
FILE_NAME = tempfile.mktemp(".nc")
FILE_NAME = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
VAR_DOUBLE_NAME="dummy_var"
VAR_SHORT_NAME='dummy_var_short'
VARNAMES = sorted([VAR_DOUBLE_NAME,VAR_SHORT_NAME])
......
......@@ -6,7 +6,7 @@ from netCDF4 import Dataset
import numpy as np
from numpy.testing import assert_array_equal
FILE_NAME = tempfile.mktemp(".nc")
FILE_NAME = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
VL_NAME = 'vlen_type'