xarray.Dataset.to_netcdf#
- Dataset.to_netcdf(path=None, mode='w', format=None, group=None, engine=None, encoding=None, unlimited_dims=None, compute=True, invalid_netcdf=False, auto_complex=None)[source]#
Write dataset contents to a netCDF file.
- Parameters:
path (
str, path-like, file-like orNone, optional) – Path to which to save this datatree, or a file-like object to write it to (which must support read and write and be seekable) or None (default) to return in-memory bytes as a memoryview.mode (
{"w", "a"}, default:"w") – Write (‘w’) or append (‘a’) mode. If mode=’w’, any existing file at this location will be overwritten. If mode=’a’, existing variables will be overwritten.format (
{"NETCDF4", "NETCDF4_CLASSIC", "NETCDF3_64BIT", "NETCDF3_CLASSIC"}, optional) – File format for the resulting netCDF file:NETCDF4: Data is stored in an HDF5 file, using netCDF4 API features.
NETCDF4_CLASSIC: Data is stored in an HDF5 file, using only netCDF 3 compatible API features.
NETCDF3_64BIT: 64-bit offset version of the netCDF 3 file format, which fully supports 2+ GB files, but is only compatible with clients linked against netCDF version 3.6.0 or later.
NETCDF3_CLASSIC: The classic netCDF 3 file format. It does not handle 2+ GB files very well.
All formats are supported by the netCDF4-python library. scipy.io.netcdf only supports the last two formats.
The default format is NETCDF4 if you are saving a file to disk and have the netCDF4-python library available. Otherwise, xarray falls back to using scipy to write netCDF files and defaults to the NETCDF3_64BIT format (scipy does not support netCDF4).
group (
str, optional) – Path to the netCDF4 group in the given file to open (only works for format=’NETCDF4’). The group(s) will be created if necessary.engine (
{"netcdf4", "h5netcdf", "scipy"}, optional) – Engine to use when writing netCDF files. If not provided, the default engine is chosen based on available dependencies, by default preferring “netcdf4” over “h5netcdf” over “scipy” (customizable vianetcdf_engine_orderinxarray.set_options()).encoding (
dict, optional) – Nested dictionary with variable names as keys and dictionaries of variable specific encodings as values, e.g.,{"my_variable": {"dtype": "int16", "scale_factor": 0.1, "zlib": True}, ...}. Ifencodingis specified the original encoding of the variables of the dataset is ignored.The h5netcdf engine supports both the NetCDF4-style compression encoding parameters
{"zlib": True, "complevel": 9}and the h5py ones{"compression": "gzip", "compression_opts": 9}. This allows using any compression plugin installed in the HDF5 library, e.g. LZF.unlimited_dims (iterable of hashable, optional) – Dimension(s) that should be serialized as unlimited dimensions. By default, no dimensions are treated as unlimited dimensions. Note that unlimited_dims may also be set via
dataset.encoding["unlimited_dims"].compute (
bool, default:True) – If true compute immediately, otherwise return adask.delayed.Delayedobject that can be computed later.invalid_netcdf (
bool, default:False) – Only valid along withengine="h5netcdf". If True, allow writing hdf5 files which are invalid netcdf as described in h5netcdf/h5netcdf.
- Returns:
* ``memoryview`if path is None`* ``dask.delayed.Delayed`if compute is False`* ``None`otherwise`
See also