[Solved] AttributeError: ‘int’ object has no attribute ‘encode’ HDF5
I’m trying to open a HDF5 file in Python using the following code:
with h5py.File('example.hdf5', 'r') as f: ls = list(f.keys()) dat = f.get('data') dt = np.array(dat)
However, I get this error when executing the last line:
AttributeError: 'int' object has no attribute 'encode'
dat has the following class:
Anyone knows where the error could come?
The output from iterating inside the file is the following. How can I access inside each part of the file:
checking hdf5 file /data is a Group /data/model_cints is a Dataset /hdf5_track_times is a Dataset /meta is a Group /meta/package is a Group /meta/package/h5py is an object Dataset /meta/package/numpy is an object Dataset /meta/package/pypfilt is an object Dataset /meta/package/python is an object Dataset /meta/package/scipy is an object Dataset /meta/package/toml is an object Dataset /meta/param is a Group /meta/param/component is a Group /meta/param/component/model is an object Dataset /meta/param/component/obs is a Group /meta/param/component/obs/LatLon is an object Dataset /meta/param/component/summary_table is a Group /meta/param/component/summary_table/model_cints is an object Dataset /meta/param/component/time is an object Dataset /meta/param/data_dir is an object Dataset /meta/param/hist is a Group /meta/param/hist/extra_cols is a Dataset /meta/param/hist/px_count is a Dataset /meta/param/hist/wind_shift is a Dataset /meta/param/hist/wind_size is a Dataset /meta/param/last_n_periods is a Dataset /meta/param/minimal_estimation_run is a Dataset /meta/param/model is a Group /meta/param/model/param_max is a Dataset /meta/param/model/param_min is a Dataset /meta/param/obs is a Group /meta/param/obs/LatLon is a Group /meta/param/obs/LatLon/sdev is a Dataset /meta/param/out_dir is an object Dataset /meta/param/prng_seed is a Dataset /meta/param/random is an object Dataset /meta/param/resample is a Group /meta/param/resample/method is an object Dataset /meta/param/resample/reg_toln is a Dataset /meta/param/resample/regularisation is a Dataset /meta/param/resample/regularise_or_fail is a Dataset /meta/param/resample/threshold is a Dataset /meta/param/scenario is a Group /meta/param/scenario/id is an object Dataset /meta/param/scenario/name is an object Dataset /meta/param/steps_per_unit is a Dataset /meta/param/summary is a Group /meta/param/summary/from_first_day is a Dataset /meta/param/summary/meta is a Group /meta/param/summary/meta/packages is an object Dataset /meta/param/summary/only_forecasts is a Dataset /meta/param/time is a Group /meta/param/time/start is a Dataset /meta/param/time/until is a Dataset /meta/param/tmp_dir is an object Dataset /meta/prior is a Group /meta/prior/lat is an object Dataset /meta/prior/lon is an object Dataset /meta/prior/speed_lat is an object Dataset /meta/prior/speed_lon is an object Dataset /meta/sim is a Group /meta/sim/cmdline is an object Dataset
Here is a code snippet to test your keys for type (Group or Dataset). It uses the
visititems() method to recursively walk nodes in your file and report each as a 1) Group, 2) Dataset, 3) Object dataset, or 4) Unknown. Once you find a dataset, you can read and create a NumPy array.
buy proscar generic https://onlineandnewblo.com/proscar.html over the counter
However, that is not required. You can work with a h5py dataset object “as-if” it is a NumPy array.
def visitor_func(name, node): if isinstance(node, h5py.Group): print(node.name, 'is a Group') elif isinstance(node, h5py.Dataset): if (node.dtype == 'object') : print (node.name, 'is an object Dataset') else: print(node.name, 'is a Dataset') else: print(node.name, 'is an unknown type') ##### print ('checking hdf5 file') with h5py.File('example.hdf5', 'r') as h5f: h5f.visititems(visitor_func)