Attempting to save an Experiment on the same file leads sometimes to a crash
This issue is hard to reproduce; current observations and guesses are given below.
An example of the error message:
HDF5-DIAG: Error detected in HDF5 (1.10.4) thread 140296662921152:
#000: ../../../src/H5F.c line 444 in H5Fcreate(): unable to create file
major: File accessibilty
minor: Unable to open file
#001: ../../../src/H5Fint.c line 1364 in H5F__create(): unable to open file
major: File accessibilty
minor: Unable to open file
#002: ../../../src/H5Fint.c line 1579 in H5F_open(): unable to truncate a file which is already open
major: File accessibilty
minor: Unable to open file
terminate called after throwing an instance of 'H5::FileIException'
Aborted
- The problem happens irregularly (at least, under Linux/Debian).
- The problem seems to be related to larger data-files (eg., trypsin raw data).
- The problem occurs for saving the Experiment on a file with the same name. Saving the Experiment with a different filenmade does not produce any problems.
First suspicions are:
- ExperimentImporter: Opens HDF files several times without closing them. Could it be opening the file in different modes (eg.,
read-only
, thenreadwrite
)? - HDF5BloscFilter multiple initialization and release.
- Another reason could be an attempt to store an empty (or null) data.
The crash seems to be related to "Update Peaks" button, after it is pressed, an attempt to save Experiment leads to an HDF5 Abortion message. It seems that the HDF file becomes somehow corrupt thereafter in the sense that you cannot save it again, even by closing and re-opening of NSXTool GUI. Checking the saved .nsx (HDF5) file via independent HDF5-tools shows that the found peaks are saved correctly, but the unit cell is not saved properly.
The main issue is the peak collections are tiny compared with the arrays used to store the image data.
Probably a sign that a member wasn't populated during Refiner::updatePredictions
. Can't see where it happens in Refiner.cpp
, where Peak3D
in a vector are changed in-place, so there's no initialisation.
RefinementBatch
uses UnitCellHandler
to generate a new unit cell per batch, it's stored in a map in the handler. Are we definitely iterating through the map and saving all unit cells?
One might reasonably assume that there's only one unit cell. Could just be that there's a broken pointer to a unit cell somehow
One of the roots of this problem is accessing the same HDF5 datafile simultaneously in two different modes: H5F_ACC_RDONLY
mode in ExperimentImporter
module and H5F_ACC_TRUNC
mode in ExperimentExporter
module (provided the user attempts to overwrite the file). This leads to H5::FileIException
and abortion.