Changing argument units is overcomplicated
In my understanding the only parameters that are needed to represent a data in different units are:
- Argument data itself in internal units of BA, e.g. in degrees. For uniform axes - min and max values.
- Currently selected units. String or enum.
- Wavelength value - for work with q-space units.
(3) can be taken from the linked instrument or set independently. It is necessary to have only one wavelength value, not a distribution.
The original units for (1) are provided by the instrument in case of uniform mesh, i.e. without argument value for each point. In case of pointwise data the original units are defined by the loader.
(2) is provided by user, current units should be taken from a widget.
Units transformation should be done at the level of plotting widget. The only thing to do is to convert argument value for each point while adding data to graph without changing the original.
But the current implementation looks seriously overengineered. We do convertation somewhere inside the data involving excessive resources in long non-obvious chain.
We start from DataItem::updateCoords
. In case of 2D data we also use MaskUnitsConverter
(why do we need it?). Then we use InstrumentItem::createCoordSystem()
which creates detector, beam and something else and returns ICoordSystem
"converter". This converter calls ICoordSystem::convertedAxes
which goes deeper into Device/Coord
directory calling numerous functions one by one.
After all of that we recreate Datafield
.
That approach comes from the core but it seems that the GUI side of coordinates can be refactored independently. After this, if the result of refactoring will be recognized as successful, the same approach can be applied to to core.