Multi experiment data extraction
Explanation of changes
Based on issue #262 (closed) and MR !239 (closed) .
This is a first draft of a multi experiment data extraction (better descriptive name for function is welcome). In it a varying parameter is gotten out of a range of experiment folders and added to the datasets of those individual experiments.
First the datasets of the time range given are concatenated and a new coordinate (concat-tuids) with references to the datasets is added (as in the advanced example)
Then the varying parameter is gotten out of the snapshot.json files put in a numpy array and extended to match the dimension of the new dataset. This is then set as a new coordinate with xi
.
In the following example, the parameter gotten out of the snapshot.json is the flux current for qubit 4.
Basic2DAnalysis
runs with this new dataset.
Motivation of changes
I'm not sure about the placement in handling.py
, so feedback is certainly welcome.
The data is extracted by the function
def multi_experiment_data_extractor(
varying_parameter: Dict[str, str],
experiment: str,
new_name: Optional[str] = None,
t_start: Optional[str] = None,
t_stop: Optional[str] = None,
) -> xr.Dataset:
t_start
and t_stop
specify the start and end timestamps of the multifile analysis and new_name
gives the name of the dataset to be created. The experiment
variable is a string containing the name of the experiment. With this and t_start
and t_stop
the TUIDs are found in the experimental folder.
The varying parameter is a dictonary in which the parameter is specified such that it can be found in snapshot.json. For example:
parameter = {
"name": "flux",
"long_name": "flux bias current",
"instrument": "fluxcurrent",
"parameter": "FBL_4",
"units": "A",
}
From that the varying parameter is gotten out by
def get_varying_parameter_values(
tuids: List[str], varying_parameter: Dict[str, str]
) -> np.ndarray:
The values are gotten out using the 'load_snapshot' function.
And the corresponding datasets are found and concatenated by
def concat_dataset(tuids: List[TUID], dim: str = "dim_0") -> xr.Dataset:
Which outputs the concatenated dataset with a new coordinate which references the old TUIDs.
Merge checklist
See also merge request guidelines
-
Merge request has been reviewed and approved by a project maintainer. -
Merge request contains a clear description of the proposed changes and the issue it addresses. -
Merge request made onto appropriate branch (develop for most MRs). -
New code is fully tested. -
New code is documented and docstrings use numpydoc format. -
CHANGELOG.rst
andAUTHORS.rst
have been updated (when applicable). -
CI pipelines pass -
pre-commit run --all-files --hook-stage commit
passes (gitlab-ci), - test suite passes (gitlab-ci),
- no degradation in code-coverage (codacy),
- no (serious) new pylint code quality issues introduced (codacy),
- documentation builds successfully (CI and readthedocs),
-
windows tests pass (manually triggered by maintainers before merging).
-
For reference, the issues workflow is described in the contribution guidelines.