Add API feature to initialise Data DB objects when a dataflow was created
As a .Stat Suite user
In order to simplify extracting data for newly created dataflows (because today an initial data upload for that dataflow is required before data can be extracted for that dataflow)
I want to manually create all Data DB objects required for the usage of dataflow for extractions.
Scenario 1: Data DB objects are created (if still necessary) when a correct request is made (the dataflow and corresponding DSD already exist)
Given that I have access to the transfer API (after authentication)
When I run a dataflow initialise method specifying a precise dataflow (agency, id, version)
And the dataflow and the corresponding DSD exist in MappingStore DB
Then
- if the Data DB dataflow-related objects have already been initialised previously, then the API returns HTTP code 200.
- if the Data DB dataflow-related objects have not been initialised previously, then the corresponding Data DB objects are created and the API returns HTTP code 201. Note: This includes the generation of the actual content constraint (empty since the fact table is empty, see: dotstatsuite-core-data-access#44).
Scenario 2: Error on incorrect request: no related dataflow is found
Given that I have access to the transfer API (after authentication)
When I run a dataflow initialise method specifying a precise dataflow (agency, id, version)
And the dataflow doesn't exists (anymore) in MappingStore DB
Then the API returns an error specifying that the dataflow wasn't found:
"Error: The initialisation could not be done because the dataflow AGENCY:DATAFLOWID(VERSION) doesn't exist in data space XXXXXX. First create the dataflow using the SDMX API or the .Stat Suite DLM, and then re-submit your request."
Notes:
- This method can be called by a user or by the DLM. A related DLM feature will be implemented at a later stage: The DLM will call this method immediately before:
- getting the number of available observations for a dataflow,
- opening the (DE) preview,
- exporting data and
- transferring data between internal spaces (method to be called on source dataspace).
- The implementation should be done in a way that facilitates its reuse when the service bus and service communication is implemented so that a dataflow creation with the NSI web services automatically triggers the initiation in the Data DB by the transfer service.
- Scenario 1 should also bring a solution for ticket: #79 (closed) to repair a database when it is in an inconsistent state due to a database failure during the first import.