Data import is a feature that allows importing entity data from a well-defined ZIP file. This ZIP file can be produced using the Export Feature by exporting data from another OutSystems environment (or from the same environment if we're doing a restoration of data) or from an ETL process that fills the data files (*.CSV) of the ZIP file.
If you are importing data from a non-OutSystems source, if you for example did an Export of only the structure and loaded the "only headers" CSVs with data from a non-OutSystems app using another ETL tool, remember:
You need to use in the text fields the escape characters N' - letter N uppercase and a single prime ' character - prior to the data to input in the field, and end with another '; for example, to load in a string the car brand Range Rover you'd put the string N'Range Rover' in the corresponding field position in the CSV;
The CSV file structure must match the corresponding *.dmm file description - you are OK if you keep the same structure, in name and sequence of columns, as the "export Only Structure" feature gave you; remember the first line of the CSVs is the header;
All date fields have to have the format yyyy-MM-ddTHH:mm:ss ;
If you aren't using the Data Append Mode (you want to keep the same record IDs in the destination), all records must be ordered in the import files by their ID field.
If your environment have multiple front-ends, all the import process have to be process in the same one. In this case, you have to configure DMM to use a Deployment Zone. You can find more information about this configuration in this link.
The import flow is somewhat different from the remaining executions as it is not separated into a configuration and an execution phase.
The first step to start the import process is selecting the database connection that points to the destination environment. Note that the Runtime Connection is always available and allows importing data to the environment where Infosistema DMM is installed. In this scenario, the Data Append Mode flag will be required.
After picking the destination database for the import, you will need to select the source of the ZIP file:
Upload a ZIP file through Browser Upload: use this option to upload the zip file directly through your browser .
Import ZIP file From SFTP Path: use this option to specify the SFTP connection and ZIP file name.
Import ZIP file From Mapped Folder: use this option to specify the full ZIP file path.
Infosistema DMM will open the Zip file and detect what entities will be subject to import.
In the final step of the configuration phase, you can review all entities and their respective fields to better understand what's being imported. There are three import flags that should appear pre-filled if needed, but you can change them.
The first one, "Generate SQL Commands Script," will save into the configured log folder all for SQL commands executed on the destination database.
The second one, "Execute Immediately," will bypass the usage of the parallelization engine and migrate data and create a single sequential execution. It can have better performance in scenarios with many tables but where those tables never have more than five thousand records.
The last one, "Data Append," will instruct Infosistema DMM to always insert records from the source database into the destination. This is mandatory when using Runtime Connection or a connection that does not have Alter Table privileges in the destination environment.
When expanding an entity, you will be able to see if it was flagged with the user mapped table feature.
Depending on your subscription version, a Validation & Subscription Check screen may appear here.
If it does, you can find more information about the process at this link.
Inside the OutSystems Component, there is a single screen that handles the execution.
On this screen, clicking "Start Importing" will show a confirmation dialog that, after confirming the action, will start the import process. Under the hood, DMM is creating an execution plan and extracting records from the zip file, then writing them to the destination Database.