
A QUIPU Dataflow can be composed visually on the Diagram. A Diagram shows Models (circle symbol), Data Containers (database or file symbols) or Modules (rounded square symbols). The Toolbox in the panel on the right shows all available Modules, whereas the Explorer on the left gives an overview of all configured components (Models, Modules, Data Containers) that exist on all Diagrams.
Flexibility by design
In QUIPU you define flows to interact with datamodels. Each interaction is managed via a variety of modules. The process usually starts with importing model metadata
Modules are the flexible components (represented by the blue squares on the diagram) that each define and enable a specific function. All modules are fully configurable.
Import/Export modules
JDBC model generator
The JDBC Model Generator Module uses a JDBC connector to retrieve model meta-data and store it in the QUIPU repository, as a Data Model of type ‘Source’. The advantage of using this module to retrieve the meta-data is the abundant availability of drivers for all sorts of databases, and even structured files. It thus provides a very powerful way to bring source models into QUIPU.
Model delta detector
Compares the last two versions of a model and reports the differences.
The module reports deltas for entities, attributes, constraints and relations.
For each model element, the following deltas are reported.
Which model elements are added.
Which model elements are removed.
Which model elements have properties that have changed.
Code generator
Generates code based on a model and a template. This code can be written to a data container (database or file), specified by the user.
QUIPU is able to generate code directly into a target database. For SQL Server targets, QUIPU offers a standard solution. To let QUIPU generate code into tables in other target databases several queries can be specified.
OData model generator
The OData model generator module reads a single schema from an OData $metadata XML file and generates a data model of type 'Source'. Currently, relations are not supported.
The OData model generator module reads and stores entities metadata, attribuites metadata and constraints metadata.
Transformation modules
HDA model generator
Generates a Data Model of type HDA (Historic Data Archive) based on a Data Model of type Staging
Staging model generator
Generates a Data Model of type Staging based on a Data Model of type Source.
Transformation generator
The transformation generator module transforms an input model by filtering, adding, converting or renaming of objects.
A transformation module contains three optional parts: 1 - Pre filters. These filters are used to filter out source model objects. All objects can be filtered using include and exclude logic. 2 - Action groups. Action groups contain actions and possibly a filter. 3 - Post filters. These filters are used to filter out (possibly created) target model objects.
Data set model generator
Generates data set entities that are specified by QUIPU users.
The entities in the resulting model represent data sets that are constructed by joining source entities based on child-parent relations defined in the source model.
Implementation
Data containers
A Data Container is a physical instance that can either be input for Import Modules or output for Export Modules. Data containers can be databases or files.
and more!
Other distinguishable features of the QUIPU framework
Visual data flow design
The canvas displays a graphical visualisation of a user diagram, where various modules are placed, configured and linked to another defining the desired data flow. The toolbox shows all configurations available in the QUIPU4 repository, with both the explorer and toolbox supporting drag and drop functionality. In the explorer overview, all configured objects such as data models and modules are categorised and presented including versioning information.
The diagram overview is the starting point to create a new data flow and also gives an overview of all existing data flow diagrams per selected environment.
Visual data lineage
Having insight in data provenance, or also called lineage of data, is a must-have for modern companies. Lineage is key when it comes to providing trust, authenticity and credibility of data, especially when data is being processed in a flow.
QUIPU makes data lineage visible on table level as well as on the level of attributes. Pick any table from a model to start with, and expand the lineage both up- and downstream.
Multiple environments
In many organisations, code lives not only in the production envrionment but also in one or more other ones such as development, test or (user) acceptance. For data solutions we usually see differences between environments in connections to source systems, to other databases and with other credentials. Moreover, the sources themselves can also differ in structure per environment, impacting the data logistics flow further down the line.
In QUIPU you can now specify one or more enviroments, where each environment has its own coloring in the application to clearly identify each environment.
Our powerfull global properties can be now defined per environment. Copying a dataflow to another environment automatically promotes the full data flow with all its objects to another environment and applies the corresponding global properties accordingly, Any generated code matches the environment specific settings.
JSON editor
QUIPU uses JSON for the configuration of its modules. JSON is a de-facto standard on the internet for exchange of data between web services. The Editor tabs in several Property panels in QUIPU use a JSON Editor.
Scripting
A powerful feature of QUIPU is the option to execute most of its functions directly via the REST API. This opens the possibility to script QUIPU actions and expand the QUIPU capabilities beyond what we can offer. A very useful scenario is where you build your solution by configuring modules and stringing them together using the powerful graphical user interface of QUIPU, test it and then create a script to automatically execute all the steps specified on the diagram for future runs when changes come to your environment. Automating as much manual work as possible, since changes will happen.
REST API
QUIPU functions are exposed in web services that can be accessed via a REST API QUIPU uses the standard web REST API mechanism to expose its services. The QUIPU front end uses this API to offer you a user friendly application to use the QUIPU system, but it is this same mechanism that allows scripts to address QUIPU services directly. We will make a detailed description of the API available in the near future. For now, use the built-in script generator that will create a script for any diagram you have drawn as a quick start for your own scripts.