Skip to content

Configuration

Call Parameters

You can access advanced functions and settings by manually adjusting the import configurations.

  • The configuration is carried out via XML files, which are read and processed by the Implex.
  • The call is made via the call parameters of the Implex.
  • You must create a separate import configuration file for each source (CSV, XML file, Cortex-Cortex or other).
  • Several record types can be defined as target within a configuration, so that several record types can be created from one source.

Import without Data Model

It is possible to import data without first configuring a data model. However, this makes it more difficult to use in Uniplex, as no fields or record types have been defined. The configuration of a data model for the Uniplex is therefore recommended. For individual developments without the Uniplex, this may not be necessary. The imported data can then only be found with your own developments or with the help of the application CtxBrowser.

Implex.jar Call Parameters

The Implex is a CLI-based Java application that can be called manually and via time-controlled system services. The following call parameters are available.

Important

On Windows systems, the entries for the installed Java must be specified within the environment variables.

Call

java -jar Implex.jar [OPTIONS] ... [DEFINITIONS_FILE] ...

Options

Call Description
-i, --import use the following import definition files (XML)
-l, --link resolve link definition
-s, --simulate simulate action, no write operations are executed in the server.
-v, --verbose output all log messages additionally to the console.
-h, --help call this help

Options for Experts

Call Description
-e, --emulate writes the internal representation of the first read record to a file in the case of a definition for the import mode and then terminates the program (only for the development of reader modules)
-d, --delay [ZAHL] number of records after which a log message should always be output (default: 200)
--server [IP:PORT] a CortexEngine can be specified here (e.g.: 127.0.0.1:29000)
--source [FILE] if the reader module allows it, a source file name can be specified here (depending on the reader module)
--getinfo [FILE] followed by a JSON file name provides module-specific information (depending on the read module).
--start [ZAHL] starts importing from the specified record
--end [NUMBER] reads as many records as specified

Any number of definition files can be specified one after the other and several options can be specified in any order (see example under Call)

Note

  • The parameter -s (simulation) has a global effect on all modes passed in the call!
  • The parameters --start and --end only work for CSV and XML imports.

Java Call Parameters

If the call is made manually or via a script, you should pass country-specific parameters to Java. This allows you to import numbers, currencies, date and time formats and similar correctly.

The following information is added to the Java call:

-Duser.country=DE -Duser.language=en

The call includes the complete jar file of the Implex and other parameters. The import and source file can also be passed:

java -jar -Duser.country=DE -Duser.language=en Implex.jar -i --server localhost:29000 --source www/Implex/sourceFile XML ./www/implex/Import-Config XML

The example shows the call of the Implex.jar with additional, country-specific Java parameters and the Implex parameters for the source and import configuration file.

General Definitions

The import configuration consists of three main areas: Global, ReaderModule and ImportSection, in which you make the respective settings.

  • Global:defines access to the CortexEngine in which data is to be changed and general working parameters for the import
  • ReaderModule: defines access to the source data
  • ImportSection: defines the field assignment from source to target fields

Changes for Implex versions from version 4.0 and higher

With version 4.0, the identifiers for the parameters in the import configurations change. From this version onwards, a configuration and call is only possible with English identifiers. Old configurations must therefore be adapted!

Konfiguration
<XML version="1.0" encoding="UTF-8"?>
<CtxImport>
 <Global>
    <LoginIP>[.....]</LoginIP>              <!-- IP or server name -->
    <LoginPort>[.....]</LoginPort>          <!-- database port; also via parameter for implex available -->
    <LoginUser>[.....]</LoginUser>          <!-- user -->
    <LoginPW>[.....]</LoginPW>              <!-- password -->
    <ImportMode>[.....]</ImportMode>        <!-- import mode; n, u, nu/un for new and/or update-->
    <DeltaList>[.....]</DeltaList>          <!-- optional - DeltaListe means that all records with no change will be deleted ("l") -->
 </Global>

  <ReaderModule type="[.....]">             <!-- e.g.: CSV, XML, ctx or own type by using abstract class -->
    <Filename>[.....]</Filename>            <!-- relative path to implex in bin directory -->
    <Separator>[.....]</Separator>          <!-- field separator -->
    <Enclosure>[.....]</Enclosure>          <!-- field delimiter -->
    <RepSeparator>[.....]</RepSeparator>    <!-- seperator for repeated content in one field; e.g.: "email 1; email 2; email 3; ..." -->
    <ColumnMode>[.....]</ColumnMode>        <!-- column mode HEADER, NUMERIC, ABC -->
    <Charset>[.....]</Charset>
  </ReaderModule>

  <ImportSection recordtype="[....]">
<FilterFunction>getChar('...') != ''</FilterFunction>   <!-- write record only, if filter function returns true -->
    <IId>getChar('...')<IId>                <!-- internal database ID, if update exactly this record -->
    <HashFilter>[.....]</HashFilter>        <!-- build hash value on all imported values and write it to this field -->

    <Reference>PID</Reference>              <!-- reference to find record if update is necessary -->

    <Field>... = getChar('...')</Field>
    <Field>... = getChar('...')</Field>
    <Field>... = getChar('...')</Field>

    <RepGroup reference="1">                                <!-- for update of group reference is needed -->
      <Field deltalist="d">... = getChar('...')</Field> <!-- deltalist means that all records without will be deleted -->
      <Field>... = getChar('...')</Field>
    </RepGroup>

  </ImportSection>
</CtxImport>

In the Global area, configure the access data for your data inventory and enter the desired import mode. Select between n (new), u (update) or nu or un (for update, otherwise new).

Example for a Global Configuration
<CtxImport>
 <Global>
    <LoginIP>127.0.0.1</LoginIP>
    <LoginPort>29001</LoginPort>
    <LoginUser>admin</LoginUser>
    <LoginPW>adm#13qzy2!</LoginPW>
    <ImportMode>nu</ImportMode>
 </Global>

The ReaderModule and ImportSection areas depend on the respective data source.

Note

You can create several records of different record types from one data source (e.g. a person and an address record from one person). To do this, you must create the ImportSection several times.

Conversion of Content

Source data such as numerical values (integer or decimal values) and date fields must be converted into other formats. Corresponding import functions are available for this purpose.

Date Fields

Source systems provide different date formats, so these must be converted into the format used by the CortexEngine before importing. The source format must be known.

Examples of dates from different sources

  • 111012
  • 12-10-11
  • 11-10-12
  • 2012-10-11
  • 2012-10

It is not clear whether the numbers represent the day, the month or the year. Therefore, dates must always be converted with this line: date('yyMMdd',getChar('source date'));

The pattern 'yyMMdd' defines the format of the source. The table shows the possible patterns that can be converted from a source. You can transfer extensive date and time information, also in combination, into the internal format.It is not clear whether the numbers represent the day, the month or the year. Therefore, dates must always be converted with this line: date('yyMMdd',getChar('source date'));

The pattern 'yyMMdd' defines the format of the source. The table shows the possible patterns that can be converted from a source. You can transfer extensive date and time information, also in combination, into the internal format.

Table for Date Information
Letter Date or Time Component Presentation Examples
G era designator text AD
y year year 1996; 96
M month in year month July; Jul; 07
w week in year number 27
W week in month number 2
D day in year number 189
d day in month number 10
F day of week in month number 2
E day in week text Tuesday; Tue
a am/pm marker text PM
H hour in day (0 - 23) number 0
k hour in day (1 - 24) number 24
K hour in am/pm (0 - 11) number 0
h hour in am/pm (1 - 12) number 12
m minute in hour number 30
s second in minute number 55
S millisecond number 978
z time zone general time zone Pacific Standard Time; PST; GMT-08:00
Z time zone RFC 822 time zone -0800

Patterns

Date and Time pattern Result
yyyy.MM.dd G 'at' HH:mm:ss z 2001.07.04 AD at 12:08:56 PDT
EEE, MMM d, ''yy Wed, Jul 4, '01
h:mm a 12:08 PM
hh 'o''clock' a, zzzz 12 o'clock PM, Pacific Daylight Time
k:mm a, z 0:08 PM, PDT
yyyyy.MMMMM.dd GGG h:mm aaa 02001.July.04 AD 12:08 PM
EEE, d MMM yyyy HH:mm:ss Z Wed, 4 Jul 2001 12:08:56 -0700
yyMMddHHmmssZ 010704120856-0700

You can use this online test (external link) to check.

For developers

The date conversion functionality is based on the SimpleDateFormat class implemented in Java.

Values with Validity Date

  • A value in a record can have a validity date and remains valid until a new value with a new date is entered.
  • The old value remains visible, even in lists and pivot lists.
  • A selection is only possible if the field was created during field configuration with the option with field dictionary for history information.
  • To import values with a validity date, the corresponding import line is added.You can use this online test (external link) to check this.
Beispiel für eine Datenquelle mit Werten im zeitlichen Verlauf
AcctNumber;Date;Value
DE123456789;01.01.2018;150
DE123456789;02.01.2018;200
DE123456789;03.01.2018;95
DE123456789;04.01.2018;80
DE123456789;05.01.2018;110
Importsection für Beispiel einer Datenquelle mit Werten im zeitlichen Verlauf
<ImportSection recordtype="ACCT">
    <Reference>AccNr</Reference>
    <Field>AccNr=getChar('AcctNumber')</Field>
    <Field>Value[time(date('dd.MM.yyyy',getChar('Date')))]=getChar('Value')</Field>
</ImportSection>

Update mode for data import

To save values over time, it must be possible to update the records. This can only be done if the import mode is set to u (update) or nu (new and update).

Import without Data Model

When importing data without a data model for the Uniplex, you must use additional attributes for each field. These attributes define the field type in the Uniplex or in user-defined applications and the basic field type for data management. Both attributes are required.

<Field fieldType="[.....]" baseType="[.....]">

Beispiel für einen Import ohne Datenmodell
<Field fieldType="C" baseType="0">[.....] = [.....]</Field>
<Field fieldType="M" baseType="2">[.....] = [.....]</Field>

Values for Field Types

Table for valid combinations in the Uniplex data model

Field type Meaning Basic field type Meaning Basic field type
C characters 0 fields with field dictionary
N positive integers 0 fields with field dictionary
F floating point numbers (float/double) 0 fields with field dictionary
D date 0 fields with field dictionary
T Timestamp (mandatory conversion with the time() function) 0 fields with field dictionary
I internal reference 0 fields with field dictionary
J JSON field 1 data container / JSON fields
B file 1 data container / JSON fields
M multiline text field 2 fields without field dictionary (multiline, short binary data or similar)
-- -- -- 3 fields with field dictionary, which also includes the time slice information

Use permitted combinations only!

The permitted combinations must be observed, as some field types are only compatible with certain basic field types.