Command-Line-Client Parameters
Integrator projects can be run from the command line by using the CLI tool EtlClient. For a list of valid parameters, start the ETLClient with the -h
option. For more information, see Integrator Command-Line Client.
Valid parameters are described below. Click on the column headings to sort by column.
Option |
Alternative Option |
Parameter |
Description |
-a |
--add |
<projectfile> |
Add the specified project to the server. The path to the project file can be set globally or locally. |
-c |
--context |
<variables> |
Context variables to use for execution of jobs, loads, and sources and for flow graph. |
-cf |
--contextfile |
<filename> |
Set the name of XML file that contains the context variables values. |
-co |
--componentoutput |
<locator> |
Get a component output structure of an extract/transform. The locator can be of type sampleBiker.sources.extract/transform-name or sampleBiker.extracts.extractName/ sampleBiker.transforms.transformName |
-cp | --copy | <sourceLocator targetLocator> | Copy components with or without their dependencies |
-d |
--data |
<sourcename> |
Get the data from the specified source (extract or transform). Number of lines and start line can be defined with -n option. Valid only in combination with -p option. |
-df |
--dataformat |
<arg> |
Transfer data from a source to csv or xml. Valid only in combination with the -d option. |
-dr |
--drillthrough |
<db.cube> |
Get detailed drillthrough data of a datastore. |
-dri |
--drillthroughinfo |
Get storage information for all drillthrough operations. |
|
-e |
--execdisplay |
<fromid toid> |
Get history of executions in interval of execution IDs. Filter on project, job, and load with -p, -j, -l options. |
-ed |
--execdetails |
<executionid> |
Display the runtime details of an execution. |
-eg |
--execgraph |
<executionid> |
Display the runtime details of an execution as an SVG graph. |
-eid | --execid | Get the most recent execution id. | |
-el |
--execlog |
<executionid> |
Display the log of an execution. |
-er |
--execremove |
<fromid toid> |
Remove executions from the history in interval of execution IDs. Filter on project, job, and load with -p, -j, -l options. |
-ev | --execvariables | <executionid> | Display the initial variable settings of an execution. |
-f |
--format |
<arg> |
Set the output format of a tree source. Valid only in combination with -d option. |
-g |
--get |
<projectname> |
Get project configuration from server. |
-gr |
--graph |
<locator> |
Get flow graph of a component as SVG definition. Set graph properties with -c option. Write output to a file with -o option. |
-h |
--help |
Show help message. |
|
-i |
--info |
Get info about the server status and the jobs running. |
|
-j |
--job |
<jobname> |
Execute the specified job of the project. Multiple jobs are separated with '.'. Valid only in combination with -p option. |
-k |
--kill |
<executionid> |
Terminate a running execution by its id, or "ALL" for all running or queued executions. Use -i option to get info about running executions. |
-l |
--load |
<loadname> |
Execute the specified load of the project. Multiple loads are separated with '.'. Valid only in combination with -p option. |
-ll |
--loglevel |
<level> |
Set the log level of this client. Possible values are: OFF, FATAL, ERROR, WARN, INFO, DEBUG, ALL. Valid only with -el option. Note: the logs are displayed from history; so, for example, if the debug level was not set when an execution was run, then "-ll DEBUG" will not show any debug logs. |
-ls |
--list |
List all projects on the server with no argument given, or project details of the project given with -p option. |
|
-m |
--migrate |
<projectfile> |
Migrate project configuration to actual definition standard. |
-n |
--sample |
<format> |
Limitation of data output to sample of n lines. Valid only in combination with -d, -dr, and -e options. A second parameter can be given for the start line that is only valid with -d. |
-o |
--output |
<filename> |
Write output to a file. |
-p |
--project |
<projectname> |
Specify the Integrator project. If no further option is set, the default job of the project is executed. The project has to be added to the server with option -a before. |
-pd |
--documentation |
<projectname> |
Get the documentation of a project. Add the flow graph of jobs with -j option. Write output to file system by giving its target directory with -o option. |
-pg |
--prototype |
<arg> |
Generate a prototype project. |
-pw |
--password |
<password> |
Give the password. Should be used with combination –u. |
-r |
--remove |
<projectname> |
Remove the specified project from server. |
-rn |
--rename |
<locator newName> |
Rename a component to a new name and updates its dependencies. |
-s |
--server |
<name[:port]> |
Use this Integrator server: name[:port]. |
-sp |
--serverprofile |
<profilename> |
Use an existing server profile. |
-spc |
--createserverprofile |
<profilename> |
Create/update a server profile. |
-t |
--test |
<locator> |
Validate and test a project or a project component in runtime without actually writing data. A component is specified in the form |
-u |
--user |
<user> |
Give the username. Should be used with combination –pw. |
-up |
--update |
<projectfile> |
Update the specified project to the server. The path to the project file can be set globally or locally. |
-v |
--validate |
<projectfile> |
Validate existing project configuration without executing. |
-w |
--wait |
<wait> |
Wait for the specified job/load of the project to finish. Takes value true/false. Valid only in combination with -j/-l options. |
-wd | --withDependencies | <withDependencies> | If set to true, a component is copied with all its dependencies. It takes the values true or false. Valid only with -cp option |
Generating project documentation
Notation
etlclient -pd <projectname> -j <jobname1> <jobname2> .... -o <directory>
Example:
- Create project documentation for project sampleBiker without flow graphs in directory C:/EtlDoku1:
etlclient -pd sampleBiker -o C:/EtlDoku1 - Create project documentation for project sampleBiker with flow graphs of job Masterdata and Cubedata:
etlclient -pd sampleBiker -j Masterdata Cubedata -o C:/EtlDoku2 - Create documentation for job Cubedata in project sampleBiker. It only contains the referenced components of this job and the flow graph of job Cubedata:
etlclient -pd sampleBiker.jobs.Cubedata -o C:/EtlDoku3
The documentation can be displayed by opening file etldocumentation.html with a browser. Note that Internet Explorer 9 doesn't display the graphical flow graph due to insufficient support for the SVG format.
The following files are generated in the output directory:
etldoc.xml | Contains all project- and component-related information, SVG flow graphs, references, and all component fields (references etldoc.xslt). |
etldoc.xslt | The XSTL transformation of etldoc.xml. By adapting this file, the layout and content of the documentation can be modified (references etldoc.css). |
etldoc.css | A style sheet (references etldoc_bg.png). |
etldoc_bg.png | A background image. |
etldocumentation.html | The documentation file to be displayed in the browser. Technically, it is etldoc.xslt transformation applied to etldoc.xml (requires only files etldoc.css and etldoc_bg.png). |
Progress logs
It is possible to generate additional info messages indicating the progress of a job or a data preview. This is helpful for the analysis of long-running Jedox Integrator jobs with high data volume.
For one or several components of the Jedox Integrator job (extract, transform, or load), a log block size can be indicated as a command line parameter. Each time the component has processed a block, an info message will appear.
Notation: -c #<componentName>.logBlockSize=<n>
Example:
-p samples\sampleBiker -j Cubedata
-c
#sampleBiker.extracts.Orderlines.logBlockSize=500
#sampleBiker.loads.OrdersCube.logBlockSize=2000
This will lead to these additional log messages:
INFO : 1 rows processed from extract Orderlines
INFO : 500 rows processed from extract Orderlines
INFO : 1000 rows processed from extract Orderlines
INFO : 2000 rows processed from load OrdersCube
Updated September 27, 2022