Hey,

I have a question. I try to build an etl - job that extract and load my data with a template. For this one I use a job with 2 steps, where I get my sql queries and pass that to the result.
In the next step, i get the parameters and set them to the subtransformation or job and want to load my data with the table output step.


So, I have a job with two steps
1. Step get my queries and set them to results
2. Step execute every input row and add the results to parameter that I can use it in the subtransformation.

Now my question?

I want to execute the transformation in parallel for every input row and extract and load my data in parallel with a template.

It is possible to do that?

I try it, to set in the 1 step, execute next step in parallel, but this doesn´t work. I used the transformationexecutor too, but the same result.

The tables will load in sequenz and not parallel.

Have someone of you an idea, how could I try to do hat?

Thanks in advanced.