View Full Version : Can you send parameters from a Job to a Pentaho MapReduce transform?

06-05-2013, 08:14 PM
Let's say I have a transform that is written as a MapReduce task (as the Mapper in my case). From my Job that calls the transform via Pentaho MapReduce, I'd like to send a few parameters to the child transform so it can perform specific logic based on the parameter. This way, I can reuse the same transform for multiple purposes.

In the Pentaho MapReduce config dialog, I see a User Defined tab, but that appears to be for Hadoop-level parameters; not user-level parameters.

Any input would be appreciated.



06-05-2013, 08:42 PM
I haven't tried this, but have you tried sending PDI parameters to the job? Like on the execution dialog, enter a parameter and value to see if it gets to the mapper transformation?

i can take a look at the code tomorrow as well.

06-05-2013, 10:35 PM
I don't think job-level parameters would work in my situation. I have a parallel job (a Start step that has "Launch next entries in parallel"), which calls two Pentaho MapReduce steps. I would like to use the same .ktr file for the two MR jobs, but just send different parameters to each one. Therefore, I would think the parameters would have to be at the step level, possibly within the Mapper and Reducer tabs.