Hitachi Vantara Pentaho Community Forums
Results 1 to 4 of 4

Thread: PDI - Java API How to pass named parameters and log info.

  1. #1
    Join Date
    Jul 2007
    Posts
    7

    Default PDI - Java API How to pass named parameters and log info.

    Hi:
    I am using PDI 4.0.
    The java api gave error. Show I switched to PDI 3.2 version. This looks complete
    Calling the transformation (xml) using java api works. I need to pass parameters to the kettle transformation. This part is not working. Can you please inform. The syntax.

    In a shell script, the passing parameter works as below.

    sh -vx pan.sh -file:"$trans" -param:infilename=$infilename -paramutfilename='notused' -level:Minimal >$logfilename


    In the java api, I tried to use the same syntax. As below. It is not working.

    String inparms[] = {
    "-param:asofdate='2010-08-05' -log='c:\\temp\\log\\test.log' -level:Minimal"};
    runTransformation.runTransformationx(
    "C:\\pdi\\transform\\test.xml",inparms);

    Can you please comment on this.

    Thank you.



  2. #2
    Join Date
    Jul 2007
    Posts
    7

    Default

    Hi:

    Based on the search in this forum, I found the below steps. The below steps works on PDI-4.0.1-CE build.

    KettleEnvironment.init();
    EnvUtil.environmentInit();
    TransMeta transMeta =
    new TransMeta(ktrname);
    Trans trans =
    new Trans(transMeta);

    trans.setParameterValue(key1
    , value1);
    trans.setParameterValue(key2
    , value2);


    trans.execute(
    null); // You can pass arguments instead of null.

    trans.waitUntilFinished();
    if ( trans.getErrors() > 0 )
    {
    thrownew RuntimeException( "There were errors during transformation execution." );
    }


    I don't know what sequence of the steps required to do the same thing to call jobs (kjb) from java-api with this build PDI-4.0.1-CE version. The passed in values are not showing in the job- write-log transformation as ${fld1} ${fld2} . The job always takes the default value if any. Any comments would help.


    val1 = "test";
    val2 = "c:\\temp\\test.txt";
    Repository rep =
    null;

    LogWriter logWriter = LogWriter.getInstance();
    KettleEnvironment.init();

    JobMeta jobMeta =
    new JobMeta(kjbname,rep);
    Job job =
    new Job(rep,jobMeta);
    job.shareVariablesWith(jobMeta);

    job.setParameterValue(key1
    , val1);
    job.setParameterValue(key2
    , val2);

    jobMeta.setInternalKettleVariables(job);
    jobMeta.activateParameters();
    job.start();
    job.waitUntilFinished();
    if ( job.getErrors() > 0 )
    {
    thrownew RuntimeException( "There were errors during job execution." );
    }
    job =
    null;




  3. #3
    Join Date
    Sep 2010
    Posts
    135

    Default

    Could anybody solve this? I have the same problem when executing jobs froms java. It always gets the default values of the parameters...

  4. #4
    Join Date
    Jul 2011
    Posts
    1

    Default

    Hello everybody,
    this works:

    ....
    RepositoryDirectoryInterface directory = rep.loadRepositoryDirectoryTree();
    String transformationDescr = "Job 1";
    JobMeta jobMeta = rep.loadJob(transformationDescr, directory, null, null);
    Job job = new Job(rep, jobMeta);
    // Set parameters to JobMeta before start the Job
    jobMeta.setParameterValue("fileName", "README.txt");
    job.start();
    job.waitUntilFinished();
    ...

    Regards
    Elia

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.