Hitachi Vantara Pentaho Community Forums
Results 1 to 5 of 5

Thread: Setting TransMeta attributes in a Job

  1. #1

    Default Setting TransMeta attributes in a Job

    I execute transformations using a Job. I run this Job through the Java API and I wanna be able to obtain Transmeta objects from a Job so I can set the Database Connections the transformations will use. how can i do this?


  2. #2
    Join Date
    Nov 1999


    > set the Database Connections the transformations will use
    I propose to use variables or shared objects/shared connections - it's much more easier and stable for the future.

  3. #3
    Join Date
    Nov 1999


    From an API view, a TransMeta object is not visible to the JobMeta object.
    The Transformation metadata (TransMeta) is loaded at runtime, right before execution.


  4. #4
    Join Date
    Dec 2008


    I also have a requirement to set database connections to multiple different databases dynamically (Oracle and SQL Server). Unfortunately the DB Type is not something that can be set through variables. Therefore I have been attempting to load and modify the ktr files first using DOM (on the raw xml) and secondly the Pentaho DI API. On a small ktr file with an Excel input and SQL Server output I have had no success with either with either method and would greatly appreciate some assistance.

    Basically I opened and parsed the ktr xml files using the DocumentBuilder and then saved them backout without modifying and noticed that the output was NOT the same. The most noticable was that text like "&#47" was directly being changes to a forward slash instead of being left as markup for the slash. I could not find any options to change this behaviour so went on to try the Pentaho DI API.

    Pentho API:
    I created the following based on what I saw on the forums to simply read a ktr file and then output it back to another:
    import org.pentaho.di.trans.TransMeta;

    System.out.println("Opening File");
    TransMeta tm = new TransMeta("C:\\Documents and Settings\\jonathan\\Desktop\\Conversion.Template.0.9\\1.0 CreateStaging\\ImportDDLInfo.ktr");

    System.out.println("Outputting File");
    String xml = tm.getXML();
    DataOutputStream dos = new DataOutputStream(new FileOutputStream(new File("C:\\Documents and Settings\\jonathan\\Desktop\\Conversion.Template.0.9\\1.0 CreateStaging\\ImportDDLInfo2.ktr")));
    }catch(Exception e){

    But I get the following output:
    Opening File
    Unable to find kettle engine jar file to set build date. (ignored)
    2009/01/19 15:23:26:764 PST [INFO] DefaultFileReplicator - Using "C:\DOCUME~1\jonathan\LOCALS~1\Temp\vfs_cache" as temporary files store.

    Error reading object from XML file
    Unable to load step info from XML step nodeorg.pentaho.di.core.exception.KettleStepLoaderException:
    Unable to load class for step/plugin with id [ExcelInput]. Check if the plugin is available in the plugins subdirectory of the Kettle distribution.

    Unable to load class for step/plugin with id [ExcelInput]. Check if the plugin is available in the plugins subdirectory of the Kettle distribution.

    BUILD SUCCESSFUL (total time: 1 second)

    I checked the plugins directory and ExcelInput is not there. But this confuses me because I can build and run it just fine from Spoon. Also I have read on forums that there could be a conflict of jar file versions, I am not sure how this is possible since there are no other versions of PDI installed. Additionally I used the full pan.bat classpath settings on execution to ensure I was not missing anything (otherwise I would have gotten some other exceptions). Thank you.

  5. #5
    Join Date
    May 2006


    Don't hijack old threads... start a new one. And maybe there's a reason database type cannot be changed on the fly ???

    As for your .ktr hack, you're not meant to "hack" them... and if you do some things break, it's a known issue.


Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.