Hitachi Vantara Pentaho Community Forums
Results 1 to 6 of 6

Thread: I'm from China.I need to create transformation with only Java codes

  1. #1
    Join Date
    May 2013
    Posts
    7

    Default I'm from China.I need to create transformation with only Java codes

    First,please ingore my mistakes about English.

    Then,here is my question:I am not allowed to use the Kettle GUI,or the repository.What I can use is the types and methods that is supplied by the kettle source codes.In the kettle 3.2.0stable version,there is an example 'TransBuilder.java' in the the folder 'extra'(I'm not sure if you guys know this),and my code is just from there,but it cannot work,and I struggle for this for a long time.

    I don't know if you understand,please help me find out what's wrong,or if you have any other examples,just let me know,I really need your help.
    Name:  360截图20130523093443214.jpg
Views: 99
Size:  22.1 KB
    Sorry for the code,I don't konw how to upload a 'txt' file.

    import java.io.DataOutputStream;
    import java.io.File;
    import java.io.FileOutputStream;
    import org.pentaho.di.core.Const;
    import org.pentaho.di.core.NotePadMeta;
    import org.pentaho.di.core.database.Database;
    import org.pentaho.di.core.database.DatabaseMeta;
    import org.pentaho.di.core.exception.KettleException;
    import org.pentaho.di.core.logging.LogWriter;
    import org.pentaho.di.core.util.EnvUtil;
    import org.pentaho.di.trans.StepLoader;
    import org.pentaho.di.trans.Trans;
    import org.pentaho.di.trans.TransHopMeta;
    import org.pentaho.di.trans.TransMeta;
    import org.pentaho.di.trans.step.StepMeta;
    import org.pentaho.di.trans.step.StepMetaInterface;
    import org.pentaho.di.trans.steps.tableinput.TableInputMeta;
    import org.pentaho.di.trans.steps.tableoutput.TableOutputMeta;
    /**
    * 这个类用于动态创建转换
    * @author 苏文波
    */
    public class table_to_table
    {
    //数据库连接信息
    public static final String[] databasesXML = {
    "<connection>"+
    "<name>kettle_bs</name>"+
    "<server>localhost</server>"+
    "<type>MYSQL</type>"+
    "<access>Native</access>"+
    "<database>123</database>"+
    "<port>3306</port>"+
    "<username>root</username>"+
    "<password>suwb</password>"+
    "<servername/>"+
    "<data_tablespace/>"+
    "<index_tablespace/>"+
    "</connection>"
    };


    /**
    * 使用输入参数(如读取表名)来创建转换
    * @param transformationName 转换名
    * @param sourceDatabaseName 读取的源数据库名
    * @param sourceTableName 读取的源表名
    * @param sourceFields 期望从源表中读取的字段名
    * @param targetDatabaseName 读取的目标数据库名
    * @param targetTableName 期望写入的目标表名
    * @param targetFields 目标表中的字段名(与源字段的字段数相同)
    * @return A new transformation 返回一个新的转换
    * @throws KettleException In the rare case something goes wrong 异常处理
    */
    public static final TransMeta buildCopyTable(String transformationName,
    String sourceDatabaseName, String sourceTableName,
    String[] sourceFields, String targetDatabaseName,
    String targetTableName, String[] targetFields)
    throws KettleException
    {
    EnvUtil.environmentInit();
    try
    {
    // 创建转换
    // 创建一个transMeta对象
    TransMeta transMeta = new TransMeta();
    transMeta.setName(transformationName);

    // 加入数据库连接,根据XML文件读取数据库资料
    // 创建数据库对象,将数据库对象databaseMeta作为属性添加到transMeta对象中
    //for (int i=0;i<databasesXML.length;i++)
    //{
    DatabaseMeta databaseMeta = new DatabaseMeta(databasesXML[0]);
    transMeta.addDatabase(databaseMeta);
    //}

    DatabaseMeta sourceDBInfo = transMeta.findDatabase(sourceDatabaseName);
    DatabaseMeta targetDBInfo = transMeta.findDatabase(targetDatabaseName);

    // 添加注释
    String note = "Reads information from table [" + sourceTableName+ "] on database [" + sourceDBInfo + "]" + Const.CR;
    note += "After that, it writes the information to table [" + targetTableName + "] on database [" + targetDBInfo + "]";
    NotePadMeta txt = new NotePadMeta(note, 150, 10, -1, -1);//五个参数分别为字符串,坐标(x,y),宽,高;
    transMeta.addNote(txt);

    // 创建一个源数据步骤
    String fromstepname = "read from [" + sourceTableName + "]";//步骤名:表输入
    TableInputMeta t_input = new TableInputMeta(); //一个表输入源
    t_input.setDatabaseMeta(sourceDBInfo); //该表所在的数据库
    String selectSQL = "SELECT "+Const.CR;//Const.CR相当于1个回车换行
    for (int i=0;i<sourceFields.length;i++)
    {
    if (i>0) selectSQL+=", ";
    else selectSQL+=" ";
    selectSQL+=sourceFields[i]+Const.CR;
    }
    selectSQL+="FROM "+sourceTableName;
    t_input.setSQL(selectSQL);
    //设定取数据的sql语句,如果知道要取的字段的话,应该可以直接String selectSQL = "SELECT xxx from"+sourceTableName;

    // 以下将源数据步骤作为属性添加到transMeta对象中
    StepLoader steploader = StepLoader.getInstance();
    String fromstepid = steploader.getStepPluginID(t_input);
    StepMeta fromstep = new StepMeta(fromstepid, fromstepname, (StepMetaInterface) t_input);//StepMeta类包含定义步骤所需的一切信息
    fromstep.setLocation(150, 100);
    fromstep.setDraw(true);
    fromstep.setDescription("Reads information from table [" + sourceTableName + "] on database [" + sourceDBInfo + "]");
    transMeta.addStep(fromstep);

    /**
    // 增加逻辑来重命名字段
    // 创建一个数据处理对象,此处为一个v_input对象。该对象用于选择指定的源数据的列,将其中的数据送到目标数据库的指定列
    SelectValuesMeta v_input = new SelectValuesMeta();
    v_input.allocate(0, 0, sourceFields.length);

    for (int i = 0; i < sourceFields.length; i++)
    {
    v_input.getMeta()[i].setName(sourceFields[i]); //设定源列
    v_input.getMeta()[i].setRename(targetFields[i]); //设定目标列
    }

    //以下将该对象作为属性添加到transMeta对象中
    String selstepname = "Rename field names";
    String selstepid = steploader.getStepPluginID(v_input);
    StepMeta selstep = new StepMeta(selstepid, selstepname, (StepMetaInterface) v_input);
    selstep.setLocation(350, 100);
    selstep.setDraw(true);
    selstep.setDescription("Rename field names");
    transMeta.addStep(selstep);

    //创建一个数据流对象d_hop,用于连接源和v_input对象
    //并将该数据流对象作为属性添加到transMeta对象中
    TransHopMeta d_hop = new TransHopMeta(fromstep, selstep);
    transMeta.addTransHop(d_hop);
    fromstep = selstep;
    */


    //创建一个目标对象,此处目标为数据库输出
    String tostepname = "write to [" + targetTableName + "]";
    TableOutputMeta t_ouput = new TableOutputMeta(); //创建一个表输出对象
    t_ouput.setDatabaseMeta(targetDBInfo); //设置该表所在数据库
    t_ouput.setTablename(targetTableName); //设置目标表的表名。若目标数据库中不存在该表,则创建表
    t_ouput.setCommitSize(200);
    t_ouput.setTruncateTable(true);
    //将该目标对象作为属性添加到transMeta对象中
    String tostepid = steploader.getStepPluginID(t_ouput);
    StepMeta tostep = new StepMeta(tostepid, tostepname, (StepMetaInterface) t_ouput);
    tostep.setLocation(550, 100);
    tostep.setDraw(true);
    tostep.setDescription("Write information to table [" + targetTableName + "] on database [" + targetDBInfo + "]");
    transMeta.addStep(tostep);

    // 建立一个数据流对象hop,用于连接源步骤和目标步骤
    TransHopMeta c_hop = new TransHopMeta(fromstep, tostep);
    transMeta.addTransHop(c_hop);
    // 如果我们还在这儿,就重写转换
    return transMeta;
    }
    catch (Exception e)
    {
    throw new KettleException("不好意思,出错了哦", e);
    }
    }
    /**
    1  创建一个新的转换
    2  保存转换为XML文件
    3  生成SQL的目标表
    4  执行转换
    5  删除目标表使这个程序可重复的
    * @param args
    */
    public static void main(String[] args) throws Exception
    {
    EnvUtil.environmentInit();
    //初始化环境
    /**
    * LogWriter log = LogWriter.getInstance("TransBuilder.log", true, LogWriter.LOG_LEVEL_DETAILED);

    // 加载所有插件和步骤
    StepLoader stloader = StepLoader.getInstance();
    if (!stloader.read())
    {
    log.logError("TransBuilder", "Error loading Kettle steps & plugins... stopping now!");
    return;
    }
    */
    LogWriter.getInstance("table_to_table.log", true, LogWriter.LOG_LEVEL_DETAILED);

    // 加载步骤
    StepLoader.init(); //初始化

    // 我们所需的参数
    String fileName = "NewTrans.xml";
    String transformationName = "kettle_test";
    String sourceDatabaseName = "su1";
    String sourceTableName = "bs5";
    String sourceFields[] = {
    "no",
    "name",
    "height",
    "weight"
    };
    String targetDatabaseName = "su1";
    String targetTableName = "bs7";
    String targetFields[] = {
    "a",
    "b",
    "c",
    "d"
    };

    // 调用方法,创建转换
    TransMeta transMeta = table_to_table.buildCopyTable(
    transformationName,
    sourceDatabaseName,
    sourceTableName,
    sourceFields,
    targetDatabaseName,
    targetTableName,
    targetFields
    );

    // 保存为xml文件
    String xml = transMeta.getXML();
    //System.out.println(xml);
    DataOutputStream dos = new DataOutputStream(new FileOutputStream(new File(fileName)));
    dos.write(xml.getBytes("UTF-8"));
    dos.close();
    System.out.println("Saved transformation to file: "+fileName);


    // 这个sql是建表的语句,是数据库没有对应的表时,会执行的语句
    String sql = transMeta.getSQLStatementsString();
    System.out.println(sql);

    // 对目标表执行sql
    Database targetDatabase = new Database(transMeta.findDatabase(targetDatabaseName));
    targetDatabase.connect();
    targetDatabase.execStatements(sql);

    // 执行整个转换
    Trans trans = new Trans(transMeta);
    trans.execute(null);
    trans.waitUntilFinished();

    // 为了程序可重复,执行完后删除目标表
    //targetDatabase.execStatement("drop table "+targetTableName);
    //targetDatabase.disconnect();
    }

    }

  2. #2
    Join Date
    May 2013
    Posts
    7

    Default

    It seems that the picture cann't be read,here is the exception:

    INFO 23-05 09:34:03,196 - Using "C:\Users\suwb\AppData\Local\Temp\vfs_cache" as temporary files store.
    INFO 23-05 09:34:03,651 - org.pentaho.di.core.util.ResolverUtil@f7c31d - Scanning for classes in [file:/F:/kettle3.2/data-integration/lib/kettle-engine.jar] matching criteria: [Lorg.pentaho.di.core.util.ResolverUtil$Test;@2acc65
    INFO 23-05 09:34:03,939 - org.pentaho.di.core.util.ResolverUtil@f7c31d - Could not examine class \'org/pentaho/di/trans/steps/formula/RowForumulaContext.class\' due to a java.lang.NoClassDefFoundError with message: org/pentaho/reporting/libraries/formula/FormulaContext
    INFO 23-05 09:34:03,954 - org.pentaho.di.core.util.ResolverUtil@f7c31d - Could not examine class \'org/pentaho/di/trans/steps/getxmldata/GetXMLData$1.class\' due to a java.lang.NoClassDefFoundError with message: org/dom4j/ElementHandler
    INFO 23-05 09:34:03,979 - org.pentaho.di.core.util.ResolverUtil@f7c31d - Could not examine class \'org/pentaho/di/trans/steps/infobrightoutput/KettleEtlLogger.class\' due to a java.lang.NoClassDefFoundError with message: com/infobright/logging/EtlLogger
    INFO 23-05 09:34:03,980 - org.pentaho.di.core.util.ResolverUtil@f7c31d - Could not examine class \'org/pentaho/di/trans/steps/infobrightoutput/KettleValueConverter.class\' due to a java.lang.NoClassDefFoundError with message: com/infobright/etl/model/ValueConverter
    INFO 23-05 09:34:04,116 - org.pentaho.di.core.util.ResolverUtil@f7c31d - Could not examine class \'org/pentaho/di/trans/steps/scriptvalues_mod/ScriptValuesAddedFunctions.class\' due to a java.lang.NoClassDefFoundError with message: org/mozilla/javascript/ScriptableObject
    INFO 23-05 09:34:04,226 - org.pentaho.di.core.util.ResolverUtil@f7c31d - Could not examine class \'org/pentaho/di/trans/steps/webservices/wsdl/ControlWsdlLocator.class\' due to a java.lang.NoClassDefFoundError with message: javax/wsdl/xml/WSDLLocator
    INFO 23-05 09:34:04,643 - DBCache - The database cache doesn't exist yet.
    Saved transformation to file: NewTrans.xml


    INFO 23-05 09:34:04,794 - - - New database connection defined
    Exception in thread "main" org.pentaho.di.core.exception.KettleDatabaseException:
    No valid database connection defined!


    at org.pentaho.di.core.database.Database.normalConnect(Database.java:274)
    at org.pentaho.di.core.database.Database.connect(Database.java:261)
    at org.pentaho.di.core.database.Database.connect(Database.java:223)
    at org.pentaho.di.core.database.Database.connect(Database.java:213)
    at table_to_table.main(table_to_table.java:250)

  3. #3

    Default

    你好!
    还不是非常清楚是否可以帮到你。麻烦你从福 解释 你遇到的 Pentaho/Kettle 的问题。
    再说,我们这里经常用 www.pastebin.com 网站。也许你可以使用一下看看。

    如果你已经找到了办法 麻烦你回答通知我们 而简单解释您的办法。

    写dat789


    Quote Originally Posted by suwb View Post
    First,please ingore my mistakes about English.

    Then,here is my question:I am not allowed to use the Kettle GUI,or the repository.What I can use is the types and methods that is supplied by the kettle source codes.In the kettle 3.2.0stable version,there is an example 'TransBuilder.java' in the the folder 'extra'(I'm not sure if you guys know this),and my code is just from there,but it cannot work,and I struggle for this for a long time.

    I don't know if you understand,please help me find out what's wrong,or if you have any other examples,just let me know,I really need your help.
    Name:  360截图20130523093443214.jpg
Views: 99
Size:  22.1 KB
    Sorry for the code,I don't konw how to upload a 'txt' file.

    import java.io.DataOutputStream;
    import java.io.File;
    import java.io.FileOutputStream;
    import org.pentaho.di.core.Const;
    import org.pentaho.di.core.NotePadMeta;
    import org.pentaho.di.core.database.Database;
    import org.pentaho.di.core.database.DatabaseMeta;
    import org.pentaho.di.core.exception.KettleException;
    import org.pentaho.di.core.logging.LogWriter;
    import org.pentaho.di.core.util.EnvUtil;
    import org.pentaho.di.trans.StepLoader;
    import org.pentaho.di.trans.Trans;
    import org.pentaho.di.trans.TransHopMeta;
    import org.pentaho.di.trans.TransMeta;
    import org.pentaho.di.trans.step.StepMeta;
    import org.pentaho.di.trans.step.StepMetaInterface;
    import org.pentaho.di.trans.steps.tableinput.TableInputMeta;
    import org.pentaho.di.trans.steps.tableoutput.TableOutputMeta;
    /**
    * 这个类用于动态创建转换
    * @author 苏文波
    */
    public class table_to_table
    {
    //数据库连接信息
    public static final String[] databasesXML = {
    "<connection>"+
    "<name>kettle_bs</name>"+
    "<server>localhost</server>"+
    "<type>MYSQL</type>"+
    "<access>Native</access>"+
    "<database>123</database>"+
    "<port>3306</port>"+
    "<username>root</username>"+
    "<password>suwb</password>"+
    "<servername/>"+
    "<data_tablespace/>"+
    "<index_tablespace/>"+
    "</connection>"
    };


    /**
    * 使用输入参数(如读取表名)来创建转换
    * @param transformationName 转换名
    * @param sourceDatabaseName 读取的源数据库名
    * @param sourceTableName 读取的源表名
    * @param sourceFields 期望从源表中读取的字段名
    * @param targetDatabaseName 读取的目标数据库名
    * @param targetTableName 期望写入的目标表名
    * @param targetFields 目标表中的字段名(与源字段的字段数相同)
    * @return A new transformation 返回一个新的转换
    * @throws KettleException In the rare case something goes wrong 异常处理
    */
    public static final TransMeta buildCopyTable(String transformationName,
    String sourceDatabaseName, String sourceTableName,
    String[] sourceFields, String targetDatabaseName,
    String targetTableName, String[] targetFields)
    throws KettleException
    {
    EnvUtil.environmentInit();
    try
    {
    // 创建转换
    // 创建一个transMeta对象
    TransMeta transMeta = new TransMeta();
    transMeta.setName(transformationName);

    // 加入数据库连接,根据XML文件读取数据库资料
    // 创建数据库对象,将数据库对象databaseMeta作为属性添加到transMeta对象中
    //for (int i=0;i<databasesXML.length;i++)
    //{
    DatabaseMeta databaseMeta = new DatabaseMeta(databasesXML[0]);
    transMeta.addDatabase(databaseMeta);
    //}

    DatabaseMeta sourceDBInfo = transMeta.findDatabase(sourceDatabaseName);
    DatabaseMeta targetDBInfo = transMeta.findDatabase(targetDatabaseName);

    // 添加注释
    String note = "Reads information from table [" + sourceTableName+ "] on database [" + sourceDBInfo + "]" + Const.CR;
    note += "After that, it writes the information to table [" + targetTableName + "] on database [" + targetDBInfo + "]";
    NotePadMeta txt = new NotePadMeta(note, 150, 10, -1, -1);//五个参数分别为字符串,坐标(x,y),宽,高;
    transMeta.addNote(txt);

    // 创建一个源数据步骤
    String fromstepname = "read from [" + sourceTableName + "]";//步骤名:表输入
    TableInputMeta t_input = new TableInputMeta(); //一个表输入源
    t_input.setDatabaseMeta(sourceDBInfo); //该表所在的数据库
    String selectSQL = "SELECT "+Const.CR;//Const.CR相当于1个回车换行
    for (int i=0;i<sourceFields.length;i++)
    {
    if (i>0) selectSQL+=", ";
    else selectSQL+=" ";
    selectSQL+=sourceFields[i]+Const.CR;
    }
    selectSQL+="FROM "+sourceTableName;
    t_input.setSQL(selectSQL);
    //设定取数据的sql语句,如果知道要取的字段的话,应该可以直接String selectSQL = "SELECT xxx from"+sourceTableName;

    // 以下将源数据步骤作为属性添加到transMeta对象中
    StepLoader steploader = StepLoader.getInstance();
    String fromstepid = steploader.getStepPluginID(t_input);
    StepMeta fromstep = new StepMeta(fromstepid, fromstepname, (StepMetaInterface) t_input);//StepMeta类包含定义步骤所需的一切信息
    fromstep.setLocation(150, 100);
    fromstep.setDraw(true);
    fromstep.setDescription("Reads information from table [" + sourceTableName + "] on database [" + sourceDBInfo + "]");
    transMeta.addStep(fromstep);

    /**
    // 增加逻辑来重命名字段
    // 创建一个数据处理对象,此处为一个v_input对象。该对象用于选择指定的源数据的列,将其中的数据送到目标数据库的指定列
    SelectValuesMeta v_input = new SelectValuesMeta();
    v_input.allocate(0, 0, sourceFields.length);

    for (int i = 0; i < sourceFields.length; i++)
    {
    v_input.getMeta()[i].setName(sourceFields[i]); //设定源列
    v_input.getMeta()[i].setRename(targetFields[i]); //设定目标列
    }

    //以下将该对象作为属性添加到transMeta对象中
    String selstepname = "Rename field names";
    String selstepid = steploader.getStepPluginID(v_input);
    StepMeta selstep = new StepMeta(selstepid, selstepname, (StepMetaInterface) v_input);
    selstep.setLocation(350, 100);
    selstep.setDraw(true);
    selstep.setDescription("Rename field names");
    transMeta.addStep(selstep);

    //创建一个数据流对象d_hop,用于连接源和v_input对象
    //并将该数据流对象作为属性添加到transMeta对象中
    TransHopMeta d_hop = new TransHopMeta(fromstep, selstep);
    transMeta.addTransHop(d_hop);
    fromstep = selstep;
    */


    //创建一个目标对象,此处目标为数据库输出
    String tostepname = "write to [" + targetTableName + "]";
    TableOutputMeta t_ouput = new TableOutputMeta(); //创建一个表输出对象
    t_ouput.setDatabaseMeta(targetDBInfo); //设置该表所在数据库
    t_ouput.setTablename(targetTableName); //设置目标表的表名。若目标数据库中不存在该表,则创建表
    t_ouput.setCommitSize(200);
    t_ouput.setTruncateTable(true);
    //将该目标对象作为属性添加到transMeta对象中
    String tostepid = steploader.getStepPluginID(t_ouput);
    StepMeta tostep = new StepMeta(tostepid, tostepname, (StepMetaInterface) t_ouput);
    tostep.setLocation(550, 100);
    tostep.setDraw(true);
    tostep.setDescription("Write information to table [" + targetTableName + "] on database [" + targetDBInfo + "]");
    transMeta.addStep(tostep);

    // 建立一个数据流对象hop,用于连接源步骤和目标步骤
    TransHopMeta c_hop = new TransHopMeta(fromstep, tostep);
    transMeta.addTransHop(c_hop);
    // 如果我们还在这儿,就重写转换
    return transMeta;
    }
    catch (Exception e)
    {
    throw new KettleException("不好意思,出错了哦", e);
    }
    }
    /**
    1  创建一个新的转换
    2  保存转换为XML文件
    3  生成SQL的目标表
    4  执行转换
    5  删除目标表使这个程序可重复的
    * @param args
    */
    public static void main(String[] args) throws Exception
    {
    EnvUtil.environmentInit();
    //初始化环境
    /**
    * LogWriter log = LogWriter.getInstance("TransBuilder.log", true, LogWriter.LOG_LEVEL_DETAILED);

    // 加载所有插件和步骤
    StepLoader stloader = StepLoader.getInstance();
    if (!stloader.read())
    {
    log.logError("TransBuilder", "Error loading Kettle steps & plugins... stopping now!");
    return;
    }
    */
    LogWriter.getInstance("table_to_table.log", true, LogWriter.LOG_LEVEL_DETAILED);

    // 加载步骤
    StepLoader.init(); //初始化

    // 我们所需的参数
    String fileName = "NewTrans.xml";
    String transformationName = "kettle_test";
    String sourceDatabaseName = "su1";
    String sourceTableName = "bs5";
    String sourceFields[] = {
    "no",
    "name",
    "height",
    "weight"
    };
    String targetDatabaseName = "su1";
    String targetTableName = "bs7";
    String targetFields[] = {
    "a",
    "b",
    "c",
    "d"
    };

    // 调用方法,创建转换
    TransMeta transMeta = table_to_table.buildCopyTable(
    transformationName,
    sourceDatabaseName,
    sourceTableName,
    sourceFields,
    targetDatabaseName,
    targetTableName,
    targetFields
    );

    // 保存为xml文件
    String xml = transMeta.getXML();
    //System.out.println(xml);
    DataOutputStream dos = new DataOutputStream(new FileOutputStream(new File(fileName)));
    dos.write(xml.getBytes("UTF-8"));
    dos.close();
    System.out.println("Saved transformation to file: "+fileName);


    // 这个sql是建表的语句,是数据库没有对应的表时,会执行的语句
    String sql = transMeta.getSQLStatementsString();
    System.out.println(sql);

    // 对目标表执行sql
    Database targetDatabase = new Database(transMeta.findDatabase(targetDatabaseName));
    targetDatabase.connect();
    targetDatabase.execStatements(sql);

    // 执行整个转换
    Trans trans = new Trans(transMeta);
    trans.execute(null);
    trans.waitUntilFinished();

    // 为了程序可重复,执行完后删除目标表
    //targetDatabase.execStatement("drop table "+targetTableName);
    //targetDatabase.disconnect();
    }

    }

  4. #4
    Join Date
    May 2013
    Posts
    7

    Default

    First,thank you for your translation into Chinese.But I am not available to this site.Can you give me some java examples just like my codes(http://forums.pentaho.com/showthread.php?143070-I-m-here-again-Who-can-help-me-with-an-API-example&p=342038#post342038)?

  5. #5
    Join Date
    May 2013
    Posts
    7

    Default

    Luckily I fix this finally. This 'DatabaseMeta sourceDBInfo = transMeta.findDatabase(sourceDatabaseName);' should be 'DatabaseMeta sourceDBInfo = transMeta.findDatabase("kettle_bs");' in my code . the "kettle_bs" is the name of my connection with database in kettle. When use sourceDatabaseName,the sourceDBInfo is null.

  6. #6
    Join Date
    May 2013
    Posts
    7

    Default

    I have another problem now. I want to use the type 'SelectValuesMeta.java' to reach a function , that is rename 'birthday' to 'birthday_re' and change the format from 'yyyy-mm-dd' to 'dd-mm-yyyy'. Who can tell me how to organize this step.(I am using kettle 3.2)
    Just like this : meta[i] = new SelectMetadataChange("birthday", "birthday_re", ValueMetaInterface.TYPE_DATE, null, null, -1, "dd-mm-yyyy", null, null, null);

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.