PDA

View Full Version : Loading a kettle job through JAVA code, with SqoopImport fails



ayyappanm
06-28-2013, 12:22 PM
Hi All,

I'm trying to a load a kettle job with a Sqoop Import through JAVA code. The creating JobMeta, I'm encoutnering Kettle exception. The same happens for HadoopCopyFiles plugin. Any suggesstion would be helpful.


===============================================================
CODE
===============================================================
String path = getClass().getClassLoader().getResource("TestOracle.kjb").getPath();
log.info(path);
JobMeta meta = new JobMeta(path, null);
meta.setParameterValue("repos.directory", "jobs");


=================================================================================
STACKTRACE
=================================================================================
org.pentaho.di.core.exception.KettleXMLException:
Unable to load the job from XML file [/home/cloudera/workspace/Pentaho_Run/jobs/TestOracle.kjb]
Unable to load job info from XML node
Unable to read Job Entry copy info from XML node : org.pentaho.di.core.exception.KettleStepLoaderException:
No valid step/plugin specified (jobPlugin=null) for SqoopImport

No valid step/plugin specified (jobPlugin=null) for SqoopImport


at org.pentaho.di.job.JobMeta.<init>(JobMeta.java:879)
at org.pentaho.di.job.JobMeta.<init>(JobMeta.java:831)
at test.pentaho.bigdata.SqoopTest.runJob(SqoopTest.java:35)
at test.pentaho.bigdata.SqoopTest.main(SqoopTest.java:20)
Caused by: org.pentaho.di.core.exception.KettleXMLException:
Unable to load job info from XML node
Unable to read Job Entry copy info from XML node : org.pentaho.di.core.exception.KettleStepLoaderException:
No valid step/plugin specified (jobPlugin=null) for SqoopImport



11238

MattCasters
06-28-2013, 12:28 PM
Did you initialize the Kettle environment? (KettleEnvironment.init())
Also make sure the big data plugins are in a location where they are found.

ayyappanm
06-28-2013, 12:32 PM
Yes. Here is my code.

EnvUtil.environmentInit();
KettleEnvironment.init();

String path = getClass().getClassLoader().getResource("TestOracle.kjb").getPath();
log.info(path);
JobMeta meta = new JobMeta(path, null);
meta.setParameterValue("repos.directory", "jobs");

Job job = new Job(null, meta);
JobEntryCopy start = meta.findStart();
if(start != null){
log.info(start.getName());
job.setName("TestOracle");
job.resetErrors();
job.activateParameters();
job.run();
}

Also, I'm running this application through eclipse and I added most of the jars under data-integration "/opt/pentaho/design-tools/data-integration/plugins/pentaho-big-data-plugin/lib".

Is there any specific jar, I need to add to build path to make this job run?

MattCasters
06-28-2013, 06:39 PM
Well, you could try to add folders to the various plugin types so that they would know where to find the plugins:


PluginFolderInterface folder = new PluginFolder("/opt/pentaho/design-tools/data-integration/plugins/", true, true, true);
StepPluginType.getInstance().getPluginFolders().add( folder );
JobEntryPluginType.getInstance().getPluginFolders().add( folder );
SpoonPluginType.getInstance().getPluginFolders().add( folder );

Or you could move the Eclipse launch config working directory to /opt/pentaho/design-tools/data-integration/

ayyappanm
06-30-2013, 01:02 AM
Thanks a lot. Setting the eclipse launch config working directory solved the problem. :)