PDA

View Full Version : java.lang.ClassNotFoundException: org.pentaho.di.core.exception.KettleException



ktadikamalla
09-17-2013, 06:11 PM
Hi, I am able to submit pigscripts to my HDFS using pdi-ce-4.4.0-stable version. but when I create Pentaho MapReduce job it throws below error. java.lang.ClassNotFoundException:org.pentaho.di.core.exception:KettleException. I saw this error in several forums but did not a solution for this issue. Can somebody please help me try to resolve this issue?

ktadikamalla
09-17-2013, 10:30 PM
Hi, I was able to resolve the issue partially, now I get error message as java.lang.ClassNotFoundException:org.pentaho.di.trans.step.RowListener. Can someone please help me resolve this issue?

MattCasters
09-18-2013, 02:48 AM
It simply means you didn't set up your environment correctly. You need to verify ports and so on. Check the wiki for a walk-through of how you set things up for your favorite hadoop distribution.

ktadikamalla
09-18-2013, 02:58 AM
Thanks Matt for your response. Just thinking out loud if it could be environment issue? I am able to run pig scripts, map reduce jobs, hive scripts.. every thing works fine. In fact i am even able to submit pigscripts from PDI. This issue starts happening when I start using Pentaho Map Reduce jobs.

MattCasters
09-18-2013, 03:12 AM
Check on HDFS if you have an /opt/pentaho folder with somewhere in it a copy of the PDI libraries. If not, then Spoon probably can't copy files into it. So check folder permissions on HDFS for /opt/pentaho

ktadikamalla
09-18-2013, 03:18 AM
Yes Matt, spoon is able to copy the jar files. I checked folder /opt/pentaho/.../..../..../lib and it has all the jar files from big data plugin folder and it has sufficient permission as well rw-r--r--

MattCasters
09-18-2013, 03:22 AM
Did you also find the kettle-*.jar files?

ktadikamalla
09-18-2013, 03:37 AM
Yes Matt, there are kettle core, db and engine jar files are available in lib folder

MattCasters
09-18-2013, 04:02 AM
well, just for the record, you need a HDFS folder /opt/pentaho/mapreduce/<your version>

In there you need a lib folder but also a plugins/ folder with in there a pentaho-big-data-plugin/ folder.

Check to see if there's a commons-configuration-1.5.jar file present. Some later 0.2 versions need it. If you don't have it, add it to your configuration (like pentaho-big-data-plugin/hadoop-configurations/hadoop-20/lib/client)

ktadikamalla
09-18-2013, 12:42 PM
Yes Matt, it has same folder structure and Jar files you mentioned above. Only thing is I have commons-configuration-1. 6 .jar which is the latest version. I am using hdp13 distribution on windows 8. Not sure if this is causing any issues. I am able to run all the map reduce jobs, pig scripts, hive so can not think it as the issue with hadoop distribution. I am thinking some Jar files are not compatible or not available in the respective folder. iibutionas the iun all . sure oifsure not

MattCasters
09-18-2013, 02:07 PM
Sorry, you're on your own on Windows. I don't know anybody doing that and I haven't tried it myself.

hallmit
06-18-2014, 04:09 AM
I was experiencing a similar case and I got this solved by changing the classpath separator in spoon.sh: OPT="$OPT ... -Dhadoop.cluster.path.separator=,"

Duplicated problem here: http://forums.pentaho.com/showthread.php?149621-error-java-lang-ClassNotFoundException-org-pentaho-di-trans-step-RowListener&highlight=ClassNotFoundException+RowListener

cheranilango
10-26-2016, 09:13 AM
Hi Every one

Just to add update on this issue for a strange haloween reason what we noticed was /opt/pentaho/mapreduce/6.1.0.1-196-6.1.0.1-196-cdh55/lib was completely empty after a cluster & VM restart . We definitely had all the lib files 1 day before cluster restart . To solve this we manually did a clean up , recreated directory and re ran pentaho map R issue was solved.


#Clean up Hadoop map reduce folder


sudo -u hdfs hadoop fs -rm -r /opt/pentaho/


# create necessary folders & execute permission for hadoop


sudo -u hdfs hadoop fs -mkdir -p /opt/ && sudo -u hdfs hadoop fs -mkdir -p /opt/pentaho/ && sudo -u hdfs hadoop fs -mkdir -p /opt/pentaho/mapreduce/6.1.0.1-196-6.1.0.1-196-cdh55/lib && sudo -u hdfs hdfs dfs -chown -R admin /opt/pentaho/mapreduce/6.1.0.1-196-6.1.0.1-196-cdh55/lib

# ensure hdfs is not in safe mode


sudo -u hdfs hdfs dfsadmin -safemode leave