Hitachi Vantara Pentaho Community Forums
Results 1 to 6 of 6

Thread: MapReduce Job failes java.lang.RuntimeException: Error in configuring object

  1. #1

    Default MapReduce Job failes java.lang.RuntimeException: Error in configuring object

    Hello all,

    I could really need some help.

    I tried to execute the example from the Penatho Website http://wiki.pentaho.com/display/BAD/...ataset+in+MapR

    with the following error. The weird thing is, a colleague of mine is executing the same job with no problem at all on the same Hadoop Cluster.

    Has anybody any idea on this one?

    Thanks,
    Rainer

    013/03/05 16:19:29 - Pentaho MapReduce - Configuring Pentaho MapReduce job to use Kettle installation from /opt/pentaho/mapreduce/4.4.0-1.3.0-cdh4
    2013/03/05 16:19:57 - Pentaho MapReduce - Setup Complete: 0.0 Mapper Completion: 0.0 Reducer Completion: 0.0
    2013/03/05 16:20:02 - Pentaho MapReduce - Setup Complete: 100.0 Mapper Completion: 0.0 Reducer Completion: 0.0
    2013/03/05 16:20:07 - Pentaho MapReduce - Setup Complete: 100.0 Mapper Completion: 0.0 Reducer Completion: 0.0
    2013/03/05 16:20:12 - Pentaho MapReduce - Setup Complete: 100.0 Mapper Completion: 0.0 Reducer Completion: 0.0
    2013/03/05 16:20:12 - Pentaho MapReduce - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : [FAILED] -- Task: attempt_201302271045_0061_m_000005_0 Attempt: attempt_201302271045_0061_m_000005_0 Event: 1
    2013/03/05 16:20:12 - Pentaho MapReduce - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : java.lang.RuntimeException: Error in configuring object
    at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:72)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:414)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:333)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
    at org.apache.hadoop.mapred.Child.main(Child.java:262)
    Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.ja

  2. #2

    Default

    The TaskTracker logs the following Message. All i find online is about older versions of Kettle, where you have to modify mapred-site.xml. This isnt neccessary anymore right? I use 4.4 stable.


    task-diagnostic-info for task attempt_201302271045_0066_m_000003_1 : java.lang.RuntimeException: Error in configuring object
    at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:72)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:414)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:333)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
    at org.apache.hadoop.mapred.Child.main(Child.java:262)
    Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:103)
    ... 9 more
    Caused by: java.lang.RuntimeException: Error loading transformation
    at org.pentaho.hadoop.mapreduce.PentahoMapRunnable.createTrans(PentahoMapRunnable.java:197)
    at org.pentaho.hadoop.mapreduce.PentahoMapRunnable.configure(PentahoMapRunnable.java:167)
    ... 14 more
    Caused by: org.pentaho.di.core.exception.KettleXMLException:
    Error reading object from XML file

    Unable to load step info from XML step nodeorg.pentaho.di.core.exception.KettleStepLoaderException:
    Unable to load class for step/plugin with id [HadoopEnterPlugin]. Check if the plugin is available in the plugins subdirectory of the Kettle distribution.


    Unable to load class for step/plugin with id [HadoopEnterPlugin]. Check if the plugin is available in the plugins subdirectory of the Kettle distribution.



    at org.pentaho.di.trans.TransMeta.loadXML(TransMeta.java:3297)
    at org.pentaho.di.trans.TransMeta.loadXML(TransMeta.java:2844)
    at org.pentaho.di.trans.TransMeta.loadXML(TransMeta.java:2830)
    at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2817)
    at org.pentaho.di.trans.TransConfiguration.<init>(TransConfiguration.java:69)
    at org.pentaho.di.trans.TransConfiguration.fromXML(TransConfiguration.java:76)
    at org.pentaho.hadoop.mapreduce.MRUtil.getTrans(MRUtil.java:68)
    at org.pentaho.hadoop.mapreduce.PentahoMapRunnable.createTrans(PentahoMapRunnable.java:195)
    ... 15 more
    Caused by: org.pentaho.di.core.exception.KettleXMLException:
    Unable to load step info from XML step nodeorg.pentaho.di.core.exception.KettleStepLoaderException:
    Unable to load class for step/plugin with id [HadoopEnterPlugin]. Check if the plugin is available in the plugins subdirectory of the Kettle distribution.


    Unable to load class for step/plugin with id [HadoopEnterPlugin]. Check if the plugin is available in the plugins subdirectory of the Kettle distribution.


    at org.pentaho.di.trans.step.StepMeta.<init>(StepMeta.java:308)
    at org.pentaho.di.trans.TransMeta.loadXML(TransMeta.java:2947)
    ... 22 more
    Caused by: org.pentaho.di.core.exception.KettleStepLoaderException:
    Unable to load class for step/plugin with id [HadoopEnterPlugin]. Check if the plugin is available in the plugins subdirectory of the Kettle distribution.

    at org.pentaho.di.trans.step.StepMeta.<init>(StepMeta.java:250)
    ... 23 more

  3. #3
    Join Date
    Sep 2012
    Posts
    71

    Default

    It looks like you are trying to run the job on a CDH4 (Cloudera) Hadoop cluster. Does your plugin.properties file (under data-integration/plugins/pentaho-big-data-plugin) have the "active.hadoop.configuration" property set to "cdh4" (rather than the default value of "hadoop-20")?

  4. #4
    Join Date
    Apr 2013
    Posts
    2

    Default

    Hi Rainer and Matt,

    Wer you able to get the root cause for this. I get the same exception when am trying to run the pentaho mapreduce sample programs. Am still unsure wat configuration changes needed, as am using CDH3u4 and same I have mentioned in plugin.properties file. You help would be grateful to resolve the issue.

  5. #5
    Join Date
    Aug 2010
    Posts
    87

    Default

    The easiest way to get back to a working environment is to delete the /opt/pentaho/mapreduce/4.4.0-1.3.0-cdh4 directory and let Kettle re-upload all required dependencies into HDFS. This will happen the next time you run a job with a PMR job entry. Just make sure the user executing the job the first time has write permissions to /opt/pentaho/mapreduce!

  6. #6
    Join Date
    Apr 2013
    Posts
    2

    Default

    jganoff , That really helped ...thanks for the valuable info...:-)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.