Hitachi Vantara Pentaho Community Forums
Results 1 to 6 of 6

Thread: How to specify Hadoop user to run MapReduce Job

  1. #1
    Join Date
    Nov 2010
    Posts
    16

    Default How to specify Hadoop user to run MapReduce Job

    Hello,

    I'm trying to run a Pentaho MapReduce job from a machine in which I'm logged-in as "Administrator"

    The job is failing because such user has no access to the Hadoop cluster (which is correct) - see error below

    How can I set up the job so that it's run with a specific user and password?

    In the step "Hadoop File Input" I can specify username and password (hdfs://user@<SERVER>:8020/)

    Is there a way to do so in the MapReduce step? (if I use user@server I'm getting an error: unknown host)

    2013/01/21 18:07:58 - Spoon - Starting job...
    2013/01/21 18:07:58 - SFW_MapReduce - Start of job execution
    2013/01/21 18:07:58 - SFW_MapReduce - Starting entry [SFW_MapReduce]
    2013/01/21 18:08:05 - SFW_MapReduce - Cleaning output path: hdfs://<SERVER>:8020/user/bio/SFW
    2013/01/21 18:08:06 - SFW_MapReduce - Installing Kettle to /opt/pentaho/mapreduce/4.4.0-1.3.0-cdh3u4
    2013/01/21 18:08:07 - SFW_MapReduce - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : Kettle installation failed
    2013/01/21 18:08:07 - SFW_MapReduce - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : org.apache.hadoop.security.AccessControlException: org.apache.hadoop.security.AccessControlException: Permission denied: user=Administrator, access=WRITE, inode="/":hdfs:hadoop:drwxr-xr-xva:184)

  2. #2
    Join Date
    Oct 2013
    Posts
    3

    Default

    has anyone a solution for this? i'm facing the same issue, internet/google is not helping on this or i haven't found this yet ...

  3. #3
    Join Date
    Sep 2012
    Posts
    71

  4. #4
    Join Date
    Oct 2013
    Posts
    3

    Default

    thx, but it looks like there is not much activity on this issue?!? this looks pretty urgent and important not only to me but the whole community, or?

  5. #5

    Default

    I see that the mentioned ticket has been marked closed and implemented in 5.1.0 GA. We are using 5.2.0 and read through the tickets mentioned above but still do not see a way to specify user/pwd in the Pentaho MapReduce entry. Any details of how we might set this up with our Hortonworks Hadoop instance? Thanks
    Last edited by simon; 04-27-2015 at 05:23 PM.
    -Simon
    Pentaho Version: 5.2.0.0

  6. #6
    Join Date
    May 2015
    Posts
    2

    Default

    Try using the below property into the hdfs-site.xml file:

    <property>
    <name>dfs.permissions</name>
    <value>false</value>
    </property>

    this might resolve the issue

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.