Hitachi Vantara Pentaho Community Forums
Results 1 to 8 of 8

Thread: Hadoop Shim related errors

  1. #1
    Join Date
    Apr 2016
    Posts
    13

    Default Hadoop Shim related errors

    Hi Team,

    I'm trying to connect hdfs system for the first time(hortonworks). I'm using PDI-8 in my windows machine.My hadoop cluster is a secured one so i'm using kerberos as authentication. Here is the probelem. I did kinit and placed the keytab file path in windows core-site.xml

    authentication.superuser.provider=hdp-kerberos
    authentication.kerberos.principal=USER@DOMAIN.COM
    authentication.kerberos.keytabLocation=C:\Users\myuser\krb5cc_username

    Now when i test shim is not recognized.

    Name:  shim-error.jpg
Views: 409
Size:  28.1 KB

    When i give NO_AUTH and restart pdi and connect again it shows below
    authentication.superuser.provider=NO_AUTH

    Name:  shim-success.jpg
Views: 411
Size:  22.6 KB

    but i'm unable to connect to hdfs directory. I tried using hadoop copy files and when i selected my connection it shows below

    Name:  conncetion-error.PNG
Views: 352
Size:  6.9 KB

    Can anyone point me to the correct source where i can get steps to configure and test successfully.

    Regards,
    G.Sujay.



    Regards,
    G.Sujay.

  2. #2
    Join Date
    Sep 2011
    Posts
    152

    Default

    did you try copying same core-site.xml file present in HDP?

  3. #3
    Join Date
    Apr 2016
    Posts
    13

    Default

    Hi Rajesh yes i did copied

  4. #4
    Join Date
    Apr 2016
    Posts
    13

    Default Pentaho 8 and kerberos

    Hi All,

    I just have few basic questions like Pentaho data integration(8.0) community edition can't it support Kerberos authentications ??? I'm trying to connect to secured cluster HDP2.4.3.2-1 (hortonworks).

    When i turn on authentication.superuser.provider=hdp-kerberos my shim doesn't load.
    If i give authentication.superuser.provider=NO_AUTH shim loads and shows below
    Name:  Capture.jpg
Views: 448
Size:  25.5 KB

    but later when i try accessing hdfs folder it shows "Unable to get VFS File object for filename 'hdfs://clustername'".

    Any inputs will be helpful.

    Regards,
    G.Sujay.

  5. #5
    Join Date
    Sep 2011
    Posts
    152

    Default

    please share config.properties output

  6. #6
    Join Date
    Apr 2016
    Posts
    13

    Default

    Hi @rajesh please find below the config properties.

    # Friendly name for this configuration
    name=HortonWorks HDP 2.6.x


    # Comma-separated list of directories and files to make available to this
    # configuration. Any resources found here will overwrite ones in lib/.
    classpath=lib/hive-service-1.2.1000.2.6.0.3-8.jar


    # Comma-separated list of paths that contain native libraries to load. These
    # could be added to LD_LIBRARY_PATH or set with -Djava.library.path instead.
    library.path=




    ignore.classes=java.security.Permission,org.apache.derby


    #HDP version that is used on the cluster
    #java.system.hdp.version=2.6.0.3-8
    java.system.hdp.version=2.4.3.2-1
    authentication.superuser.provider=NO_AUTH


    #authentication.superuser.provider=hdp-kerberos
    authentication.superuser.id=hdp-kerberos
    authentication.kerberos.principal=USER@DOMAIN.ABCD.COM
    authentication.kerberos.password=XXXXXXX
    #authentication.kerberos.keytabLocation=C:\Users\local_user\krb5cc_localuser

  7. #7
    Join Date
    Sep 2011
    Posts
    152

    Default

    I think you are using wrong shims

  8. #8
    Join Date
    Sep 2011
    Posts
    152

    Default

    In Second screenshot, it seems like you are already connected with your hdp cluster and everything works properly for VFS error:
    https://jira.pentaho.com/browse/PDI-9957

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.