Hitachi Vantara Pentaho Community Forums
Results 1 to 3 of 3

Thread: Copying files into hadoop error using Kettle 5.1.0

  1. #1

    Default Copying files into hadoop error using Kettle 5.1.0

    Hi, I've been experimenting with copying log files into hadoop. so far, I'm not having much luck, I keep getting this error:

    2014/09/10 11:41:54 - Copy XForm Log Files to Hadoop - Unable to get VFS File object for filename 'hdfs://localhost:8020/user/hdfs' : Could not resolve file "hdfs://localhost:8020/user/hdfs".

    I tried other pathnames and changed out localhost to the server name, no luck. the path /user/hdfs seems to exist:

    root:/opt/pentaho-ce/di-jobs->hadoop fs -ls /user
    Found 8 items
    drwxr-xr-x - hdfs supergroup 0 2014-09-09 11:43 /user/hdfs
    drwxrwxrwx - mapred hadoop 0 2014-08-29 17:14 /user/history
    drwxrwxr-t - hive hive 0 2014-08-29 17:15 /user/hive
    drwxrwxr-x - hue hue 0 2014-09-09 11:44 /user/hue
    drwxrwxr-x - impala impala 0 2014-08-29 18:36 /user/impala
    drwxrwxr-x - oozie oozie 0 2014-08-29 17:21 /user/oozie
    drwxr-xr-x - sample sample 0 2014-09-09 11:44 /user/sample
    drwxrwxr-x - sqoop2 sqoop 0 2014-08-29 17:16 /user/sqoop2

    How do I fix this error? Is there a better way?

    Thanks.

    Kevin

  2. #2
    Join Date
    Sep 2012
    Posts
    71

    Default

    What Hadoop distribution and version are you using? If not Apache Hadoop 0.20, you need to set your Hadoop configuration as described here: http://wiki.pentaho.com/display/BAD/...ro+and+Version

  3. #3

    Default

    Matt, I'm using cdh 5.1. Made the changes in the properties files and now it works as expected. And it's pretty quick.. thanks.

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.