Hitachi Vantara Pentaho Community Forums
Results 1 to 2 of 2

Thread: PDI + CDH4.4.0 : Unable to get VFS File object for filename

  1. #1
    Join Date
    Sep 2010
    Posts
    37

    Default PDI + CDH4.4.0 : Unable to get VFS File object for filename

    I am trying to use PDI + CDH4.4.0.

    My cluster hadoop is woking fine.

    When I try to copy files to the cluster I get the following error:

    Code:
    2014/01/20 14:19:50 - Hadoop Copy Files - ERROR (version  4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : Não é  possível copiar a pasta/o arquivo [d:/weblogs_rebuild.txt] a  [hdfs://10.239.69.200:8020/test]. Excepção: [
    2014/01/20 14:19:50 - Hadoop Copy Files - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : 
    2014/01/20 14:19:50 - Hadoop Copy Files - ERROR (version 4.4.0-stable,  build 17588 from 2012-11-21 16.02.21 by buildguy) : Unable to get VFS  File object for filename 'hdfs://10.239.69.200:8020/test' : Could not  resolve file "hdfs://10.239.69.200:8020/test".
    2014/01/20 14:19:50 - Hadoop Copy Files - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : 
    2014/01/20 14:19:50 - Hadoop Copy Files - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : ]
    2014/01/20 14:19:50 - Hadoop Copy Files - ERROR (version 4.4.0-stable,  build 17588 from 2012-11-21 16.02.21 by buildguy) :  org.pentaho.di.core.exception.KettleFileException: 
    2014/01/20 14:19:50 - Hadoop Copy Files - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : 
    2014/01/20 14:19:50 - Hadoop Copy Files - ERROR (version 4.4.0-stable,  build 17588 from 2012-11-21 16.02.21 by buildguy) : Unable to get VFS  File object for filename 'hdfs://10.239.69.200:8020/test' : Could not  resolve file "hdfs://10.239.69.200:8020/test".
    2014/01/20 14:19:50 - Hadoop Copy Files - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : 
    2014/01/20 14:19:50 - Hadoop Copy Files - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : 
    2014/01/20 14:19:50 - Hadoop Copy Files - ERROR (version 4.4.0-stable,  build 17588 from 2012-11-21 16.02.21 by buildguy) :     at  org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:161)
    2014/01/20 14:19:50 - Hadoop Copy Files - ERROR (version 4.4.0-stable,  build 17588 from 2012-11-21 16.02.21 by buildguy) :     at  org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:104)
    2014/01/20 14:19:50 - Hadoop Copy Files - ERROR (version 4.4.0-stable,  build 17588 from 2012-11-21 16.02.21 by buildguy) :     at  org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.ProcessFileFolder(JobEntryCopyFiles.java:376)
    2014/01/20 14:19:50 - Hadoop Copy Files - ERROR (version 4.4.0-stable,  build 17588 from 2012-11-21 16.02.21 by buildguy) :     at  org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.execute(JobEntryCopyFiles.java:324)
    2014/01/20 14:19:50 - Hadoop Copy Files - ERROR (version 4.4.0-stable,  build 17588 from 2012-11-21 16.02.21 by buildguy) :     at  org.pentaho.di.job.Job.execute(Job.java:589)
    2014/01/20 14:19:50 - Hadoop Copy Files - ERROR (version 4.4.0-stable,  build 17588 from 2012-11-21 16.02.21 by buildguy) :     at  org.pentaho.di.job.Job.execute(Job.java:728)
    2014/01/20 14:19:50 - Hadoop Copy Files - ERROR (version 4.4.0-stable,  build 17588 from 2012-11-21 16.02.21 by buildguy) :     at  org.pentaho.di.job.Job.execute(Job.java:443)
    2014/01/20 14:19:50 - Hadoop Copy Files - ERROR (version 4.4.0-stable,  build 17588 from 2012-11-21 16.02.21 by buildguy) :     at  org.pentaho.di.job.Job.run(Job.java:363)
    Hadoop Packages on the Server:

    Code:
    hadoop-hdfs-namenode-2.0.0+1475-1.cdh4.4.0.p0.23.el6.x86_64
    hadoop-2.0.0+1475-1.cdh4.4.0.p0.23.el6.x86_64
    hadoop-mapreduce-2.0.0+1475-1.cdh4.4.0.p0.23.el6.x86_64
    hadoop-0.20-mapreduce-2.0.0+1475-1.cdh4.4.0.p0.23.el6.x86_64
    hadoop-hdfs-2.0.0+1475-1.cdh4.4.0.p0.23.el6.x86_64
    hadoop-yarn-2.0.0+1475-1.cdh4.4.0.p0.23.el6.x86_64
    hadoop-client-2.0.0+1475-1.cdh4.4.0.p0.23.el6.x86_64
    hadoop-0.20-mapreduce-jobtracker-2.0.0+1475-1.cdh4.4.0.p0.23.el6.x86_64
    I tried to use the following PDI Packages:

    Code:
    PDI 4.4.0-stable;
    PDI 4.4.0-stable + Big-Data-Plugin Version 1.3.3.1; and
    PDI 5.0.1-stable
    All the 3 options got the same message error.


    And I can see on hadoop log:

    Code:
    2014-01-20 08:24:27,686 WARN org.apache.hadoop.ipc.Server:  Incorrect header or version mismatch from 10.239.69.20:53593 got version  3 expected version 7
    Does anyone may help me ?

    Regards.

  2. #2
    Join Date
    Sep 2010
    Posts
    37

    Default

    I've figured out.

    Change the plugin.properties file:

    active.hadoop.configuration=cdh42

    Regards.

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.