View Full Version : Hadoop File Input 'Browse -> Open File' does not respect hadoop configuration files?

11-07-2012, 11:18 AM
Hi all,
I believe this is bug in PDI/Kettle/Spoon 4.3.0 but I would like your comment if you agree, before reporting it as a bug.

1) According to the instructions in
"Apply Hadoop client configuration files by placing the core-site, hdfs-site, and mapred-site.xml files in the $PDI_HOME directory."
I've placed my hadoop relevant config files in $PDI_HOME.
In particular, my core-site.xml is the following:

2) I create a transformation with an "Hadoop File Input", open the properties and click "Browse...". The "Open File" dialog appears.
My expectation of the dialog box is that the "Connection" fields are pre-filled with the values from core-site.xml ("Server : some-host" and "Port:<empty>").
The current behaviour is that the "Connection" fields are pre-filled with "Server : localhost" and "Port: 9000".

Could you advice me if this also happens on your configurations and this is indeed a bug to report in http://jira.pentaho.com/ , or if I might have skipped any configuration step?

Thank you,

11-12-2012, 04:31 PM

I'm having the same problem. I know it's not reading the right file as I keep getting a warning message saying fs.default.name is deprecated even though I've updated the core-site.xml to use the new property, but I continue to get the same error message.

Any help would be gratefully received as this is stopping me using kettle not to mention driving me up the wall.