I follow the tutorial http://wiki.pentaho.com/display/BAD/...Hadoop+Cluster
I install hadoop2.6.0 cluster in ubuntu...i tried to install spoon5.3.0 in ubuntu and successfully load data to hdfs,but when i installed spoon5.3.0 in windows ,and try to load data into hdfs ,it failed with follwing errors:
Could not close the output stream for file "hdfs://192.168.1.186:9000/user/pdi/weblogs_rebuild.txt".


the details are :
2015/05/16 17:27:14 - Hadoop拷贝文件 - 文件 [file:///F:/weblogs_rebuild.txt] 被复制到 [hdfs://192.168.1.186:9000/user/pdi\weblogs_rebuild.txt]
2015/05/16 17:27:14 - Hadoop拷贝文件 - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : 文件系统异常:Could not copy "file:///F:/weblogs_rebuild.txt" to "hdfs://192.168.1.186:9000/user/pdi/weblogs_rebuild.txt".
2015/05/16 17:27:14 - Hadoop拷贝文件 - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : 原因:Could not close the output stream for file "hdfs://192.168.1.186:9000/user/pdi/weblogs_rebuild.txt".
2015/05/16 17:27:14 - Hadoop拷贝文件 - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : 原因:DataStreamer Exception:
2015/05/16 17:27:14 - Hadoop拷贝文件 - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : 原因:null
2015/05/16 17:27:14 - Hadoop拷贝文件 - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : 原因:org.apache.xerces.jaxp.DocumentBuilderFactoryImpl cannot be cast to javax.xml.parsers.DocumentBuilderFactory
2015/05/16 17:27:14 - myfirst_job - 完成作业项[Hadoop拷贝文件] (结果=[false])