hadoop程序在windows上访问hdfs的问题
hadoop运行在linux上,在window的本地eclips运行java代码,碰到如下的异常,java.lang.IllegalArgumentException: Wrong FS: hdfs:/ expected file:///
Java代码如下:
FileSystem fs = FileSystem.get(conf);
in = fs.open(new Path("hdfs://192.168.2.6:9000/user/hadoop/output/part-00000"));
抛出异常如下:
Exception in thread "main" java.lang.IllegalArgumentException: Wrong FS: hdfs://192.168.2.6:9000/user/hadoop/output/part-00000, expected: file:///
at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:310)
at org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:47)
at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:357)
at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:245)
at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.<init>(ChecksumFileSystem.java:125)
at org.apache.hadoop.fs.ChecksumFileSystem.open(ChecksumFileSystem.java:283)
at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:356)
at com.netease.hadoop.HDFSCatWithAPI.main(HDFSCatWithAPI.java:23)
解决方案:
hadoop需要把集群上的core-site.xml和hdfs-site.xml放到当前工程下。eclipse工作目录的bin文件夹下面
因为是访问远程的HDFS 需要通过URI来获得FileSystem. 版权声明:本文为博主原创文章,未经博主允许不得转载。
页:
[1]