首页 / 操作系统 / Linux / Hadoop HelloWord Examples -对Hadoop FileSystem进行操作 - 基于Java
我之前对Hadoop的各种文件操作都是基于命令行的,但是进阶后,经常需要直接从java的代码中对HDFS进行修改。今天来练习下。一个简单的demo,将hdfs的一个文件的内容拷贝到另外hdfs一个文件相关阅读:《Hadoop实战》中文版+英文文字版+源码【PDF】 http://www.linuxidc.com/Linux/2012-10/71901.htmHadoop HelloWorld Examples - 单表连接 http://www.linuxidc.com/Linux/2013-08/89374.htmimport java.util.*;
import java.io.*;import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.*;
import org.apache.hadoop.mapreduce.lib.input.*;
import org.apache.hadoop.mapreduce.lib.output.*;
import org.apache.hadoop.io.IOUtils;public class ShortestPath { public static void main(String[] args) throws Exception
{
Configuration conf = new Configuration();
conf.addResource(new Path("/usr/local/hadoop/conf/core-site.xml"));
//The two lines" code below is quite useful when debugging Configuration, see reference[3].
//System.out.println(conf.getRaw("fs.default.name"));
//System.out.println(conf.toString());
FileSystem fs = FileSystem.get(conf);
FSDataInputStream in= fs.open(new Path(fs.getWorkingDirectory()+"/input/data"));
BufferedReader br = new BufferedReader(new InputStreamReader(in));
FSDataOutputStream out = fs.create(new Path(fs.getWorkingDirectory() +"/testInput/copyData.txt"));
String str = br.readLine();
while(str!=null)
{
out.writeBytes(str);
out.writeBytes("
");
str = br.readLine();
}
out.close();
br.close();
}
}以上的拷贝操作也可以通过IOUtils来完成,例如:import java.util.*;
import java.io.*;import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.*;
import org.apache.hadoop.mapreduce.lib.input.*;
import org.apache.hadoop.mapreduce.lib.output.*;
import org.apache.hadoop.io.IOUtils;public class ShortestPath { public static void main(String[] args) throws Exception
{
Configuration conf = new Configuration();
conf.addResource(new Path("/usr/local/hadoop/conf/core-site.xml"));
//System.out.println(conf.getRaw("fs.default.name"));
//System.out.println(conf.toString());
FileSystem fs = FileSystem.get(conf);
FSDataInputStream in= fs.open(new Path(fs.getWorkingDirectory()+"/input/data"));
FSDataOutputStream out = fs.create(new Path(fs.getWorkingDirectory() +"/testInput/copyData.txt"));
IOUtils.copyBytes(in, out, conf);
in.close();
out.close();
}
}上面的conf.addResource(new Path("/usr/local/hadoop/conf/core-site.xml"));这行代码让我挺困惑的,我一直以为Configuration是自己在构造函数的时候就自动载入这些默认的core-site.xml之类,但是看来不是。而且调用Configuration的toString()函数后显示它载入了多个core-site.xml,更加困惑。菜鸟对配置文件不熟悉,知道的兄弟讲讲。System.out.println(conf.toString());其他更多的文件操作,比如删除等,可以参考reference[1,2],基本大同小异。Reference(1) Hadoop: The Definitive Guide【PDF版】 http://www.linuxidc.com/Linux/2012-01/51182.htm(2)http://eclipse.sys-con.com/node/1287801/mobile(3)http://www.opensourceconnections.com/2013/03/24/hdfs-debugging-wrong-fs-expected-file-exception/更多Hadoop相关信息见Hadoop 专题页面 http://www.linuxidc.com/topicnews.aspx?tid=13