我爱编程

Hadoop学习笔记(三)

2018-07-25  本文已影响0人  总有人被感动

利用Eclipse和Maven运行MapReduce程序

在Eclipse上安装MapReduce插件
1.在网上下载MapRe的插件:hadoop2x-eclipse-plugin-master.zip
2.解压后将release文件夹下的hadoop-eclipse-plugin-2.6.0.jar jar包拷贝到eclipse安
装目录的plugin文件夹下.
3.重启Eclipse,就会看到MapReduce插件已经装好了

在Eclipse上配置MapReduce project
1.设置MapReduce Location


hdfsDemo1.png

New Hadoop Location


HDFSDemo4.png

2.配置Hdfs的端口(伪单机模式)


hdfsDemo2.png

3.设置完成后可以看到左边的MapReduce标签栏里面出现了hdfs里的文件


HdfsDemo3.png

在Eclipse里面运行MapReduce程序

配置Maven

 <dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-mapreduce-client-core</artifactId>
    <version>2.5.2</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-common</artifactId>
    <version>2.5.2</version>
    <exclusions>
                <exclusion>
                    <groupId>tomcat</groupId>
                    <artifactId>jasper-compiler</artifactId>
                </exclusion>
    </exclusions>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-hdfs</artifactId>
    <version>2.5.2</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-mapreduce-client-common</artifactId>
    <version>2.5.2</version>
</dependency>

WordCount实例

 public static void main(String[] args) throws Exception {
     int result = ToolRunner.run(new Configuration(),new WordCount(), args);
     System.exit(result);
 }

 public int run(String[] args) throws Exception {
     Path inputPath, outputPath;
     if(args.length == 2){
         inputPath = new Path(args[0]);
         outputPath = new Path(args[1]);
     }else{
         System.out.println("usage <input> <output>");
         return 1;
     }
     Configuration conf = getConf();
     Job job = Job.getInstance(conf, "word count");

     job.setJarByClass(WordCount.class);
     job.setMapperClass(WordCountMapper.class);
     job.setReducerClass(WordCountReducer.class);

     job.setInputFormatClass(TextInputFormat.class);
     job.setOutputFormatClass(TextOutputFormat.class);

     job.setOutputKeyClass(Text.class);
     job.setOutputValueClass(IntWritable.class);

     FileInputFormat.addInputPath(job, inputPath);
     FileOutputFormat.setOutputPath(job, outputPath);

     return job.waitForCompletion(true) ? 0 : 1;
 }

 public static class WordCountMapper extends Mapper<LongWritable, Text, Text, IntWritable> {
     private final static IntWritable one = new IntWritable(1);
     private Text word = new Text();

     @Override
     public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException {
         StringTokenizer itr = new StringTokenizer(value.toString());
         while (itr.hasMoreTokens()) {
             word.set(itr.nextToken());
             context.write(word, one);
         }

     }
 }

 public static class WordCountReducer extends Reducer<Text, IntWritable, Text, IntWritable> {
     private IntWritable result = new IntWritable();

     @Override
     public void reduce(Text key, Iterable<IntWritable> values, Context context) throws IOException, InterruptedException {
         int sum = 0;
         for (IntWritable value : values) {
             sum += value.get();
         }
         result.set(sum);
         context.write(key, result);
     }
 }

然后run configuration里面设置传入的参数,一个是输入文件的路
径,一个是输出路径。运行之后就会发现console出现了你想要的信息。


hdfsdemo5.png
上一篇下一篇

猜你喜欢

热点阅读