77范文网 - 专业文章范例文档资料分享平台

hadoop单机部署、集群部署及win7本地Eclipse远程配置管理(3)

来源:网络收集 时间:2019-06-05 下载这篇文档 手机版
说明:文章内容仅供预览,部分内容可能不全,需要完整文档或者需要复制内容,请下载word后使用。下载word有问题请添加微信号:或QQ: 处理(尽可能给您提供完整文档),感谢您的支持与谅解。点击这里给我发消息

新建一个测试类WordCountTest: package com.hadoop.learn.test;

import java.io.IOException; import java.util.StringTokenizer;

import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.Path;

import org.apache.hadoop.io.IntWritable; import org.apache.hadoop.io.Text;

import org.apache.hadoop.mapreduce.Job; import org.apache.hadoop.mapreduce.Mapper; import org.apache.hadoop.mapreduce.Reducer;

import org.apache.hadoop.mapreduce.lib.input.FileInputFormat; import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat; import org.apache.hadoop.util.GenericOptionsParser; import org.apache.log4j.Logger; /**

* 运行测试程序 *

* @author yongboy * @date 2012-04-16 */

public class WordCountTest { private static final Logger log = Logger.getLogger(WordCountTest.class); public static class TokenizerMapper extends

}

Mapper {

private final static IntWritable one = new IntWritable(1); private Text word = new Text();

public void map(Object key, Text value, Context context) throws IOException, InterruptedException { log.info(\ log.info(\ StringTokenizer itr = new StringTokenizer(value.toString()); while (itr.hasMoreTokens()) { String wordStr = itr.nextToken(); word.set(wordStr); log.info(\ context.write(word, one); } }

public static class IntSumReducer extends Reducer { private IntWritable result = new IntWritable(); }

public void reduce(Text key, Iterable values, Context context) throws IOException, InterruptedException { log.info(\ log.info(\ int sum = 0; for (IntWritable val : values) { sum += val.get(); } result.set(sum); log.info(\ context.write(key, result); }

public static void main(String[] args) throws Exception { Configuration conf = new Configuration(); String[] otherArgs = new GenericOptionsParser(conf, args) .getRemainingArgs(); if (otherArgs.length != 2) { System.err.println(\ System.exit(2); }

Job job = new Job(conf, \ job.setJarByClass(WordCountTest.class); job.setMapperClass(TokenizerMapper.class); job.setCombinerClass(IntSumReducer.class); job.setReducerClass(IntSumReducer.class); job.setOutputKeyClass(Text.class); job.setOutputValueClass(IntWritable.class); FileInputFormat.addInputPath(job, new Path(otherArgs[0])); FileOutputFormat.setOutputPath(job, new Path(otherArgs[1])); System.exit(job.waitForCompletion(true) ? 0 : 1); } }

右键,选择“Run Configurations”,弹出窗口,点击“Arguments”选项卡,在“Program argumetns”处预先输入参数:

hdfs://192.168.153.129:9000/user/hadoop/test1 hdfs://192.168.153.129:9000/user/hadoop/output3

“VM argumetns”处输入参数-DHADOOP_USER_NAME=hadoop

(VM argumetns如果不设置参数可能会有权限的问题Permission denied,如果你的hadoop账号为hadooptest,或者xxx 则此处设置为-DHADOOP_USER_NAME=hadooptest、-DHADOOP_USER_NAME=xxx 网上也有其他解决方案

网上的解决方案:

修改hdfs-core.xml配置文件,关闭权限验证。

dfs.permissions false ) 如图:

备注:参数为了在本地调试使用,而非真实环境。

然后,点击“Apply”,然后“Close”。现在可以右键,选择“Run on Hadoop”,运行。 但此时会出现类似异常信息:

12/04/24 15:32:44 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

12/04/24 15:32:44 ERROR security.UserGroupInformation:

PriviledgedActionException as:Administrator cause:java.io.IOException: Failed to set permissions of path:

\\tmp\\hadoop-Administrator\\mapred\\staging\\Administrator-519341271\\.staging to 0700

Exception in thread \of path:

\\tmp\\hadoop-Administrator\\mapred\\staging\\Administrator-519341271\\.sta

ging to 0700

at org.apache.hadoop.fs.FileUtil.checkReturnValue(FileUtil.java:682) at org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java:655) at

org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:509) at

org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:344) at

org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:189) at

org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:116)

at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:856) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at

org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093) at

org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:850)

at org.apache.hadoop.mapreduce.Job.submit(Job.java:500) at

org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530) at

com.hadoop.learn.test.WordCountTest.main(WordCountTest.java:85) 这个是Windows下文件权限问题,在Linux下可以正常运行,不存在这样的问题。 解决方法是,修改/hadoop-1.0.2/src/core/org/apache/hadoop/fs/FileUtil.java里面的checkReturnValue,注释掉即可(有些粗暴,在Window下,可以不用检查): /**

百度搜索“77cn”或“免费范文网”即可找到本站免费阅读全部范文。收藏本站方便下次阅读,免费范文网,提供经典小说综合文库hadoop单机部署、集群部署及win7本地Eclipse远程配置管理(3)在线全文阅读。

hadoop单机部署、集群部署及win7本地Eclipse远程配置管理(3).doc 将本文的Word文档下载到电脑,方便复制、编辑、收藏和打印 下载失败或者文档不完整,请联系客服人员解决!
本文链接:https://www.77cn.com.cn/wenku/zonghe/650579.html(转载请注明文章来源)
Copyright © 2008-2022 免费范文网 版权所有
声明 :本网站尊重并保护知识产权,根据《信息网络传播权保护条例》,如果我们转载的作品侵犯了您的权利,请在一个月内通知我们,我们会及时删除。
客服QQ: 邮箱:tiandhx2@hotmail.com
苏ICP备16052595号-18
× 注册会员免费下载(下载后可以自由复制和排版)
注册会员下载
全站内容免费自由复制
注册会员下载
全站内容免费自由复制
注:下载文档有可能“只有目录或者内容不全”等情况,请下载之前注意辨别,如果您已付费且无法下载或内容有问题,请联系我们协助你处理。
微信: QQ: