当前位置: 首页 > 知识库问答 >
问题:

Hadoop安装问题:

公西翊歌
2023-03-14

我遵循本教程安装Hadoop。不幸的是,当我运行start all时。sh脚本-控制台上打印了以下错误:

hduser@dennis-HP:/usr/local/hadoop/sbin$ start-all.sh
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
hadoop config script is run...
hdfs script is run...
Config parameter : 
16/04/10 23:45:40 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
localhost: mkdir: cannot create directory ‘/usr/local/hadoop/logs’: Permission denied
localhost: chown: cannot access ‘/usr/local/hadoop/logs’: No such file or directory
localhost: starting namenode, logging to /usr/local/hadoop/logs/hadoop-hduser-namenode-dennis-HP.out
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 159: /usr/local/hadoop/logs/hadoop-hduser-namenode-dennis-HP.out: No such file or directory
localhost: head: cannot open ‘/usr/local/hadoop/logs/hadoop-hduser-namenode-dennis-HP.out’ for reading: No such file or directory
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 177: /usr/local/hadoop/logs/hadoop-hduser-namenode-dennis-HP.out: No such file or directory
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 178: /usr/local/hadoop/logs/hadoop-hduser-namenode-dennis-HP.out: No such file or directory
localhost: mkdir: cannot create directory ‘/usr/local/hadoop/logs’: Permission denied
localhost: chown: cannot access ‘/usr/local/hadoop/logs’: No such file or directory
localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-hduser-datanode-dennis-HP.out
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 159: /usr/local/hadoop/logs/hadoop-hduser-datanode-dennis-HP.out: No such file or directory
localhost: head: cannot open ‘/usr/local/hadoop/logs/hadoop-hduser-datanode-dennis-HP.out’ for reading: No such file or directory
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 177: /usr/local/hadoop/logs/hadoop-hduser-datanode-dennis-HP.out: No such file or directory
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 178: /usr/local/hadoop/logs/hadoop-hduser-datanode-dennis-HP.out: No such file or directory
Starting secondary namenodes [0.0.0.0]
0.0.0.0: mkdir: cannot create directory ‘/usr/local/hadoop/logs’: Permission denied
0.0.0.0: chown: cannot access ‘/usr/local/hadoop/logs’: No such file or directory
0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop/logs/hadoop-hduser-secondarynamenode-dennis-HP.out
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 159: /usr/local/hadoop/logs/hadoop-hduser-secondarynamenode-dennis-HP.out: No such file or directory
0.0.0.0: head: cannot open ‘/usr/local/hadoop/logs/hadoop-hduser-secondarynamenode-dennis-HP.out’ for reading: No such file or directory
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 177: /usr/local/hadoop/logs/hadoop-hduser-secondarynamenode-dennis-HP.out: No such file or directory
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 178: /usr/local/hadoop/logs/hadoop-hduser-secondarynamenode-dennis-HP.out: No such file or directory
16/04/10 23:45:55 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
yarn script is run...
starting yarn daemons
mkdir: cannot create directory ‘/usr/local/hadoop/logs’: Permission denied
chown: cannot access ‘/usr/local/hadoop/logs’: No such file or directory
starting resourcemanager, logging to /usr/local/hadoop/logs/yarn-hduser-resourcemanager-dennis-HP.out
/usr/local/hadoop/sbin/yarn-daemon.sh: line 124: /usr/local/hadoop/logs/yarn-hduser-resourcemanager-dennis-HP.out: No such file or directory
head: cannot open ‘/usr/local/hadoop/logs/yarn-hduser-resourcemanager-dennis-HP.out’ for reading: No such file or directory
/usr/local/hadoop/sbin/yarn-daemon.sh: line 129: /usr/local/hadoop/logs/yarn-hduser-resourcemanager-dennis-HP.out: No such file or directory
/usr/local/hadoop/sbin/yarn-daemon.sh: line 130: /usr/local/hadoop/logs/yarn-hduser-resourcemanager-dennis-HP.out: No such file or directory
localhost: mkdir: cannot create directory ‘/usr/local/hadoop/logs’: Permission denied
localhost: chown: cannot access ‘/usr/local/hadoop/logs’: No such file or directory
localhost: starting nodemanager, logging to /usr/local/hadoop/logs/yarn-hduser-nodemanager-dennis-HP.out
localhost: /usr/local/hadoop/sbin/yarn-daemon.sh: line 124: /usr/local/hadoop/logs/yarn-hduser-nodemanager-dennis-HP.out: No such file or directory
localhost: head: cannot open ‘/usr/local/hadoop/logs/yarn-hduser-nodemanager-dennis-HP.out’ for reading: No such file or directory
localhost: /usr/local/hadoop/sbin/yarn-daemon.sh: line 129: /usr/local/hadoop/logs/yarn-hduser-nodemanager-dennis-HP.out: No such file or directory
localhost: /usr/local/hadoop/sbin/yarn-daemon.sh: line 130: /usr/local/hadoop/logs/yarn-hduser-nodemanager-dennis-HP.out: No such file or directory

当我执行JPS命令时,抛出了以下错误

hduser@dennis-HP:/usr/local/hadoop/sbin$ jps
3802 Jps

我是hadoop新手,请给我指一篇文章,这篇文章将帮助我安装hadoop而不会出现问题

或者,如果可能(更可取)解决所面临的问题,请让我知道出现了什么问题以及如何解决?

共有3个答案

田鸿彩
2023-03-14

检查权限。如果不在hduser下,请更改所有权。

sudo chown-R用户名:组目录

钮博裕
2023-03-14

请使用命令chmod或chown检查文件夹上是否正确设置了权限。

Hadoop提供单个节点启动和停止服务,即Hadoop-daemon.sh启动[Node]

同样,还有启动/停止纱线的脚本。下面的帖子有一步一步的详细信息安装Apache Hadoophttp://www.hadoopstrata.com/staticpost?postNbr=7

凌华奥
2023-03-14

当前用户对/usr/local/hadoop的权限有限。尝试更改权限。

sudo chmod 777-R/usr/本地/hadoop/

 类似资料:
  • 我运行Hadoop集群,我有兴趣再安装一台仅使用DFSClient的机器。 这台机器(我们称之为机器X)将不是集群的一部分。 我得到本地根目录(不是HDFS根)。 我做错了什么?

  • 我正在尝试将NFS共享从windows Server2012挂载到我的Hadoop集群(运行Hadoop 2.7.3),这样它就可以在上传到windows服务器的文件上运行MapReduce。Hadoop集群正在raspberry PI2(其中8个)上运行,我已经在Hadoop wiki上进行了配置 我已经尝试将NFS挂载到主机上的HDFS目录(/HDFS/tmp/datanode)上,但在nam

  • 一、安装准备 1、下载hadoop 0.20.2,地址:http://www.apache.org/dist/hadoop/core/hadoop-0.20.2/ 2、JDK版本:jdk-6u20-linux-i586.bin (必须是1.6) 3、操作系统:Linux s132 2.6.9-78.8AXS2smp #1 SMP Tue Dec 16 02:42:55 EST 2008 x86_6

  • 017-12-21 13:46:55297-堆栈特征版本信息:集群堆栈=2.6,集群当前版本=无,命令堆栈=无,命令版本=无- 1次尝试后命令失败

  • 一 JDK的安装 下载JDK安装包,建议去Oracle官方下载,地址自行百度 下载Hadoop2.6的安装包,建议官方下载,地址自行百度 如果是在Windows端进行终端操作,建议使用XFTP与XShell,有Free版本 之后用XFTP将JDK安装包与Hadoop安装包上传到实验主机上 将Java SDK解压,并将解压文件复制到/usr/lib/jvm中 配置环境变量 如果系统中已经有默认的Op

  • 本文向大家介绍在IDEA中安装scala、maven、hadoop遇到的问题小结,包括了在IDEA中安装scala、maven、hadoop遇到的问题小结的使用技巧和注意事项,需要的朋友参考一下 小白在通过IDEA使用scala、maven、hadoop遇到的问题 问题一:idea new 新文件没有scala;File->setting->Plugins,然后搜索scala插件安装。安装完成后重