当前位置: 首页 > 知识库问答 >
问题:

hadoop 2.2.0 64位安装,但无法启动

严柏
2023-03-14

我正在尝试在服务器上安装Hadoop 2.2.0群集。现在所有的服务器都是64位的,我下载了Hadoop 2.2.0,所有的配置文件都已经设置好了。当我跑步的时候/启动dfs。sh,我得到以下错误:

13/11/15 14:29:26 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hchen/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.namenode]
sed: -e expression #1, char 6: unknown option to `s' have: ssh: Could not resolve hostname have: Name or service not known
HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM): Name or service not known
-c: Unknown cipher type 'cd'
Java: ssh: Could not resolve hostname Java: Name or service not known
The authenticity of host 'namenode (192.168.1.62)' can't be established.
RSA key fingerprint is 65:f9:aa:7c:8f:fc:74:e4:c7:a2:f5:7f:d2:cd:55:d4.
Are you sure you want to continue connecting (yes/no)? VM: ssh: Could not resolve        hostname VM: Name or service not known
You: ssh: Could not resolve hostname You: Name or service not known
warning:: ssh: Could not resolve hostname warning:: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
64-Bit: ssh: Could not resolve hostname 64-Bit: Name or service not known
...

除了64位之外,还有其他错误吗?我已经在没有密码的情况下完成了namenode和datanodes之间的登录,其他错误意味着什么?

共有3个答案

宗政子琪
2023-03-14

您还可以在hadoop-env.sh中导出变量

vim /usr/local/hadoop/etc/hadoop/hadoop-env.sh

/usr/local/hadoop-我的hadoop安装文件

#Hadoop variables
export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-amd64 # your jdk install path
export HADOOP_INSTALL=/usr/local/hadoop
export PATH=$PATH:$HADOOP_INSTALL/bin
export PATH=$PATH:$HADOOP_INSTALL/sbin
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"
阎安邦
2023-03-14

根本原因是hadoop中的默认本机库是为32位构建的。解决方案

1) 在中设置一些环境变量。bash\u配置文件。请参阅https://gist.github.com/ruo91/7154697或

2) 重建hadoop本机库,请参阅http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/NativeLibraries.html

羊舌诚
2023-03-14

将以下条目添加到. bashrc,其中HADOOP_HOME是您的hadoop文件夹:

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"

此外,执行以下命令:

ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
 类似资料:
  • 我从这里的Android Developer站点下载了IDE,在安装向导之后,IDE甚至连开始菜单都没有启动

  • 我最近安装了eclipse-dsl-juno-SR1-win32-x86_64,在解压缩文件后,当我开始运行Eclipse时,它给了我以下错误: Java已启动,但返回退出代码=1(所需Java版本=1.5) 我检查了原因,也尝试了重新安装以及其他讨论论坛建议的解决方案,但没有效果。 null 我正确地链接了我的环境变量,并尝试通过cmd编译一个Java文件,并且成功了。 我尝试按照论坛的建议在c

  • 我很感激任何帮助,我真的想让PostgreSQL在我的机器上运行。

  • 问题内容: 我是Docker的新手。安装Docker Toolbox(OS:Windows 10)之后,我运行Docker Quickstart Terminal,并在控制台中看到以下内容: 我做错什么了?系统和安装步骤的所有检查均从此处进行。 非常感谢您的帮助! 问题答案: 在Windows 10计算机上安装Docker Toolbox时遇到了相同的错误。 解: 安装Docker Communi

  • 问题内容: 使用安装程序包从Oracle安装Java SE 1.7.0u10 下载并解压缩了Eclipse Juno(4.2.1) 双击Eclipse紫色图标并获得OS X警报提示和错误消息: 要打开“ Eclipse”,您需要Java SE 6运行时。您现在要安装一个吗? (在终端中)- -- (在Finder中)双击eclipse别名(解压缩下载时包含)-终端启动,并且Exclipse启动而没

  • 一、背景 早上由于误删namenode上的hadoop文件夹,在主节点上重新安装hadoop之后,发现有2个datanode无法启动,经过排查,解决了这个问题,记下。 二、现象及解决办法 1、2个节点的Tasktracker启动了,但datanode死活起不来。 2、使用sh hadoop-daemon.sh命令单独也无法启动。 3、错误信息如下: 2010-08-10 10:51:23,413