当前位置: 首页 > 知识库问答 >
问题:

HBase无法在HDFS中创建其目录

魏臻
2023-03-14

找到7个项目drwxr-xr-x-hbase用户0 201 4-06-25 18:58/hbase/.tmp

...

但当我运行此命令时,我会得到/HBASE:no such file or director

<configuration>
    <property>
        <name>fs.defaultFS</name>
        <value>hdfs://localhost:9000</value>
    </property>
</configuration>
<configuration>
   <property>
      <name>dfs.replication</name >
      <value>1</value>
   </property>

   <property>
      <name>dfs.name.dir</name>
      <value>file:///home/marc/hadoopinfra/hdfs/namenode</value>
   </property>

   <property>
      <name>dfs.data.dir</name>
      <value>file:///home/marc/hadoopinfra/hdfs/datanode</value>
   </property>
</configuration>
<configuration>
    <property>
        <name>mapreduce.framework.name</name>
        <value>yarn</value>
    </property>
</configuration>

yarn-site.xml

<configuration>
    <property>
        <name>yarn.nodemanager.aux-services</name>
        <value>mapreduce_shuffle</value>
    </property>
    <property>
        <name>yarn.nodemanager.env-whitelist</name>
        <value>JAVA_HOME,HADOOP_COMMON_HOME,HADOOP_HDFS_HOME,HADOOP_CONF_DIR,CLASSPATH_PREPEND_DISTCACHE,HADOOP_YARN_HOME,HADOOP_MAPRED_HOME</value>
    </property>
</configuration>

Hbase配置hbase-site.xml

<configuration>
   <property>
   <name>hbase.rootdir</name>
   <value>hdfs://localhost:8030/hbase</value>
</property>
   <property>
      <name>hbase.zookeeper.property.dataDir</name>
      <value>/home/marc/zookeeper</value>
   </property>
   <property>
       <name>hbase.cluster.distributed</name>
       <value>true</value>
    </property>
</configuration>

我可以浏览http://localhost:50070和http://localhost:8088/cluster

在hbase-marc-master-marc-pc.log中,我有以下异常。有关系吗?

2017-07-01 20:31:59,349 FATAL [marc-pc:16000.activeMasterManager] master.HMaster: Failed to become active master
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): SIMPLE authentication is not enabled.  Available:[TOKEN]
    at org.apache.hadoop.ipc.Client.call(Client.java:1411)
    at org.apache.hadoop.ipc.Client.call(Client.java:1364)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
    at com.sun.proxy.$Proxy15.setSafeMode(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    at com.sun.proxy.$Proxy15.setSafeMode(Unknown Source)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.setSafeMode(ClientNamenodeProtocolTranslatorPB.java:602)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.hbase.fs.HFileSystem$1.invoke(HFileSystem.java:279)
    at com.sun.proxy.$Proxy16.setSafeMode(Unknown Source)
    at org.apache.hadoop.hdfs.DFSClient.setSafeMode(DFSClient.java:2264)
    at org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSystem.java:986)
    at org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSystem.java:970)
    at org.apache.hadoop.hbase.util.FSUtils.isInSafeMode(FSUtils.java:525)
    at org.apache.hadoop.hbase.util.FSUtils.waitOnSafeMode(FSUtils.java:971)
    at org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:429)
    at org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:153)
    at org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:128)
    at org.apache.hadoop.hbase.master.HMaster.finishActiveMasterInitialization(HMaster.java:693)
    at org.apache.hadoop.hbase.master.HMaster.access$600(HMaster.java:189)
    at org.apache.hadoop.hbase.master.HMaster$2.run(HMaster.java:1803)
    at java.lang.Thread.run(Thread.java:748)
2017-07-01 20:31:59,351 FATAL [marc-pc:16000.activeMasterManager] master.HMaster: Unhandled exception. Starting shutdown.
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): SIMPLE authentication is not enabled.  Available:[TOKEN]
    at org.apache.hadoop.ipc.Client.call(Client.java:1411)
    at org.apache.hadoop.ipc.Client.call(Client.java:1364)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
    at com.sun.proxy.$Proxy15.setSafeMode(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    at com.sun.proxy.$Proxy15.setSafeMode(Unknown Source)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.setSafeMode(ClientNamenodeProtocolTranslatorPB.java:602)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.hbase.fs.HFileSystem$1.invoke(HFileSystem.java:279)
    at com.sun.proxy.$Proxy16.setSafeMode(Unknown Source)
    at org.apache.hadoop.hdfs.DFSClient.setSafeMode(DFSClient.java:2264)
    at org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSystem.java:986)
    at org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSystem.java:970)
    at org.apache.hadoop.hbase.util.FSUtils.isInSafeMode(FSUtils.java:525)
    at org.apache.hadoop.hbase.util.FSUtils.waitOnSafeMode(FSUtils.java:971)
    at org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:429)
    at org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:153)
    at org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:128)
    at org.apache.hadoop.hbase.master.HMaster.finishActiveMasterInitialization(HMaster.java:693)
    at org.apache.hadoop.hbase.master.HMaster.access$600(HMaster.java:189)
    at org.apache.hadoop.hbase.master.HMaster$2.run(HMaster.java:1803)
    at java.lang.Thread.run(Thread.java:748)

共有1个答案

郑晨
2023-03-14

日志表明HBase在成为活动主机时出现问题,因此开始关闭。

我的假设是,HBase始终无法正确启动,因此它无法自行创建/HBase目录。此外,这也是/hbase目录仍然为空的原因。

我在我的虚拟机上复制了您的错误,并用这个修改的设置修复了它。

虚拟化软件Vagrant和Virtualbox

Java

java -version
openjdk version "1.8.0_131"
OpenJDK Runtime Environment (build 1.8.0_131-b12)
OpenJDK 64-Bit Server VM (build 25.131-b12, mixed mode)

core-site.xml(HDFS)

<configuration>
   <property>
      <name>fs.default.name</name>
      <value>hdfs://localhost:8020</value>
   </property>
</configuration>
<configuration>
   <property>
      <name>hbase.rootdir</name>
      <value>file:/home/hadoop/HBase/HFiles</value>
   </property>

   <property>
      <name>hbase.zookeeper.property.dataDir</name>
      <value>/home/hadoop/zookeeper</value>
   </property>
   <property>
      <name>hbase.cluster.distributed</name>
      <value>true</value>
   </property>
   <property>
      <name>hbase.rootdir</name>
      <value>hdfs://localhost:8020/hbase</value>
   </property>
</configuration>
sudo su # Become root user
cd /usr/local/

chown -R hadoop:root hadoop
chmod -R 755 hadoop

chown -R hadoop:root Hbase
chmod -R 755 Hbase
[hadoop@localhost conf]$ hdfs dfs -ls /hbase
Found 7 items
drwxr-xr-x   - hadoop supergroup          0 2017-07-03 14:26 /hbase/.tmp
drwxr-xr-x   - hadoop supergroup          0 2017-07-03 14:26 /hbase/MasterProcWALs
drwxr-xr-x   - hadoop supergroup          0 2017-07-03 14:26 /hbase/WALs
drwxr-xr-x   - hadoop supergroup          0 2017-07-03 14:26 /hbase/data
-rw-r--r--   1 hadoop supergroup         42 2017-07-03 14:26 /hbase/hbase.id
-rw-r--r--   1 hadoop supergroup          7 2017-07-03 14:26 /hbase/hbase.version
drwxr-xr-x   - hadoop supergroup          0 2017-07-03 14:26 /hbase/oldWALs
 类似资料:
  • 我升级到Cloudera的最新版本。现在我正尝试在HDFS中创建目录 请帮助:(

  • 我试图在独立模式下部署Hbase以下这篇文章:http://hbase.apache.org/book.html#quickstart.版本是0.92.1-cdh4.1.2 但我得到这些错误时,试图创建一个表: 错误消息: 输出日志: 我的配置: > 在hbase env中添加了JAVA_HOME。上海 hbase_网站。xml 我试图修改/etc/hosts,它看起来像这样(oracle是主机名

  • 我无法使用这个IDE Spring工具套件创建新maven项目。它显示了以下错误 CoreException:无法计算生成计划:插件org.apache.maven.plugins:maven-compiler-plugin:3.1或其依赖项之一无法解析:无法读取org.apache.maven的工件描述符。plugins:maven-compiler-plugin:jar:3.1:Artifac

  • 当我试图在eclipse中创建一个Maven项目(选择maven-archetype-quickstart,1.4版)时,我得到以下错误: 无法计算生成计划:插件org.apache.maven。插件:maven-resources-plugin:3.0.2或其依赖项之一无法解析:未能读取org.apache.maven的项目描述符。插件:maven-resources-plugin:3.0.2插

  • 我正在使用操作系统在我的系统中添加以下变量 :C:\users\user\appdata\roaming\npm\node_modules :C:\Windows\System32;C:\Windows;C:\Windows\System32\WBEM;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\程序文件(x86)\Windows工具包\8.1\W

  • 我无法创建目录,我拥有所有权限,这在我的清单中: 在MainActivity onCreate中,检查权限,如果有权限,则应创建一个目录,但始终返回false: 有什么线索或暗示吗?谢谢