首页 \ 问答 \ Hadoop伪分布式模式 - Datanode和tasktracker无法启动(Hadoop pseudo distributed mode - Datanode and tasktracker not starting)

Hadoop伪分布式模式 - Datanode和tasktracker无法启动(Hadoop pseudo distributed mode - Datanode and tasktracker not starting)

我正在运行安装了Hadoop 1.1.2的Red Hat Enterprise Linux Server 6.4版(Santiago)发行版。 我已经完成了所需的配置以启用伪分布式模式。 但是在尝试运行hadoop时,datanode和tasktracker无法启动。

我无法将任何文件复制到hdfs。

[hduser@is-joshbloom-hadoop hadoop]$ hadoop dfs -put README.txt /input
Warning: $HADOOP_HOME is deprecated.

13/05/23 16:42:00 WARN hdfs.DFSClient: DataStreamer Exception: org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /input could only be replicated to 0 nodes, instead of 1

在尝试hadoop-daemon.sh start datanode我收到消息:

starting datanode, logging to /usr/local/hadoop/libexec/../logs/hadoop-hduser-datanode-is-joshbloom-hadoop.out

tasktracker也是如此。 但是当我为namenode,secondarynamenode,jobtracker尝试相同的命令时,它们似乎正在运行。

namenode running as process 32933. Stop it first. 

我尝试了以下解决方案:

  1. 重新格式化namenode
  2. 重新安装hadoop
  3. 安装不同版本的hadoop(1.0.4)

似乎没有工作。 我在我的Mac和亚马逊ubuntu VM上遵循了相同的安装步骤,它完美无缺。

我怎样才能让hadoop工作? 谢谢!

*更新**

这是namenode的日志条目

2013-05-23 16:27:44,087 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting DataNode
STARTUP_MSG:   host = java.net.UnknownHostException: is-joshbloom-hadoop: is-joshbloom-hadoop
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 1.1.2
STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 1440782; compiled by 'hortonfo' on Thu Jan 31 02:03:24 UTC 2013
************************************************************/
2013-05-23 16:27:44,382 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
2013-05-23 16:27:44,432 INFO org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source MetricsSystem,sub=Stats registered.
2013-05-23 16:27:44,446 ERROR org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Error getting localhost name. Using 'localhost'...
java.net.UnknownHostException: is-joshbloom-hadoop: is-joshbloom-hadoop
        at java.net.InetAddress.getLocalHost(InetAddress.java:1438)
        at     org.apache.hadoop.metrics2.impl.MetricsSystemImpl.getHostname(MetricsSystemImpl.java:463)
        at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.configureSystem(MetricsSystemImpl.java:394)
        at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.configure(MetricsSystemImpl.java:390)
        at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.start(MetricsSystemImpl.java:152)
        at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.init(MetricsSystemImpl.java:133)
        at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.init(DefaultMetricsSystem.java:40)
        at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.initialize(DefaultMetricsSystem.java:50)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1589)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
Caused by: java.net.UnknownHostException: is-joshbloom-hadoop
        at java.net.Inet4AddressImpl.lookupAllHostAddr(Native Method)
        at java.net.InetAddress$1.lookupAllHostAddr(InetAddress.java:866)
        at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1258)
        at java.net.InetAddress.getLocalHost(InetAddress.java:1434)
        ... 11 more
2013-05-23 16:27:44,453 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled     snapshot period at 10 second(s).
2013-05-23 16:27:44,453 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode     metrics system started
2013-05-23 16:27:44,768 INFO org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi registered.
2013-05-23 16:27:44,914 INFO org.apache.hadoop.util.NativeCodeLoader: Loaded the native-hadoop library
2013-05-23 16:27:45,212 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:     java.net.UnknownHostException: is-joshbloom-hadoop: is-joshbloom-hadoop
        at java.net.InetAddress.getLocalHost(InetAddress.java:1438)
        at org.apache.hadoop.security.SecurityUtil.getLocalHostName(SecurityUtil.java:271)
        at org.apache.hadoop.security.SecurityUtil.login(SecurityUtil.java:289)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:301)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
Caused by: java.net.UnknownHostException: is-joshbloom-hadoop
        at java.net.Inet4AddressImpl.lookupAllHostAddr(Native Method)
        at java.net.InetAddress$1.lookupAllHostAddr(InetAddress.java:866)
        at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1258)
        at java.net.InetAddress.getLocalHost(InetAddress.java:1434)
        ... 8 more

2013-05-23 16:27:45,228 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at java.net.UnknownHostException: is-joshbloom-hadoop: is-joshbloom-hadoop
************************************************************/

*更新***

/etc/hosts

127.0.0.1   localhost localhost.localdomain localhost4 localhost4.localdomain4
::1         localhost localhost.localdomain localhost6 localhost6.localdomain6

I am running a Red Hat Enterprise Linux Server release 6.4 (Santiago) distribution with Hadoop 1.1.2 installed on it. I have made the required configurations to enable the pseudo distributed mode. But on trying to run hadoop, the datanode and tasktracker don't start.

I am not able to copy any files to hdfs.

[hduser@is-joshbloom-hadoop hadoop]$ hadoop dfs -put README.txt /input
Warning: $HADOOP_HOME is deprecated.

13/05/23 16:42:00 WARN hdfs.DFSClient: DataStreamer Exception: org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /input could only be replicated to 0 nodes, instead of 1

Also after trying hadoop-daemon.sh start datanode I get the message:

starting datanode, logging to /usr/local/hadoop/libexec/../logs/hadoop-hduser-datanode-is-joshbloom-hadoop.out

same goes for tasktracker. But when I try the same command for namenode, secondarynamenode, jobtracker they are seem to be running.

namenode running as process 32933. Stop it first. 

I tried the following solutions:

  1. Reformatting namenode
  2. Reinstalling hadoop
  3. Installing different version of hadoop (1.0.4)

None seem to work. I have followed the same installation steps on my Mac and on amazon ubuntu VM and it works perfectly.

How can I get hadoop working? Thanks!

*UPDATE**

Here is the log entry of namenode

2013-05-23 16:27:44,087 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting DataNode
STARTUP_MSG:   host = java.net.UnknownHostException: is-joshbloom-hadoop: is-joshbloom-hadoop
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 1.1.2
STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 1440782; compiled by 'hortonfo' on Thu Jan 31 02:03:24 UTC 2013
************************************************************/
2013-05-23 16:27:44,382 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
2013-05-23 16:27:44,432 INFO org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source MetricsSystem,sub=Stats registered.
2013-05-23 16:27:44,446 ERROR org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Error getting localhost name. Using 'localhost'...
java.net.UnknownHostException: is-joshbloom-hadoop: is-joshbloom-hadoop
        at java.net.InetAddress.getLocalHost(InetAddress.java:1438)
        at     org.apache.hadoop.metrics2.impl.MetricsSystemImpl.getHostname(MetricsSystemImpl.java:463)
        at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.configureSystem(MetricsSystemImpl.java:394)
        at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.configure(MetricsSystemImpl.java:390)
        at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.start(MetricsSystemImpl.java:152)
        at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.init(MetricsSystemImpl.java:133)
        at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.init(DefaultMetricsSystem.java:40)
        at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.initialize(DefaultMetricsSystem.java:50)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1589)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
Caused by: java.net.UnknownHostException: is-joshbloom-hadoop
        at java.net.Inet4AddressImpl.lookupAllHostAddr(Native Method)
        at java.net.InetAddress$1.lookupAllHostAddr(InetAddress.java:866)
        at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1258)
        at java.net.InetAddress.getLocalHost(InetAddress.java:1434)
        ... 11 more
2013-05-23 16:27:44,453 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled     snapshot period at 10 second(s).
2013-05-23 16:27:44,453 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode     metrics system started
2013-05-23 16:27:44,768 INFO org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi registered.
2013-05-23 16:27:44,914 INFO org.apache.hadoop.util.NativeCodeLoader: Loaded the native-hadoop library
2013-05-23 16:27:45,212 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:     java.net.UnknownHostException: is-joshbloom-hadoop: is-joshbloom-hadoop
        at java.net.InetAddress.getLocalHost(InetAddress.java:1438)
        at org.apache.hadoop.security.SecurityUtil.getLocalHostName(SecurityUtil.java:271)
        at org.apache.hadoop.security.SecurityUtil.login(SecurityUtil.java:289)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:301)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
Caused by: java.net.UnknownHostException: is-joshbloom-hadoop
        at java.net.Inet4AddressImpl.lookupAllHostAddr(Native Method)
        at java.net.InetAddress$1.lookupAllHostAddr(InetAddress.java:866)
        at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1258)
        at java.net.InetAddress.getLocalHost(InetAddress.java:1434)
        ... 8 more

2013-05-23 16:27:45,228 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at java.net.UnknownHostException: is-joshbloom-hadoop: is-joshbloom-hadoop
************************************************************/

*UPDATE***

content of /etc/hosts

127.0.0.1   localhost localhost.localdomain localhost4 localhost4.localdomain4
::1         localhost localhost.localdomain localhost6 localhost6.localdomain6

原文:https://stackoverflow.com/questions/16725804
更新时间:2022-05-02 18:05

最满意答案

删除IFS=","赋值,从而使空格,制表符,换行符的默认IFS值适用

#!/bin/bash

eCollection=( $(cut -d ',' -f2 MyAssignment.csv ) )
printf "%s\n" "${eCollection[0]}"

Remove the IFS="," assignment thereby letting the default IFS value of space, tab, newline apply

#!/bin/bash

eCollection=( $(cut -d ',' -f2 MyAssignment.csv ) )
printf "%s\n" "${eCollection[0]}"

相关问答

更多

相关文章

更多

最新问答

更多
  • 获取MVC 4使用的DisplayMode后缀(Get the DisplayMode Suffix being used by MVC 4)
  • 如何通过引用返回对象?(How is returning an object by reference possible?)
  • 矩阵如何存储在内存中?(How are matrices stored in memory?)
  • 每个请求的Java新会话?(Java New Session For Each Request?)
  • css:浮动div中重叠的标题h1(css: overlapping headlines h1 in floated divs)
  • 无论图像如何,Caffe预测同一类(Caffe predicts same class regardless of image)
  • xcode语法颜色编码解释?(xcode syntax color coding explained?)
  • 在Access 2010 Runtime中使用Office 2000校对工具(Use Office 2000 proofing tools in Access 2010 Runtime)
  • 从单独的Web主机将图像传输到服务器上(Getting images onto server from separate web host)
  • 从旧版本复制文件并保留它们(旧/新版本)(Copy a file from old revision and keep both of them (old / new revision))
  • 西安哪有PLC可控制编程的培训
  • 在Entity Framework中选择基类(Select base class in Entity Framework)
  • 在Android中出现错误“数据集和渲染器应该不为null,并且应该具有相同数量的系列”(Error “Dataset and renderer should be not null and should have the same number of series” in Android)
  • 电脑二级VF有什么用
  • Datamapper Ruby如何添加Hook方法(Datamapper Ruby How to add Hook Method)
  • 金华英语角.
  • 手机软件如何制作
  • 用于Android webview中图像保存的上下文菜单(Context Menu for Image Saving in an Android webview)
  • 注意:未定义的偏移量:PHP(Notice: Undefined offset: PHP)
  • 如何读R中的大数据集[复制](How to read large dataset in R [duplicate])
  • Unity 5 Heighmap与地形宽度/地形长度的分辨率关系?(Unity 5 Heighmap Resolution relationship to terrain width / terrain length?)
  • 如何通知PipedOutputStream线程写入最后一个字节的PipedInputStream线程?(How to notify PipedInputStream thread that PipedOutputStream thread has written last byte?)
  • python的访问器方法有哪些
  • DeviceNetworkInformation:哪个是哪个?(DeviceNetworkInformation: Which is which?)
  • 在Ruby中对组合进行排序(Sorting a combination in Ruby)
  • 网站开发的流程?
  • 使用Zend Framework 2中的JOIN sql检索数据(Retrieve data using JOIN sql in Zend Framework 2)
  • 条带格式类型格式模式编号无法正常工作(Stripes format type format pattern number not working properly)
  • 透明度错误IE11(Transparency bug IE11)
  • linux的基本操作命令。。。