Mac环境搭建Hadoop伪分布式无法启动namenode问题解决办法

Mac环境搭建Hadoop伪分布式无法启动namenode问题解决办法,第1张

Mac环境搭建Hadoop伪分布式无法启动namenode问题解决办法

参考:MAC下搭建Hadoop运行环境_白程序员的自习室-CSDN博客_hadoop mac 的博客,在我的Mac上搭建Hadoop,启动时遇到namenode无法启动的问题,折腾了一下午,终于解决,记录一下

我是根据博文的步骤进行配置的,细节大家可以移步看看,下面贴出我的配置:

hadoop版本:3.2.1

 JDK:1.8


一、我的配置文件:

1. core-site.xml


 
   hadoop.tmp.dir
   /Users/tangrenxin/clusterDataDirs/hadoop
   A base for other temporary directories.
 

   fs.default.name
   hdfs://localhost:9000
 

2.hdfs-site.xml


    
        dfs.replication
        1
    
	
		dfs.namenode.name.dir
		/Users/tangrenxin/clusterDataDirs/hadoop/dfs/name
	
	
		dfs.namenode.data.dir
		/Users/tangrenxin/clusterDataDirs/hadoop/dfs/data
	  

3.mapred-site.xml


    
        mapreduce.framework.name
        yarn
    

4. yarn-site.xml



    
        yarn.nodemanager.aux-services
        mapreduce_shuffle
    
    
        yarn.resourcemanager.address
        localhost:9000
    

以上是跟随博文的步骤配置的,下面说下我遇到的问题:

启动namenode: 执行 ./start-dfs.sh (namenode format略)

➜  sbin ./start-dfs.sh
Starting namenodes on [localhost]
Starting datanodes
Starting secondary namenodes [https://account.jetbrains.com:443]
sed: 1: "s/^/https://account.jet ...": bad flag in substitute command: '/'
2021-12-12 17:51:23,150 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
➜  sbin jps
5175 DataNode
5343 Jps

可以看出,执行这个脚本会启动namenode/datanode/secondary namenode
注意看:Starting secondary namenodes [https://account.jetbrains.com:443]

在我们的配置文件中,没有 secondary namenodes 相关的配置,hadoop给默认填充了https://account.jetbrains.com:443,最后jps查看后发现,namenode和secondary namenodes都没有起起来,第一想到是的是,缺配置了~

单从上面的输出来看,secondary namenodes [https://account.jetbrains.com:443] 的配置肯定有问题,先给secondary namenodes加上配置(hdfs-site.xml):

 	
		dfs.secondary.http.address
		localhost:50090
	 

加上后重新格式化namenode并执行./start-dfs.sh

可以看到,SecondaryNameNode 正常起起来了~

但是namenode仍然没起来,并且http://localhost:50070是无法访问的。

查看启动日志:

************************************************************/
2021-12-12 19:44:34,510 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, INT]
2021-12-12 19:44:34,631 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode []
2021-12-12 19:44:34,848 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties
2021-12-12 19:44:35,014 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s).
2021-12-12 19:44:35,014 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system started
2021-12-12 19:44:35,068 INFO org.apache.hadoop.hdfs.server.namenode.NameNodeUtils: fs.defaultFS is hdfs://127.0.0.1:9000
2021-12-12 19:44:35,069 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: Clients should use 127.0.0.1:9000 to access this namenode/service.
2021-12-12 19:44:35,109 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2021-12-12 19:44:35,289 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor
2021-12-12 19:44:35,331 INFO org.apache.hadoop.hdfs.DFSUtil: Starting Web-server for hdfs at: http://https://account.jetbrains.com:443:9870
2021-12-12 19:44:35,352 INFO org.eclipse.jetty.util.log: Logging initialized @1458ms
2021-12-12 19:44:35,494 INFO org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets.
2021-12-12 19:44:35,506 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.namenode is not defined
2021-12-12 19:44:35,521 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
2021-12-12 19:44:35,526 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context hdfs
2021-12-12 19:44:35,526 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
2021-12-12 19:44:35,526 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs
2021-12-12 19:44:35,562 INFO org.apache.hadoop.http.HttpServer2: Added filter 'org.apache.hadoop.hdfs.web.AuthFilter' (class=org.apache.hadoop.hdfs.web.AuthFilter)
2021-12-12 19:44:35,562 INFO org.apache.hadoop.http.HttpServer2: addJerseyResourcePackage: packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources, pathSpec=/webhdfs/v1/*
2021-12-12 19:44:35,606 INFO org.apache.hadoop.http.HttpServer2: HttpServer.start() threw a non Bind IOException
java.net.SocketException: Unresolved address
	at sun.nio.ch.Net.translateToSocketException(Net.java:131)
	at sun.nio.ch.Net.translateException(Net.java:157)
	at sun.nio.ch.Net.translateException(Net.java:163)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:76)
	at org.eclipse.jetty.server.ServerConnector.openAcceptChannel(ServerConnector.java:351)
	at org.eclipse.jetty.server.ServerConnector.open(ServerConnector.java:319)
	at org.apache.hadoop.http.HttpServer2.bindListener(HttpServer2.java:1205)
	at org.apache.hadoop.http.HttpServer2.bindForSinglePort(HttpServer2.java:1236)
	at org.apache.hadoop.http.HttpServer2.openListeners(HttpServer2.java:1299)
	at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:1154)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeHttpServer.start(NameNodeHttpServer.java:181)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:885)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:707)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:953)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:926)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1692)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1759)
Caused by: java.nio.channels.UnresolvedAddressException
	at sun.nio.ch.Net.checkAddress(Net.java:101)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:218)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	... 13 more
2021-12-12 19:44:35,608 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Stopping NameNode metrics system...
2021-12-12 19:44:35,608 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system stopped.
2021-12-12 19:44:35,608 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system shutdown complete.
2021-12-12 19:44:35,608 ERROR org.apache.hadoop.hdfs.server.namenode.NameNode: Failed to start namenode.
java.net.SocketException: Unresolved address
	at sun.nio.ch.Net.translateToSocketException(Net.java:131)
	at sun.nio.ch.Net.translateException(Net.java:157)
	at sun.nio.ch.Net.translateException(Net.java:163)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:76)
	at org.eclipse.jetty.server.ServerConnector.openAcceptChannel(ServerConnector.java:351)
	at org.eclipse.jetty.server.ServerConnector.open(ServerConnector.java:319)
	at org.apache.hadoop.http.HttpServer2.bindListener(HttpServer2.java:1205)
	at org.apache.hadoop.http.HttpServer2.bindForSinglePort(HttpServer2.java:1236)
	at org.apache.hadoop.http.HttpServer2.openListeners(HttpServer2.java:1299)
	at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:1154)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeHttpServer.start(NameNodeHttpServer.java:181)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:885)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:707)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:953)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:926)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1692)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1759)
Caused by: java.nio.channels.UnresolvedAddressException
	at sun.nio.ch.Net.checkAddress(Net.java:101)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:218)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	... 13 more
2021-12-12 19:44:35,610 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1: java.net.SocketException: Unresolved address
2021-12-12 19:44:35,612 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at MacBook-Pro-5.local/10.231.96.38

 错误原因我圈出来:

 从报错日志来看,报错信息是:java.net.SocketException: Unresolved address

是什么地址无法解析呢?

注意看:Starting Web-server for hdfs at: http://https://account.jetbrains.com:443:9870

这个地址比较奇怪:http://https://account.jetbrains.com:443:9870,这地址一看就是有问题的嘛(这条关键信息,hadoop将日志给写到INFO里了,没太注意看,折腾了一下午:)

Hadoop提供了两种Web方式访问HDFS,分别是:WebHDFS和HttpFS。

从 Starting Web-server for hdfs at: http://https://account.jetbrains.com:443:9870 可以看出,是缺少了 Web-server 相关的配置,加上试一下:

    
        dfs.namenode.http-address
        localhost:50070
    

加上后重新格式化namenode并执行./start-dfs.sh

终于OK了~

访问 http://localhost:50070 :

也OK了,问题解决~

欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/zaji/5678949.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-12-17
下一篇 2022-12-17

发表评论

登录后才能评论

评论列表(0条)

保存