当前位置: 代码迷 >> 综合 >> hadoop上传文件报错:org.apache.hadoop.ipc.RemoteException(java.io.IOException)
  详细解决方案

hadoop上传文件报错:org.apache.hadoop.ipc.RemoteException(java.io.IOException)

热度:90   发布时间:2023-12-22 15:26:42.0

上传报错:org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /b.txt could only be replicated to 0 nodes instead of minReplication (=1).  There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
查看hadoop运行情况,发现主机名称为Hostname: localhost

[hadoop@hecs-x-large-2-linux-20200331210616 hadoop]$ hdfs dfsadmin -report
20/05/19 10:41:09 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Configured Capacity: 42140401664 (39.25 GB)
Present Capacity: 30362079232 (28.28 GB)
DFS Remaining: 30362030080 (28.28 GB)
DFS Used: 49152 (48 KB)
DFS Used%: 0.00%
Under replicated blocks: 0
Blocks with corrupt replicas: 0
Missing blocks: 0
Missing blocks (with replication factor 1): 1

-------------------------------------------------
Live datanodes (1):

Name: 192.168.0.106:50010 (hadoop)
Hostname: localhost
Decommission Status : Normal
Configured Capacity: 42140401664 (39.25 GB)
DFS Used: 49152 (48 KB)
Non DFS Used: 11778322432 (10.97 GB)
DFS Remaining: 30362030080 (28.28 GB)
DFS Used%: 0.00%
DFS Remaining%: 72.05%
Configured Cache Capacity: 0 (0 B)
Cache Used: 0 (0 B)
Cache Remaining: 0 (0 B)
Cache Used%: 100.00%
Cache Remaining%: 0.00%
Xceivers: 1
Last contact: Tue May 19 10:41:09 CST 2020

查看主机名称:

[hadoop@hecs-x-large-2-linux-20200331210616 ~]$ hostname
localhost

修改主机名称:

[hadoop@hecs-x-large-2-linux-20200331210616 ~]$ hostname hadoop

查看:

[hadoop@hecs-x-large-2-linux-20200331210616 ~]$ hostname
hadoop
重新启动hadoop:

stop-all.sh

start-all.sh

查看hadoop配置,Hostname变成了hadoop

[hadoop@hecs-x-large-2-linux-20200331210616 ~]$ hdfs dfsadmin -report
20/05/19 11:12:39 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Configured Capacity: 42140401664 (39.25 GB)
Present Capacity: 30367408143 (28.28 GB)
DFS Remaining: 30367383552 (28.28 GB)
DFS Used: 24591 (24.01 KB)
DFS Used%: 0.00%
Under replicated blocks: 0
Blocks with corrupt replicas: 0
Missing blocks: 0
Missing blocks (with replication factor 1): 1

-------------------------------------------------
Live datanodes (1):

Name: 192.168.0.106:50010 (hadoop)
Hostname: hadoop
Decommission Status : Normal
Configured Capacity: 42140401664 (39.25 GB)
DFS Used: 24591 (24.01 KB)
Non DFS Used: 11772993521 (10.96 GB)
DFS Remaining: 30367383552 (28.28 GB)
DFS Used%: 0.00%
DFS Remaining%: 72.06%
Configured Cache Capacity: 0 (0 B)
Cache Used: 0 (0 B)
Cache Remaining: 0 (0 B)
Cache Used%: 100.00%
Cache Remaining%: 0.00%
Xceivers: 1
Last contact: Tue May 19 11:12:37 CST 2020
再用idea上传文件成功,而且可以在网页上成功下载文件,因为之前主机名称指向localhost,而本地使用idea主机名称指向localhost会自动查找本机网卡,配置本地hosts文件也没有用,只能使用远程hadoop的主机名称访问才能成功,记得本地的hosts的文件添加

hadoop的ip ha:doop

hadoop的详细配置如下:

core-site.xml :

 <property>
        <name>fs.defaultFS</name>
        <value>hdfs://hadoop:8020</value>
    </property>
    <property>
        <name>hadoop.tmp.dir</name>
        <value>/home/hadoop/app/tmp</value>
    </property>

hdfs-site.xml:

<configuration>
<property>
        <name>dfs.replication</name>
        <value>1</value>
    </property>
<property>
        <name>dfs.client.use.datanode.hostname</name>
        <value>true</value>
        <description>only config in clients</description>
    </property>
</configuration>

hadoop-env.sh:

export JAVA_HOME=${JAVA_HOME}
export JAVA_HOME=/home/hadoop/app/jdk1.8.0_221

slaves:

hadoop

/etc/hosts:

内网ip hadoop
hadoop的ip  hadoop

~/.bash_profile:

export PATH
export JAVA_HOME=/root/app/jdk1.8.0_221
export PATH=$JAVA_HOME/bin:$PATH
export HADOOP_HOME=/root/app/hadoop-2.6.0-cdh5.7.0
export PATH=$HADOOP_HOME/bin:$PATH
export PATH=$HADOOP_HOME/sbin:$PATH
 

 

 

 

 

 

  相关解决方案