hadoop: port appears open locally but not remotelly
Posted
by
miguel
on Server Fault
See other posts from Server Fault
or by miguel
Published on 2012-10-01T23:47:10Z
Indexed on
2012/10/02
3:40 UTC
Read the original article
Hit count: 446
I am new to linux and hadoop and I am having the same issue as in this question. I think I understand what is causing it but I don't know how to solve it (Don't know what they mean by "Edit the Hadoop server's configuration file so that it includes its NIC's address."). The other post that they link says that the configuration files should refer to the machine's externally accessible host name. I think I got this right as every hadoop configuration file refers to "master" and the etc/hosts file lists the master by its private IP address. How can I solve this?
Edit: I have 5 nodes: master
, slavec
, slaved
, slavee
and slavef
all running debian. This is the hosts file in master
:
127.0.0.1 master
10.0.1.201 slavec
10.0.1.202 slaved
10.0.1.203 slavee
10.0.1.204 slavef
this is the hosts file in slavec
(it looks similar in the other slaves):
10.0.1.200 master
127.0.0.1 slavec
10.0.1.202 slaved
10.0.1.203 slavee
10.0.1.204 slavef
the masters file in master
:
master
the slaves file in master
:
master
slavec
slaved
slavee
slavef
the masters and slaves file in slavex
has only one line: slavex
© Server Fault or respective owner