Hadoop Can't access datanode without using the IP Hadoop Can't access datanode without using the IP docker docker

Hadoop Can't access datanode without using the IP


Finally, I found the solution to this problem.

The steps are:

1- Use de hostname tag in the docker-compose for all the services as @smart_coder suggested in a commentary:

hostname:datanode

2- Edit (in Linux) the /etc/hosts file and add the ip that routes to my service (in this case I needed to map the 'datanode' to its IP) so I added this line to the /etc/hosts file:

192.168.56.1 datanode

(which is a real IPv4 IP, if I add 10.0.1.21 which is a Docker IP created in my docker-compose also works in Linux but I'm not sure if it will work accessing from Windows). With this second step we are giving access to resolve the word 'datanode' as the IP 192.168.56.1 and this will work (only) inside my Linux guest.

But please, remember from my first commentary that I have mapped my windows IP (192.168.56.1) to my Docker (Linux) IP (10.0.1.21) so if in your case you are using only Linux, you can write your IP created in your Docker compose file and it will works.

3- Edit (in Windows) the /etc/hosts file by doing this steps:

  • Press Windows key
  • Write Notepad
  • Right click -> Run as administrator
  • From Notepad, open the file: C:\Windows\System32\Drivers\etc\hosts(c is my hard drive so the address can be different if your hard disk has another name).
  • I added:

192.168.56.1 datanode

  • Save

This third step allows to resolve the word 'datanode' as the IP 192.168.56.1 for the Windows host. And after this steps I am able to download the files accessing from my Linux guest (which is inside VirtualBox) and from my Windows host.