Hey!. First of all, thanks for your content, it's very valuable, :). I just completed the tutorial and succesfully ran the cluster, however, I haven't had success launching the web ui. How do I get to the Web UI from local machine?. Thanks
Hi, after I added the variables to the /.bashrc and ran the command source ~/.bashrc I couldn't run any other command except cd, the rest of the commands can't be identified by Ubuntu, any advice ?
sudo chmod 400 pem-file.pem Please run the above command before copy privatekey from Namenode. I got the same error and was able to resolve in this way
If we are installing by using hadoop single node installation in a single machine in VMware, then how it is creating other nodes like namenode & secondary namenode. I mean to ask how 3 nodes in a single machine created.? And same in case of multi-node installation, how it is creating namenode, sec namenode, multiple data-nodes in a single VM? Could you please answer this question? 🙏
I get stuck in the last part with the Hadoop Daemon Startup. I successfully issue the command "hdfs namenode -format". But when I issue the command "$HADOOP_HOME/sbin/start-dfs.sh" and it tries to start the namenode it gives an error "access denied: public key": then tries to start datanode and the same error. Please help. I've repeteated everything 3 times and the error continues.
When i stopeed and started the aws ubuntu machines their ip addresses changes so hapoop does not know their new ip addresses, What is the solution for this situation
Oh my god! You are the only one I followed the instructions and works!!!!!! Appreciate that bro!
Glad you were able to follow the instructions and get it working!
Words cannot express how grateful I am for this video! Please keep up the good work.
Can you help me... I am struck in between. I didn't understand where to copy keypair
This is wonderful video. I completed the whole task even with zero background of Hadoop. Good job Sir. Keep up the god work
Very nice
Thanks
Thank you!!
It's only showing one live datanode. pls help
What type of job description can we search for this hadoop with AWS ?
Hey!. First of all, thanks for your content, it's very valuable, :).
I just completed the tutorial and succesfully ran the cluster, however, I haven't had success launching the web ui. How do I get to the Web UI from local machine?. Thanks
Hi, after I added the variables to the /.bashrc and ran the command
source ~/.bashrc
I couldn't run any other command except cd, the rest of the commands can't be identified by Ubuntu, any advice ?
@DineshVarma i am unable to copy the privatekey from Namenode to all the other nodes. i am getting permission denied error. Please guide me
I had the same issue, tried removing ".pub", updating the file using chmod 600, still isn't working.
sudo chmod 400 pem-file.pem
Please run the above command before copy privatekey from Namenode. I got the same error and was able to resolve in this way
If we are installing by using hadoop single node installation in a single machine in VMware, then how it is creating other nodes like namenode & secondary namenode. I mean to ask how 3 nodes in a single machine created.?
And same in case of multi-node installation, how it is creating namenode, sec namenode, multiple data-nodes in a single VM?
Could you please answer this question? 🙏
I get stuck in the last part with the Hadoop Daemon Startup. I successfully issue the command "hdfs namenode -format". But when I issue the command "$HADOOP_HOME/sbin/start-dfs.sh" and it tries to start the namenode it gives an error "access denied: public key": then tries to start datanode and the same error. Please help. I've repeteated everything 3 times and the error continues.
Hi , How to handle the public DNS that changes everytime we stop and start the nodes
You need to allocate an Elastic IP to each one of your instances before doing the Hadoop setup
When i stopeed and started the aws ubuntu machines their ip addresses changes so hapoop does not know their new ip addresses, What is the solution for this situation
Bro how you connected to Ubuntu at 9.30
I have used hadoop.pem file as a key to connect to Ubuntu. You can check the entire command I used to connect
Can you please tell that command clearly
@@arjunvenkat3329 scp -i Hadoop.pem