Multinode Hadoop installation in AWS | Hadoop Cluster Setup

แชร์
ฝัง
  • เผยแพร่เมื่อ 5 ก.ย. 2024
  • Hi Everyone!
    Here is the link to the Medium blog where I posted step by step process on Multi node hadoop installation. The commands I used in this video can also be found in this article.
    / setting-up-multi-node-...
    Below are the other videos in my channel if you are interested
    Big Data Playlist - • Big Data Hadoop
    AWS Playlist - • AWS
    MinIO Playlist - • MinIO
    Python Leetcode Solutions - • Python LeetCode Solutions
    Interesting Java videos playlist - • Java
    Medium Blog - / dineshvarma.guduru

ความคิดเห็น • 22

  • @tickmagnet7155
    @tickmagnet7155 2 ปีที่แล้ว +8

    Words cannot express how grateful I am for this video! Please keep up the good work.

    • @gandepallilavanya8317
      @gandepallilavanya8317 2 ปีที่แล้ว

      Can you help me... I am struck in between. I didn't understand where to copy keypair

  • @drshahidqamar
    @drshahidqamar 2 ปีที่แล้ว +2

    This is wonderful video. I completed the whole task even with zero background of Hadoop. Good job Sir. Keep up the god work

  • @ahamadshekh2736
    @ahamadshekh2736 2 ปีที่แล้ว +2

    Very nice

  • @shot_freeze
    @shot_freeze 5 หลายเดือนก่อน

    What type of job description can we search for this hadoop with AWS ?

  • @MuhammadKhankhan
    @MuhammadKhankhan 2 ปีที่แล้ว +1

    It's only showing one live datanode. pls help

  • @berkayates6254
    @berkayates6254 5 หลายเดือนก่อน

    When i stopeed and started the aws ubuntu machines their ip addresses changes so hapoop does not know their new ip addresses, What is the solution for this situation

  • @balasubramaniyanjayachandr1186
    @balasubramaniyanjayachandr1186 2 ปีที่แล้ว

    Thank you!!

  • @arjunshrivastva9093
    @arjunshrivastva9093 ปีที่แล้ว

    If we are installing by using hadoop single node installation in a single machine in VMware, then how it is creating other nodes like namenode & secondary namenode. I mean to ask how 3 nodes in a single machine created.?
    And same in case of multi-node installation, how it is creating namenode, sec namenode, multiple data-nodes in a single VM?
    Could you please answer this question? 🙏

  • @ajayviswam5994
    @ajayviswam5994 2 ปีที่แล้ว +1

    @DineshVarma i am unable to copy the privatekey from Namenode to all the other nodes. i am getting permission denied error. Please guide me

    • @bernadettehoffman231
      @bernadettehoffman231 2 ปีที่แล้ว

      I had the same issue, tried removing ".pub", updating the file using chmod 600, still isn't working.

    • @tharindurandula8355
      @tharindurandula8355 2 ปีที่แล้ว +3

      sudo chmod 400 pem-file.pem
      Please run the above command before copy privatekey from Namenode. I got the same error and was able to resolve in this way

  • @franciscobelenguervicente4364
    @franciscobelenguervicente4364 ปีที่แล้ว

    Hey!. First of all, thanks for your content, it's very valuable, :).
    I just completed the tutorial and succesfully ran the cluster, however, I haven't had success launching the web ui. How do I get to the Web UI from local machine?. Thanks

  • @mohammedradman2756
    @mohammedradman2756 9 หลายเดือนก่อน

    Hi, after I added the variables to the /.bashrc and ran the command
    source ~/.bashrc
    I couldn't run any other command except cd, the rest of the commands can't be identified by Ubuntu, any advice ?

  • @antoniooriol5823
    @antoniooriol5823 2 ปีที่แล้ว

    I get stuck in the last part with the Hadoop Daemon Startup. I successfully issue the command "hdfs namenode -format". But when I issue the command "$HADOOP_HOME/sbin/start-dfs.sh" and it tries to start the namenode it gives an error "access denied: public key": then tries to start datanode and the same error. Please help. I've repeteated everything 3 times and the error continues.

  • @PaOne24T
    @PaOne24T ปีที่แล้ว

    Hi , How to handle the public DNS that changes everytime we stop and start the nodes

    • @brunocarvalho3229
      @brunocarvalho3229 ปีที่แล้ว

      You need to allocate an Elastic IP to each one of your instances before doing the Hadoop setup

  • @arjunvenkat3329
    @arjunvenkat3329 2 ปีที่แล้ว +1

    Bro how you connected to Ubuntu at 9.30

    • @atomicengineering
      @atomicengineering  2 ปีที่แล้ว

      I have used hadoop.pem file as a key to connect to Ubuntu. You can check the entire command I used to connect

    • @arjunvenkat3329
      @arjunvenkat3329 2 ปีที่แล้ว

      Can you please tell that command clearly

    • @atomicengineering
      @atomicengineering  2 ปีที่แล้ว

      @@arjunvenkat3329 scp -i Hadoop.pem