Easy Hadoop Installation on Windows 10/11 - Step-by-Step Guide | Kundan Kumar

แชร์
ฝัง
  • เผยแพร่เมื่อ 4 ม.ค. 2025

ความคิดเห็น • 289

  • @jtrickzz6959
    @jtrickzz6959 7 หลายเดือนก่อน +3

    Watched a lot of videos , but yours worked the better . you even delt with the programe files space error, may videos , dont even deal with that .. awsome work

    • @kundankumar011
      @kundankumar011  7 หลายเดือนก่อน

      Glad to know that this video was so helpful to you❤️. Thank you for appreciating the video. Please do you subscribe to my channel and encourage me.

  • @mauricioocampo8989
    @mauricioocampo8989 9 หลายเดือนก่อน +2

    Excellent, I followed the steps and it worked very well for me. Thank you so much

    • @kundankumar011
      @kundankumar011  9 หลายเดือนก่อน

      Pleasure to know that you followed the steps and worked very fine. ❤️ Please you do encourage me by subscribing my channel dear.

  • @Oracle.mp4
    @Oracle.mp4 7 หลายเดือนก่อน +1

    I have been trying to install hadoop for a week. Finally it works now. Thankyou so much sir for this video 🥺🩷🩷

    • @kundankumar011
      @kundankumar011  7 หลายเดือนก่อน +1

      I am happy to know that this video was very helpful for you to install Hadoop successfully❤️. Please you subscribe to my channel if not yet and encourage me

    • @kundankumar011
      @kundankumar011  7 หลายเดือนก่อน

      Thank you so much for subscribing to my channel 💖💖💖

  • @sivashankar47
    @sivashankar47 3 หลายเดือนก่อน +1

    Thank you, followed the steps and it worked as expected.

    • @kundankumar011
      @kundankumar011  3 หลายเดือนก่อน

      @@sivashankar47 Glad to know that it is working for you as expected ❤️ Keep learning and enjoy. Please do you subscribe to my channel to encourage me and not to miss any notification on upcoming relevant videos.

  • @SaiNirmalReddyGavini-k2o
    @SaiNirmalReddyGavini-k2o 6 หลายเดือนก่อน +1

    Awesome after many tutorials I can't do it but now, I made it in One-go, thankyou sirr..

    • @kundankumar011
      @kundankumar011  6 หลายเดือนก่อน +1

      @@SaiNirmalReddyGavini-k2o I am glad to know that this tutorial was helpful for you as well❤️. Welcome dear. Nirmal Reddy. Hope you do subscribed to my channel

  • @kids_toon9
    @kids_toon9 5 หลายเดือนก่อน +1

    After wasting 1 whole day, i get this video ❤️ it help me in installation.thankyou💯

    • @kundankumar011
      @kundankumar011  5 หลายเดือนก่อน

      @@kids_toon9 Glad to know this video also helped you.❤️ Please you subscribe my channel to encourage me dear

  • @AjayKumar-yz8rq
    @AjayKumar-yz8rq หลายเดือนก่อน +1

    Super, Hats off to You SIR

    • @kundankumar011
      @kundankumar011  หลายเดือนก่อน +1

      Thank you so much for the appreciation dear Ajay. Hope you didn't fail to subscribe to my channel to encourage me

  • @amazighkab9904
    @amazighkab9904 2 หลายเดือนก่อน +1

    Thank you bro !! this is the best video that explains how to install HADOOP

    • @kundankumar011
      @kundankumar011  2 หลายเดือนก่อน +1

      Thank you so much for the appreciation dear 💗. It motivates me. Hope you subscribed to my channel and press bell 🔔 icon l so that you don't miss notifications on up irrelevant videos

  • @yercoarancibia3366
    @yercoarancibia3366 2 หลายเดือนก่อน +1

    This video is themost of complete that see en youtube, excelent video and explication, you have a good job and is the best tutorial

    • @kundankumar011
      @kundankumar011  2 หลายเดือนก่อน

      @@yercoarancibia3366 Thank you so much for appreciating this Hadoop installation tutorial. It motivates me. I hope you subscribed to my channel dear.

  • @momotarodadumpling4065
    @momotarodadumpling4065 25 วันที่ผ่านมา +1

    thank you. awesome video !!! best video to setup hadoop

    • @kundankumar011
      @kundankumar011  24 วันที่ผ่านมา

      Pleasure is all mine❤️. Glad to know that this video is awesome to you as well. I know you didn't forget to subscribe to my channel to encourage me and not to miss upcoming video notifications.

  • @EvrythingSimple
    @EvrythingSimple 9 หลายเดือนก่อน +1

    Thanks You So Mach....Great Video For Install Hadoop...❤❤

    • @kundankumar011
      @kundankumar011  9 หลายเดือนก่อน

      @EvrythingSimple Great to Know that this was helpful for you and Thank you so much for appreciating this video. Pleased you do subscribe my channel and encourage me.

  • @MohammedSalman-nb2pu
    @MohammedSalman-nb2pu 4 หลายเดือนก่อน +1

    Yes, Really the video is very useful after seeing multiple different different videos I got the correct answer here only

    • @kundankumar011
      @kundankumar011  4 หลายเดือนก่อน

      @@MohammedSalman-nb2pu I am glad to know that this video was very helpful for and got your answers here❤️. Please do you subscribe to my channel to encourage me dear and to get notifications for upcoming relevant videos.

  • @mahran34
    @mahran34 6 หลายเดือนก่อน +1

    well done, Kumar ...
    it was so useful.

    • @kundankumar011
      @kundankumar011  6 หลายเดือนก่อน +1

      Glad to know that this video was helpful to you as well ❤️ Thank you so much for subscribing my channel. 🙌

  • @mayureebawane1938
    @mayureebawane1938 7 หลายเดือนก่อน +1

    Thank you so must. You explain everything in detail which helps me alot.

    • @kundankumar011
      @kundankumar011  7 หลายเดือนก่อน

      I'm glad to know that this was very helpful to you❤️ if you have not yet subscribe my channel you do subscribe my channel and encourage me dear. 🙏

  • @PrabhatSingh-tf6km
    @PrabhatSingh-tf6km 5 หลายเดือนก่อน +2

    thank you so much sir your video really helped me install hadoop successfully

    • @kundankumar011
      @kundankumar011  5 หลายเดือนก่อน

      @@PrabhatSingh-tf6km Glad to know that this video was very helpful to you as well to install Hadoop successfully ❤️Please you subscribe to my channel dear to encourage me if not yet 🙌

    • @PrabhatSingh-tf6km
      @PrabhatSingh-tf6km 5 หลายเดือนก่อน +1

      @@kundankumar011 already did sir 👍

    • @kundankumar011
      @kundankumar011  5 หลายเดือนก่อน

      @@PrabhatSingh-tf6km pleasure ❤️

  • @shedidahmeni2910
    @shedidahmeni2910 ปีที่แล้ว +1

    May god bless you sir. It worked just fine

    • @kundankumar011
      @kundankumar011  ปีที่แล้ว

      Pleasure to know ❤️

    • @kundankumar011
      @kundankumar011  ปีที่แล้ว

      If you have not yet subscribed my channel. Please you do subscribe my channel. Your subscription will motivate me make more videos ❤️

  • @quanvan684
    @quanvan684 10 หลายเดือนก่อน +1

    You help me a lot, thank you! Hope you will make more videos like this.

    • @kundankumar011
      @kundankumar011  10 หลายเดือนก่อน

      Pleasure to know that this video is helpful for you. Ya sure, I will keep trying to make such helpful video. Kindly request you to subscribe my channel if you not yet. ❤️

  • @VidhiSharma-k6r
    @VidhiSharma-k6r หลายเดือนก่อน +1

    sir thank you so much!!!🙏🙏🙏🙏🙏

    • @kundankumar011
      @kundankumar011  หลายเดือนก่อน

      Pleasure dear ❤️. Please you subscribe to my channel to encourage me

    • @kundankumar011
      @kundankumar011  หลายเดือนก่อน

      Pleasure dear ❤️. Please you subscribe to my channel to encourage me

  • @Sanjaysview
    @Sanjaysview 11 หลายเดือนก่อน +1

    Thank you for the instructions sir❤❤

    • @kundankumar011
      @kundankumar011  11 หลายเดือนก่อน

      Pleasure to know dear @Sanjay. If you have not subscribed my channel please you do subscribe my channel to motivate me ❤️

  • @mohamedmoawad9670
    @mohamedmoawad9670 3 หลายเดือนก่อน +1

    you saved my life❤

    • @kundankumar011
      @kundankumar011  3 หลายเดือนก่อน

      @@mohamedmoawad9670 Glad to know that it was very helpful for you and saved your lots of time. Enjoy your learning dear ❤️ Hope you do subscribe to my channel to encourage me dear and not to miss notifications on upcoming relevant videos

  • @sethusuresh9196
    @sethusuresh9196 9 หลายเดือนก่อน

    Very helpful video. Works perfectly. It saved me a lot of time. Thanks a lot

    • @kundankumar011
      @kundankumar011  9 หลายเดือนก่อน

      Pleasure to know that this video is Very helpful to you. Please you do subscribe my channel and encourage me ❤️

  • @ASHWINIM-v2q
    @ASHWINIM-v2q 3 หลายเดือนก่อน +1

    Thanks for the steps

  • @SugunaDinesh-k1g
    @SugunaDinesh-k1g 3 หลายเดือนก่อน +1

    Thank u sir really helpful

    • @kundankumar011
      @kundankumar011  3 หลายเดือนก่อน

      Glad to know that it was very helpful to you as well. Hope you didn't forget to subscribe to receive notification for upcoming relevant videos.

  • @shumbushojackson983
    @shumbushojackson983 ปีที่แล้ว +2

    this is very wondeful and helpful

  • @AbdullahKhan-pl7py
    @AbdullahKhan-pl7py 7 หลายเดือนก่อน +1

    Perfect Ma mann, Keep it up

    • @kundankumar011
      @kundankumar011  7 หลายเดือนก่อน

      Thank you so much Abdullah for motivating me ❤️ Hope you didn't forget to subscribe my channel.

    • @AbdullahKhan-pl7py
      @AbdullahKhan-pl7py 7 หลายเดือนก่อน

      @@kundankumar011 Already done, And I was wondering, When I try to upload a datset into the hdfs, I see an error "Couldn't upload the file jfk_weather_cleaned.csv.". So have you made the next video you mentioned in the end of this video so that I would be better able to get the insights on it.. Regards

    • @kundankumar011
      @kundankumar011  7 หลายเดือนก่อน

      @@AbdullahKhan-pl7py Dear Abdullah. You can upload the file to hdfs using the command or the GUI interface. At the end of video at 22:50 minutes, I just showed an interface how to upload files on hdfs. May I know what the error is?

    • @AbdullahKhan-pl7py
      @AbdullahKhan-pl7py 7 หลายเดือนก่อน

      @@kundankumar011 "Couldn't upload the file Perecipitationhourly.csv." that is the error which is coming whenever I try to upload it, BTW thanks for cooperating, Your reply is highly appreciated man.

    • @kundankumar011
      @kundankumar011  7 หลายเดือนก่อน

      @@AbdullahKhan-pl7py Thank you so much for the appreciation. I will look into this and get back to you dear.

  • @wafasalah-369
    @wafasalah-369 10 หลายเดือนก่อน +1

    thnk u for this video, it helped me a lot

    • @kundankumar011
      @kundankumar011  10 หลายเดือนก่อน +1

      Dear Salah. It's pleasure to know that this video was very helpful for you ❤️. If you not yet subscribe my channel please you do subscribe and encourage me 🙌

    • @wafasalah-369
      @wafasalah-369 10 หลายเดือนก่อน

      @@kundankumar011 I already did 👌

  • @DashingData66666
    @DashingData66666 หลายเดือนก่อน

    sir open in the edior 8:10 par jo kiya hai vo nhi ho paa raha editor mei nhi open ho raha what to do please help

    • @kundankumar011
      @kundankumar011  หลายเดือนก่อน

      Sorry to know that you are facing issues. I used a sublime editor dear . You can use any editor even notepads dear. Let me know if you are asking anything else? Please subscribe to my channel to encourage me dear

    • @DashingData66666
      @DashingData66666 หลายเดือนก่อน

      @@kundankumar011 sir i pinned you on the linkedin
      sir the issue has been resolved
      but the thing is that
      my C drive was full so i did all those steps in D drive
      all the paths has been specified correclty by me
      but when u asked us to download msvcr120.dll file and asked us to copy it in C driver system 32 folder
      it is creating some error as after this
      when i opened the CMD as a administrator and written the command
      hdfc namenode -format its showing system cannot find the path specified
      sir kindly guide me what i can do now i am stucked so badly

    • @DashingData66666
      @DashingData66666 หลายเดือนก่อน +1

      @@kundankumar011 is t mandatory to save that file msvcr120.dll into system32?
      what will happen if we did all these steps in
      D drive instead of C drive
      sir please guide me

    • @kundankumar011
      @kundankumar011  หลายเดือนก่อน

      @DashingData66666 It depends where is your system32 dear. It should work.

    • @DashingData66666
      @DashingData66666 หลายเดือนก่อน

      @kundankumar011 in C drive

  • @sandip_kanzariya8476
    @sandip_kanzariya8476 5 หลายเดือนก่อน +1

    Helpful man!

    • @kundankumar011
      @kundankumar011  5 หลายเดือนก่อน

      Glad to know that it helped you as well. ❤ Please do you subscribe to my channel if you have not yet

  • @janewang8353
    @janewang8353 ปีที่แล้ว +1

    Thank you so much. It's very useful for me.

    • @kundankumar011
      @kundankumar011  ปีที่แล้ว

      Pleasure to know that this video was so useful for you ❤️. If you didn't subscribe my channel yet, please you do subscribe my channel and press bell icons not to miss next videos.

  • @sathishkolla3190
    @sathishkolla3190 ปีที่แล้ว +1

    Thank you very much , it is very use full video.
    If possible could you plz do one more video, how to install Apache Spark on top of this Hadoop env. and how to integrate Hadoop, hive and Spark.

    • @kundankumar011
      @kundankumar011  ปีที่แล้ว

      @Sathish Thank you so much for appreciating this video tutorial and subscribing my channel ❤️l. Let me work on your request as early as possible on Hadoop Spark And hive integration and upload. 👍

    • @kundankumar011
      @kundankumar011  6 หลายเดือนก่อน

      Hi dear Satish. Hope you are learning well. I am sorry to delay making a video on hive as you had requested king ago but I was quite busy on different projects. But now I have already made video in easy hive installation for Hadoop. You can find it here th-cam.com/video/CL6t2W8YC1Y/w-d-xo.htmlsi=Gx_jMbUZK6vxy76t

  • @rt7687
    @rt7687 11 หลายเดือนก่อน +1

    After insralling winrar showing only console RAR manual,what is new in latest version,winRAR help,winRAR this file but not showing hadoop file help me

    • @kundankumar011
      @kundankumar011  11 หลายเดือนก่อน +1

      Thank you so much for watching my video and it's always pleasure to help. Here is the link to download latest WinRar dear. www.win-rar.com/download.html?&L=0 . If you have not yet subscribe my channel. please you do subscribe. Your subscription is motivation for me. ❤️

    • @rt7687
      @rt7687 11 หลายเดือนก่อน +1

      @@kundankumar011 still showing same

    • @kundankumar011
      @kundankumar011  11 หลายเดือนก่อน

      @@rt7687 After installing WinRar. Extract the Hadoop.tar file then you will find extracted Hadoop file. Follow instructions in video how to extract hadoop.tar file and rest.

    • @rt7687
      @rt7687 11 หลายเดือนก่อน

      @@kundankumar011 thank you

  • @Work-q8g
    @Work-q8g ปีที่แล้ว +2

    sir for me that port number is not showing hadoop, but 8000 series port showing namenode details. how do i get to know which port will display the hadoop information?

    • @kundankumar011
      @kundankumar011  ปีที่แล้ว

      Hello Dear. Thank you for watching my video and asking question. Did you try localhost:9870??
      Ports: The port information is indeed specified in the Hadoop configuration files. Here's how to find port information in those files:
      For the core-site.xml file, look for the fs.defaultFS property. It specifies the default filesystem name, which includes the hostname and port. For example:
      fs.defaultFS hdfs://namenode-hostname:9000
      Hope you got me well dear? If you like my video please you do subscribe my channel to motivate me ❤️

    • @Work-q8g
      @Work-q8g ปีที่แล้ว +1

      @@kundankumar011
      thx for replying sir. i tried with port 9000 also. is there any way to know that i can check hadoop port working is?

    • @Work-q8g
      @Work-q8g ปีที่แล้ว +1

      its working thx

    • @kundankumar011
      @kundankumar011  ปีที่แล้ว

      @@Work-q8g cool. Enjoy learning Hadoop

    • @kundankumar011
      @kundankumar011  ปีที่แล้ว

      @@Work-q8g you subscribe my channel dear. Your subscription will motivate me ❤️

  • @bibingeorge7493
    @bibingeorge7493 ปีที่แล้ว +1

    Thank you so much sir ji

    • @kundankumar011
      @kundankumar011  ปีที่แล้ว

      Pleasure dear ❤️ please Subscribe my channel to motivate me

  • @ttvgolive3436
    @ttvgolive3436 10 หลายเดือนก่อน +1

    sir under jps command my data node is not running

    • @kundankumar011
      @kundankumar011  10 หลายเดือนก่อน +1

      Thank you so much for watching my video. Hope you didn't forget to format the namenode and datanode folder. In same video, there is instructions how to format it. Addition to this, if really this video is helpful for please you do subscribe my channel dear. Your subscription is my motivation ❤️

    • @SatyamSingh-ll7ib
      @SatyamSingh-ll7ib 10 หลายเดือนก่อน

      is it running now?

    • @ttvgolive3436
      @ttvgolive3436 10 หลายเดือนก่อน

      @@SatyamSingh-ll7ib yeah its running smoothly

  • @shivapatankar6912
    @shivapatankar6912 ปีที่แล้ว +1

    Thank you sir 😊

  • @rabeaahsan2070
    @rabeaahsan2070 หลายเดือนก่อน

    'C:\Program' is not recognized as an internal or external command,
    operable program or batch file.
    hadoop version error

    • @jsam-c1b
      @jsam-c1b 18 วันที่ผ่านมา

      i am facing the same issue, were you able to solve it. please help

  • @claudebuhanga4262
    @claudebuhanga4262 3 หลายเดือนก่อน +1

    it works well in Win 11 all error are not existing

    • @kundankumar011
      @kundankumar011  3 หลายเดือนก่อน

      Glad to know it's working well dear.

  • @golddroger4352
    @golddroger4352 10 หลายเดือนก่อน +1

    What size RAM do you need? after can you show us the installation of kerberos and warden?

    • @kundankumar011
      @kundankumar011  8 หลายเดือนก่อน

      I missed your comment earlier, and I apologize for that. I'll make sure to prioritize working on the requested video. Regarding Hadoop, while 8GB RAM is sufficient, but it depends on different factors. Like what size of datasets you have? upgrading to 16GB RAM would indeed improve performance.

  • @aakashkumar9961
    @aakashkumar9961 ปีที่แล้ว +2

    Sir my dataNode is not displaying in jps
    what should i do

    • @kundankumar011
      @kundankumar011  ปีที่แล้ว

      Sorry for late response dear Akash. I missed to see your comment. 1) Check Configuration Files: Verify that your hdfs-site.xml and core-site.xml configuration files are correctly set up. Pay attention to the dfs.datanode.data.dir property in hdfs-site.xml, which defines the directory where DataNode stores its data. Ensure the paths are correct and accessible. 2) Hope ava Version is ok for you. 3) Formate the namenode before start daemons.

  • @Akhileshhgaming
    @Akhileshhgaming 5 หลายเดือนก่อน +1

    Good morning sir
    It is not showing any local host in my google sir what can i do please help sir

    • @kundankumar011
      @kundankumar011  5 หลายเดือนก่อน +1

      Type localhost at the address bar of Web browser dear not in Google. See in video how do I access it

    • @Akhileshhgaming
      @Akhileshhgaming 5 หลายเดือนก่อน +1

      @@kundankumar011 sir I have tried in adress bar only but the cluster Hadoop is working 50070 is not working

    • @kundankumar011
      @kundankumar011  5 หลายเดือนก่อน

      ​@@Akhileshhgaming​cross check hdfs-site.xml file. this content should be there:

      dfs.namenode.http-address
      localhost:50070

      also check core-site.xml file has this:
      fs.defaultFS
      hdfs://localhost:9000
      Let me know if it worked?

  • @jsam-c1b
    @jsam-c1b 18 วันที่ผ่านมา

    i am facing the following error.
    please help sir
    The system cannot find the path specified.
    Error: JAVA_HOME is incorrectly set.
    Please update C:\hadoop\hadoop-3.4.0\etc\hadoop\hadoop-env.cmd
    '-Dhadoop.security.logger' is not recognized as an internal or external command,
    operable program or batch file.

  • @charlene-brendandjoumba-ik3448
    @charlene-brendandjoumba-ik3448 10 หลายเดือนก่อน +1

    Thank you

    • @kundankumar011
      @kundankumar011  10 หลายเดือนก่อน +1

      Welcome ❤️ and thank you so much you to for watching this video. Please you do subscribe to my channel to offer support and encouragement

    • @charlene-brendandjoumba-ik3448
      @charlene-brendandjoumba-ik3448 10 หลายเดือนก่อน

      okay@@kundankumar011

  • @Akhileshhgaming
    @Akhileshhgaming 5 หลายเดือนก่อน +1

    Sir i got all jps 5 nodes but in google iam not able to run what you have said what to do sir please help

    • @kundankumar011
      @kundankumar011  5 หลายเดือนก่อน

      @@Akhileshhgaming I am glad to know that you are also managed to configure successfully ❤️👍. I think you are trying not to say how to open Hadoop GUI, see the video description I have shared the URL to access Hadoop GUI dear. If I mistaken you let me know

    • @Akhileshhgaming
      @Akhileshhgaming 5 หลายเดือนก่อน +1

      There you have written daemons sir how to start that?

    • @kundankumar011
      @kundankumar011  5 หลายเดือนก่อน

      @@Akhileshhgaming I have guided in video dear. Let me share here. Open command prompt as an administrator then run command start-all.cmd

    • @Akhileshhgaming
      @Akhileshhgaming 5 หลายเดือนก่อน +1

      @@kundankumar011 I have done that sir start-cmd.all

    • @Akhileshhgaming
      @Akhileshhgaming 5 หลายเดือนก่อน +1

      Next what should I do sir

  • @utukurutharunkumarreddy9994
    @utukurutharunkumarreddy9994 10 หลายเดือนก่อน +2

    JPS is not working

    • @kundankumar011
      @kundankumar011  10 หลายเดือนก่อน

      Thank you for watching this video. I hope you have configured the files properly. Afterward, try using start-all.cmd to initiate all daemons. If you are still encountering issues, please provide more information so I can assist you more effectively. Please you do subscribe my channel to motivate me.

  • @VIJAYVARDHAN_V_PATIL.
    @VIJAYVARDHAN_V_PATIL. 5 หลายเดือนก่อน +1

    Wich editor you have used for hadoop-evc dragging option step.....name please ?

    • @kundankumar011
      @kundankumar011  5 หลายเดือนก่อน

      @@VIJAYVARDHAN_V_PATIL. Thank you so much for watching this video. I hope so far it's helpful for you as well. I used a "sublime" editor but feel free to use any of your choices dear. Please do you subscribe to my channel dear and encourage me 💖

  • @vulcaneir2758
    @vulcaneir2758 หลายเดือนก่อน

    What does it mean if the data node is not showing at 22:03?

    • @kundankumar011
      @kundankumar011  29 วันที่ผ่านมา

      Thank you so much for watching my video. Hope it was helpful. Sorry for the reply. Please Cross check the DataNode is configured correctly in Hadoop's configuration files:
      1) core-site.xml: Check the fs.defaultFS property points to the correct NameNode address.
      2) hdfs-site.xml: Verify the directories for DataNode storage are set and accessible.
      Let me know if this guide helped you. In addition to this, hope you didn't fail to subscribe to my channel to encourage me.

  • @MichaelReynard-o2y
    @MichaelReynard-o2y 6 หลายเดือนก่อน +1

    sir, will this tutorial work for hadoop 2.7 ?

    • @kundankumar011
      @kundankumar011  6 หลายเดือนก่อน

      Thank you so much dear for reaching out to clear your doubt on the Hadoop installation. Actually we follow the same steps for Installing any version of Hadoop so far. So this video must help you to configure it. If not let me know what the issues are occurring.

  • @DeepakPrajapat-o1r
    @DeepakPrajapat-o1r ปีที่แล้ว +1

    for me yarnmanager and namenode manager is not running 😞

    • @kundankumar011
      @kundankumar011  ปีที่แล้ว

      Hi dear. Thank you for watching my video. Hope you didn't forget to format namenode and datanode using commanline?

  • @heydarnaje6036
    @heydarnaje6036 ปีที่แล้ว +1

    شكرا لك من القلب على شرحك الجميل

    • @kundankumar011
      @kundankumar011  ปีที่แล้ว +1

      Pleasure to Know Heydar that you like my explanations💖. Please you do subscribe my channels if not yet.

    • @kundankumar011
      @kundankumar011  ปีที่แล้ว

      If you not yet subscribe my channel please you do subscribe my channel dear. ❤️

  • @vivian8667
    @vivian8667 หลายเดือนก่อน

    Sir, after running start-all.cmd, it said it is not recognized as an internal or external command, operable program or batch file. How to fix this? 😢

    • @vivian8667
      @vivian8667 หลายเดือนก่อน

      After navigating to C:\hadoop\sbin, it works, but with C:\Windows\System32, doesn’t work…

    • @kundankumar011
      @kundankumar011  หลายเดือนก่อน +1

      Sorry to know that you are facing issues. As your error it's sure that you have not set environmental variables/Path correctly. Try to see if you made some mistakes while setting Path dear. It is shown in this video

    • @kundankumar011
      @kundankumar011  หลายเดือนก่อน +1

      Please do you subscribe to my video dear to encourage me ❤️

    • @vivian8667
      @vivian8667 หลายเดือนก่อน +1

      @@kundankumar011 thank you

    • @kundankumar011
      @kundankumar011  หลายเดือนก่อน

      @@vivian8667 pleasure

  • @vocalistrohu7844
    @vocalistrohu7844 11 หลายเดือนก่อน +1

    when I am clicking on winutils I am getting an error "the application was unable to start correctly" please help

    • @kundankumar011
      @kundankumar011  11 หลายเดือนก่อน

      Hi dear. Thank you for watching my this video. Hope it was helpful. But if I am getting you well, try to download winutils file from given resources at my video description and paste it into your Hadoop configuration as guided in the video then try to see if still you gave an error. ❤️ If you have not subscribed my channel, please you do subscribe , like and share. Your subscription will motivate me more 🙏

  • @bhavanakunta9260
    @bhavanakunta9260 5 วันที่ผ่านมา

    Can Java 23 supports the Hadoop 3.4.0 version..?

    • @kundankumar011
      @kundankumar011  4 วันที่ผ่านมา

      Thank you so much for watching this video. As per my knowledge Java8 is more compatible and stable dear. Let me know if for you java23 is working fine? I believe you didn't forget to subscribe to my channel to encourage me

  • @DeepaDeepa-yb6bw
    @DeepaDeepa-yb6bw ปีที่แล้ว +1

    When i give the jps command its not showing the daemons

    • @kundankumar011
      @kundankumar011  ปีที่แล้ว

      Hello dear. Make sure you have started Hadoop daemons.

    • @DeepaDeepa-yb6bw
      @DeepaDeepa-yb6bw ปีที่แล้ว +1

      @@kundankumar011 yes I started

    • @kundankumar011
      @kundankumar011  ปีที่แล้ว

      @@DeepaDeepa-yb6bw If You started daemons and still not showing . Check The followings:
      1) Check Configuration: Ensure that your Hadoop configuration files (core-site.xml, hdfs-site.xml, and yarn-site.xml) are correctly configured with the right settings. Pay attention to file paths, ports, and other configurations.
      2) Hadoop Formatting: If this is a fresh installation, make sure you've formatted the HDFS file system using the hdfs namenode -format command before starting the daemons.
      If really you like my video and found its helpful. please you do subscribe my channel dear.

  • @kisivaniAIntelligence
    @kisivaniAIntelligence 6 หลายเดือนก่อน +2

    Hello Sir. Thank you for this video but i am facing this error "Error: Could not find or load main class Dev
    Caused by: java.lang.ClassNotFoundException: Dev ", when i run the hdfs namenode -format command.. and my computer name is Malaperi Dev. Thank you

    • @kundankumar011
      @kundankumar011  6 หลายเดือนก่อน +1

      Thank you so much for watching this video. Sorry for the error promoted. I am traveling. Give me a few minutes to settle down myself then help you to fix this error . Hope it's ok with you dear. If you have not yet subscribed to my channel yet. Please do subscribe and encourage me. Soon get back to you ❤️

    • @kundankumar011
      @kundankumar011  6 หลายเดือนก่อน

      Hello dear,There could be many reasons causing this error. I am assuming you have configured all Hadoop files correctly and set up the environment variables. Please try this first:
      I noticed there is a space in your computer name, "Malaperi Dev." Rename your computer to "MalaperiDev" without any spaces. Make sure there are no spaces in the name. After renaming, restart your computer and check if the error is resolved. if not let me know what is the error again?

    • @kisivaniAIntelligence
      @kisivaniAIntelligence 6 หลายเดือนก่อน +1

      Still Negative, i renamed it from Malaperi Dev to kisvan, but still the same error with that "Dev" infront

    • @kundankumar011
      @kundankumar011  6 หลายเดือนก่อน

      @@kisivaniAIntelligence Ensure the Hadoop and Java installation paths do not contain spaces in them and also Review your Hadoop configuration files (core-site.xml, hdfs-site.xml) for any paths or classpath entries that reference the computer name directly.

    • @kisivaniAIntelligence
      @kisivaniAIntelligence 6 หลายเดือนก่อน

      If i run this command >start-all
      it runs right away, but if and only if i run that namenode -format it will bring me the same error

  • @aashitAgrawal
    @aashitAgrawal 5 หลายเดือนก่อน +1

    working
    thankyouu!

    • @kundankumar011
      @kundankumar011  5 หลายเดือนก่อน

      I am glad to know that it's working for you as well ❤️. Enjoy learning dear Aashit. Hope you subscribed to my channel to encourage me. You also find video on Hadoop Commands on my channel in dear data Bigdata Essential playlists.

  • @PhilemonIransubije
    @PhilemonIransubije 6 หลายเดือนก่อน +1

    Hello Sir, nodemanager is shutting immediately when i start hadoop services and it is showing this error: Permissions incorrectly set for dir /tmp/hadoop-phile/nm-local-dir/usercache, should be rwxr-xr-x, actual value = rwxrwxr-x, i tried to set the above permissions it needs, but still it is shutting down. also, i am wondering how is the tmp folder related to hadoop services since it is created authomatically when they are started? any hint? thanks :)

    • @kundankumar011
      @kundankumar011  6 หลายเดือนก่อน

      Hello dear. Check the owner of the tmp folder. Who is it? I mean, you are logging into the system using some user and then running Hadoop. Make sure that the owner has all the permissions.

    • @PhilemonIransubije
      @PhilemonIransubije 6 หลายเดือนก่อน +1

      @@kundankumar011 i found that i was using a microsoft account which caused the error.

    • @kundankumar011
      @kundankumar011  6 หลายเดือนก่อน

      @@PhilemonIransubije congratulations 👏

  • @nguyenvantruc9629
    @nguyenvantruc9629 10 หลายเดือนก่อน +1

    i got this error when running start-all.cmd in C:\hadoop\sbin (SHUTDOWN_MSG: Shutting down NodeManager) i was not able to open localhost:8080
    please help me

    • @kundankumar011
      @kundankumar011  8 หลายเดือนก่อน

      Hi dear. I am so sorry for being late to reply this serious issues. I apologies. It seems like there might be an issue with your Hadoop setup, specifically with the NodeManager shutting down. Check Configuration: Ensure that your Hadoop configuration files (core-site.xml, hdfs-site.xml, yarn-site.xml, etc.) are properly configured. Pay special attention to the NodeManager configuration in yarn-site.xml. Let me know if this helped you?

  • @HoneyPatel-w7y
    @HoneyPatel-w7y 4 หลายเดือนก่อน +1

    C:\Windows\System32>hdfs namenode -format
    Error: Could not find or load main class Patel
    give this error

    • @kundankumar011
      @kundankumar011  4 หลายเดือนก่อน

      @@HoneyPatel-w7y thank you so much for watching my video. Sorry to know that you are facing class and do not find an error. This error may occur due to JAVA_HOME and HADOOP_HOME class path is not set properly or could be a compatible issue of the Java Version with Hadoop Version. First try to cross check the path of java and Hadoop dear. Let me know if this helped you?

  • @Harsh-se4pm
    @Harsh-se4pm 10 หลายเดือนก่อน +1

    my hadoop 9870 port is working but 8000 port is not working help!

    • @shatrughnpinjari5752
      @shatrughnpinjari5752 9 หลายเดือนก่อน +1

      Try 8088

    • @kundankumar011
      @kundankumar011  8 หลายเดือนก่อน

      Hi Harsh, I am sorry for being late to reply. It just sliped from mind. Hope 8088 port working as @shatrughnpinjari5752 suggested. Please let me know if still i can help you dear Harsh.

  • @saranyasaran5624
    @saranyasaran5624 หลายเดือนก่อน

    ERROR namenode.FSNamesystem: FSNamesystem initialization failed. how to fix this

    • @kundankumar011
      @kundankumar011  หลายเดือนก่อน

      The "FSNamesystem initialization failed" error in Hadoop’s NameNode usually points to an issue with the Hadoop filesystem (HDFS) setup, which can be due to improper formatting, data corruption, or configuration errors. Here are steps to troubleshoot and resolve this issue.
      1. Check if HDFS Was Formatted Correctly
      If you're setting up Hadoop for the first time or reconfiguring it, make sure that the NameNode has been correctly formatted.
      Run this command to format the NameNode (only if this is a new setup or you don’t need the data in HDFS): hdfs namenode -format
      After formatting, start Hadoop services again
      Let me know if this step helped you to troubleshoot the error. Please you do subscribe to my channel to encourage me dear

  • @beardboyphotography
    @beardboyphotography 4 หลายเดือนก่อน

    Can we get on a google meet? I have few queries, I googled but not able to figure out

    • @kundankumar011
      @kundankumar011  4 หลายเดือนก่อน

      Thank you so much for watching this video dear. Please share any queries you have related to it. I will do my best to address them in the comments. If needed, we will find a way to provide technical support. Hope you get me well dear.

    • @beardboyphotography
      @beardboyphotography 4 หลายเดือนก่อน

      @@kundankumar011 Followed all the steps but unable to launch the localhost via GUI. And I am getting NameNode: Failed to start namenode.

    • @kundankumar011
      @kundankumar011  4 หลายเดือนก่อน

      @@beardboyphotography sorry for the issues dear. Let me clarify first. When you use the "jps" command to list the running daemons, what does this command display?

    • @beardboyphotography
      @beardboyphotography 4 หลายเดือนก่อน

      @@kundankumar011 13944 Jps

    • @beardboyphotography
      @beardboyphotography 4 หลายเดือนก่อน

      @@kundankumar011 13944 Jps

  • @ttvgolive3436
    @ttvgolive3436 10 หลายเดือนก่อน +1

    all the daemons are working except the datanode

    • @kundankumar011
      @kundankumar011  10 หลายเดือนก่อน +2

      Hope you managed to format the namenode and datanode folder before running daemons. In same video, instruction is given how to format namenode and datanode. Please you do subscribe my channel ❤️

  • @wizardop2100
    @wizardop2100 3 หลายเดือนก่อน

    i Encountered this problem.
    'jps' is not recognized as an internal or external command,
    operable program or batch file
    can you please provide the solution?

    • @kundankumar011
      @kundankumar011  3 หลายเดือนก่อน +1

      Thank you so much for watching my video. The error 'jps' is not recognized as an internal or external command, operable program or batch file typically occurs because the Java Development Kit (JDK) tools (like jps) are not properly set up or the environment variable JAVA_HOME is not configured correctly. Please follow these steps to fix:
      1) Check if JDK is Installed: Open a terminal or command prompt and type the following command to check the installed version of Java:
      java -version
      2) Ensure JDK bin is in the PATH Environment Variable is set properly
      Please you do subscribe my channel to encourage me dear and don't forget to press bell icon to receive notifications on upcoming relevant videos

    • @kundankumar011
      @kundankumar011  3 หลายเดือนก่อน +1

      @@wizardop2100 Hello dear, encourage me by subscribing my channel dear.

  • @mounikagumpu3960
    @mounikagumpu3960 5 หลายเดือนก่อน

    Hi sir
    Everything is fine but when i typed hdfs namenode -format i got the error like couldn't find or load main class
    Please help

    • @kundankumar011
      @kundankumar011  5 หลายเดือนก่อน

      @@mounikagumpu3960 Hello dear. Thank you so much for watching this video. Sorry to know that you are facing issues. Ensure that the JAVA_HOME and HADOOP_HOME environmental variables are set properly.

    • @mounikagumpu3960
      @mounikagumpu3960 5 หลายเดือนก่อน

      @@kundankumar011 Should the Hadoop folder name "h" should be capital or small sir

    • @kundankumar011
      @kundankumar011  5 หลายเดือนก่อน

      @@mounikagumpu3960 I think you are talking about hadoop folder where you extracted all component. Better maintain hadoop dear small 'h'. But ensure that you set up environmental variables of JAVA_HOME and HADOOP_HOME as guided in this video. Hope you got me well dear.

    • @mounikagumpu3960
      @mounikagumpu3960 5 หลายเดือนก่อน

      @@kundankumar011 everything is fine sir but I am getting that error

    • @kundankumar011
      @kundankumar011  5 หลายเดือนก่อน

      @@mounikagumpu3960 hope there is no comma and all in environmental variables values. If all is well. Just restart your computer and see. Sometimes it helps. But sound funny 😂

  • @bhaavanabs
    @bhaavanabs ปีที่แล้ว +1

    Sir it's showing couldn't find or load main class

    • @kundankumar011
      @kundankumar011  ปีที่แล้ว

      Hello Bhavana, appreciate your time watching my video. Could you provide more detailed feedback? This will help me guide you in fixing the errors?

    • @kundankumar011
      @kundankumar011  ปีที่แล้ว

      Hi Bhavana. Did you cross check Hadoop Classpath is set correctly?

    • @kundankumar011
      @kundankumar011  ปีที่แล้ว

      Hi Bhavana. If you like my video and it's helpful for you. Please you do subscribe and press well icon to receive notification on next video upload. ❤️

  • @malvikanagoor929
    @malvikanagoor929 5 หลายเดือนก่อน

    I got everything and in cmd when I gave Jps it is giving as not recognised

    • @kundankumar011
      @kundankumar011  5 หลายเดือนก่อน

      Thank you so much for watching my video. Sorry for facing the issue. The error message 'jps' is not recognized indicates that the Java command jps (Java Virtual Machine Process Status Tool) is not found in your system's PATH. Please verify Java installation. Cross Java path, JAVA _HOME set in environmental variable or not. Let me know if this step is fixed your error?

  • @MayankGadiya-uq1el
    @MayankGadiya-uq1el 5 หลายเดือนก่อน

    I cant find etc folder inside hadoop.....can some one please help...also not able to get hadoop -env file to change java path

    • @kundankumar011
      @kundankumar011  5 หลายเดือนก่อน

      Hi Mayank. Thank you for watching this video. Sorry to know that you are not finding an etc folder in Hadoop. Ensure that you have downloaded Hadoop successfully and extracted the Hadoop zip file without major issues?

  • @glavanya9998
    @glavanya9998 3 หลายเดือนก่อน

    At first when i installed everything is fine and all the environment varibales are correct but now im facing issue with jps after entering start-all.cmd also im getting only jps no namenode,datanode,resource manager,nodemanager can you pls help sir

    • @kundankumar011
      @kundankumar011  3 หลายเดือนก่อน

      Thank you so much for watching this video and happy to know that you had successfully configured it. But unfortunately it stopped working. As you are saying all environmental variables are set well. Then to fix these issues you must check the log files to check what causes an error. Review the logs for any errors or issues with starting the daemons. The logs are typically located in $HADOOP_HOME/logs. Please do you subscribe to my channel if you have not yet to encourage me and not to miss notifications on upcoming relevant videos.

  • @GlobalTalesChronicles
    @GlobalTalesChronicles 4 หลายเดือนก่อน

    Hii sir followed every step you adviced ...but after running hdfs namenode -format I'm getting error ...big is not recognised and classpatch is not recognised

    • @kundankumar011
      @kundankumar011  4 หลายเดือนก่อน

      @@GlobalTalesChronicles Thank you so much for watching this video dear. Sorry to know that you are promoted errors while formatting namennode. What is a "big" word here? Can you copy paste the whole error?

  • @ZC-d8s
    @ZC-d8s 2 หลายเดือนก่อน

    Won't this work with jdk 17 or 21 version

    • @kundankumar011
      @kundankumar011  2 หลายเดือนก่อน +1

      @@ZC-d8s Thank you so much for watching this video. Yes it should work but jdk 8 is more stable and compatible with. Please do subscribe to my channel to encourage me dear and not to miss notifications for relevant videos.

    • @ZC-d8s
      @ZC-d8s 2 หลายเดือนก่อน

      @kundankumar011 in which port we need to run to know that it is really working

    • @kundankumar011
      @kundankumar011  2 หลายเดือนก่อน

      @@ZC-d8stry this dear. This is given in the video description localhost:50070/

    • @kundankumar011
      @kundankumar011  2 หลายเดือนก่อน +1

      @@ZC-d8s After successful installation of Hadoop. you can watch another video on Hadoop Commands on my channel th-cam.com/video/YurtxHSwkdU/w-d-xo.htmlsi=IevzPFeOtbsFUhNs

    • @kundankumar011
      @kundankumar011  2 หลายเดือนก่อน

      @@ZC-d8s please do you subscribe to my channel dear

  • @amazighkab9904
    @amazighkab9904 2 หลายเดือนก่อน

    when I execute the command "start-all.cmd" and I type the command "jps" at the beginning there are all the daemon (NodeManager, Namenode, ResourceManager, Datanode) after a moment I retype the command "jps" I see only the daemons (RessourceManager and DataNode) is this normal? Thank you

    • @kundankumar011
      @kundankumar011  2 หลายเดือนก่อน +1

      @@amazighkab9904 Basically sometimes Namenode and Resource Manager may stop automatically due to lack of space resources. But it should not happen always dear.

  • @saranyasaran5624
    @saranyasaran5624 หลายเดือนก่อน

    ERROR namenode.FSNamesystem: FSNamesystem initialization failed.

    • @kundankumar011
      @kundankumar011  หลายเดือนก่อน

      Thank you for watching my video. The "FSNamesystem initialization failed" error in Hadoop’s NameNode usually points to an issue with the Hadoop filesystem (HDFS) setup, which can be due to improper formatting, data corruption, or configuration errors. Here are steps to troubleshoot and resolve this issue.
      1. Check if HDFS Was Formatted Correctly
      If you're setting up Hadoop for the first time or reconfiguring it, make sure that the NameNode has been correctly formatted.
      Run this command to format the NameNode (only if this is a new setup or you don’t need the data in HDFS): hdfs namenode -format
      2. After formatting, start Hadoop services again
      Let me know if this step helped you fixing error dear. Please don't forget to subscribe to my channel to encourage me and press bell icon does not miss notification on upcoming relevant videos.

  • @gayathribetha9880
    @gayathribetha9880 4 หลายเดือนก่อน

    sir once the datanode is missing once namenode is missing and also everytime i try to format ive been asked Re-format filesystem in Storage Directory root= C:\hadoop\data
    amenode; location= null ? (Y or N)

    • @kundankumar011
      @kundankumar011  4 หลายเดือนก่อน

      @@gayathribetha9880 Hi dear. Choose Y

    • @gayathribetha9880
      @gayathribetha9880 4 หลายเดือนก่อน

      @@kundankumar011 Even after that datanode is missing sir what can I do? it is showing once i format but running again or while checking in gui also datanode info is not there sir

  • @DeepaDeepa-yb6bw
    @DeepaDeepa-yb6bw ปีที่แล้ว +1

    Can I install Hadoop in windows 4 GB RAM?

    • @kundankumar011
      @kundankumar011  ปีที่แล้ว

      @DeepaDeepa Yes, you can install Hadoop on a Windows system with 4 GB of RAM, but it's important to note that Hadoop is a resource-intensive framework, and running it on a machine with limited RAM may lead to performance issues or even failures for certain workloads. It's recommended to have more RAM for better performance and stability, especially if you plan to work with large datasets. You may encounter memory-related problems with only 4 GB of RAM, so be prepared for potential limitations. Hope you got me well. If you like my videos please you do #subscribe my channel. Please

  • @vishalkanvajiya-j8t
    @vishalkanvajiya-j8t 4 หลายเดือนก่อน

    JAVA_HOME is incorrectly set.
    Please update C:\hadoop\etc\hadoop\hadoop-env.cmd
    '-Dhadoop.security.logger' is not recognized as an internal or external command,
    operable program or batch file.
    i check all but same show

    • @kundankumar011
      @kundankumar011  4 หลายเดือนก่อน

      The error message "JAVA_HOME is not set properly" indicates that Hadoop is unable to locate the Java installation required to run. This is a common issue and can be resolved by correctly setting the JAVA_HOME environment variable in the Hadoop configuration files. So Please navigate to C:\hadoop\etc\hadoop\hadoop-env.cmd and Open the hadoop-env.cmd file in a text editor (e.g., Notepad). and Find the line that looks like this:
      set JAVA_HOME=%JAVA_HOME%
      Replace the existing path with the correct path to your Java installation directory. It should look something like this:
      set JAVA_HOME=C:\Java\jdk
      You can rewind this video at timeline 8:00 Minutes to see how I set the Java path in Hadoop configuration files.
      This is based on my computer. ensure that you are setting based on your computer java installation path. Let me know if it helped you. please don't forget to subscribe my channel dear to encourage me.

  • @abhinavdahiya2423
    @abhinavdahiya2423 5 หลายเดือนก่อน

    Hello sir, everything works fine except for the datanode daemon. It stops running with a SHUTDOWN message.

    • @kundankumar011
      @kundankumar011  5 หลายเดือนก่อน

      Thank you for watching my video dear Abhinav. To fix this issue with the datanode shutting down automatically after starting in your Hadoop installation, To fix this issue there could be many reasons.
      1) Verify that the user running the Datanode has the necessary permissions to read and write to the data directories specified in hdfs-site.xml. Right click on datanode folder and give full-access permissions (Read, Write and Execute)
      then Simply Restart the computer then run all daemon and see it worked or not. If not check the log
      2) check the Logs: Look at the Datanode logs for any error messages. The logs are typically located in the Hadoop logs directory. Check hadoop/logs/hadoop-hduser-datanode-.log for relevant information.
      Let me know if this helped you to fix the issues?

  • @nihalsingh6667
    @nihalsingh6667 2 หลายเดือนก่อน

    Sir there is an error showing
    Usage Error: Unrecognized or legacy configuration settings found: dir - run "yarn config -v" to see the list of settings supported in Yarn (in )
    $ yarn run [--inspect] [--inspect-brk] [-T,--top-level] [-B,--binaries-only] [--require #0] ...
    C:\Windows\System32>

    • @nihalsingh6667
      @nihalsingh6667 2 หลายเดือนก่อน

      Localhost 8088 is not working

    • @kundankumar011
      @kundankumar011  2 หลายเดือนก่อน

      Apologies to late reply dear. If localhost:8088 (the YARN ResourceManager web interface) is not working, it typically indicates that YARN is either not running correctly or there may be configuration issues preventing access. Cross Check Yarn Configuration file yarn-site.xml and also check firewall setting if it is not blocked

  • @adityanuriskandar5670
    @adityanuriskandar5670 ปีที่แล้ว

    thanks sir

    • @kundankumar011
      @kundankumar011  ปีที่แล้ว

      Thank you so much Aditayan ❤️. If you like my video. please you do subscribe my channel. Your subscription will motivate me.

  • @فروتي-ي5ش
    @فروتي-ي5ش 2 หลายเดือนก่อน

    locallhost desont work what i shuld do

    • @kundankumar011
      @kundankumar011  2 หลายเดือนก่อน +1

      Sorry to know that you are facing a challenge. Localhost is not working. This could be due to various reasons dear.
      1) Cross the Hadoop daemons NameNode , Resource Manager is still running or not using jps command while accessing localhost
      2) Cross check Hadoop Configuration file is set properly or not.
      3) Check the firewall settings. Sometimes firewall blocks some ports like 8080, 9000, 50070 etc. if yes change settings to allow them
      Please do you subscribe to my channel to encourage me dear and not to miss on upcoming relevant videos. Thank you ❤️

    • @kundankumar011
      @kundankumar011  2 หลายเดือนก่อน

      @@فروتي-ي5ش Hello dear. Hope you managed to fix the issues. Please don't forget to subscribe to my channel to encourage me

  • @SakshiGajjar-u3i
    @SakshiGajjar-u3i 4 หลายเดือนก่อน

    Nodemanager and resourcemanager shutting immediately jps shows only
    2820 Jps
    10328 NameNode
    21484 DataNode
    Pls help

    • @kundankumar011
      @kundankumar011  4 หลายเดือนก่อน

      Thank you so much for watching this video❤️ Sorry to know that you are facing issues. If the NodeManager and ResourceManager are shutting down immediately after starting, it typically indicates a configuration issue.
      Please Validate Configuration Files:
      Double-check the following configuration files for any misconfigurations:
      "yarn-site.xml"
      "core-site.xml"
      Ensure that the configuration for the yarn.resourcemanager.hostname and other essential parameters is correct. If these all are ok then there could be issues of insufficient space. Let me know if this helped you dear. Please do subscribe to my channel and encourage me

  • @SanjayNeelarapu
    @SanjayNeelarapu 5 หลายเดือนก่อน

    C:\Windows\System32>hdfs namenode -format
    'C:\Program' is not recognized as an internal or external command,
    operable program or batch file.
    '-classpath' is not recognized as an internal or external command,
    operable program or batch file.
    help....

    • @kundankumar011
      @kundankumar011  5 หลายเดือนก่อน

      Thank you so much for watching this video. Sorry to know that you are facing issue. Looking at your shared error with me. This error message indicates that there are spaces in the path to your Java installation directory, causing the command to fail. This is a common issue on Windows when Java is installed in the default directory (C:\Program Files).
      Here are the steps to resolve this issue:
      1. Set the JAVA_HOME Variable Correctly
      Ensure that the JAVA_HOME environment variable is set correctly, and that it is enclosed in quotes if there are spaces in the path. Set Environment Variables in Command Prompt
      Alternatively, you can set the environment variables directly in the Command Prompt before running the format command:
      set "JAVA_HOME=C:\Program Files\Java\jdk"
      Note:- change the path based on your jdk installation location
      Let me know if this helped you dear?

    • @kundankumar011
      @kundankumar011  5 หลายเดือนก่อน

      Hi dear. Hope you managed to fix the issues as I guided you above. Please you do subscribe my channel and encourage me.

  • @RiteshSingh-zb5ss
    @RiteshSingh-zb5ss 4 หลายเดือนก่อน

    sir localhost cluster is not opening

    • @kundankumar011
      @kundankumar011  4 หลายเดือนก่อน

      @@RiteshSingh-zb5ss Thank you so much for watching this video dear and happy to know that you managed to configure it successfully. Try to use localhost:50070/
      Let me know you can open it. Even this link is given in the video description dear. Please do you subscribe to my channel to encourage me.

    • @RiteshSingh-zb5ss
      @RiteshSingh-zb5ss 4 หลายเดือนก่อน +1

      ​@@kundankumar011 sir it is not opening , other two site is opening but cluster site is not opening

    • @kundankumar011
      @kundankumar011  4 หลายเดือนก่อน

      @@RiteshSingh-zb5ss That's good progress. What error is promoting in the web browser on opening of the cluster? Please do you subscribe to encourage me dear.

    • @RiteshSingh-zb5ss
      @RiteshSingh-zb5ss 4 หลายเดือนก่อน

      ​@@kundankumar011 it is showing site can't be reached and in the jps it is showing 1856 namenode ,5248 datanode ,10228 jps only

    • @RiteshSingh-zb5ss
      @RiteshSingh-zb5ss 4 หลายเดือนก่อน

      ​@@kundankumar011 localhost is not working for cluster , but it is working for other datanode etc.

  • @amazighkab9904
    @amazighkab9904 2 หลายเดือนก่อน

    before executing hdfs namenode -format , you must execute the command rmdir /S "C:\hadoop\data\datanode" otherwise you will get the error Incompatible clusterIDs in C:\hadoop\data\datanode: namenode clusterID = CID-9ffe06b7-e903-46f6-a3fe-508ccc12e155; datanode clusterID = CID-d349

    • @kundankumar011
      @kundankumar011  2 หลายเดือนก่อน

      Thank you for watching my video. Apologies for the delayed response. I hope you were able to resolve the error. If not, here's a quick guide to help you. The error "Incompatible clusterIDs" occurs when there is a mismatch between the NameNode's and DataNode's cluster IDs. This typically happens if Hadoop is reinstalled or the namenode -format command is run again without resetting the DataNode. To resolve the Incompatible clusterIDs error, you should delete the contents of the DataNode directory before formatting the NameNode.

  • @priyaprakash3150
    @priyaprakash3150 ปีที่แล้ว +1

    pls send a link

    • @kundankumar011
      @kundankumar011  ปีที่แล้ว

      @Priya Here you can find winutils under the bin folder of shared linked here. (This link was attached in video descriptions as well) drive.google.com/drive/folders/1WTSipe8W8OXkpFPDVzhI71otRvh4htmP?usp=sharing

    • @kundankumar011
      @kundankumar011  ปีที่แล้ว

      if you like my video and you find its helpful, please you subscribe my channel, your subscription motivate me

  • @ummekhanam
    @ummekhanam 25 วันที่ผ่านมา

    Native library issue

    • @kundankumar011
      @kundankumar011  24 วันที่ผ่านมา

      Sorry to know that you are facing issues. It could be Incorrect Library Path: The Hadoop configuration does not point to the directory containing the native libraries. Check The environment variable HADOOP_HOME. Or it might be 32 or 64bit issues. Let me know if it helped you dear. Please you subscribe to my channel to encourage me dear

  • @meghanajyothinagaram
    @meghanajyothinagaram 5 หลายเดือนก่อน +1

    Hi sir at times im getting all the namenode, datanode everything but sometimes im not getting those and sometimes im getting it for 2 times can you pls help me with that

    • @kundankumar011
      @kundankumar011  5 หลายเดือนก่อน +1

      Dear Meghana. Good to know that you managed to install and configure successfully. Regarding your issues of showing namenode and datanode twice. This could be due to running multiple instances of the same service. Use commands like 'netstat' to check for port conflicts. In addition to this, sometimes not showing datanode,, it could be due to lack of space to allocate to the datanode at that point of time. Hope you got me well dear. If you find this video helpful, please do you subscribe to my channel so that you can get notifications for the upcoming interesting video on bigdata or other tech. Thank you

    • @meghanajyothinagaram
      @meghanajyothinagaram 5 หลายเดือนก่อน

      @@kundankumar011 what can i do to get the datanode at that time sir should i dlt the Hadoop cmpltely and download it bcuz at times it shows and at times it doesnt

  • @priyaprakash3150
    @priyaprakash3150 ปีที่แล้ว +2

    sir i cant find 3.2.4 winutils

    • @kundankumar011
      @kundankumar011  ปีที่แล้ว

      @Priya Here you can find winutils under the bin folder of shared linked here. (This link was attached in video descriptions as well) drive.google.com/drive/folders/1WTSipe8W8OXkpFPDVzhI71otRvh4htmP?usp=sharing

    • @kundankumar011
      @kundankumar011  ปีที่แล้ว

      if you like my video and you find its helpful, please you subscribe my channel, your subscription motivate me

  • @Ro-ko454
    @Ro-ko454 10 หลายเดือนก่อน +1

    Jps error

    • @kundankumar011
      @kundankumar011  10 หลายเดือนก่อน

      Thank you so much for watching this video. There could be many reasons
      1) Make sure Java is installed correctly on your system and that the JAVA_HOME environment variable is set properly.
      2) Hope Hadoop environment is set up well: If you're running "jps" outside of the Hadoop environment or if Hadoop is not properly configured, it may not recognize the Hadoop processes. etc .

    • @kundankumar011
      @kundankumar011  10 หลายเดือนก่อน

      If really this video was helpful please you do subscribe my channel and encourage me ❤️

  • @sanoberfarooqui1048
    @sanoberfarooqui1048 4 หลายเดือนก่อน

    Datanode not working

    • @kundankumar011
      @kundankumar011  4 หลายเดือนก่อน

      Thank you so much for watching this video. If the Hadoop DataNode is not working or not starting, it could be due to several reasons, including misconfiguration, resource issues, or problems with the underlying storage. To troubleshoot and fix the issue, please check the configuration "hdfs-site.xml" files datanode path is specified correctly. Addition to this also check if you have sufficient space. You can also try to restart the computer and try to start all daemons and see if it worked. If possible, please do subscribe my channel to encourage me.

  • @SwethaT-z1f
    @SwethaT-z1f ปีที่แล้ว +1

    now 25.08.2023 ,when i download hadoop i dont get in a zip file i got in .tar.gz format ?

    • @kundankumar011
      @kundankumar011  ปีที่แล้ว

      @Swetha. Thank you for watching my video ❤️. Yes, it's true that you'll receive a .tar.gz file. However, in the same video, I've demonstrated the process of unzipping this downloaded file using the WinRar application. To unzip the hadoop.tar.gz file, please make sure to install WinRar. You can visit this site www.win-rar.com/download.html?&L=0 If you like my video, please you do subscribe my channel and press bell icons, Like and share. If still you have challenge please let me know, I will be happy to guide you

  • @archimedeirakoze8789
    @archimedeirakoze8789 6 หลายเดือนก่อน +1

    Hello Dr, I have a problem with Hadoop. I got this error: Failed to retrieve data from /webhdfs/v1/?op=LISTSTATUS: Server Error

    • @kundankumar011
      @kundankumar011  6 หลายเดือนก่อน

      Thank you so much for the comment dear. can you provide more details. I meant what activity cause or prompt this error??

  • @sanoberfarooqui1048
    @sanoberfarooqui1048 3 หลายเดือนก่อน

    shutting down datanode
    Error:
    2024-09-07 15:28:30,891 INFO common.Storage: Lock on C:\tmp\hadoop-pc\dfs\data\in_use.lock acquired by nodename 9528@DESKTOP-TNSAH2H
    2024-09-07 15:28:30,891 WARN common.Storage: Failed to add storage directory [DISK]file:/C:/tmp/hadoop-pc/dfs/data
    java.io.IOException: Incompatible clusterIDs in C:\tmp\hadoop-pc\dfs\data: namenode clusterID = CID-6679ba00-f42f-4835-8440-3f7fe4748513; datanode clusterID = CID-13c1262e-453e-41e2-ac80-0542976931df

    • @kundankumar011
      @kundankumar011  3 หลายเดือนก่อน

      Thank you so much for watching my video but sorry to delay in replying. If still you are facing issues then After looking at The error you are encountering indicates a mismatch between the ClusterID of the NameNode and DataNode. One for the Steps to Fix:
      Clear the DataNode Data Directory
      If you're okay with clearing the DataNode's data and starting fresh, you can delete the DataNode storage directory and restart the DataNode. This will cause the DataNode to re-register with the NameNode using the current ClusterID.
      Then reformat the namenode and restart the daemons hope this will help you. otherwise let me know dear.

  • @dilshanaceemahal7440
    @dilshanaceemahal7440 8 หลายเดือนก่อน +1

    I cant change the destination folder.After status it doesnot shows destination folder .

    • @kundankumar011
      @kundankumar011  8 หลายเดือนก่อน

      Thank you so much for watching my video. I think you are talking about changing the destination folder to install Java? If yes, System must allow you change if still not. No worry you can choose the default destination then use the same destination path location to setup environmental variable dear. Just continue watching the video. Let me if this helped you? And encourage me by subscribing my channel dear❤️

  • @saranyasaran5624
    @saranyasaran5624 หลายเดือนก่อน

    ERROR namenode.NameNode: Failed to start namenode.
    java.lang.UnsupportedOperationException: getSubject is supported only if a security manager is allowed.
    how to fix this

    • @kundankumar011
      @kundankumar011  หลายเดือนก่อน

      The "java.lang.UnsupportedOperationException: getSubject is supported only if a security manager is allowed" error typically occurs when Hadoop is trying to use secure mode (Kerberos authentication) without proper configuration or permissions. Here are some ways to address this issue:
      =================================================
      Steps to Fix the UnsupportedOperationException in Hadoop:
      =================================================
      1. Disable Hadoop Security (if not needed)
      - If you’re not using secure mode (Kerberos authentication) for Hadoop, you can disable security in your configuration files. In your `core-site.xml` file, set the following property:

      hadoop.security.authentication
      simple

      - This setting ensures that Hadoop operates in non-secure mode, which should prevent the `getSubject` error.
      2. Check Hadoop Configuration Files
      - Ensure that `core-site.xml` and `hdfs-site.xml` do not contain configurations that imply secure mode (Kerberos).
      - Verify the following configurations in `core-site.xml`:

      hadoop.security.authorization
      false

      - Make sure no settings related to `hadoop.security.authentication` are set to `kerberos` unless Kerberos is actually configured and available.
      3. Run Hadoop Without a Security Manager
      - If you are testing locally or have no specific requirement for security, you can run Hadoop without a security manager.
      - Set the following in `hadoop-env.cmd` or `hadoop-env.sh`:
      export HADOOP_OPTS="$HADOOP_OPTS -Djava.security.manager="
      - This will start Hadoop without a security manager, avoiding the `getSubject` call.
      4. Ensure Proper Permissions (If Using Kerberos)
      - If you require Kerberos, confirm that all security-related configurations are correctly set up, including:
      - `hadoop.security.authentication` set to `kerberos`.
      - Kerberos principal and keytab files are correctly specified.
      - The necessary permissions and access control lists (ACLs) are configured in `hdfs-site.xml`.
      5. Restart Hadoop
      - After making the changes above, restart Hadoop services to apply the new configuration:
      stop-dfs.sh
      stop-yarn.sh
      start-dfs.sh
      start-yarn.sh
      Let me know if this resolves the issue or if there’s more detail from the error logs!