Watched a lot of videos , but yours worked the better . you even delt with the programe files space error, may videos , dont even deal with that .. awsome work
I am happy to know that this video was very helpful for you to install Hadoop successfully❤️. Please you subscribe to my channel if not yet and encourage me
@@sivashankar47 Glad to know that it is working for you as expected ❤️ Keep learning and enjoy. Please do you subscribe to my channel to encourage me and not to miss any notification on upcoming relevant videos.
@@SaiNirmalReddyGavini-k2o I am glad to know that this tutorial was helpful for you as well❤️. Welcome dear. Nirmal Reddy. Hope you do subscribed to my channel
Thank you so much for the appreciation dear 💗. It motivates me. Hope you subscribed to my channel and press bell 🔔 icon l so that you don't miss notifications on up irrelevant videos
Pleasure is all mine❤️. Glad to know that this video is awesome to you as well. I know you didn't forget to subscribe to my channel to encourage me and not to miss upcoming video notifications.
@EvrythingSimple Great to Know that this was helpful for you and Thank you so much for appreciating this video. Pleased you do subscribe my channel and encourage me.
@@MohammedSalman-nb2pu I am glad to know that this video was very helpful for and got your answers here❤️. Please do you subscribe to my channel to encourage me dear and to get notifications for upcoming relevant videos.
@@PrabhatSingh-tf6km Glad to know that this video was very helpful to you as well to install Hadoop successfully ❤️Please you subscribe to my channel dear to encourage me if not yet 🙌
Pleasure to know that this video is helpful for you. Ya sure, I will keep trying to make such helpful video. Kindly request you to subscribe my channel if you not yet. ❤️
@@mohamedmoawad9670 Glad to know that it was very helpful for you and saved your lots of time. Enjoy your learning dear ❤️ Hope you do subscribe to my channel to encourage me dear and not to miss notifications on upcoming relevant videos
@@kundankumar011 Already done, And I was wondering, When I try to upload a datset into the hdfs, I see an error "Couldn't upload the file jfk_weather_cleaned.csv.". So have you made the next video you mentioned in the end of this video so that I would be better able to get the insights on it.. Regards
@@AbdullahKhan-pl7py Dear Abdullah. You can upload the file to hdfs using the command or the GUI interface. At the end of video at 22:50 minutes, I just showed an interface how to upload files on hdfs. May I know what the error is?
@@kundankumar011 "Couldn't upload the file Perecipitationhourly.csv." that is the error which is coming whenever I try to upload it, BTW thanks for cooperating, Your reply is highly appreciated man.
Dear Salah. It's pleasure to know that this video was very helpful for you ❤️. If you not yet subscribe my channel please you do subscribe and encourage me 🙌
Sorry to know that you are facing issues. I used a sublime editor dear . You can use any editor even notepads dear. Let me know if you are asking anything else? Please subscribe to my channel to encourage me dear
@@kundankumar011 sir i pinned you on the linkedin sir the issue has been resolved but the thing is that my C drive was full so i did all those steps in D drive all the paths has been specified correclty by me but when u asked us to download msvcr120.dll file and asked us to copy it in C driver system 32 folder it is creating some error as after this when i opened the CMD as a administrator and written the command hdfc namenode -format its showing system cannot find the path specified sir kindly guide me what i can do now i am stucked so badly
@@kundankumar011 is t mandatory to save that file msvcr120.dll into system32? what will happen if we did all these steps in D drive instead of C drive sir please guide me
Pleasure to know that this video was so useful for you ❤️. If you didn't subscribe my channel yet, please you do subscribe my channel and press bell icons not to miss next videos.
Thank you very much , it is very use full video. If possible could you plz do one more video, how to install Apache Spark on top of this Hadoop env. and how to integrate Hadoop, hive and Spark.
@Sathish Thank you so much for appreciating this video tutorial and subscribing my channel ❤️l. Let me work on your request as early as possible on Hadoop Spark And hive integration and upload. 👍
Hi dear Satish. Hope you are learning well. I am sorry to delay making a video on hive as you had requested king ago but I was quite busy on different projects. But now I have already made video in easy hive installation for Hadoop. You can find it here th-cam.com/video/CL6t2W8YC1Y/w-d-xo.htmlsi=Gx_jMbUZK6vxy76t
Thank you so much for watching my video and it's always pleasure to help. Here is the link to download latest WinRar dear. www.win-rar.com/download.html?&L=0 . If you have not yet subscribe my channel. please you do subscribe. Your subscription is motivation for me. ❤️
@@rt7687 After installing WinRar. Extract the Hadoop.tar file then you will find extracted Hadoop file. Follow instructions in video how to extract hadoop.tar file and rest.
sir for me that port number is not showing hadoop, but 8000 series port showing namenode details. how do i get to know which port will display the hadoop information?
Hello Dear. Thank you for watching my video and asking question. Did you try localhost:9870?? Ports: The port information is indeed specified in the Hadoop configuration files. Here's how to find port information in those files: For the core-site.xml file, look for the fs.defaultFS property. It specifies the default filesystem name, which includes the hostname and port. For example: fs.defaultFS hdfs://namenode-hostname:9000 Hope you got me well dear? If you like my video please you do subscribe my channel to motivate me ❤️
Thank you so much for watching my video. Hope you didn't forget to format the namenode and datanode folder. In same video, there is instructions how to format it. Addition to this, if really this video is helpful for please you do subscribe my channel dear. Your subscription is my motivation ❤️
I missed your comment earlier, and I apologize for that. I'll make sure to prioritize working on the requested video. Regarding Hadoop, while 8GB RAM is sufficient, but it depends on different factors. Like what size of datasets you have? upgrading to 16GB RAM would indeed improve performance.
Sorry for late response dear Akash. I missed to see your comment. 1) Check Configuration Files: Verify that your hdfs-site.xml and core-site.xml configuration files are correctly set up. Pay attention to the dfs.datanode.data.dir property in hdfs-site.xml, which defines the directory where DataNode stores its data. Ensure the paths are correct and accessible. 2) Hope ava Version is ok for you. 3) Formate the namenode before start daemons.
i am facing the following error. please help sir The system cannot find the path specified. Error: JAVA_HOME is incorrectly set. Please update C:\hadoop\hadoop-3.4.0\etc\hadoop\hadoop-env.cmd '-Dhadoop.security.logger' is not recognized as an internal or external command, operable program or batch file.
@@Akhileshhgaming I am glad to know that you are also managed to configure successfully ❤️👍. I think you are trying not to say how to open Hadoop GUI, see the video description I have shared the URL to access Hadoop GUI dear. If I mistaken you let me know
Thank you for watching this video. I hope you have configured the files properly. Afterward, try using start-all.cmd to initiate all daemons. If you are still encountering issues, please provide more information so I can assist you more effectively. Please you do subscribe my channel to motivate me.
@@VIJAYVARDHAN_V_PATIL. Thank you so much for watching this video. I hope so far it's helpful for you as well. I used a "sublime" editor but feel free to use any of your choices dear. Please do you subscribe to my channel dear and encourage me 💖
Thank you so much for watching my video. Hope it was helpful. Sorry for the reply. Please Cross check the DataNode is configured correctly in Hadoop's configuration files: 1) core-site.xml: Check the fs.defaultFS property points to the correct NameNode address. 2) hdfs-site.xml: Verify the directories for DataNode storage are set and accessible. Let me know if this guide helped you. In addition to this, hope you didn't fail to subscribe to my channel to encourage me.
Thank you so much dear for reaching out to clear your doubt on the Hadoop installation. Actually we follow the same steps for Installing any version of Hadoop so far. So this video must help you to configure it. If not let me know what the issues are occurring.
Sorry to know that you are facing issues. As your error it's sure that you have not set environmental variables/Path correctly. Try to see if you made some mistakes while setting Path dear. It is shown in this video
Hi dear. Thank you for watching my this video. Hope it was helpful. But if I am getting you well, try to download winutils file from given resources at my video description and paste it into your Hadoop configuration as guided in the video then try to see if still you gave an error. ❤️ If you have not subscribed my channel, please you do subscribe , like and share. Your subscription will motivate me more 🙏
Thank you so much for watching this video. As per my knowledge Java8 is more compatible and stable dear. Let me know if for you java23 is working fine? I believe you didn't forget to subscribe to my channel to encourage me
@@DeepaDeepa-yb6bw If You started daemons and still not showing . Check The followings: 1) Check Configuration: Ensure that your Hadoop configuration files (core-site.xml, hdfs-site.xml, and yarn-site.xml) are correctly configured with the right settings. Pay attention to file paths, ports, and other configurations. 2) Hadoop Formatting: If this is a fresh installation, make sure you've formatted the HDFS file system using the hdfs namenode -format command before starting the daemons. If really you like my video and found its helpful. please you do subscribe my channel dear.
Hello Sir. Thank you for this video but i am facing this error "Error: Could not find or load main class Dev Caused by: java.lang.ClassNotFoundException: Dev ", when i run the hdfs namenode -format command.. and my computer name is Malaperi Dev. Thank you
Thank you so much for watching this video. Sorry for the error promoted. I am traveling. Give me a few minutes to settle down myself then help you to fix this error . Hope it's ok with you dear. If you have not yet subscribed to my channel yet. Please do subscribe and encourage me. Soon get back to you ❤️
Hello dear,There could be many reasons causing this error. I am assuming you have configured all Hadoop files correctly and set up the environment variables. Please try this first: I noticed there is a space in your computer name, "Malaperi Dev." Rename your computer to "MalaperiDev" without any spaces. Make sure there are no spaces in the name. After renaming, restart your computer and check if the error is resolved. if not let me know what is the error again?
@@kisivaniAIntelligence Ensure the Hadoop and Java installation paths do not contain spaces in them and also Review your Hadoop configuration files (core-site.xml, hdfs-site.xml) for any paths or classpath entries that reference the computer name directly.
I am glad to know that it's working for you as well ❤️. Enjoy learning dear Aashit. Hope you subscribed to my channel to encourage me. You also find video on Hadoop Commands on my channel in dear data Bigdata Essential playlists.
Hello Sir, nodemanager is shutting immediately when i start hadoop services and it is showing this error: Permissions incorrectly set for dir /tmp/hadoop-phile/nm-local-dir/usercache, should be rwxr-xr-x, actual value = rwxrwxr-x, i tried to set the above permissions it needs, but still it is shutting down. also, i am wondering how is the tmp folder related to hadoop services since it is created authomatically when they are started? any hint? thanks :)
Hello dear. Check the owner of the tmp folder. Who is it? I mean, you are logging into the system using some user and then running Hadoop. Make sure that the owner has all the permissions.
i got this error when running start-all.cmd in C:\hadoop\sbin (SHUTDOWN_MSG: Shutting down NodeManager) i was not able to open localhost:8080 please help me
Hi dear. I am so sorry for being late to reply this serious issues. I apologies. It seems like there might be an issue with your Hadoop setup, specifically with the NodeManager shutting down. Check Configuration: Ensure that your Hadoop configuration files (core-site.xml, hdfs-site.xml, yarn-site.xml, etc.) are properly configured. Pay special attention to the NodeManager configuration in yarn-site.xml. Let me know if this helped you?
@@HoneyPatel-w7y thank you so much for watching my video. Sorry to know that you are facing class and do not find an error. This error may occur due to JAVA_HOME and HADOOP_HOME class path is not set properly or could be a compatible issue of the Java Version with Hadoop Version. First try to cross check the path of java and Hadoop dear. Let me know if this helped you?
Hi Harsh, I am sorry for being late to reply. It just sliped from mind. Hope 8088 port working as @shatrughnpinjari5752 suggested. Please let me know if still i can help you dear Harsh.
The "FSNamesystem initialization failed" error in Hadoop’s NameNode usually points to an issue with the Hadoop filesystem (HDFS) setup, which can be due to improper formatting, data corruption, or configuration errors. Here are steps to troubleshoot and resolve this issue. 1. Check if HDFS Was Formatted Correctly If you're setting up Hadoop for the first time or reconfiguring it, make sure that the NameNode has been correctly formatted. Run this command to format the NameNode (only if this is a new setup or you don’t need the data in HDFS): hdfs namenode -format After formatting, start Hadoop services again Let me know if this step helped you to troubleshoot the error. Please you do subscribe to my channel to encourage me dear
Thank you so much for watching this video dear. Please share any queries you have related to it. I will do my best to address them in the comments. If needed, we will find a way to provide technical support. Hope you get me well dear.
@@beardboyphotography sorry for the issues dear. Let me clarify first. When you use the "jps" command to list the running daemons, what does this command display?
Hope you managed to format the namenode and datanode folder before running daemons. In same video, instruction is given how to format namenode and datanode. Please you do subscribe my channel ❤️
i Encountered this problem. 'jps' is not recognized as an internal or external command, operable program or batch file can you please provide the solution?
Thank you so much for watching my video. The error 'jps' is not recognized as an internal or external command, operable program or batch file typically occurs because the Java Development Kit (JDK) tools (like jps) are not properly set up or the environment variable JAVA_HOME is not configured correctly. Please follow these steps to fix: 1) Check if JDK is Installed: Open a terminal or command prompt and type the following command to check the installed version of Java: java -version 2) Ensure JDK bin is in the PATH Environment Variable is set properly Please you do subscribe my channel to encourage me dear and don't forget to press bell icon to receive notifications on upcoming relevant videos
@@mounikagumpu3960 Hello dear. Thank you so much for watching this video. Sorry to know that you are facing issues. Ensure that the JAVA_HOME and HADOOP_HOME environmental variables are set properly.
@@mounikagumpu3960 I think you are talking about hadoop folder where you extracted all component. Better maintain hadoop dear small 'h'. But ensure that you set up environmental variables of JAVA_HOME and HADOOP_HOME as guided in this video. Hope you got me well dear.
@@mounikagumpu3960 hope there is no comma and all in environmental variables values. If all is well. Just restart your computer and see. Sometimes it helps. But sound funny 😂
Hi Bhavana. If you like my video and it's helpful for you. Please you do subscribe and press well icon to receive notification on next video upload. ❤️
Thank you so much for watching my video. Sorry for facing the issue. The error message 'jps' is not recognized indicates that the Java command jps (Java Virtual Machine Process Status Tool) is not found in your system's PATH. Please verify Java installation. Cross Java path, JAVA _HOME set in environmental variable or not. Let me know if this step is fixed your error?
Hi Mayank. Thank you for watching this video. Sorry to know that you are not finding an etc folder in Hadoop. Ensure that you have downloaded Hadoop successfully and extracted the Hadoop zip file without major issues?
At first when i installed everything is fine and all the environment varibales are correct but now im facing issue with jps after entering start-all.cmd also im getting only jps no namenode,datanode,resource manager,nodemanager can you pls help sir
Thank you so much for watching this video and happy to know that you had successfully configured it. But unfortunately it stopped working. As you are saying all environmental variables are set well. Then to fix these issues you must check the log files to check what causes an error. Review the logs for any errors or issues with starting the daemons. The logs are typically located in $HADOOP_HOME/logs. Please do you subscribe to my channel if you have not yet to encourage me and not to miss notifications on upcoming relevant videos.
Hii sir followed every step you adviced ...but after running hdfs namenode -format I'm getting error ...big is not recognised and classpatch is not recognised
@@GlobalTalesChronicles Thank you so much for watching this video dear. Sorry to know that you are promoted errors while formatting namennode. What is a "big" word here? Can you copy paste the whole error?
@@ZC-d8s Thank you so much for watching this video. Yes it should work but jdk 8 is more stable and compatible with. Please do subscribe to my channel to encourage me dear and not to miss notifications for relevant videos.
@@ZC-d8s After successful installation of Hadoop. you can watch another video on Hadoop Commands on my channel th-cam.com/video/YurtxHSwkdU/w-d-xo.htmlsi=IevzPFeOtbsFUhNs
when I execute the command "start-all.cmd" and I type the command "jps" at the beginning there are all the daemon (NodeManager, Namenode, ResourceManager, Datanode) after a moment I retype the command "jps" I see only the daemons (RessourceManager and DataNode) is this normal? Thank you
@@amazighkab9904 Basically sometimes Namenode and Resource Manager may stop automatically due to lack of space resources. But it should not happen always dear.
Thank you for watching my video. The "FSNamesystem initialization failed" error in Hadoop’s NameNode usually points to an issue with the Hadoop filesystem (HDFS) setup, which can be due to improper formatting, data corruption, or configuration errors. Here are steps to troubleshoot and resolve this issue. 1. Check if HDFS Was Formatted Correctly If you're setting up Hadoop for the first time or reconfiguring it, make sure that the NameNode has been correctly formatted. Run this command to format the NameNode (only if this is a new setup or you don’t need the data in HDFS): hdfs namenode -format 2. After formatting, start Hadoop services again Let me know if this step helped you fixing error dear. Please don't forget to subscribe to my channel to encourage me and press bell icon does not miss notification on upcoming relevant videos.
sir once the datanode is missing once namenode is missing and also everytime i try to format ive been asked Re-format filesystem in Storage Directory root= C:\hadoop\data amenode; location= null ? (Y or N)
@@kundankumar011 Even after that datanode is missing sir what can I do? it is showing once i format but running again or while checking in gui also datanode info is not there sir
@DeepaDeepa Yes, you can install Hadoop on a Windows system with 4 GB of RAM, but it's important to note that Hadoop is a resource-intensive framework, and running it on a machine with limited RAM may lead to performance issues or even failures for certain workloads. It's recommended to have more RAM for better performance and stability, especially if you plan to work with large datasets. You may encounter memory-related problems with only 4 GB of RAM, so be prepared for potential limitations. Hope you got me well. If you like my videos please you do #subscribe my channel. Please
JAVA_HOME is incorrectly set. Please update C:\hadoop\etc\hadoop\hadoop-env.cmd '-Dhadoop.security.logger' is not recognized as an internal or external command, operable program or batch file. i check all but same show
The error message "JAVA_HOME is not set properly" indicates that Hadoop is unable to locate the Java installation required to run. This is a common issue and can be resolved by correctly setting the JAVA_HOME environment variable in the Hadoop configuration files. So Please navigate to C:\hadoop\etc\hadoop\hadoop-env.cmd and Open the hadoop-env.cmd file in a text editor (e.g., Notepad). and Find the line that looks like this: set JAVA_HOME=%JAVA_HOME% Replace the existing path with the correct path to your Java installation directory. It should look something like this: set JAVA_HOME=C:\Java\jdk You can rewind this video at timeline 8:00 Minutes to see how I set the Java path in Hadoop configuration files. This is based on my computer. ensure that you are setting based on your computer java installation path. Let me know if it helped you. please don't forget to subscribe my channel dear to encourage me.
Thank you for watching my video dear Abhinav. To fix this issue with the datanode shutting down automatically after starting in your Hadoop installation, To fix this issue there could be many reasons. 1) Verify that the user running the Datanode has the necessary permissions to read and write to the data directories specified in hdfs-site.xml. Right click on datanode folder and give full-access permissions (Read, Write and Execute) then Simply Restart the computer then run all daemon and see it worked or not. If not check the log 2) check the Logs: Look at the Datanode logs for any error messages. The logs are typically located in the Hadoop logs directory. Check hadoop/logs/hadoop-hduser-datanode-.log for relevant information. Let me know if this helped you to fix the issues?
Sir there is an error showing Usage Error: Unrecognized or legacy configuration settings found: dir - run "yarn config -v" to see the list of settings supported in Yarn (in ) $ yarn run [--inspect] [--inspect-brk] [-T,--top-level] [-B,--binaries-only] [--require #0] ... C:\Windows\System32>
Apologies to late reply dear. If localhost:8088 (the YARN ResourceManager web interface) is not working, it typically indicates that YARN is either not running correctly or there may be configuration issues preventing access. Cross Check Yarn Configuration file yarn-site.xml and also check firewall setting if it is not blocked
Sorry to know that you are facing a challenge. Localhost is not working. This could be due to various reasons dear. 1) Cross the Hadoop daemons NameNode , Resource Manager is still running or not using jps command while accessing localhost 2) Cross check Hadoop Configuration file is set properly or not. 3) Check the firewall settings. Sometimes firewall blocks some ports like 8080, 9000, 50070 etc. if yes change settings to allow them Please do you subscribe to my channel to encourage me dear and not to miss on upcoming relevant videos. Thank you ❤️
Thank you so much for watching this video❤️ Sorry to know that you are facing issues. If the NodeManager and ResourceManager are shutting down immediately after starting, it typically indicates a configuration issue. Please Validate Configuration Files: Double-check the following configuration files for any misconfigurations: "yarn-site.xml" "core-site.xml" Ensure that the configuration for the yarn.resourcemanager.hostname and other essential parameters is correct. If these all are ok then there could be issues of insufficient space. Let me know if this helped you dear. Please do subscribe to my channel and encourage me
C:\Windows\System32>hdfs namenode -format 'C:\Program' is not recognized as an internal or external command, operable program or batch file. '-classpath' is not recognized as an internal or external command, operable program or batch file. help....
Thank you so much for watching this video. Sorry to know that you are facing issue. Looking at your shared error with me. This error message indicates that there are spaces in the path to your Java installation directory, causing the command to fail. This is a common issue on Windows when Java is installed in the default directory (C:\Program Files). Here are the steps to resolve this issue: 1. Set the JAVA_HOME Variable Correctly Ensure that the JAVA_HOME environment variable is set correctly, and that it is enclosed in quotes if there are spaces in the path. Set Environment Variables in Command Prompt Alternatively, you can set the environment variables directly in the Command Prompt before running the format command: set "JAVA_HOME=C:\Program Files\Java\jdk" Note:- change the path based on your jdk installation location Let me know if this helped you dear?
@@RiteshSingh-zb5ss Thank you so much for watching this video dear and happy to know that you managed to configure it successfully. Try to use localhost:50070/ Let me know you can open it. Even this link is given in the video description dear. Please do you subscribe to my channel to encourage me.
@@RiteshSingh-zb5ss That's good progress. What error is promoting in the web browser on opening of the cluster? Please do you subscribe to encourage me dear.
before executing hdfs namenode -format , you must execute the command rmdir /S "C:\hadoop\data\datanode" otherwise you will get the error Incompatible clusterIDs in C:\hadoop\data\datanode: namenode clusterID = CID-9ffe06b7-e903-46f6-a3fe-508ccc12e155; datanode clusterID = CID-d349
Thank you for watching my video. Apologies for the delayed response. I hope you were able to resolve the error. If not, here's a quick guide to help you. The error "Incompatible clusterIDs" occurs when there is a mismatch between the NameNode's and DataNode's cluster IDs. This typically happens if Hadoop is reinstalled or the namenode -format command is run again without resetting the DataNode. To resolve the Incompatible clusterIDs error, you should delete the contents of the DataNode directory before formatting the NameNode.
@Priya Here you can find winutils under the bin folder of shared linked here. (This link was attached in video descriptions as well) drive.google.com/drive/folders/1WTSipe8W8OXkpFPDVzhI71otRvh4htmP?usp=sharing
Sorry to know that you are facing issues. It could be Incorrect Library Path: The Hadoop configuration does not point to the directory containing the native libraries. Check The environment variable HADOOP_HOME. Or it might be 32 or 64bit issues. Let me know if it helped you dear. Please you subscribe to my channel to encourage me dear
Hi sir at times im getting all the namenode, datanode everything but sometimes im not getting those and sometimes im getting it for 2 times can you pls help me with that
Dear Meghana. Good to know that you managed to install and configure successfully. Regarding your issues of showing namenode and datanode twice. This could be due to running multiple instances of the same service. Use commands like 'netstat' to check for port conflicts. In addition to this, sometimes not showing datanode,, it could be due to lack of space to allocate to the datanode at that point of time. Hope you got me well dear. If you find this video helpful, please do you subscribe to my channel so that you can get notifications for the upcoming interesting video on bigdata or other tech. Thank you
@@kundankumar011 what can i do to get the datanode at that time sir should i dlt the Hadoop cmpltely and download it bcuz at times it shows and at times it doesnt
@Priya Here you can find winutils under the bin folder of shared linked here. (This link was attached in video descriptions as well) drive.google.com/drive/folders/1WTSipe8W8OXkpFPDVzhI71otRvh4htmP?usp=sharing
Thank you so much for watching this video. There could be many reasons 1) Make sure Java is installed correctly on your system and that the JAVA_HOME environment variable is set properly. 2) Hope Hadoop environment is set up well: If you're running "jps" outside of the Hadoop environment or if Hadoop is not properly configured, it may not recognize the Hadoop processes. etc .
Thank you so much for watching this video. If the Hadoop DataNode is not working or not starting, it could be due to several reasons, including misconfiguration, resource issues, or problems with the underlying storage. To troubleshoot and fix the issue, please check the configuration "hdfs-site.xml" files datanode path is specified correctly. Addition to this also check if you have sufficient space. You can also try to restart the computer and try to start all daemons and see if it worked. If possible, please do subscribe my channel to encourage me.
@Swetha. Thank you for watching my video ❤️. Yes, it's true that you'll receive a .tar.gz file. However, in the same video, I've demonstrated the process of unzipping this downloaded file using the WinRar application. To unzip the hadoop.tar.gz file, please make sure to install WinRar. You can visit this site www.win-rar.com/download.html?&L=0 If you like my video, please you do subscribe my channel and press bell icons, Like and share. If still you have challenge please let me know, I will be happy to guide you
Thank you so much for watching my video but sorry to delay in replying. If still you are facing issues then After looking at The error you are encountering indicates a mismatch between the ClusterID of the NameNode and DataNode. One for the Steps to Fix: Clear the DataNode Data Directory If you're okay with clearing the DataNode's data and starting fresh, you can delete the DataNode storage directory and restart the DataNode. This will cause the DataNode to re-register with the NameNode using the current ClusterID. Then reformat the namenode and restart the daemons hope this will help you. otherwise let me know dear.
Thank you so much for watching my video. I think you are talking about changing the destination folder to install Java? If yes, System must allow you change if still not. No worry you can choose the default destination then use the same destination path location to setup environmental variable dear. Just continue watching the video. Let me if this helped you? And encourage me by subscribing my channel dear❤️
ERROR namenode.NameNode: Failed to start namenode. java.lang.UnsupportedOperationException: getSubject is supported only if a security manager is allowed. how to fix this
The "java.lang.UnsupportedOperationException: getSubject is supported only if a security manager is allowed" error typically occurs when Hadoop is trying to use secure mode (Kerberos authentication) without proper configuration or permissions. Here are some ways to address this issue: ================================================= Steps to Fix the UnsupportedOperationException in Hadoop: ================================================= 1. Disable Hadoop Security (if not needed) - If you’re not using secure mode (Kerberos authentication) for Hadoop, you can disable security in your configuration files. In your `core-site.xml` file, set the following property:
hadoop.security.authentication simple
- This setting ensures that Hadoop operates in non-secure mode, which should prevent the `getSubject` error. 2. Check Hadoop Configuration Files - Ensure that `core-site.xml` and `hdfs-site.xml` do not contain configurations that imply secure mode (Kerberos). - Verify the following configurations in `core-site.xml`:
hadoop.security.authorization false
- Make sure no settings related to `hadoop.security.authentication` are set to `kerberos` unless Kerberos is actually configured and available. 3. Run Hadoop Without a Security Manager - If you are testing locally or have no specific requirement for security, you can run Hadoop without a security manager. - Set the following in `hadoop-env.cmd` or `hadoop-env.sh`: export HADOOP_OPTS="$HADOOP_OPTS -Djava.security.manager=" - This will start Hadoop without a security manager, avoiding the `getSubject` call. 4. Ensure Proper Permissions (If Using Kerberos) - If you require Kerberos, confirm that all security-related configurations are correctly set up, including: - `hadoop.security.authentication` set to `kerberos`. - Kerberos principal and keytab files are correctly specified. - The necessary permissions and access control lists (ACLs) are configured in `hdfs-site.xml`. 5. Restart Hadoop - After making the changes above, restart Hadoop services to apply the new configuration: stop-dfs.sh stop-yarn.sh start-dfs.sh start-yarn.sh Let me know if this resolves the issue or if there’s more detail from the error logs!
Watched a lot of videos , but yours worked the better . you even delt with the programe files space error, may videos , dont even deal with that .. awsome work
Glad to know that this video was so helpful to you❤️. Thank you for appreciating the video. Please do you subscribe to my channel and encourage me.
Excellent, I followed the steps and it worked very well for me. Thank you so much
Pleasure to know that you followed the steps and worked very fine. ❤️ Please you do encourage me by subscribing my channel dear.
I have been trying to install hadoop for a week. Finally it works now. Thankyou so much sir for this video 🥺🩷🩷
I am happy to know that this video was very helpful for you to install Hadoop successfully❤️. Please you subscribe to my channel if not yet and encourage me
Thank you so much for subscribing to my channel 💖💖💖
Thank you, followed the steps and it worked as expected.
@@sivashankar47 Glad to know that it is working for you as expected ❤️ Keep learning and enjoy. Please do you subscribe to my channel to encourage me and not to miss any notification on upcoming relevant videos.
Awesome after many tutorials I can't do it but now, I made it in One-go, thankyou sirr..
@@SaiNirmalReddyGavini-k2o I am glad to know that this tutorial was helpful for you as well❤️. Welcome dear. Nirmal Reddy. Hope you do subscribed to my channel
After wasting 1 whole day, i get this video ❤️ it help me in installation.thankyou💯
@@kids_toon9 Glad to know this video also helped you.❤️ Please you subscribe my channel to encourage me dear
Super, Hats off to You SIR
Thank you so much for the appreciation dear Ajay. Hope you didn't fail to subscribe to my channel to encourage me
Thank you bro !! this is the best video that explains how to install HADOOP
Thank you so much for the appreciation dear 💗. It motivates me. Hope you subscribed to my channel and press bell 🔔 icon l so that you don't miss notifications on up irrelevant videos
This video is themost of complete that see en youtube, excelent video and explication, you have a good job and is the best tutorial
@@yercoarancibia3366 Thank you so much for appreciating this Hadoop installation tutorial. It motivates me. I hope you subscribed to my channel dear.
thank you. awesome video !!! best video to setup hadoop
Pleasure is all mine❤️. Glad to know that this video is awesome to you as well. I know you didn't forget to subscribe to my channel to encourage me and not to miss upcoming video notifications.
Thanks You So Mach....Great Video For Install Hadoop...❤❤
@EvrythingSimple Great to Know that this was helpful for you and Thank you so much for appreciating this video. Pleased you do subscribe my channel and encourage me.
Yes, Really the video is very useful after seeing multiple different different videos I got the correct answer here only
@@MohammedSalman-nb2pu I am glad to know that this video was very helpful for and got your answers here❤️. Please do you subscribe to my channel to encourage me dear and to get notifications for upcoming relevant videos.
well done, Kumar ...
it was so useful.
Glad to know that this video was helpful to you as well ❤️ Thank you so much for subscribing my channel. 🙌
Thank you so must. You explain everything in detail which helps me alot.
I'm glad to know that this was very helpful to you❤️ if you have not yet subscribe my channel you do subscribe my channel and encourage me dear. 🙏
thank you so much sir your video really helped me install hadoop successfully
@@PrabhatSingh-tf6km Glad to know that this video was very helpful to you as well to install Hadoop successfully ❤️Please you subscribe to my channel dear to encourage me if not yet 🙌
@@kundankumar011 already did sir 👍
@@PrabhatSingh-tf6km pleasure ❤️
May god bless you sir. It worked just fine
Pleasure to know ❤️
If you have not yet subscribed my channel. Please you do subscribe my channel. Your subscription will motivate me make more videos ❤️
You help me a lot, thank you! Hope you will make more videos like this.
Pleasure to know that this video is helpful for you. Ya sure, I will keep trying to make such helpful video. Kindly request you to subscribe my channel if you not yet. ❤️
sir thank you so much!!!🙏🙏🙏🙏🙏
Pleasure dear ❤️. Please you subscribe to my channel to encourage me
Pleasure dear ❤️. Please you subscribe to my channel to encourage me
Thank you for the instructions sir❤❤
Pleasure to know dear @Sanjay. If you have not subscribed my channel please you do subscribe my channel to motivate me ❤️
you saved my life❤
@@mohamedmoawad9670 Glad to know that it was very helpful for you and saved your lots of time. Enjoy your learning dear ❤️ Hope you do subscribe to my channel to encourage me dear and not to miss notifications on upcoming relevant videos
Very helpful video. Works perfectly. It saved me a lot of time. Thanks a lot
Pleasure to know that this video is Very helpful to you. Please you do subscribe my channel and encourage me ❤️
Thanks for the steps
Pleasure dear
Thank u sir really helpful
Glad to know that it was very helpful to you as well. Hope you didn't forget to subscribe to receive notification for upcoming relevant videos.
this is very wondeful and helpful
Thank you so much Jackson ❤️
Perfect Ma mann, Keep it up
Thank you so much Abdullah for motivating me ❤️ Hope you didn't forget to subscribe my channel.
@@kundankumar011 Already done, And I was wondering, When I try to upload a datset into the hdfs, I see an error "Couldn't upload the file jfk_weather_cleaned.csv.". So have you made the next video you mentioned in the end of this video so that I would be better able to get the insights on it.. Regards
@@AbdullahKhan-pl7py Dear Abdullah. You can upload the file to hdfs using the command or the GUI interface. At the end of video at 22:50 minutes, I just showed an interface how to upload files on hdfs. May I know what the error is?
@@kundankumar011 "Couldn't upload the file Perecipitationhourly.csv." that is the error which is coming whenever I try to upload it, BTW thanks for cooperating, Your reply is highly appreciated man.
@@AbdullahKhan-pl7py Thank you so much for the appreciation. I will look into this and get back to you dear.
thnk u for this video, it helped me a lot
Dear Salah. It's pleasure to know that this video was very helpful for you ❤️. If you not yet subscribe my channel please you do subscribe and encourage me 🙌
@@kundankumar011 I already did 👌
sir open in the edior 8:10 par jo kiya hai vo nhi ho paa raha editor mei nhi open ho raha what to do please help
Sorry to know that you are facing issues. I used a sublime editor dear . You can use any editor even notepads dear. Let me know if you are asking anything else? Please subscribe to my channel to encourage me dear
@@kundankumar011 sir i pinned you on the linkedin
sir the issue has been resolved
but the thing is that
my C drive was full so i did all those steps in D drive
all the paths has been specified correclty by me
but when u asked us to download msvcr120.dll file and asked us to copy it in C driver system 32 folder
it is creating some error as after this
when i opened the CMD as a administrator and written the command
hdfc namenode -format its showing system cannot find the path specified
sir kindly guide me what i can do now i am stucked so badly
@@kundankumar011 is t mandatory to save that file msvcr120.dll into system32?
what will happen if we did all these steps in
D drive instead of C drive
sir please guide me
@DashingData66666 It depends where is your system32 dear. It should work.
@kundankumar011 in C drive
Helpful man!
Glad to know that it helped you as well. ❤ Please do you subscribe to my channel if you have not yet
Thank you so much. It's very useful for me.
Pleasure to know that this video was so useful for you ❤️. If you didn't subscribe my channel yet, please you do subscribe my channel and press bell icons not to miss next videos.
Thank you very much , it is very use full video.
If possible could you plz do one more video, how to install Apache Spark on top of this Hadoop env. and how to integrate Hadoop, hive and Spark.
@Sathish Thank you so much for appreciating this video tutorial and subscribing my channel ❤️l. Let me work on your request as early as possible on Hadoop Spark And hive integration and upload. 👍
Hi dear Satish. Hope you are learning well. I am sorry to delay making a video on hive as you had requested king ago but I was quite busy on different projects. But now I have already made video in easy hive installation for Hadoop. You can find it here th-cam.com/video/CL6t2W8YC1Y/w-d-xo.htmlsi=Gx_jMbUZK6vxy76t
After insralling winrar showing only console RAR manual,what is new in latest version,winRAR help,winRAR this file but not showing hadoop file help me
Thank you so much for watching my video and it's always pleasure to help. Here is the link to download latest WinRar dear. www.win-rar.com/download.html?&L=0 . If you have not yet subscribe my channel. please you do subscribe. Your subscription is motivation for me. ❤️
@@kundankumar011 still showing same
@@rt7687 After installing WinRar. Extract the Hadoop.tar file then you will find extracted Hadoop file. Follow instructions in video how to extract hadoop.tar file and rest.
@@kundankumar011 thank you
sir for me that port number is not showing hadoop, but 8000 series port showing namenode details. how do i get to know which port will display the hadoop information?
Hello Dear. Thank you for watching my video and asking question. Did you try localhost:9870??
Ports: The port information is indeed specified in the Hadoop configuration files. Here's how to find port information in those files:
For the core-site.xml file, look for the fs.defaultFS property. It specifies the default filesystem name, which includes the hostname and port. For example:
fs.defaultFS hdfs://namenode-hostname:9000
Hope you got me well dear? If you like my video please you do subscribe my channel to motivate me ❤️
@@kundankumar011
thx for replying sir. i tried with port 9000 also. is there any way to know that i can check hadoop port working is?
its working thx
@@Work-q8g cool. Enjoy learning Hadoop
@@Work-q8g you subscribe my channel dear. Your subscription will motivate me ❤️
Thank you so much sir ji
Pleasure dear ❤️ please Subscribe my channel to motivate me
sir under jps command my data node is not running
Thank you so much for watching my video. Hope you didn't forget to format the namenode and datanode folder. In same video, there is instructions how to format it. Addition to this, if really this video is helpful for please you do subscribe my channel dear. Your subscription is my motivation ❤️
is it running now?
@@SatyamSingh-ll7ib yeah its running smoothly
Thank you sir 😊
Pleasure Dear Shiva ❤️
please You do subscribe my channel 🥰
'C:\Program' is not recognized as an internal or external command,
operable program or batch file.
hadoop version error
i am facing the same issue, were you able to solve it. please help
it works well in Win 11 all error are not existing
Glad to know it's working well dear.
What size RAM do you need? after can you show us the installation of kerberos and warden?
I missed your comment earlier, and I apologize for that. I'll make sure to prioritize working on the requested video. Regarding Hadoop, while 8GB RAM is sufficient, but it depends on different factors. Like what size of datasets you have? upgrading to 16GB RAM would indeed improve performance.
Sir my dataNode is not displaying in jps
what should i do
Sorry for late response dear Akash. I missed to see your comment. 1) Check Configuration Files: Verify that your hdfs-site.xml and core-site.xml configuration files are correctly set up. Pay attention to the dfs.datanode.data.dir property in hdfs-site.xml, which defines the directory where DataNode stores its data. Ensure the paths are correct and accessible. 2) Hope ava Version is ok for you. 3) Formate the namenode before start daemons.
Good morning sir
It is not showing any local host in my google sir what can i do please help sir
Type localhost at the address bar of Web browser dear not in Google. See in video how do I access it
@@kundankumar011 sir I have tried in adress bar only but the cluster Hadoop is working 50070 is not working
@@Akhileshhgamingcross check hdfs-site.xml file. this content should be there:
dfs.namenode.http-address
localhost:50070
also check core-site.xml file has this:
fs.defaultFS
hdfs://localhost:9000
Let me know if it worked?
i am facing the following error.
please help sir
The system cannot find the path specified.
Error: JAVA_HOME is incorrectly set.
Please update C:\hadoop\hadoop-3.4.0\etc\hadoop\hadoop-env.cmd
'-Dhadoop.security.logger' is not recognized as an internal or external command,
operable program or batch file.
Thank you
Welcome ❤️ and thank you so much you to for watching this video. Please you do subscribe to my channel to offer support and encouragement
okay@@kundankumar011
Sir i got all jps 5 nodes but in google iam not able to run what you have said what to do sir please help
@@Akhileshhgaming I am glad to know that you are also managed to configure successfully ❤️👍. I think you are trying not to say how to open Hadoop GUI, see the video description I have shared the URL to access Hadoop GUI dear. If I mistaken you let me know
There you have written daemons sir how to start that?
@@Akhileshhgaming I have guided in video dear. Let me share here. Open command prompt as an administrator then run command start-all.cmd
@@kundankumar011 I have done that sir start-cmd.all
Next what should I do sir
JPS is not working
Thank you for watching this video. I hope you have configured the files properly. Afterward, try using start-all.cmd to initiate all daemons. If you are still encountering issues, please provide more information so I can assist you more effectively. Please you do subscribe my channel to motivate me.
Wich editor you have used for hadoop-evc dragging option step.....name please ?
@@VIJAYVARDHAN_V_PATIL. Thank you so much for watching this video. I hope so far it's helpful for you as well. I used a "sublime" editor but feel free to use any of your choices dear. Please do you subscribe to my channel dear and encourage me 💖
What does it mean if the data node is not showing at 22:03?
Thank you so much for watching my video. Hope it was helpful. Sorry for the reply. Please Cross check the DataNode is configured correctly in Hadoop's configuration files:
1) core-site.xml: Check the fs.defaultFS property points to the correct NameNode address.
2) hdfs-site.xml: Verify the directories for DataNode storage are set and accessible.
Let me know if this guide helped you. In addition to this, hope you didn't fail to subscribe to my channel to encourage me.
sir, will this tutorial work for hadoop 2.7 ?
Thank you so much dear for reaching out to clear your doubt on the Hadoop installation. Actually we follow the same steps for Installing any version of Hadoop so far. So this video must help you to configure it. If not let me know what the issues are occurring.
for me yarnmanager and namenode manager is not running 😞
Hi dear. Thank you for watching my video. Hope you didn't forget to format namenode and datanode using commanline?
شكرا لك من القلب على شرحك الجميل
Pleasure to Know Heydar that you like my explanations💖. Please you do subscribe my channels if not yet.
If you not yet subscribe my channel please you do subscribe my channel dear. ❤️
Sir, after running start-all.cmd, it said it is not recognized as an internal or external command, operable program or batch file. How to fix this? 😢
After navigating to C:\hadoop\sbin, it works, but with C:\Windows\System32, doesn’t work…
Sorry to know that you are facing issues. As your error it's sure that you have not set environmental variables/Path correctly. Try to see if you made some mistakes while setting Path dear. It is shown in this video
Please do you subscribe to my video dear to encourage me ❤️
@@kundankumar011 thank you
@@vivian8667 pleasure
when I am clicking on winutils I am getting an error "the application was unable to start correctly" please help
Hi dear. Thank you for watching my this video. Hope it was helpful. But if I am getting you well, try to download winutils file from given resources at my video description and paste it into your Hadoop configuration as guided in the video then try to see if still you gave an error. ❤️ If you have not subscribed my channel, please you do subscribe , like and share. Your subscription will motivate me more 🙏
Can Java 23 supports the Hadoop 3.4.0 version..?
Thank you so much for watching this video. As per my knowledge Java8 is more compatible and stable dear. Let me know if for you java23 is working fine? I believe you didn't forget to subscribe to my channel to encourage me
When i give the jps command its not showing the daemons
Hello dear. Make sure you have started Hadoop daemons.
@@kundankumar011 yes I started
@@DeepaDeepa-yb6bw If You started daemons and still not showing . Check The followings:
1) Check Configuration: Ensure that your Hadoop configuration files (core-site.xml, hdfs-site.xml, and yarn-site.xml) are correctly configured with the right settings. Pay attention to file paths, ports, and other configurations.
2) Hadoop Formatting: If this is a fresh installation, make sure you've formatted the HDFS file system using the hdfs namenode -format command before starting the daemons.
If really you like my video and found its helpful. please you do subscribe my channel dear.
Hello Sir. Thank you for this video but i am facing this error "Error: Could not find or load main class Dev
Caused by: java.lang.ClassNotFoundException: Dev ", when i run the hdfs namenode -format command.. and my computer name is Malaperi Dev. Thank you
Thank you so much for watching this video. Sorry for the error promoted. I am traveling. Give me a few minutes to settle down myself then help you to fix this error . Hope it's ok with you dear. If you have not yet subscribed to my channel yet. Please do subscribe and encourage me. Soon get back to you ❤️
Hello dear,There could be many reasons causing this error. I am assuming you have configured all Hadoop files correctly and set up the environment variables. Please try this first:
I noticed there is a space in your computer name, "Malaperi Dev." Rename your computer to "MalaperiDev" without any spaces. Make sure there are no spaces in the name. After renaming, restart your computer and check if the error is resolved. if not let me know what is the error again?
Still Negative, i renamed it from Malaperi Dev to kisvan, but still the same error with that "Dev" infront
@@kisivaniAIntelligence Ensure the Hadoop and Java installation paths do not contain spaces in them and also Review your Hadoop configuration files (core-site.xml, hdfs-site.xml) for any paths or classpath entries that reference the computer name directly.
If i run this command >start-all
it runs right away, but if and only if i run that namenode -format it will bring me the same error
working
thankyouu!
I am glad to know that it's working for you as well ❤️. Enjoy learning dear Aashit. Hope you subscribed to my channel to encourage me. You also find video on Hadoop Commands on my channel in dear data Bigdata Essential playlists.
Hello Sir, nodemanager is shutting immediately when i start hadoop services and it is showing this error: Permissions incorrectly set for dir /tmp/hadoop-phile/nm-local-dir/usercache, should be rwxr-xr-x, actual value = rwxrwxr-x, i tried to set the above permissions it needs, but still it is shutting down. also, i am wondering how is the tmp folder related to hadoop services since it is created authomatically when they are started? any hint? thanks :)
Hello dear. Check the owner of the tmp folder. Who is it? I mean, you are logging into the system using some user and then running Hadoop. Make sure that the owner has all the permissions.
@@kundankumar011 i found that i was using a microsoft account which caused the error.
@@PhilemonIransubije congratulations 👏
i got this error when running start-all.cmd in C:\hadoop\sbin (SHUTDOWN_MSG: Shutting down NodeManager) i was not able to open localhost:8080
please help me
Hi dear. I am so sorry for being late to reply this serious issues. I apologies. It seems like there might be an issue with your Hadoop setup, specifically with the NodeManager shutting down. Check Configuration: Ensure that your Hadoop configuration files (core-site.xml, hdfs-site.xml, yarn-site.xml, etc.) are properly configured. Pay special attention to the NodeManager configuration in yarn-site.xml. Let me know if this helped you?
C:\Windows\System32>hdfs namenode -format
Error: Could not find or load main class Patel
give this error
@@HoneyPatel-w7y thank you so much for watching my video. Sorry to know that you are facing class and do not find an error. This error may occur due to JAVA_HOME and HADOOP_HOME class path is not set properly or could be a compatible issue of the Java Version with Hadoop Version. First try to cross check the path of java and Hadoop dear. Let me know if this helped you?
my hadoop 9870 port is working but 8000 port is not working help!
Try 8088
Hi Harsh, I am sorry for being late to reply. It just sliped from mind. Hope 8088 port working as @shatrughnpinjari5752 suggested. Please let me know if still i can help you dear Harsh.
ERROR namenode.FSNamesystem: FSNamesystem initialization failed. how to fix this
The "FSNamesystem initialization failed" error in Hadoop’s NameNode usually points to an issue with the Hadoop filesystem (HDFS) setup, which can be due to improper formatting, data corruption, or configuration errors. Here are steps to troubleshoot and resolve this issue.
1. Check if HDFS Was Formatted Correctly
If you're setting up Hadoop for the first time or reconfiguring it, make sure that the NameNode has been correctly formatted.
Run this command to format the NameNode (only if this is a new setup or you don’t need the data in HDFS): hdfs namenode -format
After formatting, start Hadoop services again
Let me know if this step helped you to troubleshoot the error. Please you do subscribe to my channel to encourage me dear
Can we get on a google meet? I have few queries, I googled but not able to figure out
Thank you so much for watching this video dear. Please share any queries you have related to it. I will do my best to address them in the comments. If needed, we will find a way to provide technical support. Hope you get me well dear.
@@kundankumar011 Followed all the steps but unable to launch the localhost via GUI. And I am getting NameNode: Failed to start namenode.
@@beardboyphotography sorry for the issues dear. Let me clarify first. When you use the "jps" command to list the running daemons, what does this command display?
@@kundankumar011 13944 Jps
@@kundankumar011 13944 Jps
all the daemons are working except the datanode
Hope you managed to format the namenode and datanode folder before running daemons. In same video, instruction is given how to format namenode and datanode. Please you do subscribe my channel ❤️
i Encountered this problem.
'jps' is not recognized as an internal or external command,
operable program or batch file
can you please provide the solution?
Thank you so much for watching my video. The error 'jps' is not recognized as an internal or external command, operable program or batch file typically occurs because the Java Development Kit (JDK) tools (like jps) are not properly set up or the environment variable JAVA_HOME is not configured correctly. Please follow these steps to fix:
1) Check if JDK is Installed: Open a terminal or command prompt and type the following command to check the installed version of Java:
java -version
2) Ensure JDK bin is in the PATH Environment Variable is set properly
Please you do subscribe my channel to encourage me dear and don't forget to press bell icon to receive notifications on upcoming relevant videos
@@wizardop2100 Hello dear, encourage me by subscribing my channel dear.
Hi sir
Everything is fine but when i typed hdfs namenode -format i got the error like couldn't find or load main class
Please help
@@mounikagumpu3960 Hello dear. Thank you so much for watching this video. Sorry to know that you are facing issues. Ensure that the JAVA_HOME and HADOOP_HOME environmental variables are set properly.
@@kundankumar011 Should the Hadoop folder name "h" should be capital or small sir
@@mounikagumpu3960 I think you are talking about hadoop folder where you extracted all component. Better maintain hadoop dear small 'h'. But ensure that you set up environmental variables of JAVA_HOME and HADOOP_HOME as guided in this video. Hope you got me well dear.
@@kundankumar011 everything is fine sir but I am getting that error
@@mounikagumpu3960 hope there is no comma and all in environmental variables values. If all is well. Just restart your computer and see. Sometimes it helps. But sound funny 😂
Sir it's showing couldn't find or load main class
Hello Bhavana, appreciate your time watching my video. Could you provide more detailed feedback? This will help me guide you in fixing the errors?
Hi Bhavana. Did you cross check Hadoop Classpath is set correctly?
Hi Bhavana. If you like my video and it's helpful for you. Please you do subscribe and press well icon to receive notification on next video upload. ❤️
I got everything and in cmd when I gave Jps it is giving as not recognised
Thank you so much for watching my video. Sorry for facing the issue. The error message 'jps' is not recognized indicates that the Java command jps (Java Virtual Machine Process Status Tool) is not found in your system's PATH. Please verify Java installation. Cross Java path, JAVA _HOME set in environmental variable or not. Let me know if this step is fixed your error?
I cant find etc folder inside hadoop.....can some one please help...also not able to get hadoop -env file to change java path
Hi Mayank. Thank you for watching this video. Sorry to know that you are not finding an etc folder in Hadoop. Ensure that you have downloaded Hadoop successfully and extracted the Hadoop zip file without major issues?
At first when i installed everything is fine and all the environment varibales are correct but now im facing issue with jps after entering start-all.cmd also im getting only jps no namenode,datanode,resource manager,nodemanager can you pls help sir
Thank you so much for watching this video and happy to know that you had successfully configured it. But unfortunately it stopped working. As you are saying all environmental variables are set well. Then to fix these issues you must check the log files to check what causes an error. Review the logs for any errors or issues with starting the daemons. The logs are typically located in $HADOOP_HOME/logs. Please do you subscribe to my channel if you have not yet to encourage me and not to miss notifications on upcoming relevant videos.
Hii sir followed every step you adviced ...but after running hdfs namenode -format I'm getting error ...big is not recognised and classpatch is not recognised
@@GlobalTalesChronicles Thank you so much for watching this video dear. Sorry to know that you are promoted errors while formatting namennode. What is a "big" word here? Can you copy paste the whole error?
Won't this work with jdk 17 or 21 version
@@ZC-d8s Thank you so much for watching this video. Yes it should work but jdk 8 is more stable and compatible with. Please do subscribe to my channel to encourage me dear and not to miss notifications for relevant videos.
@kundankumar011 in which port we need to run to know that it is really working
@@ZC-d8stry this dear. This is given in the video description localhost:50070/
@@ZC-d8s After successful installation of Hadoop. you can watch another video on Hadoop Commands on my channel th-cam.com/video/YurtxHSwkdU/w-d-xo.htmlsi=IevzPFeOtbsFUhNs
@@ZC-d8s please do you subscribe to my channel dear
when I execute the command "start-all.cmd" and I type the command "jps" at the beginning there are all the daemon (NodeManager, Namenode, ResourceManager, Datanode) after a moment I retype the command "jps" I see only the daemons (RessourceManager and DataNode) is this normal? Thank you
@@amazighkab9904 Basically sometimes Namenode and Resource Manager may stop automatically due to lack of space resources. But it should not happen always dear.
ERROR namenode.FSNamesystem: FSNamesystem initialization failed.
Thank you for watching my video. The "FSNamesystem initialization failed" error in Hadoop’s NameNode usually points to an issue with the Hadoop filesystem (HDFS) setup, which can be due to improper formatting, data corruption, or configuration errors. Here are steps to troubleshoot and resolve this issue.
1. Check if HDFS Was Formatted Correctly
If you're setting up Hadoop for the first time or reconfiguring it, make sure that the NameNode has been correctly formatted.
Run this command to format the NameNode (only if this is a new setup or you don’t need the data in HDFS): hdfs namenode -format
2. After formatting, start Hadoop services again
Let me know if this step helped you fixing error dear. Please don't forget to subscribe to my channel to encourage me and press bell icon does not miss notification on upcoming relevant videos.
sir once the datanode is missing once namenode is missing and also everytime i try to format ive been asked Re-format filesystem in Storage Directory root= C:\hadoop\data
amenode; location= null ? (Y or N)
@@gayathribetha9880 Hi dear. Choose Y
@@kundankumar011 Even after that datanode is missing sir what can I do? it is showing once i format but running again or while checking in gui also datanode info is not there sir
Can I install Hadoop in windows 4 GB RAM?
@DeepaDeepa Yes, you can install Hadoop on a Windows system with 4 GB of RAM, but it's important to note that Hadoop is a resource-intensive framework, and running it on a machine with limited RAM may lead to performance issues or even failures for certain workloads. It's recommended to have more RAM for better performance and stability, especially if you plan to work with large datasets. You may encounter memory-related problems with only 4 GB of RAM, so be prepared for potential limitations. Hope you got me well. If you like my videos please you do #subscribe my channel. Please
JAVA_HOME is incorrectly set.
Please update C:\hadoop\etc\hadoop\hadoop-env.cmd
'-Dhadoop.security.logger' is not recognized as an internal or external command,
operable program or batch file.
i check all but same show
The error message "JAVA_HOME is not set properly" indicates that Hadoop is unable to locate the Java installation required to run. This is a common issue and can be resolved by correctly setting the JAVA_HOME environment variable in the Hadoop configuration files. So Please navigate to C:\hadoop\etc\hadoop\hadoop-env.cmd and Open the hadoop-env.cmd file in a text editor (e.g., Notepad). and Find the line that looks like this:
set JAVA_HOME=%JAVA_HOME%
Replace the existing path with the correct path to your Java installation directory. It should look something like this:
set JAVA_HOME=C:\Java\jdk
You can rewind this video at timeline 8:00 Minutes to see how I set the Java path in Hadoop configuration files.
This is based on my computer. ensure that you are setting based on your computer java installation path. Let me know if it helped you. please don't forget to subscribe my channel dear to encourage me.
Hello sir, everything works fine except for the datanode daemon. It stops running with a SHUTDOWN message.
Thank you for watching my video dear Abhinav. To fix this issue with the datanode shutting down automatically after starting in your Hadoop installation, To fix this issue there could be many reasons.
1) Verify that the user running the Datanode has the necessary permissions to read and write to the data directories specified in hdfs-site.xml. Right click on datanode folder and give full-access permissions (Read, Write and Execute)
then Simply Restart the computer then run all daemon and see it worked or not. If not check the log
2) check the Logs: Look at the Datanode logs for any error messages. The logs are typically located in the Hadoop logs directory. Check hadoop/logs/hadoop-hduser-datanode-.log for relevant information.
Let me know if this helped you to fix the issues?
Sir there is an error showing
Usage Error: Unrecognized or legacy configuration settings found: dir - run "yarn config -v" to see the list of settings supported in Yarn (in )
$ yarn run [--inspect] [--inspect-brk] [-T,--top-level] [-B,--binaries-only] [--require #0] ...
C:\Windows\System32>
Localhost 8088 is not working
Apologies to late reply dear. If localhost:8088 (the YARN ResourceManager web interface) is not working, it typically indicates that YARN is either not running correctly or there may be configuration issues preventing access. Cross Check Yarn Configuration file yarn-site.xml and also check firewall setting if it is not blocked
thanks sir
Thank you so much Aditayan ❤️. If you like my video. please you do subscribe my channel. Your subscription will motivate me.
locallhost desont work what i shuld do
Sorry to know that you are facing a challenge. Localhost is not working. This could be due to various reasons dear.
1) Cross the Hadoop daemons NameNode , Resource Manager is still running or not using jps command while accessing localhost
2) Cross check Hadoop Configuration file is set properly or not.
3) Check the firewall settings. Sometimes firewall blocks some ports like 8080, 9000, 50070 etc. if yes change settings to allow them
Please do you subscribe to my channel to encourage me dear and not to miss on upcoming relevant videos. Thank you ❤️
@@فروتي-ي5ش Hello dear. Hope you managed to fix the issues. Please don't forget to subscribe to my channel to encourage me
Nodemanager and resourcemanager shutting immediately jps shows only
2820 Jps
10328 NameNode
21484 DataNode
Pls help
Thank you so much for watching this video❤️ Sorry to know that you are facing issues. If the NodeManager and ResourceManager are shutting down immediately after starting, it typically indicates a configuration issue.
Please Validate Configuration Files:
Double-check the following configuration files for any misconfigurations:
"yarn-site.xml"
"core-site.xml"
Ensure that the configuration for the yarn.resourcemanager.hostname and other essential parameters is correct. If these all are ok then there could be issues of insufficient space. Let me know if this helped you dear. Please do subscribe to my channel and encourage me
C:\Windows\System32>hdfs namenode -format
'C:\Program' is not recognized as an internal or external command,
operable program or batch file.
'-classpath' is not recognized as an internal or external command,
operable program or batch file.
help....
Thank you so much for watching this video. Sorry to know that you are facing issue. Looking at your shared error with me. This error message indicates that there are spaces in the path to your Java installation directory, causing the command to fail. This is a common issue on Windows when Java is installed in the default directory (C:\Program Files).
Here are the steps to resolve this issue:
1. Set the JAVA_HOME Variable Correctly
Ensure that the JAVA_HOME environment variable is set correctly, and that it is enclosed in quotes if there are spaces in the path. Set Environment Variables in Command Prompt
Alternatively, you can set the environment variables directly in the Command Prompt before running the format command:
set "JAVA_HOME=C:\Program Files\Java\jdk"
Note:- change the path based on your jdk installation location
Let me know if this helped you dear?
Hi dear. Hope you managed to fix the issues as I guided you above. Please you do subscribe my channel and encourage me.
sir localhost cluster is not opening
@@RiteshSingh-zb5ss Thank you so much for watching this video dear and happy to know that you managed to configure it successfully. Try to use localhost:50070/
Let me know you can open it. Even this link is given in the video description dear. Please do you subscribe to my channel to encourage me.
@@kundankumar011 sir it is not opening , other two site is opening but cluster site is not opening
@@RiteshSingh-zb5ss That's good progress. What error is promoting in the web browser on opening of the cluster? Please do you subscribe to encourage me dear.
@@kundankumar011 it is showing site can't be reached and in the jps it is showing 1856 namenode ,5248 datanode ,10228 jps only
@@kundankumar011 localhost is not working for cluster , but it is working for other datanode etc.
before executing hdfs namenode -format , you must execute the command rmdir /S "C:\hadoop\data\datanode" otherwise you will get the error Incompatible clusterIDs in C:\hadoop\data\datanode: namenode clusterID = CID-9ffe06b7-e903-46f6-a3fe-508ccc12e155; datanode clusterID = CID-d349
Thank you for watching my video. Apologies for the delayed response. I hope you were able to resolve the error. If not, here's a quick guide to help you. The error "Incompatible clusterIDs" occurs when there is a mismatch between the NameNode's and DataNode's cluster IDs. This typically happens if Hadoop is reinstalled or the namenode -format command is run again without resetting the DataNode. To resolve the Incompatible clusterIDs error, you should delete the contents of the DataNode directory before formatting the NameNode.
pls send a link
@Priya Here you can find winutils under the bin folder of shared linked here. (This link was attached in video descriptions as well) drive.google.com/drive/folders/1WTSipe8W8OXkpFPDVzhI71otRvh4htmP?usp=sharing
if you like my video and you find its helpful, please you subscribe my channel, your subscription motivate me
Native library issue
Sorry to know that you are facing issues. It could be Incorrect Library Path: The Hadoop configuration does not point to the directory containing the native libraries. Check The environment variable HADOOP_HOME. Or it might be 32 or 64bit issues. Let me know if it helped you dear. Please you subscribe to my channel to encourage me dear
Hi sir at times im getting all the namenode, datanode everything but sometimes im not getting those and sometimes im getting it for 2 times can you pls help me with that
Dear Meghana. Good to know that you managed to install and configure successfully. Regarding your issues of showing namenode and datanode twice. This could be due to running multiple instances of the same service. Use commands like 'netstat' to check for port conflicts. In addition to this, sometimes not showing datanode,, it could be due to lack of space to allocate to the datanode at that point of time. Hope you got me well dear. If you find this video helpful, please do you subscribe to my channel so that you can get notifications for the upcoming interesting video on bigdata or other tech. Thank you
@@kundankumar011 what can i do to get the datanode at that time sir should i dlt the Hadoop cmpltely and download it bcuz at times it shows and at times it doesnt
sir i cant find 3.2.4 winutils
@Priya Here you can find winutils under the bin folder of shared linked here. (This link was attached in video descriptions as well) drive.google.com/drive/folders/1WTSipe8W8OXkpFPDVzhI71otRvh4htmP?usp=sharing
if you like my video and you find its helpful, please you subscribe my channel, your subscription motivate me
Jps error
Thank you so much for watching this video. There could be many reasons
1) Make sure Java is installed correctly on your system and that the JAVA_HOME environment variable is set properly.
2) Hope Hadoop environment is set up well: If you're running "jps" outside of the Hadoop environment or if Hadoop is not properly configured, it may not recognize the Hadoop processes. etc .
If really this video was helpful please you do subscribe my channel and encourage me ❤️
Datanode not working
Thank you so much for watching this video. If the Hadoop DataNode is not working or not starting, it could be due to several reasons, including misconfiguration, resource issues, or problems with the underlying storage. To troubleshoot and fix the issue, please check the configuration "hdfs-site.xml" files datanode path is specified correctly. Addition to this also check if you have sufficient space. You can also try to restart the computer and try to start all daemons and see if it worked. If possible, please do subscribe my channel to encourage me.
now 25.08.2023 ,when i download hadoop i dont get in a zip file i got in .tar.gz format ?
@Swetha. Thank you for watching my video ❤️. Yes, it's true that you'll receive a .tar.gz file. However, in the same video, I've demonstrated the process of unzipping this downloaded file using the WinRar application. To unzip the hadoop.tar.gz file, please make sure to install WinRar. You can visit this site www.win-rar.com/download.html?&L=0 If you like my video, please you do subscribe my channel and press bell icons, Like and share. If still you have challenge please let me know, I will be happy to guide you
Hello Dr, I have a problem with Hadoop. I got this error: Failed to retrieve data from /webhdfs/v1/?op=LISTSTATUS: Server Error
Thank you so much for the comment dear. can you provide more details. I meant what activity cause or prompt this error??
shutting down datanode
Error:
2024-09-07 15:28:30,891 INFO common.Storage: Lock on C:\tmp\hadoop-pc\dfs\data\in_use.lock acquired by nodename 9528@DESKTOP-TNSAH2H
2024-09-07 15:28:30,891 WARN common.Storage: Failed to add storage directory [DISK]file:/C:/tmp/hadoop-pc/dfs/data
java.io.IOException: Incompatible clusterIDs in C:\tmp\hadoop-pc\dfs\data: namenode clusterID = CID-6679ba00-f42f-4835-8440-3f7fe4748513; datanode clusterID = CID-13c1262e-453e-41e2-ac80-0542976931df
Thank you so much for watching my video but sorry to delay in replying. If still you are facing issues then After looking at The error you are encountering indicates a mismatch between the ClusterID of the NameNode and DataNode. One for the Steps to Fix:
Clear the DataNode Data Directory
If you're okay with clearing the DataNode's data and starting fresh, you can delete the DataNode storage directory and restart the DataNode. This will cause the DataNode to re-register with the NameNode using the current ClusterID.
Then reformat the namenode and restart the daemons hope this will help you. otherwise let me know dear.
I cant change the destination folder.After status it doesnot shows destination folder .
Thank you so much for watching my video. I think you are talking about changing the destination folder to install Java? If yes, System must allow you change if still not. No worry you can choose the default destination then use the same destination path location to setup environmental variable dear. Just continue watching the video. Let me if this helped you? And encourage me by subscribing my channel dear❤️
ERROR namenode.NameNode: Failed to start namenode.
java.lang.UnsupportedOperationException: getSubject is supported only if a security manager is allowed.
how to fix this
The "java.lang.UnsupportedOperationException: getSubject is supported only if a security manager is allowed" error typically occurs when Hadoop is trying to use secure mode (Kerberos authentication) without proper configuration or permissions. Here are some ways to address this issue:
=================================================
Steps to Fix the UnsupportedOperationException in Hadoop:
=================================================
1. Disable Hadoop Security (if not needed)
- If you’re not using secure mode (Kerberos authentication) for Hadoop, you can disable security in your configuration files. In your `core-site.xml` file, set the following property:
hadoop.security.authentication
simple
- This setting ensures that Hadoop operates in non-secure mode, which should prevent the `getSubject` error.
2. Check Hadoop Configuration Files
- Ensure that `core-site.xml` and `hdfs-site.xml` do not contain configurations that imply secure mode (Kerberos).
- Verify the following configurations in `core-site.xml`:
hadoop.security.authorization
false
- Make sure no settings related to `hadoop.security.authentication` are set to `kerberos` unless Kerberos is actually configured and available.
3. Run Hadoop Without a Security Manager
- If you are testing locally or have no specific requirement for security, you can run Hadoop without a security manager.
- Set the following in `hadoop-env.cmd` or `hadoop-env.sh`:
export HADOOP_OPTS="$HADOOP_OPTS -Djava.security.manager="
- This will start Hadoop without a security manager, avoiding the `getSubject` call.
4. Ensure Proper Permissions (If Using Kerberos)
- If you require Kerberos, confirm that all security-related configurations are correctly set up, including:
- `hadoop.security.authentication` set to `kerberos`.
- Kerberos principal and keytab files are correctly specified.
- The necessary permissions and access control lists (ACLs) are configured in `hdfs-site.xml`.
5. Restart Hadoop
- After making the changes above, restart Hadoop services to apply the new configuration:
stop-dfs.sh
stop-yarn.sh
start-dfs.sh
start-yarn.sh
Let me know if this resolves the issue or if there’s more detail from the error logs!