bro @Unfold Data Science asap very urgent plz help during installation of this step ie sudo nano .bashrc it says below sudo nano .bashrc [sudo] password for hdoop1: hdoop1 is not in the sudoers file.
I literally installed it 3 times in my college lab system but the instructions were so unclear, I couldn't install it properly at all. I watched this video and installed it on my Virtual Box Ubuntu and I FINALLY FINALLY FINALLY GOT IT 😭😭😭😭😭🎉🎉🎉 Thank you so so so much for this video!!!
guys, there is a slight change while editing .bashrc file @9:22 of the video where the last line is export HADOOP_OPTS"-Djava.library.path=$HADOOP_HOME/lib/nativ" please change it to export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/native" later when you type source .bashrc it won't throw any errors thank you.
Brother, I was struggling with the installation for the last month till coming across your video. It was a god send. I am glad that I could complete following your steps. The best part is how to tackle the errors during installation. Looking forward to more such stuff from you
Thank You soooo much for such video It took 6 hrs to finally do everything successfully after resolving many hurdles. Anyone face any issue then may be I can help. Please reply your problem in comment
in the end when we are putting the local file to hadoop i am getting a error please can you help me resolve it. This is the error: hdfs dfs -put home/abdeali/demo.txt / 2022-10-02 12:00:36,644 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable put: `home/abdeali/demo.txt': No such file or directory Thank you in advance
Thank you for making this video, it really helped me! I'm in a Big Data training right now, and I've been trying to install Hadoop properly for weeks at this point. The only error I got this time was "port 22: Connection Refused" when using the ssh localhost command. I followed a suggestion from Stack Overflow and did "sudo service ssh restart". Then ssh localhost worked fine for me, and I was able to continue with the rest of your instructions. Thank you!!
While running ~/hadoop-2.3.1/sbin/ It is showing an error that no such file/directory and before that while modifying the yarn files, it was asking for rewrite y/n? I gave yes to it. Please help with this
If I’d like to install one name node and two data nodes, should I install 3 Ubuntu systems on virtual box and link then? Could you do a video on creating 1 master and 2 salve nodes?
First of all, let me thank you for explaining so beautifully. It was really helpful. Now coming to my concern - I am facing issue in running Hadoop. I followed all the steps in this video, but when am running the command "ssh localhost" or trying to start services, I am getting connection refused message. Would really appreciate your help here.
is there is any problem using current super user to install hadoop or we need to adduser for the purpose to install hadoop. i studied hadoop in a years ago so i forgot everything in hadoop
@UnfoldDataScience i was facing some problems regarding the ssh localhost and i tried the commands from stack overflow too but it is not working .. even after serve ssh restart command it is asking for password and after entering the password it is showing that "hdoop is not in the sudoers file.Thiis incident should be reported" this is being shown
its getting error when i am adding txt file for the first time what is the problem please help me out . error :put: File /data.txt._COPYING_ could only be written to 0 of the 1 minReplication nodes. There are 0 datanode(s) running and 0 node(s) are excluded in this operation.
Hi sir. I followed the same as mentioned in video. But finally when I type jps , datanode is not showing. I tried restarting 2 commands start-dfs.sh and start-yarn.sh but still not getting datanode in jps. please help me out. I tried many things present in internet but nothing worked.
I fixed the problem. It was in the first file and specifically this line: "Djava.library.path=$HADOOP_HOME/lib/nativ". It's supposed to be native and not nativ. When I fixed that then my program started working perfectly.
hello sir, I tried following your installation but hadoop 3.2.1 is no longer available for download, and the .bashrc is compleytely different in hadoop 3.2.2, will this be a problem as I am missing some important alteration in that part.
bro while i gonna put this sudo nano .bashrc command it showing like "this hadoop user is not in sudoers file this incident will be reported it" ,will u help me to come out from this
in the end when we are putting the local file to Hadoop I am getting a error please can you help me resolve it. This is the error: hdfs dfs -put home/abdeali/demo.txt / 2022-10-02 12:00:36,644 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable put: `home/abdeali/demo.txt': No such file or directory Thank you in advance
Hello, When i’m trying to unzip Hadoop file using tar xzf command, it is throwing “no space left on device” error. Can you please help me on how to resolve this.
Check previous comments, people have answered it in comment. you can check this link as well: www.tecmint.com/fix-user-is-not-in-the-sudoers-file-the-incident-will-be-reported-ubuntu/
Hey sir. Lovely video. It worked flawlessly. Thanks. I have few questions. 1) where exactly is this data getting stored ?? India itself?? Where?? 2) How can we store data for free?? Why isn't Hadoop paid???
1. If you are doing local installation, in your local machine, if u do in a cluster, wherever cluster machines are located physically. 2. You will need to pay for hardware - not for using hadoop
Hello sir , In the second file HADOOP_HOME I am getting an empty file saying that directory' /home/hadoop/hadoop-3.2.1/etc/hadoop' does not exist Plz help me
Hi Sir, 1st of all, thank you for the proper-made video, it is really nice. However, I have a question: If I download the hadoop package via web browser and extract it in Download file, do I need to modify any directory? Or should I just follow your instruction as suppose?
somehow secondaryname node is not popping up when i run jps. However i was able to execute put statement and placed the file in hadoop root directory. Is it an issue if secondarynamenode is not running and still others are working ??
@21:59 when i type jsp command and hit it shows only Jps, NameNode, SecondaryNameNode and remaining Nodemanger and resourcemanager is not showing what is the reason ?
Hi sir, Thanks for the video. I have one question please clarify it. In the core-site.xml and in the other .XML files you gave path like home/hadoop/tmpdata home/hadoop/ dfsdata/namenode home/hadoop/ dfsdata/datanode And all... Now my question is will system automatically create this directories at the back end ? Or we have to manually create and give the mentioned path that are given in the document? Thanks
unable to retrive (put) file from /home/Desktop to hdfs. it says put: `/home/qsadmin/test.txt': No such file or directory. can you help? couldnt find any right explanation in google
Thanks for the video, question: When I run ssh localhost it gives me this: ssh: connect to host localhost port 22: Connection refused. Any idea why and how to solve it?
getting the following error: WARNING: HADOOP_PREFIX has been replaced by HADOOP_HOME. Using value of HADOOP_PREFIX. ERROR: Invalid HADOOP_YARN_HOME I have seen some posts in stackoverflow and modfied HADOOP_PREFIX to HADOOP_HOME and still unable to resolve the error. can you pleae help to resolve this.
@@UnfoldDataScience thank you sir.... sir when i wrote (Readlink -f /usr/bin/java) i got ( /usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java) is the java home written as (export JAVA_HOME=/usr/bin/jvm/java-8-openjdk-amd64/jre) or (export JAVA_HOME=/usr/bin/jvm/java-8-openjdk-amd64)?
@@sathvikreddy8602 Follow this step to solve your problem it will work definitely hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-common/SingleCluster.html
[ Directory ' / home/hdoop/hadoop-3.2.1/etc/hadoop ' does not exist ] ^ this is the error i am facing when I try to get into the env.sh of hadoop , also i have 3.2.2 version of hadoop installed, I am in desperate need of help.
Hello sir In your video at 10:36 you talk about that command which is sudo adduser Hadoop sudo before which we type like in your case it is (su - aman) For me it is su - dhruv After that it asks for password And it comes as su : Authentication failure What should I do now?
when i try to edit any file there's an error occuring... "hdoop is not in the sudoers file. This incident will be reported." Kindly, provide the solution.
When I am running the command "hdfs namenode -format", I am getting command 'hdfs' not found. how can it be solved? I had run all the previous commands properly.
Receving this error: (in hdoop user) Command - hdfs dfs -put /home/logan/emp.csv / error - put: `/home/logan/emp.csv': No such file or directory Even though the file location is correct I am receving this error. Any solutions please?
Great work, but i am getting an issue after completion of hadoop installation... I could not send my local files to hdoop user, where hdfs exists. I have approached the same procedure... no idea why it is not recognising
hey did you get any answer to your issue as i am getting the same error hdfs dfs -put home/abdeali/demo.txt / 2022-10-02 12:00:36,644 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable put: `home/abdeali/demo.txt': No such file or directory can you please help me regarding this. Thank you
Access English, Hindi courses here- www.unfolddatascience.com/s/store
Don't forget to register on the website guys ☺
bro @Unfold Data Science asap very urgent
plz help during installation of this step ie sudo nano .bashrc it says below
sudo nano .bashrc
[sudo] password for hdoop1:
hdoop1 is not in the sudoers file.
I literally installed it 3 times in my college lab system but the instructions were so unclear, I couldn't install it properly at all. I watched this video and installed it on my Virtual Box Ubuntu and I FINALLY FINALLY FINALLY GOT IT 😭😭😭😭😭🎉🎉🎉 Thank you so so so much for this video!!!
Worked like charm
THANK YOU SOO MUCH
Your video stopped me from smashing my laptop ;)
guys, there is a slight change while editing .bashrc file @9:22 of the video where the last line is
export HADOOP_OPTS"-Djava.library.path=$HADOOP_HOME/lib/nativ"
please change it to
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/native"
later when you type
source .bashrc
it won't throw any errors
thank you.
Thats what heroes do.
@@juhisharma6873 same error, can you fix it
Hope u are able to fix it, if not please read other comments or just google it, It may be a very minor issue
Thanjks
Thanks
Brother, I was struggling with the installation for the last month till coming across your video. It was a god send. I am glad that I could complete following your steps. The best part is how to tackle the errors during installation. Looking forward to more such stuff from you
did you install successfully ?
Thank You soooo much for such video
It took 6 hrs to finally do everything successfully after resolving many hurdles.
Anyone face any issue then may be I can help. Please reply your problem in comment
Awesome. Love your comment. Let's keep helping each other.
where to get that localhost gui
in the end when we are putting the local file to hadoop i am getting a error please can you help me resolve it.
This is the error:
hdfs dfs -put home/abdeali/demo.txt /
2022-10-02 12:00:36,644 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
put: `home/abdeali/demo.txt': No such file or directory
Thank you in advance
I have the same problem that @Abdeali Hazari has. Any help?
@@dhilipkarthik1807 after starting your Hadoop nodes
- go to port 9870 in browser to see it running locally
highly recommend :- with the help of this video i installed Hadoop successfully, but make sure u follow all the steps properly.
Thanku.
Cheers Harsh. Your comments mean a lot 😃
i spent 1/2month to install hadoop but every time i failed, but allah showed me your channel and after i watched your video now i installed it thanks.
Glad to hear that Anwar. Happy Learning!
Thanks a lot, this finally helped after so many searching here and there found one perfect video for Hadoop Installation
Thanks for watching Abhijit.
Thanks a lot, definitely the video to watch if you want to install hadoop ! keep posting this kind of quality videos !
Thanks a lot.
Thank you for making this video, it really helped me! I'm in a Big Data training right now, and I've been trying to install Hadoop properly for weeks at this point. The only error I got this time was "port 22: Connection Refused" when using the ssh localhost command. I followed a suggestion from Stack Overflow and did "sudo service ssh restart". Then ssh localhost worked fine for me, and I was able to continue with the rest of your instructions. Thank you!!
Welcome Morgan.
thanks a lot, that works
@20:00 when you try to enter the folder it should be 3.2.3 not 3.2.1 if you are having problems. Thank you again!
U're a life saver tysm
thank you Sir for this tutorial. It's very helpful
Made it look so simple. Thanks a lot. Finally installed after so many tries. Good Work...!!!
Glad it helped Kashish.
thanks, it installed properly and working without any errors.
🍻
Thank you very very much......this was really helpful. Hoping to see more from you.
Thanks Aniket.
While running ~/hadoop-2.3.1/sbin/
It is showing an error that no such file/directory and before that while modifying the yarn files, it was asking for rewrite y/n? I gave yes to it. Please help with this
Thank you so much, finally installed hadoop, god bless you!
You're welcome!
Great Video Bro!!!!!!!!!!!!!!!! Excellent explanation and the file provided saves a lot of time. Thanks a lot bro!!!!!!!!!!!!!
Welcome Satyam.
Loved it bro, was struggling to install from other sites, nicely explained. Keep it up....:-)
Glad it helped. Thanks for watching.
BRO I JUST WANTED TO SAY, THANK YOU SO MUCH !!!
best video for downloading hadoop thank you sir g
Thanks a lot . This process is really working.
Welcome Riya.
Thanks for u r efforts.. i am able to install hadoop :)
Thanks very much a million times, your tutorial was very helpful. Thanks again and stay bless..
You're welcome!
When I paste wget link to download hadoop it's showing HTTP request sent awaiting response... 404 not found please help me with this soon
Yes,same issue, please help if anyone has tackle this error
Google the link, it may have changed
Google it, may be link has changed
COME ON MAN! Thank you so much. This video really helpful for me as a beginner
Thanks Taufiq.
@@juhisharma6873 I have same issue.. can u pls help?
If I’d like to install one name node and two data nodes, should I install 3 Ubuntu systems on virtual box and link then?
Could you do a video on creating 1 master and 2 salve nodes?
Sir , while starting hdfs format , it is showing error.
same here.
when i try with " sudo nano .bashrc" , i reviced error " hdoopuse in not in the sudoers file. This incident will be reported". How can i solve this ?
did u solve this error?
@@shreyachauhan8890 is this error solved?
Log in as the root and add hdoop to sudo. There are some good guides on how to do that on google
THanks Guys for helping each other.
@@JM-hj3sm how i can do that can explin more
Is this Hadoop installation done in pseudo distributed mode?
Brother while downloading hadoop its showing that link as 404 error what to do
First of all, let me thank you for explaining so beautifully. It was really helpful. Now coming to my concern - I am facing issue in running Hadoop. I followed all the steps in this video, but when am running the command "ssh localhost" or trying to start services, I am getting connection refused message. Would really appreciate your help here.
This is a very common issue, below link will help:
stackoverflow.com/questions/17335728/connect-to-host-localhost-port-22-connection-refused/29868328
@@UnfoldDataScience
Im getting error in 2nd file sudo nano $HADOOP_HOME/etc/Hadoop/Hadoop -env.sh command
@@deekshanayak895 did the error ressolved for u?
@@deekshanayak895 same for me
thank u so much i learned alot and it worked for me
Glad it helped
hello sir,
I want to create a cluster were one master and two slave
so, I will have to follow this steps for all VMs right??
is there is any problem using current super
user to install hadoop or we need to adduser for the purpose to install hadoop. i studied hadoop in a years ago so i forgot everything in hadoop
@UnfoldDataScience i was facing some problems regarding the ssh localhost and i tried the commands from stack overflow too but it is not working .. even after serve ssh restart command it is asking for password and after entering the password it is showing that "hdoop is not in the sudoers file.Thiis incident should be reported" this is being shown
Goods Tutorial😍
Glad you like it pls share with friends also.
will it run spark-shell --master yarn commands?
its getting error when i am adding txt file for the first time what is the problem please help me out . error :put: File /data.txt._COPYING_ could only be written to 0 of the 1 minReplication nodes. There are 0 datanode(s) running and 0 node(s) are excluded in this operation.
Sir i don't know why but put command didn't worked it said no such file or directory
Helpful Video :) Thanks
I Directory /home /hdoop/hadoop-3.3.1/etc/hadoop' does not exist
It seems to me when i type sudo nano SHADOOP_HOME/etc/hadoop/hadoop-env.sh 😢
dicsussed before in commnets
Where have u discussed can't find the sol
Thanks a lot for this tutorial!
Thanks Manoj for watching.
Really good one. thank you
Welcome.
Hi sir. I followed the same as mentioned in video. But finally when I type jps , datanode is not showing. I tried restarting 2 commands start-dfs.sh and start-yarn.sh but still not getting datanode in jps. please help me out. I tried many things present in internet but nothing worked.
i have the same problem too....
Does stop-all.sh and start-all.sh help?
Go through the comments on this video. I think people faced similar issue and resolved
I fixed the problem. It was in the first file and specifically this line: "Djava.library.path=$HADOOP_HOME/lib/nativ". It's supposed to be native and not nativ. When I fixed that then my program started working perfectly.
@@simery9066 Nope still not working.
11:28 - The file that you asked us to edit doesn't exist. Please help.
did you find solution to this
hello sir, I tried following your installation but hadoop 3.2.1 is no longer available for download, and the .bashrc is compleytely different in hadoop 3.2.2, will this be a problem as I am missing some important alteration in that part.
Try with the version available, I dont think its a problem.
@@UnfoldDataScience Sir the .bashrc file is completely different
I dont know how to geta java runtime that supports apt. I downloaded a new java jdk yet it doesn't work
This is a java issue. U can follow different solution on various websites one may work
bro while i gonna put this sudo nano .bashrc command it showing like "this hadoop user is not in sudoers file this incident will be reported it" ,will u help me to come out from this
This is a very common issue, if you google you will easily find solution, you need to add user in sudo group.
is it compulsory to install jdk 8, can't we install latest version of java like jdk 17
Not compulsory
I had a problem with the HADOOP_OPTS, was that supposed to be an "=" and not a dash??
Yes
hdoop is not in the sudoers file. This incident will be reported.
I am getting error when i type sudo nano .bashrc
Very common error, see here:
unix.stackexchange.com/questions/179954/username-is-not-in-the-sudoers-file-this-incident-will-be-reported
in the end when we are putting the local file to Hadoop I am getting a error please can you help me resolve it.
This is the error:
hdfs dfs -put home/abdeali/demo.txt /
2022-10-02 12:00:36,644 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
put: `home/abdeali/demo.txt': No such file or directory
Thank you in advance
This has been discussds previously on comments. Check once
@@UnfoldDataScience i cannot find the solution to this please help
@@UnfoldDataScience No, it is not in any comment. I've reviewed it many times but nothing
thank you sir best video
Thanks Abhishek
Hello,
When i’m trying to unzip Hadoop file using tar xzf command, it is throwing “no space left on device” error. Can you please help me on how to resolve this.
Plaise when i run this cmd sudo nano . Bashrc
This message show wht i can do
(Hadoop not in the folder sudoers. Incident will signal)???
Check previous comments, people have answered it in comment.
you can check this link as well:
www.tecmint.com/fix-user-is-not-in-the-sudoers-file-the-incident-will-be-reported-ubuntu/
Thank you so much for this video it really worked! I have only one question How to install hadoop in multiple nodes in ubuntu?
Thanks, in multinode, we need to do some setting change in different files, process wise 80% its same
Hey sir. Lovely video. It worked flawlessly. Thanks.
I have few questions.
1) where exactly is this data getting stored ?? India itself?? Where??
2) How can we store data for free?? Why isn't Hadoop paid???
1. If you are doing local installation, in your local machine, if u do in a cluster, wherever cluster machines are located physically.
2. You will need to pay for hardware - not for using hadoop
Hello sir , In the second file HADOOP_HOME I am getting an empty file saying that directory' /home/hadoop/hadoop-3.2.1/etc/hadoop' does not exist
Plz help me
same for me any solution ?
same i tried using 3.3.3 .. still i am facing issue.. any solution ?
@@jyothiprasadkr6189 export HADOOP_HOME=/home/hdoop/hadoop-3.2.3 (1st file , 1st line) , replace hdoop =(userid i.e here in video hdoop)
sir i am not able to start namenode on local host ,it is starting on my virtual machine and at the end it is showing permission denied
Sir iam having error in downloading the Hadoop ...it shows 404 not found
Go through google directly
Hi Sir,
1st of all, thank you for the proper-made video, it is really nice. However, I have a question:
If I download the hadoop package via web browser and extract it in Download file, do I need to modify any directory? Or should I just follow your instruction as suppose?
You can install from downloads folder as well, just give the correct path
Very Nice Explanation. Please mention in which Linux and Version you are working Thank You
Superb !!!!
It is showing error when I execute the $hdfs namenode -format
hdfs not found error
Hadoop is not in the sudoers file. This incident will be reported. it is coming like that to me
This is a simple error, you try to fix by googling. Then I can help u.
i solved my error thankyou
See you did it, cheers 🍻
Tell me how u solved it
great job bro
Thanks alot
somehow secondaryname node is not popping up when i run jps. However i was able to execute put statement and placed the file in hadoop root directory. Is it an issue if secondarynamenode is not running and still others are working ??
stop all and start all once.
@@UnfoldDataScience
Somehow started working when I login in as root. Thanks
@21:59 when i type jsp command and hit it shows only Jps, NameNode, SecondaryNameNode and remaining Nodemanger and resourcemanager is not showing what is the reason ?
Did u run start-all.sh, try stopping and restarting again.
Hi sir, Thanks for the video.
I have one question please clarify it.
In the core-site.xml and in the other .XML files you gave path like
home/hadoop/tmpdata
home/hadoop/ dfsdata/namenode
home/hadoop/ dfsdata/datanode
And all...
Now my question is will system automatically create this directories at the back end ?
Or we have to manually create and give the mentioned path that are given in the document?
Thanks
System will pick from these places, i mean its already there if i am getting question right.
unable to retrive (put) file from /home/Desktop to hdfs. it says put: `/home/qsadmin/test.txt': No such file or directory. can you help? couldnt find any right explanation in google
File is not existing in source
after command jps secondarynamenode is not showing.. what should i do now...
Restart hadoop
It shows hdoop not in sudoers file. When I run sudo nano .bashrc command what to do?
please read previous comments, its an easy issue to fix.
Thanks a lot...!!🙏❣️
Welcome.
Is creating a new user for hadoop is necessary ?
I followed your steps, but in the first step, sudo apt update it's showing error e: sub error occurred. Please help sir
Good one.
Thanks a ton.
Thanks for the video, question: When I run ssh localhost it gives me this: ssh: connect to host localhost port 22: Connection refused. Any idea why and how to solve it?
Please read previous comments, it's a common issue
getting the following error:
WARNING: HADOOP_PREFIX has been replaced by HADOOP_HOME. Using value of HADOOP_PREFIX.
ERROR: Invalid HADOOP_YARN_HOME
I have seen some posts in stackoverflow and modfied HADOOP_PREFIX to HADOOP_HOME and still unable to resolve the error. can you pleae help to resolve this.
See if below link helps:
dbversity.com/warning-hadoop_home-is-deprecated/
After typing jps only Jp's, resource manager node managers is coming not name nodes and data nodes
Restart hadoop
Sir can i use the same instructions to install hadoop on ubuntu server?
Yes
@@UnfoldDataScience thank you sir....
sir when i wrote (Readlink -f /usr/bin/java) i got ( /usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java)
is the java home written as
(export JAVA_HOME=/usr/bin/jvm/java-8-openjdk-amd64/jre) or (export JAVA_HOME=/usr/bin/jvm/java-8-openjdk-amd64)?
nandri thala... thanks
Welcome.
Great !!!
Great video
Thanks Adithi
i have a problem it says that command 'hdfs' not found
Check if Hadoop is installed correctly by running start-all.sh inside sbin folder of hadoop
Hadoop installed properly
@@sathvikreddy8602 Follow this step to solve your problem it will work definitely hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-common/SingleCluster.html
@@UnfoldDataScience hadoopusr@pop-os:~/hadoop-3.3.1/sbin$ start-all.sh
start-all.sh: command not found
[ Directory ' / home/hdoop/hadoop-3.2.1/etc/hadoop ' does not exist ]
^ this is the error i am facing when I try to get into the env.sh of hadoop , also i have 3.2.2 version of hadoop installed, I am in desperate need of help.
check manually if files and folders exist at right location, if not delete and re download
Hello sir
In your video at 10:36 you talk about that command which is sudo adduser Hadoop sudo before which we type like in your case it is (su - aman)
For me it is
su - dhruv
After that it asks for password
And it comes as
su : Authentication failure
What should I do now?
give correct password or create a new user and start fresh.
I know it's a late reply but for anyone with this issue. You need to enter your original root user's password not the hadoop user's password
Thank you very much
After ssc localhost
It was showing host key verification failed what I need to do
Hope u are able to fix it, if not please read other comments or just google it, It may be a very minor issue.
Sir you installed hadoop on ubunto in VM......i have pc with ubuntu as a system, can i use the same installation of hadoop on ubuntu ?
Yes same way u can do, no need of VM.
when i try to edit any file there's an error occuring...
"hdoop is not in the sudoers file. This incident will be reported."
Kindly, provide the solution.
issue discussed before
When I am running the command "hdfs namenode -format", I am getting command 'hdfs' not found.
how can it be solved?
I had run all the previous commands properly.
You may be in wrong directory
Receving this error: (in hdoop user)
Command - hdfs dfs -put /home/logan/emp.csv /
error - put: `/home/logan/emp.csv': No such file or directory
Even though the file location is correct I am receving this error. Any solutions please?
i think command should be other way , check syantax of source and destination.
Not able to download Hadoop it shows tar: error is not recoverable: exiting now pls help
Hi Satish, pls search in google once download hadoop. May be the link is cnhaged or something.
Thanks a lot!
Welcome.
Hello sir, can u plz tell where to get that localhost gui
Great work, but i am getting an issue after completion of hadoop installation... I could not send my local files to hdoop user, where hdfs exists. I have approached the same procedure... no idea why it is not recognising
hey did you get any answer to your issue as i am getting the same error
hdfs dfs -put home/abdeali/demo.txt /
2022-10-02 12:00:36,644 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
put: `home/abdeali/demo.txt': No such file or directory
can you please help me regarding this. Thank you
@@abdealihazari4484 I am getting the same error. help me please.
i am getting command hdfs not found error at the command hdfs namenode -format can u help me with that
installation did not happen or u r not in right directory
Not able to see datanode on jps command, what can be the issue?
restart services