Dear sir, I am importing data from mssql to hdfs to hive using sqoop job, once data is getting into hive then using spark sql I create data frame according to bussiness requirement then I dump spark dataframe data to pandas dataset and the write pandas dataset data in to csv.after getting data into csv I import csv data to power BI with direct query mode to make reports. Above I selected direct query mode while importing csv data to power bi bcoz I need uptodate data in reports. Above is my data flow. Question 1. I manually run sqoop job/ spark sql code every day at 11pm - is there any solution that run both job automatically at 11 pm
localhost means host which is running locally i.e. on the system on which program is run. If it is a normal ip, it'll go to the network layer to connect to that ip but localhost:8080 means it will connect to port 8080 on the same machine
Thank you very much, brother. Underrated channel! Seriously! I wish I knew about this channel 2,3 months before. But never too late!
Great content with simple explanation
Brother, can you make one video on frequently used Linux commands in shell? Thank you.
Bro can you explain what is data science use cases
Awesome video 👍👍
great lesson sir
Dear sir,
I am importing data from mssql to hdfs to hive using sqoop job, once data is getting into hive then using spark sql I create data frame according to bussiness requirement then I dump spark dataframe data to pandas dataset and the write pandas dataset data in to csv.after getting data into csv I import csv data to power BI with direct query mode to make reports.
Above I selected direct query mode while importing csv data to power bi bcoz I need uptodate data in reports.
Above is my data flow.
Question
1. I manually run sqoop job/ spark sql code every day at 11pm - is there any solution that run both job automatically at 11 pm
Will you please upload py spark with codes it will be very help full.
Basic apache server and its configs pathi podunga
Hi bro is it possible to get trained from you at least for short term, if possible long term.I did Diploma in Data Engineering , need your assiatnce
Is it possible to migrate data from ebd to Tera data using sqoop. If yes, is there any online material to refer to?
And also about the role of infra guy
Dheivame Neenga ethana nala enga eruthinga neenga etha sekarama panirutnigana 6 masam paada pattu hadoopa katruka theva eruntrukathe !!! 🙏🙏🙏
Thank you bro , soon gonna upload complete big data tech stack videos , stay touch with my channel 🙂🙏
What actually a localhost means in commands what does localhost signify
localhost means host which is running locally i.e. on the system on which program is run.
If it is a normal ip, it'll go to the network layer to connect to that ip but localhost:8080 means it will connect to port 8080 on the same machine
"Sqoop job is a map only job". This is True of False ? Can anyone confirm and explain for me ? Pls, my english is not good so i can't hearing right