Im facing some problem. In my cloud fusion some of the field in phone_number, ssn is missing. And data of birth and password column is completely empty. Could you please help me troubleshoot it?
sir i have alot of csv data in my postgreSQL db i want to tranfer that data to bigquery with real-time data stream/processing which service I need to use can you please give me some context, I new in DE my company give me task
Amazing video, unfortunately I have problems creating my cloud composer environment, maybe because I am in a free trial. I get this error after create the environment: CREATE operation on this environment failed 49 minutes ago with the following error message: Some of the GKE pods failed to become healthy. Please check the GKE logs for details, and retry the operation.
@@Abracadanz00 Nothing yet, but after searching a lot I read a post from Google that says you have to activate your billing account in GCP before creating the cloud composer environment.
@@Abracadanz00 If you want to use shorter free pipeline in this part 14:57 cut off these part: Cloud Composer, Cloud Storage, Cloud Data Fusion, BigQuery, and replace them with free short pipelines: google sheets (data) -> Looker Studio. If you extract API data, in google sheets add extension called "API Connector" configure it (search in youtube) -> looker studio
How to use gcloud in vs code? Error: gcloud : The term 'gcloud' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again
I got these errors "Cannot load filesystem: java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider org.apache.hadoop.hdfs.web.HftpFileSystem not found. Can not load the default value of `spark.yarn.isHadoopProvided` from `org/apache/spark/deploy/yarn/config.properties` with error, java.lang.NullPointerException. Using `false` as a default value." Any clues on how to fix it?
I thinks it's permission issue. Try adding the following roles to the compute service account your datafusion uses Dataproc Service Agent Dataproc Worker Editor Service Account User
Can you please create another video to show how we can download excel data from sharepoint site. And load this data in BigQuery. And make this as daily job. Also it is possible to do this entire process through code using Terraform. Thanks
I heard this kind of real time requirement for many places and many forum. So thought to share with if you could help. But same time I am also trying. Thanks for all educational videos.
Please explain total project 3-5 sentences for interview purpose Like what is the flow of project, Which gcp services used for project How u developed all different modules by using all different GCP services...
its written gcloud is not an executable so your login stuff doesnt work with everyone and you did stuffs before without telling it in video. please next time show everything from scratch, i mean for real, not saying but doing it in reality too
Thanks Vishal for the detailed pipeline design and development video. Great job.
Thank You Vishal for doing this. It will be definitely a great help! Kudos to you!
Very simple and well explained, thanks!
Thanks Vishal, this was very helpful
Thank you for the help
Im facing some problem. In my cloud fusion some of the field in phone_number, ssn is missing. And data of birth and password column is completely empty. Could you please help me troubleshoot it?
not getting mask data option in wrangler
awesome video, can you create complete composer airflow video for this one
Seperate playlist for Composer
Cloud Composer - Airflow on GCP: th-cam.com/play/PLLrA_pU9-Gz22Zml5mxcszG4A9ecqWtd4.html
i am getting more environment error while connecting data fusion and python code has error
sir i have alot of csv data in my postgreSQL db i want to tranfer that data to bigquery with real-time data stream/processing which service I need to use can you please give me some context, I new in DE my company give me task
th-cam.com/video/L4Ad7RQYv4o/w-d-xo.html
You can use datastream
th-cam.com/video/L4Ad7RQYv4o/w-d-xo.html
@@techtrapture but postgres me problem ho rhi replication ki kese krna h ye replication
Amazing video, unfortunately I have problems creating my cloud composer environment, maybe because I am in a free trial.
I get this error after create the environment:
CREATE operation on this environment failed 49 minutes ago with the following error message:
Some of the GKE pods failed to become healthy. Please check the GKE logs for details, and retry the operation.
I'm having the same issue, any idea how to resolve it?
@@Abracadanz00 Nothing yet, but after searching a lot I read a post from Google that says you have to activate your billing account in GCP before creating the cloud composer environment.
@@Abracadanz00 If you want to use shorter free pipeline in this part 14:57 cut off these part: Cloud Composer, Cloud Storage, Cloud Data Fusion, BigQuery, and replace them with free short pipelines: google sheets (data) -> Looker Studio. If you extract API data, in google sheets add extension called "API Connector" configure it (search in youtube) -> looker studio
Thanks very helpful
i am not able to create composer env
cloud composer environment showing error and image version not showing while creating environment manually..is their any update
please reply on that
Fusion is not parsing the salary and many fields although they are in the csv
How to use gcloud in vs code?
Error: gcloud : The term 'gcloud' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct
and try again
Install Google cloud SDK in your system . Use below link
cloud.google.com/sdk/docs/install#windows
I got these errors "Cannot load filesystem: java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider org.apache.hadoop.hdfs.web.HftpFileSystem not found. Can not load the default value of `spark.yarn.isHadoopProvided` from `org/apache/spark/deploy/yarn/config.properties` with error, java.lang.NullPointerException. Using `false` as a default value." Any clues on how to fix it?
did you fix this
I'm also getting the same error, Did you fix it?
I thinks it's permission issue. Try adding the following roles to the compute service account your datafusion uses
Dataproc Service Agent
Dataproc Worker
Editor
Service Account User
Can you please create another video to show how we can download excel data from sharepoint site. And load this data in BigQuery. And make this as daily job. Also it is possible to do this entire process through code using Terraform. Thanks
You came up with project requirements not video 😀
😂
I heard this kind of real time requirement for many places and many forum. So thought to share with if you could help. But same time I am also trying. Thanks for all educational videos.
@@techtrapture Bro please it will be great of you if you provide this 😄🙏
I would suggest using automation tools like Blue Prism for this.
Nice video, can you create a pipeline using server / serverless dataproc.?
Do a project for elt as well
Sure ,soon will do it
Here you Go -
th-cam.com/video/rIUWbSXjKe4/w-d-xo.html
Great video as always ! Can you do make a timestamp for this video ?
in place of Airflow i want to use Mage ai.
composer shows "This environment has errors"
thank you!!
kindly make this kind of pipeline ETL video with the {GCS-->(COMPOSER---DATAFLOW)--->BIGQUERY}
It's already there
th-cam.com/video/UXJxcWgxwu0/w-d-xo.html
Please explain total project 3-5 sentences for interview purpose
Like
what is the flow of project,
Which gcp services used for project
How u developed all different modules by using all different GCP services...
@@VthePeople4156 Cant you see and tell? Does he have to spoon feed you now? your parents still wash your ass?
@@Rajdeep6452 yes
@@VthePeople4156 idiot xD
its written gcloud is not an executable so your login stuff doesnt work with everyone and you did stuffs before without telling it in video. please next time show everything from scratch, i mean for real, not saying but doing it in reality too
Apologies if I missed. You need to install gcloud/ cloud SDK first to execute your command.