Anjan - I am trying to do cross project sink. the Sink destination is BQ which is in other project but logs are not reflecting the data set. Am I missing some thing here. Please advice.
@@VishalPatankar-t6g there will be a service account associated with every sink you create , check if that service account has enough permission to access bigquery in another project
Hi , Email addresses and domains must be associated with an active Google Account, Google Workspace account or Cloud Identity account. Im getting this error when trying to give an IAM permission to the sink
@@anjangcpdataengineering5209 actually I did not create a project yet in GCP how to create a project do you have any tutorials for creating a project can you send it
@@Akash-dg5jthello, i am getting this error in my promtail node running in a kubernetes cluster. "level=error ts=2023-05-01T15:01:10.42495235Z caller=pull_target.go:98 msg="failed to receive pubsub messages" error="rpc error: code = PermissionDenied desc = User not authorized to perform this action."" I created a pubsub, created a sink and gave the service acc permission, I`m not sure what else to do
Hi, th-cam.com/channels/DzNddqknlFyDM4wRHU8ApA.html -- its great channel - can you create video for data lineage API, i need to do for my project (thanks in advance), and for what architecture you would prefer for microservice architecture? because I need to create one where i will read data from APIs and load into database - batch and event driven both - so need to know how to setup, i am confused between cloud run/gke/data flow due to costing and benifits -- you can reply me separately or we can have a 5 min talk also
Informative
Hi. It’s very informative. Pls make complete session on GCP with terraform possibly.
Thank you sir for wonderful video,All are very nice and helpful, only concern is the voice is very low not sure is it for me or for everyone
Anjan - I am trying to do cross project sink. the Sink destination is BQ which is in other project but logs are not reflecting the data set. Am I missing some thing here. Please advice.
@@VishalPatankar-t6g there will be a service account associated with every sink you create , check if that service account has enough permission to access bigquery in another project
bro upload more topics on gcp. It is easy to understand in this channel iam in the learning phase
Sure , stay tuned , keep watching , lot more to come.
Hi ,
Email addresses and domains must be associated with an active Google Account, Google Workspace account or Cloud Identity account.
Im getting this error when trying to give an IAM permission to the sink
What kind of account you are using ? Can you pls provide that ? Look like it it is not Google workspace account like gmail
@@anjangcpdataengineering5209 Can you please give me a fix , can I use this email to make it a workspace email ? , Thank you.
Damn Good🎉
How to get admin access in IAM it is showing you do not have required permission
Can you check who is the owner of your Gcp project ?
@@anjangcpdataengineering5209 actually I did not create a project yet in GCP how to create a project do you have any tutorials for creating a project can you send it
@@Akash-dg5jthello, i am getting this error in my promtail node running in a kubernetes cluster.
"level=error ts=2023-05-01T15:01:10.42495235Z caller=pull_target.go:98 msg="failed to receive pubsub messages" error="rpc error: code = PermissionDenied desc = User not authorized to perform this action.""
I created a pubsub, created a sink and gave the service acc permission, I`m not sure what else to do
@@anjangcpdataengineering5209 How do i check this?
How to do this by terraform please
All your videos are useful , But please remove background music , really disgusting and irritating .
Hi, th-cam.com/channels/DzNddqknlFyDM4wRHU8ApA.html
-- its great channel - can you create video for data lineage API, i need to do for my project (thanks in advance), and for what architecture you would prefer for microservice architecture? because I need to create one where i will read data from APIs and load into database - batch and event driven both - so need to know how to setup, i am confused between cloud run/gke/data flow due to costing and benifits -- you can reply me separately or we can have a 5 min talk also