Hello, this solution does not handle partner ecosystem, neither provides any api mgmt. I think, this solution needs APIGEE X in order to work with centrally manage apis + partner ecosystem.
Good Job Udesh in helping PCA aspirants with all four case studies !!! I had a concern on sharing the results of the Data Studio to the GKE ? Could you please elaborate ? Thanks.
Thanks for the question, Vijay. I kept it as an optional addition. All the results from the Data Studio could be moved out to an application for the users to use. Here, the main requirement is not to the users but I have kept it as optional. GKE could host the application. This is purely my addition for a full-fledged solution.
@@HardikShah10 That is correct. Data to BT or BQ is through Dataflow. For a good streaming data, I used Pub/Sub from the IoT core, which is then connected to Dataflow for a smooth flow of data
Dear @the cloud pilot, thanks for sharing your solution. I have two points to share. #1 ML will deploy model on data in BQ and the outcomes are normally stored for either sharing with users OR to train model later, in BQ only. why its sending that to pubsub and what is the objective we are achieving there ? #2 Updates sent to device for adjustments are only identified once you analyze data, streaming and batch (via BQML or some other custom model ). once analysis is complete and adjustments are identified only they are sent to vehicles. Pls advise on why pubsub is sending data back to vehicles without analysis. I think that is the flaw we need to address. Rest is all good...:) Rgds Hardik
Thanks Hardik for the question. Pub/Sub is not sending back device config. It goes through Cloud Functions and Cloud Dataflow. I used the following reference for the architecture:i.stack.imgur.com/csFMe.png
@@thecloudpilot @The Cloud Pilot Tx again for taking time to respond to query.. However I feel, given context of case study the ref architecture may not be applicable as is. I may be wrong here but trying to put some thought into it. Tx Hardik
@@HardikShah10 The ref architecture diagram is a general case for IoT. Perhaps, overall. So, in my perspective, it'd not be lesser than what we want. Thanks again for sharing your thoughts. I like the way this discussion went healthy.
Since the requirement is not focused over security solutions, it's not necessary to use security services. But at a basic level, we need to make sure that the data and network is secure. So, clearly we need to follow the best practices of each of the used services in the architecture to make the solution strong on a security stand point.
Feel free to discuss your approach towards the solution.
Hello, this solution does not handle partner ecosystem, neither provides any api mgmt. I think, this solution needs APIGEE X in order to work with centrally manage apis + partner ecosystem.
That's a really good observation. Thank you for your inputs.
Google Cloud IoT Core is discontinued, any alternative?
Any MQTT platforms can be leveraged, including Cloud PubSub.
@@thecloudpilot So the data goes straight to Pub/Sub from the vehicles?
Very informative. Thanks Udesh :)
I'm glad you found it helpful. Thank you for the feedback.
Good Job Udesh in helping PCA aspirants with all four case studies !!!
I had a concern on sharing the results of the Data Studio to the GKE ? Could you please elaborate ? Thanks.
Thanks for the question, Vijay. I kept it as an optional addition. All the results from the Data Studio could be moved out to an application for the users to use. Here, the main requirement is not to the users but I have kept it as optional. GKE could host the application. This is purely my addition for a full-fledged solution.
Why we need to use Pub/Sub to upload the daily data?. Why cant we directly upload the data to Big table or Cloud storage?.
I proposed Pub/Sub because it is the best fit for streaming data with the minimum latency. So, the data will be accessed in near real-time.
@@thecloudpilot I believe, for daily upload of data from GCS to BQ or BT should be via data flow. it makes more sense in my view.
@@HardikShah10 That is correct. Data to BT or BQ is through Dataflow. For a good streaming data, I used Pub/Sub from the IoT core, which is then connected to Dataflow for a smooth flow of data
Dear @the cloud pilot,
thanks for sharing your solution. I have two points to share.
#1 ML will deploy model on data in BQ and the outcomes are normally stored for either sharing with users OR to train model later, in BQ only. why its sending that to pubsub and what is the objective we are achieving there ?
#2 Updates sent to device for adjustments are only identified once you analyze data, streaming and batch (via BQML or some other custom model ). once analysis is complete and adjustments are identified only they are sent to vehicles.
Pls advise on why pubsub is sending data back to vehicles without analysis. I think that is the flaw we need to address.
Rest is all good...:)
Rgds
Hardik
Thanks Hardik for the question. Pub/Sub is not sending back device config. It goes through Cloud Functions and Cloud Dataflow. I used the following reference for the architecture:i.stack.imgur.com/csFMe.png
@@thecloudpilot @The Cloud Pilot
Tx again for taking time to respond to query..
However I feel, given context of case study the ref architecture may not be applicable as is. I may be wrong here but trying to put some thought into it.
Tx
Hardik
@@HardikShah10 The ref architecture diagram is a general case for IoT. Perhaps, overall. So, in my perspective, it'd not be lesser than what we want. Thanks again for sharing your thoughts. I like the way this discussion went healthy.
How dataflow can send/some information and update it on iOT? Isn’t it only data streaming service?
Thanks for the question.
I hope this helps: stackoverflow.com/questions/62325238/how-do-i-update-iot-device-config-in-cloud-iot-core-using-dataflow
To modify the device configuration, you will need Cloud IOT edge device
hi! thank you so much for the solution
can also discuss about security part ?
Since the requirement is not focused over security solutions, it's not necessary to use security services. But at a basic level, we need to make sure that the data and network is secure. So, clearly we need to follow the best practices of each of the used services in the architecture to make the solution strong on a security stand point.