So the bottm line is that simple black box method of capturing system dynamics is not the solution. For a digital Twin, you need to apply first principle to model the system and THEN reinforce it with dynamic field data.
The term DT is gaining traction these days in research but I find it repetition of idea used in product/building life cycle. Similarly Reduced Order Model has been there in prototype-model and in phenomenological elements in finite element analysis. I have seen that sometimes, researchers coin new terminology or buzz word that help them to publish their work as new tech.
My take is, by focusing on commercially viable features/solutions. A high level of fidelity is expensive so the work to achieve this fidelity should be limited to problems that are likely to pay off when solved rather than trying to make a high-fidelity digital twin of the whole system. This requires an understanding of the problem area to be able to identify the problems that are actually worth solving using a DT before development of the DT starts, ensuring the developed features are geared towards solving a specific problem. Creating digital twins of multi-system components also helps limit the work needed to be done for every new system, e.g., motors.
Can someone explain, for prediction tasks, why should we do all of this modeling, instead of building a Deep learning model on the historic data of the physical asset? and retrain it every x amount of time to be up to date. I can understand the interpretability advantage of physics-driven, but is there any other advantage?
Because prediction will need training the models? it is a dumb and brute force way to do things. Things would quickly go out of hand. You need lot of computing power.
So the bottm line is that simple black box method of capturing system dynamics is not the solution. For a digital Twin, you need to apply first principle to model the system and THEN reinforce it with dynamic field data.
😊
The term DT is gaining traction these days in research but I find it repetition of idea used in product/building life cycle. Similarly Reduced Order Model has been there in prototype-model and in phenomenological elements in finite element analysis.
I have seen that sometimes, researchers coin new terminology or buzz word that help them to publish their work as new tech.
thank you so much for this wonderful presentation
very good talk, Great
amazing presentation!
does anyone have a link to the digitial twin blog by stephen ferguson from siemens she mentions at 5:00
Amazing talk
Awesome video - thanks for sharing 👋
Thank you for your presentation!!!
How can costs of digitalbtwins be sustainable
My take is, by focusing on commercially viable features/solutions. A high level of fidelity is expensive so the work to achieve this fidelity should be limited to problems that are likely to pay off when solved rather than trying to make a high-fidelity digital twin of the whole system. This requires an understanding of the problem area to be able to identify the problems that are actually worth solving using a DT before development of the DT starts, ensuring the developed features are geared towards solving a specific problem. Creating digital twins of multi-system components also helps limit the work needed to be done for every new system, e.g., motors.
Awesome talk
Can someone explain, for prediction tasks, why should we do all of this modeling, instead of building a Deep learning model on the historic data of the physical asset? and retrain it every x amount of time to be up to date. I can understand the interpretability advantage of physics-driven, but is there any other advantage?
yes, spatial interpratation
Because prediction will need training the models? it is a dumb and brute force way to do things. Things would quickly go out of hand. You need lot of computing power.
This is just an observer?
There was no machine learning in this talk.
Vv