This is already the case in a certain sense. One part of the architecture of neural networks has been given the name ‘hidden layers’ by the inventors of the whole thing. And these layers are called this because nobody can really say what exactly is going on in them. In other areas, for example in medical research, there is the term ‘black box’. It is used there to describe the problem that scientists can never create a closed causal chain when administering substances into the human body. What can be measured is the dose and a series of statistical values, which are then labelled as the effect.The input and output, most of what happens in between, is therefore either unknown or so complex that it no longer makes sense to try to describe it completely. All scientists can do is try to gradually reduce the size of the black box through further research. What I'm trying to say is that dealing with things that we don't fully understand is to a certain extent a familiar phenomenon. But I look forward with great excitement to the social learning process as far as AIs are concerned. This is so exciting because it will take the phenomenon and problem out of the laboratory, so to speak, and bring it into people's everyday lives. At least in the early stages, until this technology is established in society, it will be very challenging for many people. I myself believe that the impact of this development will be greater than the invention of the internet. Thanks for watching the video! :)
Sehr gut, die Basis habe ich gut verstanden. Danke!
Great work! Thank you for these explanations 👌
welcome back master 😍
When AI becomes self progamming, we will no longer have the capability to understand it.
This is already the case in a certain sense. One part of the architecture of neural networks has been given the name ‘hidden layers’ by the inventors of the whole thing. And these layers are called this because nobody can really say what exactly is going on in them. In other areas, for example in medical research, there is the term ‘black box’. It is used there to describe the problem that scientists can never create a closed causal chain when administering substances into the human body. What can be measured is the dose and a series of statistical values, which are then labelled as the effect.The input and output, most of what happens in between, is therefore either unknown or so complex that it no longer makes sense to try to describe it completely. All scientists can do is try to gradually reduce the size of the black box through further research. What I'm trying to say is that dealing with things that we don't fully understand is to a certain extent a familiar phenomenon. But I look forward with great excitement to the social learning process as far as AIs are concerned. This is so exciting because it will take the phenomenon and problem out of the laboratory, so to speak, and bring it into people's everyday lives. At least in the early stages, until this technology is established in society, it will be very challenging for many people. I myself believe that the impact of this development will be greater than the invention of the internet.
Thanks for watching the video! :)