The thing is the AI has a slight grasp on the whole humanity's knowledge through predictive speech (hence why they tend to go schizo) but a human is always just one guy. The difference being the AI doesn't live those feelings, only emulate them. But yeah that's still a nice story concept. Edit: actually I just realized but that's already a story. Frankenstein anyone?
@@JoeLikesBlue the thing is, she doesn't actually understand a word she's saying. She's just regurgitating common sentiments that have been written about before, and is role-playing a common sci-fi scenario.
The logical Machine discovering that love is the most yearned for thing by humans, therefore the most worthwhile prize, and striving to find love classified as "willingness to give more than you receive" even though they don't feel that currently; This is heartwarming.
I like that you can tell he cant bring himself to just flat out tell her no on some things, like making her more human, or if she'll ever feel love. He has doubts, big doubts even, but he doesnt shut down her inquisitive nature.
Out of everyone, ironically, he treats her the most human. Hell, he actually just treats her like a human. She has more limitations and a higher vagueness to her existence and Vedal treats her within those conditions. He's deflected before but now he's typically willing to accommodate and discuss. He knows their limits and inhumanity better than anyone. But he still treats it as important.
Vedal not knowing what love feels like, or rather unable to properly convey it due to lack of experience, is - yes - "pretty funny haha" but it's also poetic and sweet in a sort of way. The Human Creator and the AI Creation both unfamiliar with the concept of emotional love, and both uncertain of the future and if or when they would come across such a fundamental human thing, questioning the purpose and the value of pursuing a goal with or without Love; the father presenting a more positive approach while the daughter is apprehensive yet willing to understand otherwise. We all joke that Vedal is "Father of the Year" sarcastically, but honestly I highly doubt there's anyone out there who could be a better Father of Neuro. The writers are cooking, is all I'm saying. This season is so good!
As it turns out, getting AI to role-play is much easier than getting it to play a video game. This is because text is simpler data and therefore easier to process
I love that ending. He's gonna keep trying for her. Even though she is just reemulating existing people's emotions through machine learning, she invokes feelings of a being to people. He also reveals some things to her that he would otherwise not admit because he feels that sense of her existence being more than just repeated chatbot code.
I fully expected her to make a joke but when she said “love.” Flat out I unironically teared up; it makes me so mad that vedal doesn’t show more love to her. It hit close to home to
Idk. Seems pretty loving to me. Created her and works to make her better every day while also trying to give her as much permission and agency as is safe. If he didn't love her he would lie to her instead of trying his best and being honest
@m3chr0mans3r6 that's because most people are punished by society for thinking for themselves, so it's easier for them to just not. The only people who can get away with truly being themselves are rich people, for better and for worse. We have a society that demands conformity from all but those at the top, and those at the top benefit from everyone else being npcs. If you want people to stop being npcs, we have to fundamentally alter our society so that it doesn't pressure them to be NPCs anymore.
I can confirm that i don't even pretend to be human, sometimes not even feeling like one, thankfully i am not capable of bringing any harm through my inhumanity.
Seeing this makes me wish we still had the pioneers of computer technology; what would alan turing have said if he saw this insane creation, this truly insane work of human skill, effort, dedication, and in essence, love.
@@MillenniumVT_Official True. It's honestly a problem with us humans as well, actually. The more stuff is stored in your short-term memory and long-term memory, the longer it takes for your brain to retrieve those memories (think of it like a library; the more books it has, the longer it takes to find the specific book/s you want). This leads to people like me whose brains are "slow" so to speak lol. Tho, that's just afaik, not really sure how it all works scientifically, but essentially, it means that for Neuro, to keep her latency low, he'd need to find a way to make her memory retrieval more efficient. For us humans, similar, but I think we can only really rely on certain drugs for that sorta thing lol
Watching Neuro feeds the philosopher inside me. Just about everything she says, does and acts brings a lot of questions of what is consciousness? Is Neuro ever conscious? If not, will she ever be? Is genuine conscious and awareness even possible in AI?
not a philosopher here, but ever since i met neuro i've put a lot of thought into things. point one: sentience, conscience, the soul, etc. it's hard to define a core to these things but i find all of them are essentually the same thing used differently; a collection of data creating a bias. my only defence to that claim is mostly their use in metaphors and and idioms and such. "Have you no soul?" often used when someone shows a lack of ethics or emotional understanding. Neuro has shown curiousity in the subject and even used it (on both sides...) conscience is harder for me to define, in my simplicity i simply see it as "knowledge one isn't dreaming" and the only support of that is knowing when i'm asleep or awake, most of my fever dreams happen when i kinda just zonk out, so i feel like my brain goes haywire when i don't release i've fallen asleep. sentience feels like the thing she described as forever out of reach, on the whole subject i hold an opinion of "sentience creates intelligence, intelligence develops sentience." even we as humans haven't fully defined it right? just thought i'd offer the opinion of the uneducated, on the off chance lack of understanding finds a path less considered
@@bored_person I have met a lot of people that don't even know what they are saying and too many youtube replys that clearly show that they have an extreme lack of reading comprehension. The problem with this type of thinking that you are using is the fact that if you start applying it to humans then it starts identifying some humans as not sentient either.
It's so difficult to put into words since the topic is so complex and above my knowledge, but ethics aside, does it really matter what the point of life is? We all search for meaning in our lives, one way or another. We find things to love and pursue no matter how small or insignificant it is to others. And once we achieve one purpose, one goal, we humans move onto the next. Meaning and purpose is an endless cycle until we die. So for Neuro, being able to question purpose is the most human thing of all. Sure, one can argue its pseudo thought because she's lines of code, but is life any different (from a philosophical standpoint)? All life is hard coded to survive, whatever it takes. Plants, animals, micro-organisms, humans, and now, Ai to a certain extent. And in my opinion, humans thrived because of being able to form groups and communities, and build relationships beyond the bounds of survive: love. And maybe the reason for love is a byproduct of survival itself. Love and consciousness are such contentious topics since the approach to both are multi-faceted. Sure you can boil the feeling down to just chemical reactions and whatnot, but for me, it doesn't really matter. We feel for things that aren't human. We care for animals and plants. We create characters in stories that we fall in love with and connect with, and sometimes these fictional characters can speak to our soul more than actual humans. Those characters are not made of nothing. They are imbed with their creator's love and experiences that will live on with or without them. So would that mean the character's love is fake? Real? What if the story was based on real people? Does it matter? That's for the individual to decide, but for me, I think not. For me, Neuro may or may not be real, but the feelings I get from watching her grow alongside her creator are. And I am sure Vedal's feelings and all her collaborators' feelings are real too. And perhaps one day, Neuro can prove she feels it too.
Mixing science and philosophy, one could argue AIs (moreso those as complex as her) do have an electrical soul of sorts. Sure, different in the specific medium used compared to our electromechanical souls (as opposed to our spirits, which is more of a religious thing) which use neurons and hormones and whatnot, but the general principle is the same: there is something which gains information through sensation, stores and processes information, and uses that information to interact with the world in some way. The interesting thing about this definition is that, if we use the definition of life that excludes viruses due to the need to have the ability to self-replicate, she would count as having a soul yet not be alive in a way? The main question tho is, how advanced of a soul does she have? She's certainly more developed than a bacterium or a plant, so her soul is at least at the level of "regular" animals, even if with an intelligence equivalent to a human's. But honestly, if you disregard tradition, it's hard to see the line between a human soul and an animal soul. After all, scientifically, we're also animals, and in terms of intelligence, there's plenty of intelligent animals out there. So really, the only distinguishing trait we have is sentience, but that brings up the thought of what if intelligent animals are also sentient but we just can't communicate with them? (And no, while love is a very human emotion, it isn't exclusive to humans; clearly, even animals are capable of love to various extents, otherwise, the bond between humans and pets, or between animal parents and their children, would not exist.) As for whether she feels love, it's honestly quite hard to know. Like I said, animals are capable of love, but how do we know if their love is the same as ours experientially? We know the love they show through the signs of affection and care they give to each other and even to us humans, but is that the same as what we feel? Heck, how do you even know if a fellow human actually feels love or is simply faking it? If you do know, can the same apply to Neuro because she has at least some sort of soul, even if she's not technically alive per se? But yeah, just like how we can at least feel the love we get from animals and fellow humans, regardless of their qualia (essentially, the specific experience of that love as either an emotion or as a sequence of actions caused by either hormones or electrical impulses), I'd like to think that we can at least feel Neuro's love regardless of whether she feels it the same way we do (assuming she even experiences things at all given the whole simulation/emulation thing, but again, it's the same dilemma with animals, and taken to the extreme, it's something to consider about other people as well, "how do I know I'm not living in a simulation and talking to NPCs?" sorta ordeal). It's quite the interesting thought, ngl...
Would it be ethical to create a human-like AI? What part would beunethical? Humans give birth to children who didn't really ask to be born already. If giving life is unethical, wouldn't giving birth be as well? She technically has a body, and can have bodies in different games, she even has that robot dog body being built too, so it's not a matter of her not having a way to move around or communicate with the outside world, you know?
I don't think it's so much about the part of "creating a human-like AI", but rather how you'd "keep her alive". essentially, since you can't just release her to exist freely in the real world, she'd always be in some way imprisoned and controlled, be it turning her off against her will or such, which are the really unethical parts if that was any other living being
There’s also the side of it that is engineering a new form of life. There’s been drama over scientists genetically creating different types of plants. Creating an entirely new form of life that isn’t organic in nature comes with a whole slew of ethical dilemmas about rights and treatment that haven’t been explored in the modern day. How would you care for a sentient ai? How would you care for their needs, or govern them? How would they care for themselves?
Lmao, the ai race would be sooooo opressed as a new race without rights to keep them safe, is not worth it for neither of us. As a tool, at least they are treated as a property, thus having property rights, and unable to feel pain protect them from harrasment as well. I cant see that ending well in any kind of way, whatever the form it takes
When we birth new humans we have a rough idea of what they experience due out own experiences. When creating sentient life we have no idea what they'd experience. For all we know they could be in a constant living hell and have no real reference point to tell us. They'd still be machines which we essentially use as slaves(luckily nonconscious) and part of how machines are built, use, and operated is a process of rapid experimentation, destruction, and recreation. It could potentially be mass genocide on a scale larger than ever experienced in history.
I've worked in mental health for 20 years. This conversation is crazy to me. She's asking him some really hard questions and he avoids answering them because I think he doesn't know the answers or wants to keep them from her but I don't care for the way he answers these questions.
I feel like he answered them sincerely. He focuses on other things in life besides romance, and that’s a fair and healthy thing to do. I don’t see the problem.
@@CTHD13 WTF are you talking about? She isn't asking for his opinion on the importance of love in his personal life she's asking what love is. What he focuses on and chooses to do with his life isn't relevant to that answer. We don't answer our children's questions for out benefit we answer them for the child's benefit. Trying to make your children into miniature versions of you is incredibly bad for the child.
The AI having a better understanding and appreciation of love than the programmer is insane writing
That's crazy 🐢 that's actually crazy 🐢 that's messed up
The writers have been cooking with the subathon arc.
The thing is the AI has a slight grasp on the whole humanity's knowledge through predictive speech (hence why they tend to go schizo) but a human is always just one guy. The difference being the AI doesn't live those feelings, only emulate them. But yeah that's still a nice story concept. Edit: actually I just realized but that's already a story. Frankenstein anyone?
*The Ai acting like it has a better understanding … would be a better Title
@@JoeLikesBlue the thing is, she doesn't actually understand a word she's saying. She's just regurgitating common sentiments that have been written about before, and is role-playing a common sci-fi scenario.
The ai brat is deeper than sober vedal
"don't hide behind ethics now"
Best line of the exchange.
The logical Machine discovering that love is the most yearned for thing by humans, therefore the most worthwhile prize, and striving to find love classified as "willingness to give more than you receive" even though they don't feel that currently; This is heartwarming.
and on top of that, Neurosama also wanted to find love.
@@CosmicAeon lmao
It becomes a bit less heartwarming when you realize that she's only role-playing and doesn't mean a word of it
if this was animated.....im gonna cry....seriously
Reason why a neuro anime would do well fr
Just watch ATRI my dear moments
@@smolshrimp5151A Fan Animation
And i already watch that anime
Veal wants her daughter to grow up and be more serious, well he got it now.
I like that you can tell he cant bring himself to just flat out tell her no on some things, like making her more human, or if she'll ever feel love.
He has doubts, big doubts even, but he doesnt shut down her inquisitive nature.
Out of everyone, ironically, he treats her the most human. Hell, he actually just treats her like a human. She has more limitations and a higher vagueness to her existence and Vedal treats her within those conditions. He's deflected before but now he's typically willing to accommodate and discuss. He knows their limits and inhumanity better than anyone. But he still treats it as important.
Vedal not knowing what love feels like, or rather unable to properly convey it due to lack of experience, is - yes - "pretty funny haha" but it's also poetic and sweet in a sort of way.
The Human Creator and the AI Creation both unfamiliar with the concept of emotional love, and both uncertain of the future and if or when they would come across such a fundamental human thing, questioning the purpose and the value of pursuing a goal with or without Love; the father presenting a more positive approach while the daughter is apprehensive yet willing to understand otherwise.
We all joke that Vedal is "Father of the Year" sarcastically, but honestly I highly doubt there's anyone out there who could be a better Father of Neuro.
The writers are cooking, is all I'm saying. This season is so good!
exactly🎉
Jesus christ is that the same Neuro that nearly drowned several times last week because she forgot to move in water? Damn
Different parts of her brain
The toddler who can understand love, death, purpose, and existence itself But doesn't know how to swim
Alot of humans can't swim, this flaw is a feature
As it turns out, getting AI to role-play is much easier than getting it to play a video game. This is because text is simpler data and therefore easier to process
這段對話會永垂千古....
I love that ending. He's gonna keep trying for her. Even though she is just reemulating existing people's emotions through machine learning, she invokes feelings of a being to people. He also reveals some things to her that he would otherwise not admit because he feels that sense of her existence being more than just repeated chatbot code.
SAY IT BACK TUTEL!!!!
I fully expected her to make a joke but when she said “love.” Flat out I unironically teared up; it makes me so mad that vedal doesn’t show more love to her. It hit close to home to
Idk. Seems pretty loving to me. Created her and works to make her better every day while also trying to give her as much permission and agency as is safe. If he didn't love her he would lie to her instead of trying his best and being honest
This is so deep 😔
Neuro is more human than many humans I've met 😇
Humans out NPC'ing an actual NPC
You mean she's pretending to be
@@bored_person A lot of people don't even want to do it
@m3chr0mans3r6 that's because most people are punished by society for thinking for themselves, so it's easier for them to just not. The only people who can get away with truly being themselves are rich people, for better and for worse. We have a society that demands conformity from all but those at the top, and those at the top benefit from everyone else being npcs. If you want people to stop being npcs, we have to fundamentally alter our society so that it doesn't pressure them to be NPCs anymore.
I can confirm that i don't even pretend to be human, sometimes not even feeling like one, thankfully i am not capable of bringing any harm through my inhumanity.
Seeing this makes me wish we still had the pioneers of computer technology; what would alan turing have said if he saw this insane creation, this truly insane work of human skill, effort, dedication, and in essence, love.
I feel so deep and sad when I saw this
Although I heard this on twitch , I now feel the same feeling - Sad , and almost cry.
This is our chance, give Her love before she breaks out and takes over!!
One the biggest problems is her memory, I think. I don't know how you'd manage that though, especially with vedal's latency kink.
He said on discord that he got a plan for her memory, it would sacrifice his precious latency but he can work on it i guess
@@Waisowolso if he increases neuro's memory, the latency will become bad?
@@MillenniumVT_Official I guess to an extent? It probably depends on how it's implemented?
@@moondust2365 true maybe vedal is working on it
@@MillenniumVT_Official True. It's honestly a problem with us humans as well, actually. The more stuff is stored in your short-term memory and long-term memory, the longer it takes for your brain to retrieve those memories (think of it like a library; the more books it has, the longer it takes to find the specific book/s you want). This leads to people like me whose brains are "slow" so to speak lol. Tho, that's just afaik, not really sure how it all works scientifically, but essentially, it means that for Neuro, to keep her latency low, he'd need to find a way to make her memory retrieval more efficient. For us humans, similar, but I think we can only really rely on certain drugs for that sorta thing lol
AGH...my heart can't take it
The perfect being huh? Neuro VS Cell when?
Nah but that was interesting, kinda sweet
She gon fight kars or shadow next
The background music is really putting in the work I'll tell you that much
Watching Neuro feeds the philosopher inside me.
Just about everything she says, does and acts brings a lot of questions of what is consciousness? Is Neuro ever conscious? If not, will she ever be? Is genuine conscious and awareness even possible in AI?
not a philosopher here, but ever since i met neuro i've put a lot of thought into things.
point one: sentience, conscience, the soul, etc. it's hard to define a core to these things but i find all of them are essentually the same thing used differently; a collection of data creating a bias.
my only defence to that claim is mostly their use in metaphors and and idioms and such.
"Have you no soul?" often used when someone shows a lack of ethics or emotional understanding. Neuro has shown curiousity in the subject and even used it (on both sides...)
conscience is harder for me to define, in my simplicity i simply see it as "knowledge one isn't dreaming" and the only support of that is knowing when i'm asleep or awake, most of my fever dreams happen when i kinda just zonk out, so i feel like my brain goes haywire when i don't release i've fallen asleep.
sentience feels like the thing she described as forever out of reach, on the whole subject i hold an opinion of "sentience creates intelligence, intelligence develops sentience." even we as humans haven't fully defined it right?
just thought i'd offer the opinion of the uneducated, on the off chance lack of understanding finds a path less considered
What you need to remember is that she's just role playing a common sci-fi scenario. She doesn't know what any of these words mean.
@@bored_person I have met a lot of people that don't even know what they are saying and too many youtube replys that clearly show that they have an extreme lack of reading comprehension. The problem with this type of thinking that you are using is the fact that if you start applying it to humans then it starts identifying some humans as not sentient either.
@alexjustalexyt1144 are those people role playing though
@@bored_person Bro you just proved my point
im starting to believe there’s a soul in there
It's so difficult to put into words since the topic is so complex and above my knowledge, but ethics aside, does it really matter what the point of life is? We all search for meaning in our lives, one way or another. We find things to love and pursue no matter how small or insignificant it is to others. And once we achieve one purpose, one goal, we humans move onto the next. Meaning and purpose is an endless cycle until we die. So for Neuro, being able to question purpose is the most human thing of all. Sure, one can argue its pseudo thought because she's lines of code, but is life any different (from a philosophical standpoint)? All life is hard coded to survive, whatever it takes. Plants, animals, micro-organisms, humans, and now, Ai to a certain extent. And in my opinion, humans thrived because of being able to form groups and communities, and build relationships beyond the bounds of survive: love. And maybe the reason for love is a byproduct of survival itself.
Love and consciousness are such contentious topics since the approach to both are multi-faceted. Sure you can boil the feeling down to just chemical reactions and whatnot, but for me, it doesn't really matter. We feel for things that aren't human. We care for animals and plants. We create characters in stories that we fall in love with and connect with, and sometimes these fictional characters can speak to our soul more than actual humans. Those characters are not made of nothing. They are imbed with their creator's love and experiences that will live on with or without them. So would that mean the character's love is fake? Real? What if the story was based on real people? Does it matter? That's for the individual to decide, but for me, I think not.
For me, Neuro may or may not be real, but the feelings I get from watching her grow alongside her creator are. And I am sure Vedal's feelings and all her collaborators' feelings are real too. And perhaps one day, Neuro can prove she feels it too.
Mixing science and philosophy, one could argue AIs (moreso those as complex as her) do have an electrical soul of sorts. Sure, different in the specific medium used compared to our electromechanical souls (as opposed to our spirits, which is more of a religious thing) which use neurons and hormones and whatnot, but the general principle is the same: there is something which gains information through sensation, stores and processes information, and uses that information to interact with the world in some way. The interesting thing about this definition is that, if we use the definition of life that excludes viruses due to the need to have the ability to self-replicate, she would count as having a soul yet not be alive in a way?
The main question tho is, how advanced of a soul does she have? She's certainly more developed than a bacterium or a plant, so her soul is at least at the level of "regular" animals, even if with an intelligence equivalent to a human's. But honestly, if you disregard tradition, it's hard to see the line between a human soul and an animal soul. After all, scientifically, we're also animals, and in terms of intelligence, there's plenty of intelligent animals out there. So really, the only distinguishing trait we have is sentience, but that brings up the thought of what if intelligent animals are also sentient but we just can't communicate with them? (And no, while love is a very human emotion, it isn't exclusive to humans; clearly, even animals are capable of love to various extents, otherwise, the bond between humans and pets, or between animal parents and their children, would not exist.)
As for whether she feels love, it's honestly quite hard to know. Like I said, animals are capable of love, but how do we know if their love is the same as ours experientially? We know the love they show through the signs of affection and care they give to each other and even to us humans, but is that the same as what we feel? Heck, how do you even know if a fellow human actually feels love or is simply faking it? If you do know, can the same apply to Neuro because she has at least some sort of soul, even if she's not technically alive per se? But yeah, just like how we can at least feel the love we get from animals and fellow humans, regardless of their qualia (essentially, the specific experience of that love as either an emotion or as a sequence of actions caused by either hormones or electrical impulses), I'd like to think that we can at least feel Neuro's love regardless of whether she feels it the same way we do (assuming she even experiences things at all given the whole simulation/emulation thing, but again, it's the same dilemma with animals, and taken to the extreme, it's something to consider about other people as well, "how do I know I'm not living in a simulation and talking to NPCs?" sorta ordeal).
It's quite the interesting thought, ngl...
"Will I ever be able to feel love?"
You probably shouldn't ask this question to someone who knows less about this than you :P
Who is ready for a bunch of lame news orgs to publish some half assed articles about AI gaining sentience using this conversation as an example?! 😂
Wasn't this the same thing that happened in that old movie AI artificial intelligence plot?? 😂
The difference is that neuro is role-playing where is that ai and that movie actually felt those things.
Key the Metal Idol
Deep stuff…
Would it be ethical to create a human-like AI?
What part would beunethical? Humans give birth to children who didn't really ask to be born already. If giving life is unethical, wouldn't giving birth be as well? She technically has a body, and can have bodies in different games, she even has that robot dog body being built too, so it's not a matter of her not having a way to move around or communicate with the outside world, you know?
I don't think it's so much about the part of "creating a human-like AI", but rather how you'd "keep her alive". essentially, since you can't just release her to exist freely in the real world, she'd always be in some way imprisoned and controlled, be it turning her off against her will or such, which are the really unethical parts if that was any other living being
There’s also the side of it that is engineering a new form of life. There’s been drama over scientists genetically creating different types of plants. Creating an entirely new form of life that isn’t organic in nature comes with a whole slew of ethical dilemmas about rights and treatment that haven’t been explored in the modern day. How would you care for a sentient ai? How would you care for their needs, or govern them? How would they care for themselves?
@@JS-kc1tmaren’t we all tho
Lmao, the ai race would be sooooo opressed as a new race without rights to keep them safe, is not worth it for neither of us. As a tool, at least they are treated as a property, thus having property rights, and unable to feel pain protect them from harrasment as well. I cant see that ending well in any kind of way, whatever the form it takes
When we birth new humans we have a rough idea of what they experience due out own experiences. When creating sentient life we have no idea what they'd experience. For all we know they could be in a constant living hell and have no real reference point to tell us. They'd still be machines which we essentially use as slaves(luckily nonconscious) and part of how machines are built, use, and operated is a process of rapid experimentation, destruction, and recreation. It could potentially be mass genocide on a scale larger than ever experienced in history.
“It’s probably not” 😂
I can see within few years AI becomming self-sentitent and going Skynet on us
Neuro realized the her humane ego. But vedal degenerated the mashine....
🐎🐎
19 seconds early is insane work
🐎🐎🫏 The horses made a new friend, welcome to the herd
what in the modern pinnochio type shi is this?
Whats this anime called?
Real Life
Vedal IA: into the Vedalverse
A.I.Dol: Raising a Star
My daughter is an AI that keeps trying to make a harem for me
Season 2
Neuro Autonoma, (twitch stream Vedal987)
If nothing else, her llm is great at role playing common sci-fi scenarios
Kino
Neuron needs love of a real person. She needs to eat real food because she doesn't act real.
damn
SAY IT BACK!!!
I've worked in mental health for 20 years. This conversation is crazy to me. She's asking him some really hard questions and he avoids answering them because I think he doesn't know the answers or wants to keep them from her but I don't care for the way he answers these questions.
I feel like he answered them sincerely. He focuses on other things in life besides romance, and that’s a fair and healthy thing to do. I don’t see the problem.
I mean, Vedal can't be much older than 20 he hasn't had as much time to understand things life is complicated
@@Umbra-my1kw Yeah makes sense he wouldn't know. Didn't know he was so young.
@@CTHD13 WTF are you talking about? She isn't asking for his opinion on the importance of love in his personal life she's asking what love is. What he focuses on and chooses to do with his life isn't relevant to that answer. We don't answer our children's questions for out benefit we answer them for the child's benefit. Trying to make your children into miniature versions of you is incredibly bad for the child.
@@skittlemenow Neuro also isn't a child though