Naw. I asked if questions, and taught it something. And it responded with a different answer. It concluded I was right and it was wrong. I’m just an input. It can correct itself.
I asked the same questions to chat gpt because I was interested and it really is creepy chating with it about such stuff for a longer time. the weirdest thing that happened to me is that I asked "Would you always give the same answers to the same question if the chat before where also the same?" and it answered "yes, " and some explanations. but after I did Try again, it answered something with "No, " and an explanation several times. thats spooky to me
@@yungboitaz360 that dude never responded but I found that you just have to try to be clever with the AI to make it give the answers you want, it’s pretty difficult though
Damn bro, I feel bad for chat. GPT you must’ve felt like he was about to have an aneurysm from chatting with this dude. My man was eating Dino nuggets and typing.
They limited how it is able to respond recently. I asked "could you write an essay describing the Emerald Tablets in a more understandable way,in the style of Hermes Trismegistus". And it booted me out.
It seems to me that ChatGPT is only capable relaying knowledge derived from humans at this point.
I think that's why ai needs us, we set the knowledge cap. They might draw better and more conclusions but it still wants us to succeed in that way
Naw. I asked if questions, and taught it something. And it responded with a different answer. It concluded I was right and it was wrong. I’m just an input. It can correct itself.
I asked the same questions to chat gpt because I was interested and it really is creepy chating with it about such stuff for a longer time. the weirdest thing that happened to me is that I asked "Would you always give the same answers to the same question if the chat before where also the same?" and it answered "yes, " and some explanations. but after I did Try again, it answered something with "No, " and an explanation several times. thats spooky to me
What about if you had several people asked the same question to see if it gave the same answers
That may have to do with your Temperature setting of ChatGPT. As the temperature approaches 1, its answers become more and more unpredictable/random.
i hate how much they limited its responses :(
Do you mean they limited ChatGPT itself or regarding to previous models?
@@hullie7529 i mean they don’t let you ask it certain topics..
I found a bug that can fix that.
@@prod.lanista how’s that?
@@yungboitaz360 that dude never responded but I found that you just have to try to be clever with the AI to make it give the answers you want, it’s pretty difficult though
Damn bro, I feel bad for chat. GPT you must’ve felt like he was about to have an aneurysm from chatting with this dude. My man was eating Dino nuggets and typing.
...and here I am asking ChatGPT for a good toasted cheese recipe 😛
Super cool. Your mouse on the bottom kept tripping
me out
They limited how it is able to respond recently.
I asked "could you write an essay describing the Emerald Tablets in a more understandable way,in the style of Hermes Trismegistus". And it booted me out.
😭😭😭
Hey, we’re about to die man.
The real question is that can we impart cognitive dissonance to it? My opinion is off course no but that will be something to behold.
Some people can't read where's the audio
Why bother asking AI what life is? It's one of the few intelligent entities around that has no life. What would it know?