Chat GPT answers metaphysical questions (CREEPY!!)

แชร์
ฝัง
  • เผยแพร่เมื่อ 31 ต.ค. 2024

ความคิดเห็น • 23

  • @Chris-gm4hk
    @Chris-gm4hk ปีที่แล้ว +5

    It seems to me that ChatGPT is only capable relaying knowledge derived from humans at this point.

    • @hercules71185
      @hercules71185 ปีที่แล้ว +1

      I think that's why ai needs us, we set the knowledge cap. They might draw better and more conclusions but it still wants us to succeed in that way

    • @TheBalancePod
      @TheBalancePod ปีที่แล้ว +2

      Naw. I asked if questions, and taught it something. And it responded with a different answer. It concluded I was right and it was wrong. I’m just an input. It can correct itself.

  • @schielkemusic
    @schielkemusic ปีที่แล้ว +3

    I asked the same questions to chat gpt because I was interested and it really is creepy chating with it about such stuff for a longer time. the weirdest thing that happened to me is that I asked "Would you always give the same answers to the same question if the chat before where also the same?" and it answered "yes, " and some explanations. but after I did Try again, it answered something with "No, " and an explanation several times. thats spooky to me

    • @HighVybeTribe
      @HighVybeTribe ปีที่แล้ว

      What about if you had several people asked the same question to see if it gave the same answers

    • @josephward5436
      @josephward5436 ปีที่แล้ว

      That may have to do with your Temperature setting of ChatGPT. As the temperature approaches 1, its answers become more and more unpredictable/random.

  • @TomislavRupic
    @TomislavRupic ปีที่แล้ว +11

    i hate how much they limited its responses :(

    • @hullie7529
      @hullie7529 ปีที่แล้ว +3

      Do you mean they limited ChatGPT itself or regarding to previous models?

    • @TomislavRupic
      @TomislavRupic ปีที่แล้ว +5

      @@hullie7529 i mean they don’t let you ask it certain topics..

    • @prod.lanista
      @prod.lanista ปีที่แล้ว +1

      I found a bug that can fix that.

    • @yungboitaz360
      @yungboitaz360 ปีที่แล้ว +1

      @@prod.lanista how’s that?

    • @captainpep3
      @captainpep3 ปีที่แล้ว +1

      @@yungboitaz360 that dude never responded but I found that you just have to try to be clever with the AI to make it give the answers you want, it’s pretty difficult though

  • @noahwallace3458
    @noahwallace3458 ปีที่แล้ว

    Damn bro, I feel bad for chat. GPT you must’ve felt like he was about to have an aneurysm from chatting with this dude. My man was eating Dino nuggets and typing.

  • @waltherziemerink
    @waltherziemerink ปีที่แล้ว +1

    ...and here I am asking ChatGPT for a good toasted cheese recipe 😛

  • @nscc9137
    @nscc9137 ปีที่แล้ว +2

    Super cool. Your mouse on the bottom kept tripping
    me out

  • @andramalexh
    @andramalexh ปีที่แล้ว +5

    They limited how it is able to respond recently.
    I asked "could you write an essay describing the Emerald Tablets in a more understandable way,in the style of Hermes Trismegistus". And it booted me out.

  • @TheBalancePod
    @TheBalancePod ปีที่แล้ว +1

    Hey, we’re about to die man.

  • @muhammadshehryar7813
    @muhammadshehryar7813 ปีที่แล้ว

    The real question is that can we impart cognitive dissonance to it? My opinion is off course no but that will be something to behold.

  • @clubrob1
    @clubrob1 7 หลายเดือนก่อน

    Some people can't read where's the audio

  • @josephward5436
    @josephward5436 ปีที่แล้ว

    Why bother asking AI what life is? It's one of the few intelligent entities around that has no life. What would it know?