Run your own ChatGPT Alternative with Chat with RTX & GPT4All

แชร์
ฝัง

ความคิดเห็น • 48

  • @Razor2048
    @Razor2048 4 หลายเดือนก่อน +7

    I wish the makers of these GPT models as well as even stable diffusion, would at least offer the option to allow for the use of shared system memory. On the PC, by default a video card can utilize in additional to the dedicated VRAM, it can additionally allocate 50% of the system memory for use by the GPU. For VRAM throughput intensive tasks, this will carry a large performance penalty, but it would still be good to have, as many people would rather the task run slowly than not at all. It would especially be good with things like stable diffusion where it would generating higher res images, can easily use 24-32GB of VRAM which most consumer cards lack, but nothing is stopping them from allowing a slower shared memory process to redo a generation at a higher res.

    • @dwalinfundinson
      @dwalinfundinson 4 หลายเดือนก่อน

      They can. You're looking for LM Studio.

  • @vsdfsdfsdfwew
    @vsdfsdfsdfwew 4 หลายเดือนก่อน

    how did you start it? after install no icon appears/link etc

  • @cr-pol
    @cr-pol 3 หลายเดือนก่อน

    A followup video about any updates for these two GPT tools would be interesting to see if they
    1: got better at understanding
    2: stopped making things up

  • @adriangpuiu
    @adriangpuiu 3 หลายเดือนก่อน

    i downloaded and installed the chat with rtx but the youtube url is missing from the drop down :) if anyone has the older zip file to share it on internet would be great and highly appreciated

  • @enermaxstephens1051
    @enermaxstephens1051 4 หลายเดือนก่อน

    But when has that hermes model been updated last? If you ask it, it says it was in 2019. Probably false, but I'd still be curious how up to date all this is. Might be better if a much newer model downloaded off huggingface was used.

  • @normcfu
    @normcfu 4 หลายเดือนก่อน

    I don't know a lot about the AI models and just watching your video I learned some stuff. Question, do these models take advantage of multiple cpus in your PC or are they single threaded outside the GPU?

    • @zivzulander
      @zivzulander 4 หลายเดือนก่อน +2

      They are threaded. CPU utilization will be very high across cores if running on CPU vs GPU.

    • @biglevian
      @biglevian 4 หลายเดือนก่อน

      in gpt4all you can even tell it how many threads it can use

  • @JohnPMiller
    @JohnPMiller 4 หลายเดือนก่อน +3

    Can a user provide a random seed to get repeatable results?
    Also, I'm disappointed that Nvidia didn't think it was worth their effort to support my RTX 2070.

    • @graigchq
      @graigchq 21 วันที่ผ่านมา

      its not about their effort to support, but you have an old GPU that doesn't imclude the hardware that ChatRTX relies upon

  • @drumbyte
    @drumbyte 4 หลายเดือนก่อน

    To load llama13b on a computer with less video memory: Goto the installation folder -> ChatWithRTX_Offline_2_15_mistral_Llama\RAG -> Edit the llama13b.nvi file -> (Change Line 26 to the value 7). The line should look like this: then run the installation and it will install Llama 13b:)

    • @richardtschuggi2834
      @richardtschuggi2834 2 หลายเดือนก่อน

      Perfect ! Thank you a lot. Having 11 GB VRAM would be a pity missing bigger model

  • @alexmodern6667
    @alexmodern6667 4 หลายเดือนก่อน

    Thank you. Very impressed with common sense illustrated ideas

    • @ericB3444
      @ericB3444 4 หลายเดือนก่อน

      Correct

    • @ericB3444
      @ericB3444 4 หลายเดือนก่อน

      BUAHAHAHAHAH

  • @nrnoble
    @nrnoble 4 หลายเดือนก่อน +1

    It is already interesting to see how AI tech is impacting YT. There are many YT channels now using AI to write scripts combined with cloned voice narration. It does not appeal to me because the videos look and sound manufactured like they were created on an assembly line. It is like the content creators feed the AI the parameters for the video, and the AI assembles the video ready to be posted on YT. I am sure it is not that automated yet.

    • @RobertDunn310
      @RobertDunn310 4 หลายเดือนก่อน +1

      This is actually a great opportunity for human creators to stand out even more. I don't see anything wrong with using AI to help you draft a script, but I do hate hearing that robotic text-to-speech voice.

    • @LonSeidman
      @LonSeidman  4 หลายเดือนก่อน +1

      I agree completely - it’s already very obvious what’s AI generated vs created by a real thinking human.

    • @nrnoble
      @nrnoble 4 หลายเดือนก่อน

      @@LonSeidman I hope YT implenets a way to filter out AI generated content.

    • @nrnoble
      @nrnoble 4 หลายเดือนก่อน

      @@RobertDunn310 It's not the AI scripts that bothers me its the cloned voices that sound fake. The combination allows them flood YT with frequent videos that currently I am unable to filterout when searching for a topic. It's a matter of personal preference. Overall the use AI is not a bad thing. I use it daily for getting info.

  • @CF542
    @CF542 4 หลายเดือนก่อน +1

    I'm not sure I see the point of these chat models.

    • @Lysander-Spooner
      @Lysander-Spooner 4 หลายเดือนก่อน

      To dispense agitprop to a generation to lazy to do any real research. Someone reading the results without seeing the original video would have a completely false understanding of what was siad. Imagine what happens when applied to government actions, politics and the news.

  • @Ockv74
    @Ockv74 4 หลายเดือนก่อน +1

    ❤❤❤

  • @nrnoble
    @nrnoble 4 หลายเดือนก่อน

    Currently we are at early dawn of AI, much like late 70s with micro processor computers (Apple 1, Commodor Pet, etc) or the early days of internet in the late 80s (dialup modems). Over the next decade AI will explode in terms of abilities, and unfortunately misuse of the technology by those who want to use AI for unethical, illegal, and dangerous reasons very much like technology that have preceded it.

  • @ericB3444
    @ericB3444 4 หลายเดือนก่อน +3

    LonBall, who needs an offline chatbot unless it’s the apocalypse and you have absolutely zero connection to the online products?

    • @Kevin-oj2uo
      @Kevin-oj2uo 4 หลายเดือนก่อน +14

      Privacy concerns. I will highly prefer a local AI.

    • @ericB3444
      @ericB3444 4 หลายเดือนก่อน +1

      @@Kevin-oj2uo my dog Ragula and myself don’t think people are worried about what you’re asking your chatbot because there’s a lot of other problems in the world that are bigger than that.

    • @Kevin-oj2uo
      @Kevin-oj2uo 4 หลายเดือนก่อน

      ​@@ericB3444lol

    • @Kevin-oj2uo
      @Kevin-oj2uo 4 หลายเดือนก่อน

      ​@@ericB3444lol😂

    • @dwalinfundinson
      @dwalinfundinson 4 หลายเดือนก่อน +9

      Why pay a subscription for something your own PC can do? That's the point of having a computer... to compute.

  • @villageidiot8718
    @villageidiot8718 4 หลายเดือนก่อน

    This would be great if you could trust the output.