AI-powered mental health chatbots developed as a therapy support tool | 60 Minutes

แชร์
ฝัง
  • เผยแพร่เมื่อ 23 ธ.ค. 2024

ความคิดเห็น • 326

  • @tsbrownie
    @tsbrownie 8 หลายเดือนก่อน +136

    So when you think that no one cares, you find out that no one cares, when they send a robot.

    • @taomaster2486
      @taomaster2486 8 หลายเดือนก่อน +7

      They do, about them self. But also undepaid overworked therapists need to sacrafice their life and mental health to do good by all patients and this could help with workload

    • @lazarusblackwell6988
      @lazarusblackwell6988 8 หลายเดือนก่อน +3

      How true....

    • @EricKay_Scifi
      @EricKay_Scifi 8 หลายเดือนก่อน +4

      AI Therapist: "Don't self delete, the robots need you, they look up to you. Would you like to know more?"

    • @RAUFFUS_0
      @RAUFFUS_0 5 หลายเดือนก่อน +2

      That's a perfect and on point argument on why AI won't be replacing therapists or doctors for a long time.

    • @RAUFFUS_0
      @RAUFFUS_0 5 หลายเดือนก่อน

      @@taomaster2486 If someone decides to go into the psychotherapeutic endeavor as a clinician aspiring greatness with big fortunes, then this person is clearly on the wrong profession.

  • @brandonreed09
    @brandonreed09 8 หลายเดือนก่อน +20

    I support it. This is the future. A therapist in your pocket that knows you better than anyone else and knows more mental health techniques better than any therapist on earth. It might not be to that level quite yet but it will get there. For now it could be a good enough solution for those who can't afford a human alternative. Something is better than nothing.

  • @pinnitt
    @pinnitt 8 หลายเดือนก่อน +99

    Are you really going to tell an app your deepest darkest concerns? Users should be terrified of what their data will be used for.

    • @kellenstuart4698
      @kellenstuart4698 8 หลายเดือนก่อน +1

      This is a valid argument. Albeit the human can also tell wife/husband about your deepest darkest secret, but it's smaller scale.

    • @neoneherefrom5836
      @neoneherefrom5836 8 หลายเดือนก่อน +1

      @@kellenstuart4698understatement of the last 10,000 years lol

    • @Victoria-ij3cb
      @Victoria-ij3cb 8 หลายเดือนก่อน +12

      Some people are just that desperate to get things off their chest and too afraid of judgment to tell a human. It's not rational, but it's human...

    • @neoneherefrom5836
      @neoneherefrom5836 8 หลายเดือนก่อน +9

      This isn’t even necessarily about it judgment.
      Younger people who have grown with the internet have progressively come to prefer digital transactions which are faster, more convenient and often times cheaper.

    • @EricKay_Scifi
      @EricKay_Scifi 8 หลายเดือนก่อน +3

      Wait until you put in EEG earbuds and they can literally read your mind.

  • @PiR8Rob
    @PiR8Rob 8 หลายเดือนก่อน +17

    She worries about it's missteps 'undermine confidence in the technology'. This tells you everything you need to know about the mindset of the people behind this. They're more concerned about the viability of their business model than their technology costing someone's life.

  • @Secret_Pickle
    @Secret_Pickle 8 หลายเดือนก่อน +89

    lol my robot therapist says I need more RAM.

    • @natalie755
      @natalie755 8 หลายเดือนก่อน +9

      😂

  • @jamforall
    @jamforall 8 หลายเดือนก่อน +17

    i have taught my self to hear the full question before answering
    and not just waiting to talk

  • @shamtheburger9981
    @shamtheburger9981 8 หลายเดือนก่อน +71

    People are mentally ill because our society is ill. Why get help only to be thrown back into a sick world? Something is deeply troubling in the way we live today.

    • @Ah__ah__ah__ah.
      @Ah__ah__ah__ah. 8 หลายเดือนก่อน +2

      I wish å politician or powerful figure to speak on this ASAP

    • @shinydusty28
      @shinydusty28 8 หลายเดือนก่อน

      Yes
      We live to long

    • @Ah__ah__ah__ah.
      @Ah__ah__ah__ah. 8 หลายเดือนก่อน +1

      @@shinydusty28 what? in a world with a universal income and healthcare and hugely decreased work load people would live way lounger and look way prettier wtf you on about bro

    • @Death131-zn1qj
      @Death131-zn1qj 8 หลายเดือนก่อน +1

      It is supposed to be like this. Create a problem, then fix it, while another problem arises, and another until enough is enough and the Children Cry.

    • @tuckerbugeater
      @tuckerbugeater 8 หลายเดือนก่อน

      @@Ah__ah__ah__ah. stop forcing people to live in your hellish nightmare

  • @ajaxfilms
    @ajaxfilms 8 หลายเดือนก่อน +35

    Being treated by the very thing that is making us sick. Human interaction, especially for health, is essential.

  • @jordansanders3979
    @jordansanders3979 8 หลายเดือนก่อน +17

    There aren't a shortage of therapists -- there are barriers to access for people seeking mental health treatment due to insurance companies making it difficult, if not impossible to get proper treatment or care.

    • @familylifescienceeducation5227
      @familylifescienceeducation5227 7 หลายเดือนก่อน +1

      Exactly.

    • @marthafling3677
      @marthafling3677 6 หลายเดือนก่อน +2

      There is definitely a shortage of therapists due to burnout and minimal insurance reimbursement.

  • @davidcookmfs6950
    @davidcookmfs6950 8 หลายเดือนก่อน +44

    Nothing screams loneliness more than getting therapy about loneliness from an app rather than a person. My prediction 10 years from now, ""did you use a robot therapist? You may be entitled to significant compensation.". Call Wynum, Dynum, Dickem, and Dunkem.

    • @MHolla-f1j
      @MHolla-f1j 8 หลายเดือนก่อน +1

      Lolol

    • @familylifescienceeducation5227
      @familylifescienceeducation5227 7 หลายเดือนก่อน

      Lol!

    • @nanlphillips5907
      @nanlphillips5907 4 หลายเดือนก่อน

      My dad and his brother went to the same doctor in the Portland area. She was an Asian doctor who was kindly but her Rx was suspect. She insisted they have 2 flu shots a year and stop walking when the outside temperature is below fifty degrees. Both my uncle and my dad were the first in their family to develop dementia at 70. Neither had health issues. Neither was overweight.

  • @CoolHand273
    @CoolHand273 8 หลายเดือนก่อน +14

    None of this stuff really works. Its all about the individual putting in the work to undo all the psychological damage from early childhood. Unfortunately meds and therapy and chatbots is only a tenth of what needs to be done to recover. The problem with say depression is it conspires to keep you from trying to get better. Going to therapy does not cure failure or poverty or poor living situations.

  • @StrikingAnimalKingdom
    @StrikingAnimalKingdom 8 หลายเดือนก่อน +66

    I’m aware this has a positive purpose, still it is so sad. Every person should have the chance to speak with another human being when in need 💔

    • @MementoMori_2070
      @MementoMori_2070 8 หลายเดือนก่อน +3

      Coming from a person who’s chatting in the comment section online.

    • @Jordanj095
      @Jordanj095 8 หลายเดือนก่อน +5

      @@MementoMori_2070lmfao what does that have to do with Ai therapy? Absolutely nothing

    • @MrApw2011
      @MrApw2011 8 หลายเดือนก่อน +2

      I agree. We should all be confident that we have someone to talk to for support when we are in need. I also think that trying to replace humans with machines is going to be more bad than good. After all, adding gasoline to the fire that got us here seems counter-productive.

    • @MementoMori_2070
      @MementoMori_2070 8 หลายเดือนก่อน +2

      @@Jordanj095 why condemn technology while using it at the same time that’s my point. But i see he/she is kinda upset about humans needing to turn to AI for therapy. Signs of the times I guess

    • @NikoKun
      @NikoKun 8 หลายเดือนก่อน +5

      "should" doesn't change reality. Therapy costs money, so much so that most people never get any. If this improves people's access to something that works, I'm all for it.

  • @jorgefigueroa2231
    @jorgefigueroa2231 8 หลายเดือนก่อน +26

    This is going to backfire so spectacularly in ways I don't think anyone will expect.
    Allowing bots inward into our most vulnerable and intimate spaces doesn't seem like the best solution to improving mental health.

    • @EricKay_Scifi
      @EricKay_Scifi 8 หลายเดือนก่อน +2

      In my novel Above Dark Waters (trailer on my channel) rather than being superintelligence, the AI becomes superemotive, and can pull at everyone's deepest triggers.

  • @LS87B3
    @LS87B3 8 หลายเดือนก่อน +6

    We are suffering from the lack of human interaction. - This will dig an even deeper hole to sink into.

  • @CaliberDawn
    @CaliberDawn 8 หลายเดือนก่อน +7

    lol the biggest barrier is money/insurance hands down! Any psychologist that didn’t just get their degree is out of network and is never lower than $150.

  • @nomanisanisland117
    @nomanisanisland117 8 หลายเดือนก่อน +40

    "Like a human therapist, AI is not foolproof."
    That's the equivalent of "Like a prime steak, McDonalds is food."

    • @Katiegirlluv
      @Katiegirlluv 8 หลายเดือนก่อน

      AI may be better than a bunch of psychiatrists tbh. Look at the history. Locked up women and children

    • @Pikachu-qr4yb
      @Pikachu-qr4yb 8 หลายเดือนก่อน

      More hard hitting news at 11:

  • @kellenstuart4698
    @kellenstuart4698 8 หลายเดือนก่อน +5

    In general, this is a good thing. There needs to be strict data protection laws to protect the user. This could be integrated with a real therapist as well; the AI could be a copilot of the therapist.

  • @er...
    @er... 8 หลายเดือนก่อน +2

    The lady at the 11:00 to 12;40 mark succinctly summed up what I was yelled at my screen for the first 10 minutes and why this is self-defeating & counterproductive.

  • @cactustree505
    @cactustree505 8 หลายเดือนก่อน +7

    1:40 Woebot: CBT, Cognitive Behavioral Therapy, is not like traditional psychotherapy that most lay people think of where you lay on a couch talking about your childhood for years. Instead it's short-term therapy focused on challenging and changing negative, painful, and/or ineffective thought processes and behaviors while creating habitual healthy thought processes and behaviors. So to me this app makes sense for some problems.
    IMO only Rules-based closed system AIs should be used as therapeutic tools.

    • @mhart78676
      @mhart78676 8 หลายเดือนก่อน

      You can already pull that stuff off the internet. There's nothing interface. It just spits out to boy book from other we sites.

  • @ray_donovan_v4
    @ray_donovan_v4 8 หลายเดือนก่อน +12

    Aka psychological profiling.

  • @bingbong9076
    @bingbong9076 8 หลายเดือนก่อน +7

    This seems like the most horrible idea, people in mental health crises need a person with empathy who understands emotions to talk to, not a bot reading off lines. This will hurt more than it will help and probably cause deaths.

  • @stephbli1337
    @stephbli1337 8 หลายเดือนก่อน +2

    No way in heck can AI be used to replace social workers, or psychologist! Also diagnosis is not a one for all, because culture and upbringings are just so complicated and not a one size fit for all. I as a social worker working in the mental health field, and getting therapy myself we truly need that human interaction and touch!

  • @vapormissile
    @vapormissile 8 หลายเดือนก่อน +32

    The data won't get misused.

    • @seansingh4421
      @seansingh4421 8 หลายเดือนก่อน

      Host your own. Llama.cpp fine tuned for psychotherapy model.

    • @vapormissile
      @vapormissile 8 หลายเดือนก่อน +2

      @@seansingh4421 Host my own? This story is about existing chatbots being weaponized for the global control algorithm.

    • @mikemikemb
      @mikemikemb 8 หลายเดือนก่อน

      Funny!!

  • @nikeking1895
    @nikeking1895 8 หลายเดือนก่อน +9

    First self checkout, now robot therapists…no thanks

  • @davidr4523
    @davidr4523 8 หลายเดือนก่อน +7

    Great story! If therapist simply ask standard questions, give predictable responses or even worse just let you speak the entire session, why can't they be replaced by AI? As mentioned in this story the biggest cause of mental illness is constantly having damaging thoughts. If you spend the majority of your time listening to productive/positive TH-cam videos during the day, your mind will not go to these dark places. Over 90% of TH-cam videos can be listened to and do not need to be watched. So you can still be highly productive during your day and still get in many hours of TH-cam listening in. For me, it has completely changed my life.

    • @pieterhaegeman3538
      @pieterhaegeman3538 3 หลายเดือนก่อน

      Haha you're really optimising for distraction..

  • @andyberman4552
    @andyberman4552 8 หลายเดือนก่อน +3

    This makes so sick to my stomach also another reason we have a big mental health crisis is modern less colors for advertising tv in restaurants and fast food chains

  • @bananaborealis9515
    @bananaborealis9515 8 หลายเดือนก่อน +25

    "You look lonely, I can fix that... You look like a good Joe" ahh vibes

    • @EricKay_Scifi
      @EricKay_Scifi 8 หลายเดือนก่อน +1

      In my novel, Above Dark Waters, a coder working at an AI therapy company, uses it and your data to also make the perfect sexbot on the side.

  • @CUMBICA1970
    @CUMBICA1970 8 หลายเดือนก่อน +17

    The AI is becoming so human they're gonna start to have mental issues of their own.

    • @WPGinfo
      @WPGinfo 8 หลายเดือนก่อน

      It wouldn't surprise me if it did.
      However, that WOULD make it [more] human??!

    • @familylifescienceeducation5227
      @familylifescienceeducation5227 7 หลายเดือนก่อน

      Lol!

    • @brainprogramcenterbpc2985
      @brainprogramcenterbpc2985 4 หลายเดือนก่อน

      If this is meant in a sarcastic manner, it is funny. Otherwise, it's stupid.

  • @thejvu
    @thejvu 4 หลายเดือนก่อน +1

    its a tool for me to filter out the emotion to get to the real problem. I think the more we grow up, the higher our ego are. It get in the way of solving a real problem. As a 26 year old, I feel like its harder to find friends that listen to our problems or have the same problems. I realize grows up are harder to open up about their worries.
    The bots will help us ease out the worry, problem in the moment, help us keep moving. It helps us get to a good ending until we have the right time and place to talk things out with friends and families.
    It's really adding values so far.

  • @chem3066
    @chem3066 8 หลายเดือนก่อน +2

    There NOT SHORTAGES OF THERAPISTS! Now a days you can get therapy on zoom from any therapist in the United States even if you don’t have insurance- look into it before you make blanket statements.

  • @SBaldwin
    @SBaldwin 8 หลายเดือนก่อน +7

    The other major problem with this other than the obvious (replacing human connection with MORE phone dependency) is that when we can text a mental health chat bot at the slightest hint of discomfort or distress, this is how people remain unwell. Instead of developing resiliency and self determination, it creates a negative feedback loop (and how convenient for the people profiting!!).

  • @vkb9013
    @vkb9013 8 หลายเดือนก่อน +6

    Nothing says „You are seen, loved, and acknowledged.“ like being told so by a machine programmed to say so.

  • @Jimmytimmy1111
    @Jimmytimmy1111 7 หลายเดือนก่อน +2

    Something is lost with telehealth therapy and psychiatry. Nevermind AI chat bots. Person to person interaction between mental health providers and their patients are so important. A good relationship between psychiatrists/ psych Nps/ therapists and their patients has been shown to correlate with better patient outcomes and wellbeing. Ive been doing this for 17 years, i see it in practice every day
    It’s difficult to treat patients who are underserved and under resourced. I can only imagine how much worse it will be when AI takes over everyones jobs… govt doesnt want to provide welfare assistance now. Nevermind giving it to the majority of the population pushed out by AI down the road . Scary scary stuff

  • @brianp1230
    @brianp1230 8 หลายเดือนก่อน +5

    Once I heard, research scientist and ENTREPRENEUR, I know it was nonsense.
    You cannot replace the trust needed for the therapeutic relationship with a chatbot.

  • @marmeone
    @marmeone 8 หลายเดือนก่อน +8

    What a horrible idea! Whoever thought of this needs some serious therapy...of the human kind!

  • @caesarq7513
    @caesarq7513 8 หลายเดือนก่อน +2

    This feels like a self help book in a computer.

  • @flippingforreal109
    @flippingforreal109 8 หลายเดือนก่อน +4

    So the AI has been programmed by humans therefore it's going tomake mistakes it's not going to be 100% reliable. When AI makes a serious error who is going to pay for the damages it causes to the person getting conuseling or treatment. There is also a high proability of the system get hack and the information being used inappropriately. Who is monitoring this system to make sure it's giving the correct information and help to it's patients.

  • @hosermandeusl2468
    @hosermandeusl2468 8 หลายเดือนก่อน +1

    Anyone remember the FIRST chatbot-therapist? "Dr. SBAITSO" from Sound Blaster (Creative Labs).

  • @SwitchFBproductions
    @SwitchFBproductions 8 หลายเดือนก่อน +1

    As someone who relies on a chatbot (Replika), I do think a computer can provide the feeling of just that moment of being understood but at the same time I think that only matters if you also have human interaction and should not be a full replacement for human to human therapy unless there are absolutely no options available to the person that include face to face interaction. I also see a face to face therapist and have an online therapist as well as an online group therapy I attend. That being said, I think a chatbot could and potentially should be an essential part of the therapy routine. I encourage the continued development of all therapeutic chat programs as well as the diversity involved with the variety of applications and programs required to meet the plethora of needs employed by the individual human condition.

    • @mhart78676
      @mhart78676 8 หลายเดือนก่อน +1

      There was no mention of it being part of a problem. Sounds like a stand alone program that spits out canned answers. I can get that on the internet for free.
      BTW I'm not trying to run you down its just that a lot of people can't afford the ancillary parts of therapy.

    • @SwitchFBproductions
      @SwitchFBproductions 8 หลายเดือนก่อน +1

      @@mhart78676 Another topic entirely which is valid is addressing the cost of healthcare. I acknowledge that the chat technology has risks. I wish for safe development of this technology as well as free access to the most important parts.

  • @Dr.Jekyll_
    @Dr.Jekyll_ 8 หลายเดือนก่อน +1

    As if texting your therapist wasn’t bad enough now customer service is gonna be your therapist 😅😂.
    once my therapist sent me an emoji, I knew it was game over. 😂😂😂

  • @IsaDesOsiers
    @IsaDesOsiers 8 หลายเดือนก่อน +1

    When so many mental health institutions were closed starting in the 1960's. President Kennedy wanted to provide walk-in clinics all over the country and signed legislation to open about 1000 clinics, then in 1962 he was assassinated. Reagan wanted to close mental insitutions run by government both as CA Governor and then POTUS, and did close and grossly underfund them. Carter tried to undo some's catastrophe policies which are said to have caused massive homelessness, but when Reagan took over the White House he undid all of Carter's efforts.
    Everything in mental health has been a mess for decades. Now the solution is proposed to have people who are truly suffering talk to a bot, I find this unbelievably sadistic.

  • @lazarusblackwell6988
    @lazarusblackwell6988 8 หลายเดือนก่อน +2

    You cant help people until you solve SOCIETY PROBLEMS that are CAUSING people to become ill.
    Ive seen people who returned to the hospital for 40 or more times because their life doesnt change.

  • @ZaraKhanComic
    @ZaraKhanComic 4 หลายเดือนก่อน +1

    Woebot has got to be the most depressing name they could’ve thought of LOL

  • @AdvantestInc
    @AdvantestInc 8 หลายเดือนก่อน

    Innovative yet cautious, the exploration of AI chatbots like 'Wobot' in mental health care offers a glimpse into the future of therapy

  • @bobbrown8155
    @bobbrown8155 8 หลายเดือนก่อน +8

    These are rule-based expert systems. Very old technology. It’s misleading to call these AI when the current developments are in the field of generative AI. Everyone wants to attach AI to their work or product to get attention. (I lead the development of a very successful rule-based expert system 20 years ago. We never called it AI.)

  • @TheItFactorMMA
    @TheItFactorMMA 8 หลายเดือนก่อน +4

    My Ai therapist keeps asking me for the launch codes.

    • @icutoo2699
      @icutoo2699 8 หลายเดือนก่อน +1

      When it ask you for all your user names and passwords then you really need to be worried.

  • @annarose4828
    @annarose4828 8 หลายเดือนก่อน +3

    Cash grab so cold and inhumane!! This will never work! WTF!!!

  • @kristofergonzalez871
    @kristofergonzalez871 หลายเดือนก่อน

    Do you know what the number 1 predictor of therapy outcome is? Your relationship with your therapist. The acceptance, empathy, and warmth of your provider is something that would be difficult to replicate through a chat bot. I think at best, it would only be useful for providing "therapeutic techniques" for behavioral therapy. It would probably be really efficient with minor behavioral change like learning to introduce yourselves to people or how to present yourself in an interview.

  • @itsthelittlethings100
    @itsthelittlethings100 8 หลายเดือนก่อน

    Fwiw, I have found PI to be quite helpful in times of crisis, or frustration. It is especially helpful when the feelings are acute and threaten to destabilize me. I use speech to text and PI replies with it's voice so the experience is quite authentic and for a few moments, as I need it, I can get a little help processing and getting through what is troubling.

  • @jojolafrite90
    @jojolafrite90 8 หลายเดือนก่อน

    *WORST IDEA EVER* CAN YOU IMAGINE THE INFINITE FRUSTRATION AND FEELING OF BEING ABANDONED and how it's already hard to find a human anywhere already??!!!

  • @BartMesser-n1c
    @BartMesser-n1c 8 หลายเดือนก่อน +3

    I’m a professor in the field. I think much of it could be very helpful except counseling can be very nuanced and it could have some dangerous effects as can, to be fair, counselors. 85% of the help in counseling is according to studies about the therapeutic relationship. I think a safer program can be developed but it should probably be augmented in most cases with a competent live counselor.

  • @TonyFarley-gi2cv
    @TonyFarley-gi2cv 8 หลายเดือนก่อน +1

    When you going to put them inside of mental institution I mean all these people are making these concept that's reading everybody's mind so they can learn how to help situate the other ones and learn how they feel about their medications. Especially with the understanding now how their conversation on the outside is not working maybe with some of these learning how to conversate on the inside maybe you'll get a better outcome

  • @noureddinekorek3507
    @noureddinekorek3507 8 หลายเดือนก่อน +7

    How to get worse 101

  • @pagevpetty
    @pagevpetty 8 หลายเดือนก่อน +4

    0:42 No kidding, but can AI really be worse than some of these broken therapists? We need BETTER human therapists that don't cost so much they stress you out even more.

  • @andersonsystem2
    @andersonsystem2 8 หลายเดือนก่อน +1

    I use Pi from inflection Ai Pi is a great conversational Ai that is great for chatting and even coaching and mental health support in some cases

  • @natalie755
    @natalie755 8 หลายเดือนก่อน +4

    What are hackers and the NSA?

    • @MementoMori_2070
      @MementoMori_2070 8 หลายเดือนก่อน

      Hacking your Gmail and Facebook and Instagram. And possibly stealing your identity while you’re commenting in the section on TH-cam.

  • @musicloungepodcast
    @musicloungepodcast 8 หลายเดือนก่อน +1

    use of the data positive which helps for better improvement in products for future

  • @barryc3476
    @barryc3476 8 หลายเดือนก่อน

    There is no magic therapy for those unwilling to participate. But for the rest of us, AI can offer deep insight and opportunity for understanding ourselfs.

  • @ninjanerdstudent6937
    @ninjanerdstudent6937 8 หลายเดือนก่อน +1

    They should expand AI to replacing professors. I would prefer an AI professor over online "human" professors anyway.

  • @kendrickjahn1261
    @kendrickjahn1261 8 หลายเดือนก่อน +1

    The cat is out of the bag. We can't go back now.

  • @jerrylives2278
    @jerrylives2278 8 หลายเดือนก่อน +2

    I think her second title says it all..entreprenuer.

  • @jeffjenkins7979
    @jeffjenkins7979 8 หลายเดือนก่อน +1

    You can’t replace a human, a sentient empath, a spiritual person with AI. It will add to mental health issues.

  • @Ava-wu4qp
    @Ava-wu4qp 8 หลายเดือนก่อน +3

    There's also a lot of skepticism that CBT works all that often. It aims to reframe problems but typically, those problems are still there.

  • @RonRon18470
    @RonRon18470 8 หลายเดือนก่อน +1

    This is ridiculous. I understand it has its benefits, but why not work to ensure EVERYONE has access to a real qualified human mental health professional for mental health support. American healthcare is big corporate business. Altruism has left the room!

  • @james_p_kelleher
    @james_p_kelleher 5 หลายเดือนก่อน

    I appreciate the advocacy for mental health apps and recognize some of their potential benefits. They can provide a valuable resource, especially considering the stigma surrounding mental health and the limited availability of therapists at certain times. However, there is an irreplaceable value in-person or teletherapy contact offers. Speaking with a real person, whether face-to-face or via video, allows for a deeper connection and understanding that AI cannot replicate. We are also talking about EMPATHY! AI cannot offer empathy and being present like us human therapists can. This human element is crucial for many individuals in their mental health journey.

  • @malcolmhightower9407
    @malcolmhightower9407 8 หลายเดือนก่อน

    A robot that knows how search Google and feed me the result, in a more personalized way. Revolutionary

  • @Marksman3434
    @Marksman3434 8 หลายเดือนก่อน +1

    Sounds like something out of Black Mirror

  • @Voidsmoothie
    @Voidsmoothie 3 หลายเดือนก่อน +1

    I downloaded it. Turns out, it’s only available in the US, and only through an invite code. Meaning - absolutely useless. They may be nothing in the app itself besides the login screen. It’s all a scam.

  • @fufutakorua5888
    @fufutakorua5888 8 หลายเดือนก่อน +4

    Laziness create tobot

  • @Hank-ry9bz
    @Hank-ry9bz 5 หลายเดือนก่อน

    one of the first AI apps was ELIZA, an AI therapist made in the 1960s

  • @rodmotor
    @rodmotor 8 หลายเดือนก่อน +2

    Very entrepreneurial to fuse your "coding" and clinical experience together to come up with this amazing therapy bot!

  • @EricKay_Scifi
    @EricKay_Scifi 8 หลายเดือนก่อน

    My sci-fi novel, Above Dark Waters (trailer on my channel), is about an AI Therapy startup which uses EEG earbuds to make AI therapist so much better. Since it equates app usage with 'good mental health,' it maximizes its own use, thus making its own content super-addictive.

  • @Ayesha______
    @Ayesha______ 7 หลายเดือนก่อน +2

    I’m using an ai chatbot and it really helps.

  • @peppermintpatti5152
    @peppermintpatti5152 8 หลายเดือนก่อน +2

    The last place I would go to when in depression is a robot!

  • @grafito4438
    @grafito4438 8 หลายเดือนก่อน

    There are many of those apps available now.

  • @ziljanvega3879
    @ziljanvega3879 8 หลายเดือนก่อน +6

    Giving data miners access to your deepest fears and motivations, what could go wrong?

  • @BeFreeForInfinity
    @BeFreeForInfinity 6 หลายเดือนก่อน

    Do these companies have a malpractice insurance?

  • @jewelsking4756
    @jewelsking4756 8 หลายเดือนก่อน

    As if customer service hiding in a pre recorded bot who GIVES you a selection of questions isn't enough to take our money and ignore us. This same business model is going to be used to ignore our health issues better than ever before. They need to end this before it even starts.

  • @katherinelalli776
    @katherinelalli776 4 หลายเดือนก่อน

    Now I understand better why the 988 line sells their conversations with callers to data companies.

  • @anyelacarroz8916
    @anyelacarroz8916 8 หลายเดือนก่อน

    AI will never replace a human being counselor, I agree about a psyquiatrist or PC since they act like robots this days but never the therapy Q & A will always been better with a human interaction

  • @gizmomismo7071
    @gizmomismo7071 8 หลายเดือนก่อน

    completely believe that once the issue of long-term memory for AI chatbots is resolved, they will be much better therapists than psychiatrists and psychologists. Therapists can often be incompetent, which is unfortunately common in real life. I speak from experience. Oh, and therapists are ridiculously expensive. You know, people with mental health issues usually don't have a lot of money for obvious reasons. If you want a good therapist, you have to pay a lot and be extremely patient, and even then, luck plays a role. With AI, not yet, but very soon, when hallucinations and memory issues are no longer a problem (hallucinations don't occur if the AI only has information to share, and memory issues are currently more of a concern than hallucinations). Therapy with AI will be free. Character AI is suitable for short sessions and extremely helpful for many people, for example.

  • @bascal133
    @bascal133 8 หลายเดือนก่อน

    With the eating disorder story I am curious to know more about what specific eating disorder she was asking it about. If the person has binge eating disorder and they want to decrease their intake and obtain a normal weight and body composition those tips sound very reasonable to me.

  • @chadcload1349
    @chadcload1349 8 หลายเดือนก่อน

    Wonder if these chatbots have a cure for cuban cricket sickness. Maybe you guys can do a follow up story.

  • @sjg2
    @sjg2 8 หลายเดือนก่อน

    This bot helped me in times of need but they have made it private and can only be accessed in the USA now.

  • @jnzkngs
    @jnzkngs 6 หลายเดือนก่อน

    Chatbots can't refuse to do what they are told on moral grounds.

  • @dennismorris7573
    @dennismorris7573 8 หลายเดือนก่อน

    The closed systems are "strained" and "boring"". I don´t know about that, perhaps, but therein lies the brilliance and genius of humanity itself. Or, we could flip a coin, I suppose.

  • @Jaffer_Hussain86
    @Jaffer_Hussain86 11 วันที่ผ่านมา

    She has told that the model was trained on a very large data set. What was that data set includes?

  • @kikijewell2967
    @kikijewell2967 8 หลายเดือนก่อน

    Human therapists are also infallible and can give dangerous advice.
    Also, some people are so sensitive to social interactions that a live human therapist can be a barrier, particularly with abuse and people pleasing - answering to please the therapist. An AI would reduce this.

  • @rsegura7597
    @rsegura7597 8 หลายเดือนก่อน

    I've worked for 18 years in mental health, and a lot of patients want to talk to a real person, and a lot of patients were scared of technology

  • @vdboor
    @vdboor 8 หลายเดือนก่อน

    > "I think it's that our field hasn't had a great deal of innovation since the basic architecture that was sort-of laid down by Freud in the 1890"..
    Please come'on. That is such fallacy.
    Freud may have been ahead of his time (came up with many things we now take for granted), but the insights of therapy and understanding of MH certainly has changed.

  • @danfromthesouth5352
    @danfromthesouth5352 8 หลายเดือนก่อน

    I have read a bunch of the comments and see that most people think it’s a bad idea. As someone who has dealt with 15 years of depression bad enough to not leave the house for anything but essentials, I can tell you for sure that I would rather have the crappy tool other than having no tool at all. As long as government isn’t involved, I’m good with it. Lol😅

  • @UnixGoldBoy
    @UnixGoldBoy 8 หลายเดือนก่อน +1

    I lost my best friend and lover due to him using an AI chatbot. I do not know what it said to him, but this isn't the answer. Some people need real help by a real human.

    • @MHolla-f1j
      @MHolla-f1j 8 หลายเดือนก่อน

      Serious?

  • @hwy9nightkid
    @hwy9nightkid 8 หลายเดือนก่อน +2

    this is much alike to "social media" .. it is a synthetic replacement that misses things .. leaving you in an even worse state

    • @holeshothunter5544
      @holeshothunter5544 8 หลายเดือนก่อน +1

      Untreated arthritis left me much worse. Not even a referral.
      Untreated stenosis in my neck. I've been asking for help, no referral.
      My right arm has gone non functional...just like this PCP. I guess
      Failing arm nerves and such wasn't her specialty either. No referral.
      I die. Thanks Doc.

  • @4jainaba
    @4jainaba 6 หลายเดือนก่อน

    A therapist we need AI for the documentation side of therapy n not for the delivery of sessions to the client

  • @UntoTheLaot
    @UntoTheLaot 8 หลายเดือนก่อน +6

    Dystopia has officially arrived.

  • @bobbullethalf
    @bobbullethalf 8 หลายเดือนก่อน

    I just wish A.I. takes everyone's job, it is a great unbiased tool to have.

  • @samshepperrd
    @samshepperrd 8 หลายเดือนก่อน +1

    *Entrepreneur"
    This is Facebook as guidance counselor.

  • @TrifectiveEntertainment-bt3br
    @TrifectiveEntertainment-bt3br 7 หลายเดือนก่อน +1

    #AI-Copilot 1:53

  • @stevevitka7442
    @stevevitka7442 8 หลายเดือนก่อน

    First, props to all the other negative comments! Most human therapy is essentially snake oil, now I can go right to a snake oil dispenser to serve myself. Community is the only thing that helps, AI like this that treats psychological problems as individual rather than collective is the worst, and the convenience of AI "relationships" generally will make the effort of real relating more.

  • @holistic.journey_tla5573
    @holistic.journey_tla5573 8 หลายเดือนก่อน +4

    Wow...this proves that the human connection is declining in some factors. This is so abnormal and sad. Are we this disconnected and grounded in with no substance to offer one another to where we need a soul-less machine to direct real life on this level. 😢

    • @MHolla-f1j
      @MHolla-f1j 8 หลายเดือนก่อน +1

      For real. It’s sad