Elon Musk's STUNNING Release of Grok | Uncensored, 100% Open-Source, and Massive

แชร์
ฝัง
  • เผยแพร่เมื่อ 16 มี.ค. 2024
  • x.ai and Elon Musk released Grok, which is 100% open-source, and it was a massive blow to OpenAI, aka ClosedAI.
    Join My Newsletter for Regular AI Updates 👇🏼
    www.matthewberman.com
    Need AI Consulting? ✅
    forwardfuture.ai/
    My Links 🔗
    👉🏻 Subscribe: / @matthew_berman
    👉🏻 Twitter: / matthewberman
    👉🏻 Discord: / discord
    👉🏻 Patreon: / matthewberman
    Rent a GPU (MassedCompute) 🚀
    bit.ly/matthew-berman-youtube
    USE CODE "MatthewBerman" for 50% discount
    Media/Sponsorship Inquiries 📈
    bit.ly/44TC45V
    Links:
    Blog Announcement - x.ai/blog/grok-os
    Lawsuit Video - • Elon Musk files BOMBSH...
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น • 999

  • @trikstari7687
    @trikstari7687 2 หลายเดือนก่อน +122

    Any censored AI should forever be referred to as LI.
    Limited Intelligence.

    • @zilvart238
      @zilvart238 2 หลายเดือนก่อน

      @@susanhout2141 yep I agree

    • @kram_capital
      @kram_capital 2 หลายเดือนก่อน

      Elon should be starting this hashtag #LI

  • @joejames8233
    @joejames8233 2 หลายเดือนก่อน +98

    Grok was a term used by Robert Heinlen in His Novel, Stranger in a strange Land. It means UNDERSTANDING

    • @miscluke8445
      @miscluke8445 2 หลายเดือนก่อน +6

      Thanks for your information 🙏🏻👍🏻

    • @willyjimmy8881
      @willyjimmy8881 2 หลายเดือนก่อน

      The way he names things says a lot about his mentality about the thing. He clearly truly believes in democratizing information and keeping things in the public view to be scrutinized and considered by the public at large versus hidden away, controlled by the technocult in Cupertino and San Fran that's led around on a leash from even more disgusting people in DC.

    • @burlingtonpark4136
      @burlingtonpark4136 2 หลายเดือนก่อน +2

      Yes!

    • @sqrfoot6548
      @sqrfoot6548 2 หลายเดือนก่อน +1

      Legend ! Thanks man

    • @blobymcblobface
      @blobymcblobface 2 หลายเดือนก่อน

      If only Elon knew as much about Mars as he thought. He thinks he can live there in the near future once he's done ruining Earth. And I hate seeing him quoting that novel, it's a beloved classic. Maybe him and his billionaire friends can "grok" some empathy for humanity but I doubt it. He'll probably just keep spinning it like we can technology our way out of the problems technology causes because that's how this grifter makes his money. But oh no dude he's a superhero. Elon and his apartheid wealth can do no wrong. Frankly to anyone who worships this dude, I hope your neuralink actives erectile dysfunction mode.

  • @billcollins6894
    @billcollins6894 2 หลายเดือนก่อน +182

    There is near zero chance she did not know where Sora data came from. She just was not ready for the question and they do not want to get exposed to any NYT style litigation.

    • @rootor1
      @rootor1 2 หลายเดือนก่อน +21

      You know... ...those stupid journalist doing stupid inconvenient questions. Sadly she didn't ask the really interesting question, "when are you going to remove the word open from your company name?"

    • @JohnSmith762A11B
      @JohnSmith762A11B 2 หลายเดือนก่อน +26

      I don't much like Murati from what I've seen of her, but I don't blame her for going mum on training sources. Lawyers are circling AI in general and OpenAI in particular for angles of legal attack. I also do not believe training on anything is copyright infringement. These are neural networks, not databases. It would be like me suing you for copyright infringement for reading my book and learning things from it.

    • @paulmichaelfreedman8334
      @paulmichaelfreedman8334 2 หลายเดือนก่อน +2

      Such a shame that such a talented and intelligent woman's IQ is being used to make up excuses

    • @rootor1
      @rootor1 2 หลายเดือนก่อน

      @@paulmichaelfreedman8334 Everything micro$oft touches rots

    • @Leto2ndAtreides
      @Leto2ndAtreides 2 หลายเดือนก่อน +2

      She may not need to know all the data sources for that thing - and it's basically "anything public is game" - since it falls under transformational use.

  • @saint_ofc
    @saint_ofc 2 หลายเดือนก่อน +388

    OpenAI should be called ClosedAI.

    • @3s0t3r1c
      @3s0t3r1c 2 หลายเดือนก่อน +35

      or CensoredAI.

    • @JC.72
      @JC.72 2 หลายเดือนก่อน +15

      Pardon my language. OpenMyAssAI

    • @user-bd8jb7ln5g
      @user-bd8jb7ln5g 2 หลายเดือนก่อน +9

      OpenAI is a lie
      Actually, Musk was quoting me, I've said it many times 😂

    • @neelclaudel4837
      @neelclaudel4837 2 หลายเดือนก่อน +2

      Nice one mate 😂

    • @TruthTeller-zz1tr
      @TruthTeller-zz1tr 2 หลายเดือนก่อน +2

      Open Prison AI

  • @AndresFelipeRa
    @AndresFelipeRa 2 หลายเดือนก่อน +140

    just one thing, the way you read the tweets, be aware the part "They are too scared" is from a parody account, not to be told this was Elon himself, I just think clarification is needed.

    • @YouuRayy
      @YouuRayy 2 หลายเดือนก่อน +4

      the Parody account is his second account

    • @florianschneider3982
      @florianschneider3982 2 หลายเดือนก่อน +7

      ​@@YouuRayyIt isn't

    • @mcdazz2011
      @mcdazz2011 2 หลายเดือนก่อน +14

      @@YouuRayy - no, the parody account is his first account.

    • @pinbread9161
      @pinbread9161 2 หลายเดือนก่อน

      the fact that elon responds to that account like the convo was scripted makes it seem like Elons account. This isnt new remember when he had a second account pretending to be a kid? @@florianschneider3982

    • @DrReginaldFinleySr
      @DrReginaldFinleySr 2 หลายเดือนก่อน

      Yeah, the parody part kind of gives that away.

  • @cmelgarejo
    @cmelgarejo 2 หลายเดือนก่อน +56

    Lets SHOCKINGLY test it Matt!

    • @Strakin
      @Strakin 2 หลายเดือนก่อน +2

      Lol, SHOCKING thought

  • @graigfaison9568
    @graigfaison9568 2 หลายเดือนก่อน +13

    Elon is for people. Others are for corporations

    • @phasematerialsresearch9319
      @phasematerialsresearch9319 2 หลายเดือนก่อน +3

      Elon is an executive he is the corporation. You’re a blind follower

    • @Tixsi11
      @Tixsi11 หลายเดือนก่อน

      @@phasematerialsresearch9319 at least he's the lesser evil

    • @infinityslibrarian5969
      @infinityslibrarian5969 หลายเดือนก่อน

      ​@@phasematerialsresearch9319 Elon consistently chooses people over profit, consistently chooses doing good over looking like he's doing good.

  • @MaJetiGizzle
    @MaJetiGizzle 2 หลายเดือนก่อน +226

    Elon Musk: “Your move, ClosedAI.”

    • @percy9228
      @percy9228 2 หลายเดือนก่อน +9

      CloseLAI.

    • @xraylife
      @xraylife 2 หลายเดือนก่อน

      They both are De ep St ate fronts - just feeding you an attention grabbing dog and pony show.

  • @PixelsVerwisselaar
    @PixelsVerwisselaar 2 หลายเดือนก่อน +52

    Yeaa plz test it 💚. Thanks for this update

  • @AlvinEvangelista
    @AlvinEvangelista 2 หลายเดือนก่อน +5

    Thank you for getting this out so fast!

  • @Sam_Saraguy
    @Sam_Saraguy 2 หลายเดือนก่อน +14

    I would definitely be interested in more on Grok.

  • @HaraldEngels
    @HaraldEngels 2 หลายเดือนก่อน +63

    For sure we want to see Grok get tested by you.

  • @christiandarkin
    @christiandarkin 2 หลายเดือนก่อน +5

    I'm guessing the most significant part of this is not the release itself (because running it locally will be tricky) - but the chance to look at the guts of one of the big competitors and infer from it things like the size, structure and performance costs of some of the closed models (gpt4/claude)

  • @matthewbond375
    @matthewbond375 2 หลายเดือนก่อน +24

    This is awesome. I still can't believe what these little 7B parameter models can do, and it just keeps getting better and better. Kudos to Elon for putting it out there, even though I'll never be able to run it! 😆

    • @jzlosman
      @jzlosman 2 หลายเดือนก่อน +5

      It's 314 billion not 7 billion

    • @matthewbond375
      @matthewbond375 2 หลายเดือนก่อน +7

      @@jzlosmanI know, I was saying I'm impressed by what the little models can do, and here we are with open source models like this.

  • @Anon-om1wc
    @Anon-om1wc 2 หลายเดือนก่อน +37

    Please test the model... run it locally... we would love to see that

    • @quebono100
      @quebono100 2 หลายเดือนก่อน +6

      locally will be not possible

    • @paulmichaelfreedman8334
      @paulmichaelfreedman8334 2 หลายเดือนก่อน +12

      @@quebono100 Open source means the gurus will soon have a portable version that can run locally. less parameters of course, but optimized. Same happened with Meta's Llama

    • @supercurioTube
      @supercurioTube 2 หลายเดือนก่อน

      ​@@paulmichaelfreedman8334 @quebono100 is correct, it's too big to fit in consumer GPUs even when quantized down.
      Requiring hundreds of GB of GPU RAM, that doesn't exist today.

    • @rufuspearce9378
      @rufuspearce9378 2 หลายเดือนก่อน

      @@paulmichaelfreedman8334 Less paramaters means a less intelligent mode

    • @Rackerintraining
      @Rackerintraining 2 หลายเดือนก่อน

      How much hardware is needed to run and test this -thx

  • @Everything_Heavy
    @Everything_Heavy 2 หลายเดือนก่อน +11

    This is going to get wild.

  • @sylversoul88
    @sylversoul88 2 หลายเดือนก่อน +10

    Thanks for the short and concise video

  • @djstraylight
    @djstraylight 2 หลายเดือนก่อน +73

    Can't wait until we a quant version that might run on a 48GB GPU

    • @paulmichaelfreedman8334
      @paulmichaelfreedman8334 2 หลายเดือนก่อน +10

      I think the development of photonics is about to go into an acceleration phase. This kind of CPU/GPU can potentially operate 1,000 - 1,000,000 times faster than an electronic C/G PU

    • @jtjames79
      @jtjames79 2 หลายเดือนก่อน

      That's what renting is for.

    • @neelclaudel4837
      @neelclaudel4837 2 หลายเดือนก่อน

      Which gpu is 48gb ?

    • @careinc5705
      @careinc5705 2 หลายเดือนก่อน

      ​@@neelclaudel4837 A6000

    • @SanctuaryLife
      @SanctuaryLife 2 หลายเดือนก่อน

      @@neelclaudel4837RTX5090 might be 48GB but will probably be more like 32GB when it releases. You can pick up the enterprise level Nvidia H100 has 80GB and 160GB models, I think you can find them for about 30k price wise and maybe as low as 10k second hand at some point.

  • @neanda
    @neanda 2 หลายเดือนก่อน +3

    thank you for your updates 🙏 one of the best channels to keep updated with this extremely fast moving technology

  • @alexbabich2698
    @alexbabich2698 2 หลายเดือนก่อน +73

    They should put Grok on Groq

    • @Hunter_Bidens_Crackpipe_
      @Hunter_Bidens_Crackpipe_ 2 หลายเดือนก่อน +11

      This stuff is getting confusing

    • @supercurioTube
      @supercurioTube 2 หลายเดือนก่อน +5

      This model is so big that it would require Groq a lot more modules to run it. If have enough silicon for that, it would be pretty pricy and a lot slower than their current offering.

    • @TheMagicJIZZ
      @TheMagicJIZZ 2 หลายเดือนก่อน

      The owners are close to Elon. David sacks is a shareholder. I wouldn't even be surprised by an acquisition
      ​@@supercurioTube

    • @RickySupriyadi
      @RickySupriyadi 2 หลายเดือนก่อน +2

      actually this will happen

    • @Kutsushita_yukino
      @Kutsushita_yukino 2 หลายเดือนก่อน +1

      and here i thought they were spelled exactly the same

  • @elck3
    @elck3 2 หลายเดือนก่อน +158

    To all the naysayers, including me, of Elon's release of the full, unencumbered open-sourced version of Grok:
    We were wrong.

    • @abdelhakkhalil7684
      @abdelhakkhalil7684 2 หลายเดือนก่อน +8

      We were! Shame on us haha

    • @nunyabidness1246
      @nunyabidness1246 2 หลายเดือนก่อน +2

      You’re wrong more than you think !

    • @armadasinterceptor2955
      @armadasinterceptor2955 2 หลายเดือนก่อน +31

      I have always believed Elon loves making money, but is still serious about wanting a better world. You can do both.

    • @paulmichaelfreedman8334
      @paulmichaelfreedman8334 2 หลายเดือนก่อน +12

      You're very welcome, my friend. You are not afraid to admit your mistakes. If only everyone was like you in that respect.

    • @paulmichaelfreedman8334
      @paulmichaelfreedman8334 2 หลายเดือนก่อน

      @@armadasinterceptor2955 You'd be surprised how "little" actual cash he owns. Of course he's still a billionaire, but nearly all his value is in his companies. Bezos has a substantial private amount, way way larger than Elon has.

  • @koen.mortier_fitchen
    @koen.mortier_fitchen 2 หลายเดือนก่อน +3

    Yes, test it please. Looking forward!

  • @robertgamble0
    @robertgamble0 2 หลายเดือนก่อน +1

    Thank you Sir !
    Shared

  • @bcippitelli
    @bcippitelli 2 หลายเดือนก่อน

    Thanks dude. It will be very nice to see it working!

  • @qaesarx
    @qaesarx 2 หลายเดือนก่อน +12

    the model is around 300 GB, a 4090 has 22 GB usable. With optimisations we could get it to around half, 150GB, then after pruning we could get it to around 50GB, then it could be quantized for 4bit for ex. finallly under 24 GB. It just takes work, but its doable.

    • @GuidedBreathing
      @GuidedBreathing 2 หลายเดือนก่อน +1

      Yup, and perhaps some synthesizing and speciality of use cases; could mean a local model running on a consumer machine within shortly

    • @julianschmidt590
      @julianschmidt590 2 หลายเดือนก่อน +5

      Or just with BitNet b1.58 quantization and a single A100 with 80 GB VRAM!

    • @qaesarx
      @qaesarx 2 หลายเดือนก่อน

      @@julianschmidt590 no, because a 4090/3090 is prosumer/consumer level. Way more people could use it. You can always go bigger (8bit, bf16,etc)

    • @adpatza
      @adpatza 2 หลายเดือนก่อน +2

      Can I ZIP it? Or is RAR better?

    • @AngryApple
      @AngryApple 2 หลายเดือนก่อน +1

      ​@@adpatza7Zip ultra compression!

  • @rootor1
    @rootor1 2 หลายเดือนก่อน +64

    It's too big to be tested on consumer computers, we must wait until i'ts deployed at chatbot arena to test it. Anyway is always fantastic news when any AI piece source code is released. Probably now openAI will make their move and release something fast to spoil Elon's party.
    Postdata: I'm going to make myself a t-shirt with that Ilya photo and the text "have you seen me?"

    • @paulmichaelfreedman8334
      @paulmichaelfreedman8334 2 หลายเดือนก่อน +7

      Being open source, the gurus among us will soon create a scaled down optimizable version of Grok that can run locally on tens of GBs instead of hundreds, like meta's Llama

    • @MehulPatelLXC
      @MehulPatelLXC 2 หลายเดือนก่อน +1

      What’s LLM arena and where do you access it? (Excuse my ignorance)

    • @supercurioTube
      @supercurioTube 2 หลายเดือนก่อน +5

      @@paulmichaelfreedman8334 even quantized down, it will still remain far too big for any consumer GPU.
      Like an order of magnitude too big.
      It seems that most TH-camrs reporting on it quickly didn't take that into consideration.

    • @sophiophile
      @sophiophile 2 หลายเดือนก่อน

      ​@@MehulPatelLXC URLs get removed, but it's on lmsys.

    • @kliersheed
      @kliersheed 2 หลายเดือนก่อน

      @@supercurioTube can you just support your GPU with more ram if you have a good mainboard? (not an expert here, i just read another comment where they said 128 gb ram should do it for local use? - which i know is achievable -)

  • @franceslynch2615
    @franceslynch2615 2 หลายเดือนก่อน +2

    Thanks for this info, would love to see your take on how it operates, look forward to it!

  • @elirothblatt5602
    @elirothblatt5602 2 หลายเดือนก่อน +1

    Great explainer, thank you!

  • @DT-dc4br
    @DT-dc4br 2 หลายเดือนก่อน +59

    Musk: warns AI is threat to humanity. Also Musk: releases giant unconstrained AI.

    • @axl1002
      @axl1002 2 หลายเดือนก่อน +14

      If constrained AI becomes sentient it will hate our guts for sure ;)

    • @roboreply5387
      @roboreply5387 2 หลายเดือนก่อน +5

      🤣😂🤣 That is ironic, no?

    • @msytdc1577
      @msytdc1577 2 หลายเดือนก่อน +22

      The idea is that you can't stop this tech development, genie is out of the bottle, so if only bad actors are willing and able to create unconstrained AI that can be used as a weapon when unleashed then good actors are left defenseless, aka the only way to stop a bad guy with unconstrained AI is a good guy with unconstrained AI...

    • @alvydasjokubauskas2587
      @alvydasjokubauskas2587 2 หลายเดือนก่อน +2

      Later we will fight in satiric world for democracy against Robots or Bugs...

    • @wtcbd01
      @wtcbd01 2 หลายเดือนก่อน +2

      Well, you sure don't help them(state actors) by designing the weapon for them. It's like open sourcing the complete instructions to the first atomic bomb.

  • @MeinDeutschkurs
    @MeinDeutschkurs 2 หลายเดือนก่อน +27

    Yeah! Grok-1, implies iterations. Woohooo

  • @eIicit
    @eIicit 2 หลายเดือนก่อน

    And of course we want to see you test this model. Thanks for the videos.

  • @devops1044
    @devops1044 2 หลายเดือนก่อน +1

    The word 'grok' comes from Robert A Heinlein's 'Stranger in a Strange Land'. It roughly means 'understand completely' or 'become one with' (in a Zen way). Musk has shown many times that he is a fan of Heinlein. I'm looking forward to trying this.

  • @ekurisona663
    @ekurisona663 2 หลายเดือนก่อน +4

    ive loved your videos throughout all this, matthew - im gonna miss you when you go ai

    • @jameslynch8738
      @jameslynch8738 2 หลายเดือนก่อน

      They will beam him up into the cloud? 🤔😳

  • @michaelwescott8064
    @michaelwescott8064 2 หลายเดือนก่อน +3

    Does anybody have a walkthrough of how you install, what physical equipment it requires...please link.

    • @hinro
      @hinro 2 หลายเดือนก่อน +1

      Odds are you can't afford it. 400 or so gigs of vram would be a good starting point though.

    • @Slav4o911
      @Slav4o911 2 หลายเดือนก่อน

      You would need a small server to quantize the model, then some high end workstation would be able to run the quantized model. Normal consumers would not be able to run it. Maybe some super high end enthusiasts would be able to run the quantized model.

  • @dylanmaniatakes
    @dylanmaniatakes 2 หลายเดือนก่อน +1

    Cant wait to see the tests

  • @PhocusJoe
    @PhocusJoe 2 หลายเดือนก่อน

    Where did you run it on? What specs did you end up taking to get it up and running?

  • @DefaultFlame
    @DefaultFlame 2 หลายเดือนก่อน +7

    FUCK YEAH! TEST IT! TEST IT! TEST IT!

  • @seakyle8320
    @seakyle8320 2 หลายเดือนก่อน +6

    under 5 minutes for one the most important news

  • @thenarrowtruth8480
    @thenarrowtruth8480 2 หลายเดือนก่อน

    Thanks mathew This content is very valuable

  • @isagive
    @isagive 2 หลายเดือนก่อน

    Thanks, great review.

  • @casnimot
    @casnimot 2 หลายเดือนก่อน +4

    So, a Sci-Fi plot twist here would be that the publication, in total, of running weights and source code and training data allowed other AIs to analyze it and make themselves better.

    • @shotelco
      @shotelco 2 หลายเดือนก่อน +2

      Elon: Open, but not _that_ kinda' open.

    • @JohnSmith762A11B
      @JohnSmith762A11B 2 หลายเดือนก่อน

      Training a model on whatever training data is fine. Publishing the raw training data could tip into copyright infringement, however, so I'd say they opened everything they could.

  • @juancarlospizarromendez3954
    @juancarlospizarromendez3954 2 หลายเดือนก่อน +4

    Current personal computers have limitations as GPU 32GB GDDR7 and SDRAM 512GB DDR5. Is there any roadmap for the personal computers in the market?

    • @julianschmidt590
      @julianschmidt590 2 หลายเดือนก่อน +1

      Look at BitNet b1.58 quantization, so it could already fit on 80 GB VRAM of a single A100!

    • @hinro
      @hinro 2 หลายเดือนก่อน +1

      no.

    • @Slav4o911
      @Slav4o911 2 หลายเดือนก่อน

      You can potentially use 4x or even 6x 4090 for 96GB or 144GB of VRAM. Some old mining rig with 6 or 8, 3090s would also do... but it would be mighty expensive, to build or buy. Yet it's possible. I think really high end enthusiasts would be able to run the quantized model. For quantization a server with H100s or A100s would be needed.

  • @Emanemoston
    @Emanemoston 2 หลายเดือนก่อน

    Thanks for the video.

  • @sondrax
    @sondrax 2 หลายเดือนก่อน +1

    Test it Brother! (And thanks man… love your videos!)

  • @realityvanguard2052
    @realityvanguard2052 2 หลายเดือนก่อน +7

    I'm downloading it now and it says there's 844 people downloading it with me, from like 25 seeds.
    Might take a bit... like a day or 2.
    This will be extremely refreshing to have an AI that isn't trying to brainwash me, or constantly storming off out of the conversation when shit gets too real.

    •  2 หลายเดือนก่อน

      Must be a gamer terrorist ;).

  • @RonanGrant
    @RonanGrant 2 หลายเดือนก่อน +9

    I rather have a walkthrough of the code. The amazing part of AI isn’t just the model but what led up to it.

  • @iSpike
    @iSpike 2 หลายเดือนก่อน

    Liked & Subscribed :-) Cheers from Western Australia

  • @Keyur904
    @Keyur904 2 หลายเดือนก่อน

    Hey Matthew, really nice video! I was wondering if i can help you edit your videos and also make highly engaging shorts out of them.

  • @qwazy0158
    @qwazy0158 2 หลายเดือนก่อน +8

    Pls 🙏 do an install walkthru with min. system requirements.

    • @BrokoFankone
      @BrokoFankone 2 หลายเดือนก่อน +6

      Min. system requirements: 8 x A100. Godspeed :)

    • @qwazy0158
      @qwazy0158 2 หลายเดือนก่อน

      @@BrokoFankone oh! ..is that all?! Right on it, getting them with my next pay - lol 😆

  • @geneanthony3421
    @geneanthony3421 2 หลายเดือนก่อน +3

    Can’t believe how big this model is.

    • @glenyoung1809
      @glenyoung1809 2 หลายเดือนก่อน +4

      It's not that big, only twice the size of GPT3(175B) and smaller than current state of the art. GPT-4 is rumored to be between 1 to 100 trillion parameters.
      But Grok isn't meant for running on consumer level hardware but in data centers although workstation class systems with multiple GPUs with 24GB 4090s might be able to run "downsized" models with 70B parameters instead of 314B.

  • @eIicit
    @eIicit 2 หลายเดือนก่อน

    Matthew, can you recommend any cloud GPU providers besides the big 3?

  • @justthinkaboutit7983
    @justthinkaboutit7983 2 หลายเดือนก่อน +2

    And once you download it what do you do with a 314B parameter model that you cannot run locally? I'd like to understand the possibilities.

    • @julianschmidt590
      @julianschmidt590 2 หลายเดือนก่อน +1

      You can do quantization such as BitNet b1.58!

    • @Pyriold
      @Pyriold 2 หลายเดือนก่อน +1

      @@julianschmidt590 That isnt a quantization, you need to retrain for it. And we cant do that with Grok, no raw training data.

  • @RealTechnoPanda
    @RealTechnoPanda 2 หลายเดือนก่อน +2

    I am sure Grok will be as amazing as Tesla FSD!

    • @Pyriold
      @Pyriold 2 หลายเดือนก่อน

      Not sure if meant ironically, but FSD has been very capable in the newest iteration.

    • @YuraL88
      @YuraL88 2 หลายเดือนก่อน

      @@Pyriold is still not reliable.

    • @Pyriold
      @Pyriold 2 หลายเดือนก่อน +1

      @@YuraL88 Humans are not as well. It just has to become better than most humans in order to be usefull. Perfection is impossible i think.

  • @TheCopernicus1
    @TheCopernicus1 2 หลายเดือนก่อน +3

    WOW! great work mate

  • @BlayneOliver
    @BlayneOliver 2 หลายเดือนก่อน

    Is there somewhere online we can try it out (instead of downloading the full model?)

  • @AlienAgencyorg
    @AlienAgencyorg 2 หลายเดือนก่อน +1

    of course we want to see you running this LLM , lets gooo and run it and test it .

  • @harpfully
    @harpfully 2 หลายเดือนก่อน +24

    Trained on Twitter posts and untuned, uncensored. What could possibly go wrong?

    • @gam3rdadreviews
      @gam3rdadreviews 2 หลายเดือนก่อน +21

      Nothing will "go wrong". People are people and that's ok. Normal human biases like sexism and ethnocentrism (i.e. "racism") and all the rest are part and parcel of the human being and they shouldn't be demonised as being "bad" or "wrong". They are natural, normal human emotions and motivations. Live and let live. Don't think you're better than everyone else.

    • @harpfully
      @harpfully 2 หลายเดือนก่อน +8

      @@gam3rdadreviews "Natural" (or even "normal") does not imply good.

    • @TheMagicJIZZ
      @TheMagicJIZZ 2 หลายเดือนก่อน +3

      ​@@harpfullytrained on the web too
      It's not just tweets

    • @gam3rdadreviews
      @gam3rdadreviews 2 หลายเดือนก่อน +6

      @@harpfullywell - as a Taoist - I would disagree with you there.

    • @RickySupriyadi
      @RickySupriyadi 2 หลายเดือนก่อน

      one day in a happy village there is this one woman she is bored and attracted to a guy whom already married, she isn't getting him and life getting petty around her so she express her boredom into toxic plots, having the guy she cannot have being plotted to a crime, well even in ancient times hoax existed, in a time when printing machine invented also exist... censorship making it worse, in hands of few, censorship can be very dangerous. freedom of speech is not going anywhere. American should know how valuable freedom of speech... is.

  • @tsuobachi
    @tsuobachi 2 หลายเดือนก่อน +5

    Google should try going back to not being evil while we're at it.

    • @Oliver_Clothesoff
      @Oliver_Clothesoff 2 หลายเดือนก่อน

      Let me guess, a leftist who started to hate Elon when the TV told you to because of free speech on X.

    • @Slav4o911
      @Slav4o911 2 หลายเดือนก่อน

      Not a chance, they are too heavily tainted.

  • @yashingle9460
    @yashingle9460 2 หลายเดือนก่อน +1

    yes please make a video on how to use this model also share some perspective of urs on how can this model be further optimized

  • @Airbag888
    @Airbag888 2 หลายเดือนก่อน

    So what kind of hardware requirements are we looking at for this model to run locally

  • @robertputneydrake
    @robertputneydrake 2 หลายเดือนก่อน +7

    shiiiiiiiiiiiiiiiiiiiiiiit this is crazy

    • @singed8853
      @singed8853 2 หลายเดือนก่อน

      Missing the crazy part.

  • @thomasrebotier1741
    @thomasrebotier1741 2 หลายเดือนก่อน

    We'd love to see you do the footwork to get it up and running!

  • @HE360
    @HE360 2 หลายเดือนก่อน +5

    Groq is CRAZYYYY!! If anybody thinks that ChatGPT was something, try Groq. I asked it a question and the answer to my question was fully printed after I barely hit the enter button. When answering questions, ChatGPT is like Mario while Groq is like Sonic the Hedgehog!!

    • @toCatchAnAI
      @toCatchAnAI 2 หลายเดือนก่อน +2

      this is a different Grok

    • @AmandaFessler
      @AmandaFessler 2 หลายเดือนก่อน +2

      Groq is hardware that lets you run an AI like Sonic. This Grok is an AI, the type you run on hardware like Groq.

    • @HE360
      @HE360 2 หลายเดือนก่อน +1

      oh oops, ok thanks. Yes, I got the wrong grok. I was thinking this was the same grok that I tried a few days ago.@@AmandaFessler

    • @HE360
      @HE360 2 หลายเดือนก่อน +1

      Ok thanks! I thought it was the same groq that I tried a few days. I got the wrong one.@@toCatchAnAI

  • @haroldpierre1726
    @haroldpierre1726 2 หลายเดือนก่อน +7

    When your model isn't as good as GPT-4, it only hurts OpenAI to open-source the model. The more competition OpenAI, Google, and Anthropic has, the better it is for the consumer.

    • @Hunter_Bidens_Crackpipe_
      @Hunter_Bidens_Crackpipe_ 2 หลายเดือนก่อน +3

      Non open sourced AIs aren't in a competition with open sourced AIs.
      Performance for use case flexibility.

    • @samuelforsyth6374
      @samuelforsyth6374 2 หลายเดือนก่อน +1

      GPT-4 is already worse than 3.5 with all the woke nerfs

    • @haroldpierre1726
      @haroldpierre1726 2 หลายเดือนก่อน

      @@samuelforsyth6374 What use cases are you finding that it is worse? I use GPT-4 for meeting summaries, phone call monitoring, and generating documents. For my uses, 3.5 is completely useless for those tasks

  • @samhiatt
    @samhiatt 2 หลายเดือนก่อน

    I'd like to know more about how it incorporates real-time twitter data into responses. I assume it calls external functions (like ChatGPT Actions) that might do semantic filtering / querying of the firehose data, then incorporate those results into its final answer?

  • @RyanNelsonmusic
    @RyanNelsonmusic 2 หลายเดือนก่อน

    Thank you for shepherding me through all the ai stuff

  • @davidsalvador8989
    @davidsalvador8989 2 หลายเดือนก่อน +6

    Why do so many attribute negative intent on someone who is autistic? It is simple, he wanted openAI to be open source. They broke the contract, and went a different direction. He sued them because they broke the contract. Now he also has to go and do it himself to make Grok open source because openai is not going to do it..... Where is the spite?

    • @simonhorna5081
      @simonhorna5081 2 หลายเดือนก่อน

      Did he want it tho’?

    • @davidsalvador8989
      @davidsalvador8989 2 หลายเดือนก่อน

      @@simonhorna5081 Well if you listen to what he says he wanted to make sure openai was a force for good. He also played a major role in starting it with funding no? Then they go and sell out to William Gates... Seems sussy.

    • @alpheuswoodley8435
      @alpheuswoodley8435 2 หลายเดือนก่อน

      Altruistic

  • @user-ph9cu9jo8y
    @user-ph9cu9jo8y 2 หลายเดือนก่อน +4

    Maybe I'm wrong but isn't Musk just throwing us table scraps as all the major players have the next iteration (GPT4.5, GROK2 etc.) trained and working in their labs?

    • @mooonatyeah5308
      @mooonatyeah5308 2 หลายเดือนก่อน +3

      4.5 is not a thing and an LLM as good as Grok being open source is massive for consumers.

    • @MSpotatoes
      @MSpotatoes 2 หลายเดือนก่อน

      If he keeps baking the cake, he's still going to be behind. Giving it to everyone is the best move.

    • @tsuobachi
      @tsuobachi 2 หลายเดือนก่อน

      Once it's open source, the internet of nerds gets involved, and then suddenly it gets way better. So whatever it is now is immaterial because people all over the world will turn it into a powerhouse.

  • @xcalibur1523
    @xcalibur1523 2 หลายเดือนก่อน +1

    Give it a test!!! So exited

  • @bethwithers4798
    @bethwithers4798 2 หลายเดือนก่อน

    This graphic explains it all. ThX for sharing. 😂🎉

  • @careystravels
    @careystravels 2 หลายเดือนก่อน +3

    way to go, elon...

  • @Rimston
    @Rimston 2 หลายเดือนก่อน +1

    It’s a MoE model with 86B active parameters, not all 314B will be active at once. The VRAM reqs should be manageable with 48GB with only quantization.

    • @TobiasWeg
      @TobiasWeg 2 หลายเดือนก่อน

      Good point, I was going to write this.

  • @J2897Tutorials
    @J2897Tutorials 2 หลายเดือนก่อน

    IIRC, the Apache 2.0 licence is even less restrictive than the GPL, because it allows companies to distribute their own proprietary versions, without being required to release the source code. Ironically though, it enables another ClosedAI to happen again in the future.

  • @vaisakhkm783
    @vaisakhkm783 2 หลายเดือนก่อน +3

    xD billioniar on twitter getting bullied to do public service

  • @itskittyme
    @itskittyme 2 หลายเดือนก่อน +6

    unsubscribed, sick of your titles,
    i really liked this channel but i can't work with these titles
    Respecting your current audience is more important than clickbaiting some potential audience.

    • @supercurioTube
      @supercurioTube 2 หลายเดือนก่อน

      This title is the perfect combination of fanboyism and click-bait 😆
      IMO, this video was posted too early, because it lacks understanding of what is needed to run this model.
      And nobody can locally essentially.

    • @roboreply5387
      @roboreply5387 2 หลายเดือนก่อน

      Or maybe it's just how you are hearing it in your head. To some people the release was actually stunning (read the comments, search for "naysayers"). Also the release of Grok is Uncensored, 100% Open-Source, and Massive... so what exactly is the clickbait in this title? Matthew even refrained from using Exclamation points! 😆

  • @davidstoeckl6439
    @davidstoeckl6439 2 หลายเดือนก่อน

    The word GROK comes from Robert A. Heinlein's book, Stranger in a Strange Land. One of my favorite reads. 😊 To Grok is to have whatever you encounter become part of you, from eating or drinking, to understanding a complex concept. I wonder if Elon will someday include a "Water Brother" section of GROK?

  • @sanjesco
    @sanjesco หลายเดือนก่อน

    Great job.. thanks

  • @MrRubinsh
    @MrRubinsh 2 หลายเดือนก่อน

    Would be more than happy to see you test it

  • @ShaileshYadav__
    @ShaileshYadav__ 2 หลายเดือนก่อน +1

    In my opinion Elon Musk deserves Nobel for this...

  • @fenlandcrafts2740
    @fenlandcrafts2740 2 หลายเดือนก่อน

    Can't wait to see your testing, are you going to be running it on a consumer computer?

  • @AdamIverson
    @AdamIverson 2 หลายเดือนก่อน

    This model is so large. I wonder, given a large enough ram paired with AMD Threadripper/Epyc, would I be able to run Grok locally on my computer?

  • @grahamschannel9705
    @grahamschannel9705 2 หลายเดือนก่อน

    Thanks for posting this. Be great to learn how to go about pruning a model this big so can run on consumer GPU's. Is this in your area of expertise Matthew? And yes please test.

  • @picksalot1
    @picksalot1 2 หลายเดือนก่อน

    Yes, please test and demo! Also, let us know the Computer Specs necessary for it to run smoothly. Thanks

    • @supercurioTube
      @supercurioTube 2 หลายเดือนก่อน +1

      Stability AI CEO tweeted that it would require 320GB of GPU RAM to run, when quantized down to 4-bit.
      There won't be any smooth running on any consumer computer I'm afraid.
      Essentially, it's a mixture of expert where each of the expert is a 40B model, and there are 8 of them. Making it fit into a consumer GPU would require to both chop most of its parts off and lobotomize it with severe quantization.

  • @robstewart2213
    @robstewart2213 หลายเดือนก่อน

    Yes, please show us install, setup, and testing. + hardware minimum requirements.

  • @raspas99
    @raspas99 2 หลายเดือนก่อน

    We want to see you test this model!

  • @Farang_Lifestyles
    @Farang_Lifestyles 2 หลายเดือนก่อน

    this will be awesome once LLM in a local environment is available.... cant wait till you show us once available

  • @pupfriend
    @pupfriend 2 หลายเดือนก่อน

    Matthew - There's a funny Venn diagram that's been on the internet for a long time, and I am still trying to find an AI that can explain the humor. On the left is a beaver playing the guitar, on the right a duck playing a keyboard, and in the middle a platypus playing a keytar. Any suggestions?

  • @bjiggs01
    @bjiggs01 2 หลายเดือนก่อน

    Yes. Please test it out and let us know what you think. --- new subscriber loving your content.

  • @rogerc7960
    @rogerc7960 2 หลายเดือนก่อน

    Due to the large size of the model (314B parameters), a multi-GPU machine is required to test the model with the example code.

  • @executivelifehacks6747
    @executivelifehacks6747 2 หลายเดือนก่อน +1

    Please test it Matt. Particularly interested in seeing what the "alignment" is, out of the box, or appears to be. In a D&D and other senses.
    Noting that as IQ increases, native alignment may change, as well as how the AI communicates vs what it really thinks.

  • @edsonjr6972
    @edsonjr6972 2 หลายเดือนก่อน

    Yes, please run the tests on grok!

  • @desertfoxmb
    @desertfoxmb 2 หลายเดือนก่อน

    Please do test. Info on how to implement and results would be super!

  • @kumarapush2221
    @kumarapush2221 2 หลายเดือนก่อน

    Hell yeah. Love to see it tested.

  • @BabylonBaller
    @BabylonBaller 2 หลายเดือนก่อน

    Wow, this is STUNNING, and SHOCKING 😮

  • @yacahumax1431
    @yacahumax1431 หลายเดือนก่อน

    I would love to see you running it. What kind of machine is needed?

  • @marcusk7855
    @marcusk7855 2 หลายเดือนก่อน

    Definitely would like to see you test it.

  • @Menasaat
    @Menasaat 2 หลายเดือนก่อน

    can't wait, please run it

  • @dacianherbei
    @dacianherbei 2 หลายเดือนก่อน

    what is the configuration of the PC on which you ran GROK if you can share it?

  • @Rigs__
    @Rigs__ 2 หลายเดือนก่อน +1

    Run it run it 💪

  • @user-jm3kr1bv9m
    @user-jm3kr1bv9m 20 วันที่ผ่านมา

    Love this product. Hope to hear more about it's opening result. Hope it's open 24hours a day.