Merge LLMs to Make Best Performing AI Model

แชร์
ฝัง

ความคิดเห็น • 125

  • @maya-akim
    @maya-akim  9 หลายเดือนก่อน +66

    i hope you find the video useful and don't forget to show (and brag about) your blended models!

    • @TheEarlVix
      @TheEarlVix 9 หลายเดือนก่อน +1

      Thank you. Found this from your post on X.

    • @chrisBruner
      @chrisBruner 9 หลายเดือนก่อน +1

      Your videos are always very good and cutting edge.

    • @qiqqaqwerty1713
      @qiqqaqwerty1713 9 หลายเดือนก่อน +2

      Thanks for the very informative video.
      Cheers from "Down Under"!

    • @XUHANSHENG-i3c
      @XUHANSHENG-i3c 9 หลายเดือนก่อน +1

      videos are so great! i will be modest to learn as more as i can!

    • @Karthikarjunan16
      @Karthikarjunan16 6 หลายเดือนก่อน

      Hi beauty your youtube video link can't embed other app notion app and all you when publishing video before check TH-cam studio mark video embed permission select

  • @qiqqaqwerty1713
    @qiqqaqwerty1713 9 หลายเดือนก่อน +3

    🎯 Key Takeaways for quick navigation, however this summary does not avoids you watch the complete video for a more in deep understanding::
    Main Ideas:
    - 🌍 Model blending is an innovative approach to surpass the performance of high-cost models with limited resources.
    - 🤖 Non-experts can effectively blend models, demonstrating the technique's accessibility.
    - 💡 The blend allows for specialized functionality, combining models tuned for diverse tasks into a powerhouse model.
    - 🛠 The merging process involves selecting compatible models, defining parameters, and executing the blend with basic command line knowledge.
    - 🔄 Various blending methods like task vector arithmetic and SLURP offer unique advantages for custom model creation.
    - 📚 Proper selection and preparation of models are crucial, with a focus on architecture compatibility and avoiding common pitfalls.
    - 🏆 Blended models can achieve top rankings on leaderboards, though their position may fluctuate.
    - 🤔 The effectiveness of benchmarks in evaluating model intelligence is questioned, highlighting the issue of data contamination.
    Takeways:
    00:00 *🤖 Introduction to Model Blending*
    - Introduction to the concepts of model blending, showcasing the power of combining models to overcome resource limitations and improve performance.
    - Highlights two models, Mixol and Ramonda, emphasizing the potential of model blending even with limited resources.
    01:24 *📘 Basics of Model Blending*
    - Detailed explanation of model blending, its significance, and the methodology behind efficient blending.
    - Discusses the blending process, the importance of model selection, and the steps involved in creating a blended model.
    02:05 *💡 The Promise of Blending*
    - Explores the potential of blending models to create top-performing LLMs without the need for extensive resources.
    - Focus on the accessibility of fine-tuning and blending for personalized model development.
    03:33 *🛠️ How to Blend Models*
    - Provides a practical guide on blending models using MergeKit, including setup and execution steps.
    - Emphasizes the ease of blending models with basic knowledge and the right tools, offering an approachable method for enthusiasts and professionals alike.
    05:33 *🧪 Detailed Blending Methods*
    - Deep dive into various blending techniques such as task vector arithmetic, SLURP, TIES, and DARE, explaining their unique applications and benefits.
    - Discusses the technical aspects of model blending, offering insights into choosing the right method for specific goals.
    08:17 *🖥️ Preparing for Blending*
    - Guidelines on selecting compatible models for blending, emphasizing the importance of architecture and layer compatibility.
    - Instructions for downloading models from Hugging Face and preparing for the blending process.
    10:33 *📝 Configuring YAML for Blending*
    - Step-by-step instructions on setting up YAML files for blending, highlighting the importance of specifying base models, merge methods, and parameters.
    - Offers practical tips for configuring blending parameters to optimize the blending process.
    13:42 *🚀 Executing the Blend and Evaluation*
    - Detailed walkthrough of the blending execution using MergeKit and subsequent evaluation through a text generation interface.
    - Encourages testing and fine-tuning of the blended model before submission to benchmarks or public use.
    15:45 *📊 Performance Testing and Data Contamination*
    - Discusses the significance of performance testing on open LLM leaderboards and addresses the issue of data contamination in model training.
    - Highlights the importance of careful model selection and blending strategy to avoid overfitting and ensure genuine improvements in model performance.
    I hope this helps everybody!

  • @Santiino
    @Santiino 9 หลายเดือนก่อน +22

    Its mindblowing to me how good your videos are yet you are still so unknown. Keep it up!

  • @Ludecan
    @Ludecan 9 หลายเดือนก่อน +10

    Awesome video!! Been following your series on building AI agents and they're very good! Thanks for sharing!

  • @EduGuti9000
    @EduGuti9000 9 หลายเดือนก่อน

    ¡Gracias!

  • @mysticaltech
    @mysticaltech 8 หลายเดือนก่อน +1

    Maya, you are good at this stuff. you are averaging my internal mind vectors to make Ai easy. Keep doing so!

  • @s-miguel
    @s-miguel 9 หลายเดือนก่อน +2

    This is destined to evolve into a meticulously curated, go-to channel of human reliability for years to come. Thank you very much for the exceptional quality you provide!

  • @ivandukic
    @ivandukic 9 หลายเดือนก่อน +3

    Wow, what an incredible explaination of merge methods. Thank you.

  • @SebastianKreutzberger
    @SebastianKreutzberger 9 หลายเดือนก่อน +3

    Fantastic video, so well prepared, fool-proof explained, and a really cutting-edge topic. Best AI TH-camr out there - thank you 🙏

  • @seanhynes9516
    @seanhynes9516 7 หลายเดือนก่อน +2

    Awesome, thanks for the gerat video. Very well explained, great diagrams! :)

  • @Nifty-Stuff
    @Nifty-Stuff 8 หลายเดือนก่อน +2

    Blending LLMs is a fascinating idea. The idea left me wondering: Why hasn't anybody developed a system/app that takes the API's from the top LLMs, created agents for each, and then have these agents all work together to brainstorm, debate, review, and solve problems? I often get 4 different answers from 4 LLMs, so why not have them all setup as agents "in one room" working together to come up with the "best" solution. I can't find anybody that's tried this... why not? Wouldn't having the "top minds" (LLMs) working together produce better results?

    • @bamh1re318
      @bamh1re318 7 หลายเดือนก่อน

      they could become worse, or give you 4 different answers, or could not stop talking around themsaelves

    • @knawdawg
      @knawdawg 3 หลายเดือนก่อน

      A few reasons, APIs don’t give you any information about model structure and are pre customized. Most of the data is proprietary, and for it to even make sense you have to train it over and over. We are FAR from perfect, innovation is found by understanding flaws. Trail and error, societal brute force, I somehow have a local model on my phone that is very very generative, what’s next

  • @Dan-zd1qr
    @Dan-zd1qr 4 หลายเดือนก่อน

    I've learned a lot more about LLMs, architecture, and merging principles in just a few minutes.

  • @robinmordasiewicz
    @robinmordasiewicz 9 หลายเดือนก่อน

    wow, most sophisticated TH-camr ever. New favorite channel.

  • @scienceandmind3065
    @scienceandmind3065 9 หลายเดือนก่อน +2

    Great video and exactly what I need at the moment. Having a lot of specialized models for science, translation, coding, finance etc but no good way of combining them.

    • @maya-akim
      @maya-akim  9 หลายเดือนก่อน

      best of luck! and share with us your results if you want :)

  • @blim420
    @blim420 9 หลายเดือนก่อน +4

    Excellent walk through, thanks !

  • @Rob_Steele
    @Rob_Steele 9 หลายเดือนก่อน +3

    Great video Maya! Keep em coming! 😎

  • @VFraunhofer
    @VFraunhofer 2 หลายเดือนก่อน

    That was a very interesting video, thanks for you work !

  • @GetzAI
    @GetzAI 9 หลายเดือนก่อน +3

    Thanks Maya!

  • @overcuriousity
    @overcuriousity 9 หลายเดือนก่อน

    interesting, easy to follow, well researched and critically scrutinized the results. like your content!

  • @HealthyNutrition-y
    @HealthyNutrition-y 2 หลายเดือนก่อน

    🔴🔵“Intelligence is compression of information.” This is one of the most useful videos I believe I have ever watched on TH-cam.?

  • @jeremybristol4374
    @jeremybristol4374 9 หลายเดือนก่อน +4

    Love the props with storytelling! Great instructional video!

  • @hand-eye4517
    @hand-eye4517 9 หลายเดือนก่อน +1

    We thank you for all the amazing content and as such , being a great content creator , i dont wanna sound nitpicky , but since you are already attracting and leaning towards the DIY crowd you may as well be using the open source tools as well { vs Codium} etc. Just a small critique because i love the content.

    • @maya-akim
      @maya-akim  9 หลายเดือนก่อน

      hey thanks for support and feedback 🙏🏻 I'm not sure I totally follow. Do you suggest that I switch to Codium? Honestly, before your comment I assumed that VScode is open source, but after googling a bit I realized that the product itself isn't actually. But I looks like Codium is os, so you think that that's a better fit for the channel?

  • @mayorc
    @mayorc 9 หลายเดือนก่อน +1

    Great video Maya. Keep it up ❕❕❕

  • @leumas_tai
    @leumas_tai 9 หลายเดือนก่อน +4

    Great video. How does this differ from the Mixture of Experts (MOE)?

    • @maya-akim
      @maya-akim  9 หลายเดือนก่อน +3

      that's an excellent question! first of all, I noticed that the community doesn't consider MoE to be merged models, even though you can use mergekit to create MoE yourself (instructions in the description box). My understanding is that blended models become "fixed" when it comes to their capabilities.
      MoE capabilities change dynamically thanks to gating mechanism that decides how much of each expert's advice to follow for a given input. You specify prompts (or simple strings with mergekit) that activate specific expert. For example, here's a configuration that I used for MoE: huggingface.co/mayacinka/West-Ramen-7Bx4 as you can see, positive and negative prompts will "guide" the model.

    • @leumas_tai
      @leumas_tai 9 หลายเดือนก่อน

      @@maya-akim interesting. thanks for sharing your thoughts I'll look it out.

  • @iasdfasdfasdfasdf
    @iasdfasdfasdfasdf 3 หลายเดือนก่อน

    Wow, where did you go? Your video editing and the use of paper cutouts is so unique.

  • @minae1423
    @minae1423 9 หลายเดือนก่อน

    well articulated and educational video, thank you Maya!🙏🏼

  • @ajay--yadav
    @ajay--yadav 9 หลายเดือนก่อน

    lot of information about so many topics presented nicely.

  • @geekyprogrammer4831
    @geekyprogrammer4831 9 หลายเดือนก่อน +3

    Very underrated channel!! This is enlightening. How a person can be so smart and beautiful too at the same time 😭😭

    • @InitGet
      @InitGet 3 หลายเดือนก่อน

      *thwap* First warning.
      Next time you go to horny jail...
      She is quite lovely though

  • @synchro-dentally1965
    @synchro-dentally1965 9 หลายเดือนก่อน

    Excellent video! The development outlook seems open to so many possibilities. I'm curious if anyone will find advantages in networks built via diffusions(similar to image generation) or if there will be more real time dynamics implemented as the model responds to a query.

  • @doomstertech8305
    @doomstertech8305 9 หลายเดือนก่อน

    great video, loved the explanation of all the technical stuff. Would love to know your process on how you read and understand these topics in-depth?

  • @xspydazx
    @xspydazx 9 หลายเดือนก่อน

    Very good lesson and explanation ! So far the best on this subject .. as the main problem I have was running the models after . I could not find the definitive method to work ... Despite one of the models scoring high it could not run in the HF Inference plugin on the model card ..

  • @deter3
    @deter3 4 หลายเดือนก่อน

    You are a really experienced AI engineer .

  • @rein436
    @rein436 9 หลายเดือนก่อน +1

    Very insightful 👍

  • @lokeshart3340
    @lokeshart3340 9 หลายเดือนก่อน +5

    Can we blend multimodal models like llavaa and mistral and gemini vision? Can u make a video on it pls..❤❤

    • @maya-akim
      @maya-akim  9 หลายเดือนก่อน +3

      oh that's interesting, I got to say I didn't try but I'm curious myself! I'll see how it goes and either I'll make a video or I'll let you know somehow

    • @lokeshart3340
      @lokeshart3340 9 หลายเดือนก่อน

      @@maya-akim sure.

    • @VFraunhofer
      @VFraunhofer 2 หลายเดือนก่อน

      That's directly what I thought about, would be nice not to having to use a lot of differents models and make them communicate. @maya-akim : Did you try since ?

    • @lokeshart3340
      @lokeshart3340 2 หลายเดือนก่อน

      @@VFraunhofer yes i am still waiting

  • @gerykis
    @gerykis 9 หลายเดือนก่อน

    Very good explanation. I'm looking for such easy to understand video how to fine tune a model locally .

  • @Cloudvenus666
    @Cloudvenus666 9 หลายเดือนก่อน +2

    What happens if you merge two models of the same family but they each have different context lengths? Does the model with the larger token window take precedence?

    • @maya-akim
      @maya-akim  9 หลายเดือนก่อน +2

      it will depend on the "base model". But, in the cases that don't require defining a base model (like passthrough) or this hacky case here: huggingface.co/mayacinka/chatty-djinn-14B. when I merged models with 32K and 8K context window, the 32K models overpowered the 8K open chat model.

    • @Cloudvenus666
      @Cloudvenus666 9 หลายเดือนก่อน

      @@maya-akim thank you

  • @DemiGoodUA
    @DemiGoodUA 9 หลายเดือนก่อน +1

    Nice Video! Do we have the ability to fine tune the model on own codebase?

  • @amandamate9117
    @amandamate9117 9 หลายเดือนก่อน +1

    can you test agent frameworks like CrewAi with Claude 3 opus?

  • @abdallamosa8836
    @abdallamosa8836 8 หลายเดือนก่อน

    Is Combining tools like SWE-Agent, Crew AI, and OS-Copilot into a cohesive agentic workflow possible

  • @Linguisticsfreak
    @Linguisticsfreak 7 หลายเดือนก่อน

    Since we don't have access to the training data, it is simply impossible/unfeasible to choose models based on whether they have or don't have contaminated data.

  • @MiguelLopez-mu1ss
    @MiguelLopez-mu1ss 9 หลายเดือนก่อน

    Thank you for the insights

  • @johnefan
    @johnefan 9 หลายเดือนก่อน

    Great Video👏🏻

  • @CDIGS-EI-hv3cf
    @CDIGS-EI-hv3cf 4 หลายเดือนก่อน

    Hi Maya, thank you for this video, which is recently one of the best to this topic. But I'm sorry, I still haven't understand it yet... What do I have to do after the mergekit installation? How do I run the process? Can you make a video where you show the process? I also tried the model tree, but in my case, it has not shown anything...

  • @nimesh.akalanka
    @nimesh.akalanka 7 หลายเดือนก่อน

    How can I fine-tune the LLAMA 3 8B model for free on my local hardware, specifically a ThinkStation P620 Tower Workstation with an AMD Ryzen Threadripper PRO 5945WX processor, 128 GB DDR4 RAM, and two NVIDIA RTX A4000 16GB GPUs in SLI? I am new to this and have prepared a dataset for training. Is this feasible?

  • @Alf-Dee
    @Alf-Dee 9 หลายเดือนก่อน

    Amazing video! I didn’t know it could be done.
    I am definitely going to make my own uncensored blended model for coding.
    I am tired of openai telling me that I should not modify/hack code without owner permission even if I am the owner, and I am trying to test how solid the code is…

  • @abdallahtoba7568
    @abdallahtoba7568 5 หลายเดือนก่อน

    what's the difference between task arithmetic and Slerp

  • @ulrichbeutenmuller8101
    @ulrichbeutenmuller8101 9 หลายเดือนก่อน

    thanks, great video!

  • @SanctuaryGardenLiving
    @SanctuaryGardenLiving 9 หลายเดือนก่อน

    The combination of spending time messing w ai along with your videos are inspiring me to build my own workstation.
    Not sure if that's smart considering I don't know how to code.
    So far I have ordered:
    super micro x12dai mobo
    2 platinum 8352s
    2 rtx 3090s
    2 sata 12 tb
    2 optane nvmes for os and quick retrieval stuff
    128 gb lrdimm ddr4
    E-ATX case, cables & ps
    Do you do any consulting work via zoom? I may need some direction soon.

  • @JoelSiby-ju5pf
    @JoelSiby-ju5pf 8 หลายเดือนก่อน

    after that i could use my customized model from hugging face or locally on my app's?

    • @JoelSiby-ju5pf
      @JoelSiby-ju5pf 8 หลายเดือนก่อน

      also now that i have decided to use this model on my creating of gen-ai app's how would i load?
      llm = ??? # provide me the syntax for this

  • @Karthikarjunan16
    @Karthikarjunan16 6 หลายเดือนก่อน

    Hi beauty your youtube video link can't embed other app notion app and all you when publishing video before check TH-cam studio mark video embed permission select

  • @yougaming8217
    @yougaming8217 7 หลายเดือนก่อน

    Is it possible to merge 7B with 8B models?

  • @noblewarrior4776
    @noblewarrior4776 9 หลายเดือนก่อน

    You are amazing… thank you

  • @amandamate9117
    @amandamate9117 5 หลายเดือนก่อน +2

    why you stopped posting videos on this channel?

  • @BrandonFurtwangler
    @BrandonFurtwangler 9 หลายเดือนก่อน

    Why does Slerp only support two models?
    Can’t you just slerp between pairs, then slerp the slerps, etc until you have 1?

    • @maya-akim
      @maya-akim  9 หลายเดือนก่อน +1

      yep, you absolutely can slerp the slerps of the previously slerped slerps. That's what a lot of people do.

  • @tiberiumihairezus417
    @tiberiumihairezus417 9 หลายเดือนก่อน

    Great content.

  • @johntdavies
    @johntdavies 9 หลายเดือนก่อน

    Maya, a great video, thank you. Quick question, where are you based? The reason I ask is I'm looking for an AI speak in the UK, you came to mind so was just wondering. Again, excellent video, amazing depth.

    • @maya-akim
      @maya-akim  9 หลายเดือนก่อน

      hey John, thanks a lot for the support! I live in Austin, TX, so I'm afraid I won't be of any help :/

    • @johntdavies
      @johntdavies 9 หลายเดือนก่อน

      @@maya-akim Damn, that's a long way away! Never mind, keep up the great work and thanks for getting back 🙂

  • @AacisMistake
    @AacisMistake 9 หลายเดือนก่อน +1

    My only question while watching was. Why should I make a model? I figure there is going to a be a infinite number of models being created by people and soon to be ai models created by ai models. So my question is, what is the point of making a custom model aside from fine tuning on data. I use autogen, would creating a model like your doing. empower a local model to let's say.. chat on my data, and be good at function calling? maybe this would be an experimental way to possibly make my own model specifically for autogen? Like Ik someone out there is already working on that specifically and even you showed those models specifically used for function calling in one of your other vids.

    • @maya-akim
      @maya-akim  9 หลายเดือนก่อน +2

      oh that's a great question! here's how I would use it: 1. Find a model that scores highly on MMLU benchmark (which means that It has diverse knowledge). Blend it with a model that you like because of how its "vibe". For me that would be openchat because I like how conversational it is. The blended model would perform better than the two "parent" models. 2. I'm actually working on this one. I'm trying to fine tune one model to specifically be good at crafting youtube titles. And another one to write good youtube scripts. Than, I'll try to blend those two.

  • @RahulBhalley
    @RahulBhalley 9 หลายเดือนก่อน

    Thanks! What software are you running for loading and inferencing your merged LLM using localhost in browser?

    • @maya-akim
      @maya-akim  9 หลายเดือนก่อน

      that's oobabooga's text generation UI. It allows you to run any model, whether it's saved locally, or on huggingface's hub

    • @RahulBhalley
      @RahulBhalley 9 หลายเดือนก่อน

      thanks@@maya-akim

  • @justinwhite2725
    @justinwhite2725 9 หลายเดือนก่อน +1

    LLMs catching up to something Stable Diffusion users have been doing for awhile.
    Open source is the way.

  • @axe863
    @axe863 8 หลายเดือนก่อน

    Stacked and Cascading ensembling have been around for awhile

  • @inout3394
    @inout3394 9 หลายเดือนก่อน

    LLM: Tokenization vs MAMBA, please make video about this

  • @ashgtd
    @ashgtd 6 หลายเดือนก่อน +1

    it's been 3 months. More please

  • @wiktormigaszewski8684
    @wiktormigaszewski8684 2 หลายเดือนก่อน

    cool!

  • @bgNinjashows
    @bgNinjashows 9 หลายเดือนก่อน

    Genius

  • @oryxchannel
    @oryxchannel 7 หลายเดือนก่อน

    Visual Studio for Mac being discontinued August 31, 2024.

  • @chuchel3156
    @chuchel3156 9 หลายเดือนก่อน

    Nice video

  • @LouvoresPauloRicardo
    @LouvoresPauloRicardo 9 หลายเดือนก่อน

    Maya.. you are the girl!!!

  • @GuidedBreathing
    @GuidedBreathing 9 หลายเดือนก่อน

    3:40 and now add robots 🤖 cheers🥂

  • @Maisonier
    @Maisonier 8 หลายเดือนก่อน +1

    So it's like mixing colors, back in kindergarten, you'd always blend everything together hoping to create this amazing hue, but it always just ended up this muddy, ugly brown

  • @gareththomas3234
    @gareththomas3234 8 หลายเดือนก่อน

    why not just use autogen?

    • @free_thinker4958
      @free_thinker4958 8 หลายเดือนก่อน +1

      Autogen is full of crap

  • @SinanAkkoyun
    @SinanAkkoyun 9 หลายเดือนก่อน

    Bach wtc 1 prelude 21 😍

  • @khhamsafar
    @khhamsafar 2 หลายเดือนก่อน

    @maya-akim
    Why you are not uploading new videos in those days? Are you alive? 🙄

  • @siddhubhai2508
    @siddhubhai2508 5 หลายเดือนก่อน +1

    Despite the fact India have most talented developers, in terms of AI development I think India is going behind 😢

  • @Dhirajkumar-ls1ws
    @Dhirajkumar-ls1ws 9 หลายเดือนก่อน

    👍

  • @PaulSchwarzer-ou9sw
    @PaulSchwarzer-ou9sw 9 หลายเดือนก่อน

    🎉

  • @zippytechnologies
    @zippytechnologies 8 หลายเดือนก่อน

    At first - I was excited to see a new video with useful info - but when it got to that crime scene mapping thing you do - well... sorta creepy, no? What is that method called? Conspiracy mapping? Good visuals but wow... I lost track of what was going on with it... maybe it was more of a "Why are you putting holes in you walls? Some poor guy is gonna be like "...where's the spackle and putty knife? Some tenant/wife/daughter/kid poked a bunch of holes in my wall"... I never understood how so many holes got poked into my daughters walls or even our living room walls (ahem... the wife) but maybe this is just something that is fun to do? Now, do a video on how to patch all those little holes and get a paint roller with medium nap to repaint and cover everything up - but don't just paint a small area... no.. gonna probably have to paint the whole wall so there's no more streaks and visible coverups.. or at least learn how to feather out the edges so they blend better with the existing paint on the walls.. ugh... can't plug those holes and paint with an ai agent (yet)... so at least some skills are still worthy of known and learning... go get a guy or gal with some handy work skills - mechanical skills or something useful that AI can't do well any never will (likely for a long time) and you at least know your guy/gal will be useful given that AI will be putting lots of other people out of work (and is already doing so). I need to hire some people to help me get this working for our company - but I can't afford to keep paying drywall contractors every time we get a new idea... lol

    • @yellowboat8773
      @yellowboat8773 8 หลายเดือนก่อน

      Wow, you have too much time on your hands

    • @iasdfasdfasdfasdf
      @iasdfasdfasdfasdf 3 หลายเดือนก่อน

      Textbook ADHD comment right here.

  • @ServerGamingTop100
    @ServerGamingTop100 9 หลายเดือนก่อน +1

    It's not about collecting links with information and adding below the video... The important information is: which models are compatible and how to write the configuration file, which you barely mention! I can find all these links myself.

  • @user-he8qc4mr4i
    @user-he8qc4mr4i 8 หลายเดือนก่อน

    Monoltic LLMs < MulitAgents

    • @rinokpp1692
      @rinokpp1692 8 หลายเดือนก่อน

      CAN I use agent on my mobile device

  • @NickDoddTV
    @NickDoddTV 9 หลายเดือนก่อน

    Good soup

  • @JINGWA64
    @JINGWA64 7 หลายเดือนก่อน

    problem with making vids that require prior knowledge and experience, is those who would find the information most useful, cannot make use of that information due requiring that prior knowledge and experience, yet at the same time the information provided in the vid is at the level to service a novice who had no prior interest, so who is the audience being catered to?

  • @LukasSmith827
    @LukasSmith827 9 หลายเดือนก่อน

    your timing is scary

    • @maya-akim
      @maya-akim  9 หลายเดือนก่อน

      what do you mean?

  • @VegascoinVegas
    @VegascoinVegas 5 หลายเดือนก่อน

    I noticed almost no accent but you say ".sli" instead of ".cli" and that seems to confirm you also speak an Eastern European language. I want to learn Serbian so I think your name is Maja. Just guessing.
    I wonder if you wanted to use AI to deep fake your voice to produce the exact same English narration in a different language, do you immediately know which model you would train with this textual script?

  • @PazLeBon
    @PazLeBon 9 หลายเดือนก่อน

    so a claude haha

  • @florentflote
    @florentflote 9 หลายเดือนก่อน

  • @MichaelDomer
    @MichaelDomer 8 หลายเดือนก่อน +1

    Too many video of AI nerds on TouTube... for AI nerds, hardly anyone makes videos for the average John and Jane, resulting in a large group people detached from AI.

    • @maya-akim
      @maya-akim  8 หลายเดือนก่อน

      what types of videos would appeal to average John and Jane?

  • @MichaelDomer
    @MichaelDomer 8 หลายเดือนก่อน +1

    Stop saying it's so simple... yes, for you it is.

    • @DudeBlox-y8m
      @DudeBlox-y8m 5 หลายเดือนก่อน

      haha.....true best to avoid that

  • @usmanthechamp123
    @usmanthechamp123 9 หลายเดือนก่อน

    @maya-akim blending would be the best word for this right, merging I think is the word people are using for it don't you think

  • @EduGuti9000
    @EduGuti9000 9 หลายเดือนก่อน

    ¡Gracias!

    • @maya-akim
      @maya-akim  9 หลายเดือนก่อน +1

      Thanks a lot 🤗