Intro to JAX: Accelerating Machine Learning research

แชร์
ฝัง
  • เผยแพร่เมื่อ 21 ก.ย. 2024

ความคิดเห็น • 62

  • @domenicovalles2498
    @domenicovalles2498 2 ปีที่แล้ว +31

    This guy is so epic. He looks like he's enjoying every second of life.

  • @iva1389
    @iva1389 2 ปีที่แล้ว +88

    NumPy on steroids

  • @EnricoRos
    @EnricoRos 2 ปีที่แล้ว +87

    This video maximizes dInsights/dtime, is well written and easy to understand! I want to see more videos from Jake!

    • @SinDarSoup
      @SinDarSoup 2 ปีที่แล้ว +5

      JAke X

    • @oncedidactic
      @oncedidactic 2 ปีที่แล้ว +3

      EduTube needs a like button for specifically this metric 🤜🤛

    • @nrrgrdn
      @nrrgrdn 2 ปีที่แล้ว +2

      It maximizes Insights/time, not the derivative

    • @ilyboc
      @ilyboc 2 ปีที่แล้ว

      @@nrrgrdn yeh maybe that's better, but I think he means you gain continuously more insights as you advance in the video

  • @emiljanQ3
    @emiljanQ3 2 ปีที่แล้ว +8

    Looks great! I tend to default to numpy when I want to do something that is not fully supported in keras or pytorch and if i can get paralellization on gpu very easily from this that is perfect!

  • @OtRatsaphong
    @OtRatsaphong 2 ปีที่แล้ว +5

    Thank you for this good intro to JAX. Very easy to follow and understand, Jake. Definitely going to add this to my toolkit. 👍🙏

  • @pablo_brianese
    @pablo_brianese 2 ปีที่แล้ว +6

    I burst out laughing with the ExpressoMaker that overloads the + operator.

  • @karansarkar1710
    @karansarkar1710 2 ปีที่แล้ว +4

    Thiis sounds very good especially the grad and vmap functionality. I think more libraries would have to be released to compete with pytorch.

  • @valshaev1145
    @valshaev1145 ปีที่แล้ว

    Thanks! For me it helps alot! Being a C/C++ / Python developer, somehow I left behind such an important framework / library.

  • @lacasadeacero
    @lacasadeacero 2 ปีที่แล้ว +3

    i have a question, whats the porpuse of doing so many frameworks? time? efficiency? cuz i don't see it.

  • @subipan4593
    @subipan4593 2 ปีที่แล้ว +5

    JAX seems to be more similar to PyTorch i.e., dynamic graph instead of static graph as in Tensorflow.

    • @bender2752
      @bender2752 2 ปีที่แล้ว +2

      There's something called AutoGraph in TensorFlow actually

    • @geekye
      @geekye 2 ปีที่แล้ว

      That's Flax. Jax is more like the backbone of that

  • @joshuasmith2450
    @joshuasmith2450 2 ปีที่แล้ว +1

    How are you going to compare torch to tf/jax when run on a different GPU? There is no way you can argue the 2 gpus are comparable, they will be faster/slower at different types of computation regardless of software used. Should have compared the 3 on a common gpu if for some reason torch couldnt be run on the tpuv3.

  • @Shikalegend
    @Shikalegend 2 ปีที่แล้ว +2

    This typically looks like a problem that could be easily solved with a language that supports multi-stage programming; meta-programming as a first class citizen, which is not really the case with Python. Like Rust or Elixir via the Nx library which is actually directly inspired of Jax.

  • @gaborenyedi637
    @gaborenyedi637 2 ปีที่แล้ว +2

    Why do you need a new lib? Tensorflow can do 90+% of this, doesn't it? Is it a good idea to make a completely new thing instead extending the old one?
    One more question: do/will you have Keras support?

  • @TohaBgood2
    @TohaBgood2 2 ปีที่แล้ว +5

    Ok, this is seriously cool. Is this brand new? Haven't seen it before.
    Also, in the first code sample did you mean to import vmap and pmap instead of map, or is that some kind of namespace black magic I don't understand?

    • @enricoshippole2409
      @enricoshippole2409 2 ปีที่แล้ว +2

      It has been around for over 2 years now I believe

    • @linminhtoo
      @linminhtoo 2 ปีที่แล้ว +1

      ya it's a typo, there's no magic

  • @AlphaMoury
    @AlphaMoury 2 ปีที่แล้ว +4

    I thought JAX was running as default in tensorflow, am I missing something here?

  • @markoshivapavlovic4976
    @markoshivapavlovic4976 2 ปีที่แล้ว +1

    nice talk that will be interesting

  • @sitrakaforler8696
    @sitrakaforler8696 ปีที่แล้ว

    Great content ! BRAVO and THANKS !

  • @CharlesMacKay88
    @CharlesMacKay88 8 หลายเดือนก่อน

    2:14 why in predict function inputs is reassigned but never used ? should be outputs = np.tanh(outputs)

  • @AJ-et3vf
    @AJ-et3vf 2 ปีที่แล้ว

    Awesome video! Thank you!

  • @nightwingphd8580
    @nightwingphd8580 2 ปีที่แล้ว

    This is wild!

  • @srinivastadepalli9431
    @srinivastadepalli9431 2 ปีที่แล้ว

    Awesome intro!

  • @L4rsTrysToMakeTut
    @L4rsTrysToMakeTut 2 ปีที่แล้ว +2

    Why don't use julia lang?

  • @HibeePin
    @HibeePin 2 ปีที่แล้ว +2

    Active: Jax enters Evasion, a defensive stance, for up to 2 seconds, causing all basic attacks against him to miss.

    • @dl8083
      @dl8083 2 ปีที่แล้ว

      I knew this is going to come up lol

  • @brandomiranda6703
    @brandomiranda6703 2 ปีที่แล้ว +2

    What is the difference btw numerical vs automatic differentiation?

    • @amitxi-y5q
      @amitxi-y5q 2 ปีที่แล้ว +1

      Numerical differentiation computes f’(x) by evaluating the function around x: (f(x+h)-f(x-h))/2h with a small h. Automatic differentiation represents the function expression or code as a computational graph. It looks at the actual code of the function. The final derivative is obtained by propagating the value of local derivatives of simple expressions through the graph via the chain rule. The simple expressions are functions like +, -, cos(x), exp(x) for which we knows the derivatives at a given x.

  • @marcosanguineti2710
    @marcosanguineti2710 2 ปีที่แล้ว

    Really interesting!

  • @captainlennyjapan27
    @captainlennyjapan27 2 ปีที่แล้ว +1

    Top Jax OP

  • @rickhackro
    @rickhackro 2 ปีที่แล้ว

    Amazing!

  • @markoshivapavlovic4976
    @markoshivapavlovic4976 2 ปีที่แล้ว

    Nice framework.

  • @RH-mk3rp
    @RH-mk3rp ปีที่แล้ว

    Something's wrong with the audio. His voice gets so soft it's hard to hear at the end of some sentences.

  • @eddisonlewis8099
    @eddisonlewis8099 9 หลายเดือนก่อน

    Interesting Stuff

  • @satwikram2479
    @satwikram2479 2 ปีที่แล้ว

    Amazing👏

  • @kuretaxyz
    @kuretaxyz 2 ปีที่แล้ว +1

    Seeing JAX on the TensorFlow channel, now I am scared they'll mess this codebase too. Please don't, k thx.

  • @bicarrio
    @bicarrio 2 ปีที่แล้ว +3

    it says "from jax import map", but it seems it should be vmap?

    • @boffo25
      @boffo25 2 ปีที่แล้ว

      from jax import map as vmap

  • @sashanktalakola
    @sashanktalakola 3 หลายเดือนก่อน

    1:14 lol they compared TPU runtimes with GPU runtimes

  • @mominabbas125
    @mominabbas125 2 ปีที่แล้ว

    Wow! 🏋️

  • @hfkssadfrew
    @hfkssadfrew 2 ปีที่แล้ว

    Seems tensorflow is fast enough?

  • @chrisioannidis2295
    @chrisioannidis2295 2 ปีที่แล้ว +2

    Imagine if it had a real weapon

  • @matthewpublikum3114
    @matthewpublikum3114 2 ปีที่แล้ว

    Is this much better than simd?

  • @brandomiranda6703
    @brandomiranda6703 2 ปีที่แล้ว +1

    Does this support apples gpus in M1 max?

    • @toastrecon
      @toastrecon 2 ปีที่แล้ว

      I also wonder if they utilize the neural processors, too?

  • @brandomiranda6703
    @brandomiranda6703 2 ปีที่แล้ว +1

    I dont get it. Why do we need this if pytorch and keras/tf already exist?

    • @simonb.979
      @simonb.979 2 ปีที่แล้ว +3

      I mean it is kinda niche but suppose you solve a problem that heavily relies on many custom functions, e.g., a very specific algebra like quaternion-operations. Then you can write super-fast basic operations and compose them to build a complicated loss-function that as a whole you can then jit-compile and let it get optimized. Or differentiate it, or vectorize it, all with a tiny decorator.

    • @tclf90
      @tclf90 2 ปีที่แล้ว

      torch and keras is "slow" and is only meant for the development phase. not sure how fast jax can outperform them.
      edit: "slow" as in computation/inference time

    • @MrAmgadHasan
      @MrAmgadHasan ปีที่แล้ว

      @@tclf90 So what frameworks are "fast"?

  • @HealthZo
    @HealthZo 7 หลายเดือนก่อน

    😮😮😮😮 0:28

  • @harryali4601
    @harryali4601 2 ปีที่แล้ว

    Is it me or does the backend technology of JAX sound very similar to the one in tensorflow.

  • @RoyRogersMusicShop
    @RoyRogersMusicShop ปีที่แล้ว

    Googles Bard sent me here . Anyone know why ?

  • @AnimeshSharma1977
    @AnimeshSharma1977 2 ปีที่แล้ว +1

    My Call Jax Son #AI ?

  • @jakewong6305
    @jakewong6305 2 ปีที่แล้ว

    JAX come out because of torch

  • @example.com.
    @example.com. 2 ปีที่แล้ว +1

    numpyやめるわ

  • @millco-.-
    @millco-.- 2 ปีที่แล้ว

    its tiresome