APPLE, Seriously??? Is This lying? 🤦‍♂️- 64-core M1 Ultra vs RTX 3090

แชร์
ฝัง
  • เผยแพร่เมื่อ 26 ก.ย. 2024

ความคิดเห็น • 2K

  • @kb420ps
    @kb420ps 2 ปีที่แล้ว +4560

    Apple's marketing department is a BEAST!!! They'll never let a simple thing like the TRUTH get in the way of marketing their products!!!!

    • @theTechNotice
      @theTechNotice  2 ปีที่แล้ว +303

      🤦

    • @kb420ps
      @kb420ps 2 ปีที่แล้ว +62

      @@theTechNotice lmao!!!

    • @ItumelengS
      @ItumelengS 2 ปีที่แล้ว +73

      Truth is inconvenient at times

    • @Teluric2
      @Teluric2 2 ปีที่แล้ว +126

      Its true. Apple with many youtubers said M1 is faster than i9 and was a lie

    • @tiromandal6399
      @tiromandal6399 2 ปีที่แล้ว +48

      @@Teluric2 You mean "iShills"?

  • @niceone1456
    @niceone1456 2 ปีที่แล้ว +1421

    Apple engineer: the 64 cores is a beast in video encoding, it outperforms RTX 3090. BUT it’s nothing near a Nvidia gpu performance in other fields.
    Apple marketing genius: ok outperform Nvidia gpu

    • @ja-kidnb6416
      @ja-kidnb6416 2 ปีที่แล้ว +39

      yeah JUST video encoding. I had to edit on an M1 Ultra and it basically was an annoyance compared to my 5950x with its 2060 Super. Never again... (and I loved my 2011 Macbook Pro!)

    • @chaos7360
      @chaos7360 2 ปีที่แล้ว +39

      not even that, my coworker has an m1 ultra and 4x slower than my PC in video rendering

    • @rebel2809
      @rebel2809 2 ปีที่แล้ว +6

      i feel like thats genuinely what happened. that sounds like such an apple thing to do.

    • @fayenotfaye
      @fayenotfaye 2 ปีที่แล้ว +14

      @@ja-kidnb6416 youre probably using an application that either isn't optimised for AS or doesnt support a codec that the m1 ultra is efficient at encoding

    • @destinyhero4795
      @destinyhero4795 2 ปีที่แล้ว +9

      Me with 1050ti and 2nd gen Ryzen, "hey, m1 ultra is not that bad :("

  • @attilaherbert6227
    @attilaherbert6227 2 ปีที่แล้ว +624

    I think what most people missed is the axis description: "Relative performance". Basically, this graph means absolutely nothing, it's a fancy gimmick that makes your low-power computer look strong.

    • @MrFWStoner
      @MrFWStoner 2 ปีที่แล้ว +5

      Thank you! 🙏

    • @iffy_too4289
      @iffy_too4289 2 ปีที่แล้ว +17

      Exactly. I saw that and thought; WTF, relative to what? It cant be power as it's the other axis, so what is it?

    • @noturbeezwaxbeaulac1383
      @noturbeezwaxbeaulac1383 2 ปีที่แล้ว +11

      no. it shows that the system does NOT consume as much electrical power. period.

    • @jovaraszigmantas
      @jovaraszigmantas 2 ปีที่แล้ว +11

      And even then it is still not up to bar. At times it was drawing 4 times more power and 5 better benchmarks meaning that the relative performance is not always better on Apple. But you are definitely correct, comparing anything mobile to a desktop grade will always be useless as it is not fair apples to apples comparison.

    • @jessepatterson8897
      @jessepatterson8897 2 ปีที่แล้ว +4

      it's comparing performance at TDPs no one lied remotely, but good job falling for click bait

  • @sunraiii
    @sunraiii 2 ปีที่แล้ว +375

    Buy the product, not the marketing, my friends. Always good to double-check companies' claims. Great video!

    • @arjunsingh68112
      @arjunsingh68112 2 ปีที่แล้ว +10

      Nvidia/Intel/Amd wont give the flex Apple provides. My 1050ti mobile stands still a better than most m1 mac out their for fraction of the price.

    • @goshu7009
      @goshu7009 2 ปีที่แล้ว +3

      @@arjunsingh68112 Every computer is better then Mac. Thats because Mac is not a computer, right? AT least thats what htey say?

  • @xjet
    @xjet 2 ปีที่แล้ว +511

    The M1 uses one fifth the power (instantaneous measurement) but takes five times as long -- therefore the total power consumption per task is about the same. No significant win there either 😞

    • @zadsazhad
      @zadsazhad 2 ปีที่แล้ว +1

      Ikr

    • @VatchGaming
      @VatchGaming 2 ปีที่แล้ว

      what about the entry level?

    • @zadsazhad
      @zadsazhad 2 ปีที่แล้ว +5

      @@VatchGaming m1 is good enough for everyday use with great battery life but, the optimization still isn't there and u can still get better performance for the price but the trade off are battery and audio

    • @Real_MisterSir
      @Real_MisterSir 2 ปีที่แล้ว +19

      Except that with the M1 you will never have fast workflow, so it's a net loss for Apple in that regard. What you have is portability, that's about it. For any professional workload outside of specific media encoding that the M1 is optimized for, as a professional you are going to lose money by using the M1 simply because of the time you're wasting, and how the performance per watt doesn't scale up linearly (aka the more performance you require, the worse the power consumption for that performance gets).
      And then there's the price, which isn't in the M1's favor either. Nor is the fact that many professional programs for rendering allow for cpu+gpu rendering, so you aren't losing out by having a separate cpu and gpu in your system.

    • @Real_MisterSir
      @Real_MisterSir 2 ปีที่แล้ว +1

      @@zadsazhad But even for everyday use and battery life, who spends 1k over the regular M1 which is already incredibly pricey..

  • @super_hero2
    @super_hero2 2 ปีที่แล้ว +775

    Do basically the 3090 is 5x faster than the Ultra but draw 5x more power, that is pretty much linear scaling. Good job NVIDIA.

    • @JustRuda
      @JustRuda 2 ปีที่แล้ว +175

      Yes, but you can still optimize the voltage. I just changed the voltage on RTX 3080 12G, I now have about 100W less power consumption, but the performance is literally almost the same (3-5% difference),

    • @tkpenalty
      @tkpenalty 2 ปีที่แล้ว +52

      @@JustRuda Also add to the fact that Samsung 8nm isn't a great process compared to TSMC 5nm / N7

    • @whitedawn215
      @whitedawn215 2 ปีที่แล้ว +49

      Power doesn't generally scale linearly, the same voltage through an M1 chip would not give it nearly the same performance.

    • @super_hero2
      @super_hero2 2 ปีที่แล้ว +91

      @@whitedawn215 It means that the Ultra still have a long way to catch up to NVIDIA. It can't even beat NVIDIA in efficiency in these tests. It uses 5x lower wattage but also 5x slower. That is a terrible performance for the Ultra.

    • @juslenjeyatharan1004
      @juslenjeyatharan1004 2 ปีที่แล้ว +2

      @@super_hero2 Actually, that’s on par with power-per-watt. It’s just that the Nvidia can go 5x faster.

  • @timotmon
    @timotmon 2 ปีที่แล้ว +967

    You nailed it Lauri. Apple has been trying to appeal to 3D artists since the M1 when they made it big news that Cinema 4D was optimized for the new chip. So to come out with a graph that misleading is a downright dirty play in my opinion. Nice work!

    • @Teluric2
      @Teluric2 2 ปีที่แล้ว

      You dont optimize for a chip .you write a native version for the chip.
      Same crap Apple does . Trying to fool people like everybody has the same IQ
      of an Apple fanboy

    • @1.1kSubChallengeWithoutAnyVid
      @1.1kSubChallengeWithoutAnyVid 2 ปีที่แล้ว +12

      Where are the 2 replies. I want to read them.

    • @jmun3688
      @jmun3688 2 ปีที่แล้ว +7

      @@1.1kSubChallengeWithoutAnyVid fr

    • @timotmon
      @timotmon 2 ปีที่แล้ว +36

      @Garrus Vakarian He doesn't do game benchmarks, that's an entirely different thing. His channel focuses exclusively on creative production. So yeah, The m1 Ultra can't compete with a 3090 in GPU centric creative software on almost any level. Certainly can't compete in 3D rendering applications like redshift and octane plus it's not scalable.

    • @timotmon
      @timotmon 2 ปีที่แล้ว +3

      @@dannybcreative Danny, I love Mac's and small scale 3D projects are perfectly fine on that platform. FCP is exceptionally optimized for the new M1 architecture plus the the low energy consumption is phenomenal. I have to say that most users would be very pleased and very impressed on the the new Studio Mac. No reason you can't create the highest quality of visual content on the new macs. The only exception would be if you were rendering very dense polygonal objects with high resolution bitmaps for 3D work , That's fine and PC's are exceptionally focused on performance like this but not quite as eloquent. I would love to drive an F1 formula car from time to time but I'd rather have a high end Mercedes as my daily driver.

  • @vlogsingh
    @vlogsingh 2 ปีที่แล้ว +10

    You cant play games with M1 , but 3090 can

  • @pirojfmifhghek566
    @pirojfmifhghek566 2 ปีที่แล้ว +307

    If "efficiency" is all the M1 has going for it, has anyone tried matching the M1's performance with lower grade parts? The 3050 can smoke the thing and it only draws 130w. Put them side by side and see where that gets you. Is that 30-40w of power savings worth the extra three thousand dollars you spent on a device you can never DIY repair?

    • @noturbeezwaxbeaulac1383
      @noturbeezwaxbeaulac1383 2 ปีที่แล้ว +8

      ya the M1 is not a gaming video card. it a all in one chip. that is strong for its type. you want to compare it to anyt thing in the PC world, start looking at motherboards with integrated video. and not plugin video cards.

    • @pirojfmifhghek566
      @pirojfmifhghek566 2 ปีที่แล้ว +69

      @@noturbeezwaxbeaulac1383 That's the space that the chip itself exists in, but that's not what the desktop computing space is. The M1 makes sense for cheap and light ipads and laptops--where space and power is at a premium--but if you have even a little leeway to add a dedicated GPU to the system it absolutely trounces Apple's all-in-one design. And that's not good for a system like the M1 Studio, which was designed AND MARKETED to compete directly with other desktop behemoths.
      Unfair comparison? That's the comparison Apple is trying to sell us on. And they fudged a lot of numbers to feed us that lie.

    • @gamechannel1271
      @gamechannel1271 2 ปีที่แล้ว +48

      @@noturbeezwaxbeaulac1383 Apple decided to start the discrete GPU comparison, not us.

    • @akshobhyamanthati2304
      @akshobhyamanthati2304 2 ปีที่แล้ว +5

      @@pirojfmifhghek566 hes talking about the m1 studio the one for pros so performace matters more than power effiency

    • @noturbeezwaxbeaulac1383
      @noturbeezwaxbeaulac1383 2 ปีที่แล้ว

      @@gamechannel1271 Apple as been comparing it product long a go and it's normal. and if you do want to compare things do make the point of keeping the correct comparaison. Otherwise this was good click bait.

  • @kewk
    @kewk 2 ปีที่แล้ว +347

    I was appalled at the real world performance vs what they stated so I returned mine shortly after I bought it. It essentially kept up with my rtx 2070 laptop. Where as my main pc smoked it in everything.

    • @kevinweber5129
      @kevinweber5129 2 ปีที่แล้ว +8

      It’s essentially a laptop chip doubled. If they wanted power they would be pumping more power through it and running the chip faster. It’s limited just like the TrashCan make - atleast they made accommodations for a Real GPU there. What I think is a big loss is that you can’t buy it with one MAx chip in it and then add a second one it with later.

    • @trevorlafave
      @trevorlafave 2 ปีที่แล้ว +7

      @@kevinweber5129 For video editing, music production, and photo editing, this is still one of the better options. A lot of producers use logic pro, and many creators use final cut. The thing is, most of these use cases don’t require that much GPU power. You can barely game using MacOS, so you will probably not need the extra GPU power anyway. The CPU is what makes the M1 Ultra great.

    • @jakejakedowntwo6613
      @jakejakedowntwo6613 2 ปีที่แล้ว +4

      @@trevorlafave make sense, M1 is marketed as a media machine and the 3090 is a gaming card

    • @kevinweber5129
      @kevinweber5129 2 ปีที่แล้ว +8

      @@trevorlafave I’ve read comments that logicPro plug-ins have had problems so it doesn’t work for some. I guess it’s great if final cut is your one and only software.

    • @noturbeezwaxbeaulac1383
      @noturbeezwaxbeaulac1383 2 ปีที่แล้ว +7

      @@kevinweber5129 jesus, it don't work like that. the 2 chips in the ultra are soldered together. C'ant "just plug" a second one. its not like the old dual systems. Good luck making them 0.5 nano solder joints

  • @davidbrennan5
    @davidbrennan5 2 ปีที่แล้ว +42

    When I was a kid Apple products were overpriced and underpowered, not much has changed in the last 35 years.

    • @reddbendd
      @reddbendd 2 ปีที่แล้ว +1

      you can get a cracked iphone xr for under $100usd now though, replacing the screens yourself is 2-3x cheaper. buying an iphone in good condition just isnt worth it

    • @charlesreid9337
      @charlesreid9337 ปีที่แล้ว

      Their claims to fame was the macintosh and their program to put computers in (upperclass) schools. When it came out it was amazing. Shortly after that everyone beat them but theyve built this sortof elitist image and turned it into a cult.

  • @Mark-yp9dl
    @Mark-yp9dl 2 ปีที่แล้ว +29

    Clearly apple made the comparison of their m1 ultra chip against the gtx 1650.

    • @jasonsykes4199
      @jasonsykes4199 2 ปีที่แล้ว +5

      Don't give apple credit. they clearly benchmarked it against an original Gateway computer.

  • @DigitalImageWorksVFX
    @DigitalImageWorksVFX 2 ปีที่แล้ว +298

    In terms of desktop computers, I'm not interested how much power they consume. I'm interested how they perform. Apple still is far behind in terms of GPU raw performance, but future may be interesting.

    • @simplyruben3184
      @simplyruben3184 2 ปีที่แล้ว +24

      power consumption should matter, especially nowadays though, energy bills are higher then ever so having a gpu that can push fast refresh rates at the cost of drawing 900w (coughs in 4090) yeah no.. id rather not pay $1000+ monthly for power. efficiency matters, the more efficient a card is the better its performance ontop of reduced power bills, but ig your family pays the bills so you dont care

    • @llothar68
      @llothar68 2 ปีที่แล้ว +2

      Thats why we need a law for this. It's insane to have more than 1W in standby and more than 20W in sleep. Intel/ATX mainboards have 20W in standby.

    • @flashback4588
      @flashback4588 2 ปีที่แล้ว +33

      @@simplyruben3184 maybe for the average consumer but i think these workstations are meant for big companies with millions of dollars were performance is more important than their electricity bills

    • @spirit9087
      @spirit9087 2 ปีที่แล้ว +42

      @@simplyruben3184 bruh
      you going to wait 3d render 5x longer on mac then on pc and you still have to pay the same bill, but you also have to pay bill 5x for monitor's electro on mac if you talking about power consumption

    • @GUY-on-Earth
      @GUY-on-Earth 2 ปีที่แล้ว +1

      true but first they need 10 years of milking and then they’ll start to mean what they say, in which by that time they’ll still be behind on innovation

  • @No-mq5lw
    @No-mq5lw 2 ปีที่แล้ว +17

    Don't forget there's the 3060 Laptop, which takes up 80w on paper (out of a max 180W), with an OpenCL Geekbench 5 score of 100903, CUDA score of 108249, and Blender monster 1249.47, junk of 738.999, and classroom of 641.888. V Ray CUDA is at 873.
    Sample size of 1 btw. Did I also mention that this system costed me around $900?

    • @No-mq5lw
      @No-mq5lw 2 ปีที่แล้ว +1

      @Tevin Gigabyte G5 KD. Happened to get it on sale though.

  • @TheLonelyMoon
    @TheLonelyMoon 2 ปีที่แล้ว +149

    I was thinking, hey you know, PCs with 3090 are probably more expensive, the marketing is off by a long shot but it doesn't sound half bad since it should be a cheaper opt---
    "Starting price: from $1999" "M1 Ultra: from $3999"
    Aight imma head out

    • @randomanimegalaxy6859
      @randomanimegalaxy6859 2 ปีที่แล้ว +7

      it's very much comparable with 3060 so you should also compare the price of 3060 with M1 which is way cheaper
      3090 is just overkill for M1 ultra so as price too.

    • @TheLonelyMoon
      @TheLonelyMoon 2 ปีที่แล้ว +1

      @@randomanimegalaxy6859 Yeah, I thought it was gonna be priced around 1500 to 2000 for ultra, not 4000 👁👄👁

    • @kiro253
      @kiro253 2 ปีที่แล้ว +10

      Haha do u really expect a cheap price from this company called apple? It literally only sell premium garbage

    • @HoodedMushroom
      @HoodedMushroom 2 ปีที่แล้ว +8

      @@randomanimegalaxy6859 Right now, when iam looking at the prices at where iam from (EU), and its stock, i could buy full build with 3090ti and 12900k with DDR5 for 4k dollars. Apples pricing is just ridiculous.

    • @Nyaruko166
      @Nyaruko166 2 ปีที่แล้ว +3

      @@HoodedMushroom they sell brand not product 🐸☕

  • @iancurrie8844
    @iancurrie8844 2 ปีที่แล้ว +225

    Apple lies all the time. They claimed that the M1 ran photoshop with 8GB of ram better than pcs with 32GB and a bunch of bought and paid for TH-camrs backed them up. They claimed that someone having unified memory allowed an 8GB system to do miracles and it’s all you’d ever need. That’s absolutely impossible. As a heavy photoshop user, I can tell you that some projects can exceed 16GB in a single instance of the program. No amount of “unification” can make that work on 8GB total system memory.
    Then when the higher end M1 chips came out those same TH-camrs now claim that they fixed the slow productivity of the M1 which didn’t, apparently, exist until the higher ones launched.
    It’s all a pack of lies.
    Look at cinebench. They rand the lowly core i5 12400 above the M1 max.

    • @itsmeadmiral
      @itsmeadmiral 2 ปีที่แล้ว +8

      where is your evidence? or is this just more speculation?

    • @iancurrie8844
      @iancurrie8844 2 ปีที่แล้ว +63

      @@itsmeadmiral It's all over youtube. Just have a look. Watch the initial M1 reviews and now the Max ones. The narrative changes. It's all lies. As for the cinebench, those benchmarks are freely available.

    • @itsmeadmiral
      @itsmeadmiral 2 ปีที่แล้ว +4

      @@iancurrie8844 got it.

    • @priyadarsihalder9242
      @priyadarsihalder9242 2 ปีที่แล้ว +15

      just blindly trust apple and you will be fast than superman himself. The thing you need the most is trust blindly on apple 🙂.

    • @cheeeeezewizzz
      @cheeeeezewizzz 2 ปีที่แล้ว +4

      Gotta say though, for us lighter photoshop users affinity photo on the iPad is soo much more approachable than Adobe is

  • @RealityStudioLLC888
    @RealityStudioLLC888 2 ปีที่แล้ว +199

    My best guess is Apple did a video editing "benchmark" that decoded/encoded from/to Pro Res and had no effects whatsoever. That way the M1 Ultra gets to use all it's Pro Res hardware decoders/encoders whereas the NVidia 3090 can't.

    • @llothar68
      @llothar68 2 ปีที่แล้ว +27

      As. far as i remember and i watch the Apple events since 2006 when they moved to Intel, Apple never compared GPU computing power. They always mean encoding.

    • @Teluric2
      @Teluric2 2 ปีที่แล้ว +23

      @@llothar68 its not the first time Apple use deceptive marketing and lies.

    • @PanosPitsi
      @PanosPitsi 2 ปีที่แล้ว +14

      @@Teluric2 or any other company for that matter. Remember how ray tracing would make video games look like real life or how the ps5 would compete with 2k dollar pcs? Or how huawei would make an android competitor ? Or how cyberpunk would be the game of the decade ? Apple saying their gpu is good in some irrelevant test was to be expected their gpus always sucked so this would be no different

    • @krellin
      @krellin 2 ปีที่แล้ว +1

      well to be fair, the fact that CPU and GPU with apple are integrated makes a huge difference in real world workflow when both are involved in whatever you're doing
      I agree that the graph is complete crap
      but make no mistake no one comes even close when it comes to the final product, I cant buy any other laptop because everything is garbage both in software/build and performance department
      I have PC with 3090 which I use for gaming, but apple MacBook pro I use for everything + more and more gaming on it as well
      I prefer PC for ergonomics and my monitors, but it is obsolete if I ignore those reasons and the price of the mac
      My furutre setup is just MacBook pro with windows BootCamp for gaming

    • @muthukumarannm398
      @muthukumarannm398 2 ปีที่แล้ว +4

      @@krellin what you said doesn’t make sense. I read twice.

  • @markdalbey
    @markdalbey 2 ปีที่แล้ว +16

    The graph says Apple uses 200 watts less than the RTX 3090. So? I live in Las Vegas. The cost difference in electricity per year at 8 hours per day 365 days a year would be 70.95. In Hawaii which has the highest electrical price in the industrialized world, you would save 221.45. While significant to me, I am not spending 4,000 on a computer, or 2,200 on a graphics card. That cost of electricity probably is not a factor if you are in the industrialized world. It might be a factor in the Solomon Islands, which has the highest electricity rate in the world. The difference would be 578.16. If you live anywhere other than the Solomon Islands, I don't see the electricity savings being a factor in your decision making. As far as fan noise. My a/c is a lot louder than any computer fan. I just wear noise canceling headphones. They make everything really quiet, and I get to listen to B.B. King as a bonus.

    • @tjcarr70
      @tjcarr70 2 ปีที่แล้ว

      What you should consider is where that power comes from. If clean and efficient source then no problem. If people thought more about the environment then this would be a major factor, especially if you come from a third world country like the uk where people cannot afford to put food on the table….

    • @RunForPeace-hk1cu
      @RunForPeace-hk1cu 2 ปีที่แล้ว

      you are talking about ONE Mac Studio.
      Corporations buy them in data centers and all around in offices ...
      It's NOT about you. When Apple made that presentation, they were talking to corporate executives, not you 😂
      There are companies buying a rack full of Mac Minis ... replace them with Mac Studios ... BAM ... money saved.
      You just don't get it because you are a simpleton.

    • @mikldude9376
      @mikldude9376 2 ปีที่แล้ว +1

      @@tjcarr70 if you can’t afford to put food on the table then you won’t be worrying about high price computers will you.

    • @TrioLOLGamers
      @TrioLOLGamers 2 ปีที่แล้ว +1

      You have the same issue: a computer doesn't consume always that much watt, but only when working: if you do light tasks you will draw in some cases not a watt. That's why you see some shitty laggish windows/Linux laptops on Amazon that costs nothing but they last longer than your PC with a 3080.
      The reason why you buy a big power supply is not because it will drain all that watt but because there are some peaks and when they comes it will cause CPU/GPU THROTTLING and in some cases it it's cheap it can even die... Believe me. It happens and that can even damage your RAM or your SSD. That's why everyone said that Apple made a big mistake with the M1 ultra with the power supply: that's too close and not sealed from everything.

  • @OrestisRovakis
    @OrestisRovakis 2 ปีที่แล้ว +127

    in the power consumption you should consider the time that power will be drawn. If the rtx system needs 30minutes with 450w is the same as a system with 5 times more time to do the job with 5 times less power consumption which is close to what we saw here.

    • @lyte561
      @lyte561 2 ปีที่แล้ว +2

      Doesn’t mean your game less if you get more fps

    • @piotrj333
      @piotrj333 2 ปีที่แล้ว +8

      There is 2 issues. First Nvidia already runs on edge of efficiency, basicly stock RTX ampere is like overclocked Ampere. I have 3070Ti and I didn't do any unstable tricks I basicly dropped TDP to 75% yet performance was still 96%. I also did all the way tests to 40% TDP, and found the highest efficiency per watt 3070Ti has at 55% TDP. And i know from Pugetsystems it is common thing you do while making rendering farms If you do more insane tricks like undervolting efficiency will be even better.
      2nd. Ampere is old 8nm process and is more then 2 years old. RTX40xx serie is comming soon and will be much closer by date of release to M1 ultra then M1 ultra to RTX3090.

    • @theyoutuber273
      @theyoutuber273 2 ปีที่แล้ว +10

      No one buys a top-end desktop for efficiency. I can understand efficiency on a laptop where you have a battery, but it's non sense.
      Also, efficiency isn't linear with power. Meaning that even if you give m1 max same wattage as the rtx-3090, it won't have anywhere near the performance as rtx-3090.

    •  2 ปีที่แล้ว +1

      and then there is the saying "time is money"

    • @Funcle
      @Funcle 2 ปีที่แล้ว

      @@lyte561 nobody buys Apple for games

  • @jonathanodude6660
    @jonathanodude6660 2 ปีที่แล้ว +95

    small correction 2:59. the score is 244% *of* the mac score. its not 244% *faster* because its only 144% "faster". using percentages to compare values like this is pretty confusing to most people because remember you can never be more than 100% less without using negatives. its probably better to use the larger value as 100%, and say something along the line of "x is y% of z" rather than "z is w% better than x" for clarity, (edit:) like you did at 3:35.

    • @Digipiction
      @Digipiction 2 ปีที่แล้ว +8

      I'd always use multipliers when talking about the differences. 244% means that it's about 2.5x as fast, while interestingly enough "144% faster" on first sight doesn't seem like that much even though it's correct.

    • @ameera8423
      @ameera8423 2 ปีที่แล้ว +3

      @@Digipiction yeah percentages can be confusing to a lot of people especially when the video switches between using "x% faster/slower" and "x% of..." multiple times throughout

    • @SuperJoshio
      @SuperJoshio 2 ปีที่แล้ว

      I agree. This is exactly why I wanted to comment. For example, the V-Ray test, the 3050 is listed at 113.46% faster BUT in reality it is only 13.46% faster (550.3 compared to Apple’s 485). That’s because of the way the comparison is presented. Very annoying and confusing.

    • @papiezguwniak
      @papiezguwniak 2 ปีที่แล้ว +1

      Yeah, this channel is so unprofessional.

    • @EDENTGP
      @EDENTGP 2 ปีที่แล้ว +1

      @@papiezguwniak go get an m1 ultra

  • @nikhilpaleti3872
    @nikhilpaleti3872 2 ปีที่แล้ว +100

    This needs to reach everyone.
    Apple literally has nothing but lies in their arsenal. Weaker than a 3050, but crying that they beat the 3090.
    Just like they lied that the M1 can beat an i9 while it would struggle to beat a Ryzen 5.
    No one cares about 10W power draw only when you get no performance, and need to give up compatibility with x86, and lose ALL upgradability for it

    • @Busy-B.
      @Busy-B. 2 ปีที่แล้ว +16

      Not to mention if your power consumtion is 5x lower but you need 5x the time for the same tasks it comes out to nothing.

    • @RiceCubeTech
      @RiceCubeTech 2 ปีที่แล้ว +3

      The issue is video editors vastly outnumber other professional tasks so it’s a loud majority who act like the chips are bleeding edge, because for normal NLE final cut and the m1 chips are extremely fast and efficient. But that’s mostly due to the TWO video decoder chips where most chips have one.
      It falsely leads “pros” of other tasks to think it’s somehow magically faster. But for any 3d rendering, you’d be better off with any Nvidia card.

    • @nikhilpaleti3872
      @nikhilpaleti3872 2 ปีที่แล้ว +4

      @@RiceCubeTech Nah, it's just Mac fans that outweigh everyone's arguments.
      Even for Video Editing, remove the advantage that specific Apple applications take, they aren't as sky high either. It is stull much more on-par with 3060s or such, th
      an with a 3090.
      And of course, point of outside of Video Editing stands.

    • @darealduck6945
      @darealduck6945 2 ปีที่แล้ว

      don't have a mac but: that 10w power draw is essential to macbook users and i have seen videos of how users love how the m1 sips power in comparison to the power hungry i9 macbook pros of yesteryear, and how the m1 outpaces the i9 laptops in video work and rendering

    • @Rave_Etherメ
      @Rave_Etherメ 2 ปีที่แล้ว +1

      let's not forget about the fact that rtx 3090 costs about $ 2100 when apple is about $ 4000, it would take a long time to recover this amount by saving apple's energy demand

  • @eneveasi
    @eneveasi 2 ปีที่แล้ว +28

    Apple seems to power throttle their machine rather than building actual thermal design into their product. Efficient performance at a low point in the relative benchmark curves. That ends up being useless for anyone with anything major to get done

    • @Byronic19134
      @Byronic19134 2 ปีที่แล้ว +2

      I mean Apple clearly has figured out something about thermal design because they're most popular unit at the moment is the MacBook air that doesn't even have a fan.

    • @julian5857
      @julian5857 2 ปีที่แล้ว +2

      @@Byronic19134 they just put a very efficient but not very strong chip in there

    • @PanosPitsi
      @PanosPitsi 2 ปีที่แล้ว +8

      It’s an arm chip it’s basically an iPhone chip on steroids

    • @PanosPitsi
      @PanosPitsi 2 ปีที่แล้ว +4

      @@julian5857 the average user won’t even be able to use it’s true potential but think that people who actually need to 3d render must use nvidia exclusively anyways because most 3d rendering apps are CUDA exclusive

    • @m.l.9385
      @m.l.9385 2 ปีที่แล้ว

      @@julian5857 Yeah but the M1 in the Air actually can burn more Watts or in other words has actually a higher TDP than its former lousy 2-Core Intel counterpart - 30W instead of 13W - without having a fan anymore. So they did improve the thermal design for this model drastically. And even the new MacBook Pro 16” is thicker than its Intel counterpart. Apple chocked all the Intel CPU close to their thermal death. And yes they did improve thermal design in this generation. And it is actually the case that they rather let the new M1 Ultra throttle than ramp up the fans - because they want it “absolutely” noiseless.

  • @emberparadox458
    @emberparadox458 2 ปีที่แล้ว +10

    I sat in on an Apple presentation to retailers and they oh so loved talking about how great the graphical performance was on the M1. However, when it came time for the Q&A I actually posed the question of how the M1 performs in comparison to Nvidia's RTX series since they went on and on about how great it was for gaming. Their response? The M1 is fantastic for games in Apple Arcade... and that's all they said about it. They immediately pivoted away from my question and refused to answer any more of them. Between how they are assembling their devices and what the actual capabilities are, Apple is turning into the snake oil salesman that is somehow managing to con millions upon millions of people into thinking their pc products are soooooo high quality and soooooooo much better than windows pc's that perform better at half the cost.

    • @theTechNotice
      @theTechNotice  2 ปีที่แล้ว +5

      Makes you think, doesn't it? 🤔

    • @racistpixel1017
      @racistpixel1017 2 ปีที่แล้ว +2

      Apple know with m1 they dig them selves in to deep hole, amd steadily approaching m1 efficiency level and performance, intel have plans to and in terms of mn have tons of room to grow where apple can shrink transistors just so much more. I mean apple is in deep trouble, its just not obvious to average consumer yeat

  • @rubik1452
    @rubik1452 2 ปีที่แล้ว +81

    Seems like Apple wants make people spend a huge money for a processor that is lower than the cheaper one.
    What an amazing company.

    • @matthewgamman4303
      @matthewgamman4303 2 ปีที่แล้ว +7

      yah because most of the people that buy their products would just go "oh, apple says its good, im gonna buy"

    • @rubik1452
      @rubik1452 2 ปีที่แล้ว +1

      @@matthewgamman4303 well i don't understand them dude maybe they just buy by the decision of the crowd

    • @justinlikesme19
      @justinlikesme19 2 ปีที่แล้ว +1

      @@matthewgamman4303 cuz the western countries has no problem to get a apple product and specially in japan and korea

    • @ChibiNaruto
      @ChibiNaruto 2 ปีที่แล้ว +2

      Isn't that they have been doing all along? They make people pay for something that exceeds the price it should be while there are other products from other brands that work like a charm but cost lower, the price-performance ratio of Apple products were never good in my opinion

    • @Teluric2
      @Teluric2 ปีที่แล้ว

      The teenager company"

  • @fingerhorn4
    @fingerhorn4 2 ปีที่แล้ว +14

    This is why virtually no-one ever uses Apple machines for things like flight simulation which requires a high-end CPU for processing and a very high powered GPU for 3d graphics. Windows plus a decent graphics card absolutely slays any Apple machine. This by extension also rubbishes the claim Apple constantly makes about its machines being the best for video editing and similar functions. A good Windows system with a decent graphics card, at typically half the cost, makes mince meat of Apple. They may be fashionable machines but they have never been anywhere near as fast processing graphics.

    • @digvijaybokey3585
      @digvijaybokey3585 2 ปีที่แล้ว

      i mean their laptops smoke every other laptop when it comes to display, battery life, and power for said battery life rn. Their speakers are also insanely amazing.

    • @maharaj8460
      @maharaj8460 2 ปีที่แล้ว +3

      @@digvijaybokey3585 They don't smoke nothing. If battery power and display is the only thing important to you then you should really change your standards

    • @camilandtati
      @camilandtati 2 ปีที่แล้ว

      I think you nailed it here. I’m a big Apple fan, but, nothing will beat my i9 10900 & RTX3090 running MSFS2020 with ultra settings. Case in point, why hasn’t Apple produced a system that can smoothly run a similar flight simulator in VR equivalent to the Aero yet?!

    • @eclipticpath
      @eclipticpath 2 ปีที่แล้ว +1

      Not graphics, but for video editing you use encoders and decoders. Mac Studio is the only consumer computer that supports 18 streams of 8K ProRes 422 video.
      Plus Macs come with better support for media workflows, whether it’s editing ProRes video or bouncing mp3. On windows you often need to install bunch of add-ons like LAME encoder etc. to get stuff to work. Then apps like Davinci Resolve, which is one of the most popular color grading and video editing app, won’t even support ProRes on Windows which is a bummer since that’s the the typical format for professional video cameras like Black Magic. And if you’re ever gone to a gig, it’s always a Mac. For audio processing, there’s no good native support for PC so you have to use ASIO4ALL. Even then, it’s not as good as Mac. Or for music production, Macs’ midicore is way better than PC. Trying to get midi devices to work with windows is hit and miss. Then even something as simple as trackpad don’t work well on PC. Like you can’t even drag something with your mouse and scroll the timeline/change desktops with your trackpad at the same time on PC. Basically, what’s a super useful tool for gestures, scrolling, navigating, zooming, clicking when dragging, swiping etc. on Mac is practically unusable on Windows. Like how do you click a folder or change a desktop when dragging files?
      It’s not that Macs are better for media stuff, I personally just prefer them. I use PC for gaming and server for Plex, homebridge and arr services, but Macs for everything else, since everything just works and works well.

    • @VapidSlug
      @VapidSlug 2 ปีที่แล้ว

      @@digvijaybokey3585 um.. no they dont. If you take the substantial price for a Macbook and, instead, buy a custom like Eurocom, it will absolutely annihilate anything apple can build. If you are in the market for a powerful laptop, basic physics and chemistry wont get you out of using up battery.

  • @RenormalizedAdvait
    @RenormalizedAdvait 2 ปีที่แล้ว +89

    84.5*AVG(489.13,580.29,510.95)% = 445 W, i.e. efficiency per watt is almost the same across Nvidia and Apple, the only problem is that Nvidia uses 8 nm while apple uses vastly superior 5 nm process. This makes Nvidia's GPU architecture light years ahead that of Apple.

    • @maximseryakov2373
      @maximseryakov2373 2 ปีที่แล้ว

      What? Why is it ahead?

    • @mariuspuiu9555
      @mariuspuiu9555 2 ปีที่แล้ว +38

      @@maximseryakov2373 because it achieves similar perf/W using a much less efficient process node.

    • @Knightfire66
      @Knightfire66 2 ปีที่แล้ว +16

      yes and imagine nvidea switches to 6nm or 4nm.. it will be about 200% faster. so it would be 800-1000% faster... w/fps AND $/fps is way better. apple should beat the little mini brother of rtx 3090 first... the 3050 is way better then m1. and still only 300$.

    • @andrewmicro16123
      @andrewmicro16123 2 ปีที่แล้ว +4

      Another thing to consider is the M1 cannot simply scale up to match Nvidia's performance. There comes a point of diminishing returns which points to Nvidia's architecture just being simply much superior to Apple's.

    • @mariuspuiu9555
      @mariuspuiu9555 2 ปีที่แล้ว +2

      @@andrewmicro16123 it is afterall just a mobile architecture retrofitted to work in a workstation.

  • @Anthony-nj2mz
    @Anthony-nj2mz 2 ปีที่แล้ว +12

    Although I am a big Apple fan, I really do wish that they would correctly recognize their competition and try better to surpass them and not focus more largely on marketing

    • @talkysassis
      @talkysassis 2 ปีที่แล้ว

      All their good hardware designers work for HP or IBM now. The M1 is just an AMR SOC that they've bought the blueprint from arm and put some vcus inside. (You can't make an ARM SOC wihtout using their blueprints, as it is not open)
      AMR dev kits are monsters when compared to the M1 if you look at raw CPU performance.

  • @ThinLineMedia
    @ThinLineMedia 2 ปีที่แล้ว +16

    Apple always lie - not a big deal

  • @opticalip1
    @opticalip1 2 ปีที่แล้ว +2

    @9:29 I think you are mis interpreting the wording of the chart. Its Performance to power usage, so its most likely comparing the performance at the same power draw. If you kneecapped the GPU's power usage it would most likely be more accurate with that apple is presenting.

  • @Golemisoptics
    @Golemisoptics 2 ปีที่แล้ว +58

    What we are seening here is absolutely expected. You cannot beat physics. This is samsung 8N vs tsmc 5N and as we can see perf per watt is way better on apple silicon but in absolute perf rtx is destroying it. We as consumers are benefited with this level of competition.

    • @aviroblox6624
      @aviroblox6624 2 ปีที่แล้ว +7

      Do the math, consuming a fifth the power doesn't make it more efficient if it's also a fifth the performance...

    • @Golemisoptics
      @Golemisoptics 2 ปีที่แล้ว +1

      @@aviroblox6624 M1 ultra is 21tflops with 100-120w(?) max power draw (gpu) and rtx3090 is 35.58tflops with 300-320w power draw. So 50% more efficient which seems about right.

    • @PanosPitsi
      @PanosPitsi 2 ปีที่แล้ว +2

      @@aviroblox6624 if you take the 3090 machine and force it to use as much power as the mac then apples graph is true. This means nothing for a desktop but imagine this monster on a bulkier laptop or a tablet, we are talking about a tiny apu dealing blows with a top of the line gpu

    • @PanosPitsi
      @PanosPitsi 2 ปีที่แล้ว +2

      @@aviroblox6624 also apples on arm now so it’s really hard for them to make high power draw gpu on that architecture, arm is good for low power and good performance not quite extreme. Again I don’t see the appeal of having an arm chip on a desktop, they should be reserved for laptops tablets and maybe a Mac mini that is the size of a bowl

    • @Bramagon
      @Bramagon 2 ปีที่แล้ว +1

      @@Golemisoptics You can't compare terraflops between different architectures. It isn't that simple.

  • @Tossen98
    @Tossen98 2 ปีที่แล้ว +10

    2:55 just a measurement detail, but if you use M1 as your 100% value, then if the 3090 scores 244% of the M1 score value, then it isn’t 244% better, it’s (244 - 100)%, so 144% better.
    Y=2.4X => Y-X = 1.4X

    • @unknownstrike2949
      @unknownstrike2949 2 ปีที่แล้ว

      you are genius lol
      When I say 5
      then will it become 3 more than 2? or 2 more than 3?

    • @zb8939
      @zb8939 2 ปีที่แล้ว

      I'm not gonna rewatch it again to try to prove this, but I think what he put on the screen wasn't 244% better, just that it runs 244 vs the Mac m1. Could be wrong but that's what I got out of it. So you're right, but it's redundant

  • @omarspost
    @omarspost 2 ปีที่แล้ว +25

    A professional's hourly rate dwarfs the electricity used by the mac.
    Common-sense to compare productivity rate vs time, not vs watts.
    This is not a laptop. Performance per watt is only a priority on portable devices due to battery capacity
    As fantastic as Apple products are, they sometimes unnecessarily shoot themselves in the foot for no reason.

    • @maharaj8460
      @maharaj8460 2 ปีที่แล้ว

      They are like we consume less power and make 2db less sound. Like ok bud no one cares about that.

  • @Prime_Rabbit
    @Prime_Rabbit 2 ปีที่แล้ว +23

    Apple lying about their products performance by an insainely large margin? I'm shocked. Shocked I say, SHOCKED!

    • @digit313
      @digit313 2 ปีที่แล้ว +1

      nvidia 8nm vs Apple 5nm not fair.
      Bring RTX 4x00 5nm vs Apple m1 in August 2022.

    • @gn1l262
      @gn1l262 2 ปีที่แล้ว +1

      @@digit313 apple is literally claiming them strong

  • @reinhardtwilhelm5415
    @reinhardtwilhelm5415 2 ปีที่แล้ว +14

    This is nothing new. Apple has been gaslighting consumers for almost their entire existence as a company, and it should come as no surprise that they’d sell you an “RTX 3090 killer” that loses to the RTX 3060 across the board.
    I could make a PC that beats this thing for $1500. That’s a disgrace.

  • @Crowdrender
    @Crowdrender 2 ปีที่แล้ว +4

    The Benchmark that you could try running would involve encoding/decoding 4K and 8K video. You mentioned this yourself in the video that there are 'better' or 'more' encoders in the M1, this might be the area where Apple could claim better performance, but we'll await your benchmarks to say for sure!

  • @Tigerex966
    @Tigerex966 2 ปีที่แล้ว +31

    You just saved a lot of people into 3d thousands of dollars
    A simple $1599 rtx 3050 build will outperform a $8000 Mac studio ultra 64 core beast build in 3 d.Thats just amazing.
    And will give it a good fight with Intel quicksync in video editing in many areas and 12900k CPU will beat it in single and multicore in Cinebench which measures all the threeads in multicore.

    • @breechcomet9724
      @breechcomet9724 2 ปีที่แล้ว +3

      wait even 3050 can destroy mac studio😮 so PCs still over better performance for price i guess compared to mac?🤔

    • @Tigerex966
      @Tigerex966 2 ปีที่แล้ว +5

      @@breechcomet9724 apple is king of hype.
      that's a Mac studio ultra with 64 GPU cores, 64 GB ram.
      A PC with the 12900 k and an rtx 3050 pcie 4 SSD same ram in a case with good cooling will beat the Mac.
      It will lose some video editing where apples media engine encoders come into play, on m1 optimiser video editing using those encoders where it's a beast, but Intel will win a few with Intel quicksync.
      And win most 3 d and games.
      Software optimization the key for both.
      And newer twice as fast GPUs are not far away this fall.
      $2000 to $5000

    • @llothar68
      @llothar68 2 ปีที่แล้ว +3

      @@hollyc5417 People into 3D follow the advise of Linux Tech Tipps 🤣

    • @super_hero2
      @super_hero2 2 ปีที่แล้ว

      Well you also have to talk about the package as a whole though. It is like 20x smaller than the PC tower.

    • @llothar68
      @llothar68 2 ปีที่แล้ว +1

      @@super_hero2 And every office has still space for a 20x larger box. I get the idea for smallness on mobile things. But on others when we have space, big is beautiful. And it's always so much easier to repair big stuff than have the motoric skills to fiddle with small screws (yeah the screws to hold NMVe drives are insane ).

  • @Tigerex966
    @Tigerex966 2 ปีที่แล้ว +21

    Truth is always appreciated.
    Apples win are the dual media encoders renderers and graphics memory potential power draw noise size.
    That can be taken advantage of best by video editing apps.
    Not 3d they are about a 3060 power even in the the $5000 ultra studio version
    If you actually play tomb raider you will have pause and glitches not seen in the benchmark.

  • @wrenvfx
    @wrenvfx 2 ปีที่แล้ว +7

    I appreciate the production quality of ur videos because I want to be an editor for a channel like this 1 day :)

  • @1spark
    @1spark 2 ปีที่แล้ว +5

    Didn't the graph say "highest end *discrete* GPU", and the RTX 3090 GPU shown is a dedicated one. The RTX 3090 laptop GPU would have compared better. Or am i misunderstanding something?

    • @TacohMann
      @TacohMann 2 ปีที่แล้ว +3

      Discrete and dedicated mean the same thing in this context bud.

    • @1spark
      @1spark 2 ปีที่แล้ว +3

      @@TacohMann But they have vastly different power and efficiency levels...
      Please explain.

    • @brychaus9059
      @brychaus9059 2 ปีที่แล้ว +2

      They just mean the same thing in general. Dedicated desktop GPU's are called discrete too.

    • @niranjanr.m2763
      @niranjanr.m2763 2 ปีที่แล้ว

      there is no rtx 3090 laptop gpu

  • @DC9V
    @DC9V 2 ปีที่แล้ว +9

    3:30 Actually, it's 244% of the M1, meaning that it's (244% - 100%) = 144% faster.

    • @tanzeelsalfi
      @tanzeelsalfi 2 ปีที่แล้ว

      but why would you - it from apple it's just 244% time more powerful then M1 ultra

    • @tanzeelsalfi
      @tanzeelsalfi 2 ปีที่แล้ว

      then According to your logic m1 ultra is 0% powerful comparid to rtx 3090

    • @EHouseFreak
      @EHouseFreak 2 ปีที่แล้ว

      yea i think so too... and for the 3050 rtx -28% slower. Is wrong caculated if m1 is the 100%

    • @Sasoon2006
      @Sasoon2006 2 ปีที่แล้ว

      Exactly

  • @williamcousert
    @williamcousert 2 ปีที่แล้ว +30

    I'm glad I saw this before running out to buy a new Mac Studio. My $1200 MSI laptop out performs the best Apple silicon currently on the market.

    • @aviatedviewssound4798
      @aviatedviewssound4798 2 ปีที่แล้ว

      fax MSN what did you thought was gonna happen when Apple doesn't know how to make a good igpu like and but at least they're cpu win in the power efficiency to usage department

    • @gytispranskunas4984
      @gytispranskunas4984 2 ปีที่แล้ว +5

      Yes and pretty easily. Any 3060 Laptop destroys this M1 in both entertainment, usability and workflow.

    • @jozelavric4259
      @jozelavric4259 2 ปีที่แล้ว +2

      @@gytispranskunas4984 no it doesnt lmao

    • @gytispranskunas4984
      @gytispranskunas4984 2 ปีที่แล้ว

      @@jozelavric4259 how much fps your apple gets in cyberpunk ? Oh sorry it doesn't even run it.

  • @imnoodlehaus
    @imnoodlehaus 2 ปีที่แล้ว +109

    The performance per watt is ridiculous on the M1 Ultra. Unfortunately, for 3D workstations and gaming, power consumption is not really a concern. If we're measuring this in terms of performance per watt, then yes, M1 Ultra beats the 3090. But in terms of actual unbound performance? 3090, no contest. I wish Apple would allow more thermal headroom in their configurations.

    • @Beofware
      @Beofware 2 ปีที่แล้ว +10

      Apple needs to make server hardware. Actually no... scratch that. I'm sorry for even bringing that out of the aether.

    • @fenixyoutubo
      @fenixyoutubo 2 ปีที่แล้ว +1

      secondo me apple con metal sta puntando a potenza con basso wattaggio per cui dovranno essere i programmatori a cambiare il metodo di misurazione nonostante tutto i benchmark sono software

    • @Teluric2
      @Teluric2 2 ปีที่แล้ว +14

      @@Beofware who is gonna buy soldered disposable hardware?

    • @Beofware
      @Beofware 2 ปีที่แล้ว +6

      @@Teluric2 What I was getting at is that enterprise systems generally looks for the best performance-per-watt in their hardware.. it was a joke. No one in their right mind would. That's why I said "Actually no... Scratch that."
      Did you even read my comment?

    • @tkpenalty
      @tkpenalty 2 ปีที่แล้ว +16

      Performance Per Watt, the M1 ultra matches the 3090.
      The m1 ultra is 4x slower on raw compute, and uses 4x less power. However for non-parallel tasks the M1 ultra is even slower than the 3090 because the GPU dies are not directly connected.
      The only area the Apple trounces everyone is in certain encode tasks due to the ASICs that they have built in.

  • @maxmakesfilms69
    @maxmakesfilms69 2 ปีที่แล้ว +9

    When you consider "Discrete GPU", they're likely comparing to Intel's GPU on CPU offerings, not Nvidia's RTX range.

    • @Bonekinz
      @Bonekinz 2 ปีที่แล้ว +5

      Discrete GPU means a GPU separate from the CPU, and since Intel's arc isn't out yet if they're comparing it to integrated graphics they should note that.

  • @nickjacobsss
    @nickjacobsss 2 ปีที่แล้ว +5

    Not to defend apples terrible marketing graphs, but the graph doesnt state that the ultra is better than the 3090, it shows that it outperfoms the 3090 watt for watt. Obviously a flagship dedicated GPU that uses 6-7x the power is going to get higher peak numbers.

  • @NCPhotography
    @NCPhotography 2 ปีที่แล้ว +10

    Did you know walking is more energy efficient than driving a car? I guess I'm going to spend the next 6 hours walking 20 miles to work.

    • @theTechNotice
      @theTechNotice  2 ปีที่แล้ว

      🤣

    • @jsullivan10
      @jsullivan10 2 ปีที่แล้ว

      Lol 😂 , yeah apple 🍏 smh….And I’m an apple fan 👦

    • @Noobmaster._.69
      @Noobmaster._.69 2 ปีที่แล้ว

      Lol nice one

  • @echohotelsix
    @echohotelsix 2 ปีที่แล้ว +2

    me: "i have exactly one of those hardware in the video"
    friend: "oh, which one"
    me: "the power socket meter"

  • @theneonbop
    @theneonbop 2 ปีที่แล้ว +1

    They labeled the axis relative performance, not overall performance

  • @matthewjessey12
    @matthewjessey12 2 ปีที่แล้ว +13

    That graph was saying when you force the 3090 to use 200 less watts. It was comparing efficiency not overall performance.

    • @KnightHasen
      @KnightHasen 2 ปีที่แล้ว +1

      Efficiency also needs to be compared to the operation time in rendering workflow. The 3090 at full tilt uses almost twice as much power to get the job done in less than half the time means that it's not only more efficient at getting a given job done, but also frees up the 3090 for another job before the M1 Ultra is half done.

    • @matthewjessey12
      @matthewjessey12 2 ปีที่แล้ว +5

      @@KnightHasen this is very true but they didn’t necessarily lie just only showed one side of the data. It happens constantly the OP just kinda misunderstood the intent. They didn’t outright lie. He just didn’t set the parameters to properly test their claim or really understand what they were trying to say

    • @raak23
      @raak23 2 ปีที่แล้ว +1

      Yes, this. People are standing in line to bash Apple for what literally every other commercial company does: manipulate data so that it looks in your favor. They're not lying, just trying to sell their product. I'm not a big fan of this tactic, however, virtually EVERY company does this. Bottom line is: choose the hardware that does the job for YOU. I personally don't mind sacrificing some performance, in exchange for some user experience.

  • @Lazzerman42
    @Lazzerman42 2 ปีที่แล้ว +6

    So what Apple says without really saying it, is, at the SAME power consumption, the Apple is faster... but they word it in a way that we hear "twice the performance at half the power need".... tricky Apples as usual

  • @starshipupdates5217
    @starshipupdates5217 2 ปีที่แล้ว +12

    m1 ultra has two major problems that no one is talking about, wattage cap and data transfer bottleneck. this even applies to m1 max. the m1 ultra has a data lane that isnt powerful enough to feed the gpu and cpu with data, this is the equivalent to giving an f1 car normal fuel. it will only perform at 40 - 50 percent usage. when the m1 ultra is completing a render or doing a demanding task you will see the gpu usage being under 50 percent. this is because the data cant get to the gpu fast enough. this is shown in the wattage performance, the gpus tdp on chip is 120watts and we can see the m1 ultra using 60 watts in some cases. this means the chip isn't being utilised to its full potential. if we consider watts per performance we can calculate GPU utilisation. 60watts will be equivalent to 50 percent usage of the gpu. data blocking is also a problem here, sometimes the gpu will drop to idle or under 10 percent usage (12watts) which hinders its ability to provide a good score in the tests given. essentially its like you packaging items on a slow conveyer belt, sometimes you will be sat there waiting for products to come through just like how m1 ultra is waiting for data to come through so it can be processed.
    in this video you failed to provide any information about clock speed, gpu percentage and cpu percentage which would help with analysis of scores.
    we also need to consider emulation of some of these tasks which decrease the performance by 30 percent on average and in some cases over 50 percent if the translation encoders have been pushed past 100 percent. if the gpu is at 50 percent usage and the emulation is a further bottleneck of 50 percent we would see a 100 percent decrease in performance over Rosetta 2 emulated benchmarks.
    m2 chip
    apple is aware of this issue but obviously wouldn't publicly announce it. m2 should solve the problems with not only better performance on each core but also with fixing the data channels so teh gpu and cpu can be used at 100 percent, not 50 percent.

    • @PanosPitsi
      @PanosPitsi 2 ปีที่แล้ว +2

      Ehh it’s a first gen product problematic and expensive. Remember how bad dlss was when it came out? I’m an apple user but I’d never pay more than 1k for a desktop that can’t even play a video game that isn’t league of legends. Rn apples desktop chips are only good for laptops and for value (ironically yes it’s true 😂)

    • @starshipupdates5217
      @starshipupdates5217 2 ปีที่แล้ว +2

      @@PanosPitsi haha apples value for money is terrible, even if by some miracle Apple fixes the bottleneck and gets all games to run on macOS native, we still face problems such as no ray tracing, no dlss and the requirement of spending a lot of money to get Apple hardware. It’s now been confirmed that apples m2 chip isn’t really next gen at all. The only difference between m1 and m2 will be higher clocks and more cores. It’s not confirmed if the data bottleneck will be fixed yet so perhaps we could see a huge performance improvement even though we would still be on the same nm node (5nm)

    • @PanosPitsi
      @PanosPitsi 2 ปีที่แล้ว

      @@starshipupdates5217 I got a Mac mini for 650 euros on release and it hasn’t lagged since even though I’m running engineering software since, I was a pc guy but last year the prices were insane.

    • @PanosPitsi
      @PanosPitsi 2 ปีที่แล้ว

      @@starshipupdates5217 the point of macs is to get a computer that is work exclusive and a console if you want games, I got a Mac mini and a ps5 and it came around 1100 euros. At this price you can’t really complain especially during the pandemic

    • @PanosPitsi
      @PanosPitsi 2 ปีที่แล้ว +1

      @@starshipupdates5217 also you are looking at this the wrong way, Macs are specifically designed not to be used for gaming. And it makes sense look at these glass mouses tiny keyboards they didn’t even have high refresh rate until recently, there is more to a machine then the os and the chip.

  • @wiffmastermase
    @wiffmastermase 2 ปีที่แล้ว

    Apple claiming they can defeat Nvidia is like listening to anyone who owned a PS4 talk about how important your refresh rate is.

  • @adamdeane4675
    @adamdeane4675 2 ปีที่แล้ว +1

    Mac for creators, PC for consumers. This is nothing new. Though, I didn't buy those stats when they announced it, but I bought the Max M1 studio and I like it a lot. Much better than my PC. My studio does everything I want, though I am a mobile programmer and super simple gamer. I can code in anything and it compiles code faster than my PC and my intel macbook pro. I have a couple macs, I have a couple PC's, you dont have to just have one. You can have them all.

  • @superyu1337
    @superyu1337 2 ปีที่แล้ว +4

    You shouldn't compare CUDA and Metal, they are entirely different things.
    Cuda is NVidia's GPU computing framework while Metal is a graphics API for rendering made by Apple.
    Metal is basically Apple's (proprietary) Vulkan.

    • @dsrbby
      @dsrbby 2 ปีที่แล้ว

      He is comparing what apple compared... Apple marketing is full of lies. 😵‍💫😵‍💫😵‍💫

    • @remkojacobs9309
      @remkojacobs9309 2 ปีที่แล้ว +3

      Metal is probably the fastest way to do Gpu computing on Apple and Cuda is probably the fastest way to do gpu computing on Nvidia hardware. I'd say it's fair to compare them.

    • @superyu1337
      @superyu1337 2 ปีที่แล้ว +1

      @@remkojacobs9309 Metal and CUDA are fundamentally different though. OpenCL would be the equivalent for CUDA.

    • @remkojacobs9309
      @remkojacobs9309 2 ปีที่แล้ว +1

      I'm not familiar with Metal but doesn't it include something like OpenCL? I assumed it did since what is a serious GPU API these days without compute?

    • @superyu1337
      @superyu1337 2 ปีที่แล้ว

      @@remkojacobs9309 As far as I know, it can compile C functions to Metal Shading Language to run it on the GPU.
      But the major focus for Metal is on rasterization graphics for games if im not wrong.

  • @AvgDan
    @AvgDan 2 ปีที่แล้ว +15

    6:30 It's funny how you correctly say how much faster the 3050 is most/all the time, such as saying 135% is 35% faster than the M1, but when you talk about the 3090 you incorrectly say a 489% result is 489% faster than the M1 instead of subtracting 100% and saying 389% faster.

    • @Mrmysko1
      @Mrmysko1 2 ปีที่แล้ว +3

      I noticed that too.

    • @Trancelistic
      @Trancelistic 2 ปีที่แล้ว +1

      Indeed.

    • @AvgDan
      @AvgDan 2 ปีที่แล้ว +3

      @@jazanoneyar6945 I've written software that calculate payments based off percentages for government programs. Those hundreds of millions in annual reimbursements are heavily audited and I've never had a failed audit. I've written software to detect potential fraud as well, where percentage based variations in were used in various ways. I understand percentages just fine 🙂

    • @aspartam_
      @aspartam_ 2 ปีที่แล้ว +1

      @@AvgDan This is the best answer I've ever seen. Chapeu!

  • @JortBasement
    @JortBasement 2 ปีที่แล้ว +4

    Apple didn't lie.. the chart literally says: "relative performance" over Power consumption (watts).. The 3090 is 5 times faster than the M1 yes, but also draws 5x the power.. Thats what the chart is displaying. Can it be misleading? Yes if you don't pay attention. This just seems like someone doesn't like Apple and tries to slander them while they used "Relative performance"... It says relative for a reason.

    • @darreno1450
      @darreno1450 2 ปีที่แล้ว +2

      The graph seems purposely misleading. IMO, that's just as bad as if they straight up lied.

    • @jasonsykes4199
      @jasonsykes4199 2 ปีที่แล้ว

      @@darreno1450 < - This guy knows his shit.

  • @freztino
    @freztino 2 ปีที่แล้ว +2

    Cool comparison, but it annoyed me a bit how you kept saying the percentage difference wrong for the 3090. I.e. when the 3090 scored 460% you said it was 460% faster (which would be 360% faster), but when the 3060 scored 150%, you correctly said it was 50% faster.
    These early benchmark tests were also the reason I decided to make a Nvidia build instead of getting a Mac Studio to use for Blender, even though I’ve always been a Mac guy.

    • @ABDTalk1
      @ABDTalk1 2 ปีที่แล้ว

      That's true

  • @craigasketch
    @craigasketch 2 ปีที่แล้ว +6

    I think Apple graphed this as performance per watt not overall performance. Thing is I'd rather have the 3090 for sure.

    • @randomcomment9992
      @randomcomment9992 2 ปีที่แล้ว

      They just not checking some benchmark results, they also made some real world performance tests too, where the calculated data also transferred to the monitor. And that is where the are way better like windows, unified memory has 800GB/sec transfer speed, but the PCIe 4.0 x16 slot where you stick your RTX 3090 only has 32GB/sec. Also the steps on windows are ridiculous, the GPU RAM need to transfer the result to the system RAM (this is where the PCIe 4.0 x16 are), and after than they can transfer to the CPU (with about another 20-100GB/sec limited transfer depends on your RAM, like dual slot DDR5 has "only" 105GB/sec limit). So because of the more steps in the communication and because of the huge transfer speed difference, the latency is bigger on windows, you get the result on your screen later, but this is not visible in any benchmarks. And the higher benchmark value only gives you any bonus if you hit the 100% workload. Under the 100% workload it doesn't matter you are using the 20% or the 50% of the max capacity, the response speed will depends on your latency (transfer speed).

  • @Ferdam
    @Ferdam 2 ปีที่แล้ว +24

    No big news then: Apple's M1 chip is awesome when it is used to power mobile devices/small form-factor. But obviously M1 still can't match hardware that is fully designed to deliver best performance possible, which in turn means you'll get huge power draw, higher temps, higher noise and also will be physically bigger in size.

    • @Real_MisterSir
      @Real_MisterSir 2 ปีที่แล้ว +14

      Yup. Especially the noise level argument is so dumb. My calculator has better noise efficiency than a 3090! That's literally the argument..

    • @frostilver
      @frostilver 2 ปีที่แล้ว +7

      @@Real_MisterSir SIR, you've got my W for you

    • @stookla4942
      @stookla4942 2 ปีที่แล้ว +4

      The point of the video was to show apple was straight up lying, though going to your points you have never actually owned a proper high end pc.

    • @Ferdam
      @Ferdam 2 ปีที่แล้ว +1

      @@stookla4942 never mentioned that apple's marketing is acceptable.
      I have owned a few high end PCs since 2008. What's your point, exactly?

    • @Freestyle80
      @Freestyle80 2 ปีที่แล้ว

      @@Ferdam that you are anal about noise

  • @M3gaHaloNoob
    @M3gaHaloNoob 2 ปีที่แล้ว +5

    I pretty much dislike Apple for their pricing and marketing scheemes, but this videos is also marketed pretty sensationally showing the graph only for split seconds.
    The graph showed relative performance per watt consumption. Meaning Apple didnt lie but tried to fool the average joe who doesnt look longer at a graph than 1 min. Which pretty much sums up their target group they are overselling stuff to.

  • @hanswurstusbrachialus5213
    @hanswurstusbrachialus5213 2 ปีที่แล้ว +1

    Maybe you should have included the math to get a comparison for performance per Watt.

  • @SylvainDuford
    @SylvainDuford 2 ปีที่แล้ว +1

    Apple is a marketing company first and foremost.

  • @BlueBaron3x7
    @BlueBaron3x7 2 ปีที่แล้ว +22

    Apple is amazing selling junk for years for 1000's more.

  • @winstonthompson6210
    @winstonthompson6210 2 ปีที่แล้ว +10

    The CPU for the M1 Ultra is basically 2 times faster than the M1 Max. However, this is not true for its GPU, it is being bottlenecked by a hardware limitation that apparently Apple wasn’t expecting, and the more GPU cores the worse the bottleneck resulting in less gain in performance. The graph Apple showed was for performance per watt, not for performance. Apple showed what it could do at a certain wattage without mentioning that the other computer can draw on even more power than at that point, meaning it can give even higher performance. Sometimes even only 25% of the the M1 Ultra’s GPU’s full capability is being used during testing. Apples graph showed it reaching 105 watts, which with linear scaling like the other M1 chips have would be able to draw 120 watts, but I haven’t seen any other tester get it to use more than 80+ watts, and this video just showed me 90+ watts. Supposedly with the 48 Core M1 Ultra u can get up to 1.7 times faster GPU performance, but no M1 Ultra gives basically 2.0 times the performance unless it’s the CPU. They say by optimizing for the tile memory apps can get better performance, but it’s extremely harder than just optimizing for Apple’s Arm chips and will take a long time assuming developers bother. So I wouldn’t expect a perfect fix (hardware fix) until the M2 Ultra.

    • @eneveasi
      @eneveasi 2 ปีที่แล้ว +2

      M2 will still fall short if I’m honest. I feel like they intentionally power throttle their machine to avoid overheating. Because the thermal design on the studio just does not have the same cooling capabilities as your standard decent build

    • @winstonthompson6210
      @winstonthompson6210 2 ปีที่แล้ว +1

      I assume you mean the M2 Ultra, because the M1 doesn’t fall short, only the M1 Ultra. The fans in the M1 Ultra Mac Studio are currently overkill, they are considerably heavier than the ones in the M1 Max Mac Studio because Apple says the M1 Ultra requires a better cooling system. Yet, they don’t pass idle while the Ultra is at work and the device remains cool. So right now the ultra is not getting hot enough to go anywhere near bringing the thermal system to its knees. Even the M1 MacBooks with fans get hot after gaming for awhile and their fans can get pretty audible, there is no reason why the Mac Studio should be any different if it gets hot enough. So clearly thermals is not a problem.

    • @Teluric2
      @Teluric2 2 ปีที่แล้ว

      Whats the HW limitation?

    • @winstonthompson6210
      @winstonthompson6210 2 ปีที่แล้ว +1

      The hardware limitation is outlined in the TH-cam channel Max Tech’s video entitled ‘Mac Studio Review: What Apple DOESN’T want you to know..’

    • @PanosPitsi
      @PanosPitsi 2 ปีที่แล้ว

      @@eneveasi it’s an arm chip the base m1 doesn’t even need fans

  • @akhil_1210
    @akhil_1210 2 ปีที่แล้ว +1

    Dear apple, Windows isn't Android? Don't try to mess with that

  • @Bob1997654321
    @Bob1997654321 2 ปีที่แล้ว +1

    Hey this is a nitpick, but you're overstating the performance of the Nvidia GPU's by quite a bit. For example at 6:03, when you have the Apple M1 as the baseline 100% you state that the RTX3090 is 489% faster than the M1 ultra on the Monster render. The difference is 389% not 489%. You do this a couple times for other things. Just something to be mindful of. Good video though!

  • @13dma1rz
    @13dma1rz 2 ปีที่แล้ว +5

    Thanks for running the tests. I don't much care about the power consumption issue. I'm not running a mining operation. Apple always appeals to creatives but they really can't compete on the hardware side.

    • @melaniodanilosindayen9011
      @melaniodanilosindayen9011 2 ปีที่แล้ว

      Just wait. Intel said the same thing.

    • @PaulStoffregen
      @PaulStoffregen 2 ปีที่แล้ว

      For creatives, Apple does indeed compete on the hardware side. The Mac tested with 3D rendering benchmarks in this video isn't the machine for heavy weight 3D rendering. That's the expensive cheese grater, which can run 4 W6800X or 2 W6900X GPUs. Yeah, it costs $20K. Maybe you could somehow cram 4 RTX3090s into a PC for less, but the consumer GeForce cards (even 3090) have less memory and some limits imposed by nVidia's drivers to force pros to buy their much more expensive Quadro cards.

    • @PvtAnonymous
      @PvtAnonymous 2 ปีที่แล้ว

      @@PaulStoffregen then who is the Mac Studio for? That's the only question that needs to be asked. Is it the replacement for the 27" iMac? If so, then great. Although it gets pretty expensive pretty quickly. If it's supposed to be a machine for pro-users and creatives, I don't really see how. They market it as an absolute powerhouse ("Empower Station"), but it fails the most basic tasks compared to x86 hardware. Some people even returned the Studio because of performance issues, saying that they expected the Studio to perform better than the 16" MBP, which it didn't. So I am still confused about the Studio.

    • @PaulStoffregen
      @PaulStoffregen 2 ปีที่แล้ว

      @@PvtAnonymous My guess is it's mostly meant for people who use Logic & Final Cut.

  • @KaoukabiJaouad
    @KaoukabiJaouad 2 ปีที่แล้ว +10

    great review on point, putting the RTX 3050 was smart, it kinda helps people settle down, the M1 chips are insane, but their native performance is on low wattage, on laptops its revolutionary, on Desktops, it's pretty stupid, even on your electricity bill for professional use you won't see much of difference between an M1 Ultra top-spec Desktop and an RTX3090 PC with 12900K just because 99 percent of the time you use it on low-performance mode, the intel will pull 40W from the wall in that mode, the whole PC probably 150W max, it's when you use it at its full performance then you see the big difference in power draw(300 to 400W on average), it's not like a 3D artist is rendering all the time.

    • @Teluric2
      @Teluric2 2 ปีที่แล้ว

      Engineers runs models in their desktops for days and weeks nonstop using ansys and FEA

    • @KaoukabiJaouad
      @KaoukabiJaouad 2 ปีที่แล้ว

      @@Teluric2 that is something else that is a company with people hired for that, if it is one guy let say a Goerge Hotz kinda of level of machine learning models, even him he is one of a kind, top 0.1 per cent of people in top intelligence on 5 generations, he doesn't run training models for a long time, when he's serious, he has his company computer farm doing that kind of stuff .. at this point you're just splitting hairs

    • @DigitalJedi
      @DigitalJedi 2 ปีที่แล้ว +1

      @@Teluric2 In my experience those types of workloads are almost always given to a company render / compute farm. You don't want the engineer(s) to be unable to do much work for days while their desktops are doing a simulation, when a compute farm would do it within the day in most cases and let them get on to their next thing.

  • @brian2590
    @brian2590 2 ปีที่แล้ว +1

    M1's are great for low power setups. I use a cheap M1 air laptop to remote connect to workstations with NVIDIA GPU's. I can live minimal on solar energy with almost no monthly bills... NVIDIA based workstations are on company power bill... It's a win win all around. Use them both lol

  • @WaspMedia3D
    @WaspMedia3D 2 ปีที่แล้ว +1

    The craziest part is that there are apple fanbois running around actually bragging about how the M1 can render faster than 3090 ... because they just prefer to believe what they are told instead of actually finding out for themselves. Video editing might be an area where the M1 ultra does well though.

  • @murutattack
    @murutattack 2 ปีที่แล้ว +6

    So basicly they charged $1000 more because of the speech. 😹

  • @tsuitsui100
    @tsuitsui100 2 ปีที่แล้ว +3

    the fanboys are not happy

  • @Jimwatl
    @Jimwatl 2 ปีที่แล้ว +8

    In a perfect world Apple would accept that we could connect an external GPU

    • @powerhouse884
      @powerhouse884 2 ปีที่แล้ว +2

      I believe you can but you still get bottleneck by the cable.

    • @PanosPitsi
      @PanosPitsi 2 ปีที่แล้ว +2

      @@powerhouse884 you can’t anymore because gpus don’t work with arm

  • @pompomaddons
    @pompomaddons 2 ปีที่แล้ว +1

    You have to use the base 3090 by Nvidia, not partnerships, making this invalid.

    • @kvxtr
      @kvxtr 2 ปีที่แล้ว

      if you did get the baseline one
      The Difference are wayyy off
      2-3x times more than M1
      so its not even worth the efforts
      and results are very similar despite the patnership

  • @vilcsith
    @vilcsith 2 ปีที่แล้ว +1

    I don't have a Mac Studio, but I'd be lying if that presentation didn't make me excited atleast a bit.(After all, competition is amazing for the consumer.)
    I'm so angry that they straight up lied, imagine how the people who believed that graph must feel stumbling upon this video. Desktop ARM chips have a bright future, why not focus on the power draw instead of hyperfocusing on "WE'LL OUTPERFORM EVERYONE OUT OF THE BOX"? Why must Desktop ARM debut with a blatant lie?

  • @fLaMePr0oF
    @fLaMePr0oF 2 ปีที่แล้ว +6

    Yes in the real-world performance scores a PC+GPU is WAY faster than an M1 Ultra system, however, If you divide the benchmark scores by the watts consumed by each system then the M1 Ultra system is giving slightly greater performance per watt in the Monster and Classroom renders and slightly lower in Junkshop (i.e. divide M1 results by 84 and PC results by 452). This is what apple mean on their graphs as they are clearly charting performance vs power consumption
    Yes, it's a ridiculous comparison which bears no relation to actual system performance - the results for such a calculation are always gonna be very similar since you're basically just calculating the power efficiency of the silicon and they are both built around similar processes, and yes Apple deserve to be called out for such hokey and misleading 'benchmarks' but they are not actually lying

    • @einz600
      @einz600 2 ปีที่แล้ว

      No, sorry. They literally lied: "M1 delivers faster performance than the highest end GPU available while using 200 W less power."

  • @1235-g7r
    @1235-g7r 2 ปีที่แล้ว +3

    apple is a total clown 😂 go back to making phones

  • @Saeed89
    @Saeed89 2 ปีที่แล้ว +1

    RTX 3090 is an ABSOLUTE BEAST and has the definite power to annihilate M1, 2, 3 or whatever crap Apple has to offer.

  • @BabuMosahi
    @BabuMosahi 2 ปีที่แล้ว +1

    Their marketing department is a kind of beast that even they will sell expensive products in the name of privacy , but anyone who knows about apples "ACTUAL" Privacy policy, they say that they will collect your data but will not make any profit out it but they will share your data to their partners and their own third parties , the third parties 3 even allowed to collect your data but they use a loophole to collect your data , which is not to be disscus here , so that they technically and lawfully cannot get sued .

  • @peterfuentes5893
    @peterfuentes5893 2 ปีที่แล้ว +6

    As a professional video editor, I’ll never understand why people in my field of work insist on getting overpriced apple products.

    • @powerhouse884
      @powerhouse884 2 ปีที่แล้ว +2

      Cuz Windows suck and is more problematic to work with.
      I have a Pc with a 3080 and i still find macOS better to work with.

    • @peterfuentes5893
      @peterfuentes5893 2 ปีที่แล้ว +1

      @@powerhouse884 I haven't had any problems with Windows. I will say that Mac OS is a bit easier and more intuitive to use but it's not worth the premium price tag in my opinion.

  • @TheAacharge
    @TheAacharge 2 ปีที่แล้ว +4

    ARM can help M1 beat x86, but can't help on GPU, since GPU doesn't use x86.
    Basically, the total number of GPU transistors determines the major performance of the GPU.

    • @UhOhUmm
      @UhOhUmm 2 ปีที่แล้ว +3

      But it can't even beat x86. Sure it's fast at a low tdp, but x86 pushes performance way above m1. ARM is good for laptops, that's about it.

  • @FujinBlackheart
    @FujinBlackheart 2 ปีที่แล้ว +5

    Apple made a really good processer there in what it can do with less, but if it comes to raw power and even less price you loose out, and even more if you ever dream of updateing your machine, also lets not talk about their dodgy costumer service.

  • @SafetyFooT
    @SafetyFooT 2 ปีที่แล้ว

    Any of my professional audio colleagues who have "upgraded" to an m1 machine have been dealing with compatibility nightmares.

    • @theTechNotice
      @theTechNotice  2 ปีที่แล้ว +1

      Yep, I've heard a lot about that as well.

  • @SWxYicia
    @SWxYicia 2 ปีที่แล้ว +1

    Apple used 1050 ti mistakely instead of 3090

  • @theTechNotice
    @theTechNotice  2 ปีที่แล้ว +10

    Here's my testing on the performance/watt, let's see who wins in that category? 👉 th-cam.com/video/jCs27cNz6_k/w-d-xo.html 👈

    • @Tigerex966
      @Tigerex966 2 ปีที่แล้ว

      But it's not except per watt.

    • @audiovid.
      @audiovid. 2 ปีที่แล้ว +3

      Man, you are manipulating the information, when catched on this instead of saying "sorry" guys, I am also not independent tester - you say this. Unsubbed... I do not like Apples products, but even more - dihonest guys.

    • @blueben1224
      @blueben1224 2 ปีที่แล้ว

      Extrapolation goes with caution, risk, and disaster.

    • @andrewmicro16123
      @andrewmicro16123 2 ปีที่แล้ว

      Performance can't be just extended like that. Think about it, in a desktop machine if Apple could have simply upped the power to beat the 3090 they would have done so. Power and performance don't scale linearly. Plus the 3090 runs on a worse process then the M1. All these factors point to the Nvidia architecture being much superior in terms of efficiency and scalability.

  • @seplon9328
    @seplon9328 ปีที่แล้ว +1

    Apple do be capping about their product's power.🤣

  • @SodaiGoku
    @SodaiGoku 2 ปีที่แล้ว +1

    You do realize Apple users have their fingers in their ears going "la la la la la" right now, don't you?

    • @SodaiGoku
      @SodaiGoku 2 ปีที่แล้ว +1

      @obimk1 only when it comes to facts.

    • @SodaiGoku
      @SodaiGoku 2 ปีที่แล้ว +1

      @obimk1 we're not talking about what's new and innovative here, we're talking about those charts Apple put up, comparing the performance of the two. And I know my 3090 eats apples for breakfast and sh*ts them out the next day. And don't even get me started with my new 4090 to be...

    • @SodaiGoku
      @SodaiGoku 2 ปีที่แล้ว

      @obimk1 expert? What are you on about, I just want performance, raw power! 😆

  • @Bollalillo
    @Bollalillo 2 ปีที่แล้ว

    i've never understood why people buy anything assosiated with apple

  • @dachef03
    @dachef03 2 ปีที่แล้ว +2

    I'm thinking apple maybe tested the 3090 with similar power draw to the M1 ultra. That could be the only reason why they'll make such an outlandish claim like this. However, less power draw equals less performance, so apple is either lying or cheating

    • @Techy93
      @Techy93 2 ปีที่แล้ว

      no, apple just didn't make the claim in the first place. The graph this vid is based on actually says RELATIVE performance. I.E. Performance per watt. Going by the numbers the first time he reads out the power draw, the M1 was about 5x as less power so it's actually trading blows in the performance per watt department.

    • @darko6275
      @darko6275 2 ปีที่แล้ว

      @@Techy93 come on bro we all know who is apple targeting ..........

    • @Techy93
      @Techy93 2 ปีที่แล้ว

      @@darko6275 how does their target market impact wether they lied or not?

    • @darko6275
      @darko6275 2 ปีที่แล้ว

      @@Techy93 it's depend on the reader of that chart if i show that to my brother who is apple die hard fan he will say F your RTX 3090 we have the best GPU/Pc. am not saying they lied am saying they targets people who think apple always on top and they have zero techs knowledge. we are talking about people who think wireless charge is an evaluation when it's come to iPhone :)

    • @Techy93
      @Techy93 2 ปีที่แล้ว

      @@darko6275 ok, dude, I get it's super shady and shitty marketing, I commented on the vid too and mentioned it. I'm more commenting with enthusiasts in mind. We should be able to tell the difference between a lie and shady marketing. Imo this whole vid should have given a more balanced view, showing the actual efficiency (since that's apples actual claim despite them trying to make it look otherwise) because at the end of the day, they didn't lie.
      Also for reference, very much not an apple guy.

  • @samgao
    @samgao 2 ปีที่แล้ว

    I have a 5950x and a FE 3090 for gaming, and an M1 ipad pro for video editing. Best combo. Both cost over 2k. (PC cost over 6K)

  • @phoenixyt124
    @phoenixyt124 2 ปีที่แล้ว +1

    I love how the 3050 is faster than the m1 ultra

  • @variamente6855
    @variamente6855 2 ปีที่แล้ว

    They never really did lie, they just left out all the other graphs and cut short the only graph present.

  • @jakejoyride
    @jakejoyride 2 ปีที่แล้ว +2

    Where is fps comparison in Cyberpunk?

  • @MARKXHWANG
    @MARKXHWANG ปีที่แล้ว +1

    by messing with nvidia, apple lost gaming. now they will lost any AI opportunity...

  • @TaisetsuGadget
    @TaisetsuGadget 2 ปีที่แล้ว +1

    Well yeah the marketing are too much for that, but for the power consumption to performance wich is very good tho, is small & low consumption, well the price actually abit too much but it's worth for company

  • @skildfrix
    @skildfrix 2 ปีที่แล้ว

    Apple didn't know that their charts were placed upside down

  • @ChibiTheEdgehog
    @ChibiTheEdgehog 2 ปีที่แล้ว

    I mean come on, don't tell me you have ever actually bought into any apple marketing. They are the kings of alternative facts

  • @daves.software
    @daves.software 2 ปีที่แล้ว

    FYI, 489% as fast is not the same as 489% faster. If the scores were identical, they would both be 100%, and you wouldn't say that one was 100% faster than the other. So if you're going to say "x faster" you have to subtract 100%.

  • @HiVikas
    @HiVikas 2 ปีที่แล้ว

    people who are willing to spend the money for apple m1 ultra will not even think about the power consumption, they are only concerned about performance.

  • @MichaeltheORIGINAL1
    @MichaeltheORIGINAL1 2 ปีที่แล้ว +1

    Wait, so you are telling me that Apple is misleading their customers? NO WAY!

    • @kvxtr
      @kvxtr 2 ปีที่แล้ว +2

      Dint see that coming?
      Cant wait to see what graph apple comes next time

  • @MarcinSzyniszewski
    @MarcinSzyniszewski 2 ปีที่แล้ว +1

    Classic marketing tactic of Apple (and sometimes also others): don't give enough details so that the claim actually means something, yet make it sound so nice everyone is interested.