The differences in AMD's and Nvidia power monitoring

แชร์
ฝัง
  • เผยแพร่เมื่อ 20 พ.ย. 2024

ความคิดเห็น • 111

  • @matthewfennell7886
    @matthewfennell7886 7 ปีที่แล้ว +20

    Previously in the RTG labs:
    Marketing: Why do we have two power ratings?
    Engineer: The gpu has a limit of 110W and the card uses about 150W.
    Marketing: About 150W?
    Engineer: Something around that yeah. Don't worry about it.

  • @folterknecht1768
    @folterknecht1768 7 ปีที่แล้ว +33

    Great content, learned something usefull today. That 's the kind of content I 'd like to see more of, as I 'm personally not that interested in OC that goes beyond 24/7. Though ofc knowing how much headroom a card/system has, is usually interesting when it comes to long term use (to some degree at least).

  • @yungpepe
    @yungpepe 7 ปีที่แล้ว +5

    This actually makes a lot of sense now with everyone's WattMan settings and GPU-Z readings vs readings @ the wall with Watt meters.

  • @JohnDoe-qm5uo
    @JohnDoe-qm5uo 7 ปีที่แล้ว +9

    Kepler was when Nvidia starting implementing this type of power monitoring, 600-series had GPU Boost 1.0 which was solely based on power consumption.

    • @mmllmmll22
      @mmllmmll22 7 ปีที่แล้ว

      GTX 650 didn't had a GPU Boost 1.0 overall so no power monitoring Keepo

  • @kn00tcn
    @kn00tcn 7 ปีที่แล้ว +6

    well it wont cost $1, it adds pcb complexity like wiring, driver support, gpu core support, r&d, testing, etc
    gpuz says gpu 'only' power, so not sure how people are so stupid to think it means the card, besides the only true measurement is physical, if not intercepted then from the wall & estimated

  • @pikerdm7466
    @pikerdm7466 7 ปีที่แล้ว +2

    very useful vid, this is the type of thing I love seeing.

  • @ortgfer478
    @ortgfer478 7 ปีที่แล้ว +1

    pretty sure the reason amd does this is a "well we could just use this thats already here" type of thing, this kind of power monitoring is available even on cards like an hd 6870 and earlier cards, which lack the PowerTune and/or power limits all together. Worked well enough for all this time that they never bothered to change it.

  • @semitope
    @semitope 7 ปีที่แล้ว +1

    I'm not understanding the issue. the power management wouldn't really change anything would it? They could still claim 150W "TDP" because they know what it should be using on average. TDP in quotes because it really should be TBP if we want to talk about measuring the power consumption of the board, no? That information doesn't affect the GPUs clock speeds so it wouldn't matter. I doubt nvidia is throttling the GPU based on the consumption of the board. That would be strange.
    it sounds like a problem for people who want to measure power consumption, but they would be using pci-e + plug to measure. The info in GPU-Z for power is chip only . I don't think AMD cares about monitoring the other stuff at all. At best they would care about VRM and RAM temps. They aren't going to sacrifice GPU performance over the other components.

  • @sviktor4
    @sviktor4 5 ปีที่แล้ว

    Hi, I know its an old video, but I just wanna share there is no these kind of power limiting circuit on a MSI GTX 1050 TI Low Profile, it's only monitors the Vcore voltage and in Atherburner the power limit slider also just lower the Vcore. All the power managemend done via the power management IC and 4 resistor, 3 power balancing and 1 power limiting resistor (10k). Under heavy load the Vcore drops from 1050mV to as low as 890-900mV, if you unlock the Voltage control in Aftherburner it just add +50mV at 100%, it helps but not so much. To increase the power limit I just soldered 200k resistor on top of the 10k power limiting resistor, this give me 105% power limit. Later I changed it to 100k wich means 110% power limit, I didn't go any further because I didn't want to blow up my PIC-E connector. Still get power limit, but the voltage drop is much less, Vcore is 1000-1020mV and it gives about +7-8% performance boost.
    The MEM VRM separate circuit, with 1,37V and 30W power limit, wich is plenty, I managed to apply the max OC as possible from software +1000Mhz (stable at +975Mhz) on Samsung RAM.
    My final OC:
    +155 GPU - meh I know I lose the lottery here, I was able to go as far as +180 in benchmarks, but is wasn't stable in games.
    +975 MEM - I won the lottery here, in benchmarks +1000 no problems, but in games it crashed like 2 times a week, so I lowered a bit.

  • @terrytibs3365
    @terrytibs3365 7 ปีที่แล้ว

    Hey dude. Really good video. If you was making a graphics card, How would you design it

  • @unchained_0177
    @unchained_0177 7 ปีที่แล้ว

    Linus did a vert long PCIe extension too but not for power just for a wall mounted PC

  • @ricochetVendetta
    @ricochetVendetta 7 ปีที่แล้ว +4

    Does this mean gpu-z for watt on vega reading is not showing all the power

    • @ActuallyHardcoreOverclocking
      @ActuallyHardcoreOverclocking  7 ปีที่แล้ว +5

      no it isn't. It shows Vcore and maybe VHBM. VAUX display drive Vpp the fan and any other power losses are not measured.

  • @samghost13
    @samghost13 5 ปีที่แล้ว

    Hi
    You are GENIUS!
    I have a question: Can i use a Silverstone Power (PCI-E 8 Pin to 6+2) Cable on my be quiet Power Supply ? The Cables are identical. Thank you

  • @WhiteSkyMage
    @WhiteSkyMage 7 ปีที่แล้ว +1

    Here is what I'd do - flash bios, and "POWER BE DAMNED!" Let the card use as much as it wants. F*ck efficiency and shoot the power bill through the roof! Cuz power is "almost" free!

  • @oncearoundthemapleleaf9041
    @oncearoundthemapleleaf9041 7 ปีที่แล้ว

    So Buildzoid, from what you are saying, when i was trying Max overclock run monitoring via WHInFO64. The Max current draw of 330 Watts, on my R9 Nano( I have lots of extra cooling), was likely under reported?

  • @amiltonfcjunior
    @amiltonfcjunior 3 ปีที่แล้ว

    I have an RX 590 that won't install drivers and shows unreal huge power consumption in the Sensors tab of GPU-Z. What can it be?

  • @ronald4life1
    @ronald4life1 7 ปีที่แล้ว +2

    You know what's worse than not having power regulator? Having one and your cards burning up... Like the Nvidia cards... Why don't you look at adoredTV new video and realize how little praise nvidia design deserves

    • @mOnocularJohn
      @mOnocularJohn 7 ปีที่แล้ว

      Oh really? How many nvidia cards have been burning up in recent history, relevant to this topic? It's 2017 not 2007. In fact that happening 10 years ago might be why nvidia has this kind of monitoring... I guess AMD will have to fry some cards before they do it. At least nvidia learned something.

    • @Crunchbite_Daimyo
      @Crunchbite_Daimyo 7 ปีที่แล้ว +1

      AMD has over current and temperature protection on their cards. Nvidia did not have that kind of protection back then. He's talking about power monitoring though, which is something most people don't really need to know. I'm sure on the engineering cards that they had monitoring circuits, but once it's in the hands of consumers, who really cares.

  • @Zubkover
    @Zubkover 7 ปีที่แล้ว

    Question: My friend got 1070 amp by zotac - 8+6pin and the difference between 100% and 50% power limit is like 1920mhz vs 1800mhz, is it possible that on 50% amperage drop the difference is no minor?

  • @s28278187
    @s28278187 7 ปีที่แล้ว

    Would you start doing psu breakdowns or will you just be sticking to power components on gpus

  • @axelbostrom3606
    @axelbostrom3606 7 ปีที่แล้ว

    hey, how did you break the 1070 bios or whatever to make the OC curve into a line? because that thing is freaking annoying on my 1080

  • @TheJohdu
    @TheJohdu 7 ปีที่แล้ว

    so that's the reason why vega core clock is jumping like a kangaroo when power limit is exceeded?

  • @MrWebb-qw8gy
    @MrWebb-qw8gy 7 ปีที่แล้ว

    I have a Asus rog 480 8 gig what do I have to mod to make the card pull all power through the rail

  • @jonson856
    @jonson856 7 ปีที่แล้ว

    So all these power measurement tools also dont existe on the vega cards? :/

  • @tommarnk
    @tommarnk 7 ปีที่แล้ว +1

    amd dont know how much they are pulling buildzoid!

  • @YouTubeDoxedMyRealName
    @YouTubeDoxedMyRealName 5 ปีที่แล้ว

    The math for unmeasurable veritable could be backed into the firmware on AMD cards...?

  • @gr33nbits93
    @gr33nbits93 7 ปีที่แล้ว

    Ive the Aorus RX580 8gb from Gigabyte and it needs even more power, so you say that AMD should monitor power like Nvidia?

  • @SquintyGears
    @SquintyGears 7 ปีที่แล้ว

    ok but then how does this explain the inconsistencies people have been seeing on nvidia card? I was really looking forward to that part in the tease at the start of the video

  • @NavJack27gaming
    @NavJack27gaming 7 ปีที่แล้ว +1

    I've been having *sarcasm* a ton of fun */sarcasm* overclocking my 1080ti ftw3 with voltage points. I wish there was more then just afterburner and precision XOC for editing them. I'd love a stand alone program that shows the entire range of points kind of like a multi band EQ. Simple but with load and save with naming.

  • @fandomkiller
    @fandomkiller 3 ปีที่แล้ว

    nvidia ceo will be sitting at home on his smart phone controlling ur vrm soon

  • @imnotusingmyrealname4566
    @imnotusingmyrealname4566 7 ปีที่แล้ว +1

    Nvidia really has it easy.

  • @jackchills
    @jackchills 7 ปีที่แล้ว

    No audio

    • @jackchills
      @jackchills 7 ปีที่แล้ว

      Found the problem. Your audio stream, was being sent to the rear audio channels

  • @DoctorWho14615
    @DoctorWho14615 7 ปีที่แล้ว

    He said.. doody

  • @gillianseed4419
    @gillianseed4419 7 ปีที่แล้ว

    dont know why you would care about parts of the card with a static power consumption

  • @byron.
    @byron. 5 ปีที่แล้ว

    Is this also valid for 580’s?

    • @92Cope
      @92Cope 4 ปีที่แล้ว

      Yes

  • @analkarldervierte1759
    @analkarldervierte1759 7 ปีที่แล้ว +2

    The good thing about that terrible power monitoring on AMD is that I can kick my Vega really damn hard in the butt power wise.

  • @MinehowTech
    @MinehowTech 7 ปีที่แล้ว

    Put Vega 64 power onto 1080ti

  • @TheHighborn
    @TheHighborn 7 ปีที่แล้ว

    how does vega do?

  • @anthonypedersen1555
    @anthonypedersen1555 7 ปีที่แล้ว

    Sir! It's pronounced "Jay's" TwoCents. He is NOT a multi-platinum rapper turned tech-tuber.

  • @tommihommi1
    @tommihommi1 7 ปีที่แล้ว

    So AMD cards pull less power at the same performance if you have good VRM cooling, because VRM efficiency increases at lower temps, while Nvidia cards get better performance at the same power draw?

    • @analkarldervierte1759
      @analkarldervierte1759 7 ปีที่แล้ว

      That's assuming that the actual GPUs pull more or less the same power, which they don't.

    • @BeHappyTo
      @BeHappyTo 5 ปีที่แล้ว

      yes, nvidia is better at high end, amd is better overall

  • @antraxbeta23
    @antraxbeta23 7 ปีที่แล้ว +11

    They proly don't want the user to se the real power usage(rx480 user here) since a 1070 eat's less then a stock 480 :P

  • @Molo9000
    @Molo9000 7 ปีที่แล้ว

    Nvidia only implemented this strict power monitoring because of the problems they had with Fermi.
    www.geeks3d.com/20101109/nvidia-geforce-gtx-580-the-anti-furmark-dx11-card/

    • @jakirmahmood3425
      @jakirmahmood3425 7 ปีที่แล้ว +1

      So? FERMI was way way way old. They kept using proper measurement for 5 years and we should complement them for this

    • @Molo9000
      @Molo9000 7 ปีที่แล้ว

      You could also argue that saving board space and components is the better choice, because in the real world it doesn't make a difference if the card uses 10% more or less power than indicated . Maybe AMD will change their power monitoring because of the bad press over RX 480 just like Nvidia did because of Fermi. Seems unlikely though since it turned out not to be an issue in practice.

  • @AndriasLionel
    @AndriasLionel 7 ปีที่แล้ว +1

    AMD should hire you to improve their card

  • @fandomkiller
    @fandomkiller 3 ปีที่แล้ว

    love watching u kill nvidia's drm

  • @oldgamergene5712
    @oldgamergene5712 7 ปีที่แล้ว +2

    I love the video - AMD has a card that does not do power monitoring - OK so what? You comparing a 1080TI to a RX480 is laughable given the price delta. You assumptions fail in the fact that the RX480 cards work. Cost is a major factor, as noted, its not just chips, its all the engineering, software and complexity of the manufacture. AND - you scoffing at a 2% delta - sorry, that is not critical on this low end card. For this card - it simply does not matter. My first negative comment for you - but still love the video and the channel.

  • @Dobermanator
    @Dobermanator 7 ปีที่แล้ว +2

    Im not buying that. The AMD card you referenced did draw more power via PCIe as measured by pcper however within a couple days a Driver release fixed that issue, also measured by pcper so AMD must have a way of not only measuring specific area's but controlling it with software so what else do you want? I mean beside the specific hardware your'e saying is somehow a must (and clearly AMD do without it) and that all Cards exceed spec when oc'd what is your point? You said it yourself, that the way NVIDIA does it would cost a few dollars, do you really think AMD are not smart enough to know or figure that out? Do you not think AMD don't reverse engineer exactly using the methods you do? And yet they've still not made the change?

    • @jdrok5026
      @jdrok5026 6 ปีที่แล้ว

      Dobermanator amd is also a smaller company tho a few dollars adds up to make everything work perfectly

  • @zettle2345
    @zettle2345 7 ปีที่แล้ว +1

    Just to throw some cold water in the air. First you say< they only measure dodgy Iout, then you say all cards measure Vin. And you use a multi-meter to make your measurements. There is a video from "you" about how accurate that thing is, I think it's about a Vega64 FE?? lol anyway, I have not heard of the dodgy measurements burning any card down. But I have seen you burn cards down... I think I will trust the people who make the cards, instead of the people who destroy them. And that's my opinion on the matter. have a good day:)

  • @Simon74
    @Simon74 7 ปีที่แล้ว +9

    I bet AMD just has to save engineering time everywhere they can because they are understaffed.

    • @maxfacts1
      @maxfacts1 7 ปีที่แล้ว +5

      It's not about saving a few bucks it's more like Cred Dead AMD doesn't want to expose how inferior their watt wasting uArch is and how inferior Glofo's FinFail watt wasting process is.
      AMD's performance per watt is so bad that even HBM's efficiency advantage didn't help much, Performance Per Watt Champion Nivdia's Pascal still beats watt sucking, overhyped and late Vega like a pinata.

    • @KuraIthys
      @KuraIthys 7 ปีที่แล้ว

      Well... Efficiency is relative to clock speed. The real problem is AMD's architecture is slow.
      Since the relationship between clock speed and power usage is exponential, not linear, it's not that AMD cards have terrible power consumption, it's that they are so far behind on various things that to make the cards look even remotely competitive they need to push the clock speeds well into absurd territories.
      I bet Vega is really only good for about 1000-1100 mhz or so.
      But... Since it can't even compete when pushed to 1500+ mhz, well you can see the problem.
      Power hungry GPU's are generally large ones as well. Let's not forget the cringe-worthy disaster that was Fermi, which just goes to show both of these idiotic companies are more than capable of making basically the same mistakes.
      You design your architecture, then choose your clock speeds afterwards.
      Hopefully, you aren't pushed into running your chips at excessive clock speeds.
      But sometimes pushing the speeds way up is all you have.
      Vega is clearly a mess... For gaming anyway. (it's compute performance is far more consistent with what you'd expect a card of that size to be doing)
      AMD actually seems to have that issue in general - it's compute performance compared to the 'equivalent' Nvidia cards is off the charts, more often than not.
      Yet gaming performance is... Quite honestly, pathetic.
      It either proves that compute and gaming workloads don't have that much in common, or it proves that AMD cards are rarely being used to their full potential.
      After all, The RX480 is closer to a GTX 1070 than a 1060 in compute performance, yet in gaming it is either barely keeping up with, or even falling behind the performance of a GTX 1060.
      We are talking a card that in raw processing terms would appear to be something like 50% more powerful, yet in realworld gaming tests much of the time it isn't just not keeping up, it's losing. (yes, there are outliers, and everything together the RX480 and GTX1060 seem about even, but the theoretical numbers vs realworld performance seem very out of proportion.)
      Clearly, AMD's design isn't optimised correctly for realworld loads.
      Or at least, not realworld gaming loads.
      So if we assume they weren't complete and utter idiots, this raises the question - if these cards have SO much raw power that isn't doing any good for gaming (50% more theoretical performance is not a trivial amount), then what IS it designed for?
      Has AMD been focusing on stuff other than gaming this whole time, and just not telling anyone?
      Could be.
      Vega certainly seems to suggest so, given that it is a monster of a GPU with a bunch of features that are outright pointless for a gaming card.
      So... has AMD not actually been designing cards with gaming in mind this whole time?
      Is gaming performance an afterthought to other purposes for these cards?
      Hard to say. But something really doesn't add up somewhere...
      Again, an RX480, in theoretical terms is closer to the brute force performance of a 1070 than a 1060, yet in realworld terms it's struggling to keep up with the 1060...
      And this isn't an isolated thing, it's a trend with AMD cards that goes back many, MANY years.
      They are vastly overpowered on paper compared to their real-world gaming performance.
      And I mean, SERIOUSLY overpowered. Like anything from 40-70% more powerful than the nvidia cards they aren't keeping up with.
      Which really doesn't add up.
      Poor design? Poor utilisation? Or... Designed for entirely different purposes, and shoehorned into gaming after the fact? Who can say? It's likely impossible to work this out without reverse engineering these cards down to the finest details...
      Still, the days of Fermi vs Evergreen/Terascale are long gone. Quite why AMD has stuck with GCN derivatives for as long as it has is beyond me.
      Fermi was disastrous, but Nvidia dumped it after 2 generations at most.
      GCN has been around for I don't even know how many generations now? 4? 5?
      Way too long, clearly.
      Do they only keep going with it because they don't have the resources to replace it? Or is there something else going on that they aren't telling us gamers?

    • @jdrok5026
      @jdrok5026 6 ปีที่แล้ว

      KuraIthys amd designs their cards for compute performance more then anything which is why they excel in processes that can utilize all of the chip but if you look at Vega 64 vs 56 the 56 can actually smash a 64 up until your pushing ln2 gcn is probably not really well utilized for gaming work loads vs compute work load. I've never seen a gcn based card really fall behind in workstation tasks vs a cuda based card amd just sells you stuff that is most likely stuff they didn't ever give certifications.

    • @jdrok5026
      @jdrok5026 6 ปีที่แล้ว

      In gaming

  • @BrKnOblivion
    @BrKnOblivion 7 ปีที่แล้ว

    I hope AMD take note...

  • @TiberiuLupescu
    @TiberiuLupescu 7 ปีที่แล้ว +2

    If you make 20 million GPUs per year and that component costs 1$, you're saving $20 million. If you're AMD this sum is the difference between being profitable and having a loss.

    • @mmllmmll22
      @mmllmmll22 7 ปีที่แล้ว +1

      But they can incerase price for ~5$ and they will earn $80 million so w/e
      And with it they won't have RX480 with "PCI-E Problems" of pulling more than 75W

    • @ActuallyHardcoreOverclocking
      @ActuallyHardcoreOverclocking  7 ปีที่แล้ว +3

      they could cut 2$ of the mosfet budget to pay for real current sensing circuitry

    • @TheWarTube
      @TheWarTube 7 ปีที่แล้ว +5

      that whole power limit "scandal" on the RX 480 was probably worth to them those $20 million...

    • @m_sedziwoj
      @m_sedziwoj 7 ปีที่แล้ว

      I think if they don't want add for all power sources, they should add to PCIe power, because it make problems. For me, if card use 100/200/300W is not so important, I not going to mining.

  • @jacobfoster6773
    @jacobfoster6773 7 ปีที่แล้ว +1

    Fist?

  • @maxfacts1
    @maxfacts1 7 ปีที่แล้ว +4

    Great take on what's going on with Nvidia's accurate watt usage numbers vs AMD's nefarious inaccurate watt usage numbers. I feel vindicated as I learned the Watt Facts, much thanks to you.
    The Real reason cred dead AMD plays these watt hiding games is that AMD doesn't want people to know how many watts their inferior uArch and inferior Glofo FinFET Fail process really does suck.
    This is what happens when cred dead AMD combines being debt dumb and R&D poor with being a blatant watt wasting BSing propaganda pumping company desperately trying to compete with Performance Per Watt Champion Nivdia's Pascal.
    Fact: AMD's Watt wasting GPU core uArch + Glofo's watt wasting FinFAIL process = Watt Sucking GPUs that despite using high cost low yield HBM gets beaten like rented AMDonkey by Performance Per Watt Champion Nvidia/TSMC Pascal.

    • @whoruslupercal1891
      @whoruslupercal1891 7 ปีที่แล้ว +20

      This sounds like nVidia propaganda.
      Jensen's taiwanese trapboys (like you) need to go.

    • @maxfacts1
      @maxfacts1 7 ปีที่แล้ว +1

      You sound like a triggered AMD Advocate who got butt hurt by exposing AMD's nefarious watt wasting cover up.
      The Fact is that there is absolutely NO reason that AMD plays these Watt Sucking games other than to Hide how many watts their GPUs really suck. Cred Dead AMD chose to make these nefarious watt hiding designs unlike Nvidia which makes it easy to know the wattage Facts.

    • @DarkLinkAD
      @DarkLinkAD 7 ปีที่แล้ว +4

      They cant be hiding much, Just isnt enough room. Ive seen Rx460 report 100watts when massively overvolted without a 6 pin adapter, using only the PCI 2.0 revision from back in 2008. People crash at 75Ws on the old PCI design. Edit: correction PCI-E Ofcourse.

    • @oncearoundthemapleleaf9041
      @oncearoundthemapleleaf9041 7 ปีที่แล้ว +12

      Max Facts Nope not triggered, Your just trolling for reaction.

    • @Crunchbite_Daimyo
      @Crunchbite_Daimyo 7 ปีที่แล้ว +1

      I bet you really don't want to remember Fermi do you? You sound like someone who has been drinking the nvidia coolaid for way too long.

  • @OneTimePainter
    @OneTimePainter 7 ปีที่แล้ว

    First!