Two GPUs in a Desktop computer, x8 PCIE Performance.

แชร์
ฝัง
  • เผยแพร่เมื่อ 7 ม.ค. 2025

ความคิดเห็น • 322

  • @vaibhavtailang
    @vaibhavtailang 3 ปีที่แล้ว +16

    Just the video I wanted. Thank you very much sir! You've helped a lot in clearing my doubts. I went with 5900x setup instead of Threadripper Workstation purely bcoz of price factor. Also because as an Independent artist I could afford the performance loss as compared to spending a thousand bucks extra for threadripper workstation which I couldn't afford at this point anyway. But I surely get the point that at some point later I will need to shift to a workstation. I also get that apart from performance, I also don't have much hardware flexibility and upgrade path with a desktop but for now 5900x setup is just the optimum sweetspot for me.

    • @MediamanStudioServices
      @MediamanStudioServices  3 ปีที่แล้ว +2

      Hi vaibhav tailang, I understand you situation as I myself have a 5900x for the same reasons. The system is great for most work. I edit all my videos on the 5900x and I really like the performance i get for the money.
      Good luck
      Thanks for watching

  • @tojiyev_cg
    @tojiyev_cg 2 ปีที่แล้ว +7

    Great video. It is the ONLY video about this topic on TH-cam. Thank you guys.

    • @phasepanther4423
      @phasepanther4423 ปีที่แล้ว

      But why though?
      Just because sli nv link and crossfire aren't supported anymore there's no reason to ignore this at all.

    • @tojiyev_cg
      @tojiyev_cg ปีที่แล้ว

      @@phasepanther4423 I ment that it is the only video about using dual gpus on consumer motherboards, which usually have 16x pcie lanes, in this case you can use 2gpus with x8 lanes for each of them. There is no such video about this topic. I didn't say anything about not supporting sli of crossfire

    • @kevinerbs2778
      @kevinerbs2778 7 วันที่ผ่านมา

      ​@@phasepanther4423all workstation boards still support both. S.l.o and nv-link. Even trx50 & wrx90 stil havel support.

    • @kevinerbs2778
      @kevinerbs2778 7 วันที่ผ่านมา

      ​@@tojiyev_cg many am4 boards only had 16x/4x hard wired.

  • @ChrisDallasDualped
    @ChrisDallasDualped 3 ปีที่แล้ว +2

    Amazing info you've given us and have subbed to your channel today, you have the best rendering info on the net that I've seen, thank you for this.

  • @OfficialHankIII
    @OfficialHankIII 2 ปีที่แล้ว +1

    THANK YOU FOR YOU DIRECT TO THE POINT VIDEOS!!!!!

  • @rifatkabirsamrat5237
    @rifatkabirsamrat5237 ปีที่แล้ว +4

    You guys are lifesavers. Literally the only video on this topic on TH-cam. Thank you so much!

  • @DerelictNacho
    @DerelictNacho ปีที่แล้ว +1

    I greatly appreciate the way you produced this video. To the point. No click bait title. Straight forward. Thank you!

  • @laszloperesztegi
    @laszloperesztegi 8 หลายเดือนก่อน +1

    Thank you for the video. I'm here for comparing the two methods for Video Transcoding performance.

  • @guillermocosio7362
    @guillermocosio7362 3 ปีที่แล้ว +3

    Hey Mike! Using PCIe Gen4 x8 is the same bandwidth as PCIe Gen3 x16. A 3090 at fullspeed will be 1% (aprox) bottlenecked by PCIe 3 x16, so a 3070 shouldnt be bottlenecked at all with PCIe 4 x8, However using a older system with only PCIe Gen3 (Intel 10th Gen or Ryzen 2000) and dual graphics card may result in more of a performance difference.
    Also I believe that during Rendering if the scene fits within the GPU VRAM the PCIe interface is only used to feed the GPU during loading, so PCIe speed could really hurt is when the scenes don't fit in the VRAM and the GPU need the PCIe interface needs to access system RAM as a cache. So maybe in huge scenes the performance difference may actually be huge aswell.

    • @MediamanStudioServices
      @MediamanStudioServices  3 ปีที่แล้ว

      Hi Guillermo Cosio, I totally agree, thanks for sharing this with the channel.

  • @TheArtofSaul
    @TheArtofSaul 3 ปีที่แล้ว

    Something to keep in mind especially in RS case is how out of core works with slower PCIE bandwidth. PCIE speed isn't too much of an issue when everything is loaded at once in a single go and fits in VRAM. But as Out of core is pretty robust in RS you can work on much much larger scenes but the PCIE speed can make a large difference in the total bandwidth of data moving back and forth from System RAM / Scratch Disk (worst but not bad on M.2) vs a single load to the VRAM.

    • @MediamanStudioServices
      @MediamanStudioServices  3 ปีที่แล้ว

      Hi Saul, I am not an expert on Out of Bang Vram usage, but if this video is any indication of how the GPU performance when they run out of memory, this I say more Vean is always better. At the end of this video I render a very large scene with RS and the RTX A6000 crushed the RTX3090 due to the limited Vram.
      th-cam.com/video/M2I69uK9WVU/w-d-xo.html
      Thanks for watching

  • @andreybagrichuk5365
    @andreybagrichuk5365 3 ปีที่แล้ว +1

    Amazing content, thank U, Mike !!!

  • @kylecook4806
    @kylecook4806 2 ปีที่แล้ว +2

    I've 5950x and was thinking of switching to a workstation with next gen stuff, this vid though was exactly what I was looking for, getting a second GPU and I'll be staying on my current system for a long while.

    • @xMaFiaKinGz
      @xMaFiaKinGz 2 ปีที่แล้ว +1

      You can actually go as low as 4x with pcie 4.0 without losing performance. I have rtx 3070 on pcie 3.0 8x and it works flawlessly with no performance loss.

  • @rockyvicky3848
    @rockyvicky3848 2 ปีที่แล้ว +2

    Sir you are really making youtube usefull

  • @fadegboye
    @fadegboye 3 ปีที่แล้ว +2

    Hi Mike! Thanks for the video, always has lots of useful information. Just one thing, at 4:03 you mention VRay benchmark scene but that is a Redshift benchmark scene. Is that correct? Cheers.

    • @MediamanStudioServices
      @MediamanStudioServices  3 ปีที่แล้ว +1

      yups, my bad. but can not change it now. TH-cam will not allow any edits after you submit the video. I would have to delete the video and repost it.
      Thanks for pointing that out

    • @fadegboye
      @fadegboye 3 ปีที่แล้ว

      @@MediamanStudioServices No problem. Good content.

  • @Gettutorials123
    @Gettutorials123 3 ปีที่แล้ว +1

    Really informative!

  • @FuzzyImages
    @FuzzyImages 3 หลายเดือนก่อน

    Thanks for this, I recently have upgraded my cpu to a 5900x and I have been thinking of just plugging a second 3070 in to improve my render times and give a second one for rendering while I stream (do to some complicated set up for post editing I have to dual record my streams, one for the split audio and a second just to get raw game footage at a higher bit rate) as well as just maybe squeeze out two more years before needing to change to a new cpu socket. gaming is secondary to me and I only run at 1080 anyways. This was super helpful for that decision , thank you.

  • @Anthrocarbon
    @Anthrocarbon 6 หลายเดือนก่อน

    Thanks for investigating, but boy was the Laughing Man name drop the GITS nostalgia hit I needed today!

  • @06RHDSUBIEWAGON
    @06RHDSUBIEWAGON 5 หลายเดือนก่อน +2

    id love to see a video on how to setup the 2 gpus and how to run them together

  • @mario-762
    @mario-762 2 หลายเดือนก่อน +1

    Very helpful as always!

  • @FuzzyImages
    @FuzzyImages 3 หลายเดือนก่อน

    Thanks for this I recently have upgraded my cpu to a 5900x and I have been thinking of just plugging a second 3070 in to improve my render times and give a second guy for rendering while I stream (do to some complicated set up for post editing I have to dual record my streams one for the split audio and a second just to get raw game footage at a higher bit rate) gaming is secondary to me and I only run at 1080 anyways. This was super helpful for that decision , thank you.

  • @itsaman96
    @itsaman96 2 ปีที่แล้ว

    Thank you for making this video its kinda really detailed Theorie that always comes in my mind

  • @tay.0
    @tay.0 3 ปีที่แล้ว +1

    I think most of the differences you are running into is caused by CPU differences between the two builds, since Light Cache and some Global Illumination are still using CPU even during GPU renderings.

  • @sanaksanandan
    @sanaksanandan 3 ปีที่แล้ว +2

    Very useful test. I guess with older gen GPUs like RTX 2070 or 1660, the performance drop will be negligible.

    • @MediamanStudioServices
      @MediamanStudioServices  3 ปีที่แล้ว

      the performance difference should be the same across the generation of GPUs.
      thanks for watching

  • @jinomian01
    @jinomian01 ปีที่แล้ว

    Thank you for doing this. This helped me understand what the impact is, if I go down the desktop dual gpu scenario. I don't need 2 at the moment(And have never done that on any of my builds) but I would like the option in my next build.

    • @MediamanStudioServices
      @MediamanStudioServices  ปีที่แล้ว +1

      Two GPU does work in a desktop if you can fit the cards into the system and MOBO. and for 3D rendering, the 8x PCIe speeds does not affect the performance that much. It only about 2-3% in final time to render due to the transfer of the scene to the GPU. But desktop is a viable solution for dual GPU set ups. thanks for watching

    • @jinomian01
      @jinomian01 ปีที่แล้ว

      @@MediamanStudioServices Thank you Mike, that is great to know.

  • @SaladFX
    @SaladFX ปีที่แล้ว +2

    Really wanted to know how two different GPUs would work for production workloads. especially rendering in xpu.

  • @Afflictionability
    @Afflictionability ปีที่แล้ว +2

    thank you I needed this

  • @lucapupulin6990
    @lucapupulin6990 2 ปีที่แล้ว +1

    Very useful information!
    Thanks for the awesome benchmarks you do!
    It would be interesting to see how much simulation (Houdini Pyro or Vellum) speed is impacted using x16 vs x8 PCIE

  • @garyc5245
    @garyc5245 10 หลายเดือนก่อน +1

    Well even though this is 2 years old, still very helpful. My situation is using a NVlink bridge on two 3090TI's for a client that wants the VRAM of a A6000 without the price. Was able to find a nvlink bridge, but it difficult finding a motherboard with the 81mm spacing I need between the two 5.0 x 16 slots

    • @hareram4233
      @hareram4233 9 หลายเดือนก่อน

      Did u find one? I also need suggestions for this motherboard

    • @garyc5245
      @garyc5245 9 หลายเดือนก่อน

      @@hareram4233I ended up getting the TRX 50 Aero-D by Gigabyte, has a 3 slot spacing and found a 3 slot NvLink online. Still in the works, but looks like Linux can support NvLink even if windows does not....at least I hope so

  • @hardwire666too
    @hardwire666too 2 ปีที่แล้ว +1

    GPU prices are way down now, and theused market is starting to flood. I really want a 3090 mainly fo the Vram, but the prices on used GPUs is hard to ignore. I already have a 2070 and used they are starting to pop up for $250-ish bucks. It may not increase my Vram and let me worry less about scene size, but if it can shave some significant time off my renders in Blender 3.xx I'd call it a win. Would be curious to see what you think.

    • @MediamanStudioServices
      @MediamanStudioServices  2 ปีที่แล้ว

      Well I think the 3090 would shave off quite a bit of time vs. the 2070. if you can get the 3080 or 3090 at a good price. then you should get one.
      thanks for watching

  • @fenghanlin4284
    @fenghanlin4284 2 ปีที่แล้ว +1

    Thanks for the sharing. I had the same experience. I used Supermicro 2024US-TRT with two A100-40GB cards for GPU-accelerated computing (simulation). I made a mistake by putting one of the card into a X8 slot. I can see from nvidia-smi that both cards can run perfectly to 99%, until oneday I noticed that one GPU in X16 and the other in X8. I had the question whether if affects anything. So I load task individually to the card in X16, and to the one in X8, separately. I didn't see any difference at all! Then I was wondering if it changes with different load, by which I meant the different usage of memories. To confim that, I test different load from occupying 10GB memories until 40 GB in full. I didn't see any difference. Later I did the same tests on two 3090 cards (turbo fan) on a 10980XE workstation (of Dell 5820X). Sorry I can't test A100 in workstation because of heat issue. I put one of the RTX3090 card only in a X16 slot and later a X8 slot. This time I could see about 20% of difference in terms of simulation speed. Note that on 5820X, only PCIE 3.0 is provided. The I was wondering if the difference is caused by the difference between Geforece and Tesla. But I can't put them into the AS-2024-US chasis because the RTX 3090 cards are too long (msi-aero). So I roughly had a conclusion that, PCIE3.0X16 or PCIE4.0X8 may be good enough for my applications, but it may vary, subject different user cases. Thanks alot for your sharing.

  • @aceaquascapes
    @aceaquascapes 3 ปีที่แล้ว

    @3:26 dear the screenshot score is differnt from what ur audio said.. thanks for the great work.
    I finally got a workstation

    • @MediamanStudioServices
      @MediamanStudioServices  3 ปีที่แล้ว +1

      ya, i rushed this video and there is a few mistakes. But thanks for watching
      what workstation did you get?

  • @Atsolok
    @Atsolok 3 ปีที่แล้ว

    I was looking for this answer and you're the only one that could tell me.... Subscribing!

  • @IamJeshua
    @IamJeshua 2 ปีที่แล้ว +6

    Is there a problem if I use different cards? i have a 3070 ti but i want to buy a 3080ti, all this mounted on a z690 motherboard with an i7 12700f

    • @thienhofficial
      @thienhofficial 2 ปีที่แล้ว

      Following!

    • @KazCanning
      @KazCanning 2 ปีที่แล้ว +1

      There shouldn't be any issue... I have an rtx 2080 and a 3080 ti both running 8x. I was actually shocked that it worked perfectly immediately, I literally didn't have to do anything, just plugged the card in, turned on my computer and it just worked. Shocking.

    • @thienhofficial
      @thienhofficial 2 ปีที่แล้ว

      @@KazCanning I have two 3070ti back to back connected to my mobo. Do you think they're gonna overheat once I put them underload? If so, what should I do to cool them down better? I have a 680x corsair case. Thank you!

    • @KazCanning
      @KazCanning 2 ปีที่แล้ว

      @@thienhofficial hard to say. I know my 3080tis get pretty hot when working on renders. I also have the founders edition cards so they have the pass through heat fins, basically blowing hot air on the top card.

    • @ShawarmaBaby
      @ShawarmaBaby 2 ปีที่แล้ว

      @@KazCanning which psu you have? I have 3080 and 3090 woth a 1300w and when i render my monitors go off

  • @eric6606
    @eric6606 3 ปีที่แล้ว +1

    Would love to see a review of the new Mac Pro/Max vs RTX 3090/A6000. Great content btw.

    • @MediamanStudioServices
      @MediamanStudioServices  3 ปีที่แล้ว +1

      Hi Eric, not that I am a Mac hater, but there is not much support for GPU rendering on a Apple system, so I do not have any desire creating a Mac video on 3D rendering performance.
      Thanks for watching

  • @GuidoGautsch
    @GuidoGautsch 3 ปีที่แล้ว +1

    This was just the video I'd hoped you'd make! Super interesting, thank you Mike. It seems that an 570X board would do the trick with relatively minimal sacrifice. There's a pretty significant price difference in Australia between the 3060 Ti (~ $1200) and the 3070 (~ $1700), so for a dual GPU rig for Blender, two 3060 Tis would be cheaper and faster than a single 3080 ($2300), 3080 Ti ($2500) and even a 3090 ($3300). Yes, GPUs are expensive down under. I was thinking of pairing them with an MSI Tomahawk x570, a Ryzen 5950x and potentially an 850w PSU?

    • @MediamanStudioServices
      @MediamanStudioServices  3 ปีที่แล้ว +1

      hi Guido, I was looking at the asrock taichi x570 when I built my system, but it was not available when I purchased my equipment. The Tomahawk is also a nice board.
      Good luck on the GPU purchase.
      Thanks for watching.

    • @nawafalrawachy394
      @nawafalrawachy394 3 ปีที่แล้ว

      Hi Guido,
      Just a word of advice. If you have triple slot cards (air cooled) then it is recommended to have at least one empty slot between them. This way, the top card won't be choked of air. I personally recommend the MSI x570 Godlike. Another thing is for dual 3060ti a 1000 Watt gold or higher PSU is recommended from a reliable psu maker like Seasonic or EVGA (EVGA gpu's are bad but their PSU's are excellent). Generally, using dual GPU's in rasterization based engines like EEVEE or Unreal engine (by using multiple instances to speed up rendering / multi instance rendering) will use more power than a pathtraced engine like cycles or octane would use. Thus tripping the OCP on your PSU. Speaking from experience. Hope this helps.

    • @GuidoGautsch
      @GuidoGautsch 3 ปีที่แล้ว

      @@nawafalrawachy394 thanks, that's really good to know! That motherboard costs $1200 here 😵 more expensive than TRX40 boards, could even get an WRX80 for that. So maybe I'll stick to a single GPU after all.

    • @GuidoGautsch
      @GuidoGautsch 3 ปีที่แล้ว +1

      @@MediamanStudioServices thank you. The Taichi looks solid

    • @nawafalrawachy394
      @nawafalrawachy394 3 ปีที่แล้ว

      @@GuidoGautsch It could still work with a regular layout board if the cards are between 2.7-2.8 slots wide. Though the top card will struggle a bit to get air which will increase its temps. To combat this, I reduce the power limit on the top card to 95% and give it a custom fan curve to keep it from going above 80c.

  • @dulichauch
    @dulichauch 3 ปีที่แล้ว +1

    That's the info! Thank you so much! I want to build this for rendering and modelling in Blender, PSU and case with upgradeability in mind. 5950x is out of budget so I hope the 5900x will be able to handle mobo and future 2 GPUs. My list Ryzen 5900x, X570 Taichi, 32Gb Vengeance LPX, RTX 3060, plus a 2nd GPU in the Future, Case Thermaltake Level 20 HT, Deepcool 360EX, 2NVMe WD 750SN, Seasonic Prime PX1300. Thank you for any input.

    • @MediamanStudioServices
      @MediamanStudioServices  3 ปีที่แล้ว +2

      Hi dulichauch, if you reduce the PSU to 1000watts you could get some extra ram. I would suggest 64GB. Even with two RTX3080 at 350 watts that still leaves room for the 105watt CPU.
      All other parts are a good choice. The Taichi is a good choice for two GPUs.

    • @dulichauch
      @dulichauch 3 ปีที่แล้ว

      @@MediamanStudioServices Oh, thats surprising! At beQuiet's PSU Calculator, adding the watercooler, 2 GPUS, 3 additional fans and 2xM.2 and one SSD I received 1000W usage, so I got the big PSU, still think of keeping it. For RAM: would it be totally absurd to order 2 separate LPX Kits of 16x2GB for 170€ or is it strongly recommmended to go with one kit only 2x32gb for 280€?

    • @MediamanStudioServices
      @MediamanStudioServices  3 ปีที่แล้ว

      @@dulichauch it was just a suggestion to save money and get the much needed RAM required for creative workloads. More PSU is always better if you can get it.
      and I would get the 2x32gb and leave room for upgrade later.

  • @Frederic_walls
    @Frederic_walls 3 ปีที่แล้ว

    Excellent! Thank you very much for the information, it is very helpful

  • @vinee.mp4
    @vinee.mp4 2 ปีที่แล้ว

    Huge thanks on this informative video!

  • @eriuz
    @eriuz 3 ปีที่แล้ว +3

    I think you don't have a big hit in performance because on the x570 you have two x8 pcie 4.0 that is equal to 2 x16 pcie 3.0 the big hit is gonna be when you use two gpu that are pcie 3.0 or use a board that doesn't support pcie 4.0

    • @MediamanStudioServices
      @MediamanStudioServices  3 ปีที่แล้ว

      hi Eriuz , I just assumed that most people looking to get two GPU would be using a current MOBO.
      But you are correct, the older MOBO's are using PCIe V3 and a lower bandwidth.
      thanks for watching

  • @santiag0pictures
    @santiag0pictures ปีที่แล้ว +1

    Thank you so much! It's hard for many of us investing our savings in hardware trying to get the best performance at a lowest cost. Have you ever considered power consumption? because having to many machines working 24/7 affects the cost of the electricity bill. So, is it cheaper having many GPUs in one PC vs Many PCs with single GPUs?.

  • @reallycome1
    @reallycome1 9 หลายเดือนก่อน +1

    Sir, you are an ABSOLUTE LEGEND! I appreciate this clarification. I am building my first-ever custom-loop PC and was very curious about the two important points you've made. Moving from Apple after 20 years is a little daunting but willing to try out the new and more powerful machines.
    My new PC will have the following specs:
    • Intel 14900k (water-cooled by the EK-Direct Die, metal thermal paste)
    • Z790 Master X Motherboard
    • x2 Navidia 4090 Cards (water-cooled, metal thermal paste)
    • 4 TB, M2, SSDs
    • 92GB Ram at 8600

    • @majus1334
      @majus1334 6 หลายเดือนก่อน

      What's your use case?

  • @rayornelas8459
    @rayornelas8459 ปีที่แล้ว

    Really informative. It would be great to note your full specs in your videos.

  • @rudypieplenbosch6752
    @rudypieplenbosch6752 3 ปีที่แล้ว

    Interesting information, so the bandwidth of the pci4.0 is at this moment hardly used if a 16* slot and a 8* slot can stay withing 10% of eachother. And now we already have PCIe5.0, while PCIe4.0 is hardly fully used, maybe storage attached to the PCIe bridge, makes better use of the available bandwidth.

  • @Andee...
    @Andee... 8 หลายเดือนก่อน +2

    I wish you included card temps

  • @TheGamingREZ
    @TheGamingREZ 2 ปีที่แล้ว +2

    Guys could someone help?
    I have a Gigabyte Z690 Aorus Elite DDR4 CPU, and have a rtx 3080 in there.
    Is it worth it if I put another one in there (I already have another 3080, and also a 1070 ti)? I use the PC for gaming / video editing.

    • @mikebrown9826
      @mikebrown9826 2 ปีที่แล้ว

      For gaming, it's not going to help you as you can not sli/nvlink the GPUs. For video editing. Only resolve can use multiple gpus

  • @KiraSlith
    @KiraSlith ปีที่แล้ว

    It sounds like a lot of that performance reduction comes from switching time loss. I was going to build a gaming system and just add a second GPU later, but I ended up cowering out and just throwing a pair of 2080tis into a T7820. $1k and I got a rig that happily does both Blender with RT and AI production work.

  • @lejeffe1663
    @lejeffe1663 3 ปีที่แล้ว

    awsome been eyeballin a threadripper pro/r9 dual system build :)

    • @MediamanStudioServices
      @MediamanStudioServices  3 ปีที่แล้ว

      Hi Jeffe, I wish you luck finding a system in this messed up market.
      All the best.
      Thanks for watching

  • @JohntheNX
    @JohntheNX ปีที่แล้ว +2

    can u make a new test but with pci 5.0? Im planning to build a new rig with 7950x and x670 mainboard. I read the manual that if i use 2 pcie 5.0 slot, It will share each x8, but im not sure if there s any reduction considering it s faster

    • @jinomian01
      @jinomian01 ปีที่แล้ว

      I would love to see this as well. Similar build I am considering.

  • @NarekAvetisyan
    @NarekAvetisyan 3 ปีที่แล้ว

    I'm using an RTX 3070 with a Ryzen 2700X on B450 motherboard. It's running at x8 because I have a second GPU. I've measured about 2-3% performance drop on x8. I also have it overclocked +200MHz on the core and +1300MHz on the memory with power limit set to 100% and I gain 14% performance boost on my Blender renders.

  • @WangJack-z7g
    @WangJack-z7g 10 หลายเดือนก่อน +2

    Thank you for this video. It helped explain alot. I have a dual gpu setup with two 4070 supers. My current board has 1 pcie x 16 5.0 and 1 pcie x 16 4.0. I am in the process of getting a new motherboard with 2 pcie x16 5.0 slots. Would this improve performance?

    • @wtfskilz
      @wtfskilz 9 หลายเดือนก่อน

      It would increase, however it's still going to run at 8x

    • @majus1334
      @majus1334 6 หลายเดือนก่อน

      No. 4070 SUPER runs @PCIe 4.0, your motherboard will downgrade all lanes to accommodate your GPU, so PCIe 5.0 would be disabled/ downgraded to PCIe 4.0.

  • @georgigazeto4650
    @georgigazeto4650 ปีที่แล้ว +2

    are you going to make a new dual gpu video with 4080 or 4070 ti?

  • @TheMetalmachine467
    @TheMetalmachine467 2 ปีที่แล้ว +3

    Try fitting 2 nvidia 4090 fe cards in lol

  • @peterxyz3541
    @peterxyz3541 ปีที่แล้ว +2

    THANKS! Im less confusing. Question does the MB auto balance work load? I’m planning to use 2 different Nvidia for AI gen text & images

  • @DDesigner123
    @DDesigner123 3 ปีที่แล้ว

    Love your videos. new subscriber..

    • @MediamanStudioServices
      @MediamanStudioServices  3 ปีที่แล้ว

      hi Dewayne Dailey, for video editing, Unless you use Resolve Studio, the paid version. most video editing software does not use the GPU for much acceleration. And rendering in the viewport is a single GPU task. Just final 3D rendering can use the multi GPU set ups

  • @ChrisDallasDualped
    @ChrisDallasDualped 3 ปีที่แล้ว +1

    Hey Mike, I'm looking to build a PC for workflow mainly in Premiere Pro and Filmora, is it more wise to go for an RTX 3090 or 2 RTX 3070's which would benefit me the most? I don't care for gaming this is strictly for my workflow. Also, what boards and which ram would you recommend? Obviously faster ram makes a difference? What about hard drive? I would prefer ones with at least 3K write speeds. And which water cooled system should I use? I've also narrowed it down to either the Ryzen 5900X or 5950X which I have access to. I will await your reply and thx a million for your hard work.

    • @GuidoGautsch
      @GuidoGautsch 3 ปีที่แล้ว +1

      Premiere Pro isn't able to take advantage of GPUs efficiently and is more CPU-limited us my understanding, so even a single 3070 would be fine. Invest in the 5950x out of those two options. If you were to drop PrePro for Resolve, however, two 3070s would beat out the single 3090 by ~30%

    • @ChrisDallasDualped
      @ChrisDallasDualped 3 ปีที่แล้ว +1

      @@GuidoGautsch thx for the input, I hate resolve btw Adobe is so much easier to work with and so is Filmora btw. Yeah 5950X is what I am aiming for.

    • @GuidoGautsch
      @GuidoGautsch 3 ปีที่แล้ว +1

      @@ChrisDallasDualped yeah, I hear you. I prefer PP over Resolve as well for editing - just wish it'd be able to take advantage of GPU grunt as effectively as Resolve does.

    • @MediamanStudioServices
      @MediamanStudioServices  3 ปีที่แล้ว

      I agree with PP as this is my choice for editing the Mediaman videos. Just wish Adobe would get better GPU support.

  • @surenetto1102
    @surenetto1102 4 หลายเดือนก่อน +1

    What if i put my videocard on 2nd gpu slot of motherboard, will this affect the performance?

  • @yasherok
    @yasherok 3 ปีที่แล้ว +1

    Hi Mike,
    Could you please say what motherboard do you use for the Desktop machine in this video, and what cooling system? As I see you use blower versions of GPUs in this rig, right?
    Also, what do you think about GIGABYTE X570S AORUS MASTER motherboard for using with two fan style GPU and Ryzen 5950x? Looks like it has enough room for airflow between two GPU and it has decent VRM for this processor? Tring to figure out best config for my situation.
    Thank you for your channel, there is a lot of useful and unique information here!

    • @MediamanStudioServices
      @MediamanStudioServices  3 ปีที่แล้ว +1

      Hi Vlr. I am using the Gigabyte X570 pro wifi.
      www.gigabyte.com/Motherboard/X570-AORUS-PRO-WIFI-rev-10#kf
      for the cooling system. I did my own custom loop. check out the video i did here.
      th-cam.com/video/r_Rl30LhE2M/w-d-xo.html
      th-cam.com/video/dwfn5IQXnIE/w-d-xo.html
      The GIGABYTE X570S AORUS MASTER is a good choice.
      Thanks for watching

  • @exarkun8250
    @exarkun8250 ปีที่แล้ว +2

    If you use one of the GPUs for gaming, will having 2 GPUs affect its performance?

  • @antoniosomera
    @antoniosomera 2 ปีที่แล้ว +3

    Thanks!, Great information and tests!, I have one doubt, Is it possible to render in V-ray with RT cores and 2 RTX GPUs but from different generation (E.g. RTX 2090 + RTX 3090) or different models (E.g. RTX 3090 + RTX 3080)? obviously without nvlink or sli. Well, thanks in advance!

    • @TILK3D
      @TILK3D 2 ปีที่แล้ว +2

      @andrew I have the same question, maybe you already got an answer

    • @antoniosomera
      @antoniosomera 2 ปีที่แล้ว

      @@TILK3D I haven't solve my specific doubt regarding 2 RTX GPUs but from different generation (E.g. RTX 2090 + RTX 3090) , But I have found this: 1.-You can render with several Nvidia GPU's using Vray and Vray will manage to use the available CUDA cores in the GPUs (This should work with CUDA cores of the same GPU generation even if they are different model cards (E.g. RTX 3090 + RTX 3080), but I still I don't know if different CUDA cores generations can work (E.g. RTX 2090 + RTX 3090) I have a RTX 3090 and a 980 Ti, I need to make some experiments. 2.-It is no necessary to use SLI or NVLink in order to use all the available CUDA cores of those same generation Cards, 3.-The NVLink will help sharing GPU's RAM but the improvements in rendering time is small (+/- 10%), 4.-I'm deducting (I still don't have evidence) that you can render with Vray and RT cores of several RTX GPUs of the same generatios just like the cuda cores (E.g. RTX 3090 + RTX 3080) also without an NVLink, Vray will manage to use the available RT cores. 5.-I don't know if you can render with VRAY using different RT cores generations GPU's (E.g. RTX 2090 + RTX 3090) Hope this is useful to you, let me check the sources where I found this information and I'll share it here. Please let me know if you have found something else. Best regards.

    • @TILK3D
      @TILK3D 2 ปีที่แล้ว +1

      @@antoniosomera THANK YOU FOR YOUR INFORMATION I WILL BE LOOKING FOR MORE INFORMATION AND IF I SOLVE MY DOUBT THE SAME I WILL SHARE IT GREETINGS, SUCCESS

    • @Outrider42
      @Outrider42 2 ปีที่แล้ว +1

      You can use any combination of supported GPUs. The CUDAs will stack, the VRAM will not. So if you use a 2070 with a 3090, and the scene exceeds the 2070's VRAM, performance will suffer. But aside from this, you can mix and match however you want, and with different generations.
      This should be the case with most CUDA based renderers.

    • @antoniosomera
      @antoniosomera 2 ปีที่แล้ว

      @@Outrider42 Thanks!

  • @TGameplay4k
    @TGameplay4k 6 หลายเดือนก่อน

    great video thanks!

  • @comodore3684
    @comodore3684 2 ปีที่แล้ว +1

    great channel. for the video, I was expecting a benchmark comparison between one desktop gpu vs 2 gpu. is it worth to add a second gpu for rendering or is it worst or is it the same ? maybe you have another video that deal with this question, otherwise would be nice to do one. keep up the good work

    • @MediamanStudioServices
      @MediamanStudioServices  2 ปีที่แล้ว

      Hi Comodore, I have a few dual GPU video's on my channel. Check them out.. Thanks for watching

  • @jeremiahMndy
    @jeremiahMndy ปีที่แล้ว +1

    I'm about to run 2x RTX 3090 FE in NVlink configuration for a total of 48GB Vram pool. il let you know how it goes, would be cool to see this on your channel before I rig it up for Blender and Iclone.

  • @MrozuProjekt2
    @MrozuProjekt2 8 วันที่ผ่านมา

    I currently have a x399 TR 2950x platform, all pcies are the same here, I used to run 4 graphics cards on them for rendering tests, I'm going to switch to Amd 9950x and I have a problem choosing a board, whether to the B650 Aorus v2 board, it's similar in Aorus, x670, x870, can I connect 2 graphics cards? I have concerns about the second Pcie connector, because the specification says that one is connected to the CPU and the rest to the chipset, proart boards have it described that they have two pcie connectors connected to the CPU

    • @kevinerbs2778
      @kevinerbs2778 7 วันที่ผ่านมา

      Most don't slip the lanes from the cpu
      What they have is a chip that splits it. If you look in some mother board manuals it says which have it. But the cpu it self cannot split any of its lanes. I only found this out recently

  • @TheBlueVaron
    @TheBlueVaron ปีที่แล้ว +1

    Would Directx12 use both GPUs as one for gaming?

  • @si1entshotzdtom
    @si1entshotzdtom 11 หลายเดือนก่อน +1

    I run 2 GPU, on my gaming rig. Non SLI, etc. bring the 16x to 8x and I barely notice any performance loss as in FPS.
    6950xt for gaming and etc 3050 for stream. But 5950x and Dark Hero VIII.
    Core count and good MOBO help a lot, I am sure.
    But I game at 2k ultra and stream 2k ultra

    • @itschops
      @itschops 10 หลายเดือนก่อน +1

      i have a question for you. did you have to make any bios changes or were you able to just plug in and play(stream) with no issues? i have noticed that when i plugged in my second GPU (1080) i noticed some lag on the monitors connected to it. i did a clean install of drivers for my 4090 and the 1080 and still noticed some stutters on the monitors connected to the 1080. my monitors arent the same hz because my gaming monitor is a 170hz when the other 4 are just 60hz. i can get the stutters to stop when i make my gaming monitor the same hz but whose trying to game at 60hz when you have a 4090.. lol. any tips?

    • @si1entshotzdtom
      @si1entshotzdtom 10 หลายเดือนก่อน

      @@itschops I did not, but I did not plug any monitors in to the stream gpu. The stream gpu is just for work. Give that a try, if not. You make have to fresh install windows.
      Because when I upgraded to a 4080s, to replace my 6950xt. It would only read the 3050 out of the box. Yet the 4080s would output displays lol.
      I uninstalled all drivers even and, and same thing. I uninstalled all drivers again and reseated GPUS and amame thing.
      Fresh install helped, but couldn’t fully test it as the 4080s is huge and my 6950xt was liquid cooled via AlphaCool.
      So I am just using the 4080s and had to do a fresh install of windows. Works flawless

    • @itschops
      @itschops 9 หลายเดือนก่อน

      Thanks for letting me know. the main reason for wanting to have the 1080 added to my PC was to have the extra display connections since i have 5 monitors but the stutters were happening too often so i removed it for now. I didnt try a fresh install of windows but are you running 2 4080s now? do you think its running smooth because they are the same series card? @@si1entshotzdtom

    • @cannaroe1213
      @cannaroe1213 7 หลายเดือนก่อน

      I have the same set up, except I don't have a 3050. Is the 3050 even PCIe gen 4? Because if it's not, I think the 6950xt will have to downgrade to gen 3 too. I'm still building my PC, was thinking it would be cool to get a GTX for the ray tracing (and DLSS/FRS combo). But I don't really know if it's nessecary. Why a second card for streaming?

  • @elysiumcore
    @elysiumcore 3 ปีที่แล้ว

    I hope we get more lanes on AM5 boards

  • @videocruzer
    @videocruzer 20 วันที่ผ่านมา

    Not sure if it was propaganda, but I read that AMD desktop chips are designed to process a max of 12 pci lanes at once. 2x12*=4 what ever that means.

  • @junaidcg
    @junaidcg 3 ปีที่แล้ว

    Hi Mike, another great video from you. Do you think on the instances when you rendering heavy rendering, is it likely that there will be significant difference between desktop processor vs workstation processor, just like the test you did for RTX A6000 vs RTX 3090, where the difference was negatable until you tested with memory intense rendering?

    • @MediamanStudioServices
      @MediamanStudioServices  3 ปีที่แล้ว

      I dont think so as the CPU is close is specs. So it should not be much difference in the test results.

  • @georgelevantovskij8593
    @georgelevantovskij8593 3 ปีที่แล้ว +1

    This is extreamly valuable info! Thank you for your content!
    I have a following setup: Ryzen 5950X, Asus Prime Pro570X, 64Gigs of Ram (DDR4 Kingston 3200mhz), Asus Tuf Rtx3060 (12Gb), 750watt PSU.
    Do you know is it possible to add a second gpu (I have a leftover Asus Rtx3060ti), how it would effect my workflow and rendertimes in following programs: Blender, After Effects, Premiere Pro and also Unreal Engine 5? I was also thinking to get a bigger PSU (1000 watt).
    Thank you sir for your time and tips! This channel is pure gold!

    • @MediamanStudioServices
      @MediamanStudioServices  3 ปีที่แล้ว +4

      hi George, adding the second GPU will limit the Vram in the 3960 (12gb) to the same amount of Vram in the 3060ti (8GB) as 3D render software sends two packages, one for each GPU and they need to be the same package size. As for After Effects, Premiere Pro, you will not get much benefit as both of these programs have very little multi GPU support. Same with UE 5.
      thanks for watching

    • @georgelevantovskij8593
      @georgelevantovskij8593 3 ปีที่แล้ว +1

      @@MediamanStudioServices Thank you. Yes, I desided to build second backup rig which will help me with Blender rendering.

    • @bobbywiederhold
      @bobbywiederhold 2 ปีที่แล้ว

      @@MediamanStudioServices Is this the same for davinci resolve? Davinci being much more GPU reliant.

  • @nawafalrawachy394
    @nawafalrawachy394 3 ปีที่แล้ว

    Hi Mike,
    Thank you for taking the time to provide this valuable information.
    If I may ask. What is the keyboard you use in your videos. Thank you.

  • @TheMrJackzilla
    @TheMrJackzilla 3 ปีที่แล้ว

    Great videos. Can you make one talking about pcs and workstation to architectural work

    • @MediamanStudioServices
      @MediamanStudioServices  3 ปีที่แล้ว +1

      Hi Lucas, this is a great idea for a video. Let me look into the specs required. But I am thinking it will mostly be the same as 3D and VFX computers. But I will do some research to confirm this.
      Thanks for watching.

    • @TheMrJackzilla
      @TheMrJackzilla 3 ปีที่แล้ว +1

      @@MediamanStudioServices thanks, it will be very helpful. I really miss content for small 3d artists. You have been helping me a lot!!!

    • @MediamanStudioServices
      @MediamanStudioServices  3 ปีที่แล้ว +1

      @@TheMrJackzilla Glad to be of assistance. Please share on your FB page to help grow the channel.

  • @andreasrichman5628
    @andreasrichman5628 ปีที่แล้ว +1

    Does it affect southbridge temperature? significantly I mean

  • @SubirajSinghFitness
    @SubirajSinghFitness 3 ปีที่แล้ว

    Can you please tell the system requirements for a simple rtx 3060 graphics card, as i have one of the desktops which is quite old enough lets say 5th gen intel i5 with motherboard such old to, would this graphics card fit in it, or would i need to change my motherboard and processor as well to an 10-11th gen?

    • @seanspliff
      @seanspliff 3 ปีที่แล้ว +1

      Idk if it helps but I use a 3060 ti with a intel I7 and a 6 year old motherboard

    • @SubirajSinghFitness
      @SubirajSinghFitness 3 ปีที่แล้ว

      @@seanspliff is your processor i7 even 6 years old?

    • @seanspliff
      @seanspliff 3 ปีที่แล้ว +1

      @@SubirajSinghFitness yup bought it in a prebuilt with the same mother board. My motherboard is a gigabyte x99 UD3

    • @SubirajSinghFitness
      @SubirajSinghFitness 3 ปีที่แล้ว +1

      @@seanspliff thanks bro 🍻

  • @ColinCharles3D
    @ColinCharles3D 3 ปีที่แล้ว

    Hey Mike, thank you for your videos, they help me a lot concerning my future build.
    Indeed, as a 3D Artist, I am looking to upgrade my build that has lasted me for my studies since 2016.
    As I'll work with a workstation at work, I am looking to build a personal desktop PC for 3D rending (gpu based Octane Render/C4D & Blender/Cycles), video editing (adode suite) and gaming (most likely 1440p, 34' ultrawild).
    My current build would be :
    - 12th gen i7 (liquid cooled AIO)
    - 64gb RAM DDR4 (DDR5 looks a little too expensive)
    - 2TB m.2 nvme + 500gb ssd sata
    - RTX 3080 10gb (+ GTX 1060 6gb from current build)
    - 1200W PSU
    Which motherboard would you recommand that can handel 2 GPUs with at least 2x 16 PCIE 4.0 or 5.0 ?
    Doesn't look like a lot of choise yet for z690 chipset that doesn't break the bank... maybe I should wait a bit ?
    And do you think 1200w psu is enough for 2 gpus and the overall build ?
    Thanks a lot Mike, just subscribed, your content is priceless !

    • @MediamanStudioServices
      @MediamanStudioServices  3 ปีที่แล้ว +3

      hi Colin, no desktop MOBO will support 2 GPUs in 16x. You would need to move up to a HEDT or Threadripper MOBO. Now you can still get good performance from a X570 running the GPUs at 8X. And I would recommend the Asrock X570 Taichi for running two GPU. This board have great VRMs to manage the power and the spacing to house the two GPU.
      Good luck with the build

  • @mohammedumar2348
    @mohammedumar2348 ปีที่แล้ว +1

    Can you make a tutorial video on how to setup 2 different graphics card (rtx 4090 and 4080) in on pc

    • @MediamanStudioServices
      @MediamanStudioServices  ปีที่แล้ว +2

      I will be making a new content really soon. Thanks for the idea for a video

  • @3dpixel583
    @3dpixel583 2 ปีที่แล้ว

    Hi! Thanks for your comparison. This is exactly what I am looking for !
    I have a question though. Do you think the CPU difference actually contributes to the time difference?

  • @pranavsuthar5864
    @pranavsuthar5864 ปีที่แล้ว +1

    Hey mate! I am bit confused whether to buy gaming PC or workstation PC, for CAD works. So I searched for custom build pcs website and found Z690 chipset supports both workstation(Nvidia rtx a2000) & gaming(Nvidia rtx 3080ti) graphics. But my question is can I put both these graphics on chipset at same time & while working on CAD software it uses workstation graphics (for compatibility as per hardware certification) & while gaming it uses gaming graphics.

    • @mikebrown9826
      @mikebrown9826 ปีที่แล้ว +2

      Why use two GPUs. Just get the 3090 or 4090 as they also work well with CAD application.

  • @DDesigner123
    @DDesigner123 3 ปีที่แล้ว

    Looking to build a machine to help my realtime cycle rendering in the viewport. I've seen the 3090 nearly rendering certain scenes in realtime I want this level of performance but getting a 3090 now a days is hard. would 2 3080s out perform 1 3090? in rendering, video editing etc?

  • @TILK3D
    @TILK3D 2 ปีที่แล้ว

    hello, good video, sorry, an sli bridge is needed to join the two graphics cards for 3d rendering or simply by connecting the two cards to the board, it can be used, I have a 3090 ti and a 3080 zotac

    • @mikebrown9826
      @mikebrown9826 2 ปีที่แล้ว +3

      No SLI or Nvlink required to render on both GPUs

    • @TILK3D
      @TILK3D 2 ปีที่แล้ว

      thank you very much for answering and clearing my doubt greetings and success

    • @mockingbird1128
      @mockingbird1128 2 ปีที่แล้ว

      @@TILK3D so i cant use 2 rtx 2060?
      it doesnt support sli

  • @Verlaine_FGC
    @Verlaine_FGC ปีที่แล้ว

    Does the gen of the PCIE slot factor in this as well?
    Will two gpus that are meant to work on PCIE 4.0X16 slots still get a performance hit on each of them even if they're on PCIE 5.0x8 lanes?
    So no matter what, the overall performance of the lane is the highest MUTUAL rating between the slot and the gpu that is on it?
    So a gpu that is meant to run on a 4.0x16 slot will be reduced to a 4.0x8 performance when mounted to a 5.0x8 slot?

    • @omidnt2992
      @omidnt2992 ปีที่แล้ว

      if your graphic card is PCIE gen 4 , then putting it in a Gen 5 PCIE slot will not increase your performance cuz your card only know how to talk in "GEN 4" language, now what if you put two PCIE gen 4 graphic cards into 2 PCIE gen 5 slots with only 20 or 24 PCIE lanes, you will have two PCIE gen 4 x8 .
      But for rendering or AI that speed reduction wont impact your performance that much.
      And for gaming, you will never want to use 2 graphics cards cuz almost no game has coded to work with multi GPUs. for gaming maybe you can assign Physics to process on one card and everything else on your main graphic card , but as i said , don't do it for gaming too much heat and electricity for a little or no improvement (sometime the heat will even reduce your gaming FPS)

  • @theWanderer521
    @theWanderer521 3 ปีที่แล้ว

    Great content as always! I don't know if this is worth making a video - but I would love to see a video about NVlinked GPUs - performances,installation,configuring the settings
    I was creating a scene in Blender the other day, I ran out of Vram on my 3090 lol...I might have used the wrong settings, didn't optimized the scenes better or used high res textures that I shouldn't be. My future plan is to build a workstation with multiple gpus (currently I have my eyes on the H3 platform, but I might need more vram + gpu power to render) I'm using 3970x with a 3090
    Thanks!

    • @MediamanStudioServices
      @MediamanStudioServices  3 ปีที่แล้ว

      Hi theWanderer521, I would live to do a Video in two RTX3090, trouble is that I do not have two 3090 anymore. But if I can get my hands on two gpus I will do a video on the set up of the NVlink.
      Thanks for watching

    • @rakshithr4230
      @rakshithr4230 2 ปีที่แล้ว

      @@MediamanStudioServices Waiting to see this kind of video. Please make one.
      I have planned to build pc with i9 12900k with 2 x RTX 3090. l want to know what would be the performance with NVlink.

    • @rakshithr4230
      @rakshithr4230 2 ปีที่แล้ว

      @@MediamanStudioServices On your experience, tell me is that a good idea or should I go with workstation.

  • @jinron24
    @jinron24 2 ปีที่แล้ว

    Here's two questions, one if two x8 is only 5-9 percent slower why and second is there a way to fit complex scenes in limited memory cards?

  • @johnjr.8824
    @johnjr.8824 2 ปีที่แล้ว

    Thanks, was going insane not knowing why my 3080 was running on PCIE 3.0 x8 instead of x16.
    2 M.2 SSDs and intel i9-10900k chip. Cant get it to PCIE 4.0 since the chip doesn’t support it

  • @deadmikehun
    @deadmikehun 2 ปีที่แล้ว

    Awesome video, however i have a question. Do Sata drives use any pcie lanes, or are they using the dedicated 4x that is needed for the chipset? I have an asus b450-f gaming motherboard that i currently use with a gtx 1070. I have ordered an rtx 3080Ti and i want to use them simultaneously in blender, but i'm not sure whether i should upgrade my motherboard. The motherboard manual says that in a dual gpu configuration GPU1 would get 8x, GPU2 would get 4x, the NVME SSD would get 4x. I'm also using 4 Sata HDD-s for storage, but if they use any additional PCIE lanes other than the 4 required for the chipset i will loose a lot of performance on the GPUs. Does it even worth using a secondary 1070 with a 3080Ti? It's still cuda cores though... Thank you!

    • @mikebrown9826
      @mikebrown9826 2 ปีที่แล้ว

      Sata uses your chipset and not pcie lanes. As for using the 1070. There is some advantages but with only 4 lanes. It may be better to just use the 3080ti at full speed and the 16 lanea

  • @MrExo_3D
    @MrExo_3D 2 ปีที่แล้ว

    would like to see when one GPU is 16x and the 2nd one is only 4x

  • @divyasjj1136
    @divyasjj1136 ปีที่แล้ว +2

    Hey I want to know !! is that possible i can se 1660 super and 3060 ti at the same pc !!! i want to use it for blender render like can i render it with one graphic card work with the other one ??

    • @8D2BFREE
      @8D2BFREE ปีที่แล้ว +1

      you can ues to different models working together. but i never tried a 1660

    • @swdw973
      @swdw973 ปีที่แล้ว

      Blender will look for the card with the least memory so it doesn't crash. My understanding is if you have a GPU with 8GB, and one with and 12GB, it will treat both as 8GB GPUs when assigning rendering. The Blender forums will be able to give you a definite answer.

    • @viniartee
      @viniartee 10 หลายเดือนก่อน

      @@swdw973 It appears that when you have a GPU with more VRAM, and the less powerful GPU runs out of memory while your system has ample RAM, Blender can utilize the system's RAM to support the less powerful GPU. Consequently, the more powerful GPU can continue to operate at full capacity using your system's RAM. For instance, if you possess both a 3090 and a 3060 Ti GPU, along with a system equipped with 64 GB of RAM, Blender might use the 64 GB of system RAM to match the performance of the 3090. The efficiency of this process is enhanced by faster RAM. Additionally, optimal performance would benefit from a well-configured RAID system with satisfactory HDD speeds. This setup should allow Blender to function smoothly without constraining the performance of the more powerful GPU, correct?

    • @swdw973
      @swdw973 10 หลายเดือนก่อน

      @@viniarteeSystem RAM will never get anywhere near the speed of vRAM. My answer was based on info from a Blender developer on one of the Blender forums.
      I actually run 2 different GPUs, a 3060 and 4070, but I made sure they both are the 12GB versions, because of what the developer told me.

  • @Slacker28
    @Slacker28 ปีที่แล้ว

    I don't know if you can answer this but I'm just rendering with a high-end gaming PC using stable diffusion and I just want to know if x 8 pcie Gen 4 would reduce the quality because I want to put more nvme drives into the machine and I don't want to do it if it's going to compromise my Ai rendering.

  • @DasunGunathilaka
    @DasunGunathilaka 3 ปีที่แล้ว

    Hey.. thanx for the video.
    I have a question .
    Is there a difference in video quality when you export with hardware acceleration enable (nvidia, amd, Qsync) compared to just using cpu for vedio decoding and encoding..?

    • @MediamanStudioServices
      @MediamanStudioServices  3 ปีที่แล้ว +1

      Hi Dasun Gunathilaka, there is always a difference when using a different encoder, be it hardware or software, but the difference should be very small and the quality should be about the same.
      thanks for watching

  • @jesusleguiza77
    @jesusleguiza77 3 หลายเดือนก่อน

    I need a setup that has 48gb vram gpu for llm, I thought about 2 rtx 3090, could you recommend me an economical configuration? b550 x570

  • @georgeluna6217
    @georgeluna6217 2 ปีที่แล้ว +1

    I am using 2 x 3090 Ti with a 5950X (so x8 PCIe) but wondering what are the benefits from NVlink... if any..

    • @georgeluna6217
      @georgeluna6217 2 ปีที่แล้ว

      @Claus Bohm I bought an NVlink from Nvidia 3 Slot for A5000/6000 that is compatible with the 3090Ti but I there is no way to make it work.. I tried everything. fresh install windows 10, drivers update, bios update, anything.. Not sure what's the problem..

    • @kreimer1417
      @kreimer1417 2 ปีที่แล้ว

      @@georgeluna6217 Maybe its dependant on the motherboard.(ones with sli support are sure to be compatible for this kind of setup) So check that out the motherboard specs from the company site. And that nvlink for A5000/A6000 should work with rtx 3090/ rtx 3090ti.

  • @leiluo588
    @leiluo588 2 ปีที่แล้ว +2

    I am using a 5950x with 3 rtx 3090. The system is water cooled.

    • @fsayyed
      @fsayyed ปีที่แล้ว

      Do you get a huge reduction in performance by adding a third gpu? I mean does it scale back. I'm sure it still renders faster than 2 3090

  • @ozztheforester
    @ozztheforester 3 ปีที่แล้ว

    thanks this is very informative! would it be possible for you to conduct a similar test, using tb3 egpu solutions daisy chained?

    • @MediamanStudioServices
      @MediamanStudioServices  3 ปีที่แล้ว

      Hi ozztheforester, if I could get my hands on such equipment I will do a video. Here is a link to some reviews.
      egpu.io/best-egpu-buyers-guide/

  • @ahmedsouody1765
    @ahmedsouody1765 2 ปีที่แล้ว

    Thanks for the great content .. what is the power supply capacity for the desktop?

  • @Vysokovart
    @Vysokovart 3 ปีที่แล้ว

    Hey. I have a problem connecting two 3090s through the Nvlink bridge. I use blender 3.0 stable to render my works
    b550m pro4 motherboard
    Processor x5900
    I can not find information on the Internet on how to correctly connect the bridge and speed up the blender. Please help

    • @MediamanStudioServices
      @MediamanStudioServices  2 ปีที่แล้ว

      your mother board needs to support Nvlink. But have you tried to run a render with the two separate GPU? Connecting with Nvlink is not going to necessary speed up your rendering.

    • @djuansayjones6178
      @djuansayjones6178 2 ปีที่แล้ว

      @@MediamanStudioServices hey I saw that the 3080 or less only uses about 8x. But the 3090 uses more and would lose a lot of performance. Have you tested this out yet? Or do you know it's true?

    • @MediamanStudioServices
      @MediamanStudioServices  2 ปีที่แล้ว +1

      @@djuansayjones6178 I am not sure of the PCIe bandwidth for each card, but I am making a good guess here, the RTX3080 does use more than the bandwidth of just 8x when at full performance, so yes untrue.

    • @djuansayjones6178
      @djuansayjones6178 2 ปีที่แล้ว

      @@MediamanStudioServices ok thanks a lot!!!

  • @superstite21
    @superstite21 3 ปีที่แล้ว

    Great video! How do rendering softwares handle two GPUs in a non SLI/Crossfire/NVlink configuration vs a SLI/Crossfire/NVlink??
    Thank You

    • @MediamanStudioServices
      @MediamanStudioServices  3 ปีที่แล้ว +4

      when I get some more GPU I will test this out and do a video
      thanks for watching

  • @arasdarmawan
    @arasdarmawan 2 ปีที่แล้ว

    Hey Mike. Is there a way to stack 6 GPU RTX 3080/3090 blower on a single workstation motherboard ? I've been looking as well towards mining motherboard (that can fit 6-12 GPU) - but they reduce the pcie lanes speed x1 - Your video here showed a very little difference between x16 and x8 - So i'm curious if x1 is pushing it way too far for a render gpu farm? need your advice.

    • @MediamanStudioServices
      @MediamanStudioServices  2 ปีที่แล้ว +2

      My question to you is why would you want to do this. Just buy a GPU server for this task. Gigabyte makes a great server for just this style of workflow. Anything over 3-4 GPU in one workstation requires a server chassis.

  • @drengot1066
    @drengot1066 3 ปีที่แล้ว

    really informative

  • @cliffordcrawford5723
    @cliffordcrawford5723 2 หลายเดือนก่อน

    Ok, how about gpu and an accelerator, an instinct mi100 maybe? , x8 x8 🤔..

  • @FelixTheAnimator
    @FelixTheAnimator ปีที่แล้ว +1

    Does the mobo HAVE to be crossfire/sli ready to have two gpu cards?

    • @mikebrown9826
      @mikebrown9826 ปีที่แล้ว

      No. Just support 16x or 8x pcie. Preferably directly from the CPU

  • @mridulkumar8574
    @mridulkumar8574 4 หลายเดือนก่อน

    Could you make a video on quadro cards, can they handle multiple 4K monitors

  • @perionation
    @perionation 2 หลายเดือนก่อน

    Thank You😊