I believe the main reason for the performance discrepancy between the 2 laptops is due to the way thunderbolt 4 is being implemented. The i7-1260P has TB4 build into its main CPU die while the i9-12900HX does not have native TB4 so it requires another IC, the Intel Maple Ridge TB4 controller which may be causing problems
Here with the HX CPU, because of the discrete Thunderbolt controller, the data flow has to go through the motherboard chipset first and thus has to compete with other things like storage or wifi. In-CPU Thunderbolt implementation is definitely the way to go to have the best eGPU performance as it means the eGPU is pretty much directly connected to CPU PCI Express lanes.
TB4 has not changed bandwith with respect to TB3. It still has 4 lanes of PCIe 3.0, while the 4090 is a 16 lane PCIe 4.0 card, or 8 times more bandwith. That's way it runs like crap
Probably, I just assumed they were the same as the Intel ark page didn't really make it clear that there were thunderbolt differences, so this seems to be a downside for hx.
@@alanharperchiropractor exactly , the 80 series is even worse 75% more over 3080 10gb , 95% price increase in Europe and for that your getting 50%-60% more performance for the 4080 16GB , Jensen is seriously taking the piss.
I can confirm that resizable BAR is an issue for Thunderbolt eGPUs. Under Linux I've had to outright disable it in order to increase eGPU performance. Fortunately with Linux I can set kernel boot parameters to enable me to do so regardless of BIOS settings or anything. I have no idea if Windows has any similar options or functionality.
considering moving to linux RN, since my company just ditched office and using google sheet. like how many hour is needed to setup it? is steam os have no major problems to use egpu with USB 4?
@@alexamderhamiltom5238depends, any distribution with a graphical installer should take less than 30 min to install. If it's your first time trying linux i reckon ubuntu would be a good option
I'd be most interested in testing the 3070ti with the i7-1260p compared to the 3070ti in the tower. I think this is likely true for 95% of others watching this video, as very few want to drop stupid money on a 4090. Some thing even more common such as a GTX 1080 or 1070 eGPU i7-1260p / tower comparison would be even better.
Agree. 4090 as eGPU is simply stupid. Most people want eGPU not because of they want the ultimate performance, but because they don't want to buy both laptop and desktop.
Exactly. A comparison of "eGPU vs dGPU" with the "same" card is the comparison i want. I don't need gaming on the go as much but I would like to use the same laptop for everything and connect it to the eGPU for gaming and 3d modeling at home.
Jarrod, I bought a Razer Core X several yrs ago as a gamble to boost up my outclassed old i7 7700HQ Omen15 1060MaxQ, using an RTX2060. It benchmarked at approx 95% of the average desktop 2060, and was amazing tbh. Maybe pushing the limits, it was upgraded with a 3060Ti last yr, and STILL benchmarked 95% of desktop average for that card, and (with CPU undervolting, GPU overclocking, & 16GB RAM) I can still play competively with my ancient 7700HQ laptop, eg. 105fps on Battlefield 1 ultra. But, I do think this is the limit for TB3/4, and my next PC will have to be a desktop build, I can't see eGPU being a viable use for an expensive RTX4-series graphics card on any laptop. RIP eGPU beyond RTX3060Ti.
Lower-tier GPUs use less bandwidth is why. TB3/TB4 is limited to 40Gb/s, which lower-end GPUs will typically cap out at anyway, even on a normal PCIe slot in most desktops. They just don't really use the rest of the throughput. The 4090, though, DOES utilize more than 40Gb/s because of how fast and data heavy it is, which is why it performs so poorly compared to a normal desktop setup. Sadly, Thunderbolt 5 with 80Gb/s will need to be released for eGPUs to become viable again because of how bandwidth dependent newer GPUs are compared to cards from 2015-2017 (when the Core/Core X was in it's heyday)
Affording a pc is not the issue, portability is. The reason people pay more for a laptop... If I chose to build a pc I can get so much more real bang for the buck. People don't buy gaming laptops to save money.
Yeah. I think it’s the fact that the higher end laptop has a dedicated gpu. I remember in the early years of thunderbolt 2/3. It was easier to install and run a egpu on igpu only laptop vs a laptop with a dgpu.
Shocker, eGPU's are still a terrible buy if you put anything above a low to mid range card in it. Not changing anytime soon either, seeing as TB4 has the same bandwidth as TB3.
If you put a low end card in it, it is also not that worth because the egpu enclosure costs as much as a low end card (I believe?). Honestly, I believe the best application of an egpu is for other workloads that depend less on the pcie bandwidth, such as compute, rendering or video editing. Sadly, it seems that gamers predominantly use it instead.
Unless you use pcie 4 or 5 SSDs slots.. Or disable resizable bar. I am surprised no one has yet to make and egpu that can combine 2 tb ports for double the memory as many laptops have two tb ports now?
Just as fyi, Halo Infinite needed to have ReBar turned off in Nvidia Profile Inspector for driver 522.25 for it to work right on my eGPU 3070. I assume that's the case here alongside Watch Dog: Legion. With my 3070 I get 70-90 FPS at 4K and 130-160 FPS at 720P, so more graphics horsepower like the 4090 has would have definitely made the eGPU shine as it's not that bandwidth limited.
@@naufalkusumah2192 yeah if you need instructions I posted them in the eGPU sub, just search it up as my comment kept getting deleted when I posted the link
If there's a way to do it, it would have been interesting to see one of the worst performing games for the laptops in a set up where the desktop is using the eGPU, making it posible, for example, to check the impact of resizable BAR, or maybe to use it as a best case scenario.
Very informative video. I'm using an eGPU for work coz I don't own a desktop. So it would be interesting to see how the rendering speed using Twinmotion or Lumion. To explain, I'm an architect by profession. My laptop goes around with me from office, client presentation, job sites and home. And I have eGPU for office and home.
Be interesting to test again with egpu external monitor instead of video data back to laptop. Considering the egpu would normally be used as a desktop usage and most people use external monitor at home/work with the laptop anyway.
Good review. I'd thought about having an external gpu with an HX cpu but not based on these results. Looks like an HX with an 3070 or 3080 is in my future. Depends on black Friday pricing.
I use the eGPU ( Dell xps 9710 with eGPU 4060 TI 16GB) when I’m running batch of Denoising in Lightroom .. it makes a world of difference. Of course my RTX 3090 with intel’s 13900K desktop is faster.
I heard that PCI-E M.2 connection is more direct and has less overhead than using Thunderbolt. It would be interesting to compare with this connection as well, who knows. Of course with 4-lanes PCI-E setup, as there are a plenty of 1-lane setups on the market.
Maybe the best implementation of e-GPU is in the house, eg: RoG Flow, which I recalled that using Thunderbolt only reduce the performance by much than using the full XG
Wondering how much of a gap there would be with something more like a 4070 Ti, 7900 XT (or even more so, an 7800 once available). Going all out on the top of the top with eGPU clearly doesn't make sense at the moment, but depending on how much performance loss there is for the more attainable, upper midrange products, this may still be quite interesting. Context: considering to wait for the next generation of Frameworks (which, should hopefully include the Thunderbolt bandwidth upgrades you mentioned) and looking at a viable eGPU for an above-midrange solution. Coming from an Omen 15 4800H/1660Ti.
so glad I watched this video! Thank you. I was just about to pull the trigger on an ultrabook with thunderbolt and get an eGPU for desktop use. Now I can see it's not worth it yet. A gaming desktop and a cheap laptop or tablet would be a better use of money.
A better comparison would be testing different egpu setups to see which graphics cards are pointless or most efficient given the bottleneck. For example,. does a 2070 or 3090, or even a 1070 egpu get similar framerates given the thunderbolt bottleneck? Also, you mention usb4 is faster so does this mean that the upcoming Win GPD 4 with USB4 will have a better thunderbolt bandwith and better performance with egpus?
@@XxGAMERxXPS3 100%. Who knows how long it'll be before we see PCIE 5.0 M.2 adapters and (more importantly) laptops with PCIE 5.0 M.2 slots though? With how cheap used GPU's are I decided to just build a gaming PC (5700 XT and i3-12100). Only cost about $600 and should beat the performance of most EGPU setups (assuming no RT).
@@vinyfiny There is laptop with Pice 5.0 M.2. I believe MSI have one. Give it 1-2 more years and we will se more laptops with pcie gen 5. Hopefully we get a USB5 or what ever TB with gen 5 x4.
We desperately need Thunderbolt 4 docks to properly use these more beefier cards. Even with the 10 series TB3 was failing to saturate a good chunk of the bus the card could provide. eGPUs are still an amazing concept and I hope someone steps up to give the standard even more improvements
Hey Jarrod, would much rather a comparison between desktop and eGPU performance for a card like the 1650 or 3060 or something in that range because people who can afford a 4090 probably won't have any problems buying other PC components
Thunderbolt is far too limiting for an EGPU, I think the 2080 would be the most I wound pair it with and even then only if your laptop doesn't have a half decent video card. I winder if the iGPU in the HX was interfering with it. The HX line uses the older Intel HD graphics. I would try disabling it via device manager.
Yep. iGPU-s were always a terrible idea unless you wanna connect an ultrabook(like LG gram or something) to a mid-range GPU. But even then some bandwidth would be lost. Also, hi!
@@tabalugadragon3555 hey Sergei! It has actually reminded me. I have the Razer Core and Alienware Amplifier. I should sell those oversized paperweights, lol
I took the liberty of posting this to r/egpu. There's some discussion there already about 1 or 2 things in the analysis. First is about Halo Infinite having different behavior with bugged driver versions(522.25, this appears to have been discussed in Jarrod's Dischord). Second is about the HX variant of the 12900 mobile CPU apparently having both the PCH and Thunderbolt controller separate to the CPU, unlike 1260P and (apparently) 12x00H (no X, such as the 12900H). The later could explain the boatload of results where the 1260P had over 12900HX and is a situation that dates back to Ice Lake where the integrated PCH and controller had a noticeable improvement for eGPU use.
I'm really interested to see how this plays out next year with Zen 4 and 13th gen laptops, ESPECIALLY Zen 4 Dragon Range which focuses on CPU performance and is made to pair with another GPU. I'm thinking a possible 50% jump in performance in a setup like this.
Dragon Range will not be available for at least 5-6 months though, so u're gonna have to wait quite a long time. At least the 13th gen CPU for laptop is supposed to launch in Jan 2023
You don't really need a 4090 for the eGPU setup. If you get, for example, a Razer Book and pair it with a 3060Ti, you can get stellar performance and you still have an advantage that a desktop doesn't have - you can unplug and get great battery life on the go allowing you to get some work done, do presentations, etc, on one device.
Wonderful Video, Jarrod. Can you also test the eGPU + Laptop performance on the Blender, Redshift rendering? I currently have the Legion 12700+3060, wandering to buy a eGPU to boost my laptop or to sell my laptop and buy a whole new 4090 setup. Thanks Bro.
Hey jarrod the 12900hx was performaing worse cause 16 gen 4 lanes were connected to the 3070 ti leaving the i9 with 4 to spare from which 2 lanes are for the storage that leaves us with 2 lanes of thunderbolt for the 4090 while the 1260p had 4 gen 4 lanes on thunderbolt
Is this really the explanation? I didn't see reduced performance vs my Spectre 16 with RTX 3050 when I had it with my eGPU 3070 vs my Spectre 14 tbh, which should've had the same explanation for it if true
Thanks for another good eGPU video Jarrod! I would really like to a full guide eGPU set up video from you! I've been fighting with an egpu to work properly for months now, but it's hard to find any good trouble shooters or guides out there, especially videos. I found out that there is an issue with windows 11 and tb4 drivers with intel 12th gen CPUs. Thunderbolt controller isn't present in device manager. I had to install win10 to just connect to my rtx 3090/mantiz egpu. But even now, Performance is better with the laptops inbuilt 3060 card in almost every single case than it is with the 3090. External display with the laptop screen and gpu turned off doesn't make any noticeable differences either which is weird as well. I imagine that I would know a lot more about TB issues and other egpu errors if you had included some of that installation process in your videos, becasue I've watched all of your egpu videos and will keep doing so! 🙂
Thank you for testing with the monitor directly connected to eGPU card. Great update from 2020 video. I have a Radeon RTX 3070 and decent gaming desktop components on the way, and own an RTX 1080 in an eGPU setup already. I plan on running my own quick & dirty relative tests for all 4 setup combos, and will be happy to post results here if I can remember to do so. My eGPU is limited to Thunderbolt 3 though.
For better performance you could hook up directly to the pcie 4 ssds slot.. It carries none of the bottlenecks if thunderbolt and should get twice the bandwidth, assuming you can get it to work?
would had been nice to see you using a m.2 slot for the egpu as well because the speed would be way faster. no one I have seen has done it this way while also using a 40 series gpu. would be nice to see the results compared to the 7950x etc.
At this moment in time for compact portable gaming, the way to go is ITX sandwich case instead of a laptop. Is not like laptops are used to play while you're on the move in the plane or train or whatever, but to unpack and plug into the mains power supply and use at the various destination locations. The laptops used on your lap far away from a power plug are those ultrabook things that you use to drop in loads of photos and do a quick photoshop edit and upload on some blog. Not for gaming. In the future I hope case manufacturers will expand the compact form factor ITX cases that have a very small depth meaning the GPU and the mobo to be installed on the same plane side by side and developing a CPU cooling system that isn't a high tower but a small height and large surface area on the horizontal plane. This will result in having a similar footprint of the 17" laptops and a a slight height increase, thus useful to carry around. With a portable 17 " monitor you end up with a full gaming rig with a volume not much larger than a gaming 17" laptop
If the 4090 directly outputs to an external monitor, that will minimise the side effect of the structure, and I guess you can have a fairer comparison over cpu bottleneck then.
Very interesting video! Could you do a comparison between thunderbolt and m.2 to pcie x16 adapters? They also have only 4 lanes of course, but they don't have the other thunderbolt limitations.
Can you try this setup again with DLSS3 and Frame Gen enabled? Maybe even include comparisons with other 4000 series cards? I saw on some boards that frame gen compensates significantly for the drop with TB3/4 bandwidth.
Honestly I think manufactures have forgotten about egpu solutions. When was the last time we saw a new one come to market? The XG mobile is sadly the best one currently and it is so expensive that if honestly that you might as well grab a mid range desktop pc and a budget gaming laptop and probably come out ahead. I really miss Alienwares egpu. I know it was proprietary, but it had much better performance. I really wish dell was putting all that money towards making a v2 and make it a no brainer choice to buy their machines.
Oh my fudge, I was literally planning on doin this in the future since I just bought a Razer Blade 17 3070 Ti 150w TDP, and I'm planning on making this my full setup. I'm happy as hell you can try it out first.
I'd love to see: 1) high spec laptop with discrete graphics (eg. Xps17) vs egpu vs desktop. I'd like to see what sort of gap there is. 2) instead of games, I'd love to see content creation blender, unreal engine, davinci, premiere pro, after effects.
@6:36 Thanks for the clarification that eGPU with RTX 4090 is even worse than the RTX 3070Ti Laptop GPU, otherwise I was under the wrong impression that if 4090 Desktop is 4x > 4090 eGPU, then it must be at least 10x Better than my RTX 3070ti laptop GPU.
This egpu enclosure was released in tandem with the 12.5 inch razer blade stealth. When it debuted it featured soldered ram and 7th gen intel processors with 15w tdp. It was a fine laptop but it had massive bezels and the build quality left a bit to be desired. Really these enclosures only make sense with ultrabook type devices. If you already have a gaming laptop there really is no point. And even still, is it really worth spending all that money on a desktop class card to get such middling performance for how much better the card can do? Back in the day it made more sense because the cards weren’t as powerful and the thunderbolt bandwidth wasn’t as much an issue but nowadays they make no sense. The egpu enclosure over thunderbolt is a lost cause until the day there’s a new cable standard with enough bandwidth to take full advantage of the cards of that time and beyond.
Actually, the reason NVIDIA recommends such a high-output power supply is that the 3090/4090 is capable of power-draw spikes that can jump WELL over it's TDP. Linus tech talked about this during one of their builds. If you cheap-out on the PSU, these spikes can trip the circuit protection on your PSU and that's a hard boot.
next gen thunderbolt is fine and dandy, but given that the GPU enclosures have their own TB controller onboard - they will also have to be upgraded to suit. Market for them seems to be pretty much dead now
Not true. Just connect an external monitor to your eGPU box and you will see that it work much faster. If the internal display is in use, the TB3(4) send data in both direction. Just imagine a some person who has the eGPU but not external monitor.
It seems it's better to have gaming desktop and when we like to play at laptop just to stream games from gaming desktop PC to laptop (ie. using Moonlight). Thanks for interesting video!
Great Video. Now I'd be curious to see how an all AMD Advantage Laptop with the highest end specs like the - ROG Strix G15 Advantage Edition with Ryzen 9 5980HX / 6800M would perform with this eGPU setup. This is a great setup for anyone who's a creative or engineer/coder on the go. You could switch back and forth depending on your workload and just have you laptop in a cradle and eGPU next to it ready to go with just a Monitor and keyboard/mouse.
Egpu is only good for office work and video playback. Most older laptops can't do multiple 4k monitors without turning into a loud heater. eGpu offloaded heat and runs silently and allowed me to have more than 4 monitors (combined with integrated gpu output) But newer GPUs are so efficient have no issue driving 4 4k monitors so I got rid of my eGPU set up
I still love this setup since it still offers flexibility when you want to play desktop games at high speed. However, I do think it's about time we see eGPU brands make a new enclosure with larger size and Thunderbolt 4 ports. I checked that both the 4080 and 4090 has 61mm slot and neither Razer Core X Chroma nor AKiTiO Node Titan can fit.
ever since i got a 4k external monitor to use with laptop i started considering buying an eGPU, but this video is self explanatory. the only thing I can think of is that obviously the interfaces can't keep up with the new top of the range, this and the fact that gaming laptops are evidently not designed to work with thunderbolt like a laptop without a dedicated graphics card. the only solution I can think of is a new interface, similar to the one used by the asus rog, which is completely dedicated to the gpu and set by default in gaming-oriented laptops, but that would only raise their prices.
So, if you are only able to get a hold of laptops that are not super powerful, a cheap eGPU setup is your go to option… though, hope nothing goes off first try, gotta hope for not needing to troubleshoot or mess with software or hardware (common with eGPU setups), also pray for no whitelisting blockades.
I cant believe in the year 2023 we still dont have a good way to use a dGPU on a laptop. Thunderbolt 5 seems like itll finally be the one but tbh a simple 8x Oculink port would do just fine. The PCIe lanes are there and it doesnt have to be more expensive to add this port.
Hey just bought a GPD Win 4 and enjoying it but thinking for future how docked usage of getting an E GPU Oculink Setup. Do you think a 4090 or 4090ti would be worth it in that case. And on that note if it is would a 4090ti fit in that case?
It's really weird that the gaming laptop had a so much bigger bottleneck compared to the zenbook when it should have more internal PCIe lanes and power to handle them. If you can you should talk to Intel and ask an explanation to the problem
more lanes don't matter if you're limited to pcie 3.0 x4 no matter what. thunderbolt 3 is outdated, it was already shaky for 10 series GPUs, but pretty much unusable for 40 series. Even SSD's offer more than double speeds than tb3 is capable of supporting we've now reached a GPU speed to where we get benefits of saturating pcie 4.0 x16, that is equivalent to pcie 3.0 x32, now compared to x4, you can see why they perform suboptimal
I've noticed some drivers just do not work with an E-gpu... Its not well enough supported, and win 11 made it worse by alot, these tests need to be done in win 10
But what if you're connecting external displays to ghe eGPU and not running through the laptop screen? That's what I'm keen on, can I run my laptop as a CPU / OS and everything else connected to a dock and eGPU via TB3/4. Anyway, maybe a test for next time?
That's what he misses in this video, the bandwidth seems low because half the bandwidth is going to feed the laptop screen which also adds more load to the cpu, if he was using an external display he'd be getting the full 4GB/sec of bandwidth
Find out how the ASUS ROG Ally runs with RTX 4090 next! th-cam.com/video/gNlgrxNlt7E/w-d-xo.html
DOES IT HAVE A USB VERSION
Do I have to use this gpu or can I find a cheaper one
this is not a usb c but a thunderbolt cable
newer In Flames sucks bro
Would like to see in your analysis the oculink or pcie x4 egpu instead of thunderbolt as it has nore bandwidth. 😊
I believe the main reason for the performance discrepancy between the 2 laptops is due to the way thunderbolt 4 is being implemented. The i7-1260P has TB4 build into its main CPU die while the i9-12900HX does not have native TB4 so it requires another IC, the Intel Maple Ridge TB4 controller which may be causing problems
Here with the HX CPU, because of the discrete Thunderbolt controller, the data flow has to go through the motherboard chipset first and thus has to compete with other things like storage or wifi. In-CPU Thunderbolt implementation is definitely the way to go to have the best eGPU performance as it means the eGPU is pretty much directly connected to CPU PCI Express lanes.
TB4 has not changed bandwith with respect to TB3. It still has 4 lanes of PCIe 3.0, while the 4090 is a 16 lane PCIe 4.0 card, or 8 times more bandwith. That's way it runs like crap
Probably, I just assumed they were the same as the Intel ark page didn't really make it clear that there were thunderbolt differences, so this seems to be a downside for hx.
@@JarrodsTech quite a shame you went through all the trouble using a problematic, not ideal higher TDP laptop, such as a 12700H, 12800H or 12900H
@@AdriMul Well said!
Welcome to the new episode of which I can't afford
I can afford that but the price/performance is so bad.
Come back on Black Friday, (better be quick when that day is there.)
@@alanharperchiropractor exactly , the 80 series is even worse 75% more over 3080 10gb , 95% price increase in Europe and for that your getting 50%-60% more performance for the 4080 16GB , Jensen is seriously taking the piss.
Even if you can afford it, it'll be not worth the bucks.
@@alanharperchiropractor yep, 3060 with is the most cost effective GPU btw
I can confirm that resizable BAR is an issue for Thunderbolt eGPUs. Under Linux I've had to outright disable it in order to increase eGPU performance. Fortunately with Linux I can set kernel boot parameters to enable me to do so regardless of BIOS settings or anything. I have no idea if Windows has any similar options or functionality.
considering moving to linux RN, since my company just ditched office and using google sheet. like how many hour is needed to setup it? is steam os have no major problems to use egpu with USB 4?
Is it possible to run the eGPU with the old drivers without resizable BAR?
@@alexamderhamiltom5238depends, any distribution with a graphical installer should take less than 30 min to install. If it's your first time trying linux i reckon ubuntu would be a good option
This is great but I wanted to see benchmark comparison before the Egpu and after Egpu on the same laptop as well.
It would be horrendous with the smaller laptop without a dedicated graphic card.
just search up intel hd graphics 770 or something like that, and you will get the results
I'd be most interested in testing the 3070ti with the i7-1260p compared to the 3070ti in the tower. I think this is likely true for 95% of others watching this video, as very few want to drop stupid money on a 4090. Some thing even more common such as a GTX 1080 or 1070 eGPU i7-1260p / tower comparison would be even better.
Agree. 4090 as eGPU is simply stupid.
Most people want eGPU not because of they want the ultimate performance, but because they don't want to buy both laptop and desktop.
I'm having 1240p with 3060 eGPU, it's pretty fine. But laptop with 3060 can outperform it I guess
@@bogdand.4987 is it possible to connect eGPU (RTX 3080) to my laptop MSI GP62 6qf with type-c 3.1.
USB c looks identical to thunderbolt ports. If it's just USB c 3.2 it won't work. It has to be a thunderbolt port.
Exactly. A comparison of "eGPU vs dGPU" with the "same" card is the comparison i want. I don't need gaming on the go as much but I would like to use the same laptop for everything and connect it to the eGPU for gaming and 3d modeling at home.
Jarrod is just wonderful. All those testing. Massive results.
Jarrod, I bought a Razer Core X several yrs ago as a gamble to boost up my outclassed old i7 7700HQ Omen15 1060MaxQ, using an RTX2060. It benchmarked at approx 95% of the average desktop 2060, and was amazing tbh. Maybe pushing the limits, it was upgraded with a 3060Ti last yr, and STILL benchmarked 95% of desktop average for that card, and (with CPU undervolting, GPU overclocking, & 16GB RAM) I can still play competively with my ancient 7700HQ laptop, eg. 105fps on Battlefield 1 ultra. But, I do think this is the limit for TB3/4, and my next PC will have to be a desktop build, I can't see eGPU being a viable use for an expensive RTX4-series graphics card on any laptop. RIP eGPU beyond RTX3060Ti.
Lower-tier GPUs use less bandwidth is why. TB3/TB4 is limited to 40Gb/s, which lower-end GPUs will typically cap out at anyway, even on a normal PCIe slot in most desktops. They just don't really use the rest of the throughput. The 4090, though, DOES utilize more than 40Gb/s because of how fast and data heavy it is, which is why it performs so poorly compared to a normal desktop setup. Sadly, Thunderbolt 5 with 80Gb/s will need to be released for eGPUs to become viable again because of how bandwidth dependent newer GPUs are compared to cards from 2015-2017 (when the Core/Core X was in it's heyday)
Thanks for the amazing vid.
EGPU's have never stopped gaining my interest, I just wish they got better with new tech.
Jarrod is a tech scientist at this point. Running various experiments to advance the gaming life
Basically, if you can afford a 4090, you can afford a PC.
Common sense bro
you can afford both a PC and a laptop
@@GewelReal honestly... the biggest issue with a 4090 is the cost because i can buy a pretty ballin gaming laptop for 1600
@@wnxdafriz yet the 4090 desktop will be twice as fast as a 3080ti laptop, if not more.
Affording a pc is not the issue, portability is. The reason people pay more for a laptop... If I chose to build a pc I can get so much more real bang for the buck. People don't buy gaming laptops to save money.
Yeah. I think it’s the fact that the higher end laptop has a dedicated gpu. I remember in the early years of thunderbolt 2/3. It was easier to install and run a egpu on igpu only laptop vs a laptop with a dgpu.
Shocker, eGPU's are still a terrible buy if you put anything above a low to mid range card in it. Not changing anytime soon either, seeing as TB4 has the same bandwidth as TB3.
If you put a low end card in it, it is also not that worth because the egpu enclosure costs as much as a low end card (I believe?). Honestly, I believe the best application of an egpu is for other workloads that depend less on the pcie bandwidth, such as compute, rendering or video editing. Sadly, it seems that gamers predominantly use it instead.
Except with the Xg Mobile from Asus 🙃
eGPUs are a lifestyle where you throw out performance per dollar and will never be considered for budget builds
@@sparkz6381 Also throw out raw performance as well
Unless you use pcie 4 or 5 SSDs slots..
Or disable resizable bar.
I am surprised no one has yet to make and egpu that can combine 2 tb ports for double the memory as many laptops have two tb ports now?
Just as fyi, Halo Infinite needed to have ReBar turned off in Nvidia Profile Inspector for driver 522.25 for it to work right on my eGPU 3070. I assume that's the case here alongside Watch Dog: Legion. With my 3070 I get 70-90 FPS at 4K and 130-160 FPS at 720P, so more graphics horsepower like the 4090 has would have definitely made the eGPU shine as it's not that bandwidth limited.
Nice info, I will remember this when I get an eGPU setup in the future
@@naufalkusumah2192 yeah if you need instructions I posted them in the eGPU sub, just search it up as my comment kept getting deleted when I posted the link
@@omegamalkior1874 In Reddit?
@@omegamalkior1874 I’m curious about that, if there’s a link or a place to find that info let us know
@@killertruth186 yes
If there's a way to do it, it would have been interesting to see one of the worst performing games for the laptops in a set up where the desktop is using the eGPU, making it posible, for example, to check the impact of resizable BAR, or maybe to use it as a best case scenario.
Very informative video. I'm using an eGPU for work coz I don't own a desktop. So it would be interesting to see how the rendering speed using Twinmotion or Lumion. To explain, I'm an architect by profession. My laptop goes around with me from office, client presentation, job sites and home. And I have eGPU for office and home.
Similar here.
I got the impression an EGPU works best with laptops without dedicated graphics.
Be interesting to test again with egpu external monitor instead of video data back to laptop. Considering the egpu would normally be used as a desktop usage and most people use external monitor at home/work with the laptop anyway.
Isn't that what he said he did?
Good review. I'd thought about having an external gpu with an HX cpu but not based on these results. Looks like an HX with an 3070 or 3080 is in my future. Depends on black Friday pricing.
The question here is, why don't you perform the tests with a usb 4.0 (40gbps)? That's the top tier in egpu market now.
I've looked all over for the usb 4.0 egpus and I cant find any. Where are they?
Or with oculink egpu setups
You mean TB4
I use the eGPU ( Dell xps 9710 with eGPU 4060 TI 16GB) when I’m running batch of Denoising in Lightroom .. it makes a world of difference. Of course my RTX 3090 with intel’s 13900K desktop is faster.
I heard that PCI-E M.2 connection is more direct and has less overhead than using Thunderbolt. It would be interesting to compare with this connection as well, who knows. Of course with 4-lanes PCI-E setup, as there are a plenty of 1-lane setups on the market.
I think oculink does that. No usb4/thunderbolt overhead. A very janky setup but has a better compatibility with most games.
Maybe the best implementation of e-GPU is in the house, eg: RoG Flow, which I recalled that using Thunderbolt only reduce the performance by much than using the full XG
Wondering how much of a gap there would be with something more like a 4070 Ti, 7900 XT (or even more so, an 7800 once available). Going all out on the top of the top with eGPU clearly doesn't make sense at the moment, but depending on how much performance loss there is for the more attainable, upper midrange products, this may still be quite interesting.
Context: considering to wait for the next generation of Frameworks (which, should hopefully include the Thunderbolt bandwidth upgrades you mentioned) and looking at a viable eGPU for an above-midrange solution. Coming from an Omen 15 4800H/1660Ti.
Now try this test again with a modern laptop that has PCIE 4 M.2 x4 slots with a GEN 4 eGPU dock.
so glad I watched this video! Thank you.
I was just about to pull the trigger on an ultrabook with thunderbolt and get an eGPU for desktop use.
Now I can see it's not worth it yet. A gaming desktop and a cheap laptop or tablet would be a better use of money.
A better comparison would be testing different egpu setups to see which graphics cards are pointless or most efficient given the bottleneck. For example,. does a 2070 or 3090, or even a 1070 egpu get similar framerates given the thunderbolt bottleneck? Also, you mention usb4 is faster so does this mean that the upcoming Win GPD 4 with USB4 will have a better thunderbolt bandwith and better performance with egpus?
Would be interesting to compare against a M.2 4.0 PCIE adapter.
Yeah, then we can gauge the pure performance falloff without CPU interference from the desktop.
I believe when PCI-E 5.0 4x m.2 will benefit these type of gpu alot, Like quite alot since it’s basically pcie 4 x8. That is more then enough.
@@XxGAMERxXPS3 100%. Who knows how long it'll be before we see PCIE 5.0 M.2 adapters and (more importantly) laptops with PCIE 5.0 M.2 slots though?
With how cheap used GPU's are I decided to just build a gaming PC (5700 XT and i3-12100). Only cost about $600 and should beat the performance of most EGPU setups (assuming no RT).
@@vinyfiny There is laptop with Pice 5.0 M.2. I believe MSI have one. Give it 1-2 more years and we will se more laptops with pcie gen 5. Hopefully we get a USB5 or what ever TB with gen 5 x4.
We desperately need Thunderbolt 4 docks to properly use these more beefier cards. Even with the 10 series TB3 was failing to saturate a good chunk of the bus the card could provide. eGPUs are still an amazing concept and I hope someone steps up to give the standard even more improvements
Hey Jarrod, would much rather a comparison between desktop and eGPU performance for a card like the 1650 or 3060 or something in that range because people who can afford a 4090 probably won't have any problems buying other PC components
Thunderbolt is far too limiting for an EGPU, I think the 2080 would be the most I wound pair it with and even then only if your laptop doesn't have a half decent video card. I winder if the iGPU in the HX was interfering with it. The HX line uses the older Intel HD graphics. I would try disabling it via device manager.
Yep. iGPU-s were always a terrible idea unless you wanna connect an ultrabook(like LG gram or something) to a mid-range GPU. But even then some bandwidth would be lost.
Also, hi!
@@tabalugadragon3555 hey Sergei! It has actually reminded me. I have the Razer Core and Alienware Amplifier. I should sell those oversized paperweights, lol
Please compare content creator workloads: octane render (octane benchmark), redshift render, blender… Egpu 4090 with laptop vs desktop and 4090
I took the liberty of posting this to r/egpu. There's some discussion there already about 1 or 2 things in the analysis. First is about Halo Infinite having different behavior with bugged driver versions(522.25, this appears to have been discussed in Jarrod's Dischord). Second is about the HX variant of the 12900 mobile CPU apparently having both the PCH and Thunderbolt controller separate to the CPU, unlike 1260P and (apparently) 12x00H (no X, such as the 12900H). The later could explain the boatload of results where the 1260P had over 12900HX and is a situation that dates back to Ice Lake where the integrated PCH and controller had a noticeable improvement for eGPU use.
It would be extremely worthy on the PE ratio if using a GTX1080, I believe.
Would be good idea to see lower tier RTX 4000 series (4050,60 or 70). Should be less significant differences (logically thinking)
imagine this is how the future gpu will look like.
Hope they are not huge like this and use less wattage.
Hey Jarrod!
Can you try comparing egpu performance between midrange RTX cards and Arc cards?
Arc is pretty trash and no one is really going to be able to get one.
Actually ARC dGPU laptops are in abundance but no one is really going to buy one
Can you please test this with Blender benchmarks, and 3D workstations? Thanks
Would love to see the same test as I'm considering an eGPU for blender
I'm really interested to see how this plays out next year with Zen 4 and 13th gen laptops, ESPECIALLY Zen 4 Dragon Range which focuses on CPU performance and is made to pair with another GPU. I'm thinking a possible 50% jump in performance in a setup like this.
Dragon Range will not be available for at least 5-6 months though, so u're gonna have to wait quite a long time. At least the 13th gen CPU for laptop is supposed to launch in Jan 2023
@@ucle9955 In fairness, most of the heavy lifting -- if not all of it -- is already done for Dragon Range.
You don't really need a 4090 for the eGPU setup. If you get, for example, a Razer Book and pair it with a 3060Ti, you can get stellar performance and you still have an advantage that a desktop doesn't have - you can unplug and get great battery life on the go allowing you to get some work done, do presentations, etc, on one device.
Wonderful Video, Jarrod. Can you also test the eGPU + Laptop performance on the Blender, Redshift rendering? I currently have the Legion 12700+3060, wandering to buy a eGPU to boost my laptop or to sell my laptop and buy a whole new 4090 setup. Thanks Bro.
Hey jarrod the 12900hx was performaing worse cause 16 gen 4 lanes were connected to the 3070 ti leaving the i9 with 4 to spare from which 2 lanes are for the storage that leaves us with 2 lanes of thunderbolt for the 4090 while the 1260p had 4 gen 4 lanes on thunderbolt
Is this really the explanation? I didn't see reduced performance vs my Spectre 16 with RTX 3050 when I had it with my eGPU 3070 vs my Spectre 14 tbh, which should've had the same explanation for it if true
@@omegamalkior1874 the 3050 laptop only has a max of 8 gen4 lanes
@@omegamalkior1874 also the 3070 is a weak gpu compared to 4090
Thanks for another good eGPU video Jarrod! I would really like to a full guide eGPU set up video from you! I've been fighting with an egpu to work properly for months now, but it's hard to find any good trouble shooters or guides out there, especially videos.
I found out that there is an issue with windows 11 and tb4 drivers with intel 12th gen CPUs. Thunderbolt controller isn't present in device manager. I had to install win10 to just connect to my rtx 3090/mantiz egpu. But even now, Performance is better with the laptops inbuilt 3060 card in almost every single case than it is with the 3090. External display with the laptop screen and gpu turned off doesn't make any noticeable differences either which is weird as well. I imagine that I would know a lot more about TB issues and other egpu errors if you had included some of that installation process in your videos, becasue I've watched all of your egpu videos and will keep doing so! 🙂
Have you checked the processor's operating frequencies, has there been a frequency drop of about 2 Ghz?
Would love to see that with an AMD machine.
12th gen Intel is faster
AMD is vastly more efficient tho
Does AMD support Thunderbolt tho🤔
@@ezrapierce1233 Some of the laptops do now with USB 4.
@@ezrapierce1233 USB 4 don't discriminate, they just penetrate.
Love this Frankenstein experiment haha. Results are all over the place
Thank you for testing with the monitor directly connected to eGPU card. Great update from 2020 video.
I have a Radeon RTX 3070 and decent gaming desktop components on the way, and own an RTX 1080 in an eGPU setup already. I plan on running my own quick & dirty relative tests for all 4 setup combos, and will be happy to post results here if I can remember to do so. My eGPU is limited to Thunderbolt 3 though.
For better performance you could hook up directly to the pcie 4 ssds slot..
It carries none of the bottlenecks if thunderbolt and should get twice the bandwidth, assuming you can get it to work?
would had been nice to see you using a m.2 slot for the egpu as well because the speed would be way faster. no one I have seen has done it this way while also using a 40 series gpu. would be nice to see the results compared to the 7950x etc.
And connect eGPU to the external display to reduce overhead.
In Flames T-Shirt. Nice
Great video. Thanks for always doing these egpu tests. Would love to see it paired with a 6800u laptop to see how new handhelds might perform
At this moment in time for compact portable gaming, the way to go is ITX sandwich case instead of a laptop. Is not like laptops are used to play while you're on the move in the plane or train or whatever, but to unpack and plug into the mains power supply and use at the various destination locations. The laptops used on your lap far away from a power plug are those ultrabook things that you use to drop in loads of photos and do a quick photoshop edit and upload on some blog. Not for gaming.
In the future I hope case manufacturers will expand the compact form factor ITX cases that have a very small depth meaning the GPU and the mobo to be installed on the same plane side by side and developing a CPU cooling system that isn't a high tower but a small height and large surface area on the horizontal plane. This will result in having a similar footprint of the 17" laptops and a a slight height increase, thus useful to carry around. With a portable 17 " monitor you end up with a full gaming rig with a volume not much larger than a gaming 17" laptop
If the 4090 directly outputs to an external monitor, that will minimise the side effect of the structure, and I guess you can have a fairer comparison over cpu bottleneck then.
Very interesting video! Could you do a comparison between thunderbolt and m.2 to pcie x16 adapters? They also have only 4 lanes of course, but they don't have the other thunderbolt limitations.
Can you try this setup again with DLSS3 and Frame Gen enabled? Maybe even include comparisons with other 4000 series cards? I saw on some boards that frame gen compensates significantly for the drop with TB3/4 bandwidth.
Time is the most valuable asset in world and you saved me 1 h time researching for limitations of Thunderbolt 3-4. Thank you.
Hi Jarrod, please can you test the eGPU with content workloads. 🙏🙏 looking at an eGPU for rendering.
Very welcomed video, many thanks for sharing.
I don't have a laptop or a pc but i wanna see if rog will be making a portable 4090 like they did with the 30 series that u can plug on their laptop
Honestly I think manufactures have forgotten about egpu solutions. When was the last time we saw a new one come to market? The XG mobile is sadly the best one currently and it is so expensive that if honestly that you might as well grab a mid range desktop pc and a budget gaming laptop and probably come out ahead. I really miss Alienwares egpu. I know it was proprietary, but it had much better performance. I really wish dell was putting all that money towards making a v2 and make it a no brainer choice to buy their machines.
Oh my fudge, I was literally planning on doin this in the future since I just bought a Razer Blade 17 3070 Ti 150w TDP, and I'm planning on making this my full setup. I'm happy as hell you can try it out first.
your setup of GPU, extend riser , and power supply unit is beautiful(idk what to say but maybe also I mean it save some desk/table space)
Man I wish they would improve this tech so I can take my thin laptop everywhere then plug it in to the gpu at home and play which ever game I want
One of the most valuable eGPU vids I've seen, GREAT WORK!
thanks!
i don't find anyone else doing this. Thank you for the test
No problem, thanks for watching!
Interested in 4070ti performance on eGPU! But I guess it's definitely worse than 4090
I'd love to see: 1) high spec laptop with discrete graphics (eg. Xps17) vs egpu vs desktop. I'd like to see what sort of gap there is. 2) instead of games, I'd love to see content creation blender, unreal engine, davinci, premiere pro, after effects.
Thank you for the tests done. I thought to buy eGPU for 3D renders, but i will stick with my dedicated 3080 )) thank you for saving my money 🤝
I rarely comment on anything.
But this TH-camr is my piece of cake I like 👌.
it would be good to see 3070 ti mobile inclded in all tests! to compare that scenario too!
@6:36 Thanks for the clarification that eGPU with RTX 4090 is even worse than the RTX 3070Ti Laptop GPU, otherwise I was under the wrong impression that if 4090 Desktop is 4x > 4090 eGPU, then it must be at least 10x Better than my RTX 3070ti laptop GPU.
If you ever find out why the i7 was outperforming the i9 later, please tell us!
Melting aluminium, memory difference, some sort of difference on how both laptops handle I/O as TB will compete with other I/O traffic. Hard to tell.
This egpu enclosure was released in tandem with the 12.5 inch razer blade stealth. When it debuted it featured soldered ram and 7th gen intel processors with 15w tdp. It was a fine laptop but it had massive bezels and the build quality left a bit to be desired. Really these enclosures only make sense with ultrabook type devices. If you already have a gaming laptop there really is no point. And even still, is it really worth spending all that money on a desktop class card to get such middling performance for how much better the card can do? Back in the day it made more sense because the cards weren’t as powerful and the thunderbolt bandwidth wasn’t as much an issue but nowadays they make no sense. The egpu enclosure over thunderbolt is a lost cause until the day there’s a new cable standard with enough bandwidth to take full advantage of the cards of that time and beyond.
meanwhile me on my 9 year old shitty laptp trying to run flappy bird with no lag
Actually, the reason NVIDIA recommends such a high-output power supply is that the 3090/4090 is capable of power-draw spikes that can jump WELL over it's TDP. Linus tech talked about this during one of their builds. If you cheap-out on the PSU, these spikes can trip the circuit protection on your PSU and that's a hard boot.
Would be fun to see this with now released mobile 40-cards 🤔
Damn , this is the video i was looking foor, thanks mate
next gen thunderbolt is fine and dandy, but given that the GPU enclosures have their own TB controller onboard - they will also have to be upgraded to suit.
Market for them seems to be pretty much dead now
best and professional presentation, thank you very much.
I knew thunderbolt sucks in gaming, but good for file transfer and video out.
Thank you man for uploading
Thanks Jarrod. As usual, eGPU is not worth it due to bandwidth limitations of thunderbolt but boy is it interesting.
Not true. Just connect an external monitor to your eGPU box and you will see that it work much faster. If the internal display is in use, the TB3(4) send data in both direction. Just imagine a some person who has the eGPU but not external monitor.
It seems it's better to have gaming desktop and when we like to play at laptop just to stream games from gaming desktop PC to laptop (ie. using Moonlight). Thanks for interesting video!
Great Video. Now I'd be curious to see how an all AMD Advantage Laptop with the highest end specs like the - ROG Strix G15 Advantage Edition with Ryzen 9 5980HX / 6800M would perform with this eGPU setup. This is a great setup for anyone who's a creative or engineer/coder on the go. You could switch back and forth depending on your workload and just have you laptop in a cradle and eGPU next to it ready to go with just a Monitor and keyboard/mouse.
this is exactly what i'd love to see. i've searched for it many times to see if it is compatible with my 2022 g15!
Egpu is only good for office work and video playback. Most older laptops can't do multiple 4k monitors without turning into a loud heater. eGpu offloaded heat and runs silently and allowed me to have more than 4 monitors (combined with integrated gpu output) But newer GPUs are so efficient have no issue driving 4 4k monitors so I got rid of my eGPU set up
What about using a eGPU for laptops for rendering and other 3d applications?
Very solid review. thank you!
I love your videos, you always talk about interesting subjects !!
Glad you like them!
I still love this setup since it still offers flexibility when you want to play desktop games at high speed. However, I do think it's about time we see eGPU brands make a new enclosure with larger size and Thunderbolt 4 ports. I checked that both the 4080 and 4090 has 61mm slot and neither Razer Core X Chroma nor AKiTiO Node Titan can fit.
He literally fit it in the video
@@thepropolys He said in the video that he had to do some mods.
@@FAT8893 He bent a small piece of metal. Sure, it's a modification but it's also undoable and simple.
Thanks for sharing. I think for now I'll have to buy a PC for gaming when I'm home and a seperate laptop when I'm at work
ever since i got a 4k external monitor to use with laptop i started considering buying an eGPU, but this video is self explanatory.
the only thing I can think of is that obviously the interfaces can't keep up with the new top of the range, this and the fact that gaming laptops are evidently not designed to work with thunderbolt like a laptop without a dedicated graphics card.
the only solution I can think of is a new interface, similar to the one used by the asus rog, which is completely dedicated to the gpu and set by default in gaming-oriented laptops, but that would only raise their prices.
Do you have M.2/PCIe eGPU connector? Will it perform better, if using it? Or is Thunderbolt superior on laptops?
To see if it's a problem with thunderbolt you should test again but use an M.2 connector
will be interesting to see comparison with an amd ryzen 6000 series laptop with usb4
So, if you are only able to get a hold of laptops that are not super powerful, a cheap eGPU setup is your go to option… though, hope nothing goes off first try, gotta hope for not needing to troubleshoot or mess with software or hardware (common with eGPU setups), also pray for no whitelisting blockades.
I cant believe in the year 2023 we still dont have a good way to use a dGPU on a laptop. Thunderbolt 5 seems like itll finally be the one but tbh a simple 8x Oculink port would do just fine. The PCIe lanes are there and it doesnt have to be more expensive to add this port.
Hey just bought a GPD Win 4 and enjoying it but thinking for future how docked usage of getting an E GPU Oculink Setup. Do you think a 4090 or 4090ti would be worth it in that case. And on that note if it is would a 4090ti fit in that case?
It's really weird that the gaming laptop had a so much bigger bottleneck compared to the zenbook when it should have more internal PCIe lanes and power to handle them. If you can you should talk to Intel and ask an explanation to the problem
more lanes don't matter if you're limited to pcie 3.0 x4 no matter what. thunderbolt 3 is outdated, it was already shaky for 10 series GPUs, but pretty much unusable for 40 series. Even SSD's offer more than double speeds than tb3 is capable of supporting
we've now reached a GPU speed to where we get benefits of saturating pcie 4.0 x16, that is equivalent to pcie 3.0 x32, now compared to x4, you can see why they perform suboptimal
I've noticed some drivers just do not work with an E-gpu... Its not well enough supported, and win 11 made it worse by alot, these tests need to be done in win 10
the fact that it connects with a usb cable is amazing
But what if you're connecting external displays to ghe eGPU and not running through the laptop screen? That's what I'm keen on, can I run my laptop as a CPU / OS and everything else connected to a dock and eGPU via TB3/4. Anyway, maybe a test for next time?
That's what he misses in this video, the bandwidth seems low because half the bandwidth is going to feed the laptop screen which also adds more load to the cpu, if he was using an external display he'd be getting the full 4GB/sec of bandwidth
Heard some people complaining about their thunderbolt port getting melted off by powerful video cards.. I guess eGPU is not that safe yet.