► Check ProtoArc's ergonomic and portable tech solutions to upgrade your workspace at Prime Day sales from July 16th to 18th, with discounts starting at 20% off. Shop on their official website at www.protoarc.com/pages/summer-flash-sale?ref=UFDTECH or Amazon store www.amazon.com/stores/ProtoArc/page/51FDEA72-3BEE-4CD1-9157-933B4B9E3A84 #ProtoArc #ProtoArcPrimeDay2024
Fun fact: The 890M is actually the same speed as the slowest 40 series GPU you can get, specifically the 30W RTX 4050 in the Dell XPS 14. It's real, iGPUs are now starting to reach the levels of current gen Nvidia.
@mathewelisha8797 That's where it's important to say RTX 4050 Mobile. The 16 CU 890M is close to the mobile chip, unfortunately, that's about the same as a GTX 1660, nearly half as fast as the RTX 4050.
@@drewnewby Wrong, the 4050 specifically in the Dell XPS 14, (if you had read my comment) you would have known it runs slower, at around half the speed of other 4050s in games. Making it around the speed of a 1070. You know what's also the speed of a 1070? The 890M.
@@Gamer-x8b According to whom? If the PC does what he needs it to do, it dont need an upgrade. Im stickk running my AMD FX 8 Core monster with DUAL R9 290s, Sound Blaster PRO, 32 GB of RAM a whole batch of disks. its what easy 12 years old (gpus are newer i had 7990s before on this system) and it still does alll i need it to do, is far from slow and still framelocked 100 FPS in the few games i play. ~ Wouldnt dream of upgrading it, cos its does the job.
Had that setup until march 2020. The new pc is already mid to low end just a few years later. If you do upgrade, go high end so the next pc lasts just as long.
Qualcom: "We're gonna be the very best, like no one ever was. To catch market is our real test, to dominate is our cause. I will travel across the land, in laptops far and wide. Teach gamers to understand, the power that's inside. Qualcom! Gotta sell 'em all!" Reality: *cough*
Don't forget the other half of this thing, Microsoft was also able to bring Nokia to the ground.. and not because Windows Phones was really bad, they just had zero app support.. 3 years required for the most basic social networks to release on Windows Phone, but at that point evetybody leaved the platform because of the missing apps (and that easily can repeated here too, since developers why should ignore the 98% of the Windows market just to make apps for the 2% of Windows users)
To be fair most consumers won't buy the 1st generation as they are exposed to bugs. Maybe a couple of generations once Qualcomm settles in, may we see it truly shine
@@rinchu6051 Qualcomm can improve... Microsoft... well, it could be another Nokia situation if they don't fully commit to it. They might even undo the momentum... And they sure as heck did with the Copilot PCs fiasco.
11:00 i agree with the commenters. A coin flip is never a 50/50 .. there's the 0.0001 chance it balances on the edge. Sorry to hear the "baddum tss" removal was an inside job; but I still wait for your "i am sorry" video on the 50/50 matter.
The 5090D is very likely just a renamed 5080 SKU. The 4090 didnt meet export requirements, that means that the 5090D cant be faster than the 4090D. Leaks suggested and i expected the 5080 to launch first so NVIDIA could cash in on the initial release hype in China, since the 5090 certanly cant be exported to China. Also, the US Goverment was pretty pissed that NVIDIA cut down the 4090D to meet export restrictions but at the same time they exported the same 4090 SKU, so that defeated the entire point of the restrictions. Therefore it is extremely unlikely that the 5090D will feature the same SKU as the 5090, meaning it will use the SKU below that which is the one of the 5080. Combine with the fact that the 5090D will feature the same performance as the 4090D, because that was at the limit of export restrictions, and you end up with the certain conclusuion that the 5090D is just a renaming of the 5080. The GPU now acually named 5080 will therefore either be equal in performance to the 5090D or, more likely, will be a cut down Version of the original 5080 SKU. The 5090D might also have been originally intended as the 5080Ti, maybe the 5090D will be sold as the 5080Ti in the rest of the world. Either way, the 5090D is not actually a 5090 but a renamed 5080 SKU.
It's really depressing that Intel's gone down hill so much this past year, their next chips better be the best on the market, and priced extremely competitively if they want to make it. 😂I'm glad that the laptop I bought was the i7 version though, rather than the i9 version.
@@chadbizeau5997 I kind of disagreed with that take. Yeah, the performance increase was almost non-existent, but introducing PCIe Gen 4 and adding 4 dedicated lanes for the NVMe slot was a pretty good step up even if a bit late. I built 2 machines for people with the 11700K when microcenter had the CPU + Z series mobo combo for less than 250 bucks. Which was pretty good when pandemic shortages were hitting hard. They're still going strong.
I've built a lot and honestly it's been downhill since 9xxx, Coffee Lake "refresh". 11xxx was a waste of sand. For my customers, I wish it had jumped from 8th gen to 12th gen, as Alder Lake was decent towards the end value wise.
Intel is so, so quiet because it has impacted large commercial / business customers, not just Joe Consumer, or Gamer Chad. There is a lot more at stake this time around than just RMAs
If a 5090D comes, how does that work. Since the 4090D is the "strongest" gpu allowed to export to China, wouldn't that mean that the 5090D have to be at the same level as the 4090D? That would mean that the 5080 needs to be slightly weaker than the 4090D. Now i'm thinking we're gona get a gimped 5090 to make this possible, with Nvidia holding a possible 5090 super or 5090 ti in store if AMD releases something that could compete with the gimped 5090.
Might be a pipe dream, but if APU graphics technology advances enough in terms of gaming performance to that of a desktop 1080 ti I might actually switch to not using a discrete graphics card for gaming. Not only because of power consumption, but size concerns as cards are getting rather large and ship with features that I frankly do not use as someone that still games at 1080p 60 without ray tracing, or any DLSS, XeSS, or FSR except for when I'm on my Steam Deck since FSR makes several games playable on that little device.
It takes a 3060 to be on par with a PS5 / Series X, and with optimization taken into account, sometimes a 3070 but you quite need the VRAM (16GB should have been standard). A 1080Ti is roughly on par with a 2070, and the highest end 2080 Ti Super or whatever got trounced by the 3060. Getting a GPU without enough VRAM - or an APU that has to use slower system RAM - will be a problem by 2028 when next gen consoles have been out for a bit. Your system will be compatible with new games, but your system would be in the Xbox One X+ performance tier - slightly better GPU speed, slightly more memory I'd hope, but slower memory. The market is such that we will always be supporting as low end as reasonably possible - and as mobile becomes more and more viable, that lucrative market will keep your hypothetical gaming rig in business for a solid decade. Bluntly, it's wildly underpowered and that bothers me because even my 3090 isn't running UE5 with Lumen and Nanite terribly well. I'm going to be baking everything and disabling both technologies in order to run on modern devices, but there's no doubt that Lumen's successor is the way forward - it's very beautiful but held back even by my 3090. That said, in games that I've played on my Xbox One X, Series X, and 3090, I can confirm that there's a noticeable difference in image quality and visuals between each. The games look significantly better at every step up. Nonetheless, I still use my Xbox One X while my partner uses the Xbox Series X because I don't want to replace my Series X this late into the console cycle. And also because I'm done buying consoles. My Series X died because it got jiggled by my cat being picked up off of it and the HDMI port got murdered in the process. Xbox's direction seems pitiful. I'm not interested in paying these online subscriptions to access peer to peer / developer networks any longer.
@@OverbiteGames I kinda figured it was farfetched. The only reason the 1080 ti is what I'm targeting is because of one or two games. But hey technology marches on so maybe sometime it'll happen.
I have a Lenovo X Elite, it's a beast of a system. Bet laptop I've ever had - even better than my M1 Macbook Pro. Better than any x86 laptop I've ever had, too
my 7600x is pretty, feels pretty strong, i dont need a new cpu yet... but for late adopters, they sure will have a blast with it, just make sure to undervolt your cpu, AM5 runs superhot.
Thankfully no humans were harmed by America in Iraq. Human rights is such a stupid political argument to make when no accusing nation has the moral ground to stand on.
Hopefully in ~2 years when I need my company to upgrade my Intel 12th gen XPS, Qualcomm or some other ARM processor will be mature enough to daily. I don't need tons of performance, battery life is more important for working. My Macbook Air M2 has been pretty great but can't use it for everything, I also try to avoid using it for work. But I have when working in the field all day when no power is available.
All I wanted from the 9700X is better performance than the 7700X with it's lower wattage. I am much more interested in lower power usage with better performance rather than raw performance. My office is an average size and the higher wattage CPUs just heat this room up. It's just a bit, but for me it is noticeable.
Don’t get your hopes up…all rumors. The 5090 is probably going to be a 4090 with more memory. That’s it. They are NOT going to give you Blackwell for gaming.
amd improving performance, lowering wattage and temps = good intel improving performance, increasing wattage and temps = bad and cpus dieing nvidia improving performance increasing wattage and temps stay good = neutral
I really dislike them calling it a RTX5090D; thats going to open all sorts of scams and problems in the used market down the track with people not knowing what theyre buying exactly or being miss-listed. They should have called it like a RTX5085 or something
I'm willing to bet the sanctions lineup will be different from the regular one. I'm willing to bet this is how NVIDIA can still sell 5090s and 5080s without breaking the threshold, but at the same price 😂
I had a bunch of thing of my cart from third party seller they were actually cheaper before prime day but now they slapper prime day deal on it smh 🤦♂🤦♂
@@drunkhusband6257 ya 40 series seemed like a turning point, with dlss and frame generation and decent amount of ray tracing cores to actually do some ray tracing, it should give some pretty good longevity for those chips I assume. The laptop chips alone were a huge leap over the 30 series and dlss and frame generation is huge for laptop gpu performance.
Intel's Bartlett lake rumours with no e-cores makes me think that Intel probably realised that have 2 different types of cores performing same task is not working good. If only it's not lga 1700 only.
Look. Intel have been quite open that e-cores are more power efficient than p-cores, but are slower and don't have 100% of the instruction set of p-cores. For the majority of everyday computing tasks, e-cores are good enough. However, there will always be situations where p-cores are better. Like video encoding/decoding without using GPU. Or gaming. Which is why up to now they've had a mix of cores. So if you plan to predominately do work or games that exclusively use p-cores, an all p-core chip makes sense. I'd hate to think what the power consumption and heat dissipation requirements will be for overclocking though. You also have to remember that they're planning to ditch hyperthreading. So the overall performance gain might be as large as you might expect.
Its a perfect storm, everyone wanting to replace crappy intel 13900k & 14900k, and AMD coming out with a processor that blows them out of the water, i see AMD grabbing a big piece of the market share pie chart, the only question is, how much will they get?
Still loving my 1080ti, still runs AAA games without issues for the last 6 years. Definately got my moneys worth not following the "sheep" and upgrading every six months
2024 is the year AMD takes command and leads the Microprocessor world. I love them for the open source reasons, I'm a Linux user, left Windows completely now. 😜👋 join the Linux 🌊
I love the idea of the snapdragon but they need to put in thr time and miney to optimize it so that 90+% of people who use it have everything they need working, working. Games, emulators, streaming software, etc. aren't 100%. Or close enough. ARM/x86 hybrid was always what I wanted. 6 of AMD's cores with like 4 ARM cores would've been perfect to offset intensive/incompatible tasks to the x86 core while windows and other less intensive tasks runs in the background virtually on no power. If i game didn't work/work poorly, you could just flag it in windows for it to be allocated to the x86 cores exclusively. You might argue with me when i say its that simple, but i say, Microsoft is worth trillions. It should be that simple to just throw some money at it.
Did you know if you type 5090D into google then click the buy now button, youll have won a Makita Circular Saw, how do i know this i have one, theres no way Makita allow NVIDIA exclusive rights to 5090 They stopped intel with some 5000 series CPUS They stopped AMD with there 5000 series CPUs They stopped a coupple others aswell. So good luck NVIDIA, Makita will fight for there 5090
► Check ProtoArc's ergonomic and portable tech solutions to upgrade your workspace at Prime Day sales from July 16th to 18th, with discounts starting at 20% off.
Shop on their official website at
www.protoarc.com/pages/summer-flash-sale?ref=UFDTECH
or Amazon store
www.amazon.com/stores/ProtoArc/page/51FDEA72-3BEE-4CD1-9157-933B4B9E3A84
#ProtoArc #ProtoArcPrimeDay2024
3 hours ago
When i saw "5090D" I thought it was gonna run off of diesel, lol.
Being maidenless, im waiting for NVIDIA to release the 6969DD.
That sounds more like an amd model.number then nividia
When NVidia starts picking up the ultimate gamer names for their cards. The 69420DD 360 no-scope edition.
14” long and 2” thick. Never sags.
🌚
nice
Oh boy! I cant wait for the RTX 9090D Ti Super EVO!
You will get insane performance for a week before it burns the connectors.
I miss some of the old naming schemes of nvidia such as 7900GTO, 8600GTS, 9800GX2, all of them sound like car models 😂
@@MikoYotsuya292 Couldnt mean more accurate . But still I love them naming schemes. Makes me feel I own a smexy car
at only 99999999999$ 😂😂😂😂😂😂😂😂😂
@@shad3slaythe connectors will come pre-burned.
Im actually digging this new AMD Gold & Black color scheme
Fun fact: The 890M is actually the same speed as the slowest 40 series GPU you can get, specifically the 30W RTX 4050 in the Dell XPS 14. It's real, iGPUs are now starting to reach the levels of current gen Nvidia.
The Apple M chips already reached that point though. It's just the game library being very small in Mac OS.
@RobloxianX The current 16 CU 890M is GTX 1660 speed, the 40 CU Strix Halo is RTX 4050 speed, and unreleased.
@@drewnewby There are some leaked benchmarks showing the 890M being very close to the performance of an RTX 4050 laptop on Geekbench.
@mathewelisha8797 That's where it's important to say RTX 4050 Mobile. The 16 CU 890M is close to the mobile chip, unfortunately, that's about the same as a GTX 1660, nearly half as fast as the RTX 4050.
@@drewnewby Wrong, the 4050 specifically in the Dell XPS 14, (if you had read my comment) you would have known it runs slower, at around half the speed of other 4050s in games. Making it around the speed of a 1070. You know what's also the speed of a 1070? The 890M.
My I5 2500k and GTX 980ti PC watching me saying it's not time to upgrade yet 😂😂
You need to upgrade the CPU ASAP
@@Gamer-x8b According to whom? If the PC does what he needs it to do, it dont need an upgrade.
Im stickk running my AMD FX 8 Core monster with DUAL R9 290s, Sound Blaster PRO, 32 GB of RAM a whole batch of disks.
its what easy 12 years old (gpus are newer i had 7990s before on this system) and it still does alll i need it to do, is far from slow and still framelocked 100 FPS in the few games i play.
~
Wouldnt dream of upgrading it, cos its does the job.
Had that setup until march 2020. The new pc is already mid to low end just a few years later.
If you do upgrade, go high end so the next pc lasts just as long.
Haha nice, I’m still on GTX 1070 and i7-6700
There is never a good time to upgrade with how amd and nvidia are acting
Qualcom: "We're gonna be the very best, like no one ever was. To catch market is our real test, to dominate is our cause. I will travel across the land, in laptops far and wide. Teach gamers to understand, the power that's inside. Qualcom! Gotta sell 'em all!"
Reality: *cough*
First mistake: working with Microsoft, incompetent at its finest.
Don't forget the other half of this thing, Microsoft was also able to bring Nokia to the ground.. and not because Windows Phones was really bad, they just had zero app support.. 3 years required for the most basic social networks to release on Windows Phone, but at that point evetybody leaved the platform because of the missing apps (and that easily can repeated here too, since developers why should ignore the 98% of the Windows market just to make apps for the 2% of Windows users)
To be fair most consumers won't buy the 1st generation as they are exposed to bugs. Maybe a couple of generations once Qualcomm settles in, may we see it truly shine
@@Deliveredmean42
Nokia : welcome to the party dude!
@@rinchu6051 Qualcomm can improve... Microsoft... well, it could be another Nokia situation if they don't fully commit to it. They might even undo the momentum... And they sure as heck did with the Copilot PCs fiasco.
I can’t wait for the 7090D evolved ti super to come out at only 99999 USD
With this crazy inflation thats actually quite possible
As someone with a defective 14900K, I am absolutely for the AMD side. Switching as soon as I can afford it
5090 go " _omg omg shlp slph bu bu bu OooohhHhhHH THAT ALL KIDS GAME YAW_* "
Coming from owning a BMW, my first thought was Nvidia coming out with a Diesel? I need more coffee.
ima stick with my 7800x3d
AMD hawk tua chip is lookin good
Don't forget the "spit on it" thermal paste
@@derekjohnston1743don't forget "blow it" cooler
😂😂😂
@@Tygertec "blow it" cooler be looking nice
11:00 i agree with the commenters. A coin flip is never a 50/50 .. there's the 0.0001 chance it balances on the edge.
Sorry to hear the "baddum tss" removal was an inside job; but I still wait for your "i am sorry" video on the 50/50 matter.
Baddum TSS !!
The 5090D is very likely just a renamed 5080 SKU. The 4090 didnt meet export requirements, that means that the 5090D cant be faster than the 4090D. Leaks suggested and i expected the 5080 to launch first so NVIDIA could cash in on the initial release hype in China, since the 5090 certanly cant be exported to China. Also, the US Goverment was pretty pissed that NVIDIA cut down the 4090D to meet export restrictions but at the same time they exported the same 4090 SKU, so that defeated the entire point of the restrictions. Therefore it is extremely unlikely that the 5090D will feature the same SKU as the 5090, meaning it will use the SKU below that which is the one of the 5080. Combine with the fact that the 5090D will feature the same performance as the 4090D, because that was at the limit of export restrictions, and you end up with the certain conclusuion that the 5090D is just a renaming of the 5080. The GPU now acually named 5080 will therefore either be equal in performance to the 5090D or, more likely, will be a cut down Version of the original 5080 SKU. The 5090D might also have been originally intended as the 5080Ti, maybe the 5090D will be sold as the 5080Ti in the rest of the world. Either way, the 5090D is not actually a 5090 but a renamed 5080 SKU.
I'll wait for the 5090 DD
I'm waiting for the xD version
@@alnair228How about X3D? Huh? HUH?!
1:28 I was listening without watching and thought you were gonna say Hawk Tuah there! lmao
It's really depressing that Intel's gone down hill so much this past year, their next chips better be the best on the market, and priced extremely competitively if they want to make it. 😂I'm glad that the laptop I bought was the i7 version though, rather than the i9 version.
I mean, Steve from Gamer's Nexus called the 11xxx series a waste of sand. So not sure how they went down from that lol
@@chadbizeau5997 I kind of disagreed with that take. Yeah, the performance increase was almost non-existent, but introducing PCIe Gen 4 and adding 4 dedicated lanes for the NVMe slot was a pretty good step up even if a bit late.
I built 2 machines for people with the 11700K when microcenter had the CPU + Z series mobo combo for less than 250 bucks. Which was pretty good when pandemic shortages were hitting hard. They're still going strong.
I've built a lot and honestly it's been downhill since 9xxx, Coffee Lake "refresh". 11xxx was a waste of sand. For my customers, I wish it had jumped from 8th gen to 12th gen, as Alder Lake was decent towards the end value wise.
Not me having a worse dedicated gpu than is integrated in a mobile chip😂
I got your coin flip joke Brett! I groaned then chuckled when I heard it.
Big D energy
Intel is so, so quiet because it has impacted large commercial / business customers, not just Joe Consumer, or Gamer Chad. There is a lot more at stake this time around than just RMAs
I thought it was funny how you showcased the "EMO"1 keyboards with the painted black nails and finger rings/bracelets. Thats so goth lol
I got the joke last time, I like the edited out version. Its like he did it on purpose
If a 5090D comes, how does that work. Since the 4090D is the "strongest" gpu allowed to export to China, wouldn't that mean that the 5090D have to be at the same level as the 4090D? That would mean that the 5080 needs to be slightly weaker than the 4090D. Now i'm thinking we're gona get a gimped 5090 to make this possible, with Nvidia holding a possible 5090 super or 5090 ti in store if AMD releases something that could compete with the gimped 5090.
Might be a pipe dream, but if APU graphics technology advances enough in terms of gaming performance to that of a desktop 1080 ti I might actually switch to not using a discrete graphics card for gaming. Not only because of power consumption, but size concerns as cards are getting rather large and ship with features that I frankly do not use as someone that still games at 1080p 60 without ray tracing, or any DLSS, XeSS, or FSR except for when I'm on my Steam Deck since FSR makes several games playable on that little device.
That chip alone would cost more than a console that can do120fps
It takes a 3060 to be on par with a PS5 / Series X, and with optimization taken into account, sometimes a 3070 but you quite need the VRAM (16GB should have been standard).
A 1080Ti is roughly on par with a 2070, and the highest end 2080 Ti Super or whatever got trounced by the 3060.
Getting a GPU without enough VRAM - or an APU that has to use slower system RAM - will be a problem by 2028 when next gen consoles have been out for a bit.
Your system will be compatible with new games, but your system would be in the Xbox One X+ performance tier - slightly better GPU speed, slightly more memory I'd hope, but slower memory. The market is such that we will always be supporting as low end as reasonably possible - and as mobile becomes more and more viable, that lucrative market will keep your hypothetical gaming rig in business for a solid decade.
Bluntly, it's wildly underpowered and that bothers me because even my 3090 isn't running UE5 with Lumen and Nanite terribly well. I'm going to be baking everything and disabling both technologies in order to run on modern devices, but there's no doubt that Lumen's successor is the way forward - it's very beautiful but held back even by my 3090. That said, in games that I've played on my Xbox One X, Series X, and 3090, I can confirm that there's a noticeable difference in image quality and visuals between each. The games look significantly better at every step up. Nonetheless, I still use my Xbox One X while my partner uses the Xbox Series X because I don't want to replace my Series X this late into the console cycle. And also because I'm done buying consoles. My Series X died because it got jiggled by my cat being picked up off of it and the HDMI port got murdered in the process. Xbox's direction seems pitiful. I'm not interested in paying these online subscriptions to access peer to peer / developer networks any longer.
@@OverbiteGames I kinda figured it was farfetched. The only reason the 1080 ti is what I'm targeting is because of one or two games. But hey technology marches on so maybe sometime it'll happen.
Gotta sell my another kidney just for the GPU XD
how much?
@@elalemanpaisa It depends on what condition you're kidney is. If its healthy then more money
Surprised he has long form content. (I only watch his shorts)
Welcome, he’s been making long form content for years lol
@@TheCompyshopI just joined
Its not tiktok😂
@@OGruurd 😅
I wasn't even smart enough to notice the 5050 joke....
This is not the greatest GPU in the world, no. This is just a tribute.
😂 somebody been listening to Jack Black..
the 40 series would have been fire if all the cards were priced $200 less and each card was renamed to the tier below it
I have a Lenovo X Elite, it's a beast of a system. Bet laptop I've ever had - even better than my M1 Macbook Pro. Better than any x86 laptop I've ever had, too
As an Amazon employee. I am not enjoying prime day. Just started my first 11 hour shift
Sorry man, I know that sucks. Still gonna order a few things I've had in my cart for while...
Qualcom had to offer their top end xlite for under 1000 to be competirive they HAVE to sacrifice margins in short term.
my 7600x is pretty, feels pretty strong, i dont need a new cpu yet... but for late adopters, they sure will have a blast with it, just make sure to undervolt your cpu, AM5 runs superhot.
I might wait for rtx 6090
The acer monitor is 299$ in US, and 600€ in europe. Nice
Creeping up on that million subs!
will the 5090 melt cables too?
Yummy yummy deals!
I got a i5-12400f FT. RTX 3060 12GB n my brother got a i5-10400f FT. RX 5600 XT 6GB both r great
AMD is doing the same naming scheme that TV models use.
I like how they plug a trackball mouse saying "I use them all the time" with a regular mouse viable on the desk. Top notch advertising there bud....
5090Discount version? 5090Dictatorship version? 5090DesregardForHumanRights version?
Thankfully no humans were harmed by America in Iraq. Human rights is such a stupid political argument to make when no accusing nation has the moral ground to stand on.
Snap Elite costs double!
D for Diesel Powered
Pika-chew through them polls
Dam. My 150w gtx 970 is now getting beaten by a 15w igpu. Although this shound be xpected at this point.
I think the two statements point to the 9700x being basically the same performance as the 7800x3d with +-3% depending on the workload
It's probably faster in non cache dependent tasks and slower in cache dependent tasks
@MiningdragonLP I'm betting significantly faster than the non-cache reliant tasks.
The cache is for a few use cases like gaming
You forgot LTT LIME DAY :o
Back in my days Nvidia was releasing like 20 GPUs per year nowadays GTA 7 probably would release faster than RTX 5090Ti.
Companies then: We want to make good products for our gamers
Companies now: BEEP BOP BEEP BOP AI AI AI BEEP BOP BEEP BOP AI AI AI
I want to rx 8800 xt
Hopefully in ~2 years when I need my company to upgrade my Intel 12th gen XPS, Qualcomm or some other ARM processor will be mature enough to daily. I don't need tons of performance, battery life is more important for working.
My Macbook Air M2 has been pretty great but can't use it for everything, I also try to avoid using it for work. But I have when working in the field all day when no power is available.
All I wanted from the 9700X is better performance than the 7700X with it's lower wattage. I am much more interested in lower power usage with better performance rather than raw performance. My office is an average size and the higher wattage CPUs just heat this room up. It's just a bit, but for me it is noticeable.
Use liquid cooler for the CPU. And AC for the office.
Isn't 2% what Intel was giving when they thought competition was a thing of the past??
11:42 damn... Rip editor
Huh
Im that quick today
Don’t get your hopes up…all rumors. The 5090 is probably going to be a 4090 with more memory. That’s it. They are NOT going to give you Blackwell for gaming.
A stable Expo mode on AMD 7800x3d .... would be great.....
Tbh for the last couple years only thing that excites me steam deck or any non windows gaming console updates
RTX 5050 is 50/50 price to performance
I'm commenting to get a shoutout next vid, shoutout Tavaa.
Qualcomm's Snapdragon X is DOA IMO. Nobody wants to chance compatibility.
Oh man! I can't wait for AMD's RX AI 9 8080 HX XTX vs. NVIDIA's RTX 6090D Ti Super GTS SLI!!!!
The "D" is silent
Hello im here
amd improving performance, lowering wattage and temps = good
intel improving performance, increasing wattage and temps = bad and cpus dieing
nvidia improving performance increasing wattage and temps stay good = neutral
DAMMIT RIK!
i would buy x elite for 600$ but not for over 1000$ ...
I really dislike them calling it a RTX5090D; thats going to open all sorts of scams and problems in the used market down the track with people not knowing what theyre buying exactly or being miss-listed. They should have called it like a RTX5085 or something
There probably won't be any confusion since it won't be sold outside of China.
@@ironicdivinemandatestan4262 bc Chinese stuff never hits ebay and other trading sites? That's why I said used market.
Not sure how you're getting $679.99 for that Asus Tuff Gaming A16, but as I check Amazon 9hours after your post, the price is listed $899?
Nvidia: "Time to give them the 'D'."
They cant export something more powerful than 4090D so a 5090D makes no sense
I'm willing to bet the sanctions lineup will be different from the regular one.
I'm willing to bet this is how NVIDIA can still sell 5090s and 5080s without breaking the threshold, but at the same price 😂
fun fact, Jensen Huang is of Taiwanese decent.
I had a bunch of thing of my cart from third party seller they were actually cheaper before prime day but now they slapper prime day deal on it smh 🤦♂🤦♂
Oh, why did I think this was a LTT review 🤔
Those are ASUS ProArt series colors AMD decided to go with.
new GDDR7 16GB-32GB FOR RTX5090 512 BIT ?
I got a 2tb ssd pcie4.0 for $100CDN. Yay.
5090 is not coming. I'd be surprised if they announce it
it is tho
I'm guessing more like 2025 from what they originally said. I mean the 4000 series was so powerful it can still run literally everything well.
@@drunkhusband6257 ya 40 series seemed like a turning point, with dlss and frame generation and decent amount of ray tracing cores to actually do some ray tracing, it should give some pretty good longevity for those chips I assume. The laptop chips alone were a huge leap over the 30 series and dlss and frame generation is huge for laptop gpu performance.
@@WaterspoutsOfTheDeep Yep pretty much I thought the 1080ti was a good card, I think the 4090 will be amazing for a LONG time at 1440p/ultrawide 1440p
Why did AMD need to change x700 series to x800 waiting an additional month for a motherboard just because?! What the heck AMD!
Intel's Bartlett lake rumours with no e-cores makes me think that Intel probably realised that have 2 different types of cores performing same task is not working good. If only it's not lga 1700 only.
Look. Intel have been quite open that e-cores are more power efficient than p-cores, but are slower and don't have 100% of the instruction set of p-cores.
For the majority of everyday computing tasks, e-cores are good enough.
However, there will always be situations where p-cores are better. Like video encoding/decoding without using GPU. Or gaming. Which is why up to now they've had a mix of cores.
So if you plan to predominately do work or games that exclusively use p-cores, an all p-core chip makes sense.
I'd hate to think what the power consumption and heat dissipation requirements will be for overclocking though.
You also have to remember that they're planning to ditch hyperthreading. So the overall performance gain might be as large as you might expect.
BTW all Vbios from Nvidia are locked with cryptographic signature ...
Its a perfect storm, everyone wanting to replace crappy intel 13900k & 14900k, and AMD coming out with a processor that blows them out of the water, i see AMD grabbing a big piece of the market share pie chart, the only question is, how much will they get?
Beta test a laptop for 2k euro 😮
Still loving my 1080ti, still runs AAA games without issues for the last 6 years. Definately got my moneys worth not following the "sheep" and upgrading every six months
Next gen builds are gonna be nice. Can’t wait
@11:52, that was a typo, he meant since LUNCH. lol, nothing to see here says Intel...
2024 is the year AMD takes command and leads the Microprocessor world. I love them for the open source reasons, I'm a Linux user, left Windows completely now. 😜👋 join the Linux 🌊
Sorry constructive criticism but rese needs to look deeper into the deals imo
Nvidia just needs to make an all in card and ecu. The cards are getting too big and break from weight and power draw fries connectors.
Will amd abandon desktop gpu for Igpus?
AMD Hawk Thua 😂😂😂
Tom Hanks?
you are a wonderful father so your dad jokes are extra cheesy
I love the idea of the snapdragon but they need to put in thr time and miney to optimize it so that 90+% of people who use it have everything they need working, working. Games, emulators, streaming software, etc. aren't 100%. Or close enough. ARM/x86 hybrid was always what I wanted. 6 of AMD's cores with like 4 ARM cores would've been perfect to offset intensive/incompatible tasks to the x86 core while windows and other less intensive tasks runs in the background virtually on no power. If i game didn't work/work poorly, you could just flag it in windows for it to be allocated to the x86 cores exclusively. You might argue with me when i say its that simple, but i say, Microsoft is worth trillions. It should be that simple to just throw some money at it.
Any news on other Nvidia cards?
When is the 9000 X3D launch date?
July 31st.
Did you know if you type 5090D into google then click the buy now button, youll have won a Makita Circular Saw, how do i know this i have one, theres no way Makita allow NVIDIA exclusive rights to 5090
They stopped intel with some 5000 series CPUS
They stopped AMD with there 5000 series CPUs
They stopped a coupple others aswell.
So good luck NVIDIA, Makita will fight for there 5090