Ryzen 9000 Desktop CPUs Are OFFICIAL!
ฝัง
- เผยแพร่เมื่อ 9 มิ.ย. 2024
- ►FREE 30 day trial when you visit brilliant.org/GamerMeld/ to get started learning STEM! And the first 200 people will get 20% off their annual premium subscription.
AMD just confirmed AMD's Ryzen 9000 CPUs, next Gen Battlemage release, bad news for future AMD APUs and AI is hurting CPU and GPU performance! Stay tuned...
***Items featured in this video available here***
Newegg (Affiliate): geni.us/newegg1
Amazon US (Affiliate): amzn.to/3b9UjKB
Join The Discord: / discord
Twitter: / gamermeld
Facebook: Facebook: / gamermeld
Gamer Meld Merch: teespring.com/stores/gamermeld
Gamer Meld Sponsors: www.gamermeld.com/sponsors
Support Me On Patreon: / gamermeld
SOURCES:
wccftech.com/ai-craze-may-hav...
www.tomshardware.com/pc-compo...
• AMD Zen 5 Strix Halo L...
videocardz.com/newz/amd-apus-...
videocardz.com/newz/intel-lau...
www.pcgamesn.com/amd/rdna-4-l...
www.computerbase.de/2024-04/i...
wccftech.com/nvidia-geforce-r...
/ 1777951108771746202
TIMESTAMPS
0:00 AI Is Hurting Performance
2:46 Bad News For AMD's Future APUs
4:04 Battlemage Release
5:28 Ryzen 9000 Is OFFICIAL - วิทยาศาสตร์และเทคโนโลยี
honestly, we don't need AI chips in gaming computers. chatgpt is fine, but having something like copilot take up performance is stupid imo
Not a fan of basic computing having "helpful" AI, but I'm not against dedicated hardware if it gets used by games to make better NPCs that are more lifelike.
Imagine playing GTA vi with raytracing in 8k with no issues, and then you talk to some random NPC and then your crash 😂
@@ttomkins4867 There is something called GPU for doing NPC stuff.
@@AzusaSnowflakeAren't NPCs a CPU resource?
In the long run it will end up being very useful for anyone that Games and edits for instance. Having an NPU would help with upscaling, maybe assist with rendering as programs become optimised for new hardware features.
I'm saying this as someone that uses an upscaler with a non AVX 2 cpu and a 1060 6gb.
My motto for AI is over hyped in the short term(locked into compute rather than efficiency will come later) Under hyped in the long term, as we haven't began to grasp with use cases we can't imagine yet. Facebook etc would have seemed impossible in the AOL days and the current state of the Internet today would sound like crazy talk to many.
Absolutely no chance silicone space for AI accelerator is worth it. I really don't understand why AI accelerator has to be a part of SoC. If there really needs to be an AI accelerator, can't you just have it somewhere else on the mainboard, connected with PCIe? So that it doesn't use up precious silicone space, cause thermal output and slow down other cores, and become wasted space when they soon get obsolete as accelerator architecture changes over time?
It is a bit disappointing. I'm not gonna lie.
I bet Windows 12 or 13 will demand NPU as "basic" requirement, and people will complain like the TPM for Win11
PCIe 5 is so much slower than a direct bus connection, see it as the Apple Neurologic part of the processor. It is however a risky step and should not be made by an operating system (Microsoft Windows), but more the whole ecosystem. Maybe that is the Windows 12 variant everyone is talking about.
Lol no! Having an accelerator for AI is way more power efficient and performant vs trying to do them on CPU or GPUs. You'll drastically increase you laptop battery life. Plus having it off chip is so much slower like the other comment pointed out so it needs to be on same chip for seamless communication with CPU.
The old google accelerators were literally just add in m.2/pcie devices.
CPU, GPU, DPU, TPU, NPU ... normal PCs seem to be morphing into an Amiga on steroids
Exactly 💯
Moore’s law isn’t holding up like it did from the 90’s to the early 00’s so back to accelerator architectures we go!
Wth is a dpu and tpu
@@elvpse DPU = Data Processing Unit aka processor in/on a Network Card, NVIDIA BlueField and similar
TPU = Tensor Processing Unit which is related to but different than the Neural Processing Unit, Google's Coral is likely the best example of the TPU
i read that first as *Amigo* 😅
damn 😂😂
forcing AI on us was like Nvidia ramming Ray Tracing down our throats even if we never use it and charge us a premium.
I really dont understand why people are so hyped about copilotAI - can anybody explain it to me pls?
It will be Jarvis IRL, eventually it could completely change how we interact with machines. No more file explorer, you ask for something you get it, even with a typo, it could even remind you of the things you forgot about etc. I could also bring true AI in games, games that are more a simulation, less scripted but with ais that react to your actions.
@@Raphy_Afk or it could fuck up all your important shit the way it sometimes does already.
@@Raphy_Afk how is that any better than just using the search bar
@@Raphy_Afk Thank you, nicely explained.
chat gpt + windows cuz they running out of ideas
AMD is playing it a little smarter. It's virtually impossible to compete against Nvidia, yet here they are. We see what they want us to see. Both companies most likely have surprises up their sleeves.
NVIDIA is doing better in Software & Firmware quite a but their Hardware is not the massive Gap it use too be 4 or 5 years ago and is basically only better by Pumping in much more Power (Factory OC!), NVIDIA is only seen as Much better like intel due to around 10 years with no real peers and thus near blind brand loyalty and much more money to throw at Advertising (Influencers, Ads etc!), OEM's etc
I am not saying NVIDIA is not better just not nearly what they use to be at best it's marginal at least for End Consumers
AMD does not want to compete with Nvidia and Nvidia the same with AMD. They have different markets. AMD works with CPUs and Consoles. Nvidia with PC gpus and AI. That's it. There is no real competition. They have a deal.
@@DarthAwardoesn't make much sense you saying that they pump Nvidia cards with more power when they run at a lower tdp than amd. Also I'd say Ryzen has pretty much overtaken Intel at this point, nobody really says Intel is better lol. They only have a much higher market share because most people with an Intel cpu bought it a few years ago when amd was still playing catch up
They are rivals and are just now NVIDIA is focusing more on Data Centres aka Cloud Computing as well as AI, AMD is now beginning to focus on this area as well @@gamingtemplar9893
Bruh, what was the last AMD card on par with Nvidia **90 series? I hear this copium about AMD having something up their sleeve, for more than a decade.
Co pilot key = Bixby button 🗿
My bet, there are gonna be a lot of people, especially gamers who are gonne find ways to make sure any AI copilot in Windows is gonna be disabled or completly removed.
Yeaaaaaah, because that AI might be an internal spy for CIA or even worse. Also imagine how hackers could hack AI and get literally all the info on you so easily because everything will be keylogged and even voice recorded.
Ryzen 9000 vs Denuvo will be a crazy battle
what I'm waiting for is for the lower end models to get NPU's. it'll be a while yet I'm sure.
Still waiting for 5800x3d price cut
You and me, bro.
You could get the cheaper 5700X3D .
@@fleurdewin7958
You might as well go for 5800X at that rate.
Same here. Though at this point, the 7000 series full platform upgrade might end up cheaper sooner. The 5800X3D hasn't budged in price for 2 whole years :(
@@Mark-kr5go
The whole system upgrade still costs a lot, and I have 32 GB of VRAM, even if they are DDR4 3600.
@GamerMeld and people seem to be overlooking how for games once split graph capabilities are enabled. You can do super scaling and frame generation using NPUs instead of using GPU compute to do it. This will translate to more fps in the long run.
Title should say "Ryzen 9000 is Leaked" not official... gotta love click bait
It came from an official AMD chipset release. How is that not official?
@@GamerMeldIgnore him. He is just lost.
@@jeevan1198 Lol, lmao!!😂😂😂
Hes always baitin
Usually I'd be mad too but since he includes chapters I don't mind that much and appreciate the news
What a loser 😂
Is there an option not to include AI in your computer? Since i only use my pc for gaming
For now, get ryzen 7000 series and windows 10. Later - who knows
Hahahaha the rocketman clip got me, lol! One of my absolute favorite movies growing up, and still to this day.
Well, I am from germany, and here and there are A LOT of hardwarechannels, more or less the same informations in english and german, but for me it is so a weird fun to hear you talk, how you pronounce... your dialact is fun for me ;) Thank you, love it!
Was waiting for something from Intel GPU front but already got an RTX card… excited to see AVX512 perf. in Ryzen 9000s though.
When is 9800X3D coming out?
6:04 Can a Ryzen 3 4c/8t 740M really reach 65W max TDP? Similar power to the Steam Deck...
The first sentence sound like SleepyU instead of CPU xD
i sold my 10850k build last week atm having a r5 7600 in slot till next gen comes out to lose as little money as possible :D i am excited
waittttt whyy????
@@eepyhead since a r5 7600 will keep more value. i bought one of the better mini itx b650 motherboards and overkill ddr 5 ram so i can sell the cpu seperatly and lose less money since a i9 10850k currently is going for some good value butm post this release the price will go down quite a bit more
Dedicated games are prone to say, "What's price got to do with it?"
@outlet6989 the resell price of the 10850k will drop more when it releases and memory might go up slightly like due to the offset I calculated if I resell a r5 7600 solo post release I loss less than when I sell my i9 10850k than and the rest so now I only have a couple and gpu swap left this year a
Upgraded my PC early to AM5 when my motherboard started acting up... Been using a 7600 and I hate it. Waiting impatiently for a 9900
I love this channel it has been my goto for a longtime and he always has the info. He doesn’t waste a second it’s always straight to the point and I enjoy learning about all the upcoming launches and insider info.
Mad respect to using the screaming clip from “Rocketman” 😂
I’m good with my 7600x for now so I won’t be going for the 9000 series. My next upgrade will be when they announce the successor to AM5 and then I’ll get the best one at that time. Plus GPU upgrade to the latest then.
Dunno how anyone else feels, but this AI "gold rush" is as ridiculous as the Mining craze of years past...
Or the 3DTV craze ...
Mining hasn't stopped.
@@SwishaMane420 Your right, though it's not as stupid crazy as it was a few years back.
made a ton of cash mining ETH. it was great!
it's sad because it's going to give hardware manufactures (i.e NVIDIA, AMD, intel) the excuse to jack up their prices for something that the average person probably will not use.
As expected.
Known for over 250 predictions across various areas of science over the last six years.
BSc Applied Computing, nearly worked for Pivotal Games, GCHQ and the MoD and almost went heavy into AI and I fear this will not be for our benefit and am concerned about this.
So much so I am .. worried about buying into any of this.
I am not that hopeful about Ryzen 9000. What I am actually excited for is Arm and Risc-V on desktop
I think that High Level Local AI is more useful than RayTracing, for not only gaming, but computing in general.
what is high end?
if what these days is called high end are gpu's around 4 to 8 times above a normal gamer enthausiasts high end budged, would that mean that mid-low end should be called high end instead?
I actually think a TPU would be useful in games for any NPC AI behaviors or even improved random generation of content. I think there's something to be said for these things and more, though they'd have to manifest first.
Maybe - can only hope that's what pans out. But I suspect it's going to be more about data mining consumers...
Sooner or later I will move to Linux. No reason to use Windows anymore
Well, since I purchased a 7950x3d, only, a month ago, the 9000 news sucks.
Battlemage is gonna be pretty exciting if intel plays it the right way.
since Battlemage is currently already performance around 2.5 times better than alchemist based on the info they leaked during the intel vision 2024 event.
while they didn't directly say it they actually shared one other statistic about lunarlake's gpu which we know to be battlemage(also since it is 2.5 times faster already than alchemist so has to be battlemage).
they gave away the full system TOPs, and when we subtract the TOPs dedicated to the NPU and the CPU we have the TOPs of the GPU, generally seen due to how videogames work and how gpu's work the TOPs a gpu has and the game performance tend to scale quite linearly, as a example when comparing the rtx 4060ti to the integrated gpu in the metiorlake apu's based on tops then the ratio is pretty much the same as the compared ratio in gaming.
the same is also tru comparing metiorlake to elite x.
and pretty much any other current gpu architecture.
TOPs just turns out to also be a rather accurate estimate for gaming and such when comared to other gpu's.
in this case we compare it to metiorlake which is based on alchemist and we see that battlemage is litterally 2.5 times faster for a similar size and powerdraw.
next to that intel also said speciffically more than 100 tops total, which means they expect it to actually still increase potentially, the NPU won't increase, the cpu has little effect on total tops, so might be the gpu will become even more powerfull.
they also stated they where pretty certain to have shipped 40million battlemage units before the end of the year in the form of lunarlake.
to put this into perspective, the battlemage integrated gpu in lunarlake might already beat some of the lower end dedicated alchemist cards.
if the desktop cards get similar increases, or even in that direction(as long as there remain budged and mid end options as well) then that will be great.
if intel keeps the pricing similar to alchemist then intel will almost certainly take the value crown and many people might get those gpu's.
if they add the hardware memory compression into it(might sound bad but in a gpu hardware memory compression actually gives more virtual ram as well as higher total bandwith, since gpu's require more on bandwith and big messages rather than rapid small messages like cpus do.
and perhaps if it also has a npu next to it that will also get those last people away from nvidia.
I hope there's a 9999wx for threadripper
You mean a 9995wx Zen 5 Threadripper, following the Zen 4 Threadripper 7995wx CPU.
AI is and will be one of the most devastating things to ever happen to humanity.
waiting for 9800x3D, is all im goin for, atm i got a 5800x3D
How much?
Started back in Dec 23 trading out my 5900x for 5800x3D. Then realized I could sell the 3D and basic x570 board and barely spend any out of pocket for a 7600x/x670/32gb combo. Then found two local deals on brand new 7800x3D for $340 and $260. Then a 7600x/Asus b650e-e for $240. After selling off 5600G,5600,5600,5600x and four entry level x570 boards and some ram the out of pocket was very little.
Hopefully Ryzen 9000 doesn’t trigger any upgrade remorse. The deals were just too good. 😂
OVER 9000! don’t miss the opportunity AMD
Am I still wrong for wanting an AMD 5950x 3D?
AMD needs to increase their single thread performance by atleast 25%. Also, they should do something great, and give us more than 16 cores / 32 threads. Something like 32/64 for the same standard consumer price. Reduce the temperatures. Don't go over 80C. Increase the memory clock speed support to 8000 mhz base. Not XMP/AMP.
Hope everyone is ready for the over 9000 memes to return.
No, I don't want to sacrifice performance, and have NO use for AI on my system.
That AMD APU news makes me rethink prepairing to buy new AMD based computer❗
If you have new tech, use it or you loose overall.
Setting new standards without the buy-in of all computer manufacturers is scary. On the other hand, our interaction with the operating system will change completely. The engine that drives the AI readiness in these new computers is something that should be included on the SOC like a video card. Will it hurt performance? I do not think so, it better not cost TOP.
Let's just hope we get actual high-end ITX options for ryzen 9000. 7000/8000 series only has one crappy x670e board from asus and nothing for the x670 non-extreme(tm) edition. Considering that a lot people want small PCs which typically basically require 2 highspeed usbc ports...well... options are really lacking, even for b650. Why on earth does the ATX giga-tower get a mobo with enough ports to be a hoe-board? while the actual io constrained form-factor with far fewer connectivity options ends up only a a few usb ports? Even if they are 3.2 gen 2? Why?! If any form factor needs usb 4...it's the super tiny ITX. I haven't checked out the ITX support on intel yet, but I'm seriously getting pissed. While there are some attractive options for ATX, it just isn't worth the dream of a tiny minimalist power house PC. Don't get me wrong, there are a lot of complaints about 7K Ryzen chips and EXPO...(BTW asus....WTF IS THERE NO CLEAR CMOS BUTTON ON A PLATFORM WITH A NEW MEMORY OVERCLOCKING STANDARD!?!?!)
People should take control of their own brains before they even get to use AI
1:40 Well its not worth trade off, becuze I am gamer and I prefer native over dlss/ray tracing, they should focus on Ai 10 years later, too erly to focus and go all in for Ai now.
Give not give CPUs FP8 capabilities instead of NPU, FP8 should make them decent at AI workloads.
1:10 I bet AI flavours of products will start being released to try and maintain gaming performance, untill they can combine with smaller tradeoff.
HOPE YOU HAVE A GREAT DAY!
Call me stupid if you like but I really dont get copilot or how it could help me day to day so I dont get the hype isnt the google assistant ai pls explain 😂
AI PC will probably be the next 3D TV
1:36 Yes.
A: Because it's a temporary problem.
B: Because locally run Uncensored AIs are preferable to the Cloud Censured AIs.
i really don't want AI on my PC ... msot folks are on cell phones so who does this cater too ? the business market , the gamer market ? or the content creation market ? the research market ? the only one group in this i see that would want AI is the research market.
Amd announced a new processor
"It's over 9000"
they will release 100 cpus before they release another decent series of GPUs.
since 4000 wasn't worth it over my 3080ti now I've been waiting. why do the companies make it so hard to spend my money
truly dont give a shit about AI. Everyone got excited before anybody got excited.
I think AMD's CPU numbering scheme is a brilliant bit of marketing. By using alternating number groupings for each release, whether it's mobile on even thousand's and desktop on odd, because there's a bigger numerical jump, people might be lured into upgrading sooner than they otherwise might do! If the Ryzen CPU numbers only ever always went up by 1000, then it would make people less inclined to upgrade, since they're only one gen behind. But, bump it up by 2, and people that are unaware will feel like they've held off enough, and it's time to 'jump' over the mobile part number in the sequence, not realizing that they're really just going up a single generational step.
I feel like AMD concerns about their APU lineup....ARM in other hand has great iGPU...may be that's the reason why they don't play in high end GPU
Maybe they will create a AI and non AI version of the cpu for those who don't need it.
AFMF is a blessing to have
I'd rather AMD maxout efficiency and make rdna3.5 be almost rx6600
Give me everything AMD!
I would buy a cpu with no AI accelerator and more cache over one that has an AI accelerator. AI is not a purchase decision factor for me at the moment. Maybe on a discrete GPU, but not on mobile.
I would use it if there was more software maybe, but at the moment I have no compelling reason to buy it, and llvm still takes too long to compile. More cache please.
it's stupid that they ruined cache for "ai".
Just leave ai to GPUS. Nvidia is great at it.
Intel Battlemage coming around Rtx 5000 could mean either intel is confident enough to compete or too lazy to release it earlier like Arc came before 4000
I think they should keep making budget low end GPUs so that they can spend a few years developing GPU drivers
@@blamyy6310 Maybe they will make budget GPUs. Right when Nvidia drops their 1200$ cards and AMD releases their 600-800$, intel will launch their 300-500$ cards.
All I can say is AI/Copilot better be optional on future Windows releases or I'll be digging my heels in even harder as it's not something I'm even remotely interested in and I don't see why we, the end user, should pay extra (financially and performance-wise) for hardware and services that a lot of us will never use.
A lot of us are already paying for ray tracing hardware in GPU's which we are either not using out of choice (not interested), or cannot use because anything below a 7700xt / 4070ti is pretty much a waste of time with RT on anyway due to the poor performance
When will this enforced incusion of "gimmick" technologies in mainstream hardware end and why should we have to pay? Remember 3D TV's - for a time you could hardly find decent TV's that didn't have a 3D gimmick which obviously cost extra but the rest of the non-3D TV's at that time tended to be crappy entry level panels? Who actually watched 3D content at home long-term?
amd refusing to build a halo product even in very limited numbers is going to cost them hard fought market share as the 7900 xtx fans are forced to upgrade to team green.
intel and amd be like: trade deal: i get to spy on you more efficiently with useless AI NPUs and you get worse performance
Release the Kraken!
I have a future prediction. Graphic cards will be obsolete as we know them. The power of these desktop computers are now so enormous that they already rival the graphics cards for compute power, not in all specialized areas, but in most use cases. So my future prediction is that there will be a transition back to software rendering, and CPU will eventually become so strong that it will bear the load and the graphics card will mostly be basic and just display a fast 2D rendering, while the 3D rendering and effects will be mostly done in software and utilize the CPU itself.
AMD actually screwed up their. Laptop cpu names.
AI hell no is not worth the trade off!!! My new i9-14900K @ 5.8GHz pc boots up in 58 sec that is 7.5 times as long as my old i7-6850k 3.6GHz 8 sec boot time and even used the same SAMSUNG SSD 980 PRO to test boot speed before placed in my newer even faster drive witch still takes 48 to 58 sec to boot because the stupid AI training the memory and my old PC ram is DDR4 3200MHz Vs new pc DDR5 6000MHz and same RTX 3080 10GB. and after 1 month running my RTX3080 no longer found by the BIOS, thinking of just fix up the and dump it at a lost just get rid the slow AF boot trash. Because I do a ton reboots and this time lost on the new AI CPU waste butt load of time
nice shirt!! like!!
2024 July 20th
Ryzen 5 9600X
Ryzen 7 9700X
Ryzen 9 9900X
Ryzen 9 9950X
And 2025 February
Ryzen 7 9800X3D
Ryzen 9 9900X3D
Ryzen 9 9950X3D
My question is I have already 7600X should i wait for X3D variants or just purchase day one Ryzen 7 9700X and don't look back.
Is this AMD's "hold my beer moment"? Why not a more affordable dedicated card for AI or they already have an X3D version over the base model, what's another column in the graph. Just have Base, X3D and oh my AI. Those who want every bit of performance possible shouldn't have to feel hosed cause AMD bought into the Great Tulip Craze of the 2020s.
Since I don't trust any of these companies, nor do I trust AI, no. Not at all worth it.
I hate AI.
i want one APu like PS5pro cpu:
Ai is a double edged sword 🗡️ its cool to have but because it exists games are once again forgotten like how the mining ⛏️ rush fucked us
if games like minecraft are anything to go by, anything above CGA is a waste
LFG 9800X3D
AI is worse than raytracing. Lovely to have, way too expensive to be worthwhile.
Am I the only person who consistently skips to the last story because it's always the headline story?
I'm well fed now was starving for next Gen AMD CPU and GPU's especially since all the news from Green and Blue have bored me...I'm now just waiting for R7 9800X3D and RDNA5 RX 9800XTX which is why Nvidia is releasing a 5090 in the first place and Nvidia High maintenance fanboys should be thankful for that :D
edit: also as long as I can Uninstall Copilot with something like Revo Uninstaller and it not mess with the System files making my Computer unusable without reinstalling Windows should be fine.
Things I don't want to see on youtube tech videos anymore:
1. RX 570/580 reviews.
2. "Brilliant" ads.
Delete youtube then
I hate AI
Zen 1 -> 1000
Zen 2 -> 3000
Zen 3 -> 5000
Zen 4 -> 7000
Zen 5 -> 9000
It's pretty consistent, but doesn't make sense to do it this way. What comes after 9000?
Zen 6 -> b000 ?
next is surely A000, and Zen 6 -> B000
Zen 6 10000
I don't need another key for co-pilot. I don't need a windows key either. - my IBM model M has 102 keys, all i've ever needed.
I have deleted my previous commit due to the insinuation of calling someone stupid. So... Basically put considering someone wants to compare me to a Neanderthal: in today's modern CPU technology, NO user sure have to use software to dictate what cores do what. Yes: I am not ignorant that 3D V Cache memory os heat sensitive. Yes: I am not ignorant that CPU cores with 3D V Cache have to be clcoked lower.
Now, what I was referring to as a whole was the latency that has been experienced and has been documented when dealing with dual CCD CPUs (regardless of 3D V Cache configuration). My comment originally was that I hope that AMD will have this addressed or improved. The hostility was not warranted, needed, or appreciated. The name calling could have been done without.
Won't have anything to do with me, I'll continue to debloat windows 11 using PowerShell, command prompt, regedit, making custom firewall rules, modding my hosts file and bypassing all the useless requirements for an internet connection and a Microsoft account to install it and also the ridiculous hardware requirements to be able to install it on my perfectly fine but older CPUs and hardware. After debloating it, running sfc and dism and disabling the stupid fast boot hibernation state and also disabling account sign in assistance because when by passing the stupid Microsoft account requirements it can cause it to act up so always disable that too. After everything I did, it almost feels like windows 7 in that it leaves me alone, no tips and tricks, no stupid ai Co pilot or Bing ai no default app crap. It also uses less ram now too. Most people with a brain could care less about that npu crap, it's a waste of space on a chip that could be dedicated to more cache or more gpu cores, something actually useful instead...
I actually use AI rendering on my GPU way more than I game on it.
When I upgrade my GPU, it's going to be to double my AI rendering power and half my AI rendering time.
Gaming is so... yesterday. I mean literally, who games in 2024. Millenials hugging onto their obsolete Xboxes and way late and lagging far behind the times.
Bored about aall this AI
Gamer Meld you said Alchemist may release too late a couple of years ago and Intel did fine with the Arc release. Now you are saying Battlemage may release too late. I know you are still pissed off about the Amd snake oil ads but Amd has had a history of reversing engineering Intel cpu designs, that's how they got to where they are now. Know your history, do some research.
Also Nvidia will likely increase gpu prices across the board for 5000 series, making Alchemist more appealing to mid range budget gamers (biggest segment in the market). Aren't you being disingenuous to your audience or is there something else going on?
I need to say it
IT'S OWER NINE THOUSAAAAAND!!!
E title and thumbnail changes and one more clickbaity than the other
GG man
I don't want Woke AI in my PC, soon I will go to Linux if this keeps up or MS tries to make it so I have to pay for their OS every month. AI hates whitey, no thanks.
NO I DO NOT WANT AI ON MY PC.
Without the software it will do nothing
@@vitaobatera Yea with Microsh1ts. It'll be loaded on your PC as a "Feature Update" and "Surprise!"
AI slowing down Cpus will get fixed. Also it will likely need more power and I don't want to hear from climate weirdos who are saying it is bad for the environment. We need to fuel AI so we can fix the environment, foobars want to go back to the stone age. Such empty in the world today.
There is nothing to 'fix'