► Thanks to ProtoArc for sponsoring! Grab the ProtoArc XKM01 CaseUp travel solution from their official website and use code “UFD25” to enjoy 25% off early for BFCM at www.protoarc.com/products/xkm01-caseup-combo?ref=UFDTECH #ProtoArc #ProtoArcXKM01CaseUp #ProtoArcFoldableKeyboard
to be fair - AMD kinda messed up w gaming - they should of kept the 5800xt/ 7800xt3d in stock at a lower price and - recall the 9000 series. Im still laughing on a 7000 series.
@@MoonBunnyLovers weird time to buy? the quarterly earnings were the reason to buy, EPS was postitive and earnings were positive just a shame there wasnt more i suppose
Just like Crypto and NFT, it will die down. Most of it is just Hype, and only a few companies like Nvidia will benefit from the craze until something revolutionary comes along.
@@GurksTheGamer there is nothing "we shall see" it had a real use case which millions of companies are incorporated now and all their users want. there is nothing about "we shall see" about this.
It's funny hearing about Apple's memory bandwidth "improvements" this gen, considering they cut the M3 Pro's bandwidth down 25% from the M2 Pro. Now, they're touting a 75% gain over M3. Sounds to me like they never needed to cut in the first place.
The problem with not having usb A is that you can still buy external storage off the shelf thats usb a for a fraction of the price of what's usb c out of the box. Even cameras like gopro ship with a usb c to a cable. Products need to support usb c computers. And computers need more usb c.
This. There are so many options that presume usb A by default that it is not even funny. I have a recently built system, it only comes with 2 USB C ports, while it has tons of usb a. Same with products to plug into those ports.
@@arenzricodexd4409 ROCm still isn't implemented well with other software. I used Blender and the only way to get it to render with my GPU was with ZLUDA. Until a normal consumer can buy an AMD GPU, download the normal drivers, and run every software nVIDIA can, they will stay behind.
7:57 Maybe reconsider picking the monitor up for hdr since the Q&A on the product page says it only has 384 dimming zones which might not be enough for decent hdr
I agree. My 5900x has served me well for the last 4 years, and will continue to do so for several more years to come. No point in upgrading every year, especially when it’s very obvious we’re in the middle of a transition period right now. Chips in 2-3 years will be very different than they are now with everything becoming AI
@ MrIdiot The 12900F is the OEM SKU. 100 MHz less than K on Turbo / Boost. The Alder Lake SKUs do just as well as Raptor Lake after you take the steps necessary to minimize the Raptor Lake defects. Also, no idea where you found that price. Even 12900K can be found around $200 USD new.
The likely reason those CPUs didn't sell is because the retailers weren't actually selling them. They have the listing, but no stock. Same in my country, I can see the listings for a 285K for 635 or 645 euros at different sellers and none of them have stock.
I use co-pilot at work for code and I do find it quite useful. And I do find it quite useful It depends a little bit on what you mean by writing the code though.... It's more like a really good autocomplete. Like I need to write implementation of an interface that I've already written different versions of like four times, so it'll make that way faster. Maybe it'll help me write the test. It's definitely not like I'm telling the AI to go do something and it just goes and does it. It would do a terrible job at that
I think he meant they are generating code that then has to be revised by humans and fixed. So it's probably a lie. I remember them saying all the audio was trasncripted by software and then people figure out they had a lot of people for cheap working on that somewhere around the world. You don't even have to explain what you are wearing, that's right, I don't care, but reese look so sweet actually.
Well... if the machine was trained ,yes is A.I, if it was not trained then is a normal machine. In any case the machine did a better job than you since she decided to keep the machine instead of you
I work in telecoms and I cannot use chatgpt or other MLMs as a pricing manager, sometimes I would throw a complicated formula at it but generally speaking I use it more privately
I think that AMD's gaming sector is just a reflection of the general consumers wallet. I feel like most people that buy AMD are mid to low income, which is the demographic that's hurting the worst right now. In the US anyway.
11:00 Only in gaming... When you can 10x your profits from moving from gaming to AI customers.... you kinda have to... AMDs biggest problem is that CUDA rules and the open source framework version is just no where near as good. It will take a few more years to catch up and a few more to get people to use it as preference. AMD next need to get in on the ARM cpu / gpu... Qualcom is having licensing issues... Some AMD magic on some ARM ip... could be some real contenders. MS is in the way though with their lacklustre minimum effort ARM offerings of windows. Windows has borked. A global system registry? lolz self contained apps? lolz. Could all be done so much better with a new approach. How about UI consistency? How about just 1 way to do configs? Windows needs to split. Classic & New. Classic for people who 100% must have LTS. New - for customers who need cheaper to own / run / administer machines.
Dude. AMDs video cards have never been better. I bought a 6950xt and all my homies switched this year from team green. The drivers are awesome and the performance is great.
The only thing that keeps me from going Apple atm is not having a 2 in 1. As an artist I need a laptop and a tablet and I do not want to carry multiple devices (being it a tablet and laptop or a laptop and a pen tab.) Also I really am not a fan of iOS. Until then I will enjoy my Thinkpad.
If people weren't excited enough to buy RDNA3, I don't see RDNA4 being much different since they're going to be missing out on the high end. I have to wonder if they're even going to have an 800 series or if this is going to be like RDNA1 and the best we get is the 8700 XT. I think they're going to have to have some impressive increase in perf to get people excited to try to shift that number the other direction. And it's not like the PS5 Pro is going to sell a ton of units since that's only going to appeal to a niche segment of console gamers.
@@TheGameBench because nvidia gaming GPU are much more useful than just gaming. don't be surprise if 70% of geforce buyer are not gamer at all. now even companies are buying nvidia GPU for their official workstation solution non just semi pro. in china for example their cloud compute provider end up buying 4080 and 4090 modded them with more memory instead of buying the significantly more expensive H20. that's why there are GDDR6X shortage causing nvidia to release 4070 with regular GDDR6. AMD saw nvidia success with this hence why UDNA happen.
@@arenzricodexd4409 I'm only speaking to their gaming GPU's. I'm not even referring to their workstation or AI cards. That being said, AMD GPU's are useful outside of gaming as well. They trade blows with NVIDIA in productivity workloads. It really depends on what you're doing.
I love my base Macbook Air M1 with only an upgrade to 16 GB RAM. It's the perfect light work machine. Now that all of the macs get a minimum of 16 GB, i'm likely to actually buy a mini or poke the people around me who have intel macbooks to finally upgrade.
I just hope AMD remembers that a lot of their current growth comes indirectly from the gaming innovation and that Nvidia wouldn't be nowhere without it.
maybe for nvidia but not for AMD. in fact buying ATI is considered one of AMD biggest mistake in their history that eventually lead to a decade long financial woe for the company. if anything gaming almost ruin AMD.
@arenzricodexd4409 but now it's one of the biggest strengths. Without AMD buying ATI, we would not have great APUs, especially in laptops. Mobile market would continue to stagnate. Instead, it's evolving fast. That may not be true for desktop CPUs but given desktop PCs are niche between servers, workstations and laptops, I don't think they have to worry
@@arenzricodexd4409 AMD iGPUs are still an important differentiating factors VS Intel CPUs, that's why they got the console market, and why they successfully entered the handheld market, and in the era of AI, that GPU expertise might turn into a cash cow if they play it right.
@@grandsome1 yes they got that market but my point is not matter how good AMD GPU is console and handheld market can't really generate big revenue. and the last few quarters AMD gaming revenue are getting worse. usually AMD able to get something like 1.5 billion on average in every quarter. worse maybe it was at 1.2 billion. but Q1 this year the revenue drop to 922 million only. people thought that was bad and yet the revenue continue to drop in Q2 (648m) and Q3 (462m).
I work as a mental health professional so I don't want to speak too soon but I don't think they can replace us with AI. Unless people who struggle with mental health related problems don't care if they are talking to a real person. If so, I may be in trouble.
I'm a janitor/handyman. So... I'm pretty safe unless companies want to buy an automated zamboni thing and then neglect everything else... Oh god I'm screwed.
Running a 12600kf, was running a R7 5700G (changed due to audio issues thinking it was the APU causing issues) . Honestly performance wise, I hardly notice. As always go for the best price for performance. Wont be upgrading CPU for another 3/5 years do want my 3060 swapped out for a 5060/5070 tho
My dad is a computer programmer and they use ai to generate code, then all they need to do is proof read it and it speeds up the coding process by a huge margin.
Sorry Brett, you get a 👎 for saying Apple more than three times. It's like Beetlejuice, you'll have Tim Skeletor Cook showing up, and that's just trouble.
A single USB-A port(and SD slot) is all the Mac Mini needs. I have 3 things i need to plug into my Mac My network My display And my cheap $50 dock so i can connect my keyboard, mouse, and card reader, and no i refuse to use bluetooth Honestly, this is more rediculous than marketing this for photographers and not including a fast SD slot like the Mac Studio Apple is all about taking the research out of the buying experience, and now you expect your users to research which external card readers are good enough for their needs? Honestly been mad about the removal of this slot, its so easy to have 2 chassis, one with and one without if its an aesthetics thing iMAc though, why wasnt the SD card reader on the bottom of the screen of those models. At least they moved some ports to the front, absolutely love that, but now that the power button is on the bottom, why isnt it on the front. Heck Apple, you're so keen on putting important things on the bottom, why not put an SD Card reader there, so that when there isnt a card, you cant see it, but when you plug in the card, it sits flush with the front of the aluminium shell
I never leave any part of my programs or scripts to AI. I find doing it myself easier than telling AI a series of actions and conditions it has to fulfill using code. The only time I use AI is so I can understand a concept that I don't know how to work with, in which I'll learn the process thanks to the AI and then do it myself, knowing what I previously didn't know before.
Yeah, pretty sure that's what the commenter meant. And i was just being a jackass with the dongle comment 😉I'm satisfied with lots of USB C ports honestly
Much of the nvidia "consumer" segment is still related to smaller data centers buying consumer gaming GPUs. Remove that aspect and nvidia is doing worse than AMD.
I play around with AI code generation, but have NEVER straight copy pasted from chat GPT to a project. It is great for drumming up ideas or alternate approaches though.
Next quarter will be much better for AMD. A lot of people held off buying anything whilst waiting for Zen 5 and Arrow Lake (myself included) and now that both flopped, last gen AMD CPUs and AMD parts in general are flying off the shelves. I finally built my PC last month after seeing Zen 5% due to the 6 free games bundle from AMD and cashback from ASUS.
16:44 a good thunderbolt dock costs over 100$ where I am from and I need to buy it to use most of my accessories even a mouse or keyboard. It's not people being stupid and not accepting USB c it's not wanting to make a separate purchase just to use the Mac mini I would have to spend minimum 1000$ on
doesn't surprise me, data centers have unlimited money to pay whatever price AMD puts out the rest of us plebs do not have unlimited cash and cant justify paying 500$+ on a cpu
You know there are cpus below 500 right? Like.... MOST of the cpus from red or blue are under 500... And they work great for gaming. This makes no sense
Imagine that some people still think AMD makes the same crappy CPUs from the buldozer era. Datacenters don't have this problem and are usually lead by well informed people who appreciate efficiency.
@@Hardcore_Remixer I've meet people like that. And yeah, they will never change. They even said, "The CPU kill it self by Overheating." Like in one of video in Tom's Hardware.
Lol... Data center is where the money is from. So I'm absolutely happy for AMD. And you're also right, we need AMD to sell more GPUs stocks so next gen could come sooner. Lol
Too much selection. Like cars. Tesla builds 5 SEXY, Cybertruck. That is all. Everyone knows what they need/want. AMD needs to do the same. Just a few choices depending on what you need.
So big tech is laying off in droves while expanding their data centers. Explains why gaming software/hardware sales have dropped. We broke as hell dawg....
If I remember xbox console sales are down, I know it's not all of the picture like the gpu is down as well. As I own a AMD video card but I don't think AMD will catch up anytime soon as they got alot bad flack with their gpu with the drivers cashing it's just people who I know wouldn't go to AMD cuz of it or touch it, as a Linux user AMD what I picked for my video card as it's works better.
It would seem that nobody is buying an MS or Sony console by the looks of things which really isn't surprising considering there price. AMD needs to drop prices considerably instead of trying to match nvidia on price and they also need to get there power usage down.
It's funny, I thought the new intel CPU's were sold out in the US because of the whole thing about them being the first desktop CPU versions to have an integrated NPU, but I guess it was because of the very low stock on the shops 😂
no one is going to buy anything until black friday super sales, especially when we are all waiting for new stuff at ces next year which is... 2 months away
I don't think it has anything to do with the tech itself. AMD's chips are amazing right now and Intel has made some real improvements. But in this garbage economy, no one is upgrading unless it's absolutely necessary. That 2% inflation claim is complete BS. Food alone is up 200-300% which cuts into peoples disposable income. It doesn't matter if AMD dropped a 10GHz chip right now, most people wouldn't upgrade. Compare that to a few years ago where people would of happily upgraded for a 10% performance boost.
9:00 Why didn't Intel take the opportunity to shake things up and release an iGPU that soundly beats a 780m? If they knew their CPU was nothing 'special', why not give people a reason to hop on the train?
People will be buying AMD CPUs increasing competitive and consumer benefits People won't buy AMD GPUs because "thEy WoNT RuN Rt gRaPhIcS". This gives nvidia the chance to release anything at any price and AMD just giving up. Sad
AMD were just selling out of supply at the end of last year i wonder if this is just a show down from everyone buying up the x3d last year. I mean i already upgraded from an Intel i7-8700k to a AMD 7800x3d. Can't help them more than that for a couple years.
I'd love to buy a new AMD GPU, but I'm waiting for the next generation. If they want to push the release date out to next year, they're just delaying when I'm going to buy another one of their products.
I think AMD is down because all of us that had to wait during the pandemic for new GPUs bought when the crypto crash happened and so don't need a new graphics card, especially in a bad economy. I know after waiting 3 years stuck on an RX 580 I grabbed a 6650XT when they dropped below $250 and since I game at 1080P this is more than enough for my needs. Maybe when the economy picks up I'll move to 1440p and grab an 8000 series but right now there are too many other things that could use my $$$ and if it isn't broke? 🤷♂
@@paytonfritz6913 I hate to say it, but sure (If you're already pretty naturally good, Like already B+ or stronger. AI can outwrite most writers to a "B+" level, but above that, one needs to know what good copy looks like...
im both weary about ai being just thrown onto mac and ios without abandon and really pleased that base model macbooks pro, air and mac mini's are the best bang for buck for productivity and general use 16gbs of ram across the board is great
i know that theres the walled garden apple has wrapped around ai models to do them on device but even then image gen and agent models just make me uncomfortable top to bottom
Macs used to have games back before Apple went with integrated Intel graphics with their $3500 systems...smh. The graphics are fine now but soo many game publishers won't touch apple now. If Apple can get these companies back I bet Apple could sell alot of macs to gamers.
Can’t wait till 8000 and 50 series release, so I can pickup at 7900xtx or 7900xt. I think they have a good product, it’s just at a bad price. Maybe product cycles should be 4 years instead of 2? I think too many picked up 30 Series cards and current gen isn’t a big enough leap for the price to justify it.
► Thanks to ProtoArc for sponsoring! Grab the ProtoArc XKM01 CaseUp travel solution from their official website and use code “UFD25” to enjoy 25% off early for BFCM at
www.protoarc.com/products/xkm01-caseup-combo?ref=UFDTECH
#ProtoArc #ProtoArcXKM01CaseUp #ProtoArcFoldableKeyboard
🍓
to be fair - AMD kinda messed up w gaming - they should of kept the 5800xt/ 7800xt3d in stock at a lower price and - recall the 9000 series. Im still laughing on a 7000 series.
I will not.
WTF ARE YOU WEARING DAWG 😭
Hot nacho cheese wizard
Crawfishes? lol
humanz skin suit
He's a Dorito 🤔
RED ROCKET😂❤
If AI meant Absolute Idiot, then nearly all work done at my workplace is being done by AI... Unfortunately I am the prime contributor
PCMR are one hell of echo chamber
Most People doesn't mind AI as long that it isn't AI Art.
Are you automated AI?
@DeepThinker193 being an idiot is pretty much automatic for me
It can also mean Another Indian
@@dqskatt AI is freaking awesome ! Once AI can replace our braindead corrupt government, we will ALL be better in our society.
if AMD is a Massive Collapse, then what the hell is intel then
😂😂😂😂exacto no se de que hablan ellos x q literal AMD is on fire 🔥🔥🔥
Could be more on the GPU side and why they might be dropping high-end GPUs. That would be my guess.
I got so effed by AMD. Literally at 4:10 PM I bought $50 in stock. The crash happened at 4:18 PM XD. I lost a whole pizza worth of money! I'm ruined!
Weird time to buy isn't it? Unless you were expecting the announcement and hoping it would go up instead because of it?
Time for some vegetables
😶
It will come back. Let it ride.
@@MoonBunnyLovers weird time to buy? the quarterly earnings were the reason to buy, EPS was postitive and earnings were positive just a shame there wasnt more i suppose
So the whole industry except Nvidia is losing? Just great...
Maybe as far as the gaming market goes, but other than that AMD just reported a quarter with record revenues and a very healthy profit.
@@swdev245AMD still far from nvidia
if the pc market goes down, everyone goes down, except nvidia, because of ai stuff
Well i helped AMD this quarter by buying an Rx 7900xt
I did my part and got 7800 XT
I did my part and bought a 7700 XT
@@veltriixguys. Lets not shill for a big company. I am Amd gpu too but i dont really give a fuck besides the better value.
@ I agree, I had been nvidia since 2010 but this year I couldn’t do it, I had to buy the better value and I don’t care about RT
Cringe amd fanboys jesus christ
Did they expect sales to not crater when they didn't release any new gaming products and didn't drastically lower prices during the quarter?.
Term Ai has become such a fatigue now
Just like Crypto and NFT, it will die down. Most of it is just Hype, and only a few companies like Nvidia will benefit from the craze until something revolutionary comes along.
@@GurksTheGamer nope, it isn't nearly the same as NFTs or crypto, because it has genuine usecases. Still annoying though, terribly annoying..
@@yassir-5605 we shall see
@@GurksTheGamer there is nothing "we shall see" it had a real use case which millions of companies are incorporated now and all their users want. there is nothing about "we shall see" about this.
@@onthegrid6933 give examples of such use cases
It's funny hearing about Apple's memory bandwidth "improvements" this gen, considering they cut the M3 Pro's bandwidth down 25% from the M2 Pro. Now, they're touting a 75% gain over M3. Sounds to me like they never needed to cut in the first place.
Next to 'enshittification', there is the 'encheapenization'.....
The problem with not having usb A is that you can still buy external storage off the shelf thats usb a for a fraction of the price of what's usb c out of the box. Even cameras like gopro ship with a usb c to a cable. Products need to support usb c computers. And computers need more usb c.
This. There are so many options that presume usb A by default that it is not even funny. I have a recently built system, it only comes with 2 USB C ports, while it has tons of usb a. Same with products to plug into those ports.
I think Reece just lives there now. In his Elmo outfit. Eating a giant flaming Cheeto.
AMD will have a big jump in consumption if they finally get a native CUDA style system. Like ZLUDA, but built into the real drivers.
AMD unfortunately has great hardware and lackluster software
AMD already have that. it is called ROCm. in a way the existence of ZLUDA is kind of mocking what AMD been doing with ROCm.
🙏
@@arenzricodexd4409 ROCm still isn't implemented well with other software. I used Blender and the only way to get it to render with my GPU was with ZLUDA. Until a normal consumer can buy an AMD GPU, download the normal drivers, and run every software nVIDIA can, they will stay behind.
7:57 Maybe reconsider picking the monitor up for hdr since the Q&A on the product page says it only has 384 dimming zones which might not be enough for decent hdr
I have I9 12900F, I am not planning to fork out 500-600 euros for either of the new Intel or Amd CPU, not worth it, definitely
I agree. My 5900x has served me well for the last 4 years, and will continue to do so for several more years to come. No point in upgrading every year, especially when it’s very obvious we’re in the middle of a transition period right now. Chips in 2-3 years will be very different than they are now with everything becoming AI
You can literally sleep on that for like 10 years or more.
wtf is a 12900F? some sort of fisherprice "premium" cpu? can't even oc that crap to the 14 gen level
540$ for 12900F? jesus, this thing stinks
@ MrIdiot The 12900F is the OEM SKU. 100 MHz less than K on Turbo / Boost. The Alder Lake SKUs do just as well as Raptor Lake after you take the steps necessary to minimize the Raptor Lake defects. Also, no idea where you found that price. Even 12900K can be found around $200 USD new.
Brett: checks watch
Me: he's actually wearing a watch!
Brett Looking like a Hot Cheeto Pharaoh @13:20
I also hope intel gets some sales intel needs to survive for us to get good cpus!
Im an AMD fanboy and i agree
Intel will be fine. They will still have a massive amount of computers sold to businesses and prebuilts just like they always do
The likely reason those CPUs didn't sell is because the retailers weren't actually selling them. They have the listing, but no stock. Same in my country, I can see the listings for a 285K for 635 or 645 euros at different sellers and none of them have stock.
from sales perspective intel actually generate more revenue than AMD.
No wonder Google eats so much ram, ai tends to make really unoptimized code
I love the concept of USB-C, but the implementation has been so terrible. And it's the kind of terrible that was easily avoidable.
I use co-pilot at work for code and I do find it quite useful. And I do find it quite useful
It depends a little bit on what you mean by writing the code though.... It's more like a really good autocomplete.
Like I need to write implementation of an interface that I've already written different versions of like four times, so it'll make that way faster.
Maybe it'll help me write the test.
It's definitely not like I'm telling the AI to go do something and it just goes and does it. It would do a terrible job at that
I think he meant they are generating code that then has to be revised by humans and fixed. So it's probably a lie. I remember them saying all the audio was trasncripted by software and then people figure out they had a lot of people for cheap working on that somewhere around the world. You don't even have to explain what you are wearing, that's right, I don't care, but reese look so sweet actually.
My ex-girlfriend replaced me with a 14 inch vibebrator...does that count as A.I.? 😂
Well... if the machine was trained ,yes is A.I, if it was not trained then is a normal machine.
In any case the machine did a better job than you since she decided to keep the machine instead of you
I work in telecoms and I cannot use chatgpt or other MLMs as a pricing manager, sometimes I would throw a complicated formula at it but generally speaking I use it more privately
The amount of memory and the memory bandwidth are key enablers of the NPU.
I think that AMD's gaming sector is just a reflection of the general consumers wallet. I feel like most people that buy AMD are mid to low income, which is the demographic that's hurting the worst right now. In the US anyway.
11:00 Only in gaming... When you can 10x your profits from moving from gaming to AI customers.... you kinda have to... AMDs biggest problem is that CUDA rules and the open source framework version is just no where near as good. It will take a few more years to catch up and a few more to get people to use it as preference.
AMD next need to get in on the ARM cpu / gpu... Qualcom is having licensing issues... Some AMD magic on some ARM ip... could be some real contenders. MS is in the way though with their lacklustre minimum effort ARM offerings of windows. Windows has borked. A global system registry? lolz self contained apps? lolz. Could all be done so much better with a new approach. How about UI consistency? How about just 1 way to do configs? Windows needs to split. Classic & New. Classic for people who 100% must have LTS. New - for customers who need cheaper to own / run / administer machines.
Dude. AMDs video cards have never been better.
I bought a 6950xt and all my homies switched this year from team green. The drivers are awesome and the performance is great.
The only thing that keeps me from going Apple atm is not having a 2 in 1. As an artist I need a laptop and a tablet and I do not want to carry multiple devices (being it a tablet and laptop or a laptop and a pen tab.) Also I really am not a fan of iOS. Until then I will enjoy my Thinkpad.
If people weren't excited enough to buy RDNA3, I don't see RDNA4 being much different since they're going to be missing out on the high end. I have to wonder if they're even going to have an 800 series or if this is going to be like RDNA1 and the best we get is the 8700 XT. I think they're going to have to have some impressive increase in perf to get people excited to try to shift that number the other direction. And it's not like the PS5 Pro is going to sell a ton of units since that's only going to appeal to a niche segment of console gamers.
gamer in general are not exciting to buy anything. those that already own something like RX6600 or RTX 3060 will be good for a few years.
@@arenzricodexd4409 Sure, but it's not really slowing NVIDIA down. That doesn't really explain the lack of AMD GPU sales.
@@TheGameBench because nvidia gaming GPU are much more useful than just gaming. don't be surprise if 70% of geforce buyer are not gamer at all. now even companies are buying nvidia GPU for their official workstation solution non just semi pro. in china for example their cloud compute provider end up buying 4080 and 4090 modded them with more memory instead of buying the significantly more expensive H20. that's why there are GDDR6X shortage causing nvidia to release 4070 with regular GDDR6. AMD saw nvidia success with this hence why UDNA happen.
@@arenzricodexd4409 I'm only speaking to their gaming GPU's. I'm not even referring to their workstation or AI cards. That being said, AMD GPU's are useful outside of gaming as well. They trade blows with NVIDIA in productivity workloads. It really depends on what you're doing.
😢
Intel: Nobody is buying chips for DIY Gaming
AMD: Gaming Revenue is down
Is anyone buying anything gaming related?
I love my base Macbook Air M1 with only an upgrade to 16 GB RAM. It's the perfect light work machine. Now that all of the macs get a minimum of 16 GB, i'm likely to actually buy a mini or poke the people around me who have intel macbooks to finally upgrade.
I just hope AMD remembers that a lot of their current growth comes indirectly from the gaming innovation and that Nvidia wouldn't be nowhere without it.
well considering they make more from data centers most likely they will give that sector priority
maybe for nvidia but not for AMD. in fact buying ATI is considered one of AMD biggest mistake in their history that eventually lead to a decade long financial woe for the company. if anything gaming almost ruin AMD.
@arenzricodexd4409 but now it's one of the biggest strengths. Without AMD buying ATI, we would not have great APUs, especially in laptops. Mobile market would continue to stagnate. Instead, it's evolving fast. That may not be true for desktop CPUs but given desktop PCs are niche between servers, workstations and laptops, I don't think they have to worry
@@arenzricodexd4409 AMD iGPUs are still an important differentiating factors VS Intel CPUs, that's why they got the console market, and why they successfully entered the handheld market, and in the era of AI, that GPU expertise might turn into a cash cow if they play it right.
@@grandsome1 yes they got that market but my point is not matter how good AMD GPU is console and handheld market can't really generate big revenue. and the last few quarters AMD gaming revenue are getting worse. usually AMD able to get something like 1.5 billion on average in every quarter. worse maybe it was at 1.2 billion. but Q1 this year the revenue drop to 922 million only. people thought that was bad and yet the revenue continue to drop in Q2 (648m) and Q3 (462m).
Hell Mo and sesamean street is just a great set of opening jokes. Happy to have a break from Brettfast for some Hot Cheetos.
I think Brett said sesameme street 😅
@@jimbodee4043Just take the context and you will understand the Sesamean Street. You heard Reese say Hell Mo, right?
15:42 Correction...that's not capitalism that is consumerism.
Their gaming is down because they spent money on development of cpu and gpu and they feel they are worth their weight in gold and diamonds.
Where I work, this quarter we actively stopped most AI usage. I think some server stuff is still using it but only for highly specific tasks.
I work as a mental health professional so I don't want to speak too soon but I don't think they can replace us with AI. Unless people who struggle with mental health related problems don't care if they are talking to a real person. If so, I may be in trouble.
happy halloween to the cheeto and larry david's op
I'm a janitor/handyman. So... I'm pretty safe unless companies want to buy an automated zamboni thing and then neglect everything else... Oh god I'm screwed.
Reese are you a chili pepper for Halloween 😂😭 Best Tech News yet, Happy Halloween UFD Crew! 🎃
That 4k monitor seems pretty good for watching video. I really want to have a 27" 4k above my primary display for videos.
Good morning and Thank you for the information Elmo and Mr. Boneless buffalo chicken wing.
I wonder if we're also seeing the failed 9000 series launch reflected in those sales figures?
The way Reese yelled out "AI" 😂
Running a 12600kf, was running a R7 5700G (changed due to audio issues thinking it was the APU causing issues) . Honestly performance wise, I hardly notice. As always go for the best price for performance.
Wont be upgrading CPU for another 3/5 years
do want my 3060 swapped out for a 5060/5070 tho
My dad is a computer programmer and they use ai to generate code, then all they need to do is proof read it and it speeds up the coding process by a huge margin.
I really hope it wasn't a paper launch, however likely, and instead there are very sad scalpers crying over piles of 285K boxes they can't sell.
Sorry Brett, you get a 👎 for saying Apple more than three times. It's like Beetlejuice, you'll have Tim Skeletor Cook showing up, and that's just trouble.
75% building design, yard design, data distillation, MTG deck building, system optimization.
The Cheetos man, turned into a Spanish inquisition nun man at 13:19
A single USB-A port(and SD slot) is all the Mac Mini needs.
I have 3 things i need to plug into my Mac
My network
My display
And my cheap $50 dock so i can connect my keyboard, mouse, and card reader, and no i refuse to use bluetooth
Honestly, this is more rediculous than marketing this for photographers and not including a fast SD slot like the Mac Studio
Apple is all about taking the research out of the buying experience, and now you expect your users to research which external card readers are good enough for their needs? Honestly been mad about the removal of this slot, its so easy to have 2 chassis, one with and one without if its an aesthetics thing
iMAc though, why wasnt the SD card reader on the bottom of the screen of those models.
At least they moved some ports to the front, absolutely love that, but now that the power button is on the bottom, why isnt it on the front.
Heck Apple, you're so keen on putting important things on the bottom, why not put an SD Card reader there, so that when there isnt a card, you cant see it, but when you plug in the card, it sits flush with the front of the aluminium shell
I never leave any part of my programs or scripts to AI. I find doing it myself easier than telling AI a series of actions and conditions it has to fulfill using code.
The only time I use AI is so I can understand a concept that I don't know how to work with, in which I'll learn the process thanks to the AI and then do it myself, knowing what I previously didn't know before.
I still want more dongles. More dongles for all! 🔌👈😎👉🔌
When I see your "hotdog" costume, I'm just imagining it with Egyptian Pharaoh headdress textures.
10900k, should I upgrade or hold another couple of years?
Lmao ya no, we don't want the War of the Five Kings with Valve as the battleground.
_'Full USB'_ will mean USB A to a lot of people for a long time
Yeah, pretty sure that's what the commenter meant. And i was just being a jackass with the dongle comment 😉I'm satisfied with lots of USB C ports honestly
Much of the nvidia "consumer" segment is still related to smaller data centers buying consumer gaming GPUs. Remove that aspect and nvidia is doing worse than AMD.
I play around with AI code generation, but have NEVER straight copy pasted from chat GPT to a project.
It is great for drumming up ideas or alternate approaches though.
Next quarter will be much better for AMD. A lot of people held off buying anything whilst waiting for Zen 5 and Arrow Lake (myself included) and now that both flopped, last gen AMD CPUs and AMD parts in general are flying off the shelves. I finally built my PC last month after seeing Zen 5% due to the 6 free games bundle from AMD and cashback from ASUS.
16:44 a good thunderbolt dock costs over 100$ where I am from and I need to buy it to use most of my accessories even a mouse or keyboard. It's not people being stupid and not accepting USB c it's not wanting to make a separate purchase just to use the Mac mini I would have to spend minimum 1000$ on
Ryzen is Client, Gaming is GPUs and Consoles.
doesn't surprise me, data centers have unlimited money to pay whatever price AMD puts out
the rest of us plebs do not have unlimited cash and cant justify paying 500$+ on a cpu
You know there are cpus below 500 right? Like.... MOST of the cpus from red or blue are under 500...
And they work great for gaming. This makes no sense
Imagine that some people still think AMD makes the same crappy CPUs from the buldozer era.
Datacenters don't have this problem and are usually lead by well informed people who appreciate efficiency.
@@Hardcore_Remixer I've meet people like that. And yeah, they will never change. They even said, "The CPU kill it self by Overheating." Like in one of video in Tom's Hardware.
😭
Love this CHEETOS 🤣🤣🤣
Sesa-mean Street 💀
Lol... Data center is where the money is from. So I'm absolutely happy for AMD. And you're also right, we need AMD to sell more GPUs stocks so next gen could come sooner. Lol
People are looking for value.
Too much selection. Like cars. Tesla builds 5 SEXY, Cybertruck. That is all. Everyone knows what they need/want. AMD needs to do the same. Just a few choices depending on what you need.
That’s probably why AMD is hiking the price of the 9800X3D by $30
So big tech is laying off in droves while expanding their data centers. Explains why gaming software/hardware sales have dropped. We broke as hell dawg....
Sea Sam mean Street 😂😂😂
nah man, not a single sale at germanys largest r etailer is embarrassing
If I remember xbox console sales are down, I know it's not all of the picture like the gpu is down as well. As I own a AMD video card but I don't think AMD will catch up anytime soon as they got alot bad flack with their gpu with the drivers cashing it's just people who I know wouldn't go to AMD cuz of it or touch it, as a Linux user AMD what I picked for my video card as it's works better.
It would seem that nobody is buying an MS or Sony console by the looks of things which really isn't surprising considering there price. AMD needs to drop prices considerably instead of trying to match nvidia on price and they also need to get there power usage down.
4k 60Hz MiniLED for 250 is insane, despite low refresh rate
14:40 Looking like a hot Cheeto Pharaoh!
I mean most of the code is basically just boilerplate in most scenarios.
AI generated code is great for unit tests. Easily can have that account for 25%
And boiler plate code. Even before AI a lot code was already done by code gen.
Well, if they expect us to buy a new GPU every year, they better start selling them 75% cheaper.
It's funny, I thought the new intel CPU's were sold out in the US because of the whole thing about them being the first desktop CPU versions to have an integrated NPU, but I guess it was because of the very low stock on the shops 😂
AMD Gaming CPU's are down because everyone is waiting on 9800x3d Processors..
no one is going to buy anything until black friday super sales, especially when we are all waiting for new stuff at ces next year which is... 2 months away
I don't think it has anything to do with the tech itself. AMD's chips are amazing right now and Intel has made some real improvements. But in this garbage economy, no one is upgrading unless it's absolutely necessary. That 2% inflation claim is complete BS. Food alone is up 200-300% which cuts into peoples disposable income. It doesn't matter if AMD dropped a 10GHz chip right now, most people wouldn't upgrade. Compare that to a few years ago where people would of happily upgraded for a 10% performance boost.
9:00 Why didn't Intel take the opportunity to shake things up and release an iGPU that soundly beats a 780m? If they knew their CPU was nothing 'special', why not give people a reason to hop on the train?
CDProjekt RED doesn't own GOG - RED is the developer and CD Projekt is a bigger publishing company owning RED
People will be buying AMD CPUs increasing competitive and consumer benefits
People won't buy AMD GPUs because "thEy WoNT RuN Rt gRaPhIcS".
This gives nvidia the chance to release anything at any price and AMD just giving up. Sad
1/4 AI, my bio teacher let us to make up to 1/4 of our paper with the help of AI
16:29 can we keep saying dongle? Just dongle all day guys
i had to stop 5 seconds in so I could get in a laugh session. That elmo hoodie or onesi is hilarious.
AMD were just selling out of supply at the end of last year i wonder if this is just a show down from everyone buying up the x3d last year.
I mean i already upgraded from an Intel i7-8700k to a AMD 7800x3d. Can't help them more than that for a couple years.
I'd love to buy a new AMD GPU, but I'm waiting for the next generation. If they want to push the release date out to next year, they're just delaying when I'm going to buy another one of their products.
I think AMD is down because all of us that had to wait during the pandemic for new GPUs bought when the crypto crash happened and so don't need a new graphics card, especially in a bad economy. I know after waiting 3 years stuck on an RX 580 I grabbed a 6650XT when they dropped below $250 and since I game at 1080P this is more than enough for my needs. Maybe when the economy picks up I'll move to 1440p and grab an 8000 series but right now there are too many other things that could use my $$$ and if it isn't broke? 🤷♂
I am a fairly high end copywriter, I'd say I am about 50% using AI these days.
Would you recommend getting into copywriting as a side gig?
@@paytonfritz6913 I hate to say it, but sure (If you're already pretty naturally good, Like already B+ or stronger. AI can outwrite most writers to a "B+" level, but above that, one needs to know what good copy looks like...
I hereby nominate UFD Tech for the award of Tech tube channel with the Clickbaitiest video titles. 🤣🤣🤣
Is it legal to only have two companies be the only choice in a international market?
0% AI Engineering baby, FRICK’EM! 💪
I'll boist their sales when they finally release the 9950X3D
im both weary about ai being just thrown onto mac and ios without abandon
and
really pleased that base model macbooks pro, air and mac mini's are the best bang for buck for productivity and general use
16gbs of ram across the board is great
i know that theres the walled garden apple has wrapped around ai models to do them on device but even then image gen and agent models just make me uncomfortable top to bottom
Macs used to have games back before Apple went with integrated Intel graphics with their $3500 systems...smh. The graphics are fine now but soo many game publishers won't touch apple now. If Apple can get these companies back I bet Apple could sell alot of macs to gamers.
AMD is crushing intel
Can’t wait till 8000 and 50 series release, so I can pickup at 7900xtx or 7900xt. I think they have a good product, it’s just at a bad price. Maybe product cycles should be 4 years instead of 2? I think too many picked up 30 Series cards and current gen isn’t a big enough leap for the price to justify it.