@@campkira I spent $380 for a laptop for my wife with a Ryzen 5 2500u, my laptop with an i7 7700hq and 1050 cost over 800. Price is a massive difference. Obviously my pc is better but any game also available on console (95% of games) run playable on it.
Hey can is it bad if i have Vega 8 (integrated graphica card) ,8gb ram,ryzen 3200g? I saw a lot of videos of tests with this specs and they can run best games with solid fps and solid graphics,here link: pevex.hr/pc-konfiguracija-feniks-info-hyper-x-2061-amd-ryzen-3200g-8gb-ddr4-ssd-480gb-326172.html
@@honly872At that price? Yeah, I think it's pretty worth it. Not that bad. In the future, however, I suggest you ask in Reddit or forums such as the LTT forums. I may not be able to respond to such queries. Please do not make your purchasing decisions based on one stranger's comment either; seek other people's opinion too. Cheers!
@@honly872 it'll be better if u have one more 8gb stick and run the two in dual channel. And it'll also be better if u buy 3400g with vega 11 if u r not planning to purchase a graphic card in near future. And yeah, ur ram clocks should be =/> 3000mhz to get the best performance
Bitch, that is demanding as fuck. Ever tried 4K, max render, mountain or forest biome and 144fps? That is without mods. Your dang PC would be crying. Now add ray tracing to it or simply just one of those realistic modpacks. A 1080Ti can't even run those things at medium settings 4K30fps.
The cooling on my laptop is notoriously meh. After I undervolted & underamped the CPU, both CPU & GPU temperatures dropped yet performance didn't. Why Acer thought putting them both on the same heatsink was a good idea amounted to "this cheaper." I'm just glad it actually was able to be improved to acceptable levels without a friggin' cooling mat.
@@InfernosReaper I bought an AMD FX chipset laptop a while ago and it ran stupid hot. Cramming an assload of cores into a small body wasn't going to be a good idea. Lol
@@aravindr4986 No one I trusted had reviewed it. It was a bit of a gamble, but overall, one that paid off for *me*. I'm not sure it'd be a good fit for everyone, since not everyone is willing to trial and error on power adjustments, or buy a cooling pad.
Ryzen APUs: Am i joke to you Edit: My 2 cents, good as a temporary case in budget builds but become irrelevant when GPU is available, may also be helpful in emulation pcs since graphics arent used too much Edit 2: 2400G is a fucking joke though
@@DeadPhoenix86DP vega 11 (r5 2400g) with dual channel fast ram can beat the gddr5 gt1030 and run all new games at at least like 720p low and most higher
@@DeadPhoenix86DP Though they aren't as good as High end GPUs it is impressive how well can it run after seeing crappy intel integrated graphics for years
Heat is only bad if you do not have the proper cooling solution. Really this could be taken care of by having support for E-GPU's by default... but that would mean Thunderbolt on everything!
@@christopherkidwell9817 I mean while that's true, CPU temps are measured internally rather than the externally as the heat can damage components if not redistributed fast enough. So for example, if the CPU creates 120°C of heat, but can disapate it at a rate so it's 80°C constantly, the high temperature will still damage the CPU, just over a longer time in contrast to if it was cooler by default
APU heat? Are you unfamiliar with the AMD FX series space heaters? Pairing a 65-90w cpu with a 75-125w gpu is still less electricity->heat than classic AMD toasters
The dies could be made larger at a profit cost to the silicon fabrication owner. Oh no did I just imply that profits mite be too high in the tech industry! Sarcasm incase your not Canadian or British.
@@_aullik well as much as it's true, I'd say it would be probably possible to to get the igpu a set of its own memory slots you could expand yourself, no?
@@MTMguy Dude, it was before RX Flop came out. The RX Flop gets decimated by the Titan V/RTX. Why else they named it so differently from the RX 4/500 series? Obviously to beat the Titan V/RTX.
@@hankhuevos3336 200ge is actually 2c/4t, but even then the multicore performance is much better than a8 9600, and single core is miles better. graphics is also much newer and faster vega 3. Even the whole chip itself is much more efficient as well.
@@hankhuevos3336 The Athlon 200GE is faster at any workload , and more efficient , the newer architecture will give the system a much much much MUCH smoother experience 14nm vs 28nm is no contest . 14nm ftw .
Yea it's well enough as long as you don't wanna run insanity games or games on 4k or something like that insane. Just be realistic and an APU may do for you.
@@rickytorres9089 that's kind of retarded logic though to be honest... who is going to have a 4k capable monitor(hundreds of dollars more than a simple 1080p monitor) but can't afford a discrete graphics card? Nobody that plans on running a 4k gaming setup would ever buy an APU.. the APU market is for people who are building a cheap, bare-bones gaming setup. For instance my sister plays World of Warcraft casually, along with old titles like Skyrim and Fallout.. I built her a system with a Ryzen 2400g and it runs everything on ultra settings just fine at 1080p, which is the resolution most people are playing on. You can't expect a budget CPU to give enthusiast level performance, nobody complains that a 1030 doesn't give 200 FPS at 4k resolution because they understand that it's a budget card...
Hey can is it bad if i have Vega 8 (integrated graphica card) ,8gb ram,ryzen 3200g? I saw a lot of videos of tests with this specs and they can run best games with solid fps and solid graphics,here link: pevex.hr/pc-konfiguracija-feniks-info-hyper-x-2061-amd-ryzen-3200g-8gb-ddr4-ssd-480gb-326172.html
I would love if there was another socket on the motherboards, to install a GPU, instead of using a card for it :D and then just put on whatever cooler you like
Then your motherboard will be covered with 2 enormous cooler (unless you use water/liquiq cooler) and that took a lot of space, PCIE is more practical since it's mounted vertically so not much space covered.
@@thelazt16 in other hand, we could also go the reverse way, and go full Slot A/ Slot 1 days when Intel and AMD released Athlons and Pentiums in slot form. It looked exactly like a graphics card does nowadays, but was a processor (for who's unfamiliar with it, google it! It's at least interesting)
@@Kalvinjj They could make that happen but why would you innovate thing that will only cut down performance when it's actually doesn't need that kind of innovation. It just as ridiculous as Kingston Fury RGB SSD which has speed slower than MicrodSD because of it's getting overheated from it's own 75 RGB lights.
I got the last intel cards and meh. No wonder i prefer to use my REALLY BAD 940MX. Not only it gives me 5 times more performance, but i can even turn some settings from low to med. Oh and 940MX is the worst gfx card ever. It can play games from 2005-2012 max, even 12 is pushing it of you wanna use 1080p. Sure, some games still work pass 12 but meh. Even my GTX (old cards) 750/760 ti cards do a better job than a brand new integrated card from 2019.
Weird thing is there are 2 versions of the 940MX, one with DDR3 and another with GDDR5 and more shaders. My previous Acer laptop with the 940MX (GDDR5) was decent in games while my friends Dell with the DDR3 version was about quite a bit slower compared to it with the exact same processor and RAM.
cant wait for Qualcomm, Samsung and Apple to make their chips compatible with windows/mac especially if the rumored Samsung investment into their chips is real, 116 BILLION???? omg i want to dual boot my tablet as my main computer, and then my phone soon later
Windows on mobile architectures can't run most software. There is an ARM build of Windows, but most software isn't compatible. That's a huge downside. Afaik, android is more capable than ARM Windows.
My experience with gaming on integrated graphics is that you need a decent CPU and a LOT of RAM. Last night I doubled the ram on a laptop I have and decided to see how well it handles a heavily modified Minecraft with the extra ram. It did a lot better with more ram available, but it still sucked compared to my other laptop that amazingly has dedicated graphics. The FPS with that heavily modified Minecraft (forge with about 60 mods playing singleplayer lol) on the integrated graphics laptop averaged about 20-30FPS with the 8GB of RAM and about 10-15FPS with the 4GB of RAM. The dedicated graphics laptop averages between 40-60FPS with 4GB of ram. I ordered RAM for it that should be here next week to bright it up to 8GB. That being said, a plain ol' Minecraft install in multiplayer on both do well with optifine and reasonable settings. The killer of game performance, in my opinion, is Windows 10. Windows 10 seems to love to eat through RAM and spare CPU cycles. I run a lightweight but decent Linux distro, so my game selection may be limited, but Windows 10 pissed me off so much that I ditched it. Windows 7 was alright with a lot of tweaks. It seems Windows 10 can't be slimmed down as well as 7 and the updates are a nightmare. On the laptop that I just doubled the RAM on, it had Windows 10 on it but a few updates failed. Those updates continued to fail. Hours of talking to customer service and trying everything they could think of and those updates just would not install. You would start the computer and it would take a half-hour to get to the desktop because it would try to install those updates, then fail, then revert. There seems to be no actual way to ignore those updates either. One thing was tried to ignore them, and it worked for like two days before it reverted to the habit of trying to force me to install those updates. I finally had enough to went (back) to Linux and have had zero major issues since.
APUs are actually a great option for people new to building pcs, and offer great value when looking at esports titles, rather than things like gta or rdr. In games like LoL CS and valorant, getting 75fps+ is easy (I say 75 since any getting an apu prob isn’t getting a 120hz+ monitor), and you don’t even have to reduce the settings! The only real downsides to this, are you can’t play more demanding games like gta, and it will make your pc look incomplete and empty
APUs are a decent comprimise between the weak igp and a dedicated gpu. Kaby Lake G on intels side and AMDs G CPUs for the Ryzen side of things. Just need 3000mhz or higher clocked ram in Dual Channel to make the most of the APUS on the market.
My youngest only wanted to play Sims4, so I spent spent a long time pricing a laptop with a dGPU until I found several YT videos showing people playing Sims4 at 1366x768 on the iGPU of an i5-6200U, so that's what we bought instead. Also, as suggested by YT, bought and extra 4GB of RAM to make it dual channel and to give the iGPU access to more RAM and squeezed out another 10% in benchmarks with that configuration. 3-years down the line and she's still very happy with it and is still playing Sims4 (with just about every expansion pack available).
Additionally back in the day i could've made argument about backwards compatibility. Say - in `02-'06 intel embedded graphics were fully backwards compatible with VGA standard. That meant that even on windows XP (via ntvdm) you were able to just launch titles such as x-com 3 or blood natively, no dosbox required. Tho for sound you needed virtual dos sound driver such as VDMSound.
Amd igpu is the most cost effective gpu for a budget build: ryzen 3 and 5 igpu , vega 8 and vega 11 respectively, can even run many latest titles up to medium settings (AC odyssey, BF 5), and games like SOTTR, FC5 up to High setting So *it doesn't suck*
Tegra X1 is less powerful than modern intel integrated graphics... Even though I hate doing it if we compare GFLOPS... X1 is 500, my 3 year old mid range laptop is about 420... The high range ones are leaps above the x1
@@rhysjones8506 I think the normal Tegra X1 is 1 TFLOP, but the version in the Switch is downclocked to ~500 GFLOPs, which yeah is pretty weak. Slower than an RX 550 or GT 1030 and all current-gen consoles, about twice as fast as an Xbox 360.
@@nathanmead140 True, still weaker though. Top of my head I wanna say it's like 350MHz portable and 760MHz docked, while the regular X1 runs at like 1.2GHz.
@@Malus1531 no it's 500 GFLOPS... Switch is 400 docked, the whole tflop is Nvidia trying to be smart... That's with fp16 calculations, or floating points... whereas the norm for everything is fp32, that's what all APIs use... with fp32 calculations it's only half a tflop
This video basically covers all the sparse reasons why companies do not attempt to combine CPUs with high performance GPUs. It is the invreased die size and increased TDP, which keeps them from combining the two. Thanks for sharing this one. Love your work. Keep up. 😁😁✌
@@ananth.a If he don't had enough RAM... it won't work... I know since mine older gaming laptop come with low RAM... put some more RAM it and it smooth...
I am still using my AMD A10 which came with Integrated GPU and for the 1st two years, it i was able to place the latest and greatest games at 720p (my monitor was 720 back then) and it was good enough. With time the graphics have become very demanding. Now it can only work windows well with very light gaming titles. In today's time its pointless to go for an integrated GPU simply because it won't be powerful enough to give the best for todays Graphics Intense games.
The size, power and transistor count is really not that much of a issue. A modern cpu uses just as much power for interface than for processing. Integrating a gpu while adding power, also removes a bit. This have not always been true to the degree it is today, but from 14nm, it's really almost as efficent to integrate a cpu in the gpu as making a PCI-e link to a cpu. (Depending on the cpu. When binning become available, transistor count was also less of a issue. Now with zen2 with 3 degrees of binning, it's hardly a issue at all. What is the issue is memory. Ddr4 need to be sockled. Gddr6 don't. So gddr6 is a loooot faster. This pretty much make it impossible to make a fast Apu today... Because they use the infrastructure of the cpu, not the gpu. This is also why ps4(pro) and exbone does have Apu. They don't have that issue. But I have a fealing that and cracked the problem
My ryzen 5 pro 4650g also does quite a good job for an igpu with Vega 7 graphics I installed 16gb so theirs plenty available for both cpu and GPU F1 2020 medium settings 55 fps Gta v ultra high settings 37 fps Cs go must be plus 150 but my monitor is 60hz 😅
*Does integrated GPU support higher resolution like 1440p at 165Hz refresh rate?* ★ All I want to do is connect external monitor (1440p/165Hz/1ms) with laptop (having HDMI v2.0) & do CLOUD GAMING. 👍
@@samuelislam9927 The top end ones (Vega 10 and 11) perform similarily to the dedicated GPU the gtx 1030 (with GDDR5, not the stupid gimped version), which is mighty impressive for a GPU that uses DDR4
@@samuelislam9927 Bad for high end gaming, yes. But as an integrated graphics vega is currently the most powerful one and is perfect for e-sport gaming or HTPC
It’s all fun and games if you have either only a IGPU or a dedicated GPU. When you have both, there are problems. TL;DR My pc uses the IGPU instead of the dedicated GPU, causing things to run slow. Also, some AMD updating problems. So, my pc is a Samsung Series 3 NP370R4E. I upgraded its ram to 6 gb so that it could run faster and boi was that an upgrade. Just 2 gb of extra ram helped everything run better. I have two GPUs. An intel HD Graphics 4000 IGPU and an AMD RADEON HD 8600/8700M (Discrete/Hybrid) GPU. The latter always has a weird name whenever I upgrade the driver. For example: 7400/7500M, 5600/5700M, 4300/4400M throughout the time I’ve used it and none of these models are there in AMD’s website which annoys me. I’ve recently tried to update the GPU driver to the latest Adrenaline 19.x.x version but whenever I do, my pc crashes. It’s stuck in one screen. I can’t move anything. When I try to restart it by pressing the power button, it never starts and says that the hard drive has an error. I had to reset my pc several times before arriving to the conclusion that I’ll never be able to update it past Adrenaline 17.x.x. Some may say that I should give it enough time to load but that doesn’t work. I left it to update overnight and during the day when I was at school but nope. It’s just stuck on that screen. Another problem with the GPUs is that my pc uses the intel IGPU most of the time. While, other times games such as Minecraft use my AMD GPU one day then use my intel IGPU the next day. Dropping my FPS to 15 FPS from a capped and stable 60 FPS. I can’t find a way to make my laptop use the AMD GPU instead of my intel IGPU for different apps like you can do with NVIDIA GPUs or so I heard. My hardware is capable for running games at decent frames at medium to low settings but it’s the stupid IGPU getting in the way. If only there was a way.
Me when writing an essay for my finals: Hi. This is an essay of 10 words. Me when writing something stupid like this comment: Here’s a whole novel’s worth of random shit.
Spent nearly five minutes saying everything else other than "Integrated Graphics aren't as great as Dedicated Graphics because most people don't need a high-performing graphics card for their daily workload as opposed to video editors, streamers, etc, and thus helps keep the cost down for something they don't need too much resources from."
But the APUs we see are NOT focused on providing a youtube / netflix / excel screen but market themselves with the best 3d they can pull off ... which is not what is needed by the alleged Target Market of soccermoms & servers VGA used to run with LESS than ONE MB vram.. that is far less than onboard cache for most gpu these days.. VGA for APU is pennies
@@rwbimbie5854 I'm not sure what you're saying. Is it about the video topic or...? The title/topic of the video is to highlight why integrated graphics suck when compared (see thumbnail) to dedicated graphics. Not...whatever you're talking about.
YOU said most people dont need high end. I pointed out that the folks YOU BROUGHT UP dont need midgrade 3d as we see APUs trying to brag about. YOUR referenced low end market only need a 1998 tier gpu. So why are APUs bragging about their cutting edge abilities, if their market dont need that
@@hikari_no_yume There is. But techquickie is initially about explaining something in a concise manner to a general consumer audience. So it was weird that they didn't include a statement like what I posted early in the video.
I love reading these comments from 4 years ago. The 5700G now overclocks the iGPU to over 2000 MHz quite easily. If you have 32 GB of ram it uses 4 GB automatically. Of course, you have to have 4000 MHz ram and overclock the infinity fabric as well. Gets you 1080 low quality at 50-60 fps in most games. As my monitor is only a 1080 at 60 hz, that’s good enough for me. No expensive dGPU required.
@Transistor Jump The i7-8809G has ~205GB/s bandwidth but it has dedicated memory for the AMD GPU, can hardly be called an iGPU, especially since the intel CPU retains it's own iGPU. At 3200 MHz DDR4, you're looking at ~50GB/s, shared between CPU and GPU.
I have an 8th gen i7 NUC with Iris 650 graphics and dual channel ram. It handles most casual games in 1080p at 30frames - but I prefer emulating consoles such as the GameCube instead. In which case the NUC is perfect at doing so. Plus, if I ever want to beef things up, I can use the thunderbolt port and hook up an eGPU. So I have a tiny computer that is energy efficient and has more than enough power for an extremely casual gamer. The problem is with budget gamers who want to push their ancient AMD APUs and Intel HD4000 graphics and expect to see playable results. It just ain't happening.
Maybe in the future, Thunderbolt 3 port should be more ubiquitous on all types of laptops so all users have the option to buy a cheaper eGPU unit instead of buying a new gaming laptop or build a gaming desktop.
It would have been perfect if you could have done a sponsor that did the "two things at once" segue. Also, been rockin' an AMD APU since Kaverti. Yea. Kaverti. Trying to build a slim, low power, low profile SFF around one of those may not have been my wisest decision.
Consoles have dedicated graphics, their graphics proccesors are not as small as Intel's. Small graphics proccesors found on the Ryzen 2200G and 2400G, and the Nintendo Switch are really capable, but they are not match for dedicated graphics cards
@@Overload151 Well but the one in ps4 pro and Xbox one x are still Apu"s. And they are better than any integrated cards or those dedicated Card lije (gt 730 rx250 etc)
There's also the problem of power delivery. Motherboards would become very expensive in order to fit all the other parts of a GPU PCB, might even be impossible for high end GPUs on anything smaller than EATX sized boards.
I was mentioning this in a reply to another comment. Theres also the issue of sockets, which would undoubtedly limit the user to use specific GPUS with specific CPUS.
Explain Hades canyon then dies can be made intel and AMD(to a lesser extent) just want more money it's greed over improvements in technology it's completely possible to stack a PCB or die with hbm2 or gddr6 as well as standard ddr4. Smartphones do a very similar thing. Biggest reasons are cost and profit damage. AMD loses, aibs lose, intel lose and Nvidia lose. They want to sell you cheap crappy graphics cards, mid tier cards and super expensive ones. They'll justify the argument of have aio integrated PCBs with both GPU and CPU on as of course then you can't mod it or just buy another one lol. I'd rather have a Hades canyon that I change every 2-3 years than a constant cycle of I need this because this stuff I currently have is now completely irrelevant.(intel here's looking at you quad core to hex core you Dick's more warning next time).if you can only afford low end stuff your never going to be completely satisfied and may as well get a console. If your high end then fk you you can afford to swap and change at leisure. If your mid tier the markets always trying to tell you your good enough BUT you wont be able to run games in a year or two at the same as now. Wheres the value in this pc gaming economy. There is none. Why the eeeffff should anyone have to pay over a grand for a fkn graphics card to play a games highest settings thats not value thats dumb.
I use Genymotion to emulate android and play android games. My secondary PC uses a i5-7300@3Ghz with a HD630 and linux, works MUCH better than my primary with a i5-4690k@4Ghz with a GTX1050 TI Exoc (with a 150Mhz GPU OC extra) with windows. My opnion, if your game works on linux and you only have only a integrated GPU try linux to see if you get a better performance.
Funny thing though, future desktop systems might become much like a SOC with 3D CPU+GPU+HBM all packed into a giant chip. This would mean bottlenecks to access memory would be gone and bandwidth would be through the roof... Not to mention power and space savings.
One thing worth mention : Higher temperature (because of integrated gpu) around the CPU socket - often surrounded by electrolyte capacitors, will tends to reduce lifetime of motherboard by drying out the capacitors in shorter time.
Integrated graphics doesn't really suck in a content creator standpoint. It is really a helpful additional rendering and encoding engine if you do multiple video projects. I have both iGPU and Nvidia GPU enabled. If your Nvidia or AMD GPU is busy and you want additional headroome, integrated GPU comes into play to get the job done...
A spork is actually a perfect metaphor, it's not the best at being a fork neither it's the best at being a spoon. But it's usable in many general use cases.
To be fair, I beat half of COD4 on my sister's Intel 4500MHD, and half of Mass Effect 2 on a media PC's Intel HD 3000, so things are slooooowly improving.
Why not simply merge CPU and GPU and also merge RAM and graphics RAM together? Socket on the mother board would be bigger, probably twice as big, but also the heatspreader would be twice as big, much more pins on the socket for connections to GDDR, and memories you buy would have two advertised capacities, a System memory and Graphics memory.
IGPs are awesome. You won't realize how awesome they are until you've been following GPU releases for over a decade, and you begin to realize, oh shit, this IGP is more powerful than XYZ, a card I had to remove a drive cage to fit in my full tower!
I read in a PC magazine about 12 years ago that with the advent of multicore CPUs we shouldn't need GPUs in the Future. It went on to say that Quad core CPUs or even more cores would replace the need for a Graphics card in the future. I think the magazine was printed just when the Xbox 360 was is in its infancy.
Both should actually be used. The IGPU should perform certain calculations such as floating point calculations based on the "conversation" with a separate graphics card.
Cpus could always be made larger to accommodate graphics processors and larger heat sinks could be used as well as using more and faster ram. Both do not need to be on the same chip either, they can be on separate chipplets. AMD could probably do it if they really wanted to.
3d stacking will allow apus to get x80 class GPU performance, so this video will be out of date in 2-6 years incase anyone is wondering. Also I'm pretty sure cpu coolers are more effective than GPU coolers based on some mods I saw where a guy put a cpu cooler on a GPU. I dunno know about the cooler, but definately expect apus to overtake everything in 5 years.
Why does integrated GPU sucks?
They don't. It's Intel GPU that sucks, Ryzen APU's integrated GPU is a decent low-mid tier GPU.
i see you are a man of culture aswell
@@dz_apollo shut up weeb. btw, i'm a weeb as well, so i'll shut up too
If the price is good but laptop with 1050 card will add a lot more power than APU every can... The price differnt won't that much anyhow....
@@jedi22300lol
@@campkira I spent $380 for a laptop for my wife with a Ryzen 5 2500u, my laptop with an i7 7700hq and 1050 cost over 800. Price is a massive difference. Obviously my pc is better but any game also available on console (95% of games) run playable on it.
Just download a better graphics card. Problem solved.
ooh i see
L
Ahh i see youre a man of culture as well
That doesn''t work lol everybody knows you have to download PCIe slots first.
Turn half your onboard memory into a virtual drive and then wonder why you are getting out of memory errors...
0:09 *why try again?* i thought you nailed it the first time!
Yeah,what he said first was actually true.
Rayyan Sumaguina lol
Miguel Sanchez
Sleeper PCs: :(
*Vega iGPU have entered the chat*
*Intel UHD have left the chat*
Lol
Hey can is it bad if i have Vega 8 (integrated graphica card) ,8gb ram,ryzen 3200g? I saw a lot of videos of tests with this specs and they can run best games with solid fps and solid graphics,here link:
pevex.hr/pc-konfiguracija-feniks-info-hyper-x-2061-amd-ryzen-3200g-8gb-ddr4-ssd-480gb-326172.html
@@honly872At that price? Yeah, I think it's pretty worth it. Not that bad.
In the future, however, I suggest you ask in Reddit or forums such as the LTT forums. I may not be able to respond to such queries. Please do not make your purchasing decisions based on one stranger's comment either; seek other people's opinion too.
Cheers!
@@honly872 it'll be better if u have one more 8gb stick and run the two in dual channel. And it'll also be better if u buy 3400g with vega 11 if u r not planning to purchase a graphic card in near future. And yeah, ur ram clocks should be =/> 3000mhz to get the best performance
@@honly872 how many euros or dollars?
I'm into a game that punishes hardware.
Minecraft
In DigitalFoundry's video about ray tracing it actually does....
APU would had hard time since that game require more than just computing power...
Java has been known for being very stuttery too.
Bitch, that is demanding as fuck.
Ever tried 4K, max render, mountain or forest biome and 144fps?
That is without mods. Your dang PC would be crying.
Now add ray tracing to it or simply just one of those realistic modpacks. A 1080Ti can't even run those things at medium settings 4K30fps.
@@cunnyman We all know that... Em can you smartass be smarter than this before you reply to a comment? And yes I know you are a kid.
"I want more dedicated graphics chips on laptops!"
Translation: "I love thermals!!!"
The cooling on my laptop is notoriously meh. After I undervolted & underamped the CPU, both CPU & GPU temperatures dropped yet performance didn't. Why Acer thought putting them both on the same heatsink was a good idea amounted to "this cheaper." I'm just glad it actually was able to be improved to acceptable levels without a friggin' cooling mat.
@@InfernosReaper I bought an AMD FX chipset laptop a while ago and it ran stupid hot. Cramming an assload of cores into a small body wasn't going to be a good idea. Lol
Im lucky in my laptop doesn’t get that hot.
Tho in exchange it weighs a ton.
SLI gtx1080
@@InfernosReaper You didn't watch reviews before buying?
@@aravindr4986 No one I trusted had reviewed it. It was a bit of a gamble, but overall, one that paid off for *me*. I'm not sure it'd be a good fit for everyone, since not everyone is willing to trial and error on power adjustments, or buy a cooling pad.
Ryzen APUs: Am i joke to you
Edit: My 2 cents, good as a temporary case in budget builds but become irrelevant when GPU is available, may also be helpful in emulation pcs since graphics arent used too much
Edit 2: 2400G is a fucking joke though
APU don't have video memory , you use your ram and it's very bad
Yes they're joke compared to a budget GPU ;) Unless we're talking about consoles, which does both in a small package.
@@DeadPhoenix86DP vega 11 (r5 2400g) with dual channel fast ram can beat the gddr5 gt1030 and run all new games at at least like 720p low and most higher
Alo Intel paired with VEGA AKA Hades Canyon nuc. Or those systems based on Kaby Lake G CPUs.
@@DeadPhoenix86DP Though they aren't as good as High end GPUs it is impressive how well can it run after seeing crappy intel integrated graphics for years
Lot of heat, heat is bad
Saved you guys a lot of time
Heat is only bad if you do not have the proper cooling solution. Really this could be taken care of by having support for E-GPU's by default... but that would mean Thunderbolt on everything!
@@christopherkidwell9817 I mean while that's true, CPU temps are measured internally rather than the externally as the heat can damage components if not redistributed fast enough. So for example, if the CPU creates 120°C of heat, but can disapate it at a rate so it's 80°C constantly, the high temperature will still damage the CPU, just over a longer time in contrast to if it was cooler by default
APU heat?
Are you unfamiliar with the AMD FX series space heaters?
Pairing a 65-90w cpu with a 75-125w gpu is still less electricity->heat than classic AMD toasters
@Alfred Cain if only it was that easy
@@rwbimbie5854 you reminded me of that "factory 5GHz" 200w TDP meme they actually sold
Dunno my 2400g is pretty nice
Thats just your opinion fool
It would be WAY nicer with 1 or 2 gb ddr5 vram chiplets on the plank
@@rainbowisticfarts, it performs similarly to a gt 1030 so.. if you are on a very tight budget it is worth it.
it is indeed apus are hella good. you get pretty much a free entry level gpu with them. i am waiting for navi 20
Hello there
When I was 13 i tried to download Nvidia GeForce on my Intel hd laptop
@AThePetrov ok
Same
Hahaha made my day dude
Because there's limited space in a CPU and memory bandwidth of normal DDRx RAM is limited
Oh hey
"and memory bandwidth of normal DDRx RAM is limited" not only limited, also not optimized for heavy parallel work.
The dies could be made larger at a profit cost to the silicon fabrication owner. Oh no did I just imply that profits mite be too high in the tech industry! Sarcasm incase your not Canadian or British.
@@_aullik well as much as it's true, I'd say it would be probably possible to to get the igpu a set of its own memory slots you could expand yourself, no?
Yeah. Integrated graphics with actually decent memory attached (Like the Intel 8809G) has proper mid-range performance .
Should be, "why cant intel make awesome integrated graphics?"
Clearly haven't heard the their Iris Pro graphics.
Or why amd can't make good high end processors
@@AlfaPro1337 Intel's Iris Pro iGPUs aren't even close to AMD's RX Vega iGPUs though.
@@MTMguy Dude, it was before RX Flop came out. The RX Flop gets decimated by the Titan V/RTX. Why else they named it so differently from the RX 4/500 series? Obviously to beat the Titan V/RTX.
They're doing it atm
hehe, i just bought a AMD A8-9600 because it has integrated graphics. :D
it arrives tomorrow.
It's for a server so it doesn't need good graphics.
how much did it cost? could've got the athlon 200ge?
@@raiyantajwar1564 the athlon 200ge 2c/2tbut the a8 9600 has 4 cores
@@hankhuevos3336 200ge is actually 2c/4t, but even then the multicore performance is much better than a8 9600, and single core is miles better. graphics is also much newer and faster vega 3. Even the whole chip itself is much more efficient as well.
@@solarstrike33 So ? The A8-9600 is an AM4 chip . Just like the 200GE .
@@hankhuevos3336 The Athlon 200GE is faster at any workload , and more efficient , the newer architecture will give the system a much much much MUCH smoother experience 14nm vs 28nm is no contest . 14nm ftw .
Ryzen APUs actually give very respectable performance
To think that rdna 3 apus are almost 1060 performance
@@eee1200 might be even more if fsr 4 comes out still crazy that they can run forza horizon at 4k 100fps
Because they don't have RGB.
Just download RGB.
@@itizjuan what
@@itizjuan lmao
Have you not seen AMD APUs?
They do integrated graphics that can do decent gaming for a while
Yea it's well enough as long as you don't wanna run insanity games or games on 4k or something like that insane. Just be realistic and an APU may do for you.
@@rickytorres9089 that's kind of retarded logic though to be honest... who is going to have a 4k capable monitor(hundreds of dollars more than a simple 1080p monitor) but can't afford a discrete graphics card? Nobody that plans on running a 4k gaming setup would ever buy an APU.. the APU market is for people who are building a cheap, bare-bones gaming setup. For instance my sister plays World of Warcraft casually, along with old titles like Skyrim and Fallout.. I built her a system with a Ryzen 2400g and it runs everything on ultra settings just fine at 1080p, which is the resolution most people are playing on.
You can't expect a budget CPU to give enthusiast level performance, nobody complains that a 1030 doesn't give 200 FPS at 4k resolution because they understand that it's a budget card...
Hey can is it bad if i have Vega 8 (integrated graphica card) ,8gb ram,ryzen 3200g? I saw a lot of videos of tests with this specs and they can run best games with solid fps and solid graphics,here link:
pevex.hr/pc-konfiguracija-feniks-info-hyper-x-2061-amd-ryzen-3200g-8gb-ddr4-ssd-480gb-326172.html
@@honly872 i have 3200g and it surprises me each day
Finally a 4K option on Techquickie. Thank you for this gift
And I am out here playing it at 360p😂😂
@@hollow8194 Same here🤣🤣🤣🤣
This episode of techquickie is bought to you by *sporkspace!*
I thought the spork joke was gonna be the sponsor plug... Linus I'm very disappointed.
Snow Peak sponsorship when?
I would love if there was another socket on the motherboards, to install a GPU, instead of using a card for it :D and then just put on whatever cooler you like
Then your motherboard will be covered with 2 enormous cooler (unless you use water/liquiq cooler) and that took a lot of space, PCIE is more practical since it's mounted vertically so not much space covered.
Febri Eka Setyawan also no vram
@@thelazt16 in other hand, we could also go the reverse way, and go full Slot A/ Slot 1 days when Intel and AMD released Athlons and Pentiums in slot form. It looked exactly like a graphics card does nowadays, but was a processor (for who's unfamiliar with it, google it! It's at least interesting)
@@Kalvinjj They could make that happen but why would you innovate thing that will only cut down performance when it's actually doesn't need that kind of innovation. It just as ridiculous as Kingston Fury RGB SSD which has speed slower than MicrodSD because of it's getting overheated from it's own 75 RGB lights.
using the same gddr5/HBM2 memory after a gpu upgrade would be nice..
At least it's still good for emergency situations. Like if your GPU suddenly stops working and you got no spare
Or if you fuck up and accidentally overclocked your gpu too far
Spork is abomination comparable only to CPUs without SMT aka HT, iGPUs and apple's business practices.
For some things, depths of hell aren't enough.
My 2200G has no SMT, and I actually don't miss it. 4 cores with no SMT is fine for most things, gaming included.
Boy: Nvidia
Man: AMD
Legend: Intel HD graphics
Steve Irwin: no gpu at all
I got the last intel cards and meh. No wonder i prefer to use my REALLY BAD 940MX. Not only it gives me 5 times more performance, but i can even turn some settings from low to med.
Oh and 940MX is the worst gfx card ever. It can play games from 2005-2012 max, even 12 is pushing it of you wanna use 1080p. Sure, some games still work pass 12 but meh. Even my GTX (old cards) 750/760 ti cards do a better job than a brand new integrated card from 2019.
Weird thing is there are 2 versions of the 940MX, one with DDR3 and another with GDDR5 and more shaders. My previous Acer laptop with the 940MX (GDDR5) was decent in games while my friends Dell with the DDR3 version was about quite a bit slower compared to it with the exact same processor and RAM.
cant wait for Qualcomm, Samsung and Apple to make their chips compatible with windows/mac
especially if the rumored Samsung investment into their chips is real, 116 BILLION???? omg
i want to dual boot my tablet as my main computer, and then my phone soon later
There are already laptops running Windows S on arm based qualcomm cpus
Windows on mobile architectures can't run most software. There is an ARM build of Windows, but most software isn't compatible. That's a huge downside. Afaik, android is more capable than ARM Windows.
Linux has been running on Arm for a long time
@@rwbimbie5854 That's what I would run on an ARM laptop, especially if I could virtualize Windows for ARM on it for work.
This may be stupid but why can't we use a laptops gpu in a normal computer?
0:16 "even DROPPING"
I see what you did there.
I’d love to see a comparison between GPU & iGPU performance for graphics design software and pen display tech!
I need a video detailing spork technologies and advantages/disadvantages of different spork configurations @3:44
My experience with gaming on integrated graphics is that you need a decent CPU and a LOT of RAM. Last night I doubled the ram on a laptop I have and decided to see how well it handles a heavily modified Minecraft with the extra ram. It did a lot better with more ram available, but it still sucked compared to my other laptop that amazingly has dedicated graphics. The FPS with that heavily modified Minecraft (forge with about 60 mods playing singleplayer lol) on the integrated graphics laptop averaged about 20-30FPS with the 8GB of RAM and about 10-15FPS with the 4GB of RAM. The dedicated graphics laptop averages between 40-60FPS with 4GB of ram. I ordered RAM for it that should be here next week to bright it up to 8GB.
That being said, a plain ol' Minecraft install in multiplayer on both do well with optifine and reasonable settings. The killer of game performance, in my opinion, is Windows 10. Windows 10 seems to love to eat through RAM and spare CPU cycles. I run a lightweight but decent Linux distro, so my game selection may be limited, but Windows 10 pissed me off so much that I ditched it. Windows 7 was alright with a lot of tweaks. It seems Windows 10 can't be slimmed down as well as 7 and the updates are a nightmare. On the laptop that I just doubled the RAM on, it had Windows 10 on it but a few updates failed. Those updates continued to fail. Hours of talking to customer service and trying everything they could think of and those updates just would not install. You would start the computer and it would take a half-hour to get to the desktop because it would try to install those updates, then fail, then revert. There seems to be no actual way to ignore those updates either. One thing was tried to ignore them, and it worked for like two days before it reverted to the habit of trying to force me to install those updates. I finally had enough to went (back) to Linux and have had zero major issues since.
This episode of Techquickie was brought to you by THE SPORK
I don't suck!
RGB IS ALL YOU NEED!!!
Also APUs are getting better!
APUs are actually a great option for people new to building pcs, and offer great value when looking at esports titles, rather than things like gta or rdr. In games like LoL CS and valorant, getting 75fps+ is easy (I say 75 since any getting an apu prob isn’t getting a 120hz+ monitor), and you don’t even have to reduce the settings! The only real downsides to this, are you can’t play more demanding games like gta, and it will make your pc look incomplete and empty
No wonder why i cant play 3D hentai games this is just a sad story of mine :(
Try older ones.
Trust me, I *know* they work on iGPUs.
@@solarstrike33 how?? My integrated graphics card is 4600
@@ejmuerterisefromthedead8924 I had the same iGPU as you.
@@solarstrike33 ok... but that's weird on my laptop how it can't run on 3D
@@ejmuerterisefromthedead8924 that is your personal problem
ive enjoyed watching your channel and now have decided to go to college for CIS - cyber security. thank you for the inspiration.
Oh just you wait till the ryzen 3000 apu's come out, considering how solid the r5 2400g was, I am expecting great improvements.
Navi APU's would be sick next month
APUs are a decent comprimise between the weak igp and a dedicated gpu. Kaby Lake G on intels side and AMDs G CPUs for the Ryzen side of things. Just need 3000mhz or higher clocked ram in Dual Channel to make the most of the APUS on the market.
Video suggestion:
Can you please make a comparison between 2 mid-level GPU vs. 1 expensive GPU
I wanted to do that with the rx480s but mobo that support crossfire can be expensive
My youngest only wanted to play Sims4, so I spent spent a long time pricing a laptop with a dGPU until I found several YT videos showing people playing Sims4 at 1366x768 on the iGPU of an i5-6200U, so that's what we bought instead.
Also, as suggested by YT, bought and extra 4GB of RAM to make it dual channel and to give the iGPU access to more RAM and squeezed out another 10% in benchmarks with that configuration.
3-years down the line and she's still very happy with it and is still playing Sims4 (with just about every expansion pack available).
The spork?
Did you mean: *the fpoon?*
Additionally back in the day i could've made argument about backwards compatibility. Say - in `02-'06 intel embedded graphics were fully backwards compatible with VGA standard.
That meant that even on windows XP (via ntvdm) you were able to just launch titles such as x-com 3 or blood natively, no dosbox required. Tho for sound you needed virtual dos sound driver such as VDMSound.
The spork IS awesome, but can it run Crysis?
Amd igpu is the most cost effective gpu for a budget build:
ryzen 3 and 5 igpu , vega 8 and vega 11 respectively, can even run many latest titles up to medium settings (AC odyssey, BF 5), and games like SOTTR, FC5 up to High setting
So
*it doesn't suck*
1:57 *That's a lotta damage
Spork: In Germany this piece of cutlery is called a "Göffel" 'G' from Gabel (fork) and 'öffel' from Löffel (spoon).
There's nvidia Tegra x1 and amd APU both have powerful integrated gpu it's just intel been lazy as usual
Tegra X1 is less powerful than modern intel integrated graphics... Even though I hate doing it if we compare GFLOPS... X1 is 500, my 3 year old mid range laptop is about 420... The high range ones are leaps above the x1
@@rhysjones8506 I think the normal Tegra X1 is 1 TFLOP, but the version in the Switch is downclocked to ~500 GFLOPs, which yeah is pretty weak. Slower than an RX 550 or GT 1030 and all current-gen consoles, about twice as fast as an Xbox 360.
@@Malus1531 they did that to save battery while using the switch as a portable system but when it is in the dock it is higher
@@nathanmead140 True, still weaker though. Top of my head I wanna say it's like 350MHz portable and 760MHz docked, while the regular X1 runs at like 1.2GHz.
@@Malus1531 no it's 500 GFLOPS... Switch is 400 docked, the whole tflop is Nvidia trying to be smart... That's with fp16 calculations, or floating points... whereas the norm for everything is fp32, that's what all APIs use... with fp32 calculations it's only half a tflop
This video basically covers all the sparse reasons why companies do not attempt to combine CPUs with high performance GPUs. It is the invreased die size and increased TDP, which keeps them from combining the two. Thanks for sharing this one. Love your work. Keep up. 😁😁✌
they don't suck! they let you play bookworm Adventures deluxe, Minecraft and Roblox.
Checkmate 😎
Ew roblox
I mean if you have reasonable expectations from the 3200g it’s ok. It’s no rtx 3080 (18x weaker) but for the price and TDP it’s really good.
My laptop's 8th Gen i5 with integrated graphics can't even play Bendy and the Ink Machine.
@Ben dy 1 or 2? Watchdogs 1 isn't that demanding right?
Playing it on ultra and full res with a 1070, if I recall correctly.
@Ben dy You need more RAM....
@@ananth.a If he don't had enough RAM... it won't work... I know since mine older gaming laptop come with low RAM... put some more RAM it and it smooth...
I am still using my AMD A10 which came with Integrated GPU and for the 1st two years, it i was able to place the latest and greatest games at 720p (my monitor was 720 back then) and it was good enough. With time the graphics have become very demanding. Now it can only work windows well with very light gaming titles.
In today's time its pointless to go for an integrated GPU simply because it won't be powerful enough to give the best for todays Graphics Intense games.
Jayanand Supali well said!
Okay so, 4K low graphics on a pc with the Ryzen 5 2400G?
Change my mind.
The size, power and transistor count is really not that much of a issue. A modern cpu uses just as much power for interface than for processing. Integrating a gpu while adding power, also removes a bit. This have not always been true to the degree it is today, but from 14nm, it's really almost as efficent to integrate a cpu in the gpu as making a PCI-e link to a cpu. (Depending on the cpu.
When binning become available, transistor count was also less of a issue. Now with zen2 with 3 degrees of binning, it's hardly a issue at all.
What is the issue is memory. Ddr4 need to be sockled. Gddr6 don't. So gddr6 is a loooot faster.
This pretty much make it impossible to make a fast Apu today... Because they use the infrastructure of the cpu, not the gpu. This is also why ps4(pro) and exbone does have Apu. They don't have that issue.
But I have a fealing that and cracked the problem
Apple: hold my M1 chip
My ryzen 5 pro 4650g also does quite a good job for an igpu with Vega 7 graphics
I installed 16gb so theirs plenty available for both cpu and GPU
F1 2020 medium settings 55 fps
Gta v ultra high settings 37 fps
Cs go must be plus 150 but my monitor is 60hz 😅
The video:why do iGPU's suck
Intel:"nervous sweating"
AMD:*laughs in APU*
*Does integrated GPU support higher resolution like 1440p at 165Hz refresh rate?*
★ All I want to do is connect external monitor (1440p/165Hz/1ms) with laptop (having HDMI v2.0) & do CLOUD GAMING. 👍
Well there is AMD Vega graphics
Still pretty bad, bad>shit
Still Sucks Balls.
@Manok AMD Ryzen 2 chips have embedded Vega graphics, but they run underclocked and share slower DDR4 RAM (compared to GDDR5) hence much slower
@@samuelislam9927 The top end ones (Vega 10 and 11) perform similarily to the dedicated GPU the gtx 1030 (with GDDR5, not the stupid gimped version), which is mighty impressive for a GPU that uses DDR4
@@samuelislam9927 Bad for high end gaming, yes. But as an integrated graphics vega is currently the most powerful one and is perfect for e-sport gaming or HTPC
It’s all fun and games if you have either only a IGPU or a dedicated GPU.
When you have both, there are problems.
TL;DR
My pc uses the IGPU instead of the dedicated GPU, causing things to run slow. Also, some AMD updating problems.
So, my pc is a Samsung Series 3 NP370R4E. I upgraded its ram to 6 gb so that it could run faster and boi was that an upgrade. Just 2 gb of extra ram helped everything run better.
I have two GPUs. An intel HD Graphics 4000 IGPU and an AMD RADEON HD 8600/8700M (Discrete/Hybrid) GPU. The latter always has a weird name whenever I upgrade the driver.
For example: 7400/7500M, 5600/5700M, 4300/4400M throughout the time I’ve used it and none of these models are there in AMD’s website which annoys me.
I’ve recently tried to update the GPU driver to the latest Adrenaline 19.x.x version but whenever I do, my pc crashes. It’s stuck in one screen. I can’t move anything. When I try to restart it by pressing the power button, it never starts and says that the hard drive has an error. I had to reset my pc several times before arriving to the conclusion that I’ll never be able to update it past Adrenaline 17.x.x.
Some may say that I should give it enough time to load but that doesn’t work. I left it to update overnight and during the day when I was at school but nope. It’s just stuck on that screen.
Another problem with the GPUs is that my pc uses the intel IGPU most of the time. While, other times games such as Minecraft use my AMD GPU one day then use my intel IGPU the next day. Dropping my FPS to 15 FPS from a capped and stable 60 FPS.
I can’t find a way to make my laptop use the AMD GPU instead of my intel IGPU for different apps like you can do with NVIDIA GPUs or so I heard.
My hardware is capable for running games at decent frames at medium to low settings but it’s the stupid IGPU getting in the way. If only there was a way.
Me when writing an essay for my finals: Hi. This is an essay of 10 words.
Me when writing something stupid like this comment: Here’s a whole novel’s worth of random shit.
Spent nearly five minutes saying everything else other than "Integrated Graphics aren't as great as Dedicated Graphics because most people don't need a high-performing graphics card for their daily workload as opposed to video editors, streamers, etc, and thus helps keep the cost down for something they don't need too much resources from."
But the APUs we see are NOT focused on providing a youtube / netflix / excel screen
but market themselves with the best 3d they can pull off
... which is not what is needed by the alleged Target Market of soccermoms & servers
VGA used to run with LESS than ONE MB vram.. that is far less than onboard cache for most gpu these days.. VGA for APU is pennies
@@rwbimbie5854 I'm not sure what you're saying. Is it about the video topic or...? The title/topic of the video is to highlight why integrated graphics suck when compared (see thumbnail) to dedicated graphics. Not...whatever you're talking about.
YOU said most people dont need high end.
I pointed out that the folks YOU BROUGHT UP dont need midgrade 3d as we see APUs trying to brag about. YOUR referenced low end market only need a 1998 tier gpu.
So why are APUs bragging about their cutting edge abilities, if their market dont need that
There's much more to it than that, though.
@@hikari_no_yume There is. But techquickie is initially about explaining something in a concise manner to a general consumer audience. So it was weird that they didn't include a statement like what I posted early in the video.
I love reading these comments from 4 years ago. The 5700G now overclocks the iGPU to over 2000 MHz quite easily. If you have 32 GB of ram it uses 4 GB automatically. Of course, you have to have 4000 MHz ram and overclock the infinity fabric as well. Gets you 1080 low quality at 50-60 fps in most games. As my monitor is only a 1080 at 60 hz, that’s good enough for me. No expensive dGPU required.
By far, memory bandwidth is the biggest reason integrated graphics can't be more powerful than what they are now.
@Transistor Jump The i7-8809G has ~205GB/s bandwidth but it has dedicated memory for the AMD GPU, can hardly be called an iGPU, especially since the intel CPU retains it's own iGPU. At 3200 MHz DDR4, you're looking at ~50GB/s, shared between CPU and GPU.
I have an 8th gen i7 NUC with Iris 650 graphics and dual channel ram. It handles most casual games in 1080p at 30frames - but I prefer emulating consoles such as the GameCube instead. In which case the NUC is perfect at doing so. Plus, if I ever want to beef things up, I can use the thunderbolt port and hook up an eGPU. So I have a tiny computer that is energy efficient and has more than enough power for an extremely casual gamer. The problem is with budget gamers who want to push their ancient AMD APUs and Intel HD4000 graphics and expect to see playable results. It just ain't happening.
Nonsense!!!! They can easily put a CPU, GPU, DDR, GDDR, SSD, and RGB on one DIE. 😁
include the power supply, ports and display and basically that’s a fuckin laptop by this point
4:02 award winning 24/7 customer support is the main deal here
More accurate thumbnail,
Intel integrated graphic is SCUK
Use AMD APU
Maybe in the future, Thunderbolt 3 port should be more ubiquitous on all types of laptops so all users have the option to buy a cheaper eGPU unit instead of buying a new gaming laptop or build a gaming desktop.
nVidia 2080 with Intel i9 9900K. Yes, I have noticed my electricity bills rise...
My crush: My laptop has integrated graphics.
Me: Hoooot!!!!
Motherboard should have one more socket for GPU like CPU with ddr5 as shared memory...
James top man,another great video. What about factory fitted GPU cards from the word go.
That intro was unnecessary. But kinda funny :)
It would have been perfect if you could have done a sponsor that did the "two things at once" segue.
Also, been rockin' an AMD APU since Kaverti.
Yea. Kaverti.
Trying to build a slim, low power, low profile SFF around one of those may not have been my wisest decision.
8 views, 13 comments, 42 likes...
No sugar? Damn. Y'all ain't never got two things that match. Either y'all got Kool-aid, no sugar. Peanut butter, no jelly. Ham, no burger. Daaamn.
The older the berry the sweeter the juice...
Endgame was pretty good
I really enjoy the style of this channel & Linus Media Group as a whole! Quite succinct.
Well but for console s these IGPU seems working.
AMD apus are good as well
And also consoles suck
Consoles have dedicated graphics, their graphics proccesors are not as small as Intel's.
Small graphics proccesors found on the Ryzen 2200G and 2400G, and the Nintendo Switch are really capable, but they are not match for dedicated graphics cards
@@Overload151 Well but the one in ps4 pro and Xbox one x are still Apu"s. And they are better than any integrated cards or those dedicated Card lije (gt 730 rx250 etc)
@@akhil2024 those are shitty cards. Anything with GT in the front fucking sucks
There's also the problem of power delivery. Motherboards would become very expensive in order to fit all the other parts of a GPU PCB, might even be impossible for high end GPUs on anything smaller than EATX sized boards.
I was mentioning this in a reply to another comment. Theres also the issue of sockets, which would undoubtedly limit the user to use specific GPUS with specific CPUS.
Explain Hades canyon then dies can be made intel and AMD(to a lesser extent) just want more money it's greed over improvements in technology it's completely possible to stack a PCB or die with hbm2 or gddr6 as well as standard ddr4. Smartphones do a very similar thing. Biggest reasons are cost and profit damage. AMD loses, aibs lose, intel lose and Nvidia lose. They want to sell you cheap crappy graphics cards, mid tier cards and super expensive ones. They'll justify the argument of have aio integrated PCBs with both GPU and CPU on as of course then you can't mod it or just buy another one lol. I'd rather have a Hades canyon that I change every 2-3 years than a constant cycle of I need this because this stuff I currently have is now completely irrelevant.(intel here's looking at you quad core to hex core you Dick's more warning next time).if you can only afford low end stuff your never going to be completely satisfied and may as well get a console. If your high end then fk you you can afford to swap and change at leisure. If your mid tier the markets always trying to tell you your good enough BUT you wont be able to run games in a year or two at the same as now. Wheres the value in this pc gaming economy. There is none. Why the eeeffff should anyone have to pay over a grand for a fkn graphics card to play a games highest settings thats not value thats dumb.
I use Genymotion to emulate android and play android games. My secondary PC uses a i5-7300@3Ghz with a HD630 and linux, works MUCH better than my primary with a i5-4690k@4Ghz with a GTX1050 TI Exoc (with a 150Mhz GPU OC extra) with windows. My opnion, if your game works on linux and you only have only a integrated GPU try linux to see if you get a better performance.
Funny thing though, future desktop systems might become much like a SOC with 3D CPU+GPU+HBM all packed into a giant chip. This would mean bottlenecks to access memory would be gone and bandwidth would be through the roof... Not to mention power and space savings.
Back in the 90's you had dedicated hard drive controllers. Everything goes onboard/into the CPU at some point. APU is the future.
One thing worth mention : Higher temperature (because of integrated gpu) around the CPU socket - often surrounded by electrolyte capacitors, will tends to reduce lifetime of motherboard by drying out the capacitors in shorter time.
Integrated graphics doesn't really suck in a content creator standpoint. It is really a helpful additional rendering and encoding engine if you do multiple video projects. I have both iGPU and Nvidia GPU enabled. If your Nvidia or AMD GPU is busy and you want additional headroome, integrated GPU comes into play to get the job done...
The spork was a really bad idea, it sucks at both being a fork and a spoon.
A spork is actually a perfect metaphor, it's not the best at being a fork neither it's the best at being a spoon. But it's usable in many general use cases.
To be fair, I beat half of COD4 on my sister's Intel 4500MHD, and half of Mass Effect 2 on a media PC's Intel HD 3000, so things are slooooowly improving.
Why not simply merge CPU and GPU and also merge RAM and graphics RAM together? Socket on the mother board would be bigger, probably twice as big, but also the heatspreader would be twice as big, much more pins on the socket for connections to GDDR, and memories you buy would have two advertised capacities, a System memory and Graphics memory.
IGPs are awesome. You won't realize how awesome they are until you've been following GPU releases for over a decade, and you begin to realize, oh shit, this IGP is more powerful than XYZ, a card I had to remove a drive cage to fit in my full tower!
I read in a PC magazine about 12 years ago that with the advent of multicore CPUs we shouldn't need GPUs in the Future. It went on to say that Quad core CPUs or even more cores would replace the need for a Graphics card in the future. I think the magazine was printed just when the Xbox 360 was is in its infancy.
Both should actually be used. The IGPU should perform certain calculations such as floating point calculations based on the "conversation" with a separate graphics card.
Cpus could always be made larger to accommodate graphics processors and larger heat sinks could be used as well as using more and faster ram. Both do not need to be on the same chip either, they can be on separate chipplets. AMD could probably do it if they really wanted to.
This is why my laptop with a old ass i3 sucks as when trying to even play online games. My next computer must have an actual GPU.
All our workstations use IGP and it has improved reliability. I do not miss the days if having a separate graphics card.
I have an i7 5960X. There is no integrated GPU, and I’m very happy with that more space for raw computing power and cache.
3d stacking will allow apus to get x80 class GPU performance, so this video will be out of date in 2-6 years incase anyone is wondering. Also I'm pretty sure cpu coolers are more effective than GPU coolers based on some mods I saw where a guy put a cpu cooler on a GPU.
I dunno know about the cooler, but definately expect apus to overtake everything in 5 years.
should you download integraded graphics if you have a graphics card????
nah dont need
ok idea for a video: how to optimize your setup when yr hung up on performance