Idk, vga literally looks the same to me. I have two identical 1080p monitors side by side. One connected with vga and the other with dvi-d. They look exactly the same, just just need to auto sync the monitor once a year so it looks sharp
HDMI used to be limited to 60hz and I've been telling people for years display port is the only way to go if you want 144hz+ (back when anything above 144hz was outrageously expensive). Can't forget that advanced display setting either! I wish i didn't procrastinate and made these videos when i started 10+ years ago 😅
I used hdmi for a long time not knowing why I couldn’t get my full 165hz as my monitor said I would have. Well I didn’t realize there’s something called a dp cable that’s much better lmao. Console to pc struggles
@@skarfacegaming243u sure u didnt just throw it away or toss it aside not knowing what it was. Because ive never seen a monitor that does 165hz that doesnt come with one unless urs was a used one somehow. What brand was this?
dose your gpu work harder @ higher hertz? i got the Dp cable but never used it cause my HDMI is at 100hz and figured higher is only gong to make my card work harder for no reason really other than to say i got 165hz. or is card going to work the same? let me know cause now i am curious.
also keep in mind if you have one of those old chunky DVI cables, these can also support up to 144Hz refresh rate unlike HDMI, which as mentioned in the video only supports up to 100Hz. This might only be affective for older cards that have DVI ports, as i have a GTX1070, which could be a tad dated by todays standards but not uncommon
HDMI is absolutely not limited to 100Hz. IIRC, HDMI 1.4 can do 1080p 144Hz just fine. Furthermore, DVI is just HDMI and VGA wearing a trench coat, and both HDMI and VGA are perfectly capable of going past 100Hz.
@@user-mh7we5ik4g HDMI is from a purely software perspective the exact same signal as DVI (explicitly _not_ VGA) except it can also carry audio plus supports encryption. Then they added a few things. If you just want to know if the image is better or worse: no difference except if an older tv has DVI it *will* default to 4:1:1 and *never* give you 4:4:4 but at best 4:2:1 which is not common (4:2:2 and 4:4:2 are these days) and maybe will not even be supported as output.
Left out the part that some motherboards won't let you access BIOS unless you plug the monitor back into the motherboard's DP or HDMI port. It's not well known, but if someone is having trouble getting into BIOS, this is why.
You have saved my life!! My uncle gave me his old dell monitor and just wouldn’t come out of power saving mode! Moved the HDMI cord to the GPU and it worked THANK YOU!!! Saved me hundreds, I plan on getting a newer monitor but this will do until then :DDDDD So excited
He missed this. So I'll give you a hand. If you get a monitor that supports Display port and HDMI. ALWAYS, and I'll say it again, AL-WAYS go for the Display port. Display port is just better. Since it sends more data. And it can support more high-profile settings.
@@Undersnow243 Nope, you're comparing old displayport tech to new hdmi tech, while also not pointing out that that hdmi version wont exists on monitors made before 2021 except in rare cases, SO lets compare new hdmi to new displayport, hdmi 2.1 data rate: 42gb/s , displayport 2.1 data rate: 77.37... you definitely lost this argument.
@@MrVidification to fully clarify, hopefully enough, DisplayPort 1.2 has been out since 2010 while HDMI 2.0 has been out since 2013 but even counting the 2016 revision, DisplayPort is still a better choice, thus most consumer devices will do better if they use displayport, even if one has hdmi 2.1 since you need two to make it work.
@@Undersnow243I believe he was saying it like... "have older hardware? ie 2010 to 2020 go display port have 2022/23 go whichever/HDMI. Seemed to me that was saying more people have hw from 2010 to 2020 than the "latest" and therefore dp if available is a better choice/ safer for more people... I don't pretend to remember all the deets but it was my understanding dp was the better choice most times in the past. And it seems gen to gen still is.
(1) Onboard Video only works if the CPU has integrated Graphics (2) DP interface also provides color accuracy that HDMI standard does not support. (3) Onboard Video can be used for additional monitors where all the graphics card interfaces are already connected to monitors and the computer is used for other non-gaming/rendering functions like web browsing, office apps, etc...
make sure to select PC resolution in Nvidia Control panel, select the corect one, because sometimes the max refresh rate doesn't show up on the other setting.
if you have a high end monitor from the last few years its worth checking what it can run over hdmi. hdmi 2.1 added adaptive sync and has some versions that are faster than the very common displayport 1.4, and only 7000 amd and arc intel gpus support dp 2.0 and few monitors do too. meanwhile hdmi 2.1 was added to 6000 amd and 3000 nvidia cards something called display stream compression is used wh (hdmi 2.1 is a mess because they rewrote the whole hdmi standard to be packet based which is the whole reason adaptive sync works now or something and added that to slower hdmi 2.0 speed while calling that 2.1)
I see here some are arguing about which cable is better,but my Benq HDR monitor has both the ports but with Display port the Gsync/Freesync gets activated. So I am using DP cable. It looks like DP are made for pc use.while HDMI are made for home theatre environment where the cable feeds 7.1 Dolby Atmos with Dolby vision,HDR+ etc...
Also not having the proper version of the cable for certain resolutions and refresh rates. If you got display port 1.4 cable but got a 1440p 360hz, ya need a display port 2.1 cable for maximum performance... Made this mistake 😊
Be careful, if you have a ryzen CPU that does not end in G or an intel core CPU that ends in F, the CPU does not have integrated graphics and the motherboard display connectors will not work.
This is what I think I’m running into, I have a I5 12400f. Bought my DP cables today and nothing but a black screen. HDMI works just fine and it is allowing 144hz refresh rate with a high speed HDMI cable. Not sure how worried I should be about the DP
Always use Nvidia control panel to change resolutions and refresh rates. Using dingos settings resets the refresh rate if you change resolutions which is annoying.
I have dp plugged in the mb directly, using a rtx 3060 as dedicated gpu and i still get the performance. i5-12600k is not getting to hot. Reason is because if i plug cable in 3060 it results in crashing games to desktop (CTD)
My GPU maxed my VRAM when both monitors are plugged into it for some reason so I have the gaming monitor plugged into the GPU and the secondary plugged into the motherboard🤷🏼♂️
With gaming, most people will still want the highest refresh rate. Even with a monitor being high refresh rate, it won't shorten the lifespan enough to where it will become a problem. High refresh monitors will last years on end no issues :)
@@JustLinuxMan not always true, I’ve run a ASUS ROG monitor as well as a HP Omen high refresh rate monitor, both having issues after a few years of use at the highest rate on 1440p. I’ve had a computer tech come out to check my system which was fine and these were his words.
@@DertyD26Sorry about that then. I've had my Asus rog monitor for 3 years at 300hz with no issues whatsoever, and my cousin has had a 144 hz monitor for around 5, again with no issues. Didn't know it was common for monitors to do that!
He did get a higher frame rate but what did he sacrifice? 10 Bit to 8 Bit color, Full RGB to 4:2:2 Chroma subsampling? Wish he told us what he lost by doing this
I’m wanting to use a dual monitor setup. I have a 4080 super and 7800x3d should I use the integrated graphics and gpu or just the gpu? One monitor is 240hz and the other is 165hz both 1440p thanks
It should be self explaining that the videooutput on a graphicscard far surpasses that of a motherboard, but people tend to run the easy way and then complain about their own errors
@@Gameplayer55055 4k 120hz is common now even the 50" HDTV I'm using as a big screen monitor supports 4k@120hz and VRR/Freesync and that was only $500 (model is 50QNED80URA just in case you were wondering, it's a VA panel but it has an unusually wide viewing angle for VA panels and makes a pretty good SDR monitor even if it's only an "average" HDR TV)
"You won't get the best performance" True, but I think not enough people actually know why that is. When you have 2 GPUs Let's say GPU A and GPU B and have your monitor directly plugged into GPU A That GPU is going to run at its fullest when you want to use GPU B, it's going to have to send its images through the PCIe slot, through the CPU and through GPU A to get to the plug, cable and monitor Which adds a way longer and more complicated path for your images to travel, adding latency and making GPU B lose performance GPU A might also have to work a little to make this work! When you have an iGPU and dGPU like in this video, it's exactly the same Except that your iGPU isn't on a graphics card with its own plugs. It's on your CPU chip, which is on your motherboard. And usually a dGPU is much more powerful than an iGPU, so the performance loss from running the dGPU through the iGPU is more noticeable, than the loss from running the iGPU through the dGPU. That is, why the latter is preferred and why you should usually plug your monitor into your dGPU. Btw, on some motherboards you need to enable that feature in the firmware settings.
quick conundrum: pc works fine, get VGA LED on mobo cause my screen is slower to boot than my GPU. Any fix without pushing the monitor button every time i start up?
If im using laptop (Lenovo Legion 5 pro) should i use hdmi to display port or hdmi to hdmi? Because there is only one hdmi port so im quite confused. Please advise thankss
Ive got a 34 inch 165hz monitor using DP, but if I enable 10 bit, it drops to 144hz. Im still happy though, i cant reall notice the extra 20 hz less lol. DP connection is always the best, unless you have an hdmi 2.1 device lol
Also keep in mind you need TWO hdmi 2.1 devices not just the 1, or either the graphics or monitor is going to keep you at the lower speed, DP 1.4 has been around a lot longer than HDMI 2.1 and is still faster than HDMI 2.0
I have a question for your ! Witch one is the best between the ultragear 45 oled and the AW3423DWF ?? I can't seem to make a choice, I love the bigger 45" panel but the QD is just so much better... Help !
Does anyone know why When I use a display cable the Picture quality is noticeably worse than my HDMI cable? I get better Refresh rate but the picture quality looks MEH. LG 27GL83 A-B with RX 6800
So can I use a hdmi, put an adaptor on it so that I can connect it with the display port for 175hz? I have an Xbox series x but my monitor is on diplayport
Hi I have a 4k ultra wide 175hz monitor and my PC has an RTX 4070 Ti Super. When I plug in HDMi I get 120hz refresh rate without any problems. Now there are 3 display ports on my GPU and two of them causes my screen to flicker and blink very often. The center DP port is much better and just flicker some times. As soon as I drop the refresh rate to 60hz from my windows display settings, everything works fine. Now the reason why 2 DPs are blinking and the other one is fine is really confusing, probably GPU issue? Why is the screen working fine on 60hz and not 175hz? I am really confused and tried every solution.
My boy is doing tech tips ASMR 💀
😆 ‘…Yeau gaming muniturrr’
Kinda sounds like bill lumbergh in a way from office space lol
Like hmmm yeah thatd be great
real
People who still use VGA: 👁️ 👄 👁️
Idk, vga literally looks the same to me. I have two identical 1080p monitors side by side. One connected with vga and the other with dvi-d. They look exactly the same, just just need to auto sync the monitor once a year so it looks sharp
Me who uses rca 🌚
@@Brandon-uy1uvGA cant do above 60hz and struggles with higher resolutions.
@@mofstar8683my laptop wotn go to 60 fps anyway
People who still use composite: 🪦
HDMI used to be limited to 60hz and I've been telling people for years display port is the only way to go if you want 144hz+ (back when anything above 144hz was outrageously expensive). Can't forget that advanced display setting either! I wish i didn't procrastinate and made these videos when i started 10+ years ago 😅
what about 1200 ghz?
Hdmi 2.1 exists my little bro
@@rajyavardhansingh4491 if you reread it says used to be limited little bro. thanks though.
@@Somni17 ok
my pc or monitor support display port and my hdmi 2.1 is not giving me more than 60 hz will a dvi cabel be better
People who plug in there hdmi to there cpu with a 4090 👁 👄 👁
I do, but I plug in my pen display to the motherboard instead, and my four monitors go to my 4090.
@@Tunnelsnakeswhat is a pen display?
@@Tyrant02 drawing tablet screen
There, their, they're. God, elementary English.
@Antboi4653 Okay, proffessor English. No need to get your panties in a twist.
I used hdmi for a long time not knowing why I couldn’t get my full 165hz as my monitor said I would have. Well I didn’t realize there’s something called a dp cable that’s much better lmao. Console to pc struggles
Don’t monitors come with them?
Mine didn’t, would’ve save me a lot of confusion if it did 😂
@@skarfacegaming243u sure u didnt just throw it away or toss it aside not knowing what it was. Because ive never seen a monitor that does 165hz that doesnt come with one unless urs was a used one somehow. What brand was this?
@@GodisGood941
Even worse: the included cable doesn’t even support the full bandwidth for 165 Hz.
Always look out for that possibility.
dose your gpu work harder @ higher hertz? i got the Dp cable but never used it cause my HDMI is at 100hz and figured higher is only gong to make my card work harder for no reason really other than to say i got 165hz. or is card going to work the same? let me know cause now i am curious.
So we aint gonna talk about how smooth he slipped that hdmi cable into the gpu mhm yeah
It was probably reversed footage
I thought it looked weird.. kind of like timelapse footage.
also keep in mind if you have one of those old chunky DVI cables, these can also support up to 144Hz refresh rate unlike HDMI, which as mentioned in the video only supports up to 100Hz. This might only be affective for older cards that have DVI ports, as i have a GTX1070, which could be a tad dated by todays standards but not uncommon
HDMI is absolutely not limited to 100Hz. IIRC, HDMI 1.4 can do 1080p 144Hz just fine.
Furthermore, DVI is just HDMI and VGA wearing a trench coat, and both HDMI and VGA are perfectly capable of going past 100Hz.
Is dvi-d better. worse, or the same?
HDMI 2.1 can do up to 144hz at 4k.
@@user-mh7we5ik4g HDMI is from a purely software perspective the exact same signal as DVI (explicitly _not_ VGA) except it can also carry audio plus supports encryption.
Then they added a few things.
If you just want to know if the image is better or worse: no difference except if an older tv has DVI it *will* default to 4:1:1 and *never* give you 4:4:4 but at best 4:2:1 which is not common (4:2:2 and 4:4:2 are these days) and maybe will not even be supported as output.
Thank you so much !!!!!! I had a bug with my monitor because it didn't find the source, it kept blinking, but after that tip, it fixed it !!!
Thanks! Monitor wasn't recognizing display port, connected to GPU directly and worked!
My monitor only has S-video. That’s a great cable right? No need to upgrade
😂
😂😂
Yes, the S stands for Super. You can get any resolution and hz you want.
@@hermanwooster8944💀💀
I'm so glad that by being a programmer I understand and know all this by myself
Left out the part that some motherboards won't let you access BIOS unless you plug the monitor back into the motherboard's DP or HDMI port. It's not well known, but if someone is having trouble getting into BIOS, this is why.
You have saved my life!! My uncle gave me his old dell monitor and just wouldn’t come out of power saving mode! Moved the HDMI cord to the GPU and it worked THANK YOU!!! Saved me hundreds, I plan on getting a newer monitor but this will do until then :DDDDD So excited
He missed this. So I'll give you a hand.
If you get a monitor that supports Display port and HDMI. ALWAYS, and I'll say it again, AL-WAYS go for the Display port. Display port is just better. Since it sends more data. And it can support more high-profile settings.
@@Undersnow243 Nope, you're comparing old displayport tech to new hdmi tech, while also not pointing out that that hdmi version wont exists on monitors made before 2021 except in rare cases, SO lets compare new hdmi to new displayport, hdmi 2.1 data rate: 42gb/s , displayport 2.1 data rate: 77.37... you definitely lost this argument.
@@theendofthelinethe vast majority of people dont have the latest variation of the display port available that their monitor can take advantage of
@@MrVidification nor the latest hdmi, that was the point I made, thanks.
@@MrVidification to fully clarify, hopefully enough, DisplayPort 1.2 has been out since 2010 while HDMI 2.0 has been out since 2013 but even counting the 2016 revision, DisplayPort is still a better choice, thus most consumer devices will do better if they use displayport, even if one has hdmi 2.1 since you need two to make it work.
@@Undersnow243I believe he was saying it like... "have older hardware? ie 2010 to 2020 go display port have 2022/23 go whichever/HDMI. Seemed to me that was saying more people have hw from 2010 to 2020 than the "latest" and therefore dp if available is a better choice/ safer for more people...
I don't pretend to remember all the deets but it was my understanding dp was the better choice most times in the past. And it seems gen to gen still is.
(1) Onboard Video only works if the CPU has integrated Graphics
(2) DP interface also provides color accuracy that HDMI standard does not support.
(3) Onboard Video can be used for additional monitors where all the graphics card interfaces are already connected to monitors and the computer is used for other non-gaming/rendering functions like web browsing, office apps, etc...
make sure to select PC resolution in Nvidia Control panel, select the corect one, because sometimes the max refresh rate doesn't show up on the other setting.
Just switched to dp last night for 165 instead of 144.
😅 it made a smooth and noticeable difference in destiny 2 and Tokyo ghostwire
Also depends on the monitor. Some monitors hdmi is better depending on the variation of hdmi and display port.
Thank you, like legit saved my day
some motherboard disable onboard gpu when dedicated gpu inserted
very nice and easy tip than lots of people who are not used to pc's don't just know about ;)
My monitor went from 144 hz with hdmi to 165 hz maximum refresh rate with display port definitely would recommend!
And get a good cable. So many people complaining about graphics glitches and their graphics cards running hot when the problem was just the cable.
That is if you have integrated graphics on your processor aswell… you won’t get any image at all if that’s the case
if you have a high end monitor from the last few years its worth checking what it can run over hdmi.
hdmi 2.1 added adaptive sync and has some versions that are faster than the very common displayport 1.4, and only 7000 amd and arc intel gpus support dp 2.0 and few monitors do too.
meanwhile hdmi 2.1 was added to 6000 amd and 3000 nvidia cards
something called display stream compression is used wh
(hdmi 2.1 is a mess because they rewrote the whole hdmi standard to be packet based which is the whole reason adaptive sync works now or something and added that to slower hdmi 2.0 speed while calling that 2.1)
You just saved my day thank you so much❤❤❤❤
I need help with DisplayPort, but great video.
Forgot about our favorite dvi and vga 💪💪
Thx to this i used the cable came in monitor box now i get full 180 hz instead of 120 hz which i get for hdmi 😊
Me chilling with my core 2 duo igpu🗿🚬
That was lit at the time.
me who doesn't even have a gpu 😎
me chilling with my RX 5700 XT 🍺🗿
@@AusaradioSoftware renderer?
@@marioprawirosudiro7301 I think I was just talking about no dedicated gpu (even though thats what the main comment was talking about) my apologies
BRO THANK U SO MUCH U SAVED MY PROBLEM
Glad to help
I see here some are arguing about which cable is better,but my Benq HDR monitor has both the ports but with Display port the Gsync/Freesync gets activated.
So I am using DP cable.
It looks like DP are made for pc use.while HDMI are made for home theatre environment where the cable feeds 7.1 Dolby Atmos with Dolby vision,HDR+ etc...
It’s fine to plug into the mobo if you need to just check if it’s working
generally if display port is available go with that
Also not having the proper version of the cable for certain resolutions and refresh rates. If you got display port 1.4 cable but got a 1440p 360hz, ya need a display port 2.1 cable for maximum performance... Made this mistake 😊
Thanks bro i love you
HDMI 2.1 is better than Displayport unless you are able to use DP 2.1
Thank you so much
You're most welcome
And gsync and freesync is only supported over displayport or hdmi 2.1
Yeah because sometimes there’s no output if you put the connector on the motherboard.
Huge shoutout to this video because I forgot to change my monitor settings back to 144hz after messing with it lmao
This could be misleading. I've got boards with hdmi, hdmi and dp, and then I've got boards with no display out.
Or use a display port if you have a 240 hz or anything cause display port handles more hz
I was using 59 the whole time and now im using 75hz thanks
👍🏼
Be careful, if you have a ryzen CPU that does not end in G or an intel core CPU that ends in F, the CPU does not have integrated graphics and the motherboard display connectors will not work.
This is what I think I’m running into, I have a I5 12400f. Bought my DP cables today and nothing but a black screen. HDMI works just fine and it is allowing 144hz refresh rate with a high speed HDMI cable. Not sure how worried I should be about the DP
Always use Nvidia control panel to change resolutions and refresh rates.
Using dingos settings resets the refresh rate if you change resolutions which is annoying.
I have dp plugged in the mb directly, using a rtx 3060 as dedicated gpu and i still get the performance.
i5-12600k is not getting to hot.
Reason is because if i plug cable in 3060 it results in crashing games to desktop (CTD)
My GPU maxed my VRAM when both monitors are plugged into it for some reason so I have the gaming monitor plugged into the GPU and the secondary plugged into the motherboard🤷🏼♂️
I knew about plugging directly into gpu but not about the display cable. Making the switch!😂😂😂😂😂
Even my 2.1 hdmi can’t do above 60 hz on pc but on console it’s fine
But you can get 144 g-sync compatible on Hdmi no trouble now you know on a 4K Samsung
DP is always the port to go for when it comes to computers. Leave HDMI for the home theater, aka the idiot box...
Doesn’t matter what your refresh rate is if your potato can’t push more than 18 FPS
How does that go for display port to HDMI adapters?
You shouldn’t run your monitor at the the highest refresh rate, it shortens the life of the monitor. knock it down to slightly less than that.
With gaming, most people will still want the highest refresh rate. Even with a monitor being high refresh rate, it won't shorten the lifespan enough to where it will become a problem. High refresh monitors will last years on end no issues :)
@@JustLinuxMan not always true, I’ve run a ASUS ROG monitor as well as a HP Omen high refresh rate monitor, both having issues after a few years of use at the highest rate on 1440p. I’ve had a computer tech come out to check my system which was fine and these were his words.
@@DertyD26Sorry about that then. I've had my Asus rog monitor for 3 years at 300hz with no issues whatsoever, and my cousin has had a 144 hz monitor for around 5, again with no issues. Didn't know it was common for monitors to do that!
@@DertyD26Good job spreading false info.
@@Brooo0-x6y ok…I’m just telling you what a computer professional told me. 🤷♂️
He did get a higher frame rate but what did he sacrifice? 10 Bit to 8 Bit color, Full RGB to 4:2:2 Chroma subsampling? Wish he told us what he lost by doing this
🤔🤔🤔
he improved the bandwidth allowing to gain whichever he chose. facts.
for 4k 144hz 10 bit color depth monitor there's no other option except hdmi 2.1
It’s sad that PCs don’t have a display port so we need a DP to HDMI converter for it to connect to the monitor
Not all are like that
Life saver
I have that hdmi cable
Should I plug in both hdmi and display port cables? How does it work?
I have my display port plugged into my main monitor which is 165hz and my hdmi for my second monitor which is my moms old 10 year old monitor
I’m wanting to use a dual monitor setup. I have a 4080 super and 7800x3d should I use the integrated graphics and gpu or just the gpu? One monitor is 240hz and the other is 165hz both 1440p thanks
It should be self explaining that the videooutput on a graphicscard far surpasses that of a motherboard, but people tend to run the easy way and then complain about their own errors
Does it work If i Plug HDMI in To my Monitor but in the PC with an Adapter Like HDMI to Displayport?
What happens if my case only have 1 hdmi
in your control panel i noticed your native resolution was 1440 p on the PC list any idea why my native is on the Ultra HD, HD, SD list???
Uhd is 4k, qhd is 1440p, 2k is very very slightly different from 1440p,HD is 1080p, and I assume sd is 720p
I haven't ever used displayport, my monitor has it but I've just used HDMI cable
It's 4k 60hz anyway and works well
HDMI 2.1 ist 4k 120hz
@@maxkruppa3716 I didn't know 4k exists in 120hz. Probably monitor has to be *very* expensive.
@Gameplayer55055 depends on the other specs, I saw a 32 inch 4k 120hz monitor although it had a 5ms response time 😂
@@maxkruppa3716 Akshually HDMI 2.1 can do up to 144hz at 4k.
@@Gameplayer55055 4k 120hz is common now even the 50" HDTV I'm using as a big screen monitor supports 4k@120hz and VRR/Freesync and that was only $500 (model is 50QNED80URA just in case you were wondering, it's a VA panel but it has an unusually wide viewing angle for VA panels and makes a pretty good SDR monitor even if it's only an "average" HDR TV)
bro you saved my life tf me abd my buddy have been trying to get my monitor to connect for the past 30 min-1 hour
And what's the monitor's backlight LED refresh rate? Does it also switch to 120 Hz 🤔
"You won't get the best performance"
True, but I think not enough people actually know why that is.
When you have 2 GPUs
Let's say GPU A and GPU B
and have your monitor directly plugged into GPU A
That GPU is going to run at its fullest
when you want to use GPU B, it's going to have to send its images through the PCIe slot, through the CPU and through GPU A to get to the plug, cable and monitor
Which adds a way longer and more complicated path for your images to travel, adding latency and making GPU B lose performance
GPU A might also have to work a little to make this work!
When you have an iGPU and dGPU like in this video, it's exactly the same
Except that your iGPU isn't on a graphics card with its own plugs.
It's on your CPU chip, which is on your motherboard.
And usually a dGPU is much more powerful than an iGPU, so the performance loss from running the dGPU through the iGPU is more noticeable, than the loss from running the iGPU through the dGPU.
That is, why the latter is preferred and why you should usually plug your monitor into your dGPU.
Btw, on some motherboards you need to enable that feature in the firmware settings.
for some reason my dp cable never works when i plug it in. had it happen on my last pc and my new one
Video conector on mobo shouldnt work when discreet gpu is instaled. If you plug cable in motherboard you wont gen picture.
quick conundrum: pc works fine, get VGA LED on mobo cause my screen is slower to boot than my GPU. Any fix without pushing the monitor button every time i start up?
I was running 60hz for like 6 months before I realized I could go to 144
Internal graphics vs dedicated card. I thought most knew the difference 😮
What is that software to check cable?
But somehow my pc doesn’t have a display port but my monitor does.
Wreckfest is an awesome game by the way.
Yup
What hdmi to displayport cable I need to overclock my monitor connected to gaming laptop?
But where does this VGA cable go?
If im using laptop (Lenovo Legion 5 pro) should i use hdmi to display port or hdmi to hdmi? Because there is only one hdmi port so im quite confused. Please advise thankss
what if my cpu is much better than my gpu should i still plug my displayport to my gpu?
Ive got a 34 inch 165hz monitor using DP, but if I enable 10 bit, it drops to 144hz. Im still happy though, i cant reall notice the extra 20 hz less lol. DP connection is always the best, unless you have an hdmi 2.1 device lol
Also keep in mind you need TWO hdmi 2.1 devices not just the 1, or either the graphics or monitor is going to keep you at the lower speed, DP 1.4 has been around a lot longer than HDMI 2.1 and is still faster than HDMI 2.0
I have a question for your ! Witch one is the best between the ultragear 45 oled and the AW3423DWF ?? I can't seem to make a choice, I love the bigger 45" panel but the QD is just so much better... Help !
And how about dvi I have heard that dvi even more resolution quality gives
Theirs black bars on my gpu do i have to take them off
What if im out of dp and the only other dp is on my motherboard
Does display port compatible with monitor speakers like hdmi can ?
Does anyone know why When I use a display cable the Picture quality is noticeably worse than my HDMI cable?
I get better Refresh rate but the picture quality looks MEH. LG 27GL83 A-B with RX 6800
i have integrated graphics.. now what
If i dont split it it wont show a screen…
*PRESS ENTER TO START*
I'm getting a black screen on my monitor when connected via dp but not hdmi any idea what could be the issue
That happened to me before cant remember what I did...are both plugged in at the same time? Maybe try one at a time
So can I use a hdmi, put an adaptor on it so that I can connect it with the display port for 175hz? I have an Xbox series x but my monitor is on diplayport
Xbox doesn't go above 120hz, no matter the cable
My gpu is so bad I have to plug the hdmi into the cpu hdmi slot
Hi
I have a 4k ultra wide 175hz monitor and my PC has an RTX 4070 Ti Super. When I plug in HDMi I get 120hz refresh rate without any problems. Now there are 3 display ports on my GPU and two of them causes my screen to flicker and blink very often. The center DP port is much better and just flicker some times.
As soon as I drop the refresh rate to 60hz from my windows display settings, everything works fine.
Now the reason why 2 DPs are blinking and the other one is fine is really confusing, probably GPU issue?
Why is the screen working fine on 60hz and not 175hz?
I am really confused and tried every solution.
Did u update driver?
Is there any HDMI which supports 170hz cus mine only supports 144hz
My motherboard does not have video outs as it is a HEDT platform so there is no integrated gpu.
I’ve set my display to max settings, plugged into GPU, now I get black screen on startup and have to restart the video driver every single time.