Monitor performance question

Discussion in 'Hardware' started by Bull Shark, Feb 23, 2022.

  1. Bull Shark

    Bull Shark Well-Known Member

    Joined:
    Mar 14, 2019
    Ratings:
    +162 / 0 / -0
    Hi all,

    As you can see in my sig, I have 3x27inch screens 2560*1440. Or 7680*1440 combined. With some games I need to switch to 1080p 1920*1080 or 5760*1080.
    Not a problem to do so but I read somewhere that switching back from 1440p to non native 1080p does not give the same fps as gaming in native 1080p.
    Is this correct? Does this mean I better can buy a new triple screen setup to win FPS. The non native 1080p res is still displayed on the 1440p screens so My GPU does not have less pixels to calculate. There are only lesser dpi if I understand.
    That said, my screens are made for DTP they are not gaming screens. So my screens can only produce 60fps max. In 1440p that is not a problem because my system can’t produce a higher fps in that res anyway. But when I game in the non native 1080p I could see more fps if my screens could handle more fps than the current max 60.
    My screens do still cost €300+ Per screen. I can find a lot of good gaming screens made for gaming in that price range.

    Can anybody shine a light on this?
     
  2. Maskerader

    Maskerader Well-Known Member

    Joined:
    Oct 6, 2019
    Ratings:
    +355 / 0 / -0
    If understood it correctly, in your case game frames are calculated in 1080p resolution and then upscaled to 1440p. Upscaling can happen either on your GPU or on your monitor itself, depending on what you choose in your GPU drivers. Either way upscaling requires so little computing power (compared to what GPU is capable of) that any FPS drop, if it's even there, shouldn't be noticeable.

    The choice between higher resolution and higher refresh rate is purely a matter of personal preferences...
     
    • Like Like x 1
  3. Bull Shark

    Bull Shark Well-Known Member

    Joined:
    Mar 14, 2019
    Ratings:
    +162 / 0 / -0
    Thanks, will do some more tests.
    Yes 1080p must be upscale to a 1440p space of course. The screen is native 1440p
    That said, it looks like with AMS for example, it runs smoother in 1440p with some settings toned down that in 1080p with all settings maxed out. :D
     
  4. DonaldD

    DonaldD Member

    Joined:
    Jan 6, 2022
    Ratings:
    +13 / 0 / -0
    Im not trying to confuse this but as I understand your description then if you set the g-card to output a 1080p picture(1920*1080) then it is still shown as it fills the whole 1440p screen out (no black boarders).
    But that doesnt (nessesarily) mean that the pic is upscaled to 1440p.
    It can also mean that the 1080p picture is just stretched to fill all the screen - eventhough it is still just 1080p.
    Thats at least as I understand it.
    Any monitor is able to stretch a lower res picture to fill all the screen.
     
  5. Bull Shark

    Bull Shark Well-Known Member

    Joined:
    Mar 14, 2019
    Ratings:
    +162 / 0 / -0
    Indeed no black borders. Just a full screen. Although I had a 1080p monitor next to my 1440p monitor and compared the both. Both screens where running in 1080p but I could not see a big difference. The 1440p was a tiny bit less sharp in 1080p than the native 1080p.
     
  6. ravey1981

    ravey1981 Well-Known Member Beta tester

    Joined:
    Apr 15, 2018
    Ratings:
    +873 / 0 / -0
    You should get more fps by rendering at 1080p than 1440p. The only caveat is that running a lower resolution is more likely to start bottlenecking your CPU (if it's not up to the task) since it no longer has to wait for the GPU. If that makes sense. I find it's always useful to cap your FPS at whatever you deem to be acceptable rather than let your system push itself to the limit of your hardware.
     
  7. Bull Shark

    Bull Shark Well-Known Member

    Joined:
    Mar 14, 2019
    Ratings:
    +162 / 0 / -0
    I try to get at least a steady 60fps. That is acceptable for me.
    The strange thing with FPS on my system is, when the game drops from 60 to 58 or likewise I see this frame loss in a tiny kind of a hick up. If it was a smoothness drop in frames it would not bother me. But I see the fps drop and that makes me not like it.
     
  8. DonaldD

    DonaldD Member

    Joined:
    Jan 6, 2022
    Ratings:
    +13 / 0 / -0
    Its probably this function in the driver that does make what I called "stretching".
    The correct word is "scaling" - as can be seen.
    Thats the way res 1080 can fill a 1440 screen.
    It has nothing with upsampling to do.

    scaling.png
     
    • Informative Informative x 1
  9. Maskerader

    Maskerader Well-Known Member

    Joined:
    Oct 6, 2019
    Ratings:
    +355 / 0 / -0
    Looks like you would benefit from using monitors with a variable refresh rate (stuff like GSync or FreeSync). And it looks like your monitors already support Freesync. They might not work well with your nVidia card, but worth trying. Another thing worth looking at is that your monitors support up to 75Hz refresh rate, although reaching it can sometimes be a bit complicated.

    At the very least, if you're using regular Vsync try switching to what nVidia calls "Adaptive" (set it in nVidia control panel; in the game, set Vsync to "off"). Maybe FPS drops will be less noticeable then.
     
    • Informative Informative x 1
    Last edited: Feb 24, 2022
  10. Maskerader

    Maskerader Well-Known Member

    Joined:
    Oct 6, 2019
    Ratings:
    +355 / 0 / -0
    In the context of this thread it's effectively the same...
     
    • Like Like x 1
  11. Bull Shark

    Bull Shark Well-Known Member

    Joined:
    Mar 14, 2019
    Ratings:
    +162 / 0 / -0
    According Nvidia’s supported monitor Brandts, Iiyama is not stated. And my GFX card does not have 3 display ports. 2x DisplayPort, DVI-D, 2x HDMI. As far as I understand you have to connect the screens through DisplayPort if you want the benefits of Freesync.
    70 or 75Hz is only available when FreeSync is activated.

    Edit: just read on a site that it should work with Iiyama . Will take a look at this later today. Thanks for the heads up. :D
     
    Last edited: Feb 24, 2022
  12. Maskerader

    Maskerader Well-Known Member

    Joined:
    Oct 6, 2019
    Ratings:
    +355 / 0 / -0
    This doesn't mean that it won't work. It only means that they either didn't test it, or tested it and the performance wasn't up to their standards. Since you already have your monitors, it won't hurt trying.
     
  13. Bull Shark

    Bull Shark Well-Known Member

    Joined:
    Mar 14, 2019
    Ratings:
    +162 / 0 / -0
    Tested. Left screen and centre screen do give 70Hz now via DisplayPort and Freesync active. Right screen is connected via HDMI and somehow it won’t give 70Hz, only 60Hz max. Strange. As you can see on the screenshot, it does not show the Nvidia Logo as well. I have tested 2 different HDMI cables. Really don’t know what is going on here.
     

    Attached Files:

    Last edited: Feb 24, 2022
  14. Maskerader

    Maskerader Well-Known Member

    Joined:
    Oct 6, 2019
    Ratings:
    +355 / 0 / -0
    Looks like you do need DisplayPort for Freesync if you have an nVidia GPU...
     
  15. Bull Shark

    Bull Shark Well-Known Member

    Joined:
    Mar 14, 2019
    Ratings:
    +162 / 0 / -0
    Yeah. I hooked it on a displayPort and it gave 70Hz as well. O well. I have tested all kind of settings. The best and smoothest is all screens on 60Hz and Vsync on. I’ll keep it as it is now. Perhaps when I have some money to spend I’ll look for other screens. For now It is ok for me. Not perfect but what is perfect. :D Thanks for thinking with me. ;)
     
    • Like Like x 1
  16. Maskerader

    Maskerader Well-Known Member

    Joined:
    Oct 6, 2019
    Ratings:
    +355 / 0 / -0
    Did you cap your FPS for Freesync (like, at 69 FPS)? In theory this should give you smooth picture when FPS is at the cap, and when FPS drops below that, it should look better that compared to Vsync.
     
  17. Bull Shark

    Bull Shark Well-Known Member

    Joined:
    Mar 14, 2019
    Ratings:
    +162 / 0 / -0
    Tried everything. Also a FPS cap but for me or my system that is, it is best to run at 60fps and Vsync on. Smoothest experience. :D
     
    • Agree Agree x 1
  18. Bull Shark

    Bull Shark Well-Known Member

    Joined:
    Mar 14, 2019
    Ratings:
    +162 / 0 / -0
    My GFX card does only support HDMI 2.0. That is the problem if I understand it correctly. HDMI 2.1 supports higher refresh rates in 1440p. So I don’t have to look further. If I want all 3 screens on 70 or 75Hz, I need a newer GFX card. :(
     
    • Agree Agree x 1
  19. DonaldD

    DonaldD Member

    Joined:
    Jan 6, 2022
    Ratings:
    +13 / 0 / -0
    You know probably allready that it doesnt help to use some kind of HDMI to DP converter to try to get an extra DP port.
    I have seen a lot trying this but have never seen a success. ;)
     
    • Like Like x 1
  20. Bull Shark

    Bull Shark Well-Known Member

    Joined:
    Mar 14, 2019
    Ratings:
    +162 / 0 / -0
    A converter does not make the HDMI 2.0 to a 2.1 ;)