Poll - Who is running SLI/CF

Discussion in 'General Discussion' started by FunkyChicken, Jun 20, 2015.

?

Are you running a multi-GPU system?

Poll closed Jul 18, 2015.
  1. Yes, Nvidia SLI with two or more GPUs

    9 vote(s)
    19.1%
  2. Yes, AMD CrossFire with two or more GPUs

    4 vote(s)
    8.5%
  3. Not yet but planning to go for SLI

    2 vote(s)
    4.3%
  4. Not yet but planning to go for CrossFire

    2 vote(s)
    4.3%
  5. No, one card is fine for me

    30 vote(s)
    63.8%
  1. FunkyChicken

    FunkyChicken Well-Known Member

    Joined:
    Jan 30, 2015
    Ratings:
    +50 / 0 / -0
    As more and more people out there running multi-GPU systems, this poll is intended to get an overview of the actual user base. Are multi-GPU systems already a common hardware base for R3E drivers?
     
  2. garytc78

    garytc78 Member

    Joined:
    Jun 12, 2015
    Ratings:
    +6 / 0 / -0
    My experience with SLI?
    It's dissappointing! especially in racing games and sims. Poor support in other words. Very often produces stuttering and sudden framerate drops.

    My advice?
    use one powerful GPU and forget about SLI
     
    • Agree Agree x 5
  3. The_Grunt

    The_Grunt Well-Known Member

    Joined:
    Jun 19, 2015
    Ratings:
    +168 / 0 / -0
    I also have had couple SLI setups (latest dual GTX670s) and generally, I haven't been pleased with them. Microstuttering is still evident and because of that you need higher FPS with SLI to get the same smoothness than comparable single card performance. I think that R3E will get its SLI/CF profiles at some point, but as long as the developement is going on and rendering engine gets changes along with it, it is of no use to put big effort to multi-GPU option IMO. It is probably one of the last steps during the developement.
     
  4. Robert Gerke

    Robert Gerke Member

    Joined:
    Jun 2, 2015
    Ratings:
    +9 / 0 / -0
    SLI Sience 2008 ..... best thing for Enthusiasts and Triple Screen User ;)

    Sorry, but is a shame that S3 doesent support SLI, all other Sims
    have done that. This Time i use a custom bit ( 0x00400405 ) SLI Profil.
    +75% more Power. Not perfect, but good and verry smooth with
    5760 x 1080 in 8 x MS + 8 x SSAA.

    Hope that S3 wake up here in this and give us +90% ;-) Thanks

    PS: Never without a SLI System..never!
     
  5. Cosmic

    Cosmic Well-Known Member

    Joined:
    Feb 19, 2015
    Ratings:
    +74 / 0 / -0
    The only good SLI experience I ever had was back in the 3DFX days, everytime I've tried it since then has been a pain in the a$$.
     
  6. FunkyChicken

    FunkyChicken Well-Known Member

    Joined:
    Jan 30, 2015
    Ratings:
    +50 / 0 / -0
    The intention of the poll was not to list a pro and con for SLI/CF but to see how big the user base is nowadays. Besides this, with a well optimized GFX-engine, a dual-card systems can bring the experience to another level. There are enough good examples out there.
    The latest graphic cards are extremely powerful, indeed, but even these are not enough for highest visual quality on high resolution (triples or 4k). So there is no other solution than dual chip configs.
    I would be very pleased if we could get a well optimized SLI/CF support for R3E; on the other side I do understand it would a nonsense if nobody is using it. That's what the poll is for ... to see if it would be worth it ;)
     
    • Like Like x 1
  7. Arthur Spooner

    Arthur Spooner Well-Known Member

    Joined:
    Feb 5, 2015
    Ratings:
    +432 / 0 / -0
    I'm curious. I don't use any multi-GPU solution so far but I always read that there are always issues with so called micro stuttering. From what I read it comes from problems when trying to sync the output of both GPUs. Is this true? How bad is this? Or is this something some people are annoyed by a lot while others don't even notice it? Like some people get nausea when playing certain 3D-shooters while others have no problem.
     
  8. Sean Kenney

    Sean Kenney Well-Known Member

    Joined:
    Feb 9, 2015
    Ratings:
    +316 / 0 / -0
    Seems like sli/crossfire peeps are always having issues...to me.
     
    • Agree Agree x 1
  9. D.Boon

    D.Boon Well-Known Member

    Joined:
    Feb 19, 2015
    Ratings:
    +386 / 0 / -0
    I would love to run SLI or CF but, life hasn't quite worked out how I expected it too so can't afford more than one (now old) card, unfortunately, kids and the missus, come first. :confused:
     
  10. ElNino

    ElNino Well-Known Member

    Joined:
    Feb 7, 2015
    Ratings:
    +475 / 0 / -0
    I agree about the single card being a better choice, however once you get into triple monitors and have an obsession getting games to run on higher settings, even a powerful card (290x in my case) needs help, especially with tracks like Sonoma that seem to hog resources.

    This is why i would love to see more games supporting xfire/sli in the driving genre where triple screens are more common. Maybe well hit the jackpot and they will upgrade to dx12! :D
     
    • Like Like x 1
  11. D.Boon

    D.Boon Well-Known Member

    Joined:
    Feb 19, 2015
    Ratings:
    +386 / 0 / -0
    They can do that when I get a R9 300 series/Fury GPU, until then, I'd be happy if they just upgraded to DX10.1 lol.

    It's worth noting that the minimum system requirements listed for R3E, requires a card that actually supports up to DX11... Just saying.
    Upgrading to a different DX though may require a massive overhaul as well as purchase of that particular versions developer tools, I imagine it would be costly and time consuming but, the benefits may be worth it, particularly for shadows and draw distance.
    Again... Just saying ;)
     
  12. ElNino

    ElNino Well-Known Member

    Joined:
    Feb 7, 2015
    Ratings:
    +475 / 0 / -0
    ^^^ yeah agreed i think dx12 is a pipe dream for r3e, but given what they are saying about multi gpu support with it, which is that it will basically use any gpus available without needing crossfire/sli, it just sounds like it will be a godsend for multi gpu owners, or even apu/gpu users.

    That is an intersting point about dx11 requirement! Obviously, we should expect the dx11 upgrade in the next hot fix patch lol :D
     
  13. FunkyChicken

    FunkyChicken Well-Known Member

    Joined:
    Jan 30, 2015
    Ratings:
    +50 / 0 / -0
    Compared to other sims and given the fact R3E uses DX9, the engine is "brilliant". The visual quality is, using high settings, just great and even then it's performing very well. Just with some few settings lowered a bit I'm able to run it in 4k with a single 970. Something that's impossible with e.g. AC or PCars.
    But with the latest additions like Sonoma or Shanghai, single cards start to struggle in high resolutions. I'm sure the engine is even capable of more ... night/day transitions had been mentioned ages ago and weather as well?!? (I'm not sure). Now, adding more graphic highlights, it will be impossible running high settings with a single card. So the easiest "fix", without changing the architecture of the engine, is to implement proper SLI/CF support. In this way, you can boost the visual quality even when using cheaper cards...

    Another option is to switch the architecture to, let's say DX11. But like Ronin said, this is a huge amount of work to make it properly and I'm not sure it will solve all the upcoming problems. On the other other side it will be a good base for future developments, more up-to-date and capable of adding some nice "special effects" where a DX9 engine might fail.

    Regarding DX12 ... it's the right way for future graphics and the potential is great, but for now, imho, it's totally over-hyped. All the demos and presentations make it look like you just need to exchange the DX11 render path by DX12 and you will get a drastic performance increase ;) There might be a slight increase on low-end systems due to the reduced draw calls. But to get a proper implementation and converting the whole potential into real performance, new engines need to be developed. For the first year or so I'm just seeing this for playable Tech-Demos (like Crysis) or games from the major studios with enough man-power and money. Despite the fact, that you need DX12 hardware...
     
    Last edited: Jun 22, 2015
  14. Oliver Augst

    Oliver Augst Well-Known Member

    Joined:
    Jan 29, 2015
    Ratings:
    +69 / 0 / -0
    I will probably upgrade later this year on SLI, so I am for SLI and Triple Screen Support
     
  15. nate

    nate Well-Known Member

    Joined:
    Jan 31, 2015
    Ratings:
    +875 / 0 / -0
    Since this is turning into a bit of a discussion regarding multi-gpu setups and the effectiveness of them, I might as well share some of my thoughts on the matter. Coming from someone who has been using 2 gpus for about 6-7 months now.

    I was always quite aware of all the criticism mentioned in topics of people asking for advice about SLI/Crossfire. The micro-stuttering, the lack of performance scaling, the high cost compared to a single card of equal value to the 2 cards, the lack of vram scaling... Then on the opposite, there were equally as many people mentioning how adding a 2nd gpu can extend the life of your system by adding more performance at a 'reasonable' cost. As well as getting that extra 'grunt' for those that need it with triple monitors or those who run high resolutions.

    Anyways, I initially had an Nvidia gtx 660, for about a year before adding a 2nd. I only play at 1080p, so anything more powerful than that was never a desire of mine. The overall performance is about as good as I could ask for, without splurging on a single high-end card.

    However, as I have come to learn though using 2 gpus now... The performance in "most" games that support the feature, is usually close to double what my single card performs at. Which would make sense right? You are adding a 2nd gpu, so you would expect the performance to double. In the games that almost require it, I get much better frame rates at the same settings. Like a few racing sims. More fps is always better, right? :p I run without v-sync in every racing game I play. Even on a 60 Hz monitor I never experience any screen tearing. Maybe Im fortunate in this regard, as many people complain about that and use v sync, but it has never been an issue for me. Along with the better frame rates, I can crank up some of the graphics settings, which is nice of course.

    The issues arise with something I alluded to before... That some games just dont support multi-gpu systems. And when you come across a game like that, it will take the wind out of your sails a bit when you realize you have a monster of a system, and cant utilize it like intended. There are many examples of this out there. Many many games dont support multi-gpu configs. Perhaps some of it is driver related, perhaps some of it is game developer related. Im not sure. I do know though, that some games that dont have official AMD/Nvidia multi-gpu support, do in fact work with developer coding, and without the help of AMD/Nvidia.

    Im not at all well versed at this, but apparently it is possible to "force" sli and I would assume crossfire, even in the games that dont support it officially, or officially through drivers. Although, this seems to be a YMMV type thing. Sometimes you can hack it to work well, other times, it breaks the graphics somehow and gives you flickering, artifacting, or otherwise.

    Beyond that, I have only ever experienced the micro-stuttering very seldomly. Almost rarely enough for me to completely write it off as a non-issue. Stuttering and lag happens. It goes with the territory of trying to push a game to it's graphical limit, or at least what your system will allow for. ;)

    The vram concern isnt an issue for me, as no game I play saturates the frame buffer at 1080p. And for the games that do... They are either horribly optimized (Im looking at you Watch_Dogs), or my gtx 660's arent powerful enough to produce a good enough framerate when I saturate the frame buffer completely. Basically, my cards are too weak to fully utilize their vram. If I played at high resolutions, they would be gasping for air long before they hit their vram limit.

    Regarding DX12... I understand the praise and anxiousness for it. Going off Microsoft's claims, many people are expecting this to be the magic bullet of sorts. Providing performance leaps and bounds better than currently available. Well, call me pessimistic, but Microsoft always says this every time they push out a new DX version. It's how they sell people on it. It's just marketing. It's alright to get hyped up about something that is proven to be effective and available, but I see this transition going differently.

    DX 12 is still a long way off. Games that are out, arent going to get an upgrade to DX 12. It would make zero financial sense from a developer perspective to completely redo the graphics engine of the game when it has already been made. Adoption for DX 12 by game developers will also likely be slow. Disregarding that many people wont even be able to use DX 12 because their gpu is too old... Just look at the adoption of DX11 that is currently out. Many games still arent even being made with it :p

    That said, DX 12 looks promising, and I look forward to it eventually. And as for multi-gpu configs... The results arent far from what I expected. Sure there have been head aches, and troubleshooting trying to figure things out. However, most of the common issues discussed are things of the past.

    I guess the final question is... Am I happy with my multi-gpu setup and if I had it to do over again, would I? Hell yes :cool:

    Cheers
     
    • Agree Agree x 1
  16. pixeljetstream

    pixeljetstream Well-Known Member Beta tester

    Joined:
    Jan 29, 2015
    Ratings:
    +412 / 0 / -0
    Some clarifications on Dx12: The API exposes multiple devices explicitly, which means you can do task A on GPU 0 ad task B on GPU 1. This is somewhat possible today on OpenGL through vendor specific extensions, but mostly in the field of workstation applications. It also allows using devices from different vendors, say Intel integrated GPU doing some work as well, however it means of course that data-sharing across vendors is gonna be slowish (piped through CPU ram).

    As @nate mentions it is not a magic bullet, it simply means the developer now can code their own SLI, or differently said they sort of have to ;) It's the general gist of dx12/vulkan that the apis are lower-level, expose a bit more than before, but that also means the developers have to do more work that was traditionally done in the driver. This is less an issue for those who have their engines with lots of APIs (consoles...) already, but it will take a bit more effort for the rest (am working on vulkan at work myself).

    As for SLI support, it is indeed a mix of driver and developer work. For the top titles the IHVs will work closely with the developers or find out the best driver internal settings/data-flows. There is a set of strategies to choose from... Most of this is done by the hardware companies themselves. Essentially most SLI titles use what is called AFR, alternate frame rendering. Every other frame is rendered on a different GPU. This improves your performance if the CPU time to process a frame is <= half of the GPU time. More details: http://developer.download.nvidia.com/whitepapers/2011/SLI_Best_Practices_2011_Feb.pdf

    Code:
    one gpu renders 4 frames:
    |...|...|...|...|
    two gpus render 4 frames, alternating:
    |...|...|
      |...|...|
    
    Now this sounds kind straight forward and easy, but some things make it not so easy, for example the driver has to find out what resources need to be synchronized between the two GPUs (which would suck if one has to wait for the other), it may have to send more data from CPU to GPU, because it needs to make sure both GPUs have all data. Maybe some data is only used in a single frame, and other is used in many frames. Stuff like that. And because in current DX11 there is no way the developer really can tell the driver what to do, it's why it's always a mix of developer doing things in a way that map to driver strategies well. Or hardware driver guys looking at the stream and trying to find a good way to classify dependencies...

    And that's where the benefit of Dx12 kicks in, the developer by doing it himself, can express what he wants to do, he knows what data is needed on either GPU. He can also use the GPUs in different ways, each doing different type of work... that flexibility ideally improves situation, IF the developer puts the time into it. Now it's mostly driver people doing that work, which is why you see official support only on select titles.
     
    • Informative Informative x 3
  17. FunkyChicken

    FunkyChicken Well-Known Member

    Joined:
    Jan 30, 2015
    Ratings:
    +50 / 0 / -0
    A good summary, Pixel!
    What you pointed out with the DX12 development is exactly what I tell people that are hyping the performance increase for the new API! Without the efforts by the devs, we will not see much improvement. Unfortunately, there are so many games on the market, which are not at all well developed and optimized, even for the existing APIs. So it might be even worse in the future. And it will be a challenge for the smaller studios with less budget ...

    Let's cross fingers ;)

    Coming back to R3E it would be nice to know if anything is planned. We can already force SLI modes, but these create massive artifacts, in particular on the menus and loading screens. Maybe the new web-based menu might be the reason. Once on the track, there are only some artifacts on the position banner. On the other side, some SLI bits force both cards going full load but the overall framerate drops compared to a single card. Other bits gain some performance increase while not utilizing both cards to the max and creating further artifacts (e.g. on the shadows) or causing huge variations in the fps.
    I'm not a programmer so I don't know, how complicate it is to tweak the engine and render path...
     
    Last edited: Jun 23, 2015