DirectX apis, feature levels, hardware support

Discussion in 'Hardware' started by pixeljetstream, Aug 11, 2015.

  1. pixeljetstream

    pixeljetstream Well-Known Member Beta tester

    Jan 29, 2015
    +396 / 0 / -0
    Seeing some misconceptions about DirectX hardware and software in the feature request thread, wanted to move the topic to a dedicated thread.

    On Desktop PCs hardware supports various feature sets, Microsoft standardized some of those and called them "feature levels".

    Different hardware supports different feature levels, most people nowadays will be on 11_0 or higher, which can be seen at valve's hardware survey website.

    For NVIDIA a DirectX 11_0 feature level class GPU is Fermi (GeForce GT 4xx) and above since early 2010. For AMD It's Radeon HD 5xxx since late 2009.
    Wikpedia has a nice overview on all NVIDIA chips and AMD/ATI chips

    An application/game developer uses the DirectX runtime to talk with the GPU driver. The driver implements a low-level api designed by Microsoft called "wddm". Windows Vista introduced "wddm 1" and Windows 10 "wddm 2". You need the latter for using DirectX 12. However "wddm 1.x" is sufficient to run Windows 10, hence you will see a lot of old hardware (DirectX feature level 10) still supporting Windows 10. However that doesn't mean it supports DirectX 12. wddm 2 maps better to modern hardware, so if you have a wddm 2 driver, even older titles not using DirectX 12 should benefit.

    The DirectX runtime is a "convenience" layer between application programmer and hardware driver as well as operating system internals, and it is written by Microsoft. This is a bit different to OpenGL or the new Vulkan, which is written by the hardware vendors themselves. The latter allows hardware vendors to bring out features independent of what Microsoft defines as standard.

    Prior to DirectX 10 one pretty much used the version of the DirectX runtime that matched hardware. So on DirectX 9 (or better) hardware one used DirectX 9. With DirectX 10 Microsoft allowed using the same api version (what the game developer uses) with different hardware feature levels. This may be confusing sometimes, because a driver for a certain hardware may say DirectX 11 supported, but the actual hardware only has feature level 10_0.

    But the great thing is that you could program a game using DirectX 10 api, but still run it on DirectX 9 hardware, you only have to live with certain restrictions, like not being able to use certain shaders, texture formats... The api from DirectX 9 to DirectX 10 changed quite a bit and DirectX 10 hardware while nice, wasn't that big of a leap, so most didn't bother with working on DirectX 10 but switched to DirectX 11.

    You can use DirectX 11 on hardware from feature level 9_1 and higher. For the old hardware it may not be as fast as using DirectX 9 runtime directly, but it works.

    DirectX 12 requires feature level 11_0 and higher hardware.

    Microsoft provides a "reference" software implementation (ungodly slow) for the hardware vendors and application developers to debug/compare their images&results against. They also provide another emulation/software device that is tuned for speed and does allow running DirectX10-12 applications on the CPU.

    As mentioned before DirectX runtime is a convenience layer, otherwise application developers would have to code a lot more vendor/gpu specific things. However such a layer comes at a certain performance price. The recent new cross-vendor graphics apis (DirectX 12, Vulkan and Metal) have less of that convenience and map better to the recent GPU designs, so there is more low-level access to render and compute via the hardware. "With great power comes great responsibility" and to really leverage the new concepts the rendering architecture has to be designed for it. It is not a matter of plugging DirectX 12 in, and everything is fast ;) The whole render architecture should be designed to leverage the new way of doing things and make good use of it.
    Some companies have more experience with this because they also do console development, which traditionally had the lowest-level access possible because the whole operating system and hardware was fixed. So it will take a bit until we see the DirectX 12/Vulkan benefits in the wild.
    • Informative Informative x 4
    • Wonderful Wonderful x 1
  2. GooseCreature

    GooseCreature Well-Known Member

    May 30, 2015
    +606 / 0 / -0
    If everything goes as mind numbingly slow as usual DirectX14 will be out before any developer has even exploited DirectX12 to its limits. The video card industry has moved on so much further than software development (sims,games) but at least the Movie industry CGI teams utilize at least some of the features available to them. Saying that pretty sure Microsoft are guilty of dragging their heels with regards to opening up code to developers. The latest Nvidia/AMD cards have a huge array of power and features that will take software developers some time to catch up. All this being said I still think that S3 have it right (to a point) get the physics and core of the sim right then lay on the eye candy!!
    I still also believe that anyone with more than a passing interest in driving sims knows full well that money has to be spent on hardware and that complaining they can't play on their 5 year old laptop is not a reason to hold back development for the majority.
    • Agree Agree x 3