DirectX 9 CPU Benchmark Thread

Discussion in 'General Discussion' started by Thomas Jansen, Jul 22, 2019.

  1. Domingo

    Domingo New Member

    Joined:
    Jul 13, 2015
    Ratings:
    +4 / 0 / -0
    My test:
    • CPU: Intel i5 8600K
    • CPU OC: 4.9 GHz all core
    • Memory: 8gb ddr4-2400MHz Dual
    • Memory timings: CL13-15-15-34
    • GPU: Nvidia Geforce GTX 1070

    FPS: 349.8
    52ÂșC. I didn't expect this results with this RAM. Still room for OC on CPU and RAM though. It is a daily use PC and I don't want to push it too hard.

    edit: (done in windowed mode)
     

    Attached Files:

    • Like Like x 1
    Last edited: Mar 28, 2020
  2. mmmmbeer

    mmmmbeer Member

    Joined:
    Jun 21, 2015
    Ratings:
    +6 / 0 / -0
    Just stumbled across this thread in my vain search for more VR performance from this game. So here's my setup and score
    • CPU: Intel i7 6700K
    • CPU OC: 4.6 two cores, 4.4 two cores
    • Memory: 16gb ddr4-2400MHz Dual
    • Memory timings: CL15-15-15-35
    • GPU: Nvidia Geforce RTX 2080ti
    Max FPS was 474.4 although Unigine seems to be reporting my clock speed is 4.0ghz it should be 4.4 minimum and 4.6 max.

    Unigine.JPG mem.JPG Cpu.JPG
     
    • Like Like x 1
  3. Larry Foster

    Larry Foster Member

    Joined:
    Apr 14, 2019
    Ratings:
    +13 / 0 / -0
    @mmmmbeer Oh my that version of Windows????????
    Also I think the benchmark reports base clocks
    Have you used the XMP profile for your RAM?
     
  4. Balrog

    Balrog Well-Known Member

    Joined:
    Apr 10, 2015
    Ratings:
    +466 / 0 / -0
    The benchmark reports the wrong Win version for everyone judging by the posted pictures. But beleive or not, modern Windows editions are based on NT to some extent, NT 6.2 is Windows 8 for example.
     
  5. mmmmbeer

    mmmmbeer Member

    Joined:
    Jun 21, 2015
    Ratings:
    +6 / 0 / -0
    Yes it does. I'm running the latest version of Windows 10. XMP is running in bios and an OC applied to the CPU.
     
  6. Larry Foster

    Larry Foster Member

    Joined:
    Apr 14, 2019
    Ratings:
    +13 / 0 / -0
    @mmmmbeer Your screen shot of CPUZ in the Memory tab is showing only 1200 Mhz and that being doubled us only 2400 Mhz. Wouldnt an XMPprofile show a higher clock speed? Screenshot_20200421-075446_Firefox.jpg
     
  7. mmmmbeer

    mmmmbeer Member

    Joined:
    Jun 21, 2015
    Ratings:
    +6 / 0 / -0
    I'm not following you. Each stick is 2400 mhz ram. DDR stands for double data rate so the effective rate, which is what CPUz shows you, is doubled.
     
  8. Larry Foster

    Larry Foster Member

    Joined:
    Apr 14, 2019
    Ratings:
    +13 / 0 / -0
    @mmmmbeer
    Yes the sticker says 2400 MHz but it can run faster if the XMP profile is turned on in your bios.
    XMP is the RAM being overclocked if it can be. Yours might go as high as 3000MHz
     
    Last edited: Jul 17, 2020
  9. mmmmbeer

    mmmmbeer Member

    Joined:
    Jun 21, 2015
    Ratings:
    +6 / 0 / -0
    Only one XMP profile available and its selected. Runs it at 2400mhz. Without it, it runs slower. I could manually tweak it I suppose but never tried it. From the table on the first page, it's clear if I get a better CPU first I'll get some extra performance. Have the upgrade itch anyway. Waiting for Zen 4000 though!
     
    Last edited: Apr 23, 2020
  10. SamuTnT

    SamuTnT New Member

    Joined:
    Apr 17, 2020
    Ratings:
    +4 / 0 / -0
    Hi all, i'm with the others struggling with VR performance, i use an HP Reverb and it's a pain to get stable 90fps, even with low settings some tracks simply cannot get that 90 stable fps..btw here are my specs and my results:
    • CPU: Intel i7 6700K
    • CPU OC: 4.6 Ghz on 4 cores
    • Memory: 16gb ddr4-3000MHz Dual
    • Memory timings: CL15-16-16-35
    • GPU: Nvidia Geforce RTX 2080 Super
    Annotazione 2020-05-29 101016.JPG Annotazione 2020-05-29 101336.JPG Annotazione 2020-05-31 125642.JPG

    EDIT: i've noticed a huge bump in score disabling Nvidia's Shadowplay! with that on i wasn't able to hit the 7000+ score
     
    • Like Like x 1
  11. Thxave

    Thxave Member

    Joined:
    Nov 13, 2019
    Ratings:
    +22 / 0 / -0
    Finally got around to this..
    • CPU: Ryzen 5 2600
    • CPU OC: stock
    • Memory: 16GB DDR4-3200MHz
    • Memory timings: CL16-18-18-36-2T
    • GPU: Radeon RX570
     

    Attached Files:

    • Like Like x 1
  12. fl0wf1r3

    fl0wf1r3 Well-Known Member

    Joined:
    Apr 24, 2016
    Ratings:
    +235 / 0 / -0
    here we go :)
     

    Attached Files:

    • Like Like x 1
  13. SamuTnT

    SamuTnT New Member

    Joined:
    Apr 17, 2020
    Ratings:
    +4 / 0 / -0
    finally got the new i7 10700k and these are the results:

    • CPU: Intel i7 10700K
    • CPU OC: 5.0Ghz on all cores
    • Memory: 16gb ddr4-3000MHz Dual
    • Memory timings: CL15-16-16-35
    • GPU: Nvidia Geforce RTX 2080 Super
    Annotazione 2020-06-09 223400.JPG Annotazione 2020-06-09 223334.JPG Annotazione 2020-06-09 223346.JPG

    quite impressed by the result, gonna check how it will boost the fps in VR soon!
     
    • Like Like x 2
  14. Maarten

    Maarten Member

    Joined:
    Apr 15, 2019
    Ratings:
    +23 / 0 / -0
    I might have found a way that would help improve performance. As we know R3E is heavily dependent on the CPU, so having a higher clock means direct improvement. To get the highest stable/cool frequency possible you need to set a large AVX offset, but you do not want the AVX extension to be used, because that drops down the frequency of the cpu. To disable AVX in windows you need to set "bcdedit /set xsavedisable 1". After you reboot you will now see your frequency will not drop anymore. There is one problem though, Raceroom does not start with AVX off. I think this is because it is compiled with AVX enabled. I dont think raceroom will use AVX, so it would be very nice to test out a version without AVX support compiled. Other games/benchmarks can run without AVX enabled.

    [edit: Raceroom does not use AVX, the VR dll did, so you can run Raceroom without AVX]
     
    Last edited: Jul 10, 2020
  15. fl0wf1r3

    fl0wf1r3 Well-Known Member

    Joined:
    Apr 24, 2016
    Ratings:
    +235 / 0 / -0
    or get a new graphics engine, seriously
     
    • Agree Agree x 1
  16. Maarten

    Maarten Member

    Joined:
    Apr 15, 2019
    Ratings:
    +23 / 0 / -0
    @fl0wf1r3 100% agree, but I don't see that happening any time soon, and my suggestion might help us with the current situation, and is just a compile parameter.
     
    • Like Like x 1
  17. fl0wf1r3

    fl0wf1r3 Well-Known Member

    Joined:
    Apr 24, 2016
    Ratings:
    +235 / 0 / -0
    unfortunately you are right
     
  18. mikeymr

    mikeymr New Member

    Joined:
    May 3, 2020
    Ratings:
    +2 / 0 / -0
    • CPU: AMD Ryzen 9 3900X
    • CPU OC: Stock
    • Memory: 16GB DDR4-3600 MHz Dual
    • Memory timings: CL16-16-16-36
    • GPU: Nvidia Geforce GTX 2080ti
    Heaven Benchmark 3900X and 2080ti.PNG

    CPU Z Memory 3900x and 2080ti.PNG
     
    • Like Like x 1
  19. Maarten

    Maarten Member

    Joined:
    Apr 15, 2019
    Ratings:
    +23 / 0 / -0
    This is my current setup.
    • Windows 10, 2004
    • CPU: Intel Core i5 9600K
    • CPU OC: 5.1Ghz
    • Memory: 16GB DDR4-3000 MHz Dual
    • Memory timings: CL15-17-17-35
    • GPU: Nvidia Geforce GTX 1060
    Disabled AVX on OS with: "bcdedit /set xsavedisable 1"
    AVX offset in bios to 20, but that doesnt do anything if you disable it in Windows.

    heaven.png
    cpuz-cpu.png
    cpuz-memory.png
     
    • Like Like x 1
  20. fl0wf1r3

    fl0wf1r3 Well-Known Member

    Joined:
    Apr 24, 2016
    Ratings:
    +235 / 0 / -0
    what is this AVX doing?