DirectX 9 CPU Benchmark Thread

Discussion in 'General Discussion' started by Thomas Jansen, Jul 22, 2019.

  1. Balrog

    Balrog Well-Known Member

    Joined:
    Apr 10, 2015
    Ratings:
    +390 / 0 / -0
    Man, these Ryzen's can behave so weirdly. I discovered I had a faulty memory module, sent the kit back and had to buy a replacement one to be able to use my PC in the meantime.

    And I was just experimenting with the C14 kit, with PBO enabled I get roughly the same performance than before. But if I also enable AutoOC with +75 MHz:
    - I get slightly higher scores in Cinebench
    - I get slightly lower scores in CPU-Z bench
    - I lose ~12% performance in the frickin' Heaven Benchmark thing and score ~8000, what the heck (and I was able to reproduce it three times in a row)?:D
     
  2. Sebastien Brunier

    Sebastien Brunier Active Member Beta tester

    Joined:
    Dec 17, 2017
    Ratings:
    +38 / 0 / -0
    Ho ho ho it took me a bit of tinkering but... I got it !
    upload_2020-12-19_22-8-29.png

    upload_2020-12-19_22-8-59.png

    upload_2020-12-19_22-9-21.png
     
    • Like Like x 1
  3. Sebastien Brunier

    Sebastien Brunier Active Member Beta tester

    Joined:
    Dec 17, 2017
    Ratings:
    +38 / 0 / -0
    PBO is a bit hard on the CPU, it brings a lot of heat, and the clocks are less stable, so for this DX9 benchmark relying exclusively on CPU/RAM stability and speed you get more penalized.
    The score I did is on AutoOC +50 stock PBO limits, and curve optimizer underclock adjusted.
     
  4. Nir Tal

    Nir Tal Well-Known Member

    Joined:
    Aug 29, 2016
    Ratings:
    +48 / 0 / -0
    Tried the benchmark yestrday with my new system and it run for ~20min but didnt finish so i quit it.
    How long it supposed to run till it give benchmark result ?
     
  5. Thomas Jansen

    Thomas Jansen Sector3 Developer Beta tester

    Joined:
    Apr 5, 2018
    Ratings:
    +434 / 0 / -0
    you have to actually start the benchmark with the button on the top left, most likely you just left it running in the default looping mode, which is just for stress testing
     
  6. Nir Tal

    Nir Tal Well-Known Member

    Joined:
    Aug 29, 2016
    Ratings:
    +48 / 0 / -0
    Strangely I dont have button on the top-left... will try to download it again
     
  7. Wheely

    Wheely New Member

    Joined:
    Jul 9, 2015
    Ratings:
    +0 / 0 / -0
    This are my results

    • CPU: Intel i9-9900k
    • CPU OC: @4,9GHz
    • Memory: 2x8GB DDR4-4000 MHz
    • Memory timings: CL16-16-16-36-2T
    • GPU: Nvidia Geforce GTX 2080TI
    Benchmark.JPG Speicher.JPG Mainboard.JPG CPU.JPG
     
  8. Nir Tal

    Nir Tal Well-Known Member

    Joined:
    Aug 29, 2016
    Ratings:
    +48 / 0 / -0
    • CPU: AMD Ryzen 5 5600X
    • CPU OC: Stock
    • Memory: 16GB DDR4-3600 MHz
    • Memory timings: CL18
    • GPU: Nvidia Geforce GTX 1060
     

    Attached Files:

  9. mmmmbeer

    mmmmbeer Member

    Joined:
    Jun 21, 2015
    Ratings:
    +6 / 0 / -0
    • CPU: AMD Ryzen 5 5900X
    • CPU OC: Stock
    • Memory: 32GB DDR4-3600 MHz
    • Memory timings: CL18
    • GPU: Nvidia Geforce RTX2080Ti
    Just got my new 5900x and everything at stock bar the XMP enabled on the RAM. Lot more playing around to go eventually but here are the first runs of the normal benchmark and the newer one we've been running. Interestingly the CPU never went near its advertised single core boost speed of 4.8Ghz. Only ever hit 4.6 very briefly, spent most it its time between 4 and 4.4. Ryzen Master says its capable of 4.9Ghz.

    Unigine.jpg CPU.jpg Mem.jpg Capx.jpg
     
  10. Thomas Jansen

    Thomas Jansen Sector3 Developer Beta tester

    Joined:
    Apr 5, 2018
    Ratings:
    +434 / 0 / -0
    That sounds odd about the boost frequency, basically every chip i've seen goes well past the advertised boost speed, usually about 0.15GHz higher (4.85 on my 5800x for example). Is PBO off maybe?
     
  11. mmmmbeer

    mmmmbeer Member

    Joined:
    Jun 21, 2015
    Ratings:
    +6 / 0 / -0
    Yeah it was on. I've had it up to 4.98 ghz when doing the Cinebench single core benchmark but Unigine Heaven won't go past 4.6 at all after several attempts and different settings. I've been playing around in BIOS for days and testing all sorts of tweeks for tuning it but seems I've got a shitty sample. Can't get past PBO +75mhz without it becoming very unstable and can't use the curve optimiser either as its too unstable. Even +75 by itself is hit and miss, Will work for hours then just BSOD at random.

    Going to try a manual OC and see how I get on. It has gone to 4.6 all core by itself in some workloads so it may not be all bad, it just isn't taking the crown in this thread:grinning:
     
    Last edited: Jan 1, 2021
  12. nolive721

    nolive721 Active Member

    Joined:
    Dec 2, 2018
    Ratings:
    +37 / 0 / -0
    I havent run the benchmark but will do soon with my 6core Ryzen.

    Having said that i am quite impressed with the amount of people with 8core and above CPU playing RRE

    Is the game actually taking advantage of multi threading or thats pure single core IPC that will make a performance difference in the Game?
     
  13. Ablaze

    Ablaze Well-Known Member

    Joined:
    Mar 16, 2018
    Ratings:
    +102 / 0 / -0
    The curve optimizer can give a nice performance boost. From 397.5 to 413.8 fps:
    • CPU: AMD Ryzen 7 5800X
    • CPU OC: Curve Optimizer set to -10 on all cores, rest is stock
    • Memory: 64GB DDR4-3200 MHz Dual
    • Memory timings: CL16-18-18-38
    • GPU: Nvidia Geforce RTX 3090
    [​IMG]

    [​IMG]


    I can speak only for myself but I don't build a computer to play only 1 game on it. :) There are a lot games out there that benefit from more CPU cores. And those cores accelerate multitasking, streaming, video editing and rendering too.

    It's the single core IPC that makes the difference here.
     
    Last edited: Jan 3, 2021
  14. nolive721

    nolive721 Active Member

    Joined:
    Dec 2, 2018
    Ratings:
    +37 / 0 / -0
    you misunderstood me. I was impressed that PC hardware enthusiastic are also Simracing ones like RRE, that's all

    anyways, here are my results and actual CPU/RAM specs.

    upload_2021-1-3_18-26-46.png
    upload_2021-1-3_18-29-36.png

    upload_2021-1-3_18-28-0.png
     
  15. mmmmbeer

    mmmmbeer Member

    Joined:
    Jun 21, 2015
    Ratings:
    +6 / 0 / -0
    Turns out this was rubbish. Its seems to be a widespread problem that the current agesa versions don't like high FCLK settings, anything above 1600 seems to be problematic when used with the 5000 series. Mine was 1800 with my 3600 ram.

    I was never even able to boot with PBO +200Mhz on the bios but reduced FCLK to 1600 and it worked straight away! More testing to do now to see what this thing is capable of.
     
  16. Schmelge

    Schmelge Member Beta tester

    Joined:
    Jun 10, 2020
    Ratings:
    +12 / 0 / -0
    Naah I'm not so sure about that :p I switched out my 980ti for a 6800xt and now the game is unplayable :p looking at the forums there's at least one other guy with the same setup, cpu 5900x and gpu 6800xt that has the same problem, I have no issues whatsoever with other games, for instance dirt rally 2.0 which I went from 100 ss all low/off with the 980ti to all ultra/on 150ss with the 6800xt.

    Im not sure whats going on but it feels like something wierd is up with RR and at least the AMD side, im not sure about Nvidia though
     
  17. AlleyViper

    AlleyViper Active Member

    Joined:
    Dec 10, 2016
    Ratings:
    +30 / 0 / -0
    IIRC there have been issues in the past with rdna like 5700xt and dx9 games, where gpu load is too low and they lower clocks too much to maintain proper performance. I presume your card could be suffering something similar being rdna2, so try to monitor it to check if vram is also running slower, etc.

    In that scenario, you might try to force more gpu load on it by maximizing all settings ingame, remove any sort of frame limiting, and under the Radeon CP do stuff like forcing Supersampling as AA the highest you can, or even a combination of it + VSR.
    More reliable workarounds could revolve around temporarily forcing high clocks on the lower power states. If it comes to that, you could create a thread on the guru3d forum about it. Sometimes it helps too to report directly to AMD with their submission form (I've had good results a few times).

    On the ACC forums there was a user having trouble with a new amd 5000 series system (heavy stuttering ingame, link) because after installing amd ryzen master it forced system to use HPET all time (as that soft requires it). You might check if it's hurting you with r3e too if the default behavior of w10 was changed. GL
     
    Last edited: Jan 16, 2021
  18. sgauge

    sgauge New Member

    Joined:
    Jan 18, 2021
    Ratings:
    +1 / 0 / -0
    Hi there, here's mine :
    • CPU: AMD Ryzen 9 3900x
    • CPU OC: 4.4 GHz
    • Memory: 16GB DDR4-3733 MHz Dual
    • Memory timings: CL16-18-18-38
    • GPU: Nvidia Geforce GTX 2070 SUPER
    upload_2021-1-20_17-6-26.png

    upload_2021-1-20_17-6-50.png
     
  19. BeefMcQueen

    BeefMcQueen Well-Known Member Beta tester

    Joined:
    Jun 26, 2018
    Ratings:
    +150 / 0 / -0
    • CPU: Intel i7 10700
    • CPU OC: Stock
    • Memory: 32GB DDR4-2933 MHz Dual
    • Memory timings: CL21-21-21-47
    • GPU: Nvidia Geforce RTX 3070
    [​IMG]
    [​IMG]
    [​IMG]

    Oli
     

    Attached Files:

  20. sgauge

    sgauge New Member

    Joined:
    Jan 18, 2021
    Ratings:
    +1 / 0 / -0
    @BeefMcQueen : Watch out, you executed your test in 640x480 resolution, instead of 640x360.

    Also, if I may suggest you to review your memory timings, these are way too high for 2933Mhz. Your system should perform quite a bit better. Did you enable any XMP profile in the BIOS?

     
    • Like Like x 1