Massive FPS gains in rf2 using PCI-e 3.0 x16 with higher end cards!

Discussion in 'General Discussion' started by DrR1pper, Sep 30, 2014.

  1. TechAde

    TechAde Registered

    Joined:
    Oct 13, 2010
    Messages:
    606
    Likes Received:
    38
    That's easy to explain - you have an AMD GPU, only NVIDIA GPUs need the patch.

    Sent from my SM-G900F using Tapatalk
     
  2. Spinelli

    Spinelli Banned

    Joined:
    Jan 28, 2012
    Messages:
    5,290
    Likes Received:
    32
    780 Ti and X79 here as well. Everything perfect. In fact I run 2 780 Tis + a PCI-E soundcard and my GPUs both still run at full PCI-E 3.0 @ 16x (the 40 PCI-E 3.0 lanes, as opposed to 16, is the main reason why I went with LGA 2011 over the "mainstream" sockets like LGA 1150 (Haswell) and LGA 1155 (Ivy Bridge, and Sandy Bridge).

    X79 is 100% PCI-E 3.0 compliant "out of the box", and so is the Ivy Bridge-E line of CPUs (4960X, 4930K, 4820K). Sandy Bridge-E (3960X, 3930K, 3820K) CPUs, on the other hand, need the registry fix to get PCI-E 3.0 working.
     
    Last edited by a moderator: Oct 1, 2014
  3. Coanda

    Coanda Registered

    Joined:
    Jun 9, 2013
    Messages:
    689
    Likes Received:
    3
    Your right mate. 3930K for me was the issue. My memory sometimes :confused:
     
  4. DrR1pper

    DrR1pper Registered

    Joined:
    Apr 29, 2012
    Messages:
    3,294
    Likes Received:
    36
    Well, immediately something sounds off if your motherboard was genuinely running in PCI-e 2.0 x16 mode before and then you switched to PCI-e 3.0 x16 and didn't notice any change at all (but ofc it could be true because you use maybe lower quality settings than in the live benchmark method which could make the PCI-e difference less significant to performance but we'd really need a test to confirm if this is true or not).

    Secondly, whilst the results from any benchmark are indeed synthetic if they are not the settings you would otherwise use in your time playing a game, it does allow you to test and check if everything in your system is working as well as it should be (i.e. there are not hidden bottlenecks that you would be otherwise oblivious to without some kind of reference to how your system is expected to perform). In this way, performing the live benchmark is simply a means of checking everything is good (or not) with your system and how your card performs relative to other graphics card (in performance percentages). No one wants or intends to buy a graphics card, place it in their system and expect to be underperforming.....but how would you truly know without a solid checking method?

    The reason i said your results sound immediately off, is other 980 users have reported at least 16% fps difference (or maybe more) with PCI-e 3.0 x16 vs PCI-e 2.0 x16. Now this observation was made with ingame graphics at it's highest and adding 4xMSAA in the Nvidia/AMD control panel. We know though that lowering the overall gfx settings does in fact reduce the demand on the PCI-e bandwidth and thus lowering it's potential impact on overall performance of your graphics card. So, i suppose if you game gfx settings lower than the ones in the live benchmark, at a certain point it should result in zero benefit from PCI-e 3.0 vs 2.0.

    An example would be TechAde testing a GTX Titan and in the live benchmark settings with PCI-e 2.0 x16 vs 3.0 x16, observing (at least) a 25% fps loss. Turning all the gfx settings to the lowest and repeating the live benchmark yielded a much lower 4% but still, there is a difference.

    Point being, you really need to do the test (and report your results if you will please) so that you can be 100% sure. :)
     
  5. speed1

    speed1 Banned

    Joined:
    Jul 26, 2012
    Messages:
    1,858
    Likes Received:
    0
    Wasn't somebody of the opinion a GFX card would run by itself and the rest of a system wouldn't matter, and wan't believe that it is always a combination. Nice that the NV trip at least teached something differnet, and the laws of physics do not correspond to that imaginary logic.
     
  6. rogue22

    rogue22 Registered

    Joined:
    Jan 14, 2012
    Messages:
    261
    Likes Received:
    18
    But I though 3rd Gen and up I7s support 3.0?

    I'm on a First Gen I7 920 and an X58 UD5, Unless The F12 Bios update supports PCI-E express 3.0, and I miss that part.
     
  7. TechAde

    TechAde Registered

    Joined:
    Oct 13, 2010
    Messages:
    606
    Likes Received:
    38
    I think you'll find that the opinion "a GFX card would run by itself and the rest of a system wouldn't matter" also included "assuming that that nothing in the system is causing a bottleneck".

    It was the fact that the same GPU was performing differently in different systems that indicated there was some sort of problem. Going by your logic this problem would never have been thought of as a problem and people could have been throwing away a 25% performance increase for absolutely no reason.

    So, whose logic helped us solve the problem? It sure wasn't yours.
     
  8. TechAde

    TechAde Registered

    Joined:
    Oct 13, 2010
    Messages:
    606
    Likes Received:
    38
    Oh I see what you mean. According to the Gigabyte website that board doesn't support PCIe 3.0 at all, only PCIe 2.0. None of the BIOS updates add PCIe 3.0 support.

    I have no idea why you're showing as running PCIe 3.0 x16, it simply shouldn't be possible on that board with the X58 Express chipset. :eek:
     
  9. speed1

    speed1 Banned

    Joined:
    Jul 26, 2012
    Messages:
    1,858
    Likes Received:
    0
    What.......it actually is exactly the opposite who tought it wouldn't matter on what base a GPU is running ( not you ), and even you supported my opinion it would matter in an other thread, and you are the one finding the issue with the same logic i shared all the time, so how i could be the one saying different when i was sharing the same logic with you that it always is a combination of things in a pc environment and a GPU isn't running by itself without it's base where it's driven on. Thank you that you proven me right.
     
  10. TechAde

    TechAde Registered

    Joined:
    Oct 13, 2010
    Messages:
    606
    Likes Received:
    38
    Oh boy, here we go again. No, I'm not even going to bother, sorry. You carry on thinking I've proven you right, makes no odds to me.
     
  11. rogue22

    rogue22 Registered

    Joined:
    Jan 14, 2012
    Messages:
    261
    Likes Received:
    18
    I'm on the fence with the PCI 2.0 vs 3.0. This is rfactor 2, it hardly utilizes anything, compared to a few other simulations I have that really show CPU and PCI bandwith bottlenecking.

    Don't get me wrong a faster video card will give you more head room, but not as much compared.

    And if my system is reporting 3.0 when it isn't. Whats to say other systems aren't doing the same thing?
     
  12. DrR1pper

    DrR1pper Registered

    Joined:
    Apr 29, 2012
    Messages:
    3,294
    Likes Received:
    36
    Sorry for the late reply, i missed this comment.

    What do you want to me to check at the bottom of my screen?
     
  13. rogue22

    rogue22 Registered

    Joined:
    Jan 14, 2012
    Messages:
    261
    Likes Received:
    18
    Open up GPUz and where it tells you your PCI-E speed there is a question mark next to it. click on it and it will give you an option to benchmark your PCI-E speed. Run it and look at the bottom and tell me what your speed is. It only takes a second. Its not a timed test so hit esc when you feel it stabilizes.
     
  14. DrR1pper

    DrR1pper Registered

    Joined:
    Apr 29, 2012
    Messages:
    3,294
    Likes Received:
    36
    Ok, done.

    The clock speed readings don't change. They seem to be just the predicted values. However, looking at afterburner, the gpu and mem clocks are in fact higher.

    [​IMG]
     
  15. rogue22

    rogue22 Registered

    Joined:
    Jan 14, 2012
    Messages:
    261
    Likes Received:
    18
    So yours reported running at 2.0 still?

    Do the full screen test.
     
  16. DrR1pper

    DrR1pper Registered

    Joined:
    Apr 29, 2012
    Messages:
    3,294
    Likes Received:
    36
    No, i'm running the render app in GPU-Z using that ? button. It puts into into 2.0, otherwise it says 1.1 at the end.
     
  17. rogue22

    rogue22 Registered

    Joined:
    Jan 14, 2012
    Messages:
    261
    Likes Received:
    18
    So why would mind report 3.0, but GPUz and AID64 report the same.
     
  18. speed1

    speed1 Banned

    Joined:
    Jul 26, 2012
    Messages:
    1,858
    Likes Received:
    0
    Yeah i see, but i do not turn like a waving weather flag.
     
  19. DrR1pper

    DrR1pper Registered

    Joined:
    Apr 29, 2012
    Messages:
    3,294
    Likes Received:
    36
    Because it's reporting what on my motherboard PCI-e lane is running at.

    If you have a PCI-e 3.0 motherboard and you have 3.0 enabled in the bios, gpu-z should come back saying "PCI-E 3.0 x16 @ x16 3.0" under graphical load.

    Edit: ok, i've just realised what you're getting at thanks to TechAde. You have 1st gen intel CORE cpu and yet gpu-z saying you have PCI-e 3.0.

    bit of a head scratcher that....:confused:

    Does this reading change from graphical idle to load state?

    Edit 2: I wonder if there is just something wrong with gpu-z and others like it being able to read the correct PCI-e live state of your card because i can also see that it doesn't show your memory size.
     
    Last edited by a moderator: Oct 1, 2014
  20. rogue22

    rogue22 Registered

    Joined:
    Jan 14, 2012
    Messages:
    261
    Likes Received:
    18
    Nope, could be due to my overclock, I turn off any power saving features. I'll reboot back to stock and see what it says.

    AIDA64 reports the same as well.
     

Share This Page