Massive FPS gains in rf2 using PCI-e 3.0 x16 with higher end cards!

Discussion in 'General Discussion' started by DrR1pper, Sep 30, 2014.

  1. DrR1pper

    DrR1pper Registered

    Joined:
    Apr 29, 2012
    Messages:
    3,294
    Likes Received:
    36
    Gotcha, thanks! :)
     
  2. Cracheur

    Cracheur Registered

    Joined:
    Jan 3, 2012
    Messages:
    315
    Likes Received:
    8
    Hi,

    as you mentioned, I cannot test PCI3...as my MB/CPU does not support it.
    I just compared my system to public benchmarks. (same GPUs config but different MB / CPU)
    There I can see that my score is generally lower by 20-30% BUT not on games where you have highly GPU limited FPS.
    To explain it a bit better: a game that runs 120FPS on my system will run around +/-20% faster on newer CPUs. A game that runs with 45FPS on my system will not get this 20-30% increase. (it's GPU limited)
    Even though these are a bit older:
    http://www.guru3d.com/articles_pages/radeon_hd_7970_cpu_scaling_performance_review,8.html

    "The moment you plug in two cards, each of the two PCI-e 3.0 lanes becomes x8 only (which would effectively be PCI-e 2.0 x16 each, right?)"
    Yes but as mentioned I've not found a single source indicating that you would bottleneck a 1 or even 2GPU system by using PCI3@8x.

    hmmm... however facts are facts.... some people here see a quite massive FPS increase.... I'll have a closer look at the results... (I'd be happy if it's true... as I'm about to change my system-;))
     
    Last edited by a moderator: Oct 2, 2014
  3. Cracheur

    Cracheur Registered

    Joined:
    Jan 3, 2012
    Messages:
    315
    Likes Received:
    8
    hmmm...another idea to validate the PCIe-2 vs 3 (real-world) performance increase:

    Do we have anybody that is running a GPU with PCIe-3 with a very high resolution(tripple screen).
    I'm suspecting that the improvement from PCIe-2 to 3 will get less noticeable with high quality settings and resolution. (where it would matter most in my eyes).
    Except that I missed some benchmarks, I could only see the massif increase on system that run anyway at least 100FPS...

    > I would be interested to see if we still get that increase when the game starts to be more GPU limited.

    Please don't get wrong...I'd be very happy to see such an increase...
    Could anybody with such a config try to set every GPU setting to max with a high resolution?

    cheers
     
    Last edited by a moderator: Oct 2, 2014
  4. Spinelli

    Spinelli Banned

    Joined:
    Jan 28, 2012
    Messages:
    5,290
    Likes Received:
    32
    I have triples, I have SLI 780 Ti, and I have an Ivy Bridge-E CPU (true dual 16x 3.0).

    What combo would you like me to do? Triple screen, single GPU, and 16x 3.0 vs 16x 2.0?
     
  5. DrR1pper

    DrR1pper Registered

    Joined:
    Apr 29, 2012
    Messages:
    3,294
    Likes Received:
    36
    This is a very good condition to test for those who use triple screens. Perhaps their performance loss from 3.0 x16 to 3.0 x8/2.0 x16 would be less felt.

    But you'll need to confer with those who have your card and ask them if they may test this for you (that is provided they can test that resolution as well).

    Did you mean CPU limited?
     
  6. Cracheur

    Cracheur Registered

    Joined:
    Jan 3, 2012
    Messages:
    315
    Likes Received:
    8
    tripple screen resolution, + very high GFX settings... then compare PCI-e2 versus 3. (single and dual-card would be nice to see... but don't want to ask too much-;))
    > for instance: what happens if you have around 40-50 FPS and change from PCIe3 to 2?

    thx
     
    Last edited by a moderator: Oct 2, 2014
  7. Cracheur

    Cracheur Registered

    Joined:
    Jan 3, 2012
    Messages:
    315
    Likes Received:
    8
    "Did you mean CPU limited? " no, I meant GPU limited. If you increase resolution + GFX settings, FPS will be more and more limited by GPU. (CPU plays less in that case)
    As mentioned, If you have 120FPS on a 965BE, you will probably get 160 on a nice i5. If you have 35FPS... you barely will notice a difference between 965be and i5/7. Obviously there are exceptions...it depends a bit on the game but generally speaking this is true.
     
    Last edited by a moderator: Oct 2, 2014
  8. Spinelli

    Spinelli Banned

    Joined:
    Jan 28, 2012
    Messages:
    5,290
    Likes Received:
    32
    But the GPU data goes through the PCI-E lanes, wouldn't it make sense to believe that a GPU limited situation would therefore further exaggerate the framerate differences between different PCI-E setups/versions?

    EDIT: Actually I think I see what you mean. The GPU demand from the game/gfx settings might be so high that the pure GPU power/speed itself is sort of like the bottleneck before PCI-E versions and lanes even come into play. I'm guessing that's how you're looking at it, correct?
     
    Last edited by a moderator: Oct 2, 2014
  9. DrR1pper

    DrR1pper Registered

    Joined:
    Apr 29, 2012
    Messages:
    3,294
    Likes Received:
    36
    Yeah, i think you're right.

    Plus TechAde tested a Titan in 2.0 vs 3.0 with all max vs all lowest gfx settings and on max it was a 25% difference vs only 4% on lowest.
     
  10. Spinelli

    Spinelli Banned

    Joined:
    Jan 28, 2012
    Messages:
    5,290
    Likes Received:
    32
    Sure.

    I'll test SLI in 3D mode - since 3D gives rF2 just about perfect SLI scaling - and single GPU in 2D mode.

    Settings will be exactly the same as DrR's benchmark and then I'll also do them with most of DrR's settings but with 8xMSAA, 12 AM, and rain for even more graphical demand.

    Multiview on or off (I'm assuming on, off looks terrible)?
     
  11. Cracheur

    Cracheur Registered

    Joined:
    Jan 3, 2012
    Messages:
    315
    Likes Received:
    8
    Not sure how 3d or mview will affect the test....looking forwatd to the results.
     
  12. Satangoss

    Satangoss Registered

    Joined:
    Jun 2, 2011
    Messages:
    1,123
    Likes Received:
    7
    Makes absolutely no sense. Difference in games performance from Pcie 16x8x, 16x16x and 8x8x is about zero. I did this test several times, tried everything in terms of PCI slots combination, nada, nothing, niente.

    If you don't believe me, search for this sort of benchmarks in Internet.

    Now I got 2 x 980 GTX and decided do not waste my time anymore tweaking rF2 graphics, turn off SLI when fire rF2 and play all maxed + 120 fps any time any track and just turn my SLI on when I play more "optimized" (to be polish) games.
     
    Last edited by a moderator: Oct 2, 2014
  13. DrR1pper

    DrR1pper Registered

    Joined:
    Apr 29, 2012
    Messages:
    3,294
    Likes Received:
    36
    Regarding SLI working in rf2, can those who have it working please confirm so?
     
  14. Spinelli

    Spinelli Banned

    Joined:
    Jan 28, 2012
    Messages:
    5,290
    Likes Received:
    32
    I undersatand that it's a large convuluted thread, so i don't expect you to have read it all so I'll clarify for you...

    In 99% of games, there is only a tiny difference between PCI-E 3.0 @ 16x and PCI-E 3.0 @ 8x / 2.0 @ 16x. As you've said - and I fully agree - there are many tests and reviews about this subject. Time after time, game after game, PC after PC, the test results come out the same with 16x 3.0 only having around a 1-5% lead (usually only 1-3%) over 8x 3.0 / 16x 2.0.

    However, if you search throughout this thread, you will see that many different people with different PC setups have shown large framerate differences between the 2 PCI-E speeds. These rF2-framerate differences between the two PCI-E speeds are highly unusual; while almost every game from every test "around" shows very minimal gains, rF2 mysteriously shows gains of up to 32%, and usually in the 10-20 % range.

    Whether that has something to do with rF2 itself (eg. inefficient/"bad" coding making the game unnecessarily saturate the PCI-E bus), I do not know. All I know is that we have carried out multiple tests from multiple people and PC systems and the results are always the same - unusually large framerate differences, in rF2, between the two different PCI-E speeds while running.
     
    Last edited by a moderator: Oct 4, 2014
  15. Ari Antero

    Ari Antero Registered

    Joined:
    Jul 27, 2012
    Messages:
    1,882
    Likes Received:
    829
    Just ignore Satangoss hes lacy and don`t know better gain, x8 vs x16 is 5-10%
     
  16. Maug

    Maug Registered

    Joined:
    Mar 16, 2012
    Messages:
    290
    Likes Received:
    4
    After having a quick look at this thread i downloaded GPU-Z and had a look at my PCI settings. It's currently "PCI-E 3.0*16 @*16 1.1" and when i scroll over with the mouse it tells me the graphic card supports 3.0 but is currently set to 1.1. My mobo is an Asus Z87 A, my proc an I3 4130 and my video card a GTX 750 TI OC and all of them are supposed to handle PCI Express 3.0 from what's written on the boxes/manuals and internet product descriptions.

    It's been years i haven't followed hardware development and i don't want to end up messing everything just to test something so if somebody kind and more knowledgeable than myself could just tell me if i'm going the right way it'd be appreciated.

    My guess is that i should be able to run PCI-E 3.0 and so i'd need to activate it. When i click on the interrogation mark in GPU-Z(bus interface informations) i get a window popping up with a TL;DR text(i read it but it's just long) proposing me to start a render test. If i click on it, it will automatically sets the bus interface to the highest PCI-E version? If not what do i need to do?
     
    Last edited by a moderator: Oct 2, 2014
  17. DrR1pper

    DrR1pper Registered

    Joined:
    Apr 29, 2012
    Messages:
    3,294
    Likes Received:
    36
    typo. ;)

    Also the largest gain we've seen so far is 33%.
     
  18. DrR1pper

    DrR1pper Registered

    Joined:
    Apr 29, 2012
    Messages:
    3,294
    Likes Received:
    36
    1.1 is when the card is idle for power saving. Hit the "?" to the right of it and run the little render tool and it should put the pci-e lane into full steam ahead and change the reading to "3.0" at the end.
     
  19. Maug

    Maug Registered

    Joined:
    Mar 16, 2012
    Messages:
    290
    Likes Received:
    4
    Nvm i found out.
     
    Last edited by a moderator: Oct 2, 2014
  20. Ricknau

    Ricknau Registered

    Joined:
    Nov 12, 2011
    Messages:
    778
    Likes Received:
    39
    My head is swimming trying to grasp all the research you guys are doing in multiple threads! You that are digging deep surely must be retirees. :D It's lucky I haven't been fired just reading the posts let alone firing up all the benchmarks and posting multi-paragraph dissertations. But thanks a lot. This is really good info.

    Question: Has the research gone far enough for someone to predict what kind of FPS gain I might see going from PCI-e 2.0 x16 to PCI-e 3.0 x16 with a single 780ti card pushing 3 monitors? My mobo supports 3.0 but, from what I've learned here, my i5-2500k needs upgrading. Looks like a i5-3570k would get me there at around $230. I'd have to see a pretty good gain to spend that kind of money. (I wonder if there would be any buyers for a used 2500k?)
     
    Last edited by a moderator: Oct 3, 2014

Share This Page