Anybody with 6 core CPUs? (Especially Intel)

Discussion in 'General Discussion' started by Spinelli, Jan 5, 2014.

  1. Spinelli

    Spinelli Banned

    Joined:
    Jan 28, 2012
    Messages:
    5,290
    Likes Received:
    32
    If so, have any of you done any tests (with any games) comparing the exact same CPU on 4 cores vs 6 (use ctrl-alt-dlt to change affinity between 4/8 cores/threads and 6/12 cores/threads)? Also, have any of you done the same tests but also comparing Hyperthreading on vs Hyperthreading off?

    I got a new CPU, but my build isn't set up yet and I'm really curious about this especially after reading the following review:

    HT on VS HT off http://chipreviews.com/main-feature...-limit-6-core-performance-in-battlefield-3/3/

    4 cores VS 6 cores http://chipreviews.com/main-feature...-limit-6-core-performance-in-battlefield-3/4/

    Very nice difference between the 4-core and 6-core frames per second with a 3930k on Battlefield 3 64-player multiplayer. Also, HT off brings a much better improvement VS being on as well. VERY nice minimum fps using 6 cores combined with HT off.
     
    Last edited by a moderator: Jan 5, 2014
  2. Nuno Lourenço

    Nuno Lourenço Registered

    Joined:
    Oct 19, 2010
    Messages:
    593
    Likes Received:
    65
  3. petersmith

    petersmith Registered

    Joined:
    Sep 26, 2013
    Messages:
    7
    Likes Received:
    0
    As far as I know, it's a fact that rFactor2 uses only two cores. But you can gain some extra performance with 4 or 6 cores, using those extra cores for background stuff like networking and all the basic things what Windows needs. Putting more and more cores will help less and less.

    In rF2 the main principle is, it's better to use a high performance dual-core CPU, than a crappy quad-core.

    A higher clocked 4-core CPU will outperform a 6-core with lower freq in the same architecture.
     
    Last edited by a moderator: Jan 5, 2014
  4. kosmo1982

    kosmo1982 Registered

    Joined:
    Jan 10, 2012
    Messages:
    222
    Likes Received:
    9
    i upgraded from dual core intel to 6 core amd with slightly higher frequency. Didnt notice any difference in rf2.
     
  5. Ivan Baldo

    Ivan Baldo Registered

    Joined:
    Oct 7, 2010
    Messages:
    57
    Likes Received:
    3
    I guess that after you have enough CPU power for rFactor 2 then adding more power would not change anything.
    So, if you only use rFactor 2 (and some other sims, I think they also use just 2 cores), maybe it is just better to buy a Core i3-4330 Haswell and spend the rest of the money on the best GPU you can buy...
    Just a guess...
     
  6. Jamie Shorting

    Jamie Shorting Registered

    Joined:
    Sep 11, 2013
    Messages:
    2,628
    Likes Received:
    3
    Core frequency way more important than number of cores. Give me 2 cores running at 6GHz over 6 cores anyday. Unfortunately that speed isn't quite possible yet.
     
  7. Daytona 675

    Daytona 675 Registered

    Joined:
    Jan 1, 2014
    Messages:
    210
    Likes Received:
    30
    I upgraded from Q6600 to FX-6350 and for me the difference is noticeable.I have spend a little less in the CPU and pick up a better GPU (760 Hawk).
     
  8. baristabrian

    baristabrian Registered

    Joined:
    Dec 23, 2010
    Messages:
    141
    Likes Received:
    0
    I have the Intel i7 3930K Sandy bridge-E 3.8ghz 6 core and Ive not been blown away by its performance with rF2 OR AC. In fact I believe I saw some bench marks where a much less expensive quad core 4.3ghz blew mine away. Maybe the 6 core pays off with some type of application work loads but it does nothing for gaming that I can see. I have mine paired with fairly decent spec components so I doubt I have any bottlenecks so IM sure IM getting all this CPU has to offer.
     
  9. Spinelli

    Spinelli Banned

    Joined:
    Jan 28, 2012
    Messages:
    5,290
    Likes Received:
    32
    The problem with rfactor 2 is that it is heavily GPU bottlenecked at the moment, in most cases. I don't think this tells the full story.

    GT Legends is made for 1 core, not even dual, however once I am not GPU limited anymore I get a nice improvement in fps when I set it to 4 cores.

    The only way you can truly test this out is if you test the games yourself in 4-core mode and 6-core mode, same frequency obviously, also I would repeat the tests with HT off. Also, if you have the graphics too high and/or bottlenecked by your graphics then it's the weakest link in the chain and won't show the CPU difference.

    That's the main reason why the BF3 test was done with 2 SLI'd GTX 680s on only 1 monitor, to eliminate the graphics bottleneck in order to extract as much as the CPUs potential as possible.
     
    Last edited by a moderator: Jan 6, 2014
  10. Javik

    Javik Registered

    Joined:
    Mar 16, 2013
    Messages:
    68
    Likes Received:
    0
    i have small test, i have frezzes with my i7 4c 8t ,but im disable ht and game is most fast charging.
     
  11. F2kSel

    F2kSel Registered

    Joined:
    May 28, 2011
    Messages:
    139
    Likes Received:
    4
    I never had any increase in GTL going from 1 core to 4, in rf I did get 1fps improvement.

    Obviously there is a major problem with multicore coding and opt for the easy option of two cores and force the user to to upgrade to a faster CPU which is a more expensive route for the end user.

    A3 is a prime example, it's extremely CPU limited and only uses two cores. I upgraded my GPU 5850 to a NV680 and using the same setting gained 0 fps.
    I could increase some graphic settings a bit without reducing fps.

    Most don't even uses 64bit coding yet which is another issue.
     
  12. Gearjammer

    Gearjammer Registered

    Joined:
    Jun 11, 2012
    Messages:
    1,823
    Likes Received:
    24
    Almost all games will see the same result, as almost all games are designed to use a single or at most dual cores, so the additional cores won't do you any favors. As far as gaming is concerned, you don't need anything better than an i5 running the fastest core clock you can buy. Going to the i7 is a waste of money if your machine is just for gaming. If however you do other things on your system like 3D modeling and animation, movie creation, or Photoshop and stuff, the more cores and the faster the clock, the better those will perform. Another thing to consider as well is that up to a certain point, the CPU can be a bottleneck, but after that point anything faster won't show much improvement in performance. Upgrading your GPU after that point is the only way you are going to see further improvements in gaming performance. Upgrading ram will also have limited effect except in those areas I mentioned earlier, which are memory hungry at times and will benefit from all the ram you can throw at those programs.
     
  13. F2kSel

    F2kSel Registered

    Joined:
    May 28, 2011
    Messages:
    139
    Likes Received:
    4
    I never had any increase in GTL going from 1 core to 4, in rf I did get 1fps improvement.

    Obviously there is a major problem with multicore coding and opt for the easy option of two cores and force the user to to upgrade to a faster CPU which is a more expensive route for the end user.

    A3 is a prime example, it's extremely CPU limited and only uses two cores. I upgraded my GPU 5850 to a NV680 and using the same setting gained 0 fps.
    I could increase some graphic settings a bit without reducing fps.

    Most don't even uses 64bit coding yet which is another issue.
     
  14. Spinelli

    Spinelli Banned

    Joined:
    Jan 28, 2012
    Messages:
    5,290
    Likes Received:
    32
    Oh ya, i had huge gains in gtl from 1 to 4 cores, 1 core ill have jnstances where my fps are, for example, 190 fps, with 4 cores it will be like 260. When the GPU isn't the limiting factor then I get very large increases.

    Apparently crysis 3, bf3 and 4, far cry 3, and a few other games are specifically coded to take advantage of as many cores as you have, the problem is most people won't notice a difference because they are almost always GPU limited.

    That's just what I read, anyways my mega benchmark will begin soon, I will post all results for you guys.
     
  15. Nuno Lourenço

    Nuno Lourenço Registered

    Joined:
    Oct 19, 2010
    Messages:
    593
    Likes Received:
    65
    That's not need for so much work with that.

    Its very easy to see if your CPU is the bootleneck or not. In my case i buyed a 780 Ti and try to play with a Q6600. even @ 3.6Ghz i had the same FPS in 1680x1050/1920x1080/5040x1050. 66/68FPS to be more precise. After get i a 3770K and tried the same settings. Then i get with OC in GPU 115FPS/110FPS/80FPS in the three resolutions.

    With OC in CPU to 4.5Ghz the results is exactly the same in all resolutions so if i want more FPS i need to buy another Ti :p
     
  16. Addict

    Addict Registered

    Joined:
    Feb 7, 2012
    Messages:
    59
    Likes Received:
    2
    did you have a bootleneck with the Q6600@3.6Ghz ?? I play with a Q6600@3.2Ghz and i have a two cores at 60-70% load , I don't know if upgrading the cpu would increase fps or the problem is in Rfactor2 software that is not optimized very well.
     
  17. Lazza

    Lazza Registered

    Joined:
    Oct 5, 2010
    Messages:
    12,393
    Likes Received:
    6,609
    He proved he had a bottleneck because his FPS didn't change when he changed his resolution. If the CPU wasn't limiting the framerate, his graphics card would have pumped out higher FPS with fewer pixels to draw, and lower FPS when it had more work to do. This is assuming all other settings remained the same, and he said they did.

    So try doing a similar test yourself, running at your full resolution and then 1 or 2 settings lower, and see what change it makes to your framerate.
     
  18. Nuno Lourenço

    Nuno Lourenço Registered

    Joined:
    Oct 19, 2010
    Messages:
    593
    Likes Received:
    65
    Mine wasn't @ 100% load either. rFactor 2 its programed to use 2 cores, even if we have 4 cores windows will try to split the work to every core but thats not a very eficient work. In my opinion even splited it will only use the power similar to 2 cores.

    BTW what GPU you have? I think that the limite to use a Q6600@3.6ghz is a GTX 670 or a HD 7950. With a superior GPU you will have a huge bootlenecked for sure
     
  19. Spinelli

    Spinelli Banned

    Joined:
    Jan 28, 2012
    Messages:
    5,290
    Likes Received:
    32
    With the single core programmed GT Legends and my old 2500K @ 4.5 GHz I got improvements when setting it to 4 cores from it's native 1 core operation.

    I agree, the power available for the software to take advantage of with no hit in FPS, or the FPS itself on the exact same peice of software, would increase by a humungous margin if the game was coded specifically for multi-cores.

    I still think setting most single core games to use multiple physical cores is theoretically better than nothing. Now, does that make a difference in the real world when most people are only aiming for 60 fps? I would assume not, however, in the case that you have a CPU that at any particular moment hits moments of CPU bottlenecks while within 60 fps, then maybe setting that game to use more cores will avoid/lessen those moments of "within 60 fps CPU bottlenecking". Also, the chances of this happening will only get higher and higher, the higher the fps you're aiming for are. For example, in my GT Legends benchmarking there are points where my fps would, for eg., max out at around 180, but then that exact same spot would max out at, for eg., 260 fps once set to use 4 cores. Clearly this shows the game taking advantage of the added power from the 4 cores when you are in a moment where the GPU is not the limiting factor, which therefore allows the CPU to "breath" and use more of it's power.
     
    Last edited by a moderator: Jan 8, 2014
  20. Spinelli

    Spinelli Banned

    Joined:
    Jan 28, 2012
    Messages:
    5,290
    Likes Received:
    32
    Also, I believe running against lots of opponets will raise the amount of work asked from the CPU much more so than of the GPU (not linear). Why do I think this?......

    Well, in my RFactor 1 benchmark testing with I think 35 or 40 opponents, I would, as expected, get a large difference in fps depending on whether i used no AA, 2x supersampling, 4x supersampling, or 8x supersampling. Again, this makes sense because it is asking such a huge amount more and more from my GPU....HOWEVER...Sitting in last place on the start line would give me almost no fps difference between AA off, and "fps-destroying" 8x supersampling. We are talking like 7 fps or so, maybe from 28 to 35 fps when going from 8x supersampling to no AA at all. That just doesn't seem correct at all considering how large the differences were between the 2 at all other times apart from the start and first few corners. This leads me to believe that at the start the main bottleneck was either one, or possibly both, of the following

    A. The CPU itself - Unfortunately I wasn't thinking and didn't test this by over/under clocking the CPU, or by changing the amount of physical cores being used, as I thought it was purely GPU bottlenecking at the time.

    B. The PCI-E speed - I was running my GPU in PCI-E 2.0 8x mode, which is the equivalent to PCI-E 3.0 4x.

    On top of that, I also had a setting changed in every single A.I.'s physics file in order to place more load on my CPU. I believe it's called "passespertick" or "Aipassespertick"? It's something like how many times per cycle the A.I.'s physics get recalculated. Stock RFactor and I believe GT Legends (and, I believe, most ISI based games) have it set to only "2", I changed this to "10" for every single AI talent/physics file. I believe Game Stock Car also has it set by default to "10". "20" is apparently to be avoided as it will just ask way too much from your CPU. This leads me further to believe that the seemingly overly small start-line fps differences between no AA and 8x Supersampling was a result of CPU bottlenecking (I5-2500K @ 4.5 GHz with RFactor 1's affinity set to all 4 cores).
     
    Last edited by a moderator: Jan 8, 2014

Share This Page