I will look at some of the settings tomorrow and see if I can come up with something that might help but you are already running a resolution comparable with many peoples who are running triples. Don't forget that the fps will also depend on the track you are using. I am running a 2070 rtx and get a solid 60fps with most stuff turned up hight at 5160x1080 at Le Mans 2018, if I run at 7680x1440 it will drop to about 40fps at certain points on the cct. I would say that what you are seeing is about right if you are running one of the more demanding circuits.
Thank you, I appreciate it. Maybe I was expecting too much but just want to be sure there isn't something out of whack somewhere.
The framerates you are quoting are they in-car (cockpit) and which track? I average 80-82fps high to 59-62 low (riva tuner server)(monitor limits displayed FR=60) in car (Ferrari 488 at Spa or Nuremburg) Asus Strix 1080TI No OC - 2540x1440 60hz - I7-8700K - 32Gb w/ TIR5, CrewChief, JoyToKey, Streamdeck and Simhub (trackmap, temps, radar, etc) running in background. If outside car i.e. playback low=52/high=68 depending on camera view/no of cars/track location. More pixels to push than in-car. So, I think you are doing quite well ... I'm just jealous, because I still waiting for rtx3080TI to become available , probably late summer at this rate . See attched screens for my settings:
Why 50 hz refresh rate and fxaa is enabeled whem fsaa is level 4, i think that does not work when fsaa is selected. I am on my vega56, everything full other than opponent on high ( cuz i race with 40-45 ai and it eats lots of wram ), shadows on high and shadow blur on optimal. 15 visible ai, in player json rearviewcull false and rearview distance 500m, at 1080p and maxframerate to 63 ( i have 75hz freesync monitor and 63 seems smooth, others have small stutters), while racing perfectly stable 63 fps on most demanding tracks, sevring, spa, nords, the most demanding is liskis road america with these mirror settings. On repkays the fps dips to 50 but hey the replays are choppy on the new ui anyway until s397 fix it.
RTX2070+Ryzen 2600x+1440p PRB shaders on new tracks destroy FPS, but it's usually 60 fps with occasional slumps. The change of day and night greatly sags the FPS. I think I need to buy a Ryzen 5600x as the Nring+43 AI is causing problems at the start.
If your refresh rate is set to 50 Hz your maximum FPS will be 50. Try to match your refresh rate with your monitor maximum refresh rate. The difference is significant.
50hz refresh is the way RF2 interprets my Dell Monitors 60hz setting which is actually 59.973hz so windows (Win10-64b shows 60hz setting) passes it on as not a true 60hz. I have forced RF2 to set 60hz (didn't change my fps any), but every RF2 update usually changes it back to 50hz setting. I have checked the Actual frame rate in game and it shows 60fps max cannot do more with this monitor (it does not OC either) Video card is rendering more hence the 80fps+, but monitor will only do 60fps. See below from MS Support: Certain monitors report a TV-compatibility timing of 59.94Hz. Therefore, Windows 7 and newer versions of Windows expose two frequencies, 59Hz and 60Hz, for every resolution that is supported at that timing. The 59Hz setting makes sure that a TV-compatible timing is always available for an application such as Windows Media Center. The 60Hz setting maintains compatibility for applications that expect 60Hz. In Windows 7 and newer versions of Windows, when a user selects 60Hz, the OS stores a value of 59.94Hz. However, 59Hz is shown in the Screen refresh rate in Control Panel, even though the user selected 60Hz. P.S. A little history: Way back in the day when the CRT telly was a new and shiny thing, capacitors were expensive and voltage regulators were very expensive. Locking the field rate of the telly cameras to the mains frequency meant that ripple on the power supply due to poor regulation and undersized filter caps in the receiver would cause brightness variations, but that they would be more or less static over time, doing anything else caused very annoying rolling brightness on the display as the supply voltage and scan position beat together. This of course gave rise to 60Hz frame rates in the US and 50Hz just about everywhere else. Then colour arrived on the scene and we got another weird split, with the US slightly breaking things by going to 59.97Hz (Due to the way they decided to encode the croma subcarrier), while everyone else stayed on 50Hz. When computer monitors moved from expensive vector displays to use what were basically repurposed TV technology 50 and 60Hz stuck around (only now we run them non interlaced as a frame rate rather then a field rate). More than we ever wanted to know