No prob, I know a ton of players are having the same issues as we are, its not random at all. Its not just a small group with the same or similar config, its all of us… We can also say that NW being stuck in DX11 because of the graphic engine they are utilizing is the cause, or a contributing factor.
Back in the Beta, the DX12 was super buggy and caused a ton of issues for players of all configurations. Hence why were still stuck on DX11. Sadly there is no talk or anything as to when we will get DX12 back as that could help the players on high-end GPUs, but doesn’t address the limitations of the graphic engine they are using.
We like NW, its superb looking but very poorly optimized, hope they do something about this soon.
As for the 517.40 working with 4090 it came out just about the same time as reference FE’s, its got some hickups, but its useable, when all the 520.xx drivers just cause issues I’m willing to sacrifice till Nvidia gets their $hit together…
Both the PCs above are never under 90FPS wile in NW, wars or in towns… But dips of about 70FPS is insane /w locked FPS to refresh /w g-sync off, that alone will cause a shudder no matter the configuration. Its just the reality of New World, Good ol’ AGS and working as intended
It is well known that the 5800X3D with its enormous cache performs well in CPU-heavy games and it also works in New World.
Nevertheless, something urgently needs to be done about the performance.
Top end CPUs 12900K/13900K Ryzen 5900X/7900X create such a big bottleneck in certain game situations that the FPS drops extremely and it also becomes super flickery because of the bottleneck.
If you release a game that includes game modes like 50v50, you should be able to expect high end PCs in 2022 to achieve 60 FPS (detail and resolution don’t matter because CPU bottleneck even in 4K ultra or FHD low).
Agree we cant see it anyways, but the problem for most people isnt the framerate of 60, but the monitors refresh rate, that plays a huge key to image quality and smoothness… If our 4k monitors would go down to a refresh of 60HZ(like we used in 1990) then we would do it and lock the FPS to it, but sadly any high-end IPS 4K monitor will not, lowest refresh is normally 120HZ - 244HZ… If you can not display that FPS to match the refresh you get shudder and other issues… Just any FYI… Now G-Sync and Free-Sync are there to assist that gap between FPS and refresh… For most people this is sufficient but it now introduces input latency when they are far out of sync, why some people get rubber banding effects, when their network latency and connection is good, giving the effect of lag or bad connection.
Running on a 13700k now and i can confirm that i have 150 fps + in towns on max settings now @ 3440x1440 compared to 50-70 fps previous @ 2560x1440 with the 8700k
My monitor’s above average, AW2721D input lag is 4x worse at 60fps vs native refresh 240hz, anything more then 10ms is felt, and it’s felt. Even though my monitor supports 20hz VRR, it still has tearing at 50-60FPS wars since the game/client hangs & hitches
Enabling GSync for less tearing and smoother experience? You suffer from noticeable desync. Miss hits all over when players are within sneezing distance
It’s a lose, lose situation.
Client performance and bug fixes need to be #1 priority. I don’t want to harp on it, but if you were here at launch and experienced rubber band OPR and unplayable player teleporting wars, we’ve come a long way but not quite there… Yet
You have a very powerful gpu with an outdated cpu that can’t handle that performance. There is a lot of videos on youtube displaying cpu bottlenecking of the 4090. Even the difference of 50% bottleneck on some comparisons.
Yeah I’m still getting 30% utilization 45 FPS with RTX 4090 in OPR. Its a bit worse than 3090. So tired of this. Not swapping back to 3090 just to get 10 FPS more in NW, when 4090 plays other games beautifully. Frustrated.
Ultra settings might be a bit over rated. Seeing as how most of the time you are moving around, you will not see the detail. If you are standing still taking a good hard look at something, sure Ultra will make it look better. This is the reason low priced big screen TVs usually do not work well as computer monitors. They look great during a movie. Pause the movie and the picture looks like poo poo.
I go by how the game looks during game play rather than try to get some arbitrary FPS number.