The game runs on DX11 still and doesnt support/optimize RTX series, and u expect 4090 will run this game? They cant even optimize the game and reduce bottlenecks and throttling by cpu no matter what high end cpu u are using. Only one core on 100% sadly in this game, which makes the total cpu usage still about 40-50% but gpu usage drops to 50-60%
Sure, I’m running with an i9-12900k and I don’t have much more frames than @cattboy during wars (I still run between 50-60fps) @1080P with an i9, 3080TI and a 32gb ram all settings low
If CPU really did make that much of a difference then my game would be running WAY smoother.
Yes, it is a fact that New World is extremely CPU-intensive, especially when many players come together in a small space and the load is not distributed among the available cores.
AGS must finally optimise the code here, nowadays all mid/high end PCs have more than 4 cores.
I don’t understand why AGS even closes tickets opened directly by email with the simple comment, saying that the information will be passed on. But absolutely nothing happens.
If you offer game modes like 50vs50 in a game, then I don’t think it’s too much to ask that the engine then also offers 60 FPS on high-end hardware.
Exactly! Look at WoW for example, albeit the graphics are indefinitely much worse than New world, you can easily play WoW even on low-end PCs and have good frames.
Tested disabling HT/SMT and HPET and that didnt help.
The load on a single CPU core is so high that only the R7 5800X3D can output reasonably good FPS because of its significantly larger cache.
I’m sporting an i9-12900k, its performance supersedes R7 5800X3D will it work for my CPU atleast?
I dont think that disabling HT will help with your 12900K
Have you tried R7 5800X3D on wars with HT/SMT and HPET disabled? If yes how much fps @ 1080P on all settings low?
I dont have a R7 5800X3D, im rocking a 5900X, my FPS at wars are ~40-50 atm.
ok so we can’t know until its tested gotcha.
Yes, unfortunately, but it is engine or cpu limited.
I have tested FHD all low and 4k all max, both times 40-50 FPS in wars.
There definitely needs to be MAJOR optimization updates.
@Luxendra are there any plans for the New world team to further optimize the game now that the player population is in an extreme rise?
I believe that everyone should be able to enjoy New world at its best in terms of game performance.
i should have stopped at 42 i went 65 inch feel like im looking at the sky sometimes lol
Its not about cpu… high end still bottleneck because of Dx11… only using one core lol
Well you can power through some of the poor optimization with the latest and greatest CPU/RAM combos, although at very poor performance/dollar.
Cap it at 90. 60-90 is what you can notice.
Using MSI Afterburner hooks into DX and reduces performance as well.
Me and the Wife both have some serious hardware and never have any issues with other games even MW2 at the same resolutions & ultra settings… New world on the other hand is another story, the “Azoth Engine” is just not optimized for any serious hardware. The CPU offload is horrendous in any situation. Both PCs below, can sit at the locked FPS\refresh in both 2k & 4K with maxed settings except “Terrain Details” with little issues except when the GPU offloads on the CPU, then the FPS dips for a sec or 2 causing a shutter. On the Wifes PC when this happens her CPU usage will jump too 70% or more for no reason (Normal is 20-30% CPU and 70% GPU) My new PC /w the 13900K will do the exact same thing but jumps to 50-60% (Normal is 10-20% CPU and 80% GPU) Something is defiantly off there. Seems to be that the graphic engine in NW is offloading on the CPUs or its very CPU driven at the code level and not designed to utilize the hardware resources efficiently at all.
Wifes- 12900K(Stock), 32GB DDR4(4600), RTX 3090Ti, Custom water cooled loop 2x 360mm rads, cpu and gpu blocks, 850W PSU, pump @2300 (max temps 70c GPU and 73c CPU at full load)
120/144/165hz locked FPS to refresh, in 2k and 4k
My New Build - 13900K(@5800), 64GB DDR5(5600), RTX 4090, Custom water cooled loop 2x 430mm rads, cpu and gpu blocks, 1200W PSU, pump @2850 (max temps 75c GPU and 85c CPU at full load)
120/144/165hz locked FPS to refresh, in 2k and 4K
So because the temps are below the thermal throttle marks, there is zero throttling going on. This is what I thought the fps dips & CPU offload was, but after intense testing, it was discovered that it was not the case.
Both PCs are de-crapified W10Pro 22hx(updated), Nvidia 517.48(newer drivers are causing issues with NW). Both PCs are Samsung 500GB (C:) and Samsung 2TB (D:) M.2 Pro SSDs, fresh windows installs, only thing running in background is drivers and basic windows services & steam (for NW) (Avg 3gb used after boot)… This also tells me that the OS and the condition of the services are not the issue either…
Both PCs have the Nvidia drivers & CP and Windows modified for max performance…
Update Both pcs had the Nvidia 526.98 driver installed to run Warzone2.0, game ran flawless @ capped FPS of 144 matching the refresh of monitors in 4k, ultra settings. zero dips and issues… My PC was at 60-78% GPU and 4-10% CPU the entire time playing… But this driver was causing a ton of issues with Newworld, so DDUed PC and installed 517.40 again… Crazyness…
Thanks for sharing such detailed info! Very informative indeed!
Thanks for the detailed post, what surprises me a bit is that your 4090 works reasonably with such an old driver.