xiantom, on Oct 26 2012 - 00:29, said:
It depends on what kind of a bottleneck you have. Most people have GPU bottlenecks, so multi-core support doesn't really help them much although it does help.
Qumefox, on Oct 26 2012 - 00:34, said:
Uh.. No. This will only be the case if whatever encoder your using is only single threaded. Handbrake on my machine for instance has zero problems eating every core I have. ffSplit will also use all four when i'm streaming.
A multi-core cpu will yield no benefit over single core when running single threaded applications, beyond the ability to run other things at the same time like you said. But applications that are multi-threaded.. multiple cores help A LOT.
Personally, WoT doesn't even use 100% of a single core on my machine.. though actually it doesn't use 100% of the GPU either, so i'm not really sure what my bottleneck is. FPS for me ranges from 25-70 depending on map and location.
packet_loss, on Oct 26 2012 - 00:47, said:
I suspect most of you don't know what you're talking about and are only inferring things from experience using single and multi-threaded applications.
Whether or not multi-threading will improve performance for WoT is entirely dependent on the implementation of BigWorld's render engine.
Classical multi-threading in games would not improve render performance unless the GPU was starved by the CPU. This solution would use multiple threads primarily for things like loading resources. DirectX 11 changes this a bit and makes multi-threading more useful, if you want to read about it look here http://msdn.microsof...1(v=vs.85).aspx
However, from what I can tell, BigWorld does not support DirectX 11 (or at least it didn't prior to WG buying it).
What this means is without a major overhaul of the Render engine, multi-threading will not make any difference if your GPU is already thrashing. Even if you are CPU bound, if the implementation of the game engine is inefficient in the way it sends data to the GPU there will be no increase to FPS simply from multi-threading.
Rather than blindly asking for multi-threading without understanding how it impacts the game, you should just sit patiently and hope that WG does a lot of stuff (perhaps including multi-threading) to improve performance.
packet_loss, on Oct 26 2012 - 01:02, said:
That may or may not come to fruition... I have a quad core and a mid to high end GPU. I still do not get even 60 FPS on medium-ish settings.
There are a lot bigger fish to fry than multi-threading.
Gryphon_, on Oct 29 2012 - 06:25, said:
If the game was rewritten for multiple threads then those with multiple cores would show some benefit. I agree that the GPU is doing the most work, but the CPU has to process the frames for the GPU to render. On top of that, there is also physics, and managemen of the two-way datastream to and from the server. Putting either physics or the datastream on a separate thread should pay significant dividends.
Nevertheless, rewriting the code so that its multi-threaded is a lot of work as making code thread safe isnt easy.
Mohoao, on Oct 29 2012 - 07:18, said:
You probably need a new video card more than threaded game engine.
Packet_loss and I had this discussion before in another thread. A threaded game engine is only really beneficial when the game resides ONLY on your PC, when it is a "thin-client" like WoT most of the actual processing is done on the server, not your machine.
My recent experience suggests that there may be more people that are CPU-bound than we might think. I play on an old desktop with an Nvidia GT 430 graphics card (a very modest GPU by today's standards) and 2 GB RAM. However, until recently, it had a Pentium E2140 CPU (dual-core, 1.6 GHz, 800 MHz FSB, 1 MB L2 cache) - a severely underpowered CPU for running WoT. I could get 20-25 fps with every graphic setting turned to "Low" or "Off" - playable, but not a great visual experience. The core running WoT stayed pegged at 100% and the GPU ran HOT (like put-an-ice-pack-on-top-of-the-case hot) while the graphics card mostly idled along.
A couple weeks ago, I replaced the Pentium with a Core 2 Duo E6600 (dual-core, 2.4GHz, 1066 MHz FSB, and 4 MB L2 cache), essentially swapping 6-year-old technology for slightly better 6-year-old technology (it's the best the motherboard could handle). My setup now exceeds the minimum specifications for WoT by just a little bit. Now I can play with graphics settings on "Low" to "Medium" (and draw distance and LOD at max) and get 30-35 fps. The game experience is much more interesting- I can see muzzle flashes, explosions and smoke, and terrain detail that I had to forego before. However, the core running WoT STILL stays pegged at 100% (but doesn't run nearly as hot, although a good cleaning of the cooler fins and fans probably has something to do with that) while the graphics card barely breaks a sweat. I'm definitely still CPU-bound, and a $1,000 graphics card wouldn't do a thing to boost my frame rate until I get a CPU that can handle the load without maxing out.
I'd be curious to hear from people who have older and/or slower CPUs to find out what percentage of CPU cycles WoT eats up on various graphics settings. I'm wondering about the break point where the processors become powerful and fast enough to run WoT on medium-high graphics settings without pegging the needle at 100%. I hope it won't be too much longer before I can do a more substantial upgrade, but even then, I won't be able to spend enough to get me into Core i7 land. I'd like to know how far I have to go to run WoT comfortably.
Edited by snobot, Oct 29 2012 - 20:01.