Killing Floor 2?
ฅ(=^・ェ・^=)ฅ Vintage_Neko @vampire_neko
commented on
Killing Floor 2?
ฅ(=^・ェ・^=)ฅ Vintage_Neko @vampire_neko
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_970_SLI/1.html
They trade punches depending on the game and resolution (gtx 970 sli and 295x2) I might consider it.
Deleted User @__removed_2febdcff2cGILeMdar
commented on
Killing Floor 2?
Deleted User @__removed_2febdcff2cGILeMdar
This account has been suspended.
ฅ(=^・ェ・^=)ฅ Vintage_Neko @vampire_neko
commented on
Killing Floor 2?
ฅ(=^・ェ・^=)ฅ Vintage_Neko @vampire_neko
I will do 4k, everything is moving that way anyway. Whenever I can afford a 4k monitor. Technically, for the price and depending on the game and resolution and for under $1000, best performance is rough order:
1. R9 295X2 $650
2. R9 290X crossfire $600
about equal to:
3. GTX 970 sli $660
4. R9 290 crossfire $540
5. GTX 960 (4GB version) sli $480
(cheaper and equal to):
6. GTX 980 single $550
But gpu performance varies alot by game! So you just have to look at what games you want to play for your gpu decisions.
ฅ(=^・ェ・^=)ฅ Vintage_Neko @vampire_neko
commented on
Killing Floor 2?
ฅ(=^・ェ・^=)ฅ Vintage_Neko @vampire_neko
You might find this interesting too. Gaming at 4k with 980 sli, AMD vs intel. AMD often beating the much more expensive intel setup:
http://www.tweaktown.com/tweakipedia/56/amd-fx-8350-powering-gtx-780-sli-vs-gtx-980-sli-at-4k/index.html
Deleted User @__removed_2febdcff2cGILeMdar
commented on
Killing Floor 2?
Deleted User @__removed_2febdcff2cGILeMdar
This account has been suspended.
Old_Reaper @elder_reaper
commented on
Killing Floor 2?
Old_Reaper @elder_reaper
AMD is not as good for servers actually. AMD is better where you're budget minded and need focus on clock centric, single-role systems. Overclocking is useless though as your cache to system clock timings desynchronize and eventually the CPU cache can simply not function as effectively - corrupting data, reducing system stability, increasing thermal footprint and power consumption etc.
Intel is better for virtualization, multirole systems and parallel computing.
Deleted User @__removed_2febdcff2cGILeMdar
commented on
Killing Floor 2?
Deleted User @__removed_2febdcff2cGILeMdar
This account has been suspended.
Old_Reaper @elder_reaper
commented on
Killing Floor 2?
Old_Reaper @elder_reaper
For single-role yes. Servers these days are typically jack of all trades. You misremembered.
ฅ(=^・ェ・^=)ฅ Vintage_Neko @vampire_neko
commented on
Killing Floor 2?
ฅ(=^・ェ・^=)ฅ Vintage_Neko @vampire_neko
"Overclocking is useless though as your cache to system clock timings desynchronize and eventually the CPU cache can simply not function as effectively - corrupting data, reducing system stability, increasing thermal footprint and power consumption etc."
I'm curious about this, can you link any online articles?
Old_Reaper @elder_reaper
commented on
Killing Floor 2?
Old_Reaper @elder_reaper
> I'm curious about this, can you link any online articles?
Not really, this is something I learned from electrical engineering. You learn that in your computer there are several clocks used by the system, and that many of them run at the same speed, IE the CPU has a set clock of 800MHz, it can only run at multiples of 800MHz normally, such as 1.6GHz, 2.4GHz, 3.2GHz etc. The cache however has a much narrower range, think of the cache as a small quantity of very fast RAM, IE my HP C8000 PA8800 has 32MB L2 cache shared between its 900MHz dual core CPU. That cache is designed to operate only within a certain set of clocks, which are often narrower than the CPU itself. On your laptop, for example, CPU speed may vary between whatever is official and a multiple down of your system's bus clock. If your cache expects between 1.6GHz and 2.4GHz as the CPU clock, but gets 3.2GHz, its going to increase something called the 'miss rate' which basically is a failed read of the cache, requiring the CPU to reload the data from RAM into the cache, which wastes cycles. If your RAM isn't ECC, and feeds bad data back to your disk, you get corrupted files. So yeah, don't do it. Intel's Turbo mode is fine, but actually going in and overclocking to insane levels carries stupid risks. Not to mention it shortens CPU life.
As for clock speed itself, its mathematically impossible to have two CPUs, one running at 3.2GHz, and one at 2.4GHZ, at the same workload, consume the same amount of power. The 3.2GHz version can displace up to 50% more power than the 2.4GHz, so if the watt displacement of a CPU is 100W at 2.4GHz, it could consume up to 150W, typically less though. Higher power consumption here translates to more thermal footprint.
Please login to post.