Synthetic tests are never a better indicator than real world benchmarks. And in real benchmarks the octocore preforms really well in multithreaded tasks. As for gaming, in the coming years it will prove as one of the better deals because even the new consoles are using an AMD octocore. Most games will be coded to support more cores. So it may not be faster than a lot of 3rd and 4th generation Intel CPUs, but it could be more a better bang for the buck in the long run.
erm, no. synthetic tests is where AMD performance is arguably comparable while real world benchmarks show it lagging behind due to real world not being perfectly scalable to as many caores as you want.
For gaming, CPU is irrelevant nowadays. its almost never a bottleneck. And no, games wont be coded to support more CPU cores, because:
1. even by using as few as 2-4 cores they are already hitting power limits of the console GPU power which means there is no point in scaling upwards as you wont be able to run it anyway.
2. Developers hate parallel programming. they hate it so much as to move from CPU centrick to GPU centric programming, moving things like Physics to GPU and the like. thats because GPU is still for the most part singlecore. and they had a very big struggle going dualcore (Sli, Crossfire) with games not supporting it ect. SO if there is any way, the developers will go for as little cores as possible. ANd there is a way - the Intel way of higher IPC.
Like i stated previuosly, AMD compartments get worse in the long run due to power consumtion difference when to have at least similar performance AMD CPUs consume TWICE the power, which makes your power bill soar.
GPU is the most important thing in the PC anyway, even you acknowledge that. So you can get away with a cheaper, slower CPU if that means you can get a better GPU. That's how budget builds are made. Enthusiasts will always go for Intel, of course.
Yep, for gaming you certainly can. Gaming isnt the only industry that needs CPUs though just so you know. and even for gaming AMD is a worse choice due to game developer industry having very poor hyperthreading ability, so fewer, higher IPC cores are preferable.
That depends on how expensive electricity is where you live. Usually the bill will be around $30 more per year on an AMD chip. That's not a lot on a monthly basis.
Well if you got free electricity, sure.
Lets do some math.
Lets take i5-3570K that costs 230 dollars on newegg. Its full load power consumption is 77W.
Lets take FX-8350 that Costs 180 dolalrs on newegg. Overclocking it to match the chosen i5 power, it should consume around 150W on full load.
Thats power difference of 73W and price difference of 50 dollars.
Now, lets say you are running your computer for a regular 8 hour per day job, so 8 hours of consumtion. ALso you game on weekends so lets assume weekends also have 8 hour consumption for simplicity. Obviuosly these will differ for each person.
73*8=584Wh per day.
584*365=213160Wh per year. That is 213.16 KWh. Now, i dont know how costly electricity is at your home, but here it costs around 50 cents per KWh (rounding for simplicitly). This would total to 106,58 per year. Remmeber, our price difference was 50 dollars, which means that the AMD CPU will be more expensive as soon as +6 months in!
Now, that will of course differ based on usage and electricity price, but i think you can grasp the concept now.