Intel Core 2 on 45nm: Performance, Overclocking, Power Usage

CPU by piotke @ 2007-10-29

Intel is launching their successor the popular Conroe CPU, build on 45nm manufacturing process, it boasts reduced power consumption and has 50% more L2 cache. The first product out the door is a quad core beast dubbed QX9650. We take this new creation through its paces, comparing performance, power consumption and venturing into overclocking land, where sub zero cooling is the norm.

  • prev
  • next

Game Benchmarks

Game Benchmarks

Futuremark releases synthetic gaming benchmarks every few years, in our chart below we’ve included all their 3DMark benchmarks since 2001SE, until 3DMark06 none of those benchmarks supported multi cores so results should be close for most:

Madshrimps (c)


As expected, once you no longer run CPU specific benchmark or do tasks which depend on more factors than just raw CPU power, the difference diminishes, only 3DMark06 shows a noticeable difference between the two processors, the 3D06 CPU tests shows close to 100% performance scaling when the amount of cores is increased. The overall score goes up by ~12%.

Our test setup is equipped with an ATI HD 2900 XT which can be considered a “gamer” video card, with enough pixel pushing power to run the latest games; to find out if the extra 2Mb L2 cache and 45nm manufactering process yields a noticeable improvement in games we choose two FPS games; first up FEAR. An older game which doesn’t support multi core:

Madshrimps (c)


At 1280x960 High Quality the video card is not the bottleneck as you can see the 170+ average FPS results, still the CPU hardly makes a difference here, the extra L2 cache is responsible for ~1% difference...

On to a brand new game, Crysis from Crytek is a long awaited spiritual successor of Far Cry, this new game is known to bring the latest and greatest hardware to its knees. We picked up this intriguing tidbit about the game from Shacknews:

Shack: What is the main limiter for Crysis in terms of GPU, CPU, or RAM? If users are near the low end of the requirements, which should they upgrade first?
Cevat Yerli: We would say first CPU, then GPU, then memory. But it must be in balance. If you are balanced, we are more CPU bound then GPU, but at the same time at higher CPU configurations we scale very well for GPUs.

Shack: Is there dedicated support for 64-bit and dual- and quad-core processors, and if so how does the game distribute its tasks? Do you suggest a higher-clocked dual-core over a quad-core, or is quad-core performance enough to give it the edge?
Cevat Yerli: We support both 64-bit and multi-cores. Multi-core will be beneficial in the experience, particularly in faster but also smoother framerates. 64-bit and higher memory will yield quicker loading times. We recommend quad core over higher clock.

Crytek is claiming that you should first upgrade your CPU before you buy a new VGA card if you want better performance in Crysis. Also, according the Cevat Yerli, instead of increasing the speed of your Core 2 Duo CPU, you are better of with a lower clocked Quad Core processor. Let’s put these claims to the test shall we?
Madshrimps (c)


At 1280x1024 (a rather modest resolution) we are happy to get close to 20 FPS with the HD 2900 XT. Moving from the Core 2 Duo to a Core 2 Quad CPU which has more L2 is and underwhelming experience, we get a boost of ~2%, worth the extra cost?

For those people playing the Crysis Demo right now on their Windows XP machine we found a nice little tweak which allows you to enable the "Very High" quality settings under XP/DX9, making the game look very close to what you get in Vista/DX10.

Let’s forget about multi cores for a moment and compare single core performance ->
  • prev
  • next
Comment from Sidney @ 2007/10/29
Reading other reviews, it would seem the engineering sample tested at [M] requires more vcore than others. 4Ghz quad would now be common speed; no bragging right unless you see 5Ghz.
Comment from jmke @ 2007/10/29
yes that seems to be the case, but no faulting engineering samples, they are supposed to run without fault at rated speeds with default vcore, which this QX9650 did
Comment from Kougar @ 2007/10/29
2.0v for 4.9GHz? Would have loved to see what that did to the power consumption figures for that CPU!

Regarding the discrepancy with your power chart, I think that has something to do with the physical properties of the chip design. I suspect that at very high frequencies there is a thermal threshold that once neared the leakage increases dramatically. After which it will quickly reach the point where the chip ceases to function or function stably since the increased leakage raises the heat, and the heat only further increases the transistor leakage in a self-fulfilling cycle. I don't have any real proof other than my own experiences with my Q6600...

I am curious, I notice from that CPUZ image the ES QX9650 uses 1.20v at 3GHz. My own Q6600 does the same... so how far can you drop the voltage and have the QX9650 remain stable at 3Ghz? I got a Q6600 down to 1.5v, but somewhere below that point my Q6600 will show errors. Gigabyte unfortunately lacks most of the FSB voltage tuning ASUS boards offer, as some members on the XS forums claim to have reached 1.10v for 2.5-2.8Ghz speeds for Kentsfields. Would be interesting to note what effect the smaller process size and change in transistor materials would have on this for Penryn.
Comment from jmke @ 2007/10/29
Quote:
Originally Posted by Kougar View Post
2.0v for 4.9GHz? Would have loved to see what that did to the power consumption figures for that CPU!
You can last chart on this page: http://www.madshrimps.be/?action=get...&articID=6 36
386W vs 210W stock
Comment from Kougar @ 2007/10/30
Ah, nice! I had completely missed that, staring me in the face. I guess I need to take more power measurements since the data I have on my Q6600 includes my video card...

Why not extend that same chart a bit more to the right though, and undervolt that puppy?
Comment from CFKane @ 2007/10/30
I'm a little surprised that you're talking about a discrepancy in the chart while you're mentioning the changed cooling solution in the same sentence. The die temperature is one of the most important factors for the CPU power consumption and if you switch to a solution which removes the heat more efficiently, you should expect reduced power draw even with a higher clock and voltage.

That's also the reason why the maximum current in the electrical specifications for CPUs significantly exceeds what you would get from dividing the TDP by the core voltage. It's given for the maximum die temperature, which you will (hopefully) never reach in a real world situation.

Bear that in mind when testing or comparing CPU power consumption: The room/case temperature and cooling solution have a major influence and the die temperature at a certain load is an interesting figure to report along the power draw (sadly missing in most reviews).
Comment from jmke @ 2007/10/30
Thank you CFKane for you post and welcome to the forums
since the temperature was the only large difference between the two settings we were not doubting that it was indeed the lower temperature which was causing the lower temps; but I've not seen any article on the web discussing this aspect of the power consumption... hence were a bit hesitant to include that statement.
Comment from jmke @ 2007/12/07
we also got word back from Intel explaining the power usage at different temperatures:

Quote:
Matty @ Intel:

Yes, the power consumption is reduced when the temperature of the processor is lowered.

There are many things that happen in a CPU when the temperature is changed and to elaborate further on the processor specific causes we have to look at the origin of the power consumption. We can divide the total consumed power into two main parts, static power (Ps) and dynamic power (Pd).

The static power consumption is what we usually call the leakage. In an ideal transistor, it should completely shut off the channel between the source-drain, gate-source and gate-drain. Transistors are far from ideal, and the current leaks between these parts and the substrate of the processor, and this is heavily dependent on the temperature.
For example, going from room temperature to 85C (~60C difference) increases the leakage power by a factor of more than 50. Thus, reducing the temperature with the same amount will make a huge impact on Ps.

Dynamic power consumption is emitted during the short amount of time that the transistor switches. Lower temperature reduces the resistance in the processor which results in shorter delay/faster switching of the transistors. Shorter delays and less noisy signals also reduce Pd.

I hope this explanation give you some clarity to the relation between power consumption and temperature. This can even be seen with air cooling: The power consumption is lower just after a load is applied compared to after a while when the temperature has levelled out, even though the load is the same.

 

reply