"Am I The Only One?"

@ 2006/12/15
in case you've missed it, is at age 37, I've grown up and no longer need the fastest. I just need the rig that is "fast enough". Do I still desire the latest and greatest? Sure. Do I lust after it? No. Am I stupid enough to spend my extra money on this stuff? No.

Comment from jmke @ 2006/12/28
New game... 46fps 1600x1200
http://www.anandtech.com/video/showdoc.aspx?i=2895&p=1

bottleneck? not the CPU
Comment from jmke @ 2006/12/16
20" LCD is not expensive, 1600x1200 native resolution, turn up AA/AF level to decent settings (4xAA/8xAF) and all new games will be GPU limited. not CPU limited.

this has been discussed over and over and over again, we don't need physics cards to make game better, we need people to use the CPU power to deliver better gameplay, cfr Source engine which will have new particles system that can tax quad cores, and actually take advantage of a more powerful CPU.

but until that goes mainstream, any old CPU will do fine, a P4 ~3Ghz is fast enough to play anything released today without a glitch, A64 ~2.4Ghz the same, Core 2 2.2Ghz also, you don't notice any benefit from a faster CPU in FPS/RPG/99% of games out there;

there are exceptions like RTS games which do make good use of the extra CPU power, but in those games even an older previous gen VGA card is overkill

but hey, I know why it's hard to believe, don't worry, although it seems cool & dreamy to think you can make a game no longer GPU limited thanks to this uberfast VGA card, but it's just not true.

if you play at 1280x1024 you shouldn't be looking at a 8800GTX for gaming anyway; invest in higher resolution monitor if you want higher quality, or stick to a cheaper VGA card.

now let's just assume you DO buy that 8800GTX to game on your 19" LCD, and you have an older CPU, yes games will be CPU limited, in such way that you can get higher FPS by swapping CPUs, but here's the kicker, with the low speed CPU the FPS in-game will already be beyond playable levels 60+FPS, swapping to new CPU will make it increase to 100+FPS... woop-ti-doo, talk about redundancy

but like you say, I should google, but I don't right now, I know where the articles are that prove that the G80 is not CPU limited

check the charts below, they come from http://www.firingsquad.com/hardware/...d_cpu_scaling/

1280x1024 Oblivion HDR 4xAA/16xAF, FX-62 vs 3800+, both at the end of the high/low performance scale.. difference in-game ~13FPS.. but.. 3800+ is already at 90+FPS...

increase the resolution to 1600x1200 (20/21/22/24" LCD available for as low as $300).. difference ~5FPS, both at 80+FPS...

Call of Duty 1600x1200, difference ~2FPS
Quake 4 1600x1200 difference ~27FPS but lowest already 90+
BF2142 1600x1200 difference ~9FPS lowest 86+


the list goes on... games are not GPU limited, G80 cannot escape the rule of thumb than when you increase resolution & IQ setting, the bottleneck will be GPU, not CPU.

and then there's Crysis and Alan Wake to think of, 2 next gen games for 2007 which will stress the G80/R600 easily even at modest resolutions
Comment from SuAside @ 2006/12/16
if by decent screen you mean one of those uberwidescreen things that resemble a plasma tv more than a pc screen, sure. but when looking at an average 19", i'd doubt it (and yes, a 19" TFT is a decent monitor to me). it's often claimed a 8800GTX is pretty useless when using a screen below 24" and a uberhighend kickass CPU.

just google nvidia CPU limited & you'll find ample quotes to that effect from nvidia itself. you'll even find a few nvidia presentations confirming that.

still, those same presentations are usually also calls for more physics calculated on the gfx card or on a seperate entity card and not the CPU, thereby balancing the claimed cpu limitation.

now, i'm not saying your wrong jmke, (in essense i dont care because i'll never own a 8800GTX) but it is what nvidia and others have been stating.
Comment from GIBSON @ 2006/12/16
Quote:
Originally Posted by SuAside View Post
3000W mainstream psu's in 5 years? that's a weeeee bit exagerated

from an ecological point of view that's simply downright impossible unless we perfection cold fusion in 5 years or find some other comparable powersource.

but yes, i do understand what he's talking about.

i went from your average user to enthousiast when i bought my Monster 3DII 12mb graphics card for 13000 BEF back in the old days. today, i no longer look to splurge like that. nor do i wish i could/would.

if you look at reality, we're using bigger & bigger and hotter & hotter graphics cards. but why? nvidia itself claims that their GPU's are now CPU limited (and hence the average Joe simply cant see the difference between the higher end cards because he cant afford the 1000€ CPU's needed to bleed em dry). we have totally insane dual core CPUs. quad cores are within grasp. yet, none of our software is really taking any advantage from the possibilities, not even our operating systems. nearly no standard user software has been written with multicore in mind.

yet we keep investing in the non-relenting waves of new hardware. is it just me, or are these waves actually speeding up? everything goes faster, better, hotter and more powerhungry.

i slowly cease to care, how about you?
yup, i agree on that one!
Comment from jmke @ 2006/12/16
Quote:
Originally Posted by SuAside View Post
nvidia itself claims that their GPU's are now CPU limited
sorry, but that's a load of bullsh*t; anyone with a decent monitor can stress the G8800GTX enough to make it the limiting factor, not the CPU, in any decent FPS/RPG game with high detail the bottleneck is still the GPU, not CPU.
Comment from SuAside @ 2006/12/15
3000W mainstream psu's in 5 years? that's a weeeee bit exagerated

from an ecological point of view that's simply downright impossible unless we perfection cold fusion in 5 years or find some other comparable powersource.

but yes, i do understand what he's talking about.

i went from your average user to enthousiast when i bought my Monster 3DII 12mb graphics card for 13000 BEF back in the old days. today, i no longer look to splurge like that. nor do i wish i could/would.

if you look at reality, we're using bigger & bigger and hotter & hotter graphics cards. but why? nvidia itself claims that their GPU's are now CPU limited (and hence the average Joe simply cant see the difference between the higher end cards because he cant afford the 1000€ CPU's needed to bleed em dry). we have totally insane dual core CPUs. quad cores are within grasp. yet, none of our software is really taking any advantage from the possibilities, not even our operating systems. nearly no standard user software has been written with multicore in mind.

yet we keep investing in the non-relenting waves of new hardware. is it just me, or are these waves actually speeding up? everything goes faster, better, hotter and more powerhungry.

i slowly cease to care, how about you?