It appears you have not yet registered with our community. To register please click here...

 
Go Back [M] > Madshrimps > Articles & Howto's
Geforce GTX280 & Radeon HD4870X2 AA Scaling with XP & Vista Geforce GTX280 & Radeon HD4870X2 AA Scaling with XP & Vista
FAQ Members List Calendar Search Today's Posts Mark Forums Read


Geforce GTX280 & Radeon HD4870X2 AA Scaling with XP & Vista
Reply
 
Thread Tools
Old 4th September 2008, 14:54   #1
Madshrimp
 
jmke's Avatar
 
Join Date: May 2002
Location: 7090/Belgium
Posts: 79,021
jmke has disabled reputation
Default Geforce GTX280 & Radeon HD4870X2 AA Scaling with XP & Vista

In this in-depth article we take a look at the performance of the NVIDIA Geforce GTX 280 and ATI Radeon HD 4870 X2 when anti-aliasing is enabled. We test 8 different games at several different AA levels under Windows XP as well as under Windows Vista. How does performance scale when you go from XP to Vista, how much impact does enabling AA have? Read on to find out!

http://www.madshrimps.be/gotoartik.php?articID=869
__________________
jmke is offline   Reply With Quote
Old 6th September 2008, 16:48   #2
Kaotik
 
Posts: n/a
Default

You forgot to mention that
...8xAA on ATI should be compared to 8xQAA on nV, not the 8xAA which is 4xMSAA based CSAA mode
...16xAA on ATI effectively turn the card into single chip card which can do 16xMSAA, since both chips render the same frame with different AA patterns
...16xAA on nV is 4xMSAA based CSAA mode and 16xQAA on nV is 8xMSAA based CSAA mode

So 16x and 8x comparisons in your graphs are far from being 'fair' or 'apples-to-apples', the 8xAA should have ATI 8xAA vs nV 8xQAA (8xQAA = 8xMSAA) and 16xAA shouldn't even exist since the GTX280 can't do 16xMSAA which is (practicly) what the HD4870X2 is doing by blending the same frame rendered twice with different AA patterns.
  Reply With Quote
Old 7th September 2008, 00:04   #3
Madshrimp
 
jmke's Avatar
 
Join Date: May 2002
Location: 7090/Belgium
Posts: 79,021
jmke has disabled reputation
Default

Thank you for your input. much appreciated; the 16xAA on the ATI does indeed split performance in two, really heavy hit on performance, but ultimate image quality.

While the performance on the two cards at different AA levels can be compared head to the head in the charts, their scaling going from XP to Vista was definitely my main focus.

I don't know if you were the same person who email me about this same issue, but I agree with you that NVIDIA and ATI each have their own approach to the AA levels defined in their control panel, making it less than straight-forward.

on the topic of AA levels, I was hard pressed to find major improvement on NVIDIA going from 4xAA to 16xQAA (through NV control panel); same goes for ATI where 4xAA or 8xAA did little to improve image quality visually, meaning that the to be able to spot the difference you'll have to do screenshot comparisons of static screens and use a zoom-loop in order to find them.

on the upside, we finally almost get free 4xAA under Vista with ATI which will be a key feature for full DX11 compliance if I remember correctly.

Again thank you for your input and the educative message!


-----

I received the following mail regarding Crysis performance with the HD 4870 X2 (and Crossfire)

Quote:
Originally Posted by Felipe by email
I saw you article about scaling with XP & Vista comparing tow of the fastest cards in the market and I just want to tell you, thar, there is a list of Cvar in Crysis that increase FPS when using with a CF setup, this worked with 3870X2 and 4870X2, running 32bits executable.

I can say that Crysis some how is bottlenecking CF systems.

One of the Cvars is R_TEXTURESTREAMING.
Run #1- DX9 1280x1024 AA=No AA, 32 bit test, Quality: High ~~ Overall Average FPS: 46,97
Run #2- DX9 1280x1024 AA=4x, 32 bit test, Quality: High ~~ Overall Average FPS: 37,12
Run #3- DX9 1280x1024 AA=8x, 32 bit test, Quality: High ~~ Overall Average FPS: 36,33
TEXTURE STREAMING = 0
Run #1- DX9 1280x1024 AA=No AA, 32 bit test, Quality: High ~~ Overall Average FPS: 56,11
Run #2- DX9 1280x1024 AA=4x, 32 bit test, Quality: High ~~ Overall Average FPS: 42,41
Run #3- DX9 1280x1024 AA=8x, 32 bit test, Quality: High ~~ Overall Average FPS: 43,00
And using this combination here (based on my system)
sys_budget_numdrawcalls = 6000
sys_budget_videomem = 512
sys_budget_sysmem = 6144
sys_budget_frametime = 80
r_TexturesStreaming = 0
Using Only texturestreaming
Run #1- DX9 1680x1050 AA=No AA, 32 bit test, Quality: High ~~ Overall Average FPS: 4635
Using Texturestreamin and Budget config
Run #1- DX9 1680x1050 AA=No AA, 32 bit test, Quality: High ~~ Overall Average FPS: 4869

This have the same effect on 4870X2 and 4870 series, for some reason it doesnt help Nvidia cards.
__________________
jmke is offline   Reply With Quote
Old 7th September 2008, 01:33   #4
Kaotik
 
Posts: n/a
Default

Nah, I'm not the same person, it was another Beyond3D user as far as I know who mailed you.

I just thought that those differences should be mentioned even though the comparison between Vista & XP was the main concern of the great article.

Regarding the 'free 4xAA' on HD4870X2 you mentioned, I think you're mixing it with the DX10.1 requirement of 4xAA, which has nothing to do with performance, just that you have to have the support for it (as far as I know, DX10.1 was the first DX ever to actually require support for specific AA mode)
  Reply With Quote
Old 7th September 2008, 12:53   #5
Madshrimp
 
jmke's Avatar
 
Join Date: May 2002
Location: 7090/Belgium
Posts: 79,021
jmke has disabled reputation
Default

I've added an addendum to the article regarding the difference in AA levels between ATI and NVIDIA

I thought DX10.1 (and DX11) allowed 4xAA without a performance hit, much like Assassin's Creed on the ATI cards before they released the patch
__________________
jmke is offline   Reply With Quote
Old 7th September 2008, 19:00   #6
Kaotik
 
Posts: n/a
Default

Quote:
Originally Posted by jmke View Post
I've added an addendum to the article regarding the difference in AA levels between ATI and NVIDIA

I thought DX10.1 (and DX11) allowed 4xAA without a performance hit, much like Assassin's Creed on the ATI cards before they released the patch
Nope, there's no such thing as "free AA" really even though the performance drops are relatively small today. DX10.1 (or 11) changes nothing on these regards, 10.1 just introduced the requirement to have support for 4xMSAA to be 10.1 compliant, and allows access to the samples "better" (no idea how to better describe it really)
In case of Assassin's Creed, they took advantage of DX10.1 how they can access the AA samples later in the rendering too, with DX10 they had to do two rendering passes for some effects, one with and one without AA, while with DX10.1 they could just re-use the buffers from AA rendering pass, completely dropping out one rendering pass.

Last edited by Kaotik : 7th September 2008 at 19:02.
  Reply With Quote
Reply


Similar Threads
Thread Thread Starter Forum Replies Last Post
Nvidia GeForce GTX 275: Worthy Rival to Radeon HD 4890 jmke WebNews 0 15th May 2009 10:12
[M] Geforce GTX280 & Radeon HD4870X2 AA Scaling with XP & Vista jmke WebNews 1 4th September 2008 14:55
NVidia considers an across-the-board overhaul of its marketing strategy Shogun WebNews 4 9th May 2008 14:08
Driver Heaven Mobility Modder.net Public Beta V 0.8.1.0 - XP and Vista Support jmke WebNews 0 22nd June 2007 12:19
CPU scaling with GeForce 7900GS & Radeon X1950 Pro jmke WebNews 0 23rd February 2007 15:03

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off


All times are GMT +1. The time now is 11:40.


Powered by vBulletin® - Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Content Relevant URLs by vBSEO