Geforce GTX280 & Radeon HD4870X2 AA Scaling with XP & Vista

Videocards/VGA Reviews by jmke @ 2008-09-04

In this in-depth article we take a look at the performance of the NVIDIA Geforce GTX 280 and ATI Radeon HD 4870 X2 when anti-aliasing is enabled. We test 8 different games at several different AA levels under Windows XP as well as under Windows Vista. How does performance scale when you go from XP to Vista, how much impact does enabling AA have? Read on to find out!

  • next

Introduction & Test Setup

Introduction

In this article we’ll take a closer look at the performance of ATI’s latest high end card, compared to NVIDIA’s top card. Both cards offer plenty of headroom when using the latest games. In our first review of the ATI HD 4870 X2 vs NVIDIA Geforce GTX 280 we found that you do not want to invest in these products if you don’t own a high end CPU and have a high resolution monitor.

If your game setup is up to the challenge you’ll find this review interesting as we’ll be using a multitude of Anti-Aliasing settings to see how each card handles the extra rendering load. The HD 4870 X2 GPU can access its onboard 2Gb GDDR5 and this should give it an edge once the resolution and AA levels are increased. By how much you’ll find out on the following pages.

The second effect on performance we liked to investigate was the OS. Our previous review was done with Windows XP SP3. While the majority of users out there are still using XP, those into gaming and multi-GPU high end configurations are more likely to use Vista, and to be able to use more than 3Gb system memory, 64-bit Vista.

So we’ll investigate AA performance in XP SP3 (32-bit) and Vista SP1 (64-bit).

Madshrimps (c)


Which OS will offer the best gaming performance?

Test Setup & Benchmarks

We build our test setup with the help of Tones.be (Belgian’s Largest Hardware Shop!) who helped us with the hard drives, CPUs and monitors, MSI for the motherboards, OCZ for the memory, Coolermaster for the cases and power supplies and last but not least, Scythe for the silent CPU coolers.

We like to thank Sapphire for providing the HD 4870 X2 for test and Leadtek for their Winfast GTX 280. Without their support this article would not have been possible.

Madshrimps (c)


Intel Test Setup
CPU Intel Core 2 E8200 @ 3.52Ghz
Cooling Scythe Ninja 2
Mainboard MSI P45 Platinum
Memory 2 * OCZ 1Gb PC2-6400
& 2 * OCZ 2Gb PC2-8500
Other
  • Coolermaster CM690 Enclosure (3*120mm case fans)
  • Coolermaster UCP 900W Power Supply
  • Western Digital 80Gb HDD (system)
  • Samsung 640Gb HDD (Data)


  • At the time of writing the system we build would cost you approximately ~€1200 without the VGA card. While it’s not a budget system, it’s also far from high end as we’re using a DDR2 motherboard and a mid-range Wolfdale CPU. Combining it with a €300+ VGA card does place it in the more expensive bracket when it comes down to building a game machine.

    One of the costs for a system is the monitor for sure, the system price mentioned above includes this screen, a SAMSUNG Syncmaster 2493HM 24inch, it has a native resolution of 1920x1200 this screen offers quite low 5ms latency. Again this screen is mid-range as more expensive models are available, but the resolution of most 26”~27” screens remains the same at 1920x1200 resolution. You need to invest into a 30” to go higher to 2560x1600 at which point you will be spending a pretty hefty sum.

    Software config:

  • OS: Windows XP SP3 and Windows Vista SP1 64-bit
  • NVIDIA Drivers: Forceware 177.41
  • ATI Drivers: Catalyst 8.8 (8.52.2)


  • These are the games we tested:

  • Devil May Cry 4
  • Unreal Tournament 3
  • The Elder Scrolls IV: Oblivion
  • S.T.A.L.K.E.R.
  • Trackmania Nations
  • Tomb Raider Legend
  • Mass Effect
  • Crysis


  • All tests were done at 1920x1200, the test setup had 2Gb of ram unless otherwise noted.
    • next
    Comment from Kaotik @ 2008/09/06
    You forgot to mention that
    ...8xAA on ATI should be compared to 8xQAA on nV, not the 8xAA which is 4xMSAA based CSAA mode
    ...16xAA on ATI effectively turn the card into single chip card which can do 16xMSAA, since both chips render the same frame with different AA patterns
    ...16xAA on nV is 4xMSAA based CSAA mode and 16xQAA on nV is 8xMSAA based CSAA mode

    So 16x and 8x comparisons in your graphs are far from being 'fair' or 'apples-to-apples', the 8xAA should have ATI 8xAA vs nV 8xQAA (8xQAA = 8xMSAA) and 16xAA shouldn't even exist since the GTX280 can't do 16xMSAA which is (practicly) what the HD4870X2 is doing by blending the same frame rendered twice with different AA patterns.
    Comment from jmke @ 2008/09/07
    Thank you for your input. much appreciated; the 16xAA on the ATI does indeed split performance in two, really heavy hit on performance, but ultimate image quality.

    While the performance on the two cards at different AA levels can be compared head to the head in the charts, their scaling going from XP to Vista was definitely my main focus.

    I don't know if you were the same person who email me about this same issue, but I agree with you that NVIDIA and ATI each have their own approach to the AA levels defined in their control panel, making it less than straight-forward.

    on the topic of AA levels, I was hard pressed to find major improvement on NVIDIA going from 4xAA to 16xQAA (through NV control panel); same goes for ATI where 4xAA or 8xAA did little to improve image quality visually, meaning that the to be able to spot the difference you'll have to do screenshot comparisons of static screens and use a zoom-loop in order to find them.

    on the upside, we finally almost get free 4xAA under Vista with ATI which will be a key feature for full DX11 compliance if I remember correctly.

    Again thank you for your input and the educative message!


    -----

    I received the following mail regarding Crysis performance with the HD 4870 X2 (and Crossfire)

    Quote:
    Originally Posted by Felipe by email
    I saw you article about scaling with XP & Vista comparing tow of the fastest cards in the market and I just want to tell you, thar, there is a list of Cvar in Crysis that increase FPS when using with a CF setup, this worked with 3870X2 and 4870X2, running 32bits executable.

    I can say that Crysis some how is bottlenecking CF systems.

    One of the Cvars is R_TEXTURESTREAMING.
    Run #1- DX9 1280x1024 AA=No AA, 32 bit test, Quality: High ~~ Overall Average FPS: 46,97
    Run #2- DX9 1280x1024 AA=4x, 32 bit test, Quality: High ~~ Overall Average FPS: 37,12
    Run #3- DX9 1280x1024 AA=8x, 32 bit test, Quality: High ~~ Overall Average FPS: 36,33
    TEXTURE STREAMING = 0
    Run #1- DX9 1280x1024 AA=No AA, 32 bit test, Quality: High ~~ Overall Average FPS: 56,11
    Run #2- DX9 1280x1024 AA=4x, 32 bit test, Quality: High ~~ Overall Average FPS: 42,41
    Run #3- DX9 1280x1024 AA=8x, 32 bit test, Quality: High ~~ Overall Average FPS: 43,00
    And using this combination here (based on my system)
    sys_budget_numdrawcalls = 6000
    sys_budget_videomem = 512
    sys_budget_sysmem = 6144
    sys_budget_frametime = 80
    r_TexturesStreaming = 0
    Using Only texturestreaming
    Run #1- DX9 1680x1050 AA=No AA, 32 bit test, Quality: High ~~ Overall Average FPS: 4635
    Using Texturestreamin and Budget config
    Run #1- DX9 1680x1050 AA=No AA, 32 bit test, Quality: High ~~ Overall Average FPS: 4869

    This have the same effect on 4870X2 and 4870 series, for some reason it doesnt help Nvidia cards.
    Comment from Kaotik @ 2008/09/07
    Nah, I'm not the same person, it was another Beyond3D user as far as I know who mailed you.

    I just thought that those differences should be mentioned even though the comparison between Vista & XP was the main concern of the great article.

    Regarding the 'free 4xAA' on HD4870X2 you mentioned, I think you're mixing it with the DX10.1 requirement of 4xAA, which has nothing to do with performance, just that you have to have the support for it (as far as I know, DX10.1 was the first DX ever to actually require support for specific AA mode)
    Comment from jmke @ 2008/09/07
    I've added an addendum to the article regarding the difference in AA levels between ATI and NVIDIA

    I thought DX10.1 (and DX11) allowed 4xAA without a performance hit, much like Assassin's Creed on the ATI cards before they released the patch
    Comment from Kaotik @ 2008/09/07
    Quote:
    Originally Posted by jmke View Post
    I've added an addendum to the article regarding the difference in AA levels between ATI and NVIDIA

    I thought DX10.1 (and DX11) allowed 4xAA without a performance hit, much like Assassin's Creed on the ATI cards before they released the patch
    Nope, there's no such thing as "free AA" really even though the performance drops are relatively small today. DX10.1 (or 11) changes nothing on these regards, 10.1 just introduced the requirement to have support for 4xMSAA to be 10.1 compliant, and allows access to the samples "better" (no idea how to better describe it really)
    In case of Assassin's Creed, they took advantage of DX10.1 how they can access the AA samples later in the rendering too, with DX10 they had to do two rendering passes for some effects, one with and one without AA, while with DX10.1 they could just re-use the buffers from AA rendering pass, completely dropping out one rendering pass.

     

    reply