freePCtech (click here to return to the first page) Search the
    siteHelp on using this siteHome
Forums
Features
Articles
Guides/FAQs
Goodies
Downloads
About
Contact

    Articles / Reviews


ATI Radeon 32mg DDR Video Card

by Drew Dunn
2001/01/21

Let me start out by saying that when I started this review it was with a certain amount of bias. Actually, it was with a lot of bias. You see, I’m one of the many people who was saddled with an ATI Rage Fury card. In fact, “rage” and “fury” pretty well describe my reaction to both the card and ATI’s support of it. It was one card that I was happy to get rid of.

That being said, I was intrigued by ATI’s claims about the new Radeon series of video cards. They positioned them as direct competitors to Nvidia’s GeForce line, a very tall act to follow. Since my pockets weren’t deep enough to spring for the top of the line Radeon 64MB DDR card, I picked its smaller brother, the Radeon 32MB DDR.

The card is fairly nondescript, except for the cooling fan, which seems to be the item du jour on video cards these days. In fact, after using Abit’s Siluro GeForce2 GTS card, I’d say that they are becoming a necessity…these cards get very hot. The Radeon DDR is available only as an AGP card. There are no PCI versions, nor plans for them, and for good reason. The new models of graphics accelerators simply have too high of bandwidth for the PCI bus to adequately support. The Radeon has only one output for the monitor. There is no TV out on this card.

I bought the retail version of the card. It included a single CD and an 8 page installation manual. The CD included drivers and the online manual. Compared to what came in it, the box was very large. Most of the space was taken by a huge cardboard spacer that kept everything from rattling around.

According to ATI’s information, the Radeon DDR is powered by the Radeon graphics processor unit which includes the “Charisma Engine”, “Pixel Tapestry” and “Video Immersion” technologies. The card supports hardware Transformation, Clipping and Lighting (T&L). ATI also claims that the card is the industry’s leading video playback device, particularly for DVD video. ATI talks a lot about a technology called “Hyper Z”, a method for speeding up use of the Z-buffer, a memory area that holds “depth” information, that is, the data that creates the depth of field that you see in a 3-D rendered scene.

The way this normally works is that after the scene is drawn, the Z-buffer gets cleared, usually by writing all zeroes to the buffer. The problem with this is that it takes just as much memory bandwidth to write a bunch of zeroes as it does to actually write real data. ATI claims to have developed a way of clearing the buffer without writing the zeroes to each memory location. They call it Fast Z Clear.

Another aspect of the Hyper Z technology is what ATI calls “Hierarchical Z”, a technology that determines if a rendered pixel is located “behind” an object in the scene. Hierarchical Z also tests blocks of pixels in the same way, and if it determines that the pixels aren’t going to be visible in the scene, it essentially throws them away. Only the pixels that are visible are actually written to the frame buffer. Those excess or “overdrawn” pixels are never transferred, so precious memory bandwidth is conserved. The Hyper Z technology also features a lossless compression method for Z-buffer reads and writes, so with all of that, including the Fast Z Clear tossed together, memory bandwidth is used much more efficiently and performance improves by a large margin.

Since Hierarchical Z is always on in hardware, it can’t be disabled. But Fast Z Clear and Z compression can be turned off in the driver. By default, they are turned on, and when we disabled them, performance dropped (at higher resolutions) by about 30%. It definitely works.

The card fully supports OpenGL and ATI claims more support of DirectX than any other graphics board. The Radeon comes with 32MB of DDR memory, fixed, with no upgrade possible. It supports AGP4X and AGP2X. The DAC runs at 350MHz, with the processor and memory at 166MHz (as compared to 183MHz for its big brother, the 64MB DDR).

A lot of noise has been made about the Radeon’s poor 16 bit display quality. I don’t think that anybody makes any bones about this card being designed for 32 bit display, but when I tested it at 16 bits, I honestly couldn’t find anything to complain about. The benchmarks show, though, that the card was clearly designed to use a 32 bit depth, but again, I didn’t see anything that I could complain about.

Speaking of benchmarks: If you’ve read my reviews before, you know how I feel about them. Benchmarks, to me, are a test of one machine and may be just about impossible to duplicate on any other. But I can’t just say that the card performs well or performs poorly…something needs to back that conclusion up. So, I ran some tests using Quake III Arena and timedemo 1.

The test system:

  • Abit SE-6 Motherboard
  • Pentium III 933MHz
  • 256MB Crucial PC133 CAS2 SDRAM
  • Abit HotRod Pro 100 IDE RAID Controller
  • (2) Maxtor DiamondMax Pro 30.7GB IDE Hard Drives (Striped)
  • Sound Blaster Live! Platinum Sound Card
  • 8X DVD ROM
  • Acer 4x4x32 CD-R Drive
  • Windows 98SE with latest updates from Microsoft
  • DirectX 8.0
  • ATI 4.12.3056 Drivers
  • Nvidia Detonator 3 (6.31) Drivers

Installation of the video card was easy. It popped in, Windows detected it and installed the drivers. I used a desktop resolution of 1600x1200x16 bits on a 21” Hitachi monitor. The display was crisp and clear, in fact, just as crisp and clear as the Abit Siluro GeForce2 GTS card that I had been using.

Now, I didn’t expect that a $170 video card was going to outperform a $330 model, but since the GeForce2 GTS is the hot card on the market, I decided to compare the two, just for fun. Quite frankly, I was surprised.

I performed the tests with Vsync and anti-aliasing off. Otherwise, I simply accepted the OpenGL defaults that each manufacturer’s drivers presented.

These charts show how the two cards performed in Quake III. The 16 bit test was done with everything set to default, changing only the video resolution. The first 32 bit test was set to the “High Quality” setting in the graphics setup menu, again, changing only video resolution. The second 32 bit test was identical to the first, except that Texture was set to maximum.

At 800x600, the Siluro was clearly the runaway winner, as it should be. The framerates were:

                  Siluro       Radeon
 16 Bits        115.4       101.1
 
32 Bits         111.6        98.3
 
32 Bits         107.2        97.0
(Max Texture)
 
I honestly couldn’t complain about the Radeon’s performance. Sure, it wasn’t as fast as the Siluro, but, at half the price, it wasn’t half the speed either. Then I bumped up the resolution to 1024x768.

              Siluro       Radeon
16 Bits        112.3        76.0
 
32 Bits         90.1        71.1
 
32 Bits         84.9        70.1
Max Texture)
Here’s where I started to see the 32-bit bias of the Radeon. You’ll see the dramatic decrease in frame-rate for the Siluro, but no comparable one for the Radeon. Although the frame-rate does decrease, it’s by less than 10% in the Radeon and almost 20% for the Siluro. Of course, the GeForce2 card maintains its lead…as I expected.

Next, I looked at the cards at 1280x1024.

              Siluro       Radeon
16 Bits        91.7         48.5
 
32 Bits        56.1         46.2
 
32 Bits        52.7         45.4
Max Texture)
The Radeon clearly doesn’t compete at all with the Siluro at 16 bits. But at 32 bits, it holds its own. And, again, the 32 bit bias is shown in the graph as the frame-rates decrease by only about 5% from 16 bits to 32 bits.

Overclocking is a big thing today, from CPUs to video cards. I decided to take a look at the performance of the Radeon with its processor overclocked. I didn’t overclock the Siluro, primarily because I knew that Bob wanted it back and if I trashed it by overclocking it, there would surely be hell to pay.

I used PowerStrip to turn up the speed on the card. It overclocked at the way up to 195MHz. That’s right, 195MHz from a retail card, right out of the box. And the results were, well, better than I expected:

Now, just to be sure that everybody is clear, I’m not advocating cranking up the speed on your Radeon by 32MHz, but if you can, that is, if your card lets you, there’s obviously a significant performance gain to be had. But as a word of warning, make sure that you have some very effective cooling on your card because it will get very, very hot. I question whether or not the supplied fan will do the trick with the card running that fast. And, just in case anybody’s keeping score, the Siluro has a 200MHz core clock and a 333MHz DDR RAM clock.

So, the card shows very good 32-bit performance, at a cost to that of 16-bit. Also, the video acceleration appears to be extraordinarily fast. It’s bang for the buck is definitely up there. On the negative side, although I didn’t provide a benchmark for it, the card’s full screen anti-aliasing performance really dogs its performance.

So obviously, the Radeon doesn’t beat the Siluro. And, for the price difference, it shouldn’t. But it does come close, at least in 32 bit performance at higher resolutions. Definitely, if you have the money, a GeForce2 GTS card is the performance winner, but I’d have a hard time spending my cash on something that expensive with the solid performance of the Radeon 32MB DDR available at a lot less. I guess the analogy is the difference between a Ferrari and a Corvette. Both of them are perform pretty darn well. But, at least to me, one seems to be a better value than the other. I think that after being dogged by the Rage, ATI has definitely found a real winner in the Radeon 32MB DDR.

Drew Dunn

 

Articles / Reviews

Hardware
  Articles
  Reviews
Software
  Articles
  Reviews
Diagnostics
  Articles
  Reviews
Networking
  Articles
  Reviews
 
Search
 

 

Free PC Tech

Copyright The NOSPIN Group, Inc. 1991-2006.  All rights reserved.