Showing posts with label ATI. Show all posts
Showing posts with label ATI. Show all posts

Thursday, April 1, 2010

ATI Radeon HD 5870 Eyefiniti6

source: guru3d.com
Go play some games .. on six LCD screens

Think about this for a second, can you even grasp the idea that it has only been six months ago when ATI released its first DX11 class products ... six months (!) Not a month went by with ATI not releasing a new product in that Radeon Series 5000 range.

The cycle is complete though, ATI filled every little and foreseeable gap in the DX11 class graphics card market. As such they will now focus on respins, future products and the more niche gear with somewhat lower priority.

Niche is the word that I like to label today's test product with. Extravagant, extraordinary, yes ATI finally released it's Eyefinity6 SKU officially. Though we believe this will be one tough puppy to find on the market one fact remains .. there will be small market for this product. And sure, not so much the gamer, but the professional end user and corporate entities can save a lot of dough with a product like shown today. Imagine presentations on a couple on say a monitor or four, an instruction video on the other and so on. What am I talking about ? Eyefinity6 .. a high-end Radeon HD 5870 based graphics card with no less than six display port output connectors that can drive six monitors simultaneously for you to wither work, present or play games at an insane and nearly ridiculous monitor resolution.

And gaming, yep .. it is exactly that last option what we'll be looking at today. We know .. we understand, there's 0.01% chance that any of you would purchase and replicate a similar setup as shown in this article it certainly is fun to read up on. It's x-factor hardware, Top Gear stuff for computer geeks and gamers.

So yes, we'll look at Eyefinity6, we'll build a nice frame that can hold the six Dell monitors we are using in this review and then will get our groove on. Now we'll also show some performance numbers as we'll not only use one, but two Radeon HD 5870 Eyefinity6 cards, which we'll setup in CrossfireX, but more overly I like this article to be a show case. As such we'll record some high-definition footage and show you videos of a gaming in a MASSIVE monitor resolution of 5040x2100.

Hot damn .. this is going to be fun ! But first meet the product empowering it all ...

ATI Radeon HD 5870 Eyefinity6 edition

So ATI introduced Eyefinity technology on their Radeon HD 5000 series graphics cards six months ago. This literally boils down to multi monitor desktop and gaming nirvana. You will have no problem connecting say three 30" monitors at 2560x1600 per monitor -- combined to 7680x1600.

The ATI Radeon HD 5870 Eyefinity6 card however is a little weird, it's the freak nephew as it's capable of driving a total of six monitors over one card. And sure, it's clocked a little faster as well at 850 MHz on the core, as well as its 1600 shader processors.

Memory wise the card is clocked at 4800 MHz (gDDR5) much like the reference 5870, but here's one thing different. The ATI Radeon HD 5870 Eyefinity6 edition, at least the model we received, comes with a nice phat 2GB graphics memory. And that will help in uber-high resolutions with AA pretty decently. The TDP is slightly higher than the standard 5870 at 228W and 34W in idle.

Resolution wise you can drive up-to six monitors each capable of 2560x1600 (!)

Eyefinity6

So the ATI Radeon HD 5870 Eyefinity6 edition will be able to drive up-to six monitors per graphics card. We'll test this live in action, and yes, it works very nice.

You can combine monitors and get your groove on up-to 7680x3200 pixels separated over several monitors -- multiple monitors to be used as a single display. I think the limit is even 8000x8000 pixels, but don't hold me to that.

Some examples what you can do where:


  • Single Monitor setup in 2560x1600

  • Dual Monitors setup in 2560x1600 per monitor

  • Three monitors setup in 2560x1600 per monitor

  • Six monitors setup in 1920x1080 per monitor

Have a look at some examples:

ATI Radeon HD 5870 Eyefinity


ATI Radeon HD 5870 Eyefinity

You can arrange the monitors next to each other, above each other and really any combination you see fit for it.

And sure we also understand that 99% of you guys will never use more than two monitors. Perhaps not even 0.01% would use maybe even six monitors. Personally I like to game on three screens. It's really immersive, and especially on flight-simulators, racing games and strategy games (large field) you can enhance your visual experience, and obviously with more information thrown at you, you can game more precisely as well.

Keep in mind that for six monitor support the special edition (Eyefinity6) card is mandatory with six DisplayPort connectors.

Ehm dude CrossfireX ?

For those that are wondering, would it be possible to use two ATI Radeon HD 5870 Eyefinity6 edition cards driving six monitors ? So I can answer that with a yes, CrossfireX is now (though with limited game support) supported, it's actually something we'll tryout today. Especially with beastly resolutions like tested today, you'll need some serious rendering power. Now it is actually surprising to see how well performance with one card is. But two is obviously much much better.

Guru3D will be creating a test setup existing out of six Dell 22" monitors at 1680x1050 in 3x2 Landscape display mode, and that is a total staggering resolution of 5040x2100.

Considering the average enthusiast games has a resolution of 1920x1200 with his monitor, that's 2.3 Million pixels to fill preferably 40 times per second.

With our six monitors in 3x2 Landscape display mode at 5040x2100 we are pushing the limit further ... that's over 10 MPixels. You understand the dynamics now, right? You'll need a lot of RAW performance, and that's something two R5870 cards definitely can help out with.

It's now time to have a peek at some photos. First we'll show you the Eyefinity6 card up-close and personal, then the hardware setup, then the software setup and obviously we'll have a chat about the new functions like Bezel management that ATI introduced with the latest Catalyst 10.3 release.

First a photo shoot on the Radeon HD 5870 Eyefinity6 armed with 2GB memory.

Product Gallery Radeon HD 5870 Eyefinity6

Eyefinity6

Meet the Radeon HD 5870 Eyefinity6. In all it's essence this is a regular Radeon HD 5870 with some slight changes. First up we spot a small core frequency increase to give the product a little more of an aggressive bite, it's only 50 MHZ more though. What is interesting though is that this model comes packed with 2GB memory. Especially in very high resolutions like used today that will help out a lot with AA.

Next to that we obviously spot the 6 DisplayPorts. Let's zoom in on that

Eyefinity6

Here we have the six mini-DisplayPort connectors and bellow it the cooling exhaust. Should regular sized DisplayPort connectors have been used ATI would have had to forfeit on cooling. This is a perfectly fine solution and from what I heard ATI will deliver the proper cables or converters with the card for your convenience.

Eyefinity6

The rear side -- just the air intake is to be seen. the cooler itself and the shell is 100% similar to the original Radeon HD 5870s, really nothing has changed on that end.

Eyefinity6

The card has a slightly higher TDP and as such ATI need to make sure you can deliver enough current to the product. You'll need both a 6-pin and 8-pin PEG connector coming from your power supply. The card will be fed 75 Watts though the PCIe slot, 75 Watts over the 6-pin PEG and then another 150 Watts can be draw from the 8-pin PEG connector. That's 300W available, roughly 75W more than needed.

But let's assemble everything and create us a sweet multi-monitor setup now shall we?

Eyefinity Hardware Setup

Alright let's get started. To setup six monitors you'll need a little more than a graphics card alone.

Eyefinity6

Now let me assure you that FedEx will not be happy when they deliver 'the goods'. It's quite a lot of boxes alright.

So the ingredients for our little experiment are:


  • Radeon HD 5870 Eyefinity6 (x2 for CrossfireX)

  • Six Dell 1680x105 monitors

  • Monitor mounting frame (EdBak)

  • Six sets of power cables

  • Six sets of Display Port to Mini-Display port cables

  • A lot of time ...

To be able to create an Eyefinity cluster of monitors all you need to do is hook them up to your graphics card, but sure ... how do you stack the monitors if you have six of them ?

Eyefinity6


Well that can be answered real quickly, you can make a frame, or preferably order a frame with VESA mounts for your monitor. Ours comes from Edbak, a company on Poland. Not cheap, but definitely more handy and good looking. Also, nothing about this project can be considered cheap anyway. you might as well do it right.

Eyefinity6

I already figured this would be a painstaking time consuming processes; and since there's nothing easier to tick off the girlfriend I left the office and brought the entire kit home to setup and play games on in spare time in the evenings.

Once we assembled and secure the frame we can already see a where this is heading. Yes indeed .. a very ticked off GF. Okay it's time to start mounting the monitors to do this we need to remove four screws for VESA mounting.

Eyefinity6

You see the square? In each of the four corners are four screws for the VESA mount, we remove them to be able to mount them on the hexagon mount based monitor frame.


Eyefinity6

Now this is a painstakingly slow processes and really a two man job. Once all monitors are mounted and roughly aligned well it's time to think about cable management. Six monitors each with one display port cable and one power cable. Yep -- that's twelve cables and a GF looking even more irritated (you know the look with two eye-browses lifted).

Eyefinity6

Line up your monitors and things start to take shape. Now I expect that 99% of you will go with three monitors positions horizontally (Landscape) as six is a little extravagant. But heck, we like extravagant, excessive? Bring it on!

Make sure the bezel of the monitors are very close to each other.

With the monitors lined up we can go into the next phase. It's time to plug in and power up the gear now. We insert the six display port connectors, don't worry about what cable goes to what connector, We can properly align monitors with the Catalyst software in a very easy manner.

Eyefinity6

We now install Catalyst 10.3 or higher and bingo .. we got a nice desktop wallpaper greeting us -- phew -- at least one monitor is giving proper image. It's now time for the software configurations as all the monitors need to be enabled and obviously aligned and positioned.

Eyefinity Software Setup

Eyefinity6

Once you boot into windows, install the latest Catalyst drivers (10.3 or higher mandatory) after which you'll need to restart.. Once you are in the OS, start up the Catalyst control suite -- ours, I'm afraid, is in Dutch.

You'll now spot a plethora of monitors, six of them. Click on the active monitor and left click on the monitors in the middle and select "Make Group".

Eyefinity6

You'll be asked to specify your group preference, we have six monitors and select 3x2 Landscape display mode. After your monitors go active you'll notice that all screens are scrambled through each other like a jigsaw puzzle. We need to sort that out, and luckily ATI solved that in a really easy to work with method.


Eyefinity6

We are going to arrange the screens corresponding to the image wee se above us. ATI did this really clever.

To arrange the displays in your group there is no need to physically move or re-wire your displays. A wizard is provided to arrange the display surfaces included in your Display Group.

The wizard will output a blue screen on one of the six monitors in the group. Now at the upper screenshot you simply select which screen is blue. Click on the corresponding cell in the CCC UI to the one that is highlighted. After you have finished the sequence you'll have a somewhat bigger resolution alright.

But we are not there yet though.

Eyefinity6

We now have six monitors, yet will need to think about something else. The monitor bezel can create a distortion. At catalyst 10.3 there now is a Bezel correction feature.

Eyefinity6

Especially if your monitors have thick bezels, or displays of differing bezel thickness, Catalyst Control Center now lets you compensate through a fairly easy to use tool/sequence. Lets take a look at that.

Eyefinity6

Here we are compensating the six screens displays, as you can see the bezel divides a yellow triangle in two monitors. By clicking with your mouse on the controls on the right you adjust to compensate for bezel thickness.


Eyefinity6

Here you can see what that looks like on six displays; the bezel divides the yellow triangle. Using the controls on the right you adjust to compensate for bezel thickness.

Last note; ATI's Catalyst 10.3 also has support for multiple groups and fast switching between Eyefinity modes.

Right we have the screens all setup and ready to be used.

Performance and test setup

Before we begin actual visual testing, we obviously need to address performance. If you create a multi-monitor setup you need to be aware of the fact that you'll need a beefy graphics card as well.

Let's say you are connecting three 24" monitors with a screen resolution of 1920x1200 and place them horizontally next to each other. Now you will end up with a 5760x1200 pixels screen resolution. That my friends is 6912000 pixels thus nearly 7 Mpixels of resolution that need to be refreshed at least 35 times per second.

With our six monitors in 3x2 Landscape display mode we are pushing the limit further ... over 10.5 MPixels. You understand the dynamics now right?

With modern game titles this immediately already poses a threat to the 5870 as we have increased the load on our graphics card three to five times more than usual. So yes, in the end you should consider and opt CrossfireX if you plan a project like this for gaming with modern titles.

The 10.3 Catalyst drivers and higher have pretty good support for CrossfireX. A lot of games (but not all) are now supported. We'll obviously put that to the test today.

To grasp what we are doing, we recorded video's showing Eyefinity with six monitors setup in Landscape mode.

Since we are using such a high resolution, we should not at all be too CPU bound and to demonstrate this very point I've been evangelizing I will use a mid-range system (with exception of the graphics cards) made out of the following items:

Hardware:


  • Radeon HD 5870 Eyefinity6 2048MB x2

  • MSI AMD 790FX motherboard

  • Phenom II 955BE Processor @ 3400 MHz

  • 2x2GB DDR3 memory 1066 MHz

  • Samsung 500 GB HDD

  • Dell P2210 (x6)

Software:


  • Windows Vista 64-bit SP2

  • DirectX 9/10 End User Runtime

  • ATI Catalyst 10.3 beta

So yes really dog ... that's all we'll be using. A very affordable PC, remember we are GPU bound at a massive resolution of 5040x2100, not rather CPU or system bound.

So how does that relate to graphics performance you might wonder? Well, let's have a look at the frame rates of some of the tiles we are testing today, both on Single GPU and Multi-GPU setup.


Now we could measure performance of the card at normal resolutions like 1600x1200 etc and compare with other cards of course. But really, it's the six monitor double whammy performance everybody is after, so we'll focus only on that.

Results are a bit of a mixed bag, with two cards you'll definitely have enough horsepower under the hood to play most of your games at decent frame rates and very good image quality settings.

In all the games above we have maximized image quality except for AA levels which we selected and altered manually to compensate the overall framerate. Though at 5040x2100, you really do not have a real need for Anti Aliasing whatsoever, my man.

All games scaled with the two R5870 Eyefinity6 cards setup in CrossfireX, except for whatever reason Call of Duty Modern Warfare 2 was jinxed, we could not get it active with two GPU, we even tried renaming the executable. Ah well. With on card we still get near 40 FPS with 4xAA, perfectly fine.

As you can see, with some AA activated a single R5870 Eyefinity6 quickly runs out of juice unless you fiddle around with image quality of course. Anyway, the numbers are a little dry to look at, but they already show that we should get very decent performance. Let's watch some video's !

Eyefinity6 Video examples

In the following video's we'll be showing you a high-def recording of a couple of applications and games. This method, we feel, will get you a better idea of what you can expect, what's hot and what's not.

In the examples below we opted to show you some games with and also without Bezel management. Bare in mind you can choose your preference (if the game will allow a custom resolution (bezel managed)).

Ehm .. Productivity mode?

Now you could also use the screens in work environments for presentations, information boards like you see at the airport and really a multitude of choices and options come to mind.

But you and me as couch potato know that the RV870 GPUs empowering the Radeon HD 5870 cards have plenty of raw Video processing power to to some wicked stuff. Check out the chill video.

So here we have four 1080P video streams while I'm browsing the web, do a little Photoshop, I could work a little of course as well. Mind you that we are still using that mid-range Phenom II system okay? Now obviously of you reproduce this at home for similar reasons like shown in the video .. then you are spoiled my man. But this is fun, really a lot of fun.

Okay, what you guys want to see are games I guess ...

Resident Evil 5


Here we have Resident Evil 5 -- we show the benchmark as it's a nice way to show you performance with the actual readable framerate on screen. For RE5 we have not enabled Bezel management which immediately looks distorted. We are playing in a native 5040x2100 resolution. As you can see at roughly 60-'ish frames per second we have nothing to worry about, the game will be absolutely playable.

Batman Arkham Asylum



In Batman Arkham Asylum we show the a part of the intro sequence. The video camera was not properly configured though, hence the blown overexposure. Still, a good and very playable game with six screens. We have Bezel correction activated meaning we work in a customized resolution. Batman Arkham Asylum is a nice title for Eyefinity with six monitors.

Unigine Heaven Benchmark


The heaven benchmark actually is a DX11 title. For whatever reason it would not run in DX11 mode in Vista, despite DirectX being updated to the latest build. So you are looking at the 3D Engine rendering 5kx2k in DX9 for a change. Overall performance was above the 20 FPS, but as you can see .. the benchmark is struggling, it is just sooo heavy on the GPUs (which it is designed for). Great to look at, with Eyefinit though. Very impressive. AA is disabled. Bezel correction is applied, and that certainly looks good.

Half Life 2 EP1


Well we couldn't leave it out now could we?. HL2 is once again a nice example of where Eyefinity with six monitors works really well. With the 3D engine aging feel free to flick on hefty AA levels and the game will still produce awesome numbers. With 8xMSAA we still averaged out at 80 frames per second. Have a look at how rusty I have gotten where ol Hilbert used to be the l33t king Gordon!

Anno 1404


One of the games a played till I dropped was Anno 1404. RTS games work really well with so many monitors. You are not hindered by the Bezel that much (bezel correction was off btw). Overall, framerates were quite okay. This title was however limited by the Phenom II X4 955 BE processor. More screens equals more components and objects to deal with in this RTS and that requires a lot of horsepower. Still as you'll notice, the frame rates are fairy okay, even with big cities. An excellent title for Eyefinity with six monitors. Check it out.

Battlefield Bad Company 2

What a great game title this is. Check out the video. We have totally fine framerates and the game is very playable. Bezel correction is active here as you can see. Now here's where Eyefinity with six monitors disappoints, in the center middle of your screen, the crosshair and thus your focus is located. Here the Bezel creates a gap, and that is just annoying.

Your only option then is to disable Bezel management, but then the crosshair is split over two screens. s

Call of Duty Modern Warfare 2

VERY impressive, Call of Duty Modern Warfare 2 is looking great on Eyefinity. But much like Battlefield, the crosshair in the middle is smack down bull's-eye under the bezel. Absolutely annoying. The game itself obviously runs great, even at such an uber resolution. This was the one title that would not work properly in CrossfireX. But a single card at the highest settings with 4xAA still manages to produce an average framerate of roughly 40 FPS in this resolution.

Final words and conclusion

Man ... good times, that was fun gear to play around with! Well, it's time to wrap up things and have a look at the good and the not so good. Eyefinity6 works , you know what ? Rephrase -- it actually works really well. The recent driver changes allow you to combine six monitor output in combo with the sheer power of multiple GPUs (CrossfireX) and what was very important, Bezel correction.

When we touch the topic of CrossfireX, then we have to say this, you are probably going to need two cards if you want to play modern age games in a 5kx2k resolution, as yes .. we are gaming with a resolution at a freakishly 10.5 MPixels, and that's roughly five times more than your average 1920x1200 monitor. Overall performance with the help of Crossfire was with the most modern titles not super, but good enough. With older games you certainly can flick on nice AA levels as well.


Now I have to admit, from ground up, Eyefinity with six monitors has been thought through well. You create your preferred setup, install catalyst drivers, sort your screens and alternatively correct Bezel management. I really have to give props here to ATI's driver team for doing a superb job done there as it works well, a very proper implementation.

Overall from a productivity or entertainment point of view the 3x2 monitors functions really well, it's certainly the biggest desktop I have ever worked with. It's kind of like having your own command center at home, with each screen showing relevant information.

What about gaming then? Well, truth is that gaming wise the setup is a heck of a lot of fun, but the reality also remains that with most first person shooters, the Bezel in the middle poses to be a cruel issue. Exactly at the location of the crosshair (aiming cross) there's two Bezels from the upper and lower monitor and I just could not get over that or even slightly used to it. This however is a limitation for first person shooters.

Your solve, find monitors with very thin Bezels, use three monitors instead of six, or alternatively perhaps ATI will ever invent a nine monitor setup, which should solve that issue as well, hehe ...

Non first person shooters will mostly work the best. Take for example Anno 1404, what an incredible field of view are you getting in return. I foresee that Eyefinity6 could be very interesting for Flight simulator fans as well, or race fans would definitely love this all as well. In the end though, as good as the technology really is, the biggest annoyance will remain the monitor Bezels, you either dislike them, or can life with them. There's no middle way. So when you even remotely opt this kind of setup, make sure your monitors have the thinnest Bezels possible. For you as a gamer I'd seriously recommend three monitors over six really.

Bezel aside, make no mistake: you'll play your games much more immersive as you'll have so much screen resolution to work with. The experience of playing games, in our case with three to six screens is simply put fantastic, the first minutes you'll feel a little confusion as your brain actually needs to process so much information it can hardly keep up. Once you get used to it (few minutes really) the 'wow' factor kicks in and the experience is just lovely.

On the productivity and corporate side of things, Eyefinity6 is the best thing ever; relatively cheap to purchase with endless amounts of options and configurations. For an end consumer the frame, six monitors, R5870 Eyefinity6 card(s) and power bill alone simply is at the very least trivial to even consider.

So there you have it, The Radeon HD 5870 Eyefinity6 is obviously not for everybody and honestly .. I'd prefer to have you look into a phat DLP or projector if you need to game on a very big screen. You could also opt to go with 3 bigger monitors at say 1920x1200 each. Either that or seek monitors with a extremely thin Bezel. Regardless of that fact, Eyefinity6 does guarantee a lot of e-peen, great x-factor and a whole lot of fun in unprecedented resolutions, it is a pleasure to play around with. Ohh oooh ... what about six projectors on one huge screen

Should you still have questions, ATI created an information page on Radeon HD 5870 Eyefinity6 right here, the Radeon HD 5870 Eyefinity6 edition cards will sell at an MSRP of 475 USD which really is a fair price for something so exclusive.


Read More..

Wednesday, January 13, 2010

ATI Radeon HD 5670 Rule the Mainstream

source: hardwarezone.com
Capturing the Mainstream Market

Things are looking good for ATI lately. Not only were they first to get DirectX 11 compatible graphics cards into the market, but NVIDIA's inability to release their own DirectX 11 solutions in the same time span also meant that ATI has had the entire DirectX 11 market to themselves for almost a full four months! To rub salt into NVIDIA's wounds, in the past four months they have also reclaimed the title of fastest single GPU (the Cypress XT found on the Radeon HD 5870 ) and also the title of fastest single graphics card (Radeon HD 5970 ) from NVIDIA.

Not content with their dominant display of late, ATI is following up on its recent successes with a brand new graphics card for the more budget-minded, the Radeon HD 5670. The new Radeon HD 5670 cards will be powered by the Redwood XT GPU, which has 400 stream processing units, 20 texture mapping units and 8 raster operating units. Its core will run at 775MHz, with the memory at 4000MHz DDR. Memory bus width will be kept at 128 bits width (similar to the Radeon HD 5770 and Radeon HD 5750 ), and the new cards will come in either 512MB or 1GB flavors.

This is the reference ATI Radeon HD 5670 that we received from ATI. As you can see, it does not hide its budget and mainstream intentions for there's nothing fancy about it.

Admittedly, when compared to the current 'base' model of the Evergreen series - the Radeon HD 5750 - which is running on the 720-stream processor Juniper LE GPU, the Redwood XT does look kind of lightweight. However, ATI has said that the Radeon HD 5670 will be considerably more affordable - under US$100 (S$138) to be exact - making it the first truly affordable DirectX 11 graphics card. And despite being an affordable part with not as much horsepower as its other Evergreen brothers, the Radeon HD 5670 still supports EyeFinity, which means it can power up to three displays simultaneously.

According to statistics from ATI, over 15 million gamers have Steam accounts and there are over 11 million World of Warcraft players. Of them, only a mere 10% have a display resolution of 1920 x 1200 or more, and around 66% are using a sub US$100 graphics card. As such, the budget to mainstream segment is a huge one, and this is the exact target audience ATI is aiming with the new Radeon HD 5670.

We are sure you are eager to find out how ATI's latest offering fares, but first here's a look at the specifications of the Radeon HD 5670 and how it stacks up against other competitive SKUs.


There's a slight glitch there as the GPU clock is supposed to read 775MHz.

The Radeon HD 5670 & Competitive Comparison SKUs
Model ATI Radeon HD 5670 NVIDIA GeForce GT 240 ATI Radeon HD 4670 ATI Radeon HD 4770 NVIDIA GeForce 9800 GT NVIDIA GeForce 9600 GT
Core Code Redwood XT GT215 RV730 RV740 G92 G94
Transistor Count 627 million Unknown 514 million 826 million 754 million 505 million
Manufacturing Process 40nm 40nm 55nm 40nm 65 / 55nm 65nm
Core Clock 775MHz 550MHz 750MHz 750MHz 600MHz 650MHz
Stream Processors 400 96 320 640 112 64
Stream Processor Clock 775MHz 1340MHz 750MHz 750MHz 1800MHz 1625MHz
Texture Mapping Units (TMU) or Texture Filtering (TF) units 20 32 32 32 56 32
Raster Operator Units (ROP) 8 16 8 16 16 16
Memory Clock 4000MHz DDR 1800MHz DDR (GDDR3) / 3600MHz DDR (GDDR5) 2000MHz DDR 3200MHz DDR 1800MHz DDR 1800MHz DDR
DDR Memory Bus 128-bit 128-bit 128-bit 128-bit 256-bit 256-bit
PCI Express Interface PCIe x16 ver 2.0 PCIe x 16 ver 2.0 PCIe x16 ver 2.0 PCIe x16 ver 2.0 PCIe x 16 ver 2.0 PCIe x 16 ver 2.0
PCIe Power Connectors None None None 1 x 6-pin 1 x 6-pin 1 x 6-pin
Multi-GPU Technology CrossFireX (vendor dependent) None CrossFireX CrossFireX SLI SLI
DVI Output Support Yes Yes Yes Yes Yes Yes
HDCP Output Support Yes Yes Yes Yes Yes Yes
Street Price Launch Price: <> �~US$99 ~US$75 ~US$120 ~US$110 ~US$100


The ATI Radeon HD 5670

The reference card that ATI sent us is a decidedly simple affair. It comes in a single slot form factor and has a very compact cooler, which runs fairly quietly during operation. Clock speeds are reference, which means 775MHz at the core and 4000MHz DDR, and it comes with 512MB of GDDR5 memory that interfaces with a 128-bit wide memory bus.

Here are some shots of the card.

Note the ATI Radeon HD 5670's compact single slot cooler, and its lack of PCIe connectors, making it the ideal weapon of choice for HTPC and probably even LAN party enthusiasts.


The Radeon HD 5670 main competiton comes in the form of NVIDIA's GeForce GT 240, and here it is. It looks bigger because this card has a dual-slot cooler.


The Radeon HD 5670 has a DisplayPort, HDMI and DVI outputs, and is capable of driving three displays at once.


Eagle-eyed readers might have noticed that our card doesn't come with CrossFireX connector. We've been told by ATI that while CrossFireX is supported by the new Radeon HD 5670, the choice to implement them is ultimately up to vendors. Obviously, one can expect cards without CrossFireX capability to cost less.


Test Setup

The new ATI Radeon HD 5670 will be evaluated using our Vista system, which has the following specifications:

Windows Vista SP1 Test System

* Intel Core 2 Extreme QX6850 (3.00GHz)
* Gigabyte X38T-DQ6 motherboard
* 2 x 1GB DDR3-1333 Aeneon memory in dual channel mode
* Seagate 7200.10 200GB SATA hard drive
* Windows Vista Ultimate with SP1


We'll be pitting the Radeon HD 5670 against NVIDIA's latest sub US$100 card, the GeForce GT 240. It should be interesting to see how the two cards will fare given their competitive specifications. We have also included the Radeon HD 4670 and HD 4770 to see how the new card matches up against its predecessors, and also NVIDIA's mainstream stalwarts, the GeForce 9600 GT and 9800 GT.

Here's the complete list of cards tested and their driver versions:

* ATI Radeon HD 5670 512MB GDDR5 (Beta 8.69 RC)
* NVIDIA GeForce GT 240 512MB GDDR5 (ForceWare 195.62)
* ATI Radeon HD 4670 512MB GDDR3 (Catalyst 9.12)
* ATI Radeon HD 4770 512MB GDDR5 (Catalyst 9.12)
* GeForce 9600 GT 512MB GDDR3 (ForceWare 195.62)
* GeForce 9800 GT 512MB GDDR3 (ForceWare 195.62)


Also, the cards were tested using the following benchmarks:

* Futuremark 3DMark Vantage
* Crysis Warhead
* Far Cry 2
* Dawn of War 2

3DMark Vantage Results

The ATI Radeon

HD 5670 scored first blood on 3DMark Vantage as it outscored the NVIDIA GeForce GT 240 by around 10%. Compared to the older generation Radeon HD 4670, its scores were over 50% greater.

Elsewhere, we noted that the Radeon HD 5670 was more than a handful for the GeForce 9600 GT, and was just about a match for the popular GeForce 9800 GT.



Crysis Warhead & Far Cry 2 Results

The Radeon HD 5670 continued its fine showing on Crysis Warhead, easily overpowering the GeForce GT 240 and 9600 GT. The Radeon HD 5670 wasn't quite as quick as the more powerful Radeon HD 4770 nor the GeForce 9800 GT, but it did post tremendous improvements over the Radeon HD 4670.





It was the same story with Far Cry 2, as the Radeon HD 5670 proved to be too quick for the GeForce GT 240. However, the Radeon HD 5670 didn't seem too apt at handling anti-aliasing at high resolutions, no thanks to its low ROP unit count. With anti-aliasing enabled at 1920 x 1440, the Radeon HD 5670 actually posted the poorest results of the lot.





Dawn of War 2 Results

The new ATI card excelled on Dawn of War 2, having posted extremely playable frame rates right up to 1920 x 1440, and completely whipped the GeForce GT 240 as it was about 40% quicker overall. As it was, only the Radeon HD 4770 and GeForce 9800 GT were faster.



Temperature

Despite its low power draw and 40nm manufacturing process, the Radeon HD 5670 was, quite surprisingly, the hottest card of the bunch measuring in at 71 degrees Celsius. Hopefully with driver upgrades in the future or as vendors start plonking custom coolers on it, we will start seeing temperatures going south.



Power Consumption

Class-leading power efficiency is the hallmark of an Evergreen graphics card, and the Radeon HD 5670 continues this fine tradition. Idle readings were on a par with NVIDIA's GeForce GT 240, but readings at load pointed to an easy Radeon HD 5670 victory.

Also note that the Radeon HD 5670's power consumption readings were not that far off from that of the Radeon HD 4670's. With this in mind and considering how much faster the new Radeon HD 5670 is, we think that it's not an exaggeration to say that ATI has done wonders with the new Evergreen cards as far as power efficiency is concerned.



Overclocking

The Radeon HD 5670 seemed to be an eager overclocker as we could easily push it to the maximum allowable clock speeds using ATI Overdrive - 850MHz at the core and 4200MHz DDR. This gave us 2471 3DMarks, which translated to a bump of 5% on the Extreme preset of 3DMark Vantage.




The Affordable Evergreen

Casual and mainstream users and gamers who have been holding back on upgrading their graphics cards can rejoice, for the new Radeon HD 5670 is a solid mainstream gaming solution.

Seeing that the Radeon HD 5670 was remarkably faster than its predecessor - the Radeon HD 4670 - and that it also comprehensively beat NVIDIA's competing GeForce GT 240, we think it's fair to say that the new Radeon HD 5670 performs very well and up to our expectations. More impressive was the fact that it could (at times) match up to the GeForce 9800 GT, a very popular choice amongst mainstream users. The only downside to the new Redwood XT chip is that it seemed to exhibit some difficulty tackling anti-aliasing on very high resolutions, but that's not much of a problem considering that this series of cards weren't meant for such high resolution gaming.

Performance aside, we were also glad that despite its mainstream and budget billing, the Radeon HD 5670 gets useful features such as Eyefinity and support for multi-GPU via CrossFireX, making it possible to tack on additional Radeon HD 5670s as a means of upgrade. However, do take note that while the Radeon HD 5670 is technically capable of CrossFireX support, vendors might choose to omit that option on their cards to keep costs down.

With so much going for it, and with a launch price of under US$100 (S$138), the Radeon HD 5670 is a solid, bang-for-buck graphics card.

Overall, the Radeon HD 5670 is a decent and solid offering from ATI. It offers better performance and efficiency than NVIDIA's competing GeForce GT 240, and the ability to upgrade via CrossFireX (depending on vendor) is an added advantage over the GeForce GT 240.


Looking forward, there's more to come from ATI as there are plans to unroll more mainstream and budget variants of the Evergreen in form of the Radeon HD 5500 and 5400 series.

From the green camp, rumors are that Fermi cards will go into production in the third week of February and consumers can expect to find them for retail as soon as mid-March. However, it is said that the cards will be extremely limited as yields will be on the low side. Nevertheless, the arrival of Fermi can only be a good thing, seeing that developments in the graphics card scene have been increasingly one-sided in the past few months.

Read More..

Friday, January 8, 2010

AMD Introduce ATI Mobility Radeon 5000 series at CES 2010

source: anandtech.com
We are right in the middle of CES 2010, which naturally means it's a great time for tons of product announcements. Intel and their partners are busy launching the new i3/i5 processors, and AMD is taking this opportunity to announce the next version of their Mobility Radeon lineup. Yesterday, we discussed NVIDIA's quiet launch of the GT 240, and while we weren't impressed with the price/performance offered, at least we were able to get some hardware to test. As with most mobile GPU "launches", all we have right now are specifications and features. We will have to wait to get some laptops before we can provide concrete performance numbers. With or without independent performance, today ATI is announcing their new 5000 series DX11 mobile GPUs.


Performance with ATI's mobile graphics chips has generally been competitive with NVIDIA products in recent years. In fact, ATI is keen to point out that their market share for discrete mGPUs has increased to over 60% in 2009, with a whopping 13% increase in 2Q09. Since NVIDIA is the only other discrete mobile graphics solution, ATI's win is NVIDIA's loss.


That of course doesn't tell the full story, and in speaking with ATI we were informed that 70% of sales have been 4300/4500 parts - the lowest performing, least expensive offerings. 15% of sales are from the 4600 series, which is where reasonable gaming can finally enter the picture. As for the 4800 series, it's still sitting at less than 5% (with the remainder of ATI's sales apparently coming from older 3000 series parts). NVIDIA also sells far more low-end and midrange mobile GPUs than high-end parts, but as far as high-end laptop graphics is concerned, NVIDIA has had a clear lead for a while, for a couple of reasons.

First, NVIDIA has been very good about OEM/ODM design wins for their fastest GPUs. We're still running G92b cores in the GTX 280M, and the G92 first became available almost 2 years ago with the 8800M series. With only minor tweaks to the design, it has been easy for manufacturers to update existing lines with support for new NVIDIA graphics solutions. We've seen Gateway, ASUS, Dell, Clevo, and others go from 8800M to 9800M to GTX 260M/280M with very few updates to accommodate the new GPUs. That's good for the manufacturers, but we're not particularly happy that laptops are still less than half the performance of desktop parts. Then again, it's difficult to deal with laptop power and cooling constraints.

The other big reason that NVIDIA has ruled the high-end of mobile solutions is driver support. We tested the ASUS W90Vp and found that it can offer compelling performance… when the drivers worked properly. This is especially important for CrossFire and SLI solutions, and that's the one area where ATI's Mobility Radeon drivers have been severely lacking. (And as a side note, we still haven't seen much in the way of driver updates for that platform!) We asked ATI about this and were told that an improved mobile driver plan should be coming "very soon". Frankly, if the Mobility Radeon 5800 series is going to have any chance at winning gamer mindshare, those plans can't come soon enough. And of course, NVIDIA also has CUDA and PhysX to talk about. ATI in turn has Stream, and DirectCompute and OpenCL are industry standards that will hopefully supersede CUDA and Stream, but for the time being we have to give the GPGPU crown to NVIDIA.

In short, while ATI has had very compelling performance on the desktop - in fact they have been our recommended solution with the 5000 series, and the 4000 series was very good from a price/performance standpoint - on laptops ATI has been more of a multimedia solution than a gaming solution. That's fine and ATI is clearly getting design wins, but don't try to convince us that low-end discrete graphics (HD 4330 and GeForce 9300M/G110M, we're looking at you!) are anywhere near being good gaming solutions. At best, they are barely adequate, capable of running the majority of games at 1366x768 and low detail settings. If that's all you need, or if you don't even need gaming performance and are just interested in multimedia solutions, forget about discrete graphics and go with an IGP.

Okay, enough talk about the past; let's look at what AMD is announcing today. What we really want to know is if performance and compatibility are going to be where we want. Unfortunately, all we have are some figures from ATI regarding gaming performance, and until we see updated driver plans we won't know what to expect in terms of compatibility. But then drivers have only really been important if you're running high-end solutions, which is less than 5% of the laptop market.

ATI 5000 Series Specifications

With our introduction out of the way, let's take a look at ATI's new mobile parts. The big news of course is that these are the first DirectX 11 mobile graphics chips, just as the desktop counterparts were the first DirectX 11 GPUs. Despite the similarity in naming, all of the mobile parts are essentially one step back from the desktop parts, not just in clock speeds but also in stream processors. Here's the quick rundown of specifications and features, with the NVIDIA's GTX 280M tossed in for comparison.


Click for large image

Overall, the new 5000 models should boost performance by around 30% relative to the last generation ATI mGPUs, at least in theory. They have more stream processors, and higher clock speeds in most cases, plus other enhancements. One of the enhancements is something that's going to be up to individual manufacturers to utilize: GDDR5. GDDR5 sends four bits of data per clock cycle compared to two bits for GDDR3/2/1, and it also runs at lower voltages. The problem is GDDR5 costs more, and with support for GDDR3 it's likely that many manufacturers will stick with GDDR3 in order to save money in the short-term. Towards the end of the year, ATI informed us that they expect pricing to become more or less equal, at which point we should see a strong move towards GDDR5. Without the newer memory, the performance gains in some titles are going to be lower due to memory bottlenecks.

Also worth pointing out is the HD 4860, a product we can find on AMD's website complete with specifications, and a product that was announced back in March 2009. What makes the HD 4860 "interesting"? To our knowledge, it never shipped in any laptops, and with the 5000 series it probably never will. The reason we point this out is that today's announcement does not necessarily guarantee hardware anytime soon. Hopefully we will see laptops with the new graphics chips appear in the next few weeks, but particularly on the high-end GPUs it can take months before anyone picks them up. As another example, the mobile HD 4870 only appeared in a couple laptops: the ASUS W90 and the Alienware M17x. And now it's apparently EOL. So we have to raise the question of how long it will take before we start seeing laptops with 5000 series parts (particularly the 5800 series), and only time will give us an answer.

ATI touts several features as being noteworthy, including GDDR5 and DirectX 11. One feature seems rather out of place, however: Eyefinity. While it can be pretty cool to run multiple monitors on a desktop system, we really have our doubts that many people will use up to six displays with any laptop. Nevertheless, that's exactly what the 5800 and 5700/5600 series support (in theory). Naturally, it would be up to the laptop manufacturer to provide six display outputs, which seems highly unlikely. (Three digital outputs is more than sufficient, we think.) That's still leaves GDDR5, which we've discussed, and DirectX 11. We'll talk more about DX11 on the next page, but there's one other item to mention: power saving features.

TDP for the various products hasn't really changed, but that doesn't mean power characteristics are the same as the 4000 series. In the above chart, we've listed the maximum TDP of the CPU followed by the TDP of the entire GPU subsystem (i.e. with RAM and voltage regulators). ATI informed us that they improved engine and memory clock scaling as well as clock gating, bringing about significant improvements in power requirements. At full load, these improvements aren't going to be as noticeable, but idle power draw should be significantly lower than on previous discrete mGPUs. ATI claims a 4X performance per Watt improvement when comparing the HD 5750 to the HD 3650. One word of caution however is that optimal idle power requirements apparently need GDDR5, which allows better control over changing the memory clocks. As mentioned above, many of the midrange parts are likely to stick with GDDR3 until later this year.

ATI also let us know that they have worked on IGP to discrete GPU switching times, which should be 30~40% faster (depending on manufacturer implementation). Ever since we first saw a hybrid GPU solution in the ASUS N10JC, we have hailed the technology as a feature that every gaming laptop needs to have. We also discussed the feature on the Alienware m15x, and it helped make the ASUS UL80Vt our favorite current laptop. So far, uptake has been slow, but things appear to be improving with the launch of Windows 7. Lenovo, Sony, and others are pursuing hybrid solutions, and with Win7 supporting multiple GPU drivers (Vista and XP could only enable one display driver at a time) the way is paved for the future. We still haven't tested an ATI-based laptop with hybrid graphics, however; so far the only hybrid solutions we've tested use NVIDIA GPUs, but maybe the 5000 series can change that.

Mobile DirectX 11 Arrives… Where Are the Games?

Just as ATI was the first out of the gate with DirectX 11 hardware on the desktop, they are now the first mobile DX11 solution. That's good, but we still need gaming support, and that is at best a work in progress. So far there are three games that have shipped with DX11 features: Battleforge, DiRT 2, and STALKER: Call of Pripyat (well, it's available if you speak Ukrainian at least; we're still waiting for the English version, although a public benchmark is available). Here's a list of some other titles that are on the way:


Not convinced that DirectX 11 will arrive anytime soon? Perhaps this slide will change your opinion:


Okay, we remain skeptical as well, no offense to Mercury Research. In truth, it's only in the past year that DirectX 10 has really arrived, with most new games supporting DX10 features, but there were a few early DX10 titles not long after Vista launched. Unless adding DX11 features is very easy, or someone is footing the bill for developers (i.e. ATI), DX11 isn't likely to become dominant until the majority of graphics cards have DX11 support. Considering the installed user base for DX10 hardware, that could be several years away.

Speaking of DX11, here are some more slides showing what is possible with the new hardware:





The big one in that list is obviously tessellation… but haven't we heard that before? Why yes we have! And it was only just under three years ago. Okay, you can read more about DX11 and tessellation in our DX11 article. The good news is that DX11 finally makes tessellation a required element, so we're more likely to see it utilized. Unigine is used in the above images, and you can see the complete list of Unigine projects for potential early tessellation candidates. We do like the idea, but until it becomes reality it's just a checkbox on your GPU hardware, which is what we've had since the HD 2900.

Performance Preview

Since we don't have hardware, we are left with the charts that ATI provided. Obviously, you need to take these benchmark results with a huge grain of salt, but for now it's all we have to go on. ATI provided results comparing performance of their new and old Performance (5650 vs. 4650) and Enthusiast (5870 vs. 4870) solutions, an additional chart looking at WUXGA single vs. CrossFire performance, and two more charts comparing performance of high-end and midrange ATI vs. NVIDIA. Here's what we have to look forward to, based on their testing.






The performance is about what we would expect based on the specifications. Other than adding DX11 support, there's nothing truly revolutionary going on. The latest 5870 part has a higher core clock and more memory bandwidth - 27% more processing power and 11% more bandwidth compared to the standard HD 4870, to be exact. The average performance improvement appears to be 20~25%, which is right in line with those figures. Crysis appears to hit some memory bandwidth constraints, which is why the performance increase isn't as high as in other titles (see the 5650 slide for a better example of this).

On the midrange ("Performance") parts, the waters are a little murky. The highest clocked 5600 part (5750 or 5757) can run at 650MHz and provides a whopping 62% boost in core performance relative to the 4650, but that's only 10% more than the HD 4670. With GDDR5, the 5750 also offers a potential 100% increase in memory bandwidth. That said, in this slide we're not looking at the 5750 or 4670; instead we have the 5650 clocked at 550MHz with 800MHz DDR3 and ATI compares it to the 4650 with unknown clocks (550MHz/800MHz DDR3 are typical). That makes the 5650 25% faster in theoretical core performance with the same memory bandwidth. The slide shows us a performance improvement of 20~25% once more, which is expected, except Crysis clearly hits a memory bottleneck this time. On a side note, running a midrange GPU at 1920x1200 is going to result in very poor performance in any of these titles, so while the 5650 is 20% faster, we might be looking at 12 FPS vs. 10 FPS.

Moving to the ATI vs. NVIDIA slides, it's generally pointless to compare theoretical GFLOPS between the ATI and NVIDIA architectures as they're not the same. The 5870 has a theoretical ~100% GFLOPS advantage over the GTX 280M but only 5% more bandwidth. The average performance advantage of the 5870 is around 25%, with BattleForge showing a ~55% improvement. Obviously, the theoretical 100% GFLOPS advantage isn't showing up here. The midrange showdown between the 5650 and GT 240M shows closer to a ~30% performance increase, with BattleForge, L4D, and UT3 all showing >50% performance improvements. The theoretical performance of the 5650 is 144% higher than the GT 240M, but memory bandwidth is essentially the same (the GT 240M has a 1% advantage). There may be cases where ATI can get better use of their higher theoretical performance, but these results suggest that NVIDIA "GFLOPS" are around 70% more effective than ATI "GFLOPS".

For reference, ATI uses a desktop system for the ATI vs. ATI charts and some form of Core 2 Duo 2.5GHz setup for the ATI vs. NVIDIA charts. This isn't really something fishy, considering there are as yet no laptops with the new ATI hardware and no one offers an identical laptop with support for both ATI and NVIDIA GPUs (well, Alienware has the m17x, but that's about as close as we get). Here are the test details:



The first slide states a clock speed of 450MHz on the 5650 but the second says 550MHz. Given the figures in the ATI performance comparison, we think the 550MHz clock is correct.<

Initial Thoughts

On paper, the launch of ATI's latest 5000 series mobile GPUs looks very good. They won't provide a huge boost to gaming performance, but they should handle video processing at least as well as the previous generation, and we are very interested in the claims of dramatically improved idle power consumption. Unfortunately, until we get a laptop for testing we can't say for certain how well everything will turn out. There are a few flies in the ointment as we see it.


First, there's the question of drivers. To be honest, ATI's drivers on mobile GPUs (outside of CrossFire solutions) have worked well for us. We have an older laptop with an HD 3650 and updating the drivers from a 2008 release to a 2009 release (9.5 Catalyst) didn't improve performance, and we hadn't experienced any compatibility issues. But that was with HD 3000 series hardware, which is now a couple years old. With new hardware, we expect driver updates to accomplish more in terms of improving performance and compatibility. The real catch is that compatibility is almost certain to cause problems once we start seeing more DirectX 11 games. Why is this likely? Simply look at the past.

The first DirectX 10 hardware came out in November 2006. We didn't see many DX10 enabled games for a while, but nearly one full year after the launch we got Bioshock. Bioshock supported DX10, and guess what needed driver updates? That's right: all of the 8800 series NVIDIA cards (and laptops). We see a similar story with NVIDIA's CUDA and PhysX where driver updates are critical if you want to use those features. We are in the infancy of DX11, and we haven't even begun to scratch the surface of DirectCompute or OpenCL. When we start to see applications using these APIs, we can pretty much guarantee both ATI and NVIDIA will need to provide regularly updated drivers. We have no concerns about ATI's ability to do so on the desktop, but right now they don't have any mobile driver plan as far as we can tell, referring you instead to your notebook manufacturer. Most manufacturers stop updating drivers after a few months, and some models don't even get that!

We mentioned earlier that ATI informed us they will announce plans for an improved mobile driver program "very soon". Let's take a moment to make sure ATI knows exactly what we expect. We need, at the minimum, new drivers for all current mobile GPUs released on a regular schedule. NVIDIA has committed to quarterly releases in the past, and we would suggest that is a good starting point. We don't necessarily need complete integration with the desktop driver releases, but that would be the ideal end goal. NVIDIA hasn't managed to pull that one off yet, but they're getting closer. And just to be clear: we know ATI has tried to do drivers like this in the past, and the OEMs said no. Well, OEMs, there's not a chance you can make a good gaming laptop unless you let users get regularly updated graphics drivers from the GPU manufacturer. So get with the program!

And let's not even discuss ATI drivers for alternative operating systems like Linux. Ugh. I'll leave that to Christopher.

The second concern is availability of these new graphics chips. We don't mean being able to go out and buy the chips themselves; we mean the ability to purchase laptops that use the latest and greatest ATI hardware. Ideally, we would really like to see some of the new Arrandale laptops with 5800 series hardware (or even 5600/5700 hardware). We know several such laptops are in the works, and we can discuss them tomorrow, but we still need to see how well they work. Considering Arrandale has built-in IGP, these laptops also better support hybrid graphics. The idle power draw of ATI's 5000 series may be significantly lower than previous ATI mobile solutions, but nothing beats the ability to shut your discrete GPU down completely and run off of a low-power IGP when you don't need the extra graphics performance. ATI supports this, so again this is up to laptop manufacturers to make sure they implement the feature.

Finally, we've talked a little bit about NVIDIA, and we know NVIDIA is working on updated mobile hardware as well. Will they support DirectX 11? We don't know. How fast will they run? We don't know, but it's a safe bet they will be at least 20% faster than their previous generation hardware, which means they could easily match ATI's performance. How soon will this hardware be available? Probably at least a month or two after ATI's hardware, given NVIDIA hasn't announced details yet. That's a lot of questions and very few answers, but we may have more information by next month so stay tuned.

Wrapping things up, what we have today is your typical notebook GPU launch: it's all on paper with very little hardware out there for review. We know there are laptops at CES using some of these new GPUs, but CES is full of products that won't be available at retail for another month or two at least. As soon as we can get hardware in our labs, we will be able to provide an actual review of ATI's hardware. It sounds good, but drivers in particular are still a major concern for us, especially on the 5600 and above. Perhaps the most impressive aspect of this launch is that ATI is doing a top to bottom mobile GPU update, rather than the more common high-end followed by midrange and low-end (or vice versa) launches we frequently see. We'd also like to see a new IGP solution that can double the performance of the HD 4200, but that's just us being greedy.

For those who are interested in the complete presentation, below is a gallery of all 34 slides. You can read additional information about Blu-ray support (it works fine on our old HD 3650 laptop, so we're not exactly sure what has changed), ATI Stream, and other details we didn't feel needed a lot of discussion. Enjoy!



Read More..