As you might have read in my earlier post , I received an ATI 5870 video card and I had to nickname it “the beast”, because this sucker is big, and that’s just how I like it. I’m upgrading from an ATI 4850, which is another fabulous video card, but I’m certainly not going to deny myself the opportunity to try out this latest one.
If you are interested in all the fancy numbers, you can view the technical specs here , but here are the main features from the ATI site.
As you can see in the images above, this is obviously a dual-slot card, meaning it’s going to take up two spots on the back of your PC. This wasn’t a problem for me as I don’t have any other PCI cards installed, but it’s something you need to take into consideration. Installation was quick, just make sure both slots are free, snap in the card, and hook in the power. After making sure it was securely in, it was ready to go. Next step was booting up and installing the latest Catalyst drivers, which is a straightforward installation and I didn’t run into any issues whatsoever.
On the card it has two DVI outputs, an HDMI output, and a DisplayPort output. In my current setup I have two 22” widescreen monitors hooked up to the DVI ports. I’m still contemplating on what to do with the HDMI port. Anyways, after the driver installation I had to go into the display properties and easily configure Windows 7 to setup the dual-monitors to display as I want them to which is the main monitor on the right, and the second on the left.
I just installed this a day ago, so I’m going to break it in for a bit, and then follow-up with a review on the performance of the card from a user point of view, not the super technical number crunches you often see.
Your beast is tiny, sorry to say. That picture only has it go past the ram sockets... Stupidly large gtx 260 goes another 2-3 inches... Beast more in performance (And the 5xxx series rocks) then size.
Btw, you should try some eyefinity. My 5770 plays Fallout 3/left 4 dead 2/tf2/bioshock 2 on highest settings (except no aa) at 3600x1600 on a pentium d 805, I'm betting that 5870 could run crysis at amazing resolution.
ID...google the Res Evil 5 Benchmark 'demo'...and run that. It'll give a 'real' Gaming performance benchmark...
[Mine gets ave 106fps at 1680x1050 32bit DX10]
106 fps... your eyes can't see any difference much past 50 can they?
Oh no....
Your eyes can see into the hundreds of fps. Action wise you can see into 100 fps. Very fast action (IE, fast cars) it's easy to see up to 100-200 fps. When you take into account flashing lights that flash many times per second, your eyes can see 500 fps or more.
Please, PLEASE don't turn this into another of the internet's billion "how many FPS can your eyes discern" topic.... So many disagreements, SO much misinformation...
On topic:That's a sweet card, my GTX285 still does me proud, but I'm interested in what the next gen can grind out for sure.
I looked it up, and the answer is no.
Correct, however it's 'just' discernible with a 50 hertz television to detect the flicker....where 100hz is 'clean'....though more often than not 60 hz is enough.
Games are not televisions, they are generating images in 'real time' so IF/WHEN there is a particular graphics-intensive render/scene the framerate will lag...and if there is insufficient 'headroom' you will definitely see it as stagger if/when you slow to 25fps or there-abouts.
If you can AVERAGE around 100fps in gameplay there is a very good chance you will not at some stage or other be waiting for the screen to draw while your enemy caps yo ass ....or you understeer into the armco ...
You can't do anything with the HDMI port if you are using the DVI ports.
For three monitor mode you would need to use a Display Port monitor as one of the monitors. I have a HP LP2475w for this purpose. Ideally all three monitors would have the same native resolution too so you could use them in eyefinity mode.
Oh god, it is another "how many fps can you see" thread...
As for you, ok, so YOU can only see lets say 50 fps.
Me, I can tell the difference between 50 and 80-100 fps, not in the middle of my sight but at the edges where the discrepancy between the changing position of an object becomes visible. In films you have mostly fluid motions (mostly because many people agree 48 fps would be better, let alon 96) because you have the effect of ghost images called motion blur. so movement seems fluid even though it is jumpy. In computer games, you lack the ability to do some serious motion blur, hence many gamers consider the minimum speed at which to play FPS 60 fps. the minimum!
I can't probably tell any difference if the speed is greater 80-100 fps, but that will change when you use a much bigger screen with a much higher resolution (think 4x4 meters and 8192 horizontal pixels)
Why do you think IMAX HD and 3D use 48 fps even though they have motion blur as they are film?
Here are some concepts, phrases, and fragments that anyone wanting to speak authoritatively about perception thresholds of the Mk II Human Eyeball really should know about:
* analog
* digital
* individual threshold of perception
* illusion of motion
* Application/game frames per second
* Monitor vertical refresh rate
* CRT monitors
* progressive and interlaced scanlines
* For the US audience (and perhaps a handful of other countries) 120hz AC, 60hz flourescent lighting, 60hz default Windows v-rate for CRTs, and sympathetic waves.
On topic: I bought a 5770 a few weeks ago. I like it. This really says quite a bit more than otherwise indicated - as it's my first ATI card in bleems - never wanting to use them after a poor time with an under-performing 9250 (or something like that) and giant fat overweight driver configuration utility named Catalyst. But nVidia can't really compete on features or specs at a given price point right now for new card purchases.
LOL. My Nvidia GT 275 is about 10-12 inches long. I had to move a hard drive to get that mofo in.
Ewww, stock cooling. I hope it's better than the last dozen or so generations... I prefer things that doesn't soulnd like a smaller jet.
This only applies if your monitor updates at more than 60Hz though, which is unlikely with a LCD or LED monitor. As far as I know even 100Hz+ TVs will still update a computer signal at 60Hz. This means that if you can tell the difference between 60 and 100FPS, it's definetely not because you see "more" frames in the game.
187.6 @ 1600x900
Just added my overclock - I usually have that off cause the pc acts like a mini fan heater... - i7 920 @ 3.67Ghz, mem @ 1451Ghz with 2 x 4890s in crossfire, and...
... 228.4 fps
Adequate
This lovely topic takes 'e-peen' to a whole new level...
And with that said... my gpu is longer then all of yours!
I on the other hand dont mind if i get a steady 30fps. As long as its steady. Im not a user who cares if I get 100 more fps then what my eye can ever see. lol
That 106fps average had a minimum around 70fps so there was/is sufficient headroom to make staggering 'unlikely'.
Same benchmark test on supposedly one of the fastest/best AGP cards released showed an average of 21.9fps at 1280x1024 32bit DX10 with a minimum around 10fps maybe. [both resolutions were native to their respective monitors].
I was running Bioshock with FRAPS up, DX10 everything max at my monitor's native 1440x900 clamped at 60FPS due to v-sync. At one point it dropped down to the mid 40s for a few seconds and I would of never knew if it I hadn't seen the counter. Maybe my eyes are just getting old, but unless I get a huge dip in something it just doesn't seem noticable. Think I'd rather have a game using enough of my stuff to keep a smooth 45-60FPS rather than have a whole bunch of unused potential I paid for just sitting there. Of course as time goes on that potential will get used for those cards that can do 100+FPS now, but they will also be a lot cheaper by then .
FRAPS used to chew up my framerate like baby food until I got my new laptop.
I have the 5750 model and found that you can only use the HDMI slot in certain configs, which does not include dual DVI. In a dual DVI config, you'll need to have an active DP adapter (or a monitor with a native DP connection). The adapter runs about $100 and can be found on Amazon.com. (Found all of this out the hard way after purchase.)
Grab any game that runs above 60FPS at all times, play it with vsync on and off. If you can actually SEE the difference, you'll probably note that vsync off seems more jerky (screen tearing), even if it's a higher framerate.
With a refreshrate of 60Hz on the monitors you still have only 60FPS visible regardless of how fast the graphicscards renders.
In games where reaction time is significant you do get slightly better responsetime with framerates higher than 60 though, simply because the images actually displayed will be at most 1/120th of a second old at 120FPS while at 60FPS it'll be 1/60th. It may seem like a very short time, but it makes a difference (it's basically an 8ms reduction in latency (ping)). You can't actually see this difference, but it'll make the game feel more responsive.
5870 is overkill right now. I got 4870 and run latests games (Mass efect 2 bioshock 2 Demigod ) at 1650x1050 with full details (AA, deep of field etc etc maxed) without dropping below 40 fps. The only problem i had was Anno Dawn of discovery.
Unless you want to run 2 games at same time its really no point to buy it now (when the price is so freakin high). Wait fev months till Nvida release cunter product and when price war starts then u can get this card for 40% of its original price.
Lol at 100+ fps when you eye can only catch 26-30 fps.
On other note its funny that you need to spent over 2500$ for pc parts to get the same gaming expierience as you would get from 700-500$ PS3. Not to mention you dont need to worry about latest drives, dirext updates or security softwares like tages or securerom that can cause your computer to crash or make it impossible to use compies of your disks (which you have right to make!!!).
You just simply put the blueray disc in and play at the best possible settings. Not to mention that most new games are designed for either ps3 or xbox360 and pc users get lousy ports (GTA 4 is the best example here, New AvP looks like next example of it).
The only reason I stick with PC games is beacuse i like RTS and shooters and i just cant imagine playing it on joystick. Also thank to pirates and torrents, prices of PC tiltles are more competive then PS3 games (many developers decides to drop prices of their products to figh piratede copies).
Playing any online first person shooter at 30 FPS will give you a significant disadvantage against a player who plays at 120FPS. Doesn't matter if you can actually see the difference or not. At 30fps it takes 33.4ms to render a screen, during that time a lot of things happen regarding to position of the enemies. Let's say you have a 100ms ping response to the server, which is fairly good, that makes you end up with 30% more lag only because of your graphics card. It may not sound like too much, but at "pro" level it makes a HUGE difference. At 100FPS there's only a 10ms lag caused by the rendering, which should be quite close to negligible.
That should end this discussion, because there's really no opposing this basic fact. Higher FPS will affect your gameplay. Seeing the difference or not is completely irrelevant.
I seriously doubt that anyone would gain anything from going higher than 120FPS though (120FPS seems ideal to me, because it would be nicely synced with a 60Hz screen refresh rate).
There are many great features available to you once you register, including:
Sign in or Create Account