Computers & Electronics

nvidia. Poor 2d graphics quality?

  • Last Updated:
  • Jul 7th, 2011 11:37 am
Tags:
None
Deal Fanatic
User avatar
Jan 5, 2002
5287 posts
3345 upvotes

nvidia. Poor 2d graphics quality?

So here's the story I've been a LONG time ATI/AMD graphics card user. My last (and only) nvidia graphics card for desktop PC was the TNT 2, yeah a long time. Since then I had a variety of ATI cards, mostly their top model AIW cards until they stopped making them and my main rig currently runs 4870's in crossfire. Prior to the TNT 2 I used Matrox and ATI cards, but mostly Matrox until trying the TNT2, I only used the TNT2 for a short time because I found the image quality terrible in comparison to the Matrox/ATI cards I was use to, I switched to ATI AIW Radeon and found it much better.

So I decided I wanted to upgrade my my recently built HTPC which is a Sandy Bridge 2500K running on a ASUS P8H67-M Evo motherboard. I was using the Intel HD 3000 onboard graphics and the quality was fine and even light gaming was pretty good, but I wanted a little more gaming power. I run this through my Yamaha RX-V3900 Receiver using HDMI and that connects to my Epson 8700UB projector that projects to a 100" Dragonfly screen. I was very happy with the desktop graphics, I use this PC for watching videos from my hard drive, netflix, and other online services like NHL Gamecenter and UFC.tv. Surfing the web was fine and everything looked great but like I said I wanted more gaming power.

I'm considering upgrading my main PC to wither a Radeon 6970 or a GTX 580 and I figured since it has been so long since I tried an nvidia card I'd give it a GTX 460 a shot to see how it was compared to what I'm use too. That way I'd have some idea before spending almost $600 all in for a GTX 580.

I purchased a EVGA GTX 460 FTW EE and I have to say it looked terrible, the text is blurred and the images on the desktop look fuzzy and bad overall. I did a quick switch back to the Intel HD 3000 and it was significantly better. I decided to try on my main PC monitors. I tested his my Dell UltraSharp 2407 24" and Dell UltraSharp U2311H 23". I'm very use to the image on these monitors from my ATI 4870's I tested the GTX 460 and it looked like crap compared to the 4870, I then tried the Intel HD 3000 and it looked much better than the GTX but slightly less clear than the ATI. This was using the same DVI cable I've always used.

So I figured I got a bad card, so I go to the local store to buy a GTX card that I can easily return, I picked up a Zotac GTX 560 and installed it. Found it was equally as bad, did a full driver clean with driver sweeper removing it all. Guess what same results. I'm baffled at how this can be. I ran some movies and video games and they image quality is much more comparable to the ATI card, maybe a little less sharp but hardly noticeable unless you are standing still and really looking at non-moving items.

So this is my question, does nvidia still have poor 2D quality? How can it be this isn't really mentioned in reviews? I mean I look at my PC for more 2D stuff than I play games. The difference is so big that I couldn't see myself using the card like this for day to day stuff. I'm really disappointed with this.

I have two laptops with nvidia cards in them, a Alienware m11x and the quality isn't terrible, text is readable and things look good. I've passed it off since overall the 11.6" screen on the m11x isn't that great overall but I love the laptop. It used s 335m. My other laptop is a Sony Vaio F series with a 310m graphics chip. It looks better than the Alienware but that is because it uses a nicer LCD. I'll admit that neither look as good as my desktop but I figured that was normal for a laptop anyway.

So what gives? Anyone else notice this? I'll probably keep the GTX 460 because I got a pretty good price, I wanted to try Physx too. However, I sort of wish I spent the extra $30 or so and got a 6850 instead.
That's 30 minutes away, I'll be there in 10.
Beer: The cause of, and solution to, all of life's problems.
6 replies
Deal Addict
User avatar
Jan 21, 2010
3373 posts
129 upvotes
Scarborough, ON
I have 5 AMD video cards and 7 Nvidia cards and swap them out constantly and never seen any noticeable image differences except in AMD games like Stalker Call of Pripyat and Medal of Honor on NVIDIA cards.

Check Nvidia control panel if you got interlacing turned on or some sort non-native scaling.
Deal Fanatic
User avatar
Oct 7, 2007
7282 posts
1820 upvotes
Mississauga, ON
This is an extremely weird issue, and pretty sure not the fault of Nvidia cards in general. More people run Nvidia cards than AMD, so if this issue is real then you would think A LOT more people would have raise a stink. Like you said, the reviews would have been all over this if this is true.

I am running GTX 460's in SLI and have been swapping cards in and out over the years. I have NEVER seen the issues that you are experiencing.

I recommend downloading the latest Nvidia driver and make sure you are running at the native resolution. Make sure your monitor is getting the correct resolution. Swap out another DVI cable, or try VGA and see if there's a difference.
There's a sucker born every minute.
Deal Fanatic
User avatar
Jan 5, 2002
5287 posts
3345 upvotes
I've been doing a bit more research online and in a lot of forums it appears to be common for people going from ATI to nvidia. That being said it isn't that it is unreadable and it is much more noticeable on a 100" projection screen compared to my 24" LCD. I don't like bright over saturated images. I'm using settings that people often say are dark on my 24". My Epson 8700UB is running in THX mode and like I said the Intel HD 3000 looks fantastic and so does all of my other input devices (PS3, 360, etc). After posting this I connected my Alienware m11x with HDMI to the same input I was using my HTPC with and the quality wasn't as good as the HD 3000 but was very close. Then I realized I was using the integrated intel video not the nvidia 335m chipset on my m11x so I switched the the 335m and noticed an immediate drop in text quality. It wasn't as bad as the gtx460 but it was certainly worse that anything else I tried on my projector.

I should also add I always use dccw in windows 7 to adjust the basic settings and enable cleartext when changing video cards and then I'll adjust my monitor accordingly if required.

Like I said the GTX 460 looks blurry, especially on my 100" screen from about 15' away. It makes text much more difficult to read but you can still read it. I'm sure the difference I noticed on my 24" (which was a lot less) would fade away after long term use. Meaning I'd get use to it, but I'm certain if I switched back I'd notice an improvement. When I use Windows Media Center on the 100" screen the different is very minimal, it is almost the same. The problem is most noticeable in Steam, the text on install screens is very difficult to read so I'm sure the color scheme has something to do with that.

Here is my opinion on the 2D quality of the cards I tested using the Radeon 4870 as the base line.

4870 - 100%
HD 3000 - 95%
335m - 85%
gtx460 - 75%

So basically the gtx460 is 25% worse imho.

For 3D the HD 4870 would be like 100% and the gtx460 would be about 95% as it only looks slightly dull/fuzzy if you stop and look for it. In motion then look almost the same.

Someone else on another forum mentioned it might be an issue with most people not being use to one manufacture or another and maybe even that a lot of people use TN monitors which are typically faster but lower quality. So I connected the 4870 and the gtx460 to an older 17" acer monitor I have and the difference was negligible. Like I said it is mostly noticeable on my 100" screen and I'm beginning to this the minor difference isn't that much but the 100" screen amplifies it.

I've look around on many online forums, avs, hardocp, toms, etc and many people agree this is a issue and there are a ton of posts with the same problem. I found a lot of people speculating that it is a QA issue with nvidia and their ramdacs.

I can certainly deal with it but it was a little disappointing.
That's 30 minutes away, I'll be there in 10.
Beer: The cause of, and solution to, all of life's problems.
Deal Addict
User avatar
Dec 22, 2006
2865 posts
296 upvotes
Toronto
I have noticed this while setting up multiple systems on a TN panel LCD. In my experience it depends on the specific card, Intel onboard GMA/HD2000/3000 generally being the worst. It's most noticeable around text like you mentioned where text seems fuzzy/blurry but still legible.

Normally it's easily fixed by adjusting the pixel clock and phase (sharpness) on the LCD. Windows 7 also has a ClearType text tuner to adjust text sharpness.
Deal Fanatic
Apr 24, 2006
7000 posts
1122 upvotes
Toronto
Shot in the dark..

I've found blurryness on LCD can occur when the refresh rate is set too high, essentially, at what a CRT typically is set at. (I.e. Telling it you use 75 Hz instead of 60Hz.)

Typically the video card detects the LCD and adjusts automatically, but I've seen some that do not.
I Declare - The official guide to your Customs exemptions and item restrictions when returning to Canada from abroad.
Deal Fanatic
User avatar
Jan 5, 2002
5287 posts
3345 upvotes
cwb27 wrote: Shot in the dark..

I've found blurryness on LCD can occur when the refresh rate is set too high, essentially, at what a CRT typically is set at. (I.e. Telling it you use 75 Hz instead of 60Hz.)

Typically the video card detects the LCD and adjusts automatically, but I've seen some that do not.

I can't even select anything other than 59hz or 60hz and 60hz is selected.
That's 30 minutes away, I'll be there in 10.
Beer: The cause of, and solution to, all of life's problems.

Top

Thread Information

There is currently 1 user viewing this thread. (0 members and 1 guest)