The new forums will be named Coin Return (based on the most recent vote)! You can check on the status and timeline of the transition to the new forums here.
The Guiding Principles and New Rules document is now in effect.
I recently bought a Samsung widescreen monitor, it looks great and certainly takes up a lot less space then my old 21 inch CRT, but the one thing that sucks is that it only supports widescreen at 1680x1050. My desktop is now almost 5 years old, and while it still runs games remarkable well for such an old machine, newer games like Bioshock and Oblivion defiantly have some frame rate issues. Now while 1680x1050 is the only real widescreen res it supports, some games let you do some a kind of "faux" widescreen by setting a standard resolution (which you set to stretch on the monitor) and then doing some fov tricks, unfortunately the methodology varies from game to game.
Anyone else hitting these issues, I imagine they must be pretty common in the day and age of modern lcd widescreen gaming? Anyone have any advice?
acidlacedpenguinInstitutionalizedSafe in jail.Registered Userregular
edited April 2008
a couple years ago I went from a 21 inch CRT to a 21 inch LCD with native resolution at 1600x1200. Worst mistake I ever made, I'm a stickler for native resolutions so I had to play my games at that resolution or else it would look really weird (weird scan lines and ghosting around objects) after a few weeks of agonizing over it I upgraded my video card. I love it when one purchase warrants more purchases
hate to say it, but upgrading your videocard is pretty much the only solution if you want to play in your native resolution.
I'm agp running a 7600GT. Next video upgrade is going to be a new box. I don't mind playing on less then native, it does look alittle funky but better then having crap performance.
I've got a Dell 24" monitor (1920x1200), but I'm still running it on an AGP GeForce 6600GT. It hurts, but I can't justify building a new system just yet.
1280x800 looks pretty good on a 1680x1050 monitor. nVidia doesn't list it by default so you have to add it manually, but it will show up in your games after you do.
I've got a Macbook pro and so far, every game I've tried has had some nice 16:10 ratio resolutions below the native 1440x900.
You've got 1280x800, 1152x720, 1024x640, and 800x500, all of which have shown up in games no problem. Psychonauts, Guitar Hero III, HalfLife 2, etc... Maybe it's boot camp doing automatically what Fats suggested.
1280x800 looks pretty good on a 1680x1050 monitor. nVidia doesn't list it by default so you have to add it manually, but it will show up in your games after you do.
I recently bought a Samsung widescreen monitor, it looks great and certainly takes up a lot less space then my old 21 inch CRT, but the one thing that sucks is that it only supports widescreen at 1680x1050. My desktop is now almost 5 years old, and while it still runs games remarkable well for such an old machine, newer games like Bioshock and Oblivion defiantly have some frame rate issues. Now while 1680x1050 is the only real widescreen res it supports, some games let you do some a kind of "faux" widescreen by setting a standard resolution (which you set to stretch on the monitor) and then doing some fov tricks, unfortunately the methodology varies from game to game.
Anyone else hitting these issues, I imagine they must be pretty common in the day and age of modern lcd widescreen gaming? Anyone have any advice?
I'm not sure I understand what you mean. I mean, my monitor has a native resolution of 1680 x 1050, but from what I've tried (mainly with Crysis), using lower, widescreen resolutions doesn't cause any issues. I think I once had some minor glitching around the HUD in FEAR when I accidentally dropped the resolution to a non 16:10 resolution, but that was fixed when I put it back up (and later put it back up to 1680 fine).
Prior to using Fats' tip I wouldn't get an option in games to use any lower widescreen resolutions then 1680 x 1050. Apparently that process makes the nvidia fake a lower resolution that your monitor doesn't necessarily support. 1280x800 now shows up for me, but if I look at my monitor's info display it says 1280 x 960.
The one issue I am having is that vsync doesn't seem to ever be enabled for this new lower resolution, which results in alot of tearing. I've double checked to make sure it is enabled in the games I am running, I've also tired to use the nvidia driver force vsync option. Neither have worked, anyone else familiar with this issue?
Posts
hate to say it, but upgrading your videocard is pretty much the only solution if you want to play in your native resolution.
You've got 1280x800, 1152x720, 1024x640, and 800x500, all of which have shown up in games no problem. Psychonauts, Guitar Hero III, HalfLife 2, etc... Maybe it's boot camp doing automatically what Fats suggested.
That is a seriously awesome tip. Thanks alot.
I'm not sure I understand what you mean. I mean, my monitor has a native resolution of 1680 x 1050, but from what I've tried (mainly with Crysis), using lower, widescreen resolutions doesn't cause any issues. I think I once had some minor glitching around the HUD in FEAR when I accidentally dropped the resolution to a non 16:10 resolution, but that was fixed when I put it back up (and later put it back up to 1680 fine).
The one issue I am having is that vsync doesn't seem to ever be enabled for this new lower resolution, which results in alot of tearing. I've double checked to make sure it is enabled in the games I am running, I've also tired to use the nvidia driver force vsync option. Neither have worked, anyone else familiar with this issue?