I'm hoping this thread is in the right place. If not, feel free to do whatever you see fit, mods.
I recently picked up a 26"
Samsung LCD to use as a computer monitor. My intention was to use a DVI to HDMI cable for a big, high-res picture. However, when I use HDMI, the picture looks like crap. It's really hard to read things, as everything is extremely pixilated. It looks almost as if the sharpness on the TV is turned up way too high.
I know that the TV supports 1366 x 768 res through the VGA port on the back. However, I was under the impression that I could get something much nicer by using DVI to HDMI. In fact, I can get the resolution to go much higher than 1366 x 768, but it makes everything look absolutely terrible. Is there something else that I need to adjust? Or am I just stuck with a shitty picture?
Video Card, BTW:
GeForce 7600 GTS
Posts
- Have you used the DVI port on your video card before with other displays? If you've only started using it now for this, perhaps it's a faulty port/connection and you simply never knew.
- Googling your TV's model number to see if there are specific problems with displaying certain resolutions over HDMI.
- Trying a different cable, if you can get your hands on one.
Um, question for OP... did you try setting your computer at the same resolution as the TV?
I picked up my cable for $12 total at newegg.com. It's a pretty nice site for cheap tech stuff.
Yeah, I've tried multiple resolutions. The only one that even looked decent was 1280 x 700 (I think?). But that's lower res than I get with the VGA connection, so it seems a bit backwards. I was led to believe that I could get much higher resolutions from DVI -> HDMI.
I haven't used my DVI port before now, so it's possible that it could just be bad. However, my video card is relatively new (about 6 months) and I've never botched an installation before. It's a strong possibility though. I'd just like to make sure that I'm not missing anything before I go do something drastic, like buying a new vid card.
Yeah, I'm leaning towards that explanation. But it seems a bit odd that I'd get a better resolution out of VGA than HDMI. I thought HDMI was able to do something around 1080 or so, at max.
http://www.monoprice.com for all your cabling needs.
@gamefacts - Totally and utterly true gaming facts on the regular!
Oh, I didn't even think about that. Yeah, I would look up what the native resolution of your LCD is and try that. If the difference is extremely noticeable, it could very well be an issue of your monitor having a low-quality scaler.
You might want to try a program called PowerStrip to add custom resolutions to your registry. The one I used was 1366x768 (PDP), which adds the resolution 1368x768 to my registry.
Now my desktop runs at 1368x768 and a pixel doesn't get displayed on either side.
As for gaming, the nearest supported resolution for video cards would be 1280x768. You might want to see if you're TV has that as a selectable display mode.
Mine has:
1366x768 at Pixel for Pixel
1280x768 at Pixel for Pixel, and Stretch
1024x768 at Pixel for Pixel, Stretch and Zoom.
However, it can also display 800x600 and 640x480.
Any resolution with a 768 as the second number, displaying pixel for pixel (meaning you'll have black bars on either side) will give you the clearest picture.
Edit: You're TV's native resolution is 1366x768.
The type of cable you use (VGA, DVI, HDMI) does not effect the resolution of the screen.
Technically, any computer resolution settings that are even multiples of the TV's resolution (horz. or vert.) will prevent blurring: 680x384, 2720x1536, 680x768, 1360x384, 2720x768, 1360x1536, etc. But the smaller res will make it look blocky and your video card probably can't support the higher ones.
I think that the television does some image processing with HDMI input that it does not do with VGA input.
again this was an older model but it had the exact same problem with HDMI that you are having
Anyone have any experience with this, or am I just going to find out when I get the cable?
Buy some useless stuff at my Cafepress site!
If you try to run a 1366x768 display at anything other than that resolution it will always look like ass. The method you use to get the signal into the display makes no difference.
My question is this: Is there a way to convert either VGA or DVI to component and have it look nice, or am I stuck with s-video cables (and if so, is s-video really that bad?)
The thing about S-Video is that it only supports 480i (and, I suppose, even lower resolutions like 240p).
Any idea about converting vga or dvi to component? bueler? buuuueler?
i'm curious what kind of setting you tweaked with the cleartype. i run my desktop at 1360x768 over dvi-hdmi and i notice this weird effect were black text on a white background looks bold in spots and not in others. its actually in this vertical pattern that seems to get bold in 4 waves across the screen.
Well, considering the problem isn't really with what hook-up you use, but rather the resolution of the TV itself, you probably won't see an improvement.
I'm a bit disappointed with how games look on it. I mean, there's no blur or anything silly like that. It's just that I am close enough (about 2-3' away) that I can see every pixel. It's a bit annoying. On the bright side, I still have a nice TV.
It's a 1080i, so if 480 is the best I can get it is the hook-up that's the problem
Also, the idea was to play Eve (older game anyway) and possibly LOTRO, as well as run Front Row through it. Front Row will be fine at that res (I hope), but not being able to play games at a decent res is a bummer.
the line between televisions and monitors seems to be blurring. and lcd display is still an lcd display if its called a television or monitor. that being said, you need to look at what inputs you have on a device, dot pitch, resolution, and response time. in general a lcd display that is more often used as a computer monitor will have will have a higher resolution, finer dot pitch, and faster response time, but there are alot of television/monitor devices out there now that are big enough to use as a television but also just fine when used as monitor. i have a 32" sharp aquos that i use as both and i am quite happy with it (the problem i mentioned early being only a minor one and hopefully fixable)
http://www.newegg.com/Product/Product.aspx?Item=N82E16814999205
I have an ATI card, but the reviews aren't very promising :?
Well, what kind of hook-up are you using? And has your video card correctly identified the monitor? When I had first hooked my Samsung up, it took a couple of tries before my vid card actually recognized that I had disconnected my old monitor and hooked up a new one. But yeah, if you are getting resolution that low, you may want to try a different hookup.
I could be completely off base, but I thought I had read that 1080i = 720p. That is a result of the TV mashing up lines near the sides of the screen. Thus, you don't truly have 1080 lines in an "i" configuration. But, in a "p" you actually do have the amount of lines that are advertised (electronic geeks will have fun with my horrid analogy).
Bigger is better.
Buy some useless stuff at my Cafepress site!
That said, 540p will not look the same as 1080i to your eyes. Although 1080p would be sharper than 1080i. Personally I can't really tell the difference.
Since only 540 vertical pixels are being used, a TV that has a vertical picture range higher than 540 can do 1080i.
This is why. The pic is bad, but you get the idea.
1920x1080 is barely bigger than 1600x1200, but at 37" 4 feet from your face it's totally awesome.
if your tv does 1080p, then you can use 1920*1080 and get really good resolution.
i use a sceptre naga x37 (1080p) for my computer monitor (dvi to hdmi) and the picture is amazing.
aka Grillaface
Regarding the post above, is this going to look like ass even with the native resolution (if I'm able to set it to that)?
Buy some useless stuff at my Cafepress site!
well for "normal" computing use, yeah, its going to look like ass.
for video, gaming, tv, etc, it'll look great.
but your fonts and icons will look weird. it's probably not set in stone, i'm no expert, but i don't think you can set a 1080i display above 1,280x720, just like 720p
1080p sets usually have a native res of 1920*1080, which makes them desirable for cpu purposes
as a general rule of thumb then i'd say if your tv doesn't support 1080p then don't plan on using as your primary computer monitor
aka Grillaface
If your TV is not a CRT, then it does not do 1080i. It takes your 1080i signal and downscales it to whatever its actual resolution is (or, in the case of 1080p sets, deinterlaces it to 1080p).
The ONLY kind of display that actually displays an interlaced image, without deinterlacing or scaling, is a CRT.
I just got my DVI->HDMI cable today and hooked it up from my PC to the TV (Sharp 37" Aquos). The TV doesn't even try to display an image, at all.
I have an nVidia GeForce 6800 GT, it's got 1 DVI port and 1 VGA port, and I've been running dual monitors on it with no problem. I also have UltraMon. Both it and the Windows display properties seem to sort of know a second monitor is hooked up (the TV in place of my LCD monitor), but the monitor is grayed out in the properties.
I googled a bit and couldn't find anything about drivers, or any help really. Any Ideas?
Buy some useless stuff at my Cafepress site!
You can easily get 24'' computer monitors that output at least 1920x1200.