The new forums will be named Coin Return (based on the most recent vote)! You can check on the status and timeline of the transition to the new forums here.
The Guiding Principles and New Rules document is now in effect.

I need a guide/FAQ about pixels, resolution, etc.

OrganichuOrganichu poopspeesRegistered User regular
edited September 2007 in Games and Technology
I'm a moderate-heavy gamer and film purveyor, so it's annoyed me for awhile now that I don't have a real understanding of modern concepts of video quality/compatibility etc. If any of you would be kind enough to a.) answer my questions piecemeal or b.) direct me to a guide that answers most of my questions, I'd be eternally grateful and would be inclined to offer recompense in sexual currency... or just appreciation.

[strike]a.) a pixel is essentially a visual character, correct? The smallest denomination of visual artifacts that are visible on a screen, right?

b.) if so, are resolutions literal representations of pixel quantity? i.e., is 1024x768 equal to 786,432 pixels on the screen?[/strike]

[strike]c.) if this is correct, does quality worsen dramatic with percentile mismatches? For example, a 20" screen is 133% the size of a 15" screen. So will a resolution need to be 133% the pixels (however the proportions of the rows and columns might change) to have identical quality? Will a greater resolution look significantly better on a screen of a given size? Is there a ceiling of advantage? Is there a golden ratio at which the quality can not be improved upon?[/strike]

[strike]d.) What's the difference (functionally) between interlaced and progressive lines?

e.) Is a typical monitor/computer display 'high definition'? To what extent? 480/720/1080?[/strike]

[strike]f.) In what circumstances would one need an HDMI cable?[/strike]

[strike]g.) VGA/S-video? What are these?[/strike]

h.) What are component cables?


If I think of any additional questions I will add them to the OP. Also, I'll cross out sufficiently answered questions. I apologize in advance for my stunted knowledge. Thank you all so much. :):):)




edit: a-g answered... clarification needed on remaining questions.

Organichu on
«1

Posts

  • romanlevinromanlevin Registered User regular
    edited August 2007
    a) yes
    b) yes
    c) depends
    d) interlaced renders only half the lines.
    e) yes
    f) to display DRMed content (Blu Ray, HDDVD)
    g) VGA - connector for computer screens, S-Video - connector for TVs

    romanlevin on
  • OrganichuOrganichu poops peesRegistered User regular
    edited August 2007
    Updatez.

    Organichu on
  • UreshiiAkumaUreshiiAkuma Registered User regular
    edited August 2007
    d) Each time a display renders (draws) the screen (each scan, in other words), a progressive scan display renders every line --- all 480, 720, whatever. A interlaced display only draws half of the lines (every other line). On the next scan it'll draw the other half of the lines.

    So, a 1080p display draws all 1080 lines each scan. A 1080i display only draws 540 each scan.
    At the same resolution (for example, 1080), progressive is better to have than interlaced.

    I dare not enter the debate about whether a lower resolution progressive is better than higher rez interlace (e.g. 1080i v 720p)

    UreshiiAkuma on
  • GlalGlal AiredaleRegistered User regular
    edited August 2007
    e) SD/HD are just resolution sets. Since monitors have a much broader variety of these whether or not they're "HD" depends on the monitor itself, but yes, you can find monitors that top all those resolutions.
    Actually, it'd be pretty hard to find screens that don't top the first two.

    Glal on
  • mntorankusumntorankusu I'm not sure how to use this thing.... Registered User regular
    edited August 2007
    C) It really just depends. There is a point at which adding more pixels doesn't change anything, but that depends on screen size, the distance the viewer is from the screen, and how good the viewer's eyesight is. There's no magical golden ratio because of how many different variables there are.

    E) Computer monitors vary in resolution more than TVs do, but, these days just about any monitor will be at least 1280 pixels wide, and more than 720 (720p is 1280x720) pixels tall (720, 800, 960, or 1024 depending on the aspect). However, 1920x1200 (the same width and slightly taller than 1080p) is very common nowadays, too, for larger monitors.

    mntorankusu on
  • LewiePLewieP Registered User regular
    edited August 2007
    I dare not enter the debate about whether a lower resolution progressive is better than higher rez interlace (e.g. 1080i v 720p)

    (at OP) I am far from an expert, but the simple version of this arguement is it depends on the content being displayed.

    Fast moving pictures will benefit more from progressive scan than higer resolution, so the majority of games would look better in 720p than 1080i.

    However, for displaying infomation (like word proccessing for example), and slow moving video, they would benefit from higher resolution, so 1080i would be preferable to 720p.

    However, in all cases 1080p is superior to either 720p or 1080i.

    LewieP on
  • ZxerolZxerol for the smaller pieces, my shovel wouldn't do so i took off my boot and used my shoeRegistered User regular
    edited August 2007
    Interlacing is an artifact way back from the early days of television. TVs all used cathode-ray tubes (CRT), which produced a picture by guiding electron beam to stimulate phosphors on the screen. The beam would trace a zig-zag-like path across the screen rapidly to produce a picture. In order to save bandwidth when transmitting the video over the air, an interlacing scheme was used. When the beam worked its way from the top to the bottom of the screen, it would only draw every other line to complete one full picture. The beam would be brought back up and filled in the remaining lines to produce another full picture. Thus, when talking about interlaced content, the set would draw fields instead of full frames, with each field having half the vertical resolution of a full frame. In effect, each frame of the signal holds two fields, effectively doubling the amount of data you can send. This has been used as a television standard for decades.

    The analogous is the progressive scan, where each pass of the beam will go through each row without skipping. Computer monitors are progressive scan devices.

    Assuming an NTSC 60Hz signal, there would be 60 updates per second, or 60 fields per second. Because each frame has picture data for every other row on the screen, it'll take two to produce a full progressive frame (30 frames per second). Technically, it's actually 59.94 fields per second and 29.97 progressive frames per second, but whatever.

    Nowadays, with new screen technology (like LCD screens), there's no physical electron beam anymore, but the concept of progressive vs. interlaced display still applies. The set would take an interlaced field, deinterlace it to fill in the gaps (good interlacers would use sophisticated motion compensation methods to derive the missing rows instead of just doubling it), and display it as a full progressive frame (doubling the framerate too).


    edit: beat'd like a mofo

    Zxerol on
  • mntorankusumntorankusu I'm not sure how to use this thing.... Registered User regular
    edited August 2007
    Zxerol wrote: »
    Nowadays, with new screen technology (like LCD screens), there's no physical electron beam anymore, but the concept of progressive vs. interlaced display still applies. The set would take an interlaced frame, deinterlace it to fill in the gaps (good interlacers would use sophisticated motion compensation methods to derive the missing rows instead of just doubling it), and display it as a full progressive frame (doubling the framerate too).
    To add to this, a lot of content can actually be mostly put back together without making up too much data. Since there are 60 fields, and most content is between 24 and 30 frames per second, there are usually two or more fields for each frame, and therefore, a full frame exists across two fields. Recreating the original image like this is called an inverse telecine (because the process of turning 24fps content into a 60-fields-per-second NTSC signal is called a telecine).

    With 30fps content, it's easy! Since each frame has two fields, there are only two different ways the fields can be synced-- Odd-Even or Even-Odd. All it has to do is wait for two fields to be recieved, and put them together in one of these two ways, and if it detects combing (interlacing artifacts), then it tries the other way.

    With 24fps content, the concept is the same, but it's more complicated because every other frame uses three fields instead of two.

    By doing this, it's exactly the same as progressive scan, except for (at least) a 1/60th of a second delay.

    Because of the way inverse telecines work, 1080i with 30fps or less, and a good deinterlacer (which there are woefully few of), can be just as good as 1080p.

    Edit: Also, bad editing or frame drops (in video games) can cause the field sync to change or for some frames to have one one field, so this method can't be used by itself unless the interlacing is certainly, 100% consistent.

    mntorankusu on
  • OrganichuOrganichu poops peesRegistered User regular
    edited August 2007
    d) Each time a display renders (draws) the screen (each scan, in other words), a progressive scan display renders every line --- all 480, 720, whatever. A interlaced display only draws half of the lines (every other line). On the next scan it'll draw the other half of the lines.

    So, a 1080p display draws all 1080 lines each scan. A 1080i display only draws 540 each scan.
    At the same resolution (for example, 1080), progressive is better to have than interlaced.

    I dare not enter the debate about whether a lower resolution progressive is better than higher rez interlace (e.g. 1080i v 720p)

    Excellent.

    So, if I have a 42" TV that is capable of going to, say, 1080p. It means that my TV can (when provided with an appropriate signal) display 1080 horizontal lines?

    Taking, then, a 42" television (42" from opposing corners, right?), and also considering that a HDTV is usually (always?) in 16:9, I could do some math.

    x squared + 1.778x squared = 42 squared. (1.778 being 16/9)

    x squared + 1.778x squared = 1764

    2.778x squared = 1764

    2.778x = 42 722.588 228.584 = 951.172

    x = 15.119

    tv.jpg

    So the height of the screen is 15.119 inches and the length of the screen is 26.881 inches.

    The numbers don't add up, so I did all that math for nothing... I guess I'm a complete moron.














    Anyway, my point was to determine the exact height of a 42" screen so that I could figure out how 'tall' a 'line' would be. My reasoning was- if a 42" TV has 1080 horizontal lines being displayed every scan and a 52" TV has 1080 horizontal lines being displayed ever scan... how can they have similar quality? Wouldn't the 'upscaling' of the line thickness lead to distorted quality?

    I guess I'm missing this somehow.

    Organichu on
  • OrganichuOrganichu poops peesRegistered User regular
    edited August 2007
    Zxerol wrote: »
    Interlacing is an artifact way back from the early days of television. TVs all used cathode-ray tubes (CRT), which produced a picture by guiding electron beam to stimulate phosphors on the screen. The beam would trace a zig-zag-like path across the screen rapidly to produce a picture. In order to save bandwidth when transmitting the video over the air, an interlacing scheme was used. When the beam worked its way from the top to the bottom of the screen, it would only draw every other line to complete one full picture. The beam would be brought back up and filled in the remaining lines to produce another full picture. Thus, when talking about interlaced content, the set would draw fields instead of full frames, with each field having half the vertical resolution of a full frame. In effect, each frame of the signal holds two fields, effectively doubling the amount of data you can send. This has been used as a television standard for decades.

    The analogous is the progressive scan, where each pass of the beam will go through each row without skipping. Computer monitors are progressive scan devices.

    Assuming an NTSC 60Hz signal, there would be 60 updates per second, or 60 fields per second. Because each frame has picture data for every other row on the screen, it'll take two to produce a full progressive frame (30 frames per second). Technically, it's actually 59.94 fields per second and 29.97 progressive frames per second, but whatever.

    Nowadays, with new screen technology (like LCD screens), there's no physical electron beam anymore, but the concept of progressive vs. interlaced display still applies. The set would take an interlaced field, deinterlace it to fill in the gaps (good interlacers would use sophisticated motion compensation methods to derive the missing rows instead of just doubling it), and display it as a full progressive frame (doubling the framerate too).


    edit: beat'd like a mofo

    This is very comprehensive and informative w.r.t. to the technology of my inquiry. Thanks a bunch.

    Organichu on
  • OrganichuOrganichu poops peesRegistered User regular
    edited August 2007
    E) Computer monitors vary in resolution more than TVs do, but, these days just about any monitor will be at least 1280 pixels wide, and more than 720 (720p is 1280x720) pixels tall (720, 800, 960, or 1024 depending on the aspect). However, 1920x1200 (the same width and slightly taller than 1080p) is very common nowadays, too, for larger monitors.

    What's the typical resolution of a SD television?

    Organichu on
  • mntorankusumntorankusu I'm not sure how to use this thing.... Registered User regular
    edited August 2007
    Organichu wrote: »
    E) Computer monitors vary in resolution more than TVs do, but, these days just about any monitor will be at least 1280 pixels wide, and more than 720 (720p is 1280x720) pixels tall (720, 800, 960, or 1024 depending on the aspect). However, 1920x1200 (the same width and slightly taller than 1080p) is very common nowadays, too, for larger monitors.

    What's the typical resolution of a SD television?
    640x480 for a full frame, or 640x240 for a single field. 480p is also 640x480.

    mntorankusu on
  • OrganichuOrganichu poops peesRegistered User regular
    edited August 2007
    Actually, I'm not so sure that I understand... please bear with me.

    If I have a 30 inch television that is 1280x720, that means there are 921,600 dots on the screen.

    If I have a 60 inch television that is 1280x720, that means there are 921,600 dots on the screen.

    Shouldn't one look absolutely unfathomably horrible?

    Organichu on
  • mntorankusumntorankusu I'm not sure how to use this thing.... Registered User regular
    edited August 2007
    Organichu wrote: »
    Actually, I'm not so sure that I understand... please bear with me.

    If I have a 30 inch television that is 1280x720, that means there are 921,600 dots on the screen.

    If I have a 60 inch television that is 1280x720, that means there are 921,600 dots on the screen.

    Shouldn't one look absolutely unfathomably horrible?
    Assuming you're viewing them from the same distance, that's the gist of it.

    It wouldn't look unfathomably horrible unless you were really close, though. It would only look "bad" when compared to something with higher pixel density.

    mntorankusu on
  • OrganichuOrganichu poops peesRegistered User regular
    edited August 2007
    Ok then, thanks. :)

    I think my biggest issue was that I was unable to come to terms with the disparity in quality with regards to pixel density. I've pretty much got it now. Thanks all. :D

    Organichu on
  • LewiePLewieP Registered User regular
    edited August 2007
    Organichu wrote: »
    E) Computer monitors vary in resolution more than TVs do, but, these days just about any monitor will be at least 1280 pixels wide, and more than 720 (720p is 1280x720) pixels tall (720, 800, 960, or 1024 depending on the aspect). However, 1920x1200 (the same width and slightly taller than 1080p) is very common nowadays, too, for larger monitors.

    What's the typical resolution of a SD television?
    640x480 for a full frame, or 640x240 for a single field. 480p is also 640x480.

    PAL is 720x576

    http://img180.imageshack.us/img180/1189/700pxcommonvideoresolutsx9.png

    LewieP on
  • ZombiemamboZombiemambo Registered User regular
    edited August 2007
    Organichu wrote: »
    Actually, I'm not so sure that I understand... please bear with me.

    If I have a 30 inch television that is 1280x720, that means there are 921,600 dots on the screen.

    If I have a 60 inch television that is 1280x720, that means there are 921,600 dots on the screen.

    Shouldn't one look absolutely unfathomably horrible?

    Ever been close to a big-screen TV? They look pretty bad, as opposed to a smaller TV that displays all of the pixels on a smaller screen, which makes each pixel smaller and, in turn, makes the image look finer.

    Zombiemambo on
    JKKaAGp.png
  • JoahWJoahW Registered User regular
    edited August 2007
    On a 30 inch TV at 720p, each pixel would be .02x.02 inches.
    On a 60 inch TV at 720p, each pixel would be .04x.04 inches

    On a 30 inch TV at 640x480 0.0375x0.0375 inches.

    So yeah, they're pretty small.

    Edit: I'm in a mathy mood right now. On my 19 inch monitor, at 1280x960 resolution, each pixel is 0.012x0.012 inches. 8)

    JoahW on
    Jamada.gif
  • OrganichuOrganichu poops peesRegistered User regular
    edited August 2007
    Ok. I'd definitely feel confident to cross out all questions, but now I have one (last, hopefully) new one.

    xyzwhatever.) I've seen people say "blah blah bad quality" and people respond "get component cables".

    ???

    Organichu on
  • JoahWJoahW Registered User regular
    edited August 2007
    With composite cables, (Red-White-Yellow) the video is transmitted only through one cable (the yellow), so there's video compression and less detail, because one cable can only handle so much data.

    Component cables carry each color in a separate cable, leaving no compression and no real limit on color depth and resolution.

    HDMI is basically DVI with Audio in the same cable.

    JoahW on
    Jamada.gif
  • mntorankusumntorankusu I'm not sure how to use this thing.... Registered User regular
    edited August 2007
    Organichu wrote: »
    Ok. I'd definitely feel confident to cross out all questions, but now I have one (last, hopefully) new one.

    xyzwhatever.) I've seen people say "blah blah bad quality" and people respond "get component cables".

    ???
    Component cables are capable of, in addition to standard definition 480i, 480p, 720p, 1080i, and in some cases 1080p. They also send the signal in three parts, through three separate cables, so the image quality is better than Composite or S-Video even on a standard definition signal.

    Composite is a single signal, and S-Video is one cable with the signal split into two parts.

    So, in every case, Component is better than Composite or S-Video.

    HDMI is better than Component, but people will argue about by how much.

    mntorankusu on
  • OrganichuOrganichu poops peesRegistered User regular
    edited August 2007
    JoahW wrote: »
    With composite cables, (Red-White-Yellow) the video is transmitted only through one cable (the yellow), so there's video compression and less detail, because one cable can only handle so much data.

    Component cables carry each color in a separate cable, leaving no compression and no real limit on color depth and resolution.

    HDMI is basically DVI with Audio in the same cable.

    1.) Is the quality increase drastic?

    2.) I 'followed' the first part of your post, but I'm not sure what the connection is to HDMI. Was that answering the early question I posed, or is component video related to HDMI?

    Organichu on
  • JoahWJoahW Registered User regular
    edited August 2007
    HDMI is becoming more popular than component cables as of late, I think. HDMI is digital, whereas component cables are analog.

    And I've never really experienced the quality increase firsthand. I'm still using an ancient SDTV with composite cables, heh. If you're running 480p or better I think it's pretty much necessary, though.

    JoahW on
    Jamada.gif
  • ZxerolZxerol for the smaller pieces, my shovel wouldn't do so i took off my boot and used my shoeRegistered User regular
    edited August 2007
    Organichu wrote: »
    JoahW wrote: »
    With composite cables, (Red-White-Yellow) the video is transmitted only through one cable (the yellow), so there's video compression and less detail, because one cable can only handle so much data.

    Component cables carry each color in a separate cable, leaving no compression and no real limit on color depth and resolution.

    HDMI is basically DVI with Audio in the same cable.

    1.) Is the quality increase drastic?

    2.) I 'followed' the first part of your post, but I'm not sure what the connection is to HDMI. Was that answering the early question I posed, or is component video related to HDMI?

    The quality difference is mainly in the fact that a component connection allows for high-definition signals with progressive scan -- composite, s-video, etc. are strictly standard definition.

    If you have an HD set, yeah the difference is great.


    HDMI is a purely-digital interface as opposed to component (which is analog). It's closely related to the DVI ports you often find in computers and computer monitors and is in fact pin-for-pin compatible. The difference is that it carries audio too along the same wire and has a bunch of other "features" that the MPAA loves but you probably won't.

    Zxerol on
  • OrganichuOrganichu poops peesRegistered User regular
    edited August 2007
    Zxerol wrote: »
    Organichu wrote: »
    JoahW wrote: »
    With composite cables, (Red-White-Yellow) the video is transmitted only through one cable (the yellow), so there's video compression and less detail, because one cable can only handle so much data.

    Component cables carry each color in a separate cable, leaving no compression and no real limit on color depth and resolution.

    HDMI is basically DVI with Audio in the same cable.

    1.) Is the quality increase drastic?

    2.) I 'followed' the first part of your post, but I'm not sure what the connection is to HDMI. Was that answering the early question I posed, or is component video related to HDMI?

    The quality difference is mainly in the fact that a component connection allows for high-definition signals with progressive scan -- composite, s-video, etc. are strictly standard definition.

    If you have an HD set, yeah the difference is great.


    HDMI is a purely-digital interface as opposed to component (which is analog). It's closely related to the DVI ports you often find in computers and computer monitors and is in fact pin-for-pin compatible. The difference is that it carries audio too along the same wire and has a bunch of other "features" that the MPAA loves but you probably won't.

    :)

    So, assuming I wanted to connect my 360 to, say, my laptop (a Lenovo w/ a VGA port), I would have the following options:

    Get a 360 VGA-RCA adapter and plug it into my laptop, giving me the 'basic' cables on a quasi-HD resolution.

    Get a 360 VGA-component adapter and plug it into my laptop, giving me the 'jacked' cables on a quasi-HD revolution.

    Is that right, or does the 360 only sell the component adapter?

    Also, I assume there's no way to usefully 'adapt' HDMI to my laptop as 'converting' the HDMI signal would sort of defeat the point of it, right?



    Further, I am getting rid of this laptop soon and replacing it with a Macbook. The Macbook has a mini-DVI port... what would be my options for connecting the 360 directly to that?


    My reason for asking all this is that my end goal is to (until I can get a spot for a nice HDTV) make my 360 'setup' as portable as possible. Until I can get a worthy TV, I'd prefer to eliminate TV completely from the picture.

    Organichu on
  • mntorankusumntorankusu I'm not sure how to use this thing.... Registered User regular
    edited August 2007
    Have you actually used the VGA port on your laptop before? Usually they're output for connecting a second/external monitor, not input.

    mntorankusu on
  • OrganichuOrganichu poops peesRegistered User regular
    edited August 2007
    Have you actually used the VGA port on your laptop before? Usually they're output for connecting a second/external monitor, not input.

    Never.

    I assumed that it would be capable of doing both... d'oh.

    Organichu on
  • peterdevorepeterdevore Registered User regular
    edited August 2007
    I have a question that is related to the topic of this thread.

    You can now buy some newfangled computer displays that support HDCP over DVI, like this one. This is to enable higher resolution HDDVD/BluRay playback on Vista (yes, you need a videocard that supports HDCP to watch them at their full resolution, otherwise it downscales).

    Now while this may be the most horrible comsumer ripoff imaginable (having to buy a new display AND a new videocard in order to watch content that could be displayed with your old card and monitor just fine). My question is:

    Can you use a PS3 with these computer monitors on full resolution (with HDCP that is)? Are there HDMI->DVI cables available for this that will make it work?

    I'm sure the answer is 'out there' somewhere on the internet. I just thought it would be nice to let you in on the 'secret' if it is possible, since 1080p capable computer monitors are a hella lot cheaper than 1080p capable tv's.

    Seems that this would be the easiest choice (since it has HDMI in), but whatever.

    peterdevore on
  • TheSonicRetardTheSonicRetard Registered User regular
    edited August 2007
    I have a question that is related to the topic of this thread.

    You can now buy some newfangled computer displays that support HDCP over DVI, like this one. This is to enable higher resolution HDDVD/BluRay playback on Vista (yes, you need a videocard that supports HDCP to watch them at their full resolution, otherwise it downscales).

    Now while this may be the most horrible comsumer ripoff imaginable (having to buy a new display AND a new videocard in order to watch content that could be displayed with your old card and monitor just fine). My question is:

    Can you use a PS3 with these computer monitors on full resolution (with HDCP that is)? Are there HDMI->DVI cables available for this that will make it work?

    I'm sure the answer is 'out there' somewhere on the internet. I just thought it would be nice to let you in on the 'secret' if it is possible, since 1080p capable computer monitors are a hella lot cheaper than 1080p capable tv's.

    Seems that this would be the easiest choice (since it has HDMI in), but whatever.

    You're in luck

    EDIT: Maybe not...
    The display has to be HD compatible meaning it should have HDCP processing capability (in other words a regular computer monitor may not work).

    TheSonicRetard on
  • StormwatcherStormwatcher Blegh BlughRegistered User regular
    edited August 2007
    I have to add, from personal experience, that replacing normal composite cables with component cables makes a HUGE difference, even if the TV can't do higher resolutions.

    I have a 20" CRT SD TV that used to be my only set. It has component in, but only does 480i (the "normal" resolution). I used it to play gamecube and ps2 games. I'll say that, without any doubt, the component cable image is absurdly superior to the composite one on that same tv.

    Why?
    Because the colors don't "bleed". The lines "shimmer" a lot less. Less moire effect too.

    Even if the resolution stays the same, the signal is so much better, that the end result is a lot better too.

    Stormwatcher on
    Steam: Stormwatcher | PSN: Stormwatcher33 | Switch: 5961-4777-3491
    camo_sig2.png
  • LewiePLewieP Registered User regular
    edited August 2007
    Does anyone know of anywhere in the UK, online or B&M store, that will sell CRT HDTVs?

    I have a massive front room, space is not an issue, and as far as I know, in terms of quality/size:Price ratio CRTs beat all other types of telly hands down, but I can't even find a single retailer that sells them here.

    LewieP on
  • DaedalusDaedalus Registered User regular
    edited August 2007
    I have to add, from personal experience, that replacing normal composite cables with component cables makes a HUGE difference, even if the TV can't do higher resolutions.

    I have a 20" CRT SD TV that used to be my only set. It has component in, but only does 480i (the "normal" resolution). I used it to play gamecube and ps2 games. I'll say that, without any doubt, the component cable image is absurdly superior to the composite one on that same tv.

    Why?
    Because the colors don't "bleed". The lines "shimmer" a lot less. Less moire effect too.

    Even if the resolution stays the same, the signal is so much better, that the end result is a lot better too.

    At 480i, though, component and s-video look pretty much the same (and yes, miles and miles above composite).

    Daedalus on
  • StormwatcherStormwatcher Blegh BlughRegistered User regular
    edited August 2007
    I have to add, from personal experience, that replacing normal composite cables with component cables makes a HUGE difference, even if the TV can't do higher resolutions.

    I have a 20" CRT SD TV that used to be my only set. It has component in, but only does 480i (the "normal" resolution). I used it to play gamecube and ps2 games. I'll say that, without any doubt, the component cable image is absurdly superior to the composite one on that same tv.

    Why?
    Because the colors don't "bleed". The lines "shimmer" a lot less. Less moire effect too.

    Even if the resolution stays the same, the signal is so much better, that the end result is a lot better too.

    At 480i, though, component and s-video look pretty much the same (and yes, miles and miles above composite).

    Oh, yeah, no doubt about it. But Component is great on any TV, be it SD, ED or HD. S-Video is kinda dead.

    Stormwatcher on
    Steam: Stormwatcher | PSN: Stormwatcher33 | Switch: 5961-4777-3491
    camo_sig2.png
  • CZroeCZroe Registered User regular
    edited August 2007
    Organichu wrote: »
    I'm a moderate-heavy gamer and film purveyor, so it's annoyed me for awhile now that I don't have a real understanding of modern concepts of video quality/compatibility etc. If any of you would be kind enough to a.) answer my questions piecemeal or b.) direct me to a guide that answers most of my questions, I'd be eternally grateful and would be inclined to offer recompense in sexual currency... or just appreciation.

    [strike]a.) a pixel is essentially a visual character, correct? The smallest denomination of visual artifacts that are visible on a screen, right?

    b.) if so, are resolutions literal representations of pixel quantity? i.e., is 1024x768 equal to 786,432 pixels on the screen?[/strike]

    [strike]c.) if this is correct, does quality worsen dramatic with percentile mismatches? For example, a 20" screen is 133% the size of a 15" screen. So will a resolution need to be 133% the pixels (however the proportions of the rows and columns might change) to have identical quality? Will a greater resolution look significantly better on a screen of a given size? Is there a ceiling of advantage? Is there a golden ratio at which the quality can not be improved upon?[/strike]

    [strike]d.) What's the difference (functionally) between interlaced and progressive lines?

    e.) Is a typical monitor/computer display 'high definition'? To what extent? 480/720/1080?[/strike]

    [strike]f.) In what circumstances would one need an HDMI cable?[/strike]

    [strike]g.) VGA/S-video? What are these?[/strike]

    h.) What are component cables?


    If I think of any additional questions I will add them to the OP. Also, I'll cross out sufficiently answered questions. I apologize in advance for my stunted knowledge. Thank you all so much. :):):)




    edit: a-g answered... clarification needed on remaining questions.

    a) A Pixel is the smalest element of a picture (Picture Element), not a simple color. This is important because many text and people will tell you that a pixel is a red, green, or blue phosphor on the screen, when in fact it takes many phosphors to make a pixel and a pixel can be any color that the display supports. Thankfully, this misinformed garbage is starting to go away now that we have more fixed-pixel displays like LCDs and DLP. The individual holes in a CRT's (Cathod Ray Tube's) shadowmask are fixed with arrangements of phosphors that do not move, even when the resolution (amount of pixels) changes. As pixels grow and shrink, they will use more and less phosphors. Because, similarly, the fixed-pixels in an LCD do not change with the scaled resolution, you can not count your LCD's pixels unless you are running at the native resolution (something games rarely do, but your desktop should).

    b) Yes, the [number]x[number] nomenclature is used to represent how many columns by how many rows of pixels. 1024x768 indicates that there are 1,024 pixels across the top of the display forming a line, and therefore there are 1024 pixels in every line. 768 indicates that there are 768 pixels going vertically in each column, one for each of a row's 1,024 pixels. It's literally a grid and, as with any grid, you can determine the total amount of multiplying the nuber of rows by the number of columns.

    c) Obviously, there is a point of diminishing returns for increasing resolution without increasing display size, which is why most all LCDs have a certain fixed resolution for their display size. The thing is, that point is still subjective. For instance, a television will be viewed from farther away so the details would be smaller than an LCD on a desk. It's why a 30-inch 1080p television makes no sense, but a 30-inch Dell UltraSharp 3007WFP (higher than 1080p) is super awesome.

    d) Contrary to most people, I put forth that interlacing was actually done to maintain a higher resolution with the slower refresh rates of old CRTs. It only saves bandwidth versus sending twice as much detail every 60th of a second, which is somewhat insane (movies are only 24 frames a second). You see, "60" wasn't chosen arbitrarily, it was chosen so that broadcasting equipment could operate in sync with AC electrical line transmission at 60hz (for simplicity and less power conversion). 60 full resolution frames per second was insane considering that movies are tolerable at less than half that, so 30-full frames per second was more than enough. Rather than wait for your display to go blank every other frame, they would split up every frame into two interlaced fields and paint them across the screen one after another in that same 30th of a second. It just so happens that the phosphors from the last scanned field will still be illuminated to form a full image, so the CRT didn't have the performance it neded for 60hz full-frame resolution anyway. Also, to do 60hz 60 progressive frames per second, they would have only been able to broadcast half the resolution; this is where bandwidth comes in. Obviously, this wasn't the influencing factor because even a 320x240 standard would have been excellent for early television.

    e) A typical PC display is HD in the sense that they can display 720 lines or more. Anything over 480 lines is called "EDTV." Though, technically, any HD picture is supposed to be 16x9 aspect ratio, I have seen 4x3 Wal-Mart TVs claiming "HD" just because they are 1024x768 native resoltuion and their ATSC digital tuners can display the extra detail per line. To display the HD image, the sides will either need to be cut off or detail needs to be thrown out to squish the image into the display area. Shrinking the image to maintain aspect ratio would throw out enough detail to no longer have the picture made up of 720+ lines. The first method is not a problem for older 4x3 TV content somehow available in HD without a source widescreen version, but the second method is only still HD because the standard only counts lines (4x3 content and the first scenario is why it only counts lines). Now, to add insult to injury, the sets I was looking at did not have aspect control for chopping the pillair boxes off and just watching the HD 4x3 image (Wal-Mart TV OLOL).

    f) Your device and display determine the need for an HDMI cable. HDMI and DVI are 100% compatible as far as HDCP and video go, and you can convert freely between them with the appropriate adapters or cables. HDMI adds more than just video, as it provides a single connector for digital audio and video, so be prepared to lose the audio in conversion if you need DVI. Most HDMI devices will also carry other audio outputs for this scenario (and, well, the "I don't have HDMI" scenario, considering that most have other video outputs too ;)). If you have an HDCP compliant LCD for your PC, you can run the HDMI output from your Playstation 3 to the DVI connector, but you will need to use the old multi-out cable (analog, stereo) or the digital optical output for audio.

    g) VGA is the 15-pin analog connector PCs have used for the last 15 years. Techinically, it is the name for a resolution, but the cable for it cable to be known as "VGA" instead. S-Video is an old analog connector that remained more popular in Japan due to Laser Disc players, even though we desperately needed them for our video games throughout the Nineties (too small a need for electronics companies to care). The only TVs that had them, had them as vestigal components from the Japanese model, and it's easy to see why: Other than game consoles with a cable ordered from the back of the manual, where were the S-Video devices? I've spent years looking for an S-Video native VCR and they are all DVD players (combos)! With the advent of DVD, where S-Video's huge improvements over composite video (the yellow video cable usually bundled with Red and White audio wires) are now needed, S-Video gained popularity... it just did that after a high-def analog cable exists: Component. To get enough bandwidth for HD, component needs three seperate wires for video alone. DVDs use this for EDTV resolution (16x9 progressive scan SDTV), and therefore a lot of analog SDTVs have component and S-Video... as do DVD players. Though it was "around" for a long, long time, S-Video really only arrived just as it was being replaced with component, but it remains the best way to connect, say, a Gamecube without a digital output or a N64 without an RGB modification (in the US... we don't have RGB/SCART). Heck, even the SNES uses it, so if you want crisp text in your RPGs and haven't RGB modded it yet, at least the cables are finally available without special-ordering them. The difference between composite video and S-Video on a TV that can show it is more dramatic than the difference between S-Video and component at the same resolution. Component's advantages come when you scale up to higher, HD, resolutions. Just remember that S-Video is "good enough" for SD and that there is rarely a reason to pay for component cables with an SD device (Gamecube, I'm looking at you).

    h) Couldn't discuss S-Video without touching on it. :)

    CZroe on
  • HexocetHexocet Registered User regular
    edited August 2007
    Here's a nice little component vs. composite comparison Gamespot had up a while ago for the Wii.

    http://www.gamespot.com/features/6162297/p-2.html

    You can definitely see the differences in the Twilight Princess shots.

    Hexocet on
  • OrganichuOrganichu poops peesRegistered User regular
    edited August 2007
    CZRoe wrote:
    ...[Gospel of the lord]...

    God made you on the 8th day.

    You are a very nice man. Thank you. :D

    Organichu on
  • peterdevorepeterdevore Registered User regular
    edited August 2007
    CZroe wrote: »
    f) Your device and display determine the need for an HDMI cable. HDMI and DVI are 100% compatible as far as HDCP and video go, and you can convert freely between them with the appropriate adapters or cables. HDMI adds more than just video, as it provides a single connector for digital audio and video, so be prepared to lose the audio in conversion if you need DVI. Most HDMI devices will also carry other audio outputs for this scenario (and, well, the "I don't have HDMI" scenario, considering that most have other video outputs too ;)). If you have an HDCP compliant LCD for your PC, you can run the HDMI output from your Playstation 3 to the DVI connector, but you will need to use the old multi-out cable (analog, stereo) or the digital optical output for audio.

    I wouldn't say a HDMI->DVI setup from a PS3 to a HDCP compliant monitor is guaranteed to work. It should, but in this thread different people mention how it not always works out of the box, especially if you throw a switcher in the mix. Seems like the only solution is finding someone on the internet who uses that setup with the exact monitor you're planning to buy. Returning those things is a pain, so better to be safe than sorry.

    peterdevore on
  • LewiePLewieP Registered User regular
    edited August 2007
    CZroe? do you work in video in some technical capacity, or are you just interested in it as a hobby, coz damn thats not only a good explaination and interesting read, it's also worded in a very easy to read manner.

    <3

    LewieP on
  • peterdevorepeterdevore Registered User regular
    edited August 2007
    I thought I would want to add to the whole HDCP Vista conundrum that NOT A SINGLE VIDEO CARD OUT TODAY SUPPORTS HDCP. There are some on the way, but it seems the 'HDCP enabled over DVI' feature advertised for current video cards was all a big scam. Not that I know anyone who bought a blu-ray or HDDVD drive for his computer, but that last site also warns about different audio problems that might pop-up.

    (This information kindly pointed out by MechMantis here.)

    peterdevore on
  • DaedalusDaedalus Registered User regular
    edited August 2007
    I thought I would want to add to the whole HDCP Vista conundrum that NOT A SINGLE VIDEO CARD OUT TODAY SUPPORTS HDCP. There are some on the way, but it seems the 'HDCP enabled over DVI' feature advertised for current video cards was all a big scam. Not that I know anyone who bought a blu-ray or HDDVD drive for his computer, but that last site also warns about different audio problems that might pop-up.

    (This information kindly pointed out by MechMantis here.)

    Those articles are like a year old. Every GeForce 8 series and Radeon HD 2000 series card supports HDCP (as in actually supports it, as in it's part of the manufacturer's license agreement with nVidia/AMD), as well as a few of the later GeForce 7 and Radeon X1000 series cards.

    Daedalus on
Sign In or Register to comment.