The new forums will be named Coin Return (based on the most recent vote)! You can check on the status and timeline of the transition to the new forums here.
The Guiding Principles and New Rules document is now in effect.
I don't know much about TVs or whatever, so excuse me if I use laymans terms or seem to have no idea of what I'm talking about.
So my father recently acquired a new LED TV, and its really nice. Picture quality looks great, its great for playing games on and so forth. The problem for me comes when watching movies on this TV, and other LEDs in general. I feel like the FPS on these TVs is way too high, compared to what I'm normally used to watching, and this creates a weird, almost jittery effect for me when watching movies and such on these TVs. Everything just seems... off, for some reason, I don't know how to explain it. Basically it makes it unwatchable for me, because I can't really enjoy it.
Am I crazy, or is this a common issue people have with these kinds of TVs? And what can I do to circumvent it for myself? I'm sorry if this seems too vague, but it's the only way I can describe it.
Gammarah on
0
Posts
ShogunHair long; money long; me and broke wizards we don't get alongRegistered Userregular
edited May 2010
I know what you are describing but I've only seen it on LCD TVs that were really big and were full 1080p. Everyone in the room noticed it and I believe it is because we were sitting too close. Could be totally wrong about that, but I know what you're describing.
It's not the LED technology, it's just that they're usually top of the line TVs and they include an interpolating frames system, I think Sony calls it Motionflow and Samsung calls it Automotion plus.
Look around in the menus, it should be an option that should either enable/disable it, or probably state 240hz/120hz/60hz. Set it to 60hz for normal behavior.
It's not the LED technology, it's just that they're usually top of the line TVs and they include an interpolating frames system, I think Sony calls it Motionflow and Samsung calls it Automotion plus.
Look around in the menus, it should be an option that should either enable/disable it, or probably state 240hz/120hz/60hz. Set it to 60hz for normal behavior.
This was it exactly. Turning Automotion plus off fixed it immediatly. Many thanks.
It's not the LED technology, it's just that they're usually top of the line TVs and they include an interpolating frames system, I think Sony calls it Motionflow and Samsung calls it Automotion plus.
Look around in the menus, it should be an option that should either enable/disable it, or probably state 240hz/120hz/60hz. Set it to 60hz for normal behavior.
This was it exactly. Turning Automotion plus off fixed it immediatly. Many thanks.
Yeah, it ends up making things look like a soap opera. It adds frames by duplicating ones already shown.
Slightly related: Why does having more frames like this make things look "too fluid" ? I mean we're still looking at the tv with the same eyes we use to look at everything else. Why does it feel like things on the screen are more fluid than reality?
Slightly related: Why does having more frames like this make things look "too fluid" ? I mean we're still looking at the tv with the same eyes we use to look at everything else. Why does it feel like things on the screen are more fluid than reality?
Don't human eyes see television at 29.97 FPS? If the frame count was increased that would have the effect of appearing to smooth things out.
Most movies are recorded and projected at 24 frames per second. The standard fps for a normal tube TV is an interlaced 60fps, which in actuality is 29.97fps. Now modern TVs run at 60fps (120hz and 240hz TVs run at 60hz afaik, it's just the interlacing true-motion stuff that makes it seem more fluid, but the actual TV refreshes itself 60 times per second).
However your question is something I've always asked myself as well, why does reality which should be the max "fps" our eyes can see, seem less fluid that when watching a movie or game at 60fps+?
There is a film term called Judder and it's due to limitations of video recording. When a camera person pans faster then the recording device can register images to its frames, it obviously skips some images.
What you're seeing are those images being filled in by the TV. Some people like it, some hate it. I think its great for video games, panning in FPS is silky smooth.
Unfortunately, due to standard TV and movie framerates in the 20s and 30s, anything smoother now looks alien to us, despite that it ought to actually be better looking.
Unfortunately, due to standard TV and movie framerates in the 20s and 30s, anything smoother now looks alien to us, despite that it ought to actually be better looking.
I tried watching the Dark Knight with smoothing enabled, but ugh, it really killed the mood. It felt like I was watching an 80s straight to video action flick.
victor_c26 on
It's been so long since I've posted here, I've removed my signature since most of what I had here were broken links. Shows over, you can carry on to the next post.
We're also stuck with blurry, juddery, slow-panning 24fps movies forever because (thanks to 60fps home video) people associate high framerates with camcorders and cheap sitcoms, and thus think good framerates look 'fake'.
-xkcd
I'm still undecided if I like it on movies, the tech has really advanced so it's more constant (Before it was only noticeable in very select scenes) but it's still not perfect, so I see a lot of tearing in there. Some scenes look great with the fluid motion, but the tearing just gets to me!
Personally, when I'm watching a movie on a TV, I want it to look as much like film as possible.
I mean, yeah, technically my eyes aren't getting quite as much visual information as they could be, but if "better" quality in the form of a higher frame rate looks weird, it just looks weird.
I remember when 120Hz TV's first came out. I was watching Spider-Man 3 in the store, and the action scenes looked really fake. Like, you could tell they were filmed in front of a blue/green screen, etc. It was kinda distracting, but kinda cool at the same time.
I imagine its something that we as TV watchers will need to get used to going forward.
Side note: I wonder if the first Colour TV adopters felt the same way...
Most movies are recorded and projected at 24 frames per second. The standard fps for a normal tube TV is an interlaced 60fps, which in actuality is 29.97fps. Now modern TVs run at 60fps (120hz and 240hz TVs run at 60hz afaik, it's just the interlacing true-motion stuff that makes it seem more fluid, but the actual TV refreshes itself 60 times per second).
However your question is something I've always asked myself as well, why does reality which should be the max "fps" our eyes can see, seem less fluid that when watching a movie or game at 60fps+?
The red part is false. 120hz sets are capable of presenting 120 different images every second, not 60.
To answer to your max fps question, it is because of motion blur. Your retina works like a magic piece of film that is constantly exposed and developed. It takes a certain amount of time for your eyes to stop seeing what they were seeing an instant before. Think of a bright flash bulb going off. It takes a few moments for your retina to lose that image. This is how motion blur works on your eye.
For the purposes of explanation, I'm going to break up time into moments similar to FPS.
If something jumps across your field of view quickly, it will streak across and blend into the background. This is because you only have 1 retina per eye and what your eye saw in the first moment, it is still seeing during the second moment, only more faintly. In the third moment, the image is the accumulation of all 3 moments' worth of images. By the time the object has moved from one side to another, 30 moments worth of images have piled up and you have this big blurry mess.
A motion picture camera uses a virgin piece of film (retina) for every moment (frame) it captures. The important thing to remember is that a frame isn't a single moment of time, but rather a single SPAN of time. Film needs to accumulate light over time until an image is formed. What happens is the shutter opens up, the film accumulates light for the entire span of time it is open, then shuts. Then it does it again, but it can't do it right away. Some time passes before it can open up again, but the object is still moving while the camera has its "eyes closed." So when the shutter opens up again to expose a new frame of film, the beginning of this second moment doesn't correspond to the end of the first, so there is a gap in the object's trail. Cinematographers adjust the shutter angle to adjust how much motion blur there is. Narrow shutter angles means the film is exposed to less light, so narrower shutter angles need more light to compensate. BTW pretty much all the action scenes in Gladiator are shot with very narrow shutters.
Look at this video to see gaps in the motion blur that your eye wouldn't normally see. When the guy moves fast, the fire trail turns into discrete blocks, it looks like this _ _ _ _ _ _ _ _ instead of this _____________.
If you've ever messed with a strobe light, you can simulate what cameras are doing. By using a very bright, very brief flash, you are exposing your retinas with a clear image that doesn't move. By the time the next flash comes, reality has moved but the flash is so bright it appears to be still. The result is something that looks surreal because the motion blur is missing.
Automotion plus does two things: it interpolates (invents) new frames between existing frames, and it removes motion blur by analyzing the frames. So what you have is this cross between the two extremes, where you are presented these very clear, blur-free frames, and you are presented them so quickly that it doesn't look like normal TV to your eyes, but instead looks like something less-fake. The disconnect is that this less-fake image doesn't have the normal motion blur that your brain associates with TV or movies, since the TV is essentially strobing images at your eyes.
Automotion plus means that your TV is taking the roll of cinematographer, ruining the original program with its own vision of what the image should look like. All the Automotion-style things were invented for one reason: to sell TVs. 120/240 hz TVs are a very good thing, because they let things be displayed unmodified. This makes the set worthwhile to enthusiasts and purists, not to general consumer. General consumer can't tell the difference between 24fps content displayed on a 30hz screen and a 120hz screen, so interpolation was brought in to change the image up so general consumer will say "of course I can tell the difference between this 240hz set and my crap box set at home."
Is this why Date Night's action scenes looked so bad in the theater (at least when I saw it)? It literally looked like they'd filmed it for TV and decided to release it in theaters at the last second.
adejaan on
[SIGPIC][/SIGPIC]
0
ThomamelasOnly one man can kill this many Russians. Bring his guitar to me! Registered Userregular
Slightly related: Why does having more frames like this make things look "too fluid" ? I mean we're still looking at the tv with the same eyes we use to look at everything else. Why does it feel like things on the screen are more fluid than reality?
Don't human eyes see television at 29.97 FPS? If the frame count was increased that would have the effect of appearing to smooth things out.
NTSC works out to 29.97 FPS. PAL is 25 FPS. The framerates come from a trick that was used by used by the engineers of the time. Since they were dealing with interlaced frames, they decided to sync it to the power frequency of their local grids. 60 Hz in North America and 50 hz in Europe.
TrueMotion / Automotion Plus looks like crap to me... and I'm glad we're "stuck" at 24 fps for movies. It just looks like what cinema should look like, imo.
SkyCaptain on
The RPG Bestiary - Dangerous foes and legendary monsters for D&D 4th Edition
Posts
Shogun Streams Vidya
Look around in the menus, it should be an option that should either enable/disable it, or probably state 240hz/120hz/60hz. Set it to 60hz for normal behavior.
This was it exactly. Turning Automotion plus off fixed it immediatly. Many thanks.
Yeah, it ends up making things look like a soap opera. It adds frames by duplicating ones already shown.
Battle.net: Fireflash#1425
Steam Friend code: 45386507
Don't human eyes see television at 29.97 FPS? If the frame count was increased that would have the effect of appearing to smooth things out.
Shogun Streams Vidya
However your question is something I've always asked myself as well, why does reality which should be the max "fps" our eyes can see, seem less fluid that when watching a movie or game at 60fps+?
What you're seeing are those images being filled in by the TV. Some people like it, some hate it. I think its great for video games, panning in FPS is silky smooth.
I tried watching the Dark Knight with smoothing enabled, but ugh, it really killed the mood. It felt like I was watching an 80s straight to video action flick.
-xkcd
I'm still undecided if I like it on movies, the tech has really advanced so it's more constant (Before it was only noticeable in very select scenes) but it's still not perfect, so I see a lot of tearing in there. Some scenes look great with the fluid motion, but the tearing just gets to me!
But for games, it must be great!
I mean, yeah, technically my eyes aren't getting quite as much visual information as they could be, but if "better" quality in the form of a higher frame rate looks weird, it just looks weird.
That's just me, though.
Unfortunately due to technological limitations in the past, we have grown accustomed to low framerates with motion blurring as what looks "good"
Check out http://kimpix.net/2006/12/03/60fps-vs-24fps/ to see a video comparison of low framerates + motion blur vs unblurred high frame rates.
I imagine its something that we as TV watchers will need to get used to going forward.
Side note: I wonder if the first Colour TV adopters felt the same way...
The red part is false. 120hz sets are capable of presenting 120 different images every second, not 60.
To answer to your max fps question, it is because of motion blur. Your retina works like a magic piece of film that is constantly exposed and developed. It takes a certain amount of time for your eyes to stop seeing what they were seeing an instant before. Think of a bright flash bulb going off. It takes a few moments for your retina to lose that image. This is how motion blur works on your eye.
For the purposes of explanation, I'm going to break up time into moments similar to FPS.
If something jumps across your field of view quickly, it will streak across and blend into the background. This is because you only have 1 retina per eye and what your eye saw in the first moment, it is still seeing during the second moment, only more faintly. In the third moment, the image is the accumulation of all 3 moments' worth of images. By the time the object has moved from one side to another, 30 moments worth of images have piled up and you have this big blurry mess.
A motion picture camera uses a virgin piece of film (retina) for every moment (frame) it captures. The important thing to remember is that a frame isn't a single moment of time, but rather a single SPAN of time. Film needs to accumulate light over time until an image is formed. What happens is the shutter opens up, the film accumulates light for the entire span of time it is open, then shuts. Then it does it again, but it can't do it right away. Some time passes before it can open up again, but the object is still moving while the camera has its "eyes closed." So when the shutter opens up again to expose a new frame of film, the beginning of this second moment doesn't correspond to the end of the first, so there is a gap in the object's trail. Cinematographers adjust the shutter angle to adjust how much motion blur there is. Narrow shutter angles means the film is exposed to less light, so narrower shutter angles need more light to compensate. BTW pretty much all the action scenes in Gladiator are shot with very narrow shutters.
Look at this video to see gaps in the motion blur that your eye wouldn't normally see. When the guy moves fast, the fire trail turns into discrete blocks, it looks like this _ _ _ _ _ _ _ _ instead of this _____________.
If you've ever messed with a strobe light, you can simulate what cameras are doing. By using a very bright, very brief flash, you are exposing your retinas with a clear image that doesn't move. By the time the next flash comes, reality has moved but the flash is so bright it appears to be still. The result is something that looks surreal because the motion blur is missing.
Automotion plus does two things: it interpolates (invents) new frames between existing frames, and it removes motion blur by analyzing the frames. So what you have is this cross between the two extremes, where you are presented these very clear, blur-free frames, and you are presented them so quickly that it doesn't look like normal TV to your eyes, but instead looks like something less-fake. The disconnect is that this less-fake image doesn't have the normal motion blur that your brain associates with TV or movies, since the TV is essentially strobing images at your eyes.
Automotion plus means that your TV is taking the roll of cinematographer, ruining the original program with its own vision of what the image should look like. All the Automotion-style things were invented for one reason: to sell TVs. 120/240 hz TVs are a very good thing, because they let things be displayed unmodified. This makes the set worthwhile to enthusiasts and purists, not to general consumer. General consumer can't tell the difference between 24fps content displayed on a 30hz screen and a 120hz screen, so interpolation was brought in to change the image up so general consumer will say "of course I can tell the difference between this 240hz set and my crap box set at home."
NTSC works out to 29.97 FPS. PAL is 25 FPS. The framerates come from a trick that was used by used by the engineers of the time. Since they were dealing with interlaced frames, they decided to sync it to the power frequency of their local grids. 60 Hz in North America and 50 hz in Europe.