Actually, MPEG = JPEG for moving images. The advanced codecs do a much better job with constrained bit rates than the older ones, that's for sure. With MPEG2, macroblocking was the worst problem. With MPEG4 (h.264/AVC and VC-1) it just gets kind of... soft.
Well, yes, but VC1 != MPEG.
Yes it is, VC-1 is Microsoft's implementaion of MPEG4 AVC/H.264.
I've seen full-screen 1080p videos encoded at ~10mpbs. Many of the videos you can download from the PS store are like that (using AVC). They look ... ok... not bad, just kind of "soft". Less detail, and the colors are a little.. I dunno, "flat". Versus the full bit rate (18+ mbps) BD movies, there's really no comparison. Not on a good, large-screen HDTV set, not to my eyes...
Given that the new techniques that we're discussing haven't actually been used in anything made available to the public, I don't think you've seen the 10mbps transfers that Amir was lauding.
Yes, an 18mbps HD-DVD or BD movie is going to be prettier than a hastily cobbled together 10mbps downloadable trailer. That's a given. The assertion isn't that current movie transfers at 10mbps are just as good as the upper echelon releases, like King Kong. It's that new techniques have made it such that we can get beautiful peceptually lossess transfers in the near future at that bit rate.
No, I sure don't know what new techniques you're talking about. I've seen VC-1, it's nothing new. Corpse Bride in particular was F'N amazing, totally pristine. Of course the bit rate was a lot closer to 20 (and well over that in many scenes) than to 10 mpbs. The trailers I'm referring to were all AVC, which is practically identical to VC-1.
Can I play an HD-DVD in 1080p, on my xbox 360 with an HD-DVD addon? Without HDMI. Just normal component.
With current disc releases, yes. Once Hollywood enables "ICT" - Image Constraint Token - on future releases, no you won't be able to watch anything higher than 480p on analog component. That's not guaranteed to happen though.
As for scratching:
The disc on the left is a standard DVD, the one on the right has a Durabis 2 coating.
As for scratching:
The disc on the left is a standard DVD, the one on the right has a Durabis 2 coating.
There...are...FOUR...lights!!!
Drez on
Switch: SW-7690-2320-9238Steam/PSN/Xbox: Drezdar
0
ViscountalphaThe pen is mightier than the swordhttp://youtu.be/G_sBOsh-vyIRegistered Userregular
edited January 2007
--Viscountalpha" Sony & the blu-ray consortium needs to give the fuck up and just quit, Or go bankrupt.
Jwalk
:roll: aaand then we return to the real world.
The ps3 isn't doing so well right now. Sony's quality has sucked much more lately. To think they are invincible is pure foolishness. Sony sure does act like they are invincible though.
Hard-coating technology
Verbatim announced in July 2006 that their Blu-ray Disc recordable and rewritable discs would incorporate their hard-coat ScratchGuard technology which protects against scratches, abrasion, fingerprints, and traces of grease.[6][7]
So, yes.
[quote/]
Your quoting Wikipedia without actual real world information? insta-fail right there. If you wanna rebuff my comments use real information not just quoting wikipedia.
I'm disqualifing that information unless you have better proof on real out in the wild information. You can quote wikipedia all you want but without real information your have no arguement.
My whole point of attacking durabis is that it is unnecessairy and that a lower data density is just fine for the time being. Its also much more expensive per a disc. The movie studios hate this extra cost as much as I do. One of the porn industry people spoke about blu-ray being too expensive when HD-DVD was sigificantly less expensive to make.
Overpriced technology fails when another technology is almost as good but is less expensive. HD-DVD will win because more studios can produce the media and incur less expense through the whole process.There will be more HD-DVD players out as well because again they are less expensive.
So in ending I predict HD-DVD will become the next media choice.
I'm disqualifing that information unless you have better proof on real out in the wild information. You can quote wikipedia all you want but without real information your have no arguement.
why is wikipedia not real information?
bongi on
0
AbsoluteZeroThe new film by Quentin KoopantinoRegistered Userregular
edited January 2007
Right now I have to say I am still leaning towards Blu-Ray... but if this triple layer HD-DVD gets standardized, I will gladly jump ship.
AbsoluteZero on
0
ViscountalphaThe pen is mightier than the swordhttp://youtu.be/G_sBOsh-vyIRegistered Userregular
I'm disqualifing that information unless you have better proof on real out in the wild information. You can quote wikipedia all you want but without real information your have no arguement.
why is wikipedia not real information?
Its not reliable when corporate shills can immediatly alter/modify the information to what ever they please. The quote he used specifically was old too.
I'm disqualifing that information unless you have better proof on real out in the wild information. You can quote wikipedia all you want but without real information your have no arguement.
why is wikipedia not real information?
Its not reliable when corporate shills can immediatly alter/modify the information to what ever they please. The quote he used specifically was old too.
ok, so don't you kind of have to show that it's edited by corporate shills? discounting something because it might seems fairly ridiculous
besides which, what is "real world information"? you and your friends managed to scratch a disc with durabis? that's anecdotal evidence and little more; you read in a post on some forum? a scientific paper?
Actually, MPEG = JPEG for moving images. The advanced codecs do a much better job with constrained bit rates than the older ones, that's for sure. With MPEG2, macroblocking was the worst problem. With MPEG4 (h.264/AVC and VC-1) it just gets kind of... soft.
Well, yes, but VC1 != MPEG.
Yes it is, VC-1 is Microsoft's implementaion of MPEG4 AVC/H.264.
My bad, that should've read "VC1 != MPEG2". Because, as you said, the codecs are very different.
No, I sure don't know what new techniques you're talking about. I've seen VC-1, it's nothing new. Corpse Bride in particular was F'N amazing, totally pristine. Of course the bit rate was a lot closer to 20 (and well over that in many scenes) than to 10 mpbs. The trailers I'm referring to were all AVC, which is practically identical to VC-1.
While I'm not a compression expert, I suspect it has to do with things like selecting the quantization matrices (to use a jpeg analogy) and such. Perhaps they've come up with algorithms that can be used to perform a lot of fine-tuning type things automatically.
Really, you just seem to be willfully ignoring the fact that Amir has claimed that the techniques used in VC1 encoding have been greatly improved, such that we can now get by with lower bit rates. I don't mind if you want to adapt an "I'll believe it when I see it" attitude; that's cool. But this whole, "Nope, what you're talking about doesn't exist, anything less than 25mbps must necessarily suck because stuff that I've seen from once upon a time that was at 10mbps sucked, LALALALALA I CAN'T HEAR YOU" thing is really starting to grate.
Can I play an HD-DVD in 1080p, on my xbox 360 with an HD-DVD addon? Without HDMI. Just normal component.
With current disc releases, yes. Once Hollywood enables "ICT" - Image Constraint Token - on future releases, no you won't be able to watch anything higher than 480p on analog component. That's not guaranteed to happen though.
It's actually not quite that simple. Yes, the 360 can output a 1080p signal via component. However, not all TVs capable of 1080p can receive an analog 1080p signal; many of them only accept 1080p via HDMI (or DVI), and will downconvert the component signal to 1080i. As of very recently, the number of TVs that could do so was pretty small.
ElJeffe on
Maddie: "I named my feet. The left one is flip and the right one is flop. Oh, and also I named my flip-flops."
I'm disqualifing that information unless you have better proof on real out in the wild information. You can quote wikipedia all you want but without real information your have no arguement.
why is wikipedia not real information?
Its not reliable when corporate shills can immediatly alter/modify the information to what ever they please. The quote he used specifically was old too.
ok, so don't you kind of have to show that it's edited by corporate shills? discounting something because it might seems fairly ridiculous
besides which, what is "real world information"? you and your friends managed to scratch a disc with durabis? that's anecdotal evidence and little more; you read in a post on some forum? a scientific paper?
The Wikipedia article also has sources for that statement.
Couscous on
0
AbsoluteZeroThe new film by Quentin KoopantinoRegistered Userregular
Right now I have to say I am still leaning towards Blu-Ray... but if this triple layer HD-DVD gets standardized, I will gladly jump ship.
why, exactly, had you been leaning toward blu-ray?
(i think i'm like most of the population on this matter: i have no dog in the fight. i don't want to have a dog in the fight. i tend to pick losers.)
Superior storage space, and that's the only reason.
If HD-DVD gets up to 51GB standard, I would choose HD-DVD. Why? Because I think it will be a cold day in Hell when Blu-Ray gets a 3 layer or higher standard... and I think we can thank the Playstation 3 for that.
I don't think I said "anything less than 25mpbs will suck". But yes I am skeptical that your average movie can be compressed at 10mbps and still look "transparent" to the original source - or even to the same material compressed at 18+mpbs. Even that guy admits it requires very compress-able source material to achieve good results at that low of a bit rate.
I'd love to be proven wrong though.
The problem is that any new "encoding techniques" have to be 100% compatible with the existing DEcoding hardware in current players. The codecs are locked - were locked, a year-two ago actually. They have to be, for compatibility. Any disc you buy today has to be playable in any future player, and vice-versa.
Same reason you won't see 3-layer discs - not for home video/movies anyway. For computer data storage maybe. But then as I pointed out pages ago, TDK showed off prototypes of 4-layer (100GB) and 6-layer (200GB, using 33.3GB layers) Blu-ray recordable discs at CES.
While I'm not a compression expert, I suspect it has to do with things like selecting the quantization matrices (to use a jpeg analogy) and such. Perhaps they've come up with algorithms that can be used to perform a lot of fine-tuning type things automatically.
I would think that they'd want to have control over the results, and only give the encoder a certain amount of leeway to make decisions. The x264 implementation of H.264 offers options to do this, such as allowing you to specify minimum and maximum quantizer, and the amount the quantizer is allowed to change between frames.
You may also specify custom luma and chroma quantization matrices.
The problem is that any new "encoding techniques" have to be 100% compatible with the existing DEcoding hardware in current players. The codecs are locked - were locked, a year-two ago actually.
That's not exactly a problem. So long as they don't fuck around with the bitstream format and introduce a new feature they are free to tinker with the encoder all they like.
Same reason you won't see 3-layer discs - not for home video/movies anyway. For computer data storage maybe. But then as I pointed out pages ago, TDK showed off prototypes of 4-layer (100GB) and 6-layer (200GB, using 33.3GB layers) Blu-ray recordable discs at CES.
I doubt we'll see those any time soon. Late last year the fine folks over at AVSforums were reporting that the success rate for dual-layer Blu-ray discs were something like 1 in 400. That's not exactly encouraging.
Barrakketh on
Rollers are red, chargers are blue....omae wa mou shindeiru
It's hard to tell whether you're being deliberately antagonistic, or just really stupid. He never makes the claim to universality. . .
Yes he does. Saying that content is "not random, special case, easy to encode content" is the exact same thing as saying "all content can be encoded this way". Not a special case = a general case.
I don't think there was any sinister motive here. I think that Amir got a little too enthusiastic about the results seen, and realized he had to back down a bit. In other words, he's really talking about films that are particularly easy to encode.
That doesn't mean the achievement isn't still impressive.
Liabe Brave on
My name is Christian Smith.
"I just want people to see my action heart."
I don't think I said "anything less than 25mpbs will suck". But yes I am skeptical that your average movie can be compressed at 10mbps and still look "transparent" to the original source - or even to the same material compressed at 18+mpbs. Even that guy admits it requires very compress-able source material to achieve good results at that low of a bit rate.
Dude, no he doesn't. Are you not reading it, or are you just being willfully obtuse? He says this:
Yes, you read this right. We are now able to go below the 10 mbit/sec barrier. And this is not some random, special case, easy to encode content. One of the titles above is a major motion picture you would recognize in an instant with a ton of action.
[bolding mine]
You seem to be taking this:
Of course, this is not to say all movies and all content will go this low. But that there is a significant saving here across all.
And interpreting it as, "There are, like, 2 movies that we can encode at 10mbps without having them look like monkey wang," which is a ridiculous reading thereof. What he's saying is, "Not all movies are equally compressable. Yes, some movies will require a higher-than-10mbps bitrate in order to achieve maximum quality. However, 10mbps will be the rule rather than the exception."
Again, you can be skeptical, and that's cool, but chill with the fucking misinformation campaign.
ElJeffe on
Maddie: "I named my feet. The left one is flip and the right one is flop. Oh, and also I named my flip-flops."
0
syndalisGetting ClassyOn the WallRegistered User, Loves Apple Productsregular
edited January 2007
jwalk.
A LAME encoded MP3 sounds worlds better than a WMP10 encoded MP3, it uses the same codec, same bitrate... only does a better job of packaging it. In fact, a 96kbps LAME MP3 sounds comparable to a 128mbps WMP10 MP3.
Hardware plays litte to no part in this. If the compressionists can find a better algorithm to handle the data within the constraints of the codec, then they can.
You seem to be talking a lot about a subject you don't know much about.
syndalis on
SW-4158-3990-6116
Let's play Mario Kart or something...
I thought I would share here that I bought the 360 HD-DVD drive today and it looks amazingly good. I am really impressed. I don't know if the difference is worth the money versus normal DVD, but if you aren't hurting for cash, can afford it and have an hdtv...it is not a bad deal. I haven't seen bluray yet, but I am hoping I will be able to rent a player somewhere soon.
Your quoting Wikipedia without actual real world information? insta-fail right there. If you wanna rebuff my comments use real information not just quoting wikipedia.
I'm disqualifing that information unless you have better proof on real out in the wild information. You can quote wikipedia all you want but without real information your have no arguement.
If you'd bother to actually think instead of going "olo wikipedia" you'd realize that the article actually has 45 sources, two of them confirming that sentence you do derided alone.
Actually, MPEG = JPEG for moving images. The advanced codecs do a much better job with constrained bit rates than the older ones, that's for sure. With MPEG2, macroblocking was the worst problem. With MPEG4 (h.264/AVC and VC-1) it just gets kind of... soft.
Well, yes, but VC1 != MPEG.
Yes it is, VC-1 is Microsoft's implementaion of MPEG4 AVC/H.264.
I've seen full-screen 1080p videos encoded at ~10mpbs. Many of the videos you can download from the PS store are like that (using AVC). They look ... ok... not bad, just kind of "soft". Less detail, and the colors are a little.. I dunno, "flat". Versus the full bit rate (18+ mbps) BD movies, there's really no comparison. Not on a good, large-screen HDTV set, not to my eyes...
Given that the new techniques that we're discussing haven't actually been used in anything made available to the public, I don't think you've seen the 10mbps transfers that Amir was lauding.
Yes, an 18mbps HD-DVD or BD movie is going to be prettier than a hastily cobbled together 10mbps downloadable trailer. That's a given. The assertion isn't that current movie transfers at 10mbps are just as good as the upper echelon releases, like King Kong. It's that new techniques have made it such that we can get beautiful peceptually lossess transfers in the near future at that bit rate.
No, I sure don't know what new techniques you're talking about. I've seen VC-1, it's nothing new. Corpse Bride in particular was F'N amazing, totally pristine. Of course the bit rate was a lot closer to 20 (and well over that in many scenes) than to 10 mpbs. The trailers I'm referring to were all AVC, which is practically identical to VC-1.
Can I play an HD-DVD in 1080p, on my xbox 360 with an HD-DVD addon? Without HDMI. Just normal component.
With current disc releases, yes. Once Hollywood enables "ICT" - Image Constraint Token - on future releases, no you won't be able to watch anything higher than 480p on analog component. That's not guaranteed to happen though.
As for scratching:
The disc on the left is a standard DVD, the one on the right has a Durabis 2 coating.
While impressive, did the Blu-Ray disc actually still play after that? Coz you know, that's kind of the litmus test not "clearly we've destroyed a regular DVD".
Can I play an HD-DVD in 1080p, on my xbox 360 with an HD-DVD addon? Without HDMI. Just normal component.
With current disc releases, yes. Once Hollywood enables "ICT" - Image Constraint Token - on future releases, no you won't be able to watch anything higher than 480p on analog component. That's not guaranteed to happen though.
It's actually not quite that simple. Yes, the 360 can output a 1080p signal via component. However, not all TVs capable of 1080p can receive an analog 1080p signal; many of them only accept 1080p via HDMI (or DVI), and will downconvert the component signal to 1080i. As of very recently, the number of TVs that could do so was pretty small.
Actually, it's not even that simple. The Xbox 360 outputs 1080p over component for games and games alone. 1080i over component for HD-DVDs, and 480p over component for regular DVDs.
If you want to view them all in 1080p, you will need a way to view them with the VGA adaptor.
As for scratching:
The disc on the left is a standard DVD, the one on the right has a Durabis 2 coating.
What's funny about this is that you have in fact just hurt your case. Durabis is being used for 200GB discs. And... what else?
What I take from this is that so far Durabis isn't being used on all of those existing PS3 games and Blu-ray movies, which also have the data .1mm from the surface.
When will we see these on commercial releases? If ever?
It actually kind of seems to me (since 200GB discs can never ever ever ever ever be commercial releases, and have to be exclusively for personal storage, a-la CD+R/RW) that these Durabis protected discs will be the high end "Professional grade back-up" storage solution. So all my movies and games are still royally fucked. All of the content that I actually payed for and want protected isn't going to be. Or at the very least, isn't yet.
Can I play an HD-DVD in 1080p, on my xbox 360 with an HD-DVD addon? Without HDMI. Just normal component.
With current disc releases, yes. Once Hollywood enables "ICT" - Image Constraint Token - on future releases, no you won't be able to watch anything higher than 480p on analog component. That's not guaranteed to happen though.
It's actually not quite that simple. Yes, the 360 can output a 1080p signal via component. However, not all TVs capable of 1080p can receive an analog 1080p signal; many of them only accept 1080p via HDMI (or DVI), and will downconvert the component signal to 1080i. As of very recently, the number of TVs that could do so was pretty small.
Actually, it's not even that simple. The Xbox 360 outputs 1080p over component for games and games alone. 1080i over component for HD-DVDs, and 480p over component for regular DVDs.
If you want to view them all in 1080p, you will need a way to view them with the VGA adaptor.
Interesting, I have my 360 display settings set to 1080p, and I have the HD-DVD drive now as well. Watching movies they look really good, but I am not seeing them in 1080p?
Interesting, I have my 360 display settings set to 1080p, and I have the HD-DVD drive now as well. Watching movies they look really good, but I am not seeing them in 1080p?
If you take a real quick look at the first page or so of this thread
(it gets dull really quickly) then technically no, 1080p isn't in the HD DVD spec, they're looking at adding it sometime in the future.
Regarding the overall 1080i or 1080p issue, the HD-DVD specifications only support 1080i at this point in time. There are discussions about adding support for 1080p for higher performance profile players, but this has not been decided yet.
What it will do, where the movie has set it in the metadata, is the fabled reverse 3:2 pulldown of the interlaced source which means the output on the TV will be identical to a 1080p source.
Can I play an HD-DVD in 1080p, on my xbox 360 with an HD-DVD addon? Without HDMI. Just normal component.
With current disc releases, yes. Once Hollywood enables "ICT" - Image Constraint Token - on future releases, no you won't be able to watch anything higher than 480p on analog component. That's not guaranteed to happen though.
It's actually not quite that simple. Yes, the 360 can output a 1080p signal via component. However, not all TVs capable of 1080p can receive an analog 1080p signal; many of them only accept 1080p via HDMI (or DVI), and will downconvert the component signal to 1080i. As of very recently, the number of TVs that could do so was pretty small.
Actually, it's not even that simple. The Xbox 360 outputs 1080p over component for games and games alone. 1080i over component for HD-DVDs, and 480p over component for regular DVDs.
If you want to view them all in 1080p, you will need a way to view them with the VGA adaptor.
This is why "High Def" sucks balls. Seriously, this is just stupid.
With pretty much any TV doing 1080P, you won't be able to tell a difference with HD-DVD with 1080P or 1080i, as it will just do a 3:2 pulldown, and upconvert the i to a p.
Can I play an HD-DVD in 1080p, on my xbox 360 with an HD-DVD addon? Without HDMI. Just normal component.
With current disc releases, yes. Once Hollywood enables "ICT" - Image Constraint Token - on future releases, no you won't be able to watch anything higher than 480p on analog component. That's not guaranteed to happen though.
It's actually not quite that simple. Yes, the 360 can output a 1080p signal via component. However, not all TVs capable of 1080p can receive an analog 1080p signal; many of them only accept 1080p via HDMI (or DVI), and will downconvert the component signal to 1080i. As of very recently, the number of TVs that could do so was pretty small.
Actually, it's not even that simple. The Xbox 360 outputs 1080p over component for games and games alone. 1080i over component for HD-DVDs, and 480p over component for regular DVDs.
If you want to view them all in 1080p, you will need a way to view them with the VGA adaptor.
This is why "High Def" sucks balls. Seriously, this is just stupid.
Thank the movie studios for all that bullshit.
victor_c26 on
It's been so long since I've posted here, I've removed my signature since most of what I had here were broken links. Shows over, you can carry on to the next post.
With pretty much any TV doing 1080P, you won't be able to tell a difference with HD-DVD with 1080P or 1080i, as it will just do a 3:2 pulldown, and upconvert the i to a p.
3:2 pulldown is used for telecine content. To remove the telecine, you'd need to pullup the content. I know that's how you'd remove the telecine when encoding video with mencoder. "Converting" I to P would properly be called deinterlacing.
Barrakketh on
Rollers are red, chargers are blue....omae wa mou shindeiru
With pretty much any TV doing 1080P, you won't be able to tell a difference with HD-DVD with 1080P or 1080i, as it will just do a 3:2 pulldown, and upconvert the i to a p.
3:2 pulldown is used for telecine content. To remove the telecine, you'd need to pullup the content. I know that's how you'd remove the telecine when encoding video with mencoder. "Converting" I to P would properly be called deinterlacing.
Yeah there we go.
All these stupid words make my head spin.
But a 1080i hd-dvd will look no different on your 1080p tv, than a 1080p hd-dvd.
With pretty much any TV doing 1080P, you won't be able to tell a difference with HD-DVD with 1080P or 1080i, as it will just do a 3:2 pulldown, and upconvert the i to a p.
3:2 pulldown is used for telecine content. To remove the telecine, you'd need to pullup the content. I know that's how you'd remove the telecine when encoding video with mencoder. "Converting" I to P would properly be called deinterlacing.
Yeah there we go.
All these stupid words make my head spin.
But a 1080i hd-dvd will look no different on your 1080p tv, than a 1080p hd-dvd.
Just a quick clarification, but all HD-DVDs are 1080p, it's just the output that is 1080i. I think... *boggle*.
Rook on
0
AbsoluteZeroThe new film by Quentin KoopantinoRegistered Userregular
edited January 2007
Is there that much of a visual difference between 1080i and 1080p? I assume the motion is just smoother, there isn't more detail or anything like that.
AbsoluteZero on
0
syndalisGetting ClassyOn the WallRegistered User, Loves Apple Productsregular
Is there that much of a visual difference between 1080i and 1080p? I assume the motion is just smoother, there isn't more detail or anything like that.
Bad Jaggies are BAD.
syndalis on
SW-4158-3990-6116
Let's play Mario Kart or something...
0
AbsoluteZeroThe new film by Quentin KoopantinoRegistered Userregular
Is there that much of a visual difference between 1080i and 1080p? I assume the motion is just smoother, there isn't more detail or anything like that.
With pretty much any TV doing 1080P, you won't be able to tell a difference with HD-DVD with 1080P or 1080i, as it will just do a 3:2 pulldown, and upconvert the i to a p.
3:2 pulldown is used for telecine content. To remove the telecine, you'd need to pullup the content. I know that's how you'd remove the telecine when encoding video with mencoder. "Converting" I to P would properly be called deinterlacing.
Yeah there we go.
All these stupid words make my head spin.
But a 1080i hd-dvd will look no different on your 1080p tv, than a 1080p hd-dvd.
Just a quick clarification, but all HD-DVDs are 1080p, it's just the output that is 1080i. I think... *boggle*.
Yes. There is absolutely no such thing as a "1080i HD-DVD." If you have an HD-DVD - all of which are in 1080p - and play it on a 1080i TV or through a device that only sends signals up to 1080i, you will be watching it in 1080i. But all HD-DVDs have 1080p media on them.
It's similar to 7.1/5.1 sound. If you play 5.1 sound through two speakers, you'll get stereo. *shrug*
Is there that much of a visual difference between 1080i and 1080p? I assume the motion is just smoother, there isn't more detail or anything like that.
Bad Jaggies are BAD.
It reduces jaggies? How?
Okay.
Take an interlaced frame... you are getting one half of the lines each frame, and it shifts back and forth between the lines. This creates a certain kind of choppiness that only exhibits itself in motion.
A curved line, or a set of stairs at an angle on a screen where the camera is panning from left to right will LOOK jaggy on an interlaced inpute because the frame of reference continues to move while the second half of the frame sets in.
1080i makes fast paced action scenes not so hot compared to 1080p. Of course, there are a great many people who won't notice or care, and 1080I is PERFECTLY sufficient for slower filmed things, like nature shows and documentaries.
syndalis on
SW-4158-3990-6116
Let's play Mario Kart or something...
0
AbsoluteZeroThe new film by Quentin KoopantinoRegistered Userregular
Is there that much of a visual difference between 1080i and 1080p? I assume the motion is just smoother, there isn't more detail or anything like that.
Bad Jaggies are BAD.
It reduces jaggies? How?
Okay.
Take an interlaced frame... you are getting one half of the lines each frame, and it shifts back and forth between the lines. This creates a certain kind of choppiness that only exhibits itself in motion.
A curved line, or a set of stairs at an angle on a screen where the camera is panning from left to right will LOOK jaggy on an interlaced inpute because the frame of reference continues to move while the second half of the frame sets in.
1080i makes fast paced action scenes not so hot compared to 1080p. Of course, there are a great many people who won't notice or care, and 1080I is PERFECTLY sufficient for slower filmed things, like nature shows and documentaries.
Well I think 1080p is a ways off from being the standard... I don't know what I'd rather have. Smooth motion with no jaggies (720p) or increased detail (1080i).
Posts
Yes it is, VC-1 is Microsoft's implementaion of MPEG4 AVC/H.264.
No, I sure don't know what new techniques you're talking about. I've seen VC-1, it's nothing new. Corpse Bride in particular was F'N amazing, totally pristine. Of course the bit rate was a lot closer to 20 (and well over that in many scenes) than to 10 mpbs. The trailers I'm referring to were all AVC, which is practically identical to VC-1.
With current disc releases, yes. Once Hollywood enables "ICT" - Image Constraint Token - on future releases, no you won't be able to watch anything higher than 480p on analog component. That's not guaranteed to happen though.
As for scratching:
The disc on the left is a standard DVD, the one on the right has a Durabis 2 coating.
There...are...FOUR...lights!!!
Granted, I'm an obsessive compulsive grammar nazi, but still.
Its not reliable when corporate shills can immediatly alter/modify the information to what ever they please. The quote he used specifically was old too.
besides which, what is "real world information"? you and your friends managed to scratch a disc with durabis? that's anecdotal evidence and little more; you read in a post on some forum? a scientific paper?
why, exactly, had you been leaning toward blu-ray?
(i think i'm like most of the population on this matter: i have no dog in the fight. i don't want to have a dog in the fight. i tend to pick losers.)
My bad, that should've read "VC1 != MPEG2". Because, as you said, the codecs are very different.
While I'm not a compression expert, I suspect it has to do with things like selecting the quantization matrices (to use a jpeg analogy) and such. Perhaps they've come up with algorithms that can be used to perform a lot of fine-tuning type things automatically.
Really, you just seem to be willfully ignoring the fact that Amir has claimed that the techniques used in VC1 encoding have been greatly improved, such that we can now get by with lower bit rates. I don't mind if you want to adapt an "I'll believe it when I see it" attitude; that's cool. But this whole, "Nope, what you're talking about doesn't exist, anything less than 25mbps must necessarily suck because stuff that I've seen from once upon a time that was at 10mbps sucked, LALALALALA I CAN'T HEAR YOU" thing is really starting to grate.
It's actually not quite that simple. Yes, the 360 can output a 1080p signal via component. However, not all TVs capable of 1080p can receive an analog 1080p signal; many of them only accept 1080p via HDMI (or DVI), and will downconvert the component signal to 1080i. As of very recently, the number of TVs that could do so was pretty small.
Superior storage space, and that's the only reason.
If HD-DVD gets up to 51GB standard, I would choose HD-DVD. Why? Because I think it will be a cold day in Hell when Blu-Ray gets a 3 layer or higher standard... and I think we can thank the Playstation 3 for that.
I don't think I said "anything less than 25mpbs will suck". But yes I am skeptical that your average movie can be compressed at 10mbps and still look "transparent" to the original source - or even to the same material compressed at 18+mpbs. Even that guy admits it requires very compress-able source material to achieve good results at that low of a bit rate.
I'd love to be proven wrong though.
The problem is that any new "encoding techniques" have to be 100% compatible with the existing DEcoding hardware in current players. The codecs are locked - were locked, a year-two ago actually. They have to be, for compatibility. Any disc you buy today has to be playable in any future player, and vice-versa.
Same reason you won't see 3-layer discs - not for home video/movies anyway. For computer data storage maybe. But then as I pointed out pages ago, TDK showed off prototypes of 4-layer (100GB) and 6-layer (200GB, using 33.3GB layers) Blu-ray recordable discs at CES.
I would think that they'd want to have control over the results, and only give the encoder a certain amount of leeway to make decisions. The x264 implementation of H.264 offers options to do this, such as allowing you to specify minimum and maximum quantizer, and the amount the quantizer is allowed to change between frames.
You may also specify custom luma and chroma quantization matrices. That's not exactly a problem. So long as they don't fuck around with the bitstream format and introduce a new feature they are free to tinker with the encoder all they like.
EDIT: I doubt we'll see those any time soon. Late last year the fine folks over at AVSforums were reporting that the success rate for dual-layer Blu-ray discs were something like 1 in 400. That's not exactly encouraging.
Yes he does. Saying that content is "not random, special case, easy to encode content" is the exact same thing as saying "all content can be encoded this way". Not a special case = a general case.
I don't think there was any sinister motive here. I think that Amir got a little too enthusiastic about the results seen, and realized he had to back down a bit. In other words, he's really talking about films that are particularly easy to encode.
That doesn't mean the achievement isn't still impressive.
"I just want people to see my action heart."
Dude, no he doesn't. Are you not reading it, or are you just being willfully obtuse? He says this:
[bolding mine]
You seem to be taking this:
And interpreting it as, "There are, like, 2 movies that we can encode at 10mbps without having them look like monkey wang," which is a ridiculous reading thereof. What he's saying is, "Not all movies are equally compressable. Yes, some movies will require a higher-than-10mbps bitrate in order to achieve maximum quality. However, 10mbps will be the rule rather than the exception."
Again, you can be skeptical, and that's cool, but chill with the fucking misinformation campaign.
A LAME encoded MP3 sounds worlds better than a WMP10 encoded MP3, it uses the same codec, same bitrate... only does a better job of packaging it. In fact, a 96kbps LAME MP3 sounds comparable to a 128mbps WMP10 MP3.
Hardware plays litte to no part in this. If the compressionists can find a better algorithm to handle the data within the constraints of the codec, then they can.
You seem to be talking a lot about a subject you don't know much about.
Let's play Mario Kart or something...
I wasn't expecting to see these for some time.
Switch: 6200-8149-0919 / Wii U: maximumzero / 3DS: 0860-3352-3335 / eBay Shop
Blu-Ray burners and media are easy to find, however. A $499 Samsung Blu-Ray burner is supposed to come out sometime Q2 this year...
I wonder what the deal is... I also wonder why there are no dual layer Blu-Ray discs or burners out there....
:roll:
http://www.cdr-zone.com/news/verbatim_introduces_blu_ray_in_q3.html
http://www.ubergizmo.com/15/archives/2006/07/verbatim_to_release_bdr_bdre_media.html
If you'd bother to actually think instead of going "olo wikipedia" you'd realize that the article actually has 45 sources, two of them confirming that sentence you do derided alone.
If you want to view them all in 1080p, you will need a way to view them with the VGA adaptor.
What I take from this is that so far Durabis isn't being used on all of those existing PS3 games and Blu-ray movies, which also have the data .1mm from the surface.
When will we see these on commercial releases? If ever?
It actually kind of seems to me (since 200GB discs can never ever ever ever ever be commercial releases, and have to be exclusively for personal storage, a-la CD+R/RW) that these Durabis protected discs will be the high end "Professional grade back-up" storage solution. So all my movies and games are still royally fucked. All of the content that I actually payed for and want protected isn't going to be. Or at the very least, isn't yet.
Interesting, I have my 360 display settings set to 1080p, and I have the HD-DVD drive now as well. Watching movies they look really good, but I am not seeing them in 1080p?
If you take a real quick look at the first page or so of this thread
http://www.avsforum.com/avs-vb/showthread.php?t=789237
(it gets dull really quickly) then technically no, 1080p isn't in the HD DVD spec, they're looking at adding it sometime in the future.
What it will do, where the movie has set it in the metadata, is the fabled reverse 3:2 pulldown of the interlaced source which means the output on the TV will be identical to a 1080p source.
So no biggie really. No real world difference.
Click me for Sin City Breakfast Tacos! | Come discuss CG with us!
Thank the movie studios for all that bullshit.
All these stupid words make my head spin.
But a 1080i hd-dvd will look no different on your 1080p tv, than a 1080p hd-dvd.
Click me for Sin City Breakfast Tacos! | Come discuss CG with us!
Just a quick clarification, but all HD-DVDs are 1080p, it's just the output that is 1080i. I think... *boggle*.
Let's play Mario Kart or something...
It reduces jaggies? How?
Yes. There is absolutely no such thing as a "1080i HD-DVD." If you have an HD-DVD - all of which are in 1080p - and play it on a 1080i TV or through a device that only sends signals up to 1080i, you will be watching it in 1080i. But all HD-DVDs have 1080p media on them.
It's similar to 7.1/5.1 sound. If you play 5.1 sound through two speakers, you'll get stereo. *shrug*
*note to self: lern 2 quote nub
Take an interlaced frame... you are getting one half of the lines each frame, and it shifts back and forth between the lines. This creates a certain kind of choppiness that only exhibits itself in motion.
A curved line, or a set of stairs at an angle on a screen where the camera is panning from left to right will LOOK jaggy on an interlaced inpute because the frame of reference continues to move while the second half of the frame sets in.
1080i makes fast paced action scenes not so hot compared to 1080p. Of course, there are a great many people who won't notice or care, and 1080I is PERFECTLY sufficient for slower filmed things, like nature shows and documentaries.
Let's play Mario Kart or something...
Well I think 1080p is a ways off from being the standard... I don't know what I'd rather have. Smooth motion with no jaggies (720p) or increased detail (1080i).