The new forums will be named Coin Return (based on the most recent
vote)! You can check on the status and timeline of the transition to the new forums
here.
We now return to our regularly scheduled PA Forums. Please let me (Hahnsoo1) know if something isn't working. The Holiday Forum will remain up until January 10, 2025.
Shitty Technology That's Still Around
Posts
I'm more in the "does work or doesn't work" camp, but I'm sorry, this argument is too easily destroyed.
If an interference type of issue is part of the "doesn't work" then what separates a digital cable from an analog cable? at least in terms of signal degradation.
What people are referring too is that with a digital signal, any kind of failure of the signal is going to be catastrophic. Yes you can get tiling, yes you can get wrong colors, or lines in the picture, you can get errors that are not "absolutely no picture" but it's never going to be like an analog signal where everything from the time of day, to whether or not the neighbor is using the microwave is going to determine a certain level of grain in the image.
grainy/fuzzy/static dies with analog video transmission.
all herald the era of "one moment please" "buffering" and "digital artifacting"
Ultimately there isn't much of a difference. The closest you'll get to a true digital transmission is via fiber. As long as the signal is moving over copper it's going to become an electrical signal at some point. The principles of interference don't care if the data is encoded as a digital signal or as an analog one. The only difference is that it's a lot easier to tell what is causing your interference with an analog signal rather then a digital one. Beyond that, there is no difference.
That's not even close to true. There is no error checking at all in HDMI or DVI. So if I put in interference that changes a string of zero's to a string of ones then something in the display may or may not change. It could be I flip the color of a pixel, or I cause the pixel to become complete garbage. Maybe my interference is strong enough to get 1 pixel or 10% of your pixels or 100% of your pixels. Streaming via wi-fi will be a little different because it does have error checking, but if you're using your microwave you'll note that your frame rate goes to shit. Or you buffer.
I haven't read the current revision but the previous revisions of the HDMI spec strongly suggest that processing be used at any sink to prevent the display of errors. Strongly suggested isn't quite the same as "use this method" however and you can find people who are having interference issues. And unless they've changed the error checking, the error checking would only prevent HDCP or the timing signals from being fucked up. The video itself isn't using any error checking.
What is a minor HDMI image mal-render?
Every error I have seen and read about have been major and would ruin my enjoyment.
If my component video signal is only 99.9% good I would be very happy. If my HDMI video signal is only 99.9% good I would have 0.1% bad, which equals 2 000 white dots flickering randomly on every frame.
This might be a minor inconvenience for somebody else, but I would classify this as "Not working".
guess how many monster cables we had?
Digital audio is a classic example of this.
At 0db, digital audio is at it's peak, push past it and it stopps working, resulting in nothing but distortion.
At 0db, an analog signal is at it's peak also, but push past it and you can squeeze a bit more out of it, sometimes getting a warmer sound out of it (i.e. tape saturation)
It's called recording hot. It's a physical way to achieve louder, fatter sound by causing the tape to expand on recording and contract back to normal size for playback. You can do the same thing with digital but it's called compression. It sounds pretty much the same when done well, but it's grossly, grossly overused most of the time. There were limits to how hot a tape could go before it saturated. With digital you can keep processing and keep processing until the track is so hot and fat that you feel like your ears are being assaulted.
Mostly the compression. Leaves no room for any of the instruments to have their own space so everything sounds like it's right between your eyes and hitting you with a nail gun.
PSN : Bolthorn
I'm messing around with running foobar through the freeware floorfish VST plugin right now. I've also been considering getting a hardware unit to plug into my downstairs listening setup as well. EDIT: Although I'm also interested in finding a hardware unit with more comprehensive expanding features than that particular model....
The whole digital thing negates all this monster cable stuff, not only do you either not get data or you don't but you also have the capability of requesting the data again in some bits of technology. It'd be pretty stupid if the HDMI standard didn't have this, but i'm guessing it probably does. A parity bit.
Say it's sending data and some of that data for some reason is wrong, how do you tell it is?
So, say I have some data that I've sent and I want a quick and easy way to make sure with decent certainty that the data is correct. Well, I use the parity bit. The parity bit is a single bit (0 or 1) to indicate that the number of the data sent (say 7 bits) is either an even or an odd number. If that data has been corrupted chances are it may have changed the data in such a way that it is either no longer even or odd. If say the parity bit indicates that the number is meant to be an even number and the data is in-fact an odd number then you know to request the data again.
It's not infallible, but it will catch a good 99% of data errors. Ever heard of parity memory? Same principle. That kind of memory is typically used in servers where the lack of data errors is pretty damn important.
---
I've got a spare copy of Portal, if anyone wants it message me.
HDMI does not have any sort of parity check in the video transmission. Nor does it specify any way to re-transmit broken video data. The interface is just pushing too much raw data at too low of a latency to be able to stop, ask for part of the last frame again, then move on. In addition, the displays aren't sophisticated enough to check. They fill a buffer, get the sync, and swap buffers, fill the buffer, get a sync, and and swap buffers...
The actual video transmission is done in Transition Minimized Differential Signaling. TDMS codes every 8 bits of data into a 10 bit value. The first 8 bits are the data. The 9th bit says if the data has been XOR or XNOR transformed. The 10th bit says if the data was inverted after the XOR/XNOR step. The data is sometimes inverted to maintain the same average voltage on the line.
The XOR/XNOR and inverts are applied to the data in such a way that no matter what the images look like, the signal going down the line has minimum number of bit transitions (transitions occur when a 1 is transmitted after a 0 or vice-versa) to make sure the receiver's clock stays on time. Certain combos are disallowed because they would disrupt the transmission by building up a charge on the lines or making the receiver lose track of time. A few 8-bit values would produce satisfactory data for transmission no matter what transforms are done on them.
In these cases, one transform type will mean the pixel data while another type indicates a sync signal. Since there aren't that many sync or other signals needed, nearly half the 10 bit combinations you can send over HDMI are invalid. In that sense, HDMI video receivers will notice a bit out of place about half the time. Other times, it'll just draw a pixel with an incorrect color component.
That seems like an odd way to do it. Only one of the bits in a number (2^0) determines whether the number is even or odd, so that would be the only bit that your parity bit would check. The whole rest of the number could be mangled and you wouldn't know it.
The parity bit indicates whether the number of high bits (ones) is even or odd. So the 8-bit number "01101000" is odd parity because it contains 3 ones.
That way, if any single bit is changed, the parity is no longer valid. It's not foolproof, though. If an even number of bits with the same value happen to flip, the parity check will still pass. The likelihood of this happening is pretty low but it's enough not to rely on parity alone to ensure proper data transfer if you need to guarantee 100% data integrity.
Finally a relevant post. Thread was too zzzzzzzzzzzzzzzz
Look how dangerous they are now. Add a few million inexperienced pilots and traffic.
Click here to see the ANIMATED version of this signature too big for the forums! :winky:
It's hilarious when in shows and movies set in the future you have these flying cars in traffic jams wherein there's a bunch of airborne cars hovering in a line as though people wouldn't try to get around that shit in the x and y axis'.
I can see air traffic becoming viable if the majority of the car control (if not all) is performed by the computer, and the car runs in computer generated lanes that the government downloads into its storage. That's ignoring the technology of keeping a bunch of metal objects perfectly airborne even in high winds and rain.
Didn't Futurama parody that in multiple episodes?
yep.
Biiig badda boom.
Let's play Mario Kart or something...
On the other hand, "road widening" means changing a few parameters in the traffic control system, so you can almost do it on the fly, certainly overnight.
Nintendo Network ID: AzraelRose
DropBox invite link - get 500MB extra free.
It certainly makes more sense than Minority Report's ridiculous highways.
murderous central intelligence robot looked perfect to you?
ROBOT.
There is a HDMI->VGA dongle around that decrypts the HDCP signal and outputs to a VGA output. I wouldn't be surprised if there was something that could convert VGA->S-Video etc.
---
I've got a spare copy of Portal, if anyone wants it message me.