I found myself on this forum as I was researching an area, (plumbing) while searching for reviews on a shower valve product - "IO Digital" my search came up with this thread. I own and A/V Custom Integration company, I've been in the industry since I was first out of Engineering School, and I've been on the manufacturer's side for 12 years prior to my current business which is really a fancy way of saying I install A/V, automation, lighting, and remote controls. I just had to comment.
Since I'm not trying to sell anyone here any actual products I'll be as up-front as possible. Take it for what it's worth... feel free to comment, complain, or tell my why your think i'm wrong.
Here's a few general points:
- Where you buy cables makes no difference: online or from a retailer. Most brands are available to anyone with a resale certificate.
- Brand of cable alone makes little difference: As in any product, most brands provide a good, better, best
- Cables are overpriced. Well, that's subjective. Yes, there is typically more margin in any accessory.
- "It's digital, 1's and 0's - it doesn't matter" = WRONG!!!! VERY WRONG, Ignorance abounds! see following diatribe:
I certainly can't explain the detailed principals - but, I can explain essentials in layman's terms. There are three primary concerns with digital signal transmissions as they pertain to HDMI (but, they apply to any type of transmission really.)
1. Voltage drop: Most signals are 5 Volt. The receiving device has a tolerance spec / threshold -- typically something like 4.6V
-- This means that the "1's" must be at least 4.6V to be accepted... if your inexpensive cable is using 28AWG conductors (or smallar) to save $$ on copper, or to be a thin wire (like the Apple iOS device cables) the result is that your maximum wire length is very short. A 6' cable should be fine regardless.
-- If the transmitting device has an acceptable output tolerance of 4.8V - 5.2V you may only have .2V of acceptable attenuation in the cable... That's not much! This is not the same as comparing the use of "lamp cord" compared to speaker cable with an analog signal and much higher Voltages.
2. Signal skew: This is complicated. there are many factors at play. Timing, amplitude, bit error rate.
-- If the signals that are on the multiple pairs within the cable are not time aligned, of the proper amplitude, and have a clean signal to noise ratio -- you don't get a picture, or you get artifacts.
-- Shielding / Isolation: This is where you see the differences in the good, better, best products. This is also where you could being to assert that there's no difference between better and best. It is simply about the environment. Are there EMI issues, are you willing to hot-plug your components if the HDCP misses the first time? A better quality cable has countermeasures built-in... such as a small "equalizer" on the receiving end that actually takes power from the receiving device and balances the eye pattern before the signal gets to the display.
To summarize, a poorly constructed cable can pass at short lengths, without adaptors, in-wall plugs, extensions, and with hardware that is within spec to begin with... but, as the cable get's longer it must use lower gauge wire pairs 24 or 22 AWG, better quality shielding, connectors that fit snugly and maintain the EM isolation.
3. HDMI Spec 1.3 / 1.4 "High Speed" -- there have been several revisions to the standard. Newer cables can transmit Ethernet, an audio return channel, higher resolutions a.k.a. "4k", these are all contributing factors. A pre "1.3" spec cable might work fine with your cable box -- but, not with your new Blu-ray player.
Cable debates are subjective like arguing over wine. There is science though. There is an objective way to compare them. Unfortunately, there is marketing and there are people selling over-priced garbage simply because higher priced alternatives make the crap look like a good deal. Buyers beware -- but, please - don't be ignorant if you're going to share opinions.