# Sticky  Video Calibration Q&A



## Bear5K

I thought that I might open up a calibration Q&A thread for people considering calibrating their video chains. Fair warning: I do sell software that does this, but our forum is the place for tech support for our product. I also don't do tech support on specific models. Sorry, but I have not "seen them all". Far from it, actually. What I can answer are questions on meters, techniques and theory/practice on some key aspects of video calibration that seem to trip people up.

Bill


----------



## Sonnie

Thanks Bill... we appreciate your knowledge.


----------



## lcaillo

That would be GREAT! Just refer the hardware specific questions to the Manufacturer Service and Support Forum and I will see what I can come up with for help from that perspective. We will try to keep the discussion focused on general questions about calibration, not support for your product nor for specific hardware. 

I have some reference links at

http://www.hometheatershack.com/for...rmation/5573-video-calibration.html#post43531

that you might want to review and suggest changes to. Your expertise and willingness to share your experience is much appreciated.


----------



## Trekari

I for one, greatly appreciate a general QA thread about calibration.

Over the past week, I borrowed a colorimeter from a local graphics shop and did a newbish effort at calibrating my display. I managed to get my grayscale rather accurate, but despite the picture looking much better than uncalibrated, there was a lot left to be desired.

So i'm going to do a factory reset and proceed with starting over.

Questions:

1) What is the first thing that should be adjusted. Contrast level to ~35fL for a plasma, then brightness?

2) Should ALL user controls be centered to 0 in order to set the above values in a service menu? (Contrast and Sub-Brightness) 

3) Is the "smart method" as I've heard it mentioned the best way to do brightness? i.e. Dropping red/blue bias/cut/offset/etc and only adjusting the green value, then bringing red/blue back up to proper levels? (similarly, using red gain as your 'max' level and adjusting green/blue on the top end once you determine red clipping levels)

4) If green is 71% of the brightness of a display, and you set overall Brightness control to a proper level, what should be done once you bring blue/red back up and the brightness has changed, drop red/blue back down and lower green little by little until Brightness is proper with all cuts set appropriately?

5) How often during a calibration session should you "recalibrate" the sensor, assuming the use of an inexpensive colorimeter such as Eye-One Display2/LT?

I'm suprised nobody else has asked questions in this thread, hopefully for me that means I'll get all the attention. 

Thank you to anyone with the knowledge to answer!
-Jason


----------



## fibreKid

Short of spending 2 - 3K for ISF training what's a more modestly priced way to learn about the procedure and theory of calibration? I've read through the cal man online info a while back and it filled in some holes but I would really like to understand this stuff. I've had my TV calibrated so I won't be playing with mine in the near future but I would still like to learn. 

Thanks much
-john

I'm checking out the links posted by Icaillo


----------



## Bear5K

Trekari said:


> I for one, greatly appreciate a general QA thread about calibration.
> 
> Over the past week, I borrowed a colorimeter from a local graphics shop and did a newbish effort at calibrating my display. I managed to get my grayscale rather accurate, but despite the picture looking much better than uncalibrated, there was a lot left to be desired.
> 
> So i'm going to do a factory reset and proceed with starting over.
> 
> Questions:
> 
> 1) What is the first thing that should be adjusted. Contrast level to ~35fL for a plasma, then brightness?


Brightness first. It has a bigger affect on the top-end than a proper contrast control has on the low-end.



> 2) Should ALL user controls be centered to 0 in order to set the above values in a service menu? (Contrast and Sub-Brightness)


There can be a significant interaction effect between SM controls and User Menu controls. Generally, try to stay out of the SM if at all possible. Once you are in, you are going to have to figure out the interaction.



> 3) Is the "smart method" as I've heard it mentioned the best way to do brightness? i.e. Dropping red/blue bias/cut/offset/etc and only adjusting the green value, then bringing red/blue back up to proper levels? (similarly, using red gain as your 'max' level and adjusting green/blue on the top end once you determine red clipping levels)


I prefer to use the brightness control and a PLUGE pattern. I come back to check to make sure it hasn't been elevated or lowered too much after doing grayscale.



> 4) If green is 71% of the brightness of a display, and you set overall Brightness control to a proper level, what should be done once you bring blue/red back up and the brightness has changed, drop red/blue back down and lower green little by little until Brightness is proper with all cuts set appropriately?


See my answer to #3, above.



> 5) How often during a calibration session should you "recalibrate" the sensor, assuming the use of an inexpensive colorimeter such as Eye-One Display2/LT?


D2/LT? Once at the beginning.



> I'm suprised nobody else has asked questions in this thread, hopefully for me that means I'll get all the attention.
> 
> Thank you to anyone with the knowledge to answer!
> -Jason


Looks like you got the ball rolling.


----------



## Bear5K

fibreKid said:


> Short of spending 2 - 3K for ISF training what's a more modestly priced way to learn about the procedure and theory of calibration? I've read through the cal man online info a while back and it filled in some holes but I would really like to understand this stuff. I've had my TV calibrated so I won't be playing with mine in the near future but I would still like to learn.
> 
> Thanks much
> -john
> 
> I'm checking out the links posted by Icaillo


We have now made CalMAN v3 downloadable by the general public, and not to be too biased, but the help system in it is probably the best resource for learning about calibration. We took the information in the original v1 document and added to it extensively.

You can download the application from here:
www.calman.tv/downloads/calmanv3_beta.msi

If this doesn't help, then the next step is either ISF or spending a few hundred on some video engineering texts and learning how to do it inductively. 
:reading::reading: :scratchhead: :reading: :dizzy: :reading::reading: :scared: :reading::reading: :T

Bill


----------



## <^..^>Smokey Joe

Hi from my point of view when I first started calibrating it was the CalMAN V1 help which was for me the most informative and understandable infomation I found whilst trolling the web. I have found the V3(beta) information and help an excellent progressive step forward. I am not stating this because I use the software and get any reward from CalMAN. The rewards I get are from the excellent results I achieve from the software. I have to pay for it like everyone else( more $ actually as my NZ dollar sucks.)


ISF is more about buying into a support system of combined experience. You would only do this when or if you would be looking at being a calibrator of displays and wish to be more professional about your Image(In more ways than one). Many do the ISF seminar who are in the AV game and just what to understand more about what they do. It does cost though and it is up to the individual to weigh up or justify the outlay and return from the investment.

One never stops learning; painfully the more you know, the more you know you don't know.
On the other hand you don't need to learn electronics so you can calibrate display's. You in all reality just need a understanding of what is happening as a net result. 

Order of calibration.
Just remember that all adjustments are interactive with the viewer, and it is this key element of total balance which possibly classes the calibrator in the professional sense.

I don't think there is any hard and fast rules on sequence, however brightness and contrast are the starting point for most if not all.
(again the CalMAN software helps you though this, other package struggle here, free or otherwise)
Personally I clump areas of the calibration process into groups. My reasoning for this is that looking at one point of view(one graph alone) gets you into trouble.

Calibration order is a lesser concern than the reference material you use and how accurate it can be processed. Your measurements are firstly only as accurate as your reference material, then add the error from your measurement device. This is why Video generators hold such high value as they are the most accurate reference for the calibrator. They also allow the calibrator to isolate the video chain which intern can solve problems or issues elsewhere.
Calibration DVDs are only as good as the device being used to play them. Electronics have a general tolerence of 10% in their common components, so the average output can vary. (my personal SD DVD player chopped off 10% of the lower end signal for example).
On the other hand some gear can be good too, but it can be an unknown quanity unless referenced to reference equipment.

This is how I rank the tools of the trade.

1. Reference material. Video Generator 1st, a Referenced PC output 2nd and thirdly a Calibration DVD material. I use a Video Generator and Calibration DVDs.

Reference material in general is ranked 1st,

2. Measurement probe. Many arguments about what is the best probe. Personally I have built a number of the HCFR probes, used a Spyder2 and use an xrite i1pro. I personally found the net results of my work better with the i1pro. It is currently the best all rounder. It is also the only (everyman)affordable NIST traceable referenced probe. If you are serious I recommend the i1pro, Calman(spectral cal) also rate the D2 probe very highly, I may get one of these some day as a secondary backup.

Measurement probe is ranked 2nd.

3. Software. Without the first 2 being accurate the software is rendered useless, that is why I place it third. I currently use the CalMAN V3 product and will be moving to the pro version when V3 moves from beta to general release. 
Remember the software is really only a reporter of information, like any software we can have our favourites which is usually based on the product working how your head works. Some people call that intuitive. What ever the reason for me the software needs to be flexible and do the job, which is help you calibrate the display. Some products require alot of prior knowledge, some are flashy, one is currently free. I have used the free one with the DIY probes, Sencore software briefly, Datacolour software and CalMAN software, V2 (excel based) and V3beta. For me CalMAN is excellent hitpower for buck. The free software is excellent for learning though when one is in the early stages of the learning curve.

Software I rank third.

The forth element is the calibrator him or her self, here there is no rank per say, however one must have understanding of all of the above and what he or she is trying to achieve.

Regards
:wave:


----------



## fibreKid

Thank you Bear5K, I'll check it out.
-john


----------



## Trekari

Thank you very much for the replies! When my new meter arrives Wednesday I'll have to drag my PC into the living room and play for a few hours.

Hopefully the new meter arrives long before the new credit card bill.


----------



## Guest

OK, so, I run a business installing home theaters. Been doing it for a long time, but I want to take the next step.

I've looked into the ISF course, and was wondering if the CalMAN pro software with Eye one pro will:
Be fine for LCD, Plasma AND projection (I can never seem to get a straight answer when I search probes).

And, secondly, is it to "ISF standards"? I'm sure the Colorfacts pro software is wonderful, but geez.. it's pricey.

And finally, how does a Spider 3 compare to the eye one Pro?

Thanks for the help...

Z...


----------



## Bear5K

Zee said:


> OK, so, I run a business installing home theaters. Been doing it for a long time, but I want to take the next step.
> 
> I've looked into the ISF course, and was wondering if the CalMAN pro software with Eye one pro will:
> Be fine for LCD, Plasma AND projection (I can never seem to get a straight answer when I search probes).


Yes, but with a caveat. If you have to use only one meter as a Pro due to budget constraints, then the i1 Pro is it. After it, you start to get into meters costing several thousand dollars (US Pesos, really; the last time I was in Australia, the Aussie dollar was at US$0.67). The caveat is that this leaves you with a single point of failure for your business (they ought to be recalibrated every year, and they do break occasionally). Many of our professionals like to use a meter like a Chroma 5 or DisplayLT as a lower cost "back-up", that also helps with setting the grayscale controls on the new, higher-contrast direct-view displays (e.g., Pioneer Kuro). We sell kits at a lower cost for either of these options (i1 Pro + Pro software; i1 Pro + Chroma 5 + Software) on our website.



> And, secondly, is it to "ISF standards"? I'm sure the Colorfacts pro software is wonderful, but geez.. it's pricey.


This is essentially why we got into business.  With all due respect to Joel Silver, the ISF does not actually set the standards. The standards are set by organizations like the Society for Motion Picture and Television Engineers (SMPTE; I'm a member), the International Telecommunications Union (ITU) and the European Broadcasting Union. In fact, any time you see "Rec. 709" (aka HD), this is really shorthand for ITU-R Recommendation BT.709-XX (with XX being the current revision number).

We are trying to get onto the ISF "approved" list, but they had a six month backlog of paid evaluation work they were trying to complete. Expect to see this happen eventually.



> And finally, how does a Spider 3 compare to the eye one Pro?


It's a decent option for LCDs or DLPs, but we can't recommend it for CRT and plasma use yet. I also would not use it with any type of "spiky" light source (e.g., LED-based light engines). We also tend to steer professionals away from meters that are available at the local computer shop. There's a bit of a branding issue here (fee justification).

HTH!

Bill


----------



## Guest

Excellent - that pretty much covers all I need to know.
Also, thanks for enlightening me on the ISF/SMPTE and standards issue.

I did notice the package price for the pro software and i1pro was particularly good. Local Aussie pricing is insane, in particular considering how well placed our currency is right now.

I expect to be placing an order soon. thanks for your help.
Cheers,

Z...


----------



## Bear5K

Download the evaluation software, as well. The documentation should give you a jumpstart on all of this.


----------



## Guest

Good idea - thanks for the suggestion.

Cheers 

Z...


----------



## Rambo4

I am unsure if my question here is out of place as I have no real knowledge of a lot of what has been said above. 

I do know that I have a display issue in my projector, that I am unsure if it is something mechanical or just my setup. I have an IN76 projector viewed on a 92" screen, distance to screen is approx. 12 feet. Everything looks as it should doing video calibration tests, but I have this slight dip in the lower right hand corner of the image. As far as I can tell there is no adjustment in the menu for fixing one corner of the image. I do see it there when I run test images. The screen is flat and bordered (Screen Innovations) so that shouldn't be culprit. It's not really noticeable when watching a movie. As I said it is slight and gets worse toward the corner.

Should I get out my level? Or should I call the tech guys at Infocus for their opinion on a repair? :sweat:


----------



## lcaillo

Please start a new thread for your problem. 

I'd start by checking centering and level. Look very carefully at the other side. You will likely see a similar problem.


----------



## mechman

When calibrating a projector, how far away should I place the meter? This is for calibration with the meter facing the pj. Is it better to be closer to the screen or closer to the pj? Or does it even matter?


----------



## glaufman

I'm in process on acquiring materials to build an HCFR probe, because it's cheaper than the i1pro and I've been told it has better performance (especially at low light) than the i1D2/LT... I understand I'll need another probe to cal it against... realistically that means to me that it'll only be as accurate as the probe I compare it to... I have very limited access to a DTP94, the limited nature of this access is why I need another probe, but I digress...

Can anyone testify as to the performance (both in general and at low light) of the DTP94 vs the i1Pro vs the i1D2/LT? Not much point in building a probe that's good at low light when I have ot cal it with one that isn't!

Also, if anyone wants to challenge the assertion that the HCFR can be superior at low light to either the Pro/D2 I'm willing to listen...


----------



## lcaillo

With all due respect to Bill, who started the thread, there is a new kid on the block with respect to video calibration software. Tom Huffman, who some of you may know for his contributions to discussions on other forums, has a new product called ChromaPure. I have not used it, but you can see demos of it on his site, and there is an FAQ that is well written and consise on the site as well. 

http://www.hometheatershack.com/forums/parts-test-equipment-suppliers-database/20910-chromapure.html

Like his competitors at SpectraCal (CalMAN), Tom is one of the guys that is pushing the calibration business forward, trying to make better sense of it all both at the level of enthusiast and professional. The software seems targeted to enthusiasts, but I suspect that it will not be long until a more advanced version is available.


----------



## Bear5K

We look forward to competing against him.


----------



## lcaillo

Competition keeps the bar high. Keeps you on your toes. Good for the soul.


----------



## glaufman

So I've been trying to find the time to play around with some of my TVs settings and my colorimeter to try and answer some of my own questions regarding exactly how some of the "classic" calibration graphs are affected by each of the different controls... suffice to say my life just keeps getting busier and I can't find the time, so I thought I'd begin picking the brains of those who already know. Not as much fun, but certainly easier, and then others can see and learn as well.

So this might be the first of many, but I'll space them out to give you guys a break...

Let's take the normalized luminance curve (% luminance vs. % stim) and how it's affected by changing the gamma of a display... I've often thought of this curve as a rope strung between two points, those being black (0% stim) and white (100% stim)... The analogy goes that as you increase gamma, it's like paying out more string, the curve hangs lower, as you take in more string, you decrease gamma, the curve hangs higher...

Taken by itself, is there any particular flaw in thinking about it in this way?


----------



## lcaillo

That would be a good rough analogy. The important thing to get about gamma, however is that there are many variables at play that can be very different with different sets. Gamma needs to be thought of as a complete curve that can be complex and may not be described by a single average value. The whole rope is important, and the stiffness of the rope may vary along its lenght. Gamma may interact with other adjustments, and the exact nature of those interactions may be very different on your set than others.


----------



## Bear5K

As lcaillo points out, it's a good rough analogy. However, depending upon where you are at in the stimulus level, you may or may not want to make different types of adjustments. In some cases, you may want to elevate the "average" gamma value to increase shadow detail. Other people may want to lower it to improve black level. The interplay of shadow detail and black level may differ entirely from what you want to do in the mid-tones and highlights. It's far better to get a reasonably granular view of gamma to correlate what you are seeing, what you want to see and the effects of specific changes to converge the two.


----------



## glaufman

OK. So your responses made me think so hard it tok way longer to ask the nedxt question than I had intended... Thanks!
I appreciate the detail of not all machines operating the same way, and some being "stiff" at certain levels and the like, I hadn't thought about that... it's my hope to better understand how things should work in the theoretical world, as well as how often/not that applies in the real world...
To that end, I think you both said, yes, that's a good theoretical model even though it holds up only to some degree in the real world...
So the next question is, I believe I've read somewhere that on some displays that don't have gamma adjustments per se, that sometimes to can slightly tweak gamma by adjusting brightness and contrast, so I ask, using the rope analogy, if you increase contrast, looking at a luminance curve (not yet normalized) as you increase contrast you would expect the high end of the curve to rise....
The question is, as it rises, what happens to the length of the rope, does it stay the same, giving less contour to the curve? Or does it elongate, but not proportionately, so there's less contour to the curve? Or does it elongate ot the point where the contour stays the same?
I'm sure you can guess the next question, which is what then happens when looking at the normalized luminance curve, but I think I've answered that one already... I think the answer will be that in the ideal world, the since you're not ajdjusting it specifically, the gamma would stay the same, and the before/after curves would overlap, assuming you weren't driving one into run-out or the like...?


----------



## Bear5K

glaufman said:


> OK. So your responses made me think so hard it tok way longer to ask the nedxt question than I had intended... Thanks!
> I appreciate the detail of not all machines operating the same way, and some being "stiff" at certain levels and the like, I hadn't thought about that... it's my hope to better understand how things should work in the theoretical world, as well as how often/not that applies in the real world...
> To that end, I think you both said, yes, that's a good theoretical model even though it holds up only to some degree in the real world...
> So the next question is, I believe I've read somewhere that on some displays that don't have gamma adjustments per se, that sometimes to can slightly tweak gamma by adjusting brightness and contrast, so I ask, using the rope analogy, if you increase contrast, looking at a luminance curve (not yet normalized) as you increase contrast you would expect the high end of the curve to rise....
> The question is, as it rises, what happens to the length of the rope, does it stay the same, giving less contour to the curve? Or does it elongate, but not proportionately, so there's less contour to the curve? Or does it elongate ot the point where the contour stays the same?
> I'm sure you can guess the next question, which is what then happens when looking at the normalized luminance curve, but I think I've answered that one already... I think the answer will be that in the ideal world, the since you're not ajdjusting it specifically, the gamma would stay the same, and the before/after curves would overlap, assuming you weren't driving one into run-out or the like...?


Since in theory changing either the brightness or the contrast control ought not to change the gamma for reasonable values of all three, there really isn't a hard-and-fast rule. A properly engineered display will, if you compress the dynamic range too much, lower the effective gamma to maintain evenness between black and white. In other words, the gamma will fall as black and white get closer together (e.g., brightness too high, contrast too low). That being said, with current technologies, a well-engineered display ought never to get that far short of "stupid human tricks".

Ideally, you would want the converse to be true, as well. As the dynamic range increases, the gamma value increases -- at least up to a point. If you have a "bat cave", then you can appreciate and use an average gamma that is quite a bit higher than one meant for a living room with a fair bit of ambient light. Ambient light requires an elevated black level to maintain a visually desirable amount of shadow detail, and given a fixed maximum brightness, this means that you end up losing dynamic range as a result.


----------



## glaufman

So then, on a properly engineered display, for "reasonable" values of contrast and brightness, you might expect increasing contrast or decreasing brightness to increase the gamma slightly, but probably by very little? If so...
Would both affect the whole range equally, or would contrast affect more the gamma at the high end and brightness more at the low end?
And how many of modern displays are "properly engineered" in this way, as in, how high end do you have to go to find this behavior vs how low end do you have to go to find these controls affecting gamma more than they should, or in ways they shouldn't?


----------



## lcaillo

I would consider a properly engineered display to be one that accounts for non-linear luma in the contrast and brightness controls and has no effect on gamma. In practice that relationship is hard to get right, as there are multiple variables at play, not just the video relationships, but also the display technology and its implementation.

In practice, it just depends on the display. It is very hard to generalize. The effects vary from one manufacturer to another and even within brands considerably.


----------



## glaufman

Is that one of the contributions that make the Pio Elites look so good?


----------



## Bear5K

glaufman said:


> And how many of modern displays are "properly engineered" in this way, as in, how high end do you have to go to find this behavior vs how low end do you have to go to find these controls affecting gamma more than they should, or in ways they shouldn't?


Good video engineering isn't necessarily cost-prohibitive on the BOM, AFAIK. What might push it higher into the food chain is the cost of the people and giving them the time to do it right.


----------



## glaufman

Perhaps, but I always assume that increase time = increased development costs = increase production price... ROI and all... no?


----------



## Bear5K

Selling price is whatever the market agrees to pay, and time-to-market is a critical piece of that puzzle. As a result, many manufacturers may not be able to spend the incremental cycles trying to get the engineering right while they are busy pushing next year's model through the development pipe. 

As I said, the increased cost usually isn't reflected in the BOM (i.e., the components and assembly costs), but in the cost of development itself.


----------



## glaufman

Bear5K said:


> Selling price is whatever the market agrees to pay, and time-to-market is a critical piece of that puzzle. As a result, many manufacturers may not be able to spend the incremental cycles trying to get the engineering right while they are busy pushing next year's model through the development pipe.
> 
> As I said, the increased cost usually isn't reflected in the BOM (i.e., the components and assembly costs), but in the cost of development itself.


Well, this is getting off-topic, but if MFRs can't get the volume they need at the price they need to recoup the cost of the development itself, the business model isn't viable. The return must recoup the investment. Whether it's reflected in the BOM is immaterial. The BOM is just the beginning of costing a product.


----------



## Bear5K

glaufman said:


> Well, this is getting off-topic, but if MFRs can't get the volume they need at the price they need to recoup the cost of the development itself, the business model isn't viable. The return must recoup the investment. Whether it's reflected in the BOM is immaterial. The BOM is just the beginning of costing a product.


It's very much material, pardon the pun. BOM costs are roughly equivalent to variable costs, whereas development engineering costs are fixed (insensitive to volume). Given that a lot of display development is done as assembly and integration, especially at the low-end of the market, if you can trace the components back to their source, you can probably find a vendor who is "not doing it right" and then extrapolate to other, similar displays using the same chipsets.

After all, this tangent was started as a theoretical question that kept trying to get back into the specifics of what's available in the market.


----------



## glaufman

Very punny. You do make some good points. But even so, I was asking more about the higher end of the market. Also, would the chipsets from companies "doing it right" not cost more, as in why else would chipsets "doing it wrong" get bought? Unless the lower cost MFRs don't bother testing whether it's right or wrong, but that again implies the ones who do would have higher development costs...

I suppose at some point, to avoid this going further OT, I'll have to simply accept what you're saying even though it doesn't make sense to me... :huh:

Maybe I should start a new thread to discuss this further. I really don't want to cloud the otherwise-very-useful "calibration" thread with this talk...


----------



## Bear5K

glaufman said:


> Very punny. You do make some good points. But even so, I was asking more about the higher end of the market. Also, would the chipsets from companies "doing it right" not cost more, as in why else would chipsets "doing it wrong" get bought? Unless the lower cost MFRs don't bother testing whether it's right or wrong, but that again implies the ones who do would have higher development costs...
> 
> I suppose at some point, to avoid this going further OT, I'll have to simply accept what you're saying even though it doesn't make sense to me... :huh:
> 
> Maybe I should start a new thread to discuss this further. I really don't want to cloud the otherwise-very-useful "calibration" thread with this talk...


This thread has been relatively lightly traveled, so I don't mind going into a bit more detail. The higher-end companies have, often, done their own in-house engineering work, while the lower-end companies basically buy "kits" and assemble them (this is gross oversimplification, but work with me...). If I'm a Sanyo or Westinghouse or Vizio, then I buy a package from someone like Sigma, who then gets to spread its own engineering talent around a lot of different models from a lot of different OEMs. Conversely, if I'm someone like Sharp or Panasonic, then I (might) use internally-developed sub-assemblies (e.g., Uniphier) and spread those over large swathes of my product range. At that point, developing a specific model is about setting options in a firmware platform and tying it to the hardware.

Who gets caught out in the above business/development models were companies like Fujitsu or Pioneer who had low volume AND did a lot of custom development. Those manufacturers are basically all gone now.


----------



## glaufman

Ah. So what makes the difference (sometimes at least) is the MFR making a conscious decision to disable or not use certain things they've developed as an excuse to offer a lower grade product at a lower price point, or basically positioning. I have no problem buying that (pardon my own pun). But don't they tend to roll out improved processing/features in the higher end sets first, and after a while (year or more) bring those down to the lower end models once they have something better for their high end models, in effect charging more for the best or most up to date stuff, and isn't that equivalent to the "early adopters" of said improvements paying for the proprietary costs?


----------



## Bear5K

glaufman said:


> Ah. So what makes the difference (sometimes at least) is the MFR making a conscious decision to disable or not use certain things they've developed as an excuse to offer a lower grade product at a lower price point, or basically positioning. I have no problem buying that (pardon my own pun). But don't they tend to roll out improved processing/features in the higher end sets first, and after a while (year or more) bring those down to the lower end models once they have something better for their high end models, in effect charging more for the best or most up to date stuff, and isn't that equivalent to the "early adopters" of said improvements paying for the proprietary costs?


Now we really do start going off the reservation. Don't necessarily mistake a generalized firmware process as either a) the only way to develop a TV set, or b) necessarily being solely a software process. In many cases, you not only "turn off" something in software, but entire sub-assemblies are then not included in hardware. If you pull the boards out of a TV, you can often find "blank" soldering pads and even connectors where something might have gone to create a higher-end TV.

As for early adopters paying higher prices, that's basically a given. The question is how much for what features. I am distinctly not a fan of the 240Hz LCDs, but for other people they are a must-have item. The way semiconductor manufacturing works, those panels might undergo exactly the same manufacturing steps to create "the glass" as a panel that only targets a 60Hz refresh rate, but since it passes a tighter quality check at the end, it gets "binned" into the 240Hz stack. The economic cost is the same, but because of the miracle of the bell curve and supply/demand, people pay more for it. The company producing it may then load more of the "costs" of production onto that panel, rather than another one, but the economic processing through a certain point would be similar.

The above is a GROSS oversimplification of semiconductor (and LCD panel) production processes, but you get the idea. The biggest misconception that many people have of "digital" electronics is that binary math rules everything -- that all things electronic are exactly the same (1 or 0) when they are built. Such a concept doesn't hold up anywhere along the production chain, which is why you have "binning" and why, wait for it, copying settings from one display to another is more likely to be wrong than it is to be right.

How's that for getting us back on topic? :help::sneeky:addle::T


----------



## glaufman

Bear5K said:


> In many cases, you not only "turn off" something in software, but entire sub-assemblies are then not included in hardware. If you pull the boards out of a TV, you can often find "blank" soldering pads and even connectors where something might have gone to create a higher-end TV.


But that would bring us back to a real BOM difference. :bigsmile:


> As for early adopters paying higher prices, that's basically a given. The question is how much for what features. I am distinctly not a fan of the 240Hz LCDs, but for other people they are a must-have item. The way semiconductor manufacturing works, those panels might undergo exactly the same manufacturing steps to create "the glass" as a panel that only targets a 60Hz refresh rate, but since it passes a tighter quality check at the end, it gets "binned" into the 240Hz stack. The economic cost is the same, but because of the miracle of the bell curve and supply/demand, people pay more for it. The company producing it may then load more of the "costs" of production onto that panel, rather than another one, but the economic processing through a certain point would be similar.


Ah, similar to the precision resistors I try to avoid designing in where I can avoid it...


> Such a concept doesn't hold up anywhere along the production chain, which is why you have "binning" and why, wait for it, copying settings from one display to another is more likely to be wrong than it is to be right.
> 
> How's that for getting us back on topic? :help::sneeky:addle::T


Bravo, Maestro! :T I actually saw that coming just a few seconds early, but kudos nonetheless!


----------



## Bear5K

glaufman said:


> But that would bring us back to a real BOM difference. :bigsmile:


I didn't say that there wasn't a cost difference between more fully-featured sets vs. more basic sets. I only said that having a properly-engineered color decoder didn't necessarily require more expensive components, merely more expensive engineers. Given the economies of scale, I doubt that there are too many companies cranking out cheap sets using proprietary solutions, but I don't claim to know the entire market. :spend:


----------



## lcaillo

Pretty good steering of a topic there Bill. I even caught a hint of some statistics about to creep in. I wonder why?


----------



## glaufman

I see... I misinterpreted what you were saying from get-go.... typical onder::rolleyesno::doh::whistling:
So one last thought.... "features" like CMS controls, or gamma controls, different firmware? Or more expensive chips? (Let's not differentiate between good/bad full3D/2D CMS or 10-point gamma sliders...)

BTW, thanks for indulging me here:T


----------



## Bear5K

glaufman said:


> I see... I misinterpreted what you were saying from get-go.... typical onder::rolleyesno::doh::whistling:


No worries. It's a good thing I keep this thread subscribed, otherwise it might be months before I remember to check back in on it (seems like a flurry of activity, and then dormant...).



> So one last thought.... "features" like CMS controls, or gamma controls, different firmware? Or more expensive chips? (Let's not differentiate between good/bad full3D/2D CMS or 10-point gamma sliders...)


Definitely more expensive chips and more of them. The math isn't hard, but it is more math. Where it gets maddeningly complex is translating the matrix real-number math into integer math. In theory, if you put the CMS right at the point where you convert the inbound signal into linear RGB (i.e., at the RGB LUT), then you just need to do a relatively simple transform. People who talk about a CMS not being fully-featured because it doesn't include independent adjustments for secondaries that is different than the primaries, among other things, don't actually know how color math works. When you move the CMS from this "sweet spot", you end up adding complexity and cost into the design.


----------



## lcaillo

Thr sweetness of that spot also depends on assumptions about what it takes to get "linear" RGB. Add to that characteristics of the display that may require some additional consideration for spectral densities that are not consistent with the approximations on the camera and encoding side, or primary limitations, or extra color wheel segments, or having to modulate panel response in various ways, or frame rate interactions with display latencies or a dozen other variables and we end up with non-trivial challenges. But then, that is why we need tools like CalMAN, right?


----------



## Bear5K

lcaillo said:


> Thr sweetness of that spot also depends on assumptions about what it takes to get "linear" RGB. Add to that characteristics of the display that may require some additional consideration for spectral densities that are not consistent with the approximations on the camera and encoding side, or primary limitations, or extra color wheel segments, or having to modulate panel response in various ways, or frame rate interactions with display latencies or a dozen other variables and we end up with non-trivial challenges. But then, that is why we need tools like CalMAN, right?


So that we are clear, the conversion from R'G'B' or YCbCr to linear RGB is very straight-forward. You do have to make an assumption (or have an explicit control) for gamma, but that's really it. To your point, how the signal is then translated into physical light is definitely dependent upon display-specific factors.


----------



## glaufman

Bear5K said:


> No worries. It's a good thing I keep this thread subscribed, otherwise it might be months before I remember to check back in on it (seems like a flurry of activity, and then dormant...).


I have the opposite problem, I subscribe to so many just to make sure I don't miss anything important that it takes me over 2 hrs a day just to browse most and read the few that I find interesting... 


> Definitely more expensive chips and more of them.


ARG. But THAT means increased BOM cost! (Sorry, I just couldn't resist:whistling
Seriously though, I do see your points.


> People who talk about a CMS not being fully-featured because it doesn't include independent adjustments for secondaries that is different than the primaries, among other things, don't actually know how color math works.


I wasn't going to get to this for a few months, but since you mention it...
Why is it that people sometimes report that their secondaries don't fall on the line between their primaries? Or for that matter report secondary luminances that aren't the sum of the primary luminances? (that IS the correct math, isn't it?) Is this simply a bad LUT or improper calcuations or something more, er, sinister?



lcaillo said:


> Thr sweetness of that spot also depends on assumptions about what it takes to get "linear" RGB. Add to that characteristics of the display that may require some additional consideration for spectral densities that are not consistent with the approximations on the camera and encoding side, or primary limitations, or extra color wheel segments, or having to modulate panel response in various ways, or frame rate interactions with display latencies or a dozen other variables and we end up with non-trivial challenges. But then, that is why we need tools like CalMAN, right?


:rofl:Nice Len.<insert-shameless-plug-here> And you don't even work for them... (do you?):neener:


----------



## Bear5K

glaufman said:


> I wasn't going to get to this for a few months, but since you mention it...
> Why is it that people sometimes report that their secondaries don't fall on the line between their primaries? Or for that matter report secondary luminances that aren't the sum of the primary luminances? (that IS the correct math, isn't it?) Is this simply a bad LUT or improper calcuations or something more, er, sinister?


Some of it could be bad information floating around. I have some issues with a certain often-quoted "guide" on another forum. Also, it could be misunderstanding of good information. Or it could be bad tools. Or ... 

In other words, it is probably some combination of all of these issues. For a display that is worth calibrating, I doubt it is the display itself unless someone has just really screwed something up in a menu in which they didn't belong. :dumbcrazy:addle::nono:

For example, unless someone finds a way around Grassman's law, the sum of the luminances of each primary is most assuredly equal to the total luminance.
http://en.wikipedia.org/wiki/Grassmann's_law_(optics)


----------



## glaufman

That's a shame. I've studied well the guide of which you've spoken.


----------



## lcaillo

Bear5K said:


> So that we are clear, the conversion from R'G'B' or YCbCr to linear RGB is very straight-forward. You do have to make an assumption (or have an explicit control) for gamma, but that's really it. To your point, how the signal is then translated into physical light is definitely dependent upon display-specific factors.


Yes, of course, the conversion is easy linear math. It is the assumption about gamma that is not so straightforward. Encode gamma can vary a great deal. There is no way for the display to know what it was, so the linearity is really an approximation. The error may not matter much, but in some cases can be significant. This is one reason that sources vary so much and the education of the user with regard to how to use controls even on a calibrated display is so important. It is also a reason that a properly calibrated display is essential. At least you have a starting point in the calibrated settings, which takes out much of the most objectionable variance in the results.


----------



## glaufman

lcaillo said:


> Encode gamma can vary a great deal.


And here I thought all along that at least the movie studios stuck to the 0.45 encode... at least from what's "seen" on the monitor to the signal...


----------



## Bear5K

glaufman said:


> And here I thought all along that at least the movie studios stuck to the 0.45 encode... at least from what's "seen" on the monitor to the signal...


By and large they do, though you can't put too much of a universal law around this (the SMPTE and ITU standards are very clear). A lot of that is hard-coded into the capture devices themselves. Then, if the tech working on the production flow changes it, then it is probably intentional. Decode gamma is all over the map.


----------



## glaufman

Bear5K said:


> Then, if the tech working on the production flow changes it, then it is probably intentional.


Yes, they're free to "bake in" anything they want... sometimes I wonder if we couldn't make a distinction between what we call "gamma" for the encode/decode chain vs. "gamma" of a particular image...


----------



## Bear5K

glaufman said:


> Yes, they're free to "bake in" anything they want... sometimes I wonder if we couldn't make a distinction between what we call "gamma" for the encode/decode chain vs. "gamma" of a particular image...


You may be over-thinking the issue. When someone pushes the gamma on a scene intentionally, it is to create a particular look. If you correct for that push, then you lose rendering intent. What "ought" to happen is concrete guidelines that tie reproduction environment to recommended gamma in a way that accounts for the "desired" home environment (there are very strict/rigid guidelines for grading studios) so we can preserve rendering intent across heterogeneous environments.


----------



## glaufman

I very often over think issues... it's what I do...:nerd:
But I think I'm OK on this one... I don't want to correct for any indended push... what I'm talking about is strictly language/usage to avoid what I see as some confusion among some of us less edumacated folks...


----------



## Bear5K

glaufman said:


> I very often over think issues... it's what I do...:nerd:
> But I think I'm OK on this one... I don't want to correct for any indended push... what I'm talking about is strictly language/usage to avoid what I see as some confusion among some of us less edumacated folks...


All that the "average" enthusiast who cares about calibration and preserving rendering intent (which already puts you way, way above average) needs to know is that he or she needs to start with a 2.5 average gamma and adjust it downward to correct for ambient light. You then also need to potentially tweak specific parts of the gamma curve for display-specific limitations (and a bit of taste/content preference).

For most people with non-dedicated environments, something between a 2.0 and a 2.3 works, depending upon the specifics. In a bat cave, you can make a 2.5 work well.


----------



## glaufman

Bear5K said:


> (which already puts you way, way above average)


Awhhh! :heehee:
OK, granted the average guy doesn't need to be specific about language, but I've seen so much confusion between so-called experts (albeit not on this forum) when one's talking about encode, one's talking decode, one's talking about rendering... but I digress as usual...


> You then also need to potentially tweak specific parts of the gamma curve for display-specific limitations (and a bit of taste/content preference).


I'm under the impression that without an outboard processor, this really can't be done, right? What displays have this resolution on their gamma adjustments, even in the service menus? Is there some way to adjust gamma I'm missing?


----------



## glaufman

OK... I've given you guys a break for a while... time for another one... yes, that means I've been thinking again... (sorry about that...)...
CCFL backlights experience a color shift if they're set too bright, yes? Do LED edgelights do the same thing? How about backlights if you're left local dimming off? Local dimming on?
Assuming you've turned off all CE dimming...
Would you ever turn the backlight down to achieve a better D65? OF course, you still have your normal cuts/gains...


----------



## Bear5K

glaufman said:


> I'm under the impression that without an outboard processor, this really can't be done, right? What displays have this resolution on their gamma adjustments, even in the service menus? Is there some way to adjust gamma I'm missing?


My RS10 and Pioneer Kuro both have the ability to select different gamma curves. I believe that the Kuro can adjust its gamma curve at a more granular level with the ISF controls turned on (I've never bothered with mine since it is the "daily driver" TV). In general, though, yes, you will need an external processor for most of this.



glaufman said:


> OK... I've given you guys a break for a while... time for another one... yes, that means I've been thinking again... (sorry about that...)...
> CCFL backlights experience a color shift if they're set too bright, yes? Do LED edgelights do the same thing? How about backlights if you're left local dimming off? Local dimming on?
> Assuming you've turned off all CE dimming...
> Would you ever turn the backlight down to achieve a better D65? OF course, you still have your normal cuts/gains...


Sorry I missed this one from a while back. To the best of my knowledge, LEDs do not color shift. The biggest "hit" on them is the relatively "spiky" (i.e., narrow) frequency range at which they operate, making some measurement instruments less useful in measuring performance. In order to measure them successfully, you need a spectroradiometer with a sufficiently narrow measurement interval or a colorimeter that has specifically been calibrated for use with LEDs.

As for CCFLs, I cannot attest to what happens in a lab, but my experience in the field where I've seen color shifting from an LCD panel seems more of an issue with the LCD side of the panel, rather than the backlight.

Net answer: use the backlight control to get the white point close to where you want your maximum luminance (i.e., not torch mode unless you have really specific reasons/issues). If the backlight control measurably affects grayscale, see if you can return the panel and get a different one.


----------



## glaufman

Bear5K said:


> Sorry I missed this one from a while back.


No worries!


> or a colorimeter that has specifically been calibrated for use with LEDs.


This one has always perplexed me a little... different displays presumably have a choice of what LEDs they use, as such, how does a single profile matrix make a colorimeter accurate enough across the range of possibilities?


> Net answer: use the backlight control to get the white point close to where you want your maximum luminance (i.e., not torch mode unless you have really specific reasons/issues). If the backlight control measurably affects grayscale, see if you can return the panel and get a different one.


OkeeDokey!


----------



## Bear5K

glaufman said:


> This one has always perplexed me a little... different displays presumably have a choice of what LEDs they use, as such, how does a single profile matrix make a colorimeter accurate enough across the range of possibilities?


It gets somewhat complicated, but the issue has to do with tolerances. In other words, a company/manufacturer can calibrate the meter to a particular combination of LEDs, but the accuracy will diminish as you move away from that spec. The upside is that LEDs are actually, in many respects, a more homogeneous lighting technology than what has been used in the past. They still have unit-to-unit variation and manufacturer (of the LEDs and spec) variation, but you don't get as much batch-to-batch variation as you did with things like phosphor (CRT, Plasma, and to an extend CCFL). So, you essentially get a calibration that is more accurate for the technology than using something not dedicated to the technology, but you suffer from wider variances in what manufacturers might put into the display, if that makes sense. Since there are very few manufacturers out there making the necessary blue and "white" LEDs, that tends to constrain the space of what an OEM might actually put into the TV.

Net-net: figure out what the reference display is, and then know that as you drift beyond that manufacturing "group" (e.g., Samsung/Sony/JVC), you might get increased error, but not necessarily enough to invalidate the calibration (LEDs are tough, but doable). To be really sure about things, that's where the $$$ meters come into play, or contracting for a specific calibration for the meter.


----------



## perritterd

I started to calibrate my tv and I have a question about the *"Service Menu"*, not User Menu, for a Samsung LN32a550 tv. If someone (ME...) has gone into the SM and made some changes to the WB & WB-Movie numbers, can they be reset to the factory defaults if you use the "Factory Reset" setting in the SM? I neglected to write them down before changing and hope to get everything back to the default settings before retrying to calibrate my tv again. If not, where can you locate the factory settings for the SM that came with the tv from the manufacturer? 

Thanks for any help.
Bob


----------



## lcaillo

The only place that might have default settings is the service or training info. I do not recommend a reset in the service manual unless you know what parameters it affects.


----------



## glaufman

That all makes sense Bill. Thanks. Is my understanding correct though, that some of the affordable colorimeters out there are ok unprofiled, even on LED back/edge lights, for grayscale work, and the profiling is only really necessary for CMS work? I'm talknig about the C5 and DTP94.


----------



## Bear5K

perritterd said:


> I started to calibrate my tv and I have a question about the *"Service Menu"*, not User Menu, for a Samsung LN32a550 tv. If someone (ME...) has gone into the SM and made some changes to the WB & WB-Movie numbers, can they be reset to the factory defaults if you use the "Factory Reset" setting in the SM? I neglected to write them down before changing and hope to get everything back to the default settings before retrying to calibrate my tv again. If not, where can you locate the factory settings for the SM that came with the tv from the manufacturer?
> 
> Thanks for any help.
> Bob


Agree with Leonard on this one. Most SM "resets" take the display to a common set of values before any device-specific settings are loaded in the production line. In other words, the reset could be worse than simply working yourself out of the current hole. 

Always, always, always write down the values you see in the service menu before making any changes. Also, it is usually worth investing the $20 - $30 for a copy of your TV's service manual before you start poking around. It's not a guarantee, but it should sober you up for how little explanation/information is available should you decide to skip rule #1.


----------



## lcaillo

Lots of service manuals are available for free if you dig. Virtually all are available for under $20 online. There are suggested sources in the DIY repair stickies and there are vendors in the vendor listings that have manuals.


----------



## perritterd

Thanks Bear5K and lcaillo. You are both, of course, correct. I should have been more prepared instead of just jumping in without much thought! I am looking for a service manual here now but unable to locate one that might have all the settings from the factory shipping setup...but, I'll keep looking.

Thanks to both of you.
Bob


----------



## randal_r

I was hoping that someone might be able to help me. I am a newbie, and have attempted to do a video calibration. I have a Sony KDL52V5100, although it does have controls to do a gray level adjustments, it does not seem to have controls to adjust colors other than the the standard ones found through the remote control (Color & Hue). What is the way to the x,y,Y values for each color? Is this done via the Service Menu? What ever help anyone can give will be appreciated. 

Randal


----------



## mechman

You're stuck with what the manufacturer gives you. I have Sony KDL-52EX700 and I have no controls outside of the color and hue controls to adjust gamut. And if you have the controls to adjust grayscale in the user menu, then the gamut controls would be there as well if the manufacturer were to offer them. I don't think there are many, if any, manufacturers that have calibration controls in the service menu any more. 

With just the color and hue controls, you can get it close enough. Here are my chromaticity and luminance charts from my display. 

 

 

I'm in the process of touching up my displays lately and I'm sure I can get the luminance a bit better than what the above image shows. And my worst DeltaE is 2.9 for green. It is what it is.


----------



## highendallday

Are there any calibrators in indianapolis, indiana or does anybody know a calibrator in that area.


----------



## mechman

Try Jeff Meier from AccuCal. He's in Missouri but he does tours all year long. More than likely he will be stopping in IN sometime soon. 

Plus, he's one of the best.


----------



## sasamaes

Hello, 

I have a Samsung LCD TV, model LE32B651 and I need to re-calibrate ADC values in the service menu. My concern is that the TV goes into SM under "dynamic" (or "standard") mode and calibrating in that modes seems a little odd to me. The service menu has this "WB Movie" option wich I can enable and rollback to ADC calibration... should I do that? The service manual has no word about it. 

Thank you very much for your answer.


----------



## randal

How often does a Calibrator need to go into a service menu or do they ever? If so; why?

Randal


----------



## michael tlv

Greetings

Sony ... no
LG ... no
Samsung ... no
Sharp ... no
Panasonic ... no in Custom ... yes if THX mode
JVC ... no
Epson ... no


These apply to the higher end units of the brand, not the budget units which "may" have fewer controls.

Regards


----------



## mc_lover

I am so confused over VDO calibration, anyone can help me with it. I appreciate it.

- I use Laptop as VDO source. MKV movies were played and went to HDMI cable to LED display.
As a spyder4elite user (a true beginner) I can basically calibrate PC, Laptop Monitor.

but when I connected Laptop to a TV to do a calibration. there are so many questions 

*- Backlight, brightness, contrast.*
what should i do with backlight? turn it all of to "ZERO"?

*- Sharpness*
Can a sharpness be calibrated using spyder meter?

*- Color*
Spydercal didn't tell me a thing how to adjust color

I don't mind buying a Calman 5. I.. I just want to know if Calman 5 can help me with my situation or not.

it seems i am in a deep black pit.


----------



## michael tlv

Greetings

What you are finding out is that when you buy a hammer, the hammer does not come with instructions on how to build a house. 

Or books on brain surgery do not teach people to be doctors first.

Buying Calman or any other program does not mean you can sit back and let the software do the work. MAgic hands do not sprout from your laptop to calibrate a TV.

To calibrate a display, you need to learn how to do it. Software is a tool, but it won't teach you how to calibrate. Hammer ... instructions to build a house.

To learn to calibrate, you will have to spend a lot of time at DIY web sites readings and teaching yourself how to do this. At some point you might be proficient enough to do a reasonable job.

Much of this comes down to how much you value your time ($$). Is the value of your time worth little? If so, then spend the next year scrounging around for information and you will get to a calibrated TV eventually. IT has been surmised that even when people don't place much value on their time ... they still put something like $3/hr on it. So if we go with something like that ... you are looking at spending at least 100 hours in your learning process. (100x $3 = $300)

If you value your time more and want to get to an end result faster, then you will have to get some professional level training on how to do this. (Starts at $100 ... and up up up after that.) The 7 hours of training videos available from the link on this site is your start to being a competent DIYer. More efficient use of your time and you have more certainty that you did it correctly.

regards


----------



## Hagar1

Michael,

I'd be interested in one more opinion on this. 

I have a video EQ Pro hooked up to a JVC RS25 (450 hours). Using custom 1 and normal for gamma in the RS25 and I also set contrast and brightness as well.

I also find I need more room in the CMS. I select advance in the video EQ to allow a wider gamut instead of 0-100, it's now 0-200. Brightness usually needs to be around 135 and its perfect.

The question is IF I use over 100, am I boosting to much?

Eric at AV Foundry doesn't recommend it,

In general, it's a bad idea to set either the saturation or brightness above 100. You're going to drive something into over saturation or clipping at some point. If you think about it as a multiplier - sending a full white signal at 255 (8-bit level) with a multiplier of 117% would result in 298, which doesn't fit in 8-bits anymore.

One thing to be very careful of here is that you run plenty of high level content through it before you are done calibrating. I've heard (and seen) some calibrations that are really good by the numbers in CalMAN, and make a nice picture on most content. But then during something like an explosion scene, where the picture level is high, the whole thing clips and gives green/red flashes. Sometimes it's just "sparklies" around the bright edges. What happens is that these bright pixels go beyond the 8 or 10 bit limit and cause problems.

Some implementations will "clamp" the value so that anything >255 is limited to 255. Others could wrap so that 298 results in 42. The first method of clamping fails more gracefully as over-saturation just quietly turns to pure white. It's been a long time since we worked on that part of the Eq, so I can't really remember which it uses.

Also, since the CMS works as a linear function, there is some interdependence between the controls. Meaning you might be able to increase saturation to say 120 if you reduce luminance by 5 or so.

So what does the RS25 at +50 mean? Can I boost it in the RS25 and then further adjust it in the Video EQ?

And why does the RS25 gamma only go to 95? Calman wants 100, my video EQ defaults to 100. Whats the deal.

I'm about to pay someone to come in and do this. Who do you recommend in the Puyallup,WA area?

Thanks,
Dave


----------



## michael tlv

Greetings

Get the projector set up as best you can. Then add the video eq into the chain. this minimizes the impact of the EQ device. When you go too far, the eq makes some parts of the image appear solarized ... and discolored.

test patterns that show rgbcmy in the 235-253 range can more easily show that this is happening.

regards


----------



## Hagar1

Did as you said and viola!:sn:

However, what do you think the chances are of this?

I checked all my cables in the back. I had a red screen flash and thought, this wasn't right. So I replaced my Oppo HDMI and now all of sudden, I wasn't lacking in any color and in fact it dialed right in with room to spare.

Have you ever had an experience where a cable gives you trouble like that?

Thanks for your reply and help.

Now I can watch a movie this weekend and enjoy it.


----------



## michael tlv

I have seen HDMI cables go bad over time ... ... and it isn't just sparklies ... it can be tearing as well ... among other things.

Regards


----------

