This is an automated archive.
The original was posted on /r/hometheater by /u/TRIPMINE_Guy on 2023-08-18 06:16:26+00:00.
So I've been messing around with a crt monitor after only ever using lcds. This isn't a crt circle jerk post, but I wanted to talk about some things I have deduced from my experience using my crt and how it relates to how we percieve contrast. I have no knowledge on the intricacies of contrast and all that so please point out any flaws in my conclusions.
First off, in a dark room my crt seems to have a more vibrant image quality than my lcd even though my lcd should have higher contrast ratio, even on scenes with no dark. My theory as to why is that the eye is more capable of discerning colors when it is dark, and the opposite is that a brighter light reduces eyes ability to perceive color. We know ambient light does this to your tv, so why wouldn't the same also apply to the light output by our displays? So even though my crt has a lower contrast since it is dim enough my eye will perceive more colors than a brighter display that forces my eye to not
I decided to test my theory further by lowering my crt refresh rate to only 48hz which introduced a very noticeable flicker to the image and therefore meant there was even more darkness however, I perceived an even brighter image! (Sidenote, others on crt forums think I'm nuts because I actually prefer to watch stuff at 48hz now since the image looks even better in terms of color to me.) Now I cannot decide if the flickered image had a perceived higher contrast than the non-flickered image on my crt, but it is 100% perceived as brighter to my eye. I know it is not actually brighter because someone I talked to noticed this himself independently of me and measured the nits and said it was the same at any refresh rate.
These two things has led me to conclude that a higher nit display that raises the brightness level across the entire display is counterproductive to perceiving more colors and therefore higher nit displays aren't the end all in terms of color vibrancy.
Now hdr might seem to throw a wrench in my claim that higher nits are bad, but looking closer, hdr and my crt in a 48hz flicker mode is actually doing the same thing. That is, instead of just raising the light across the board of the entire scene, it is keeping the average light level of a scene lower while making parts of the scene brighter instead of the entire scene. However, in my 48hz crt the "reduced average scene brightness" is actually just my entire average light dropping due to the phosphors not being lit up as much in a second. While my "increased specular highlights" is actually the entire scene in comparison to the black between frame updates. Infact hdr will do it better since my method doesn't discriminate on what is selectively raised, but it is all selectively raised relative to the increased black frame.
Now there is something I don't understand, and that is that I've read people with oled have said black frame insertion reduces the color vibrancy of an oled which flies in the face of everything I have said. I'm not sure why this is, but I have found some studies that say that flickered light can have a higher perceived brightness than a static light, but only at certain flicker frequencies. It seems very fortunate, or perhaps unfortunate that my crt is in this range, but black frame insertion is not.
I don't have an oled yet to make a comparison so if anyone here has an oled and crt that's not worn out, if you want, try out a 48hz mode and compare sdr content and see how it compares on the two. Like I said, I'm not claiming crt has better colors than oled or anything, what I am claiming is I think image vibrancy should be tied to how humans perceive color, which a lot of which is relative to what light is in the room, instead of some objective measure like contrast, since like I said, my crt is more vibrant than my lcd to my eye, even though the contrast is objectively lower.