DetroitEWarren
Superstar
Question about which one I should use. When I mess around with display settings, using 12bit color settings, HDR says it's not active when using 12bit. But if I change to Always HDR, it drops to 8bit, activating Dolby Vision at 8bit.
I can't get 12bit when Dolby Vision is active, only 8bit. The Dolby in 8bit doesn't look as good as non HDR in 12bit.
Which one is better? Why can't Dolby use 12bit? Why does 12bit say no HDR but look better than Dolby at 8bits?
I can't get 12bit when Dolby Vision is active, only 8bit. The Dolby in 8bit doesn't look as good as non HDR in 12bit.
Which one is better? Why can't Dolby use 12bit? Why does 12bit say no HDR but look better than Dolby at 8bits?