4K Firestick, should I leave on 12bit no HDR or 8bit Dolby Vision

DetroitEWarren

Superstar
Joined
Jul 15, 2012
Messages
18,759
Reputation
6,670
Daps
59,741
Reppin
Detroit You bytch Ass nikka
Question about which one I should use. When I mess around with display settings, using 12bit color settings, HDR says it's not active when using 12bit. But if I change to Always HDR, it drops to 8bit, activating Dolby Vision at 8bit.

I can't get 12bit when Dolby Vision is active, only 8bit. The Dolby in 8bit doesn't look as good as non HDR in 12bit.

Which one is better? Why can't Dolby use 12bit? Why does 12bit say no HDR but look better than Dolby at 8bits?
 

Barney Rubble

Superstar
Supporter
Joined
May 3, 2012
Messages
2,961
Reputation
1,868
Daps
13,700
Reppin
NULL
Question about which one I should use. When I mess around with display settings, using 12bit color settings, HDR says it's not active when using 12bit. But if I change to Always HDR, it drops to 8bit, activating Dolby Vision at 8bit.

I can't get 12bit when Dolby Vision is active, only 8bit. The Dolby in 8bit doesn't look as good as non HDR in 12bit.

Which one is better? Why can't Dolby use 12bit? Why does 12bit say no HDR but look better than Dolby at 8bits?
There are no retail displays that truly support 12 bit, only 10, and nothing is encoded in 12 bit. It can't do both because it's probably a bandwidth issue. Make sure your TV supports HDMI 2.1 and that your HDMI cable does too. See if there is a 10 bit option as your TV is only 10 bit max. 8bit Dolbyy vision should look fine as the only thing it will do is dither, meaning at worst you'll see gradient lines in the background image. It could be a calibration problem or if the content you're trying to watch isn't actually encoded for dobly vision then you shouldn't use it anyway.
 
Last edited:

DetroitEWarren

Superstar
Joined
Jul 15, 2012
Messages
18,759
Reputation
6,670
Daps
59,741
Reppin
Detroit You bytch Ass nikka
There are no retail displays that truly support 12 bit, only 10, and nothing is encoded in 12 bit. It can't do both because it's probably a bandwidth issue. Make sure your TV supports HDMI 2.1 and that your HDMI cable does too. See if there is a 10 bit option as your TV is only 10 bit max. 8bit Dobly vision should look fine as they only thing it will do is dither, meaning at worst you'll see gradiant lines in the background image. It could be a calibration problem or if the content you're trying to watch isn't actually encoded for dobly vision then you shouldn't use it anyway.
When I set it at 10bit, Dolby Vision still only activates in 8bit when HDR is set to adaptive. If I switch to Always HDR, Dolby Vision is working, but never in 10bit if you get what I'm saying.

When I switch to adaptive, it says 10bit or 12bit according to what I have it set at, and looks a lot better/darker colors than Dolby Vision. 10bit or 12bit seems to be exactly the same. But they both look better than Dolby Vision at 8bits, which is HDR.

Which one should I use? It looks better when I set it to no HDR 10 or 12bit. It looks better than Dolby Vision at 8bits.
 

Barney Rubble

Superstar
Supporter
Joined
May 3, 2012
Messages
2,961
Reputation
1,868
Daps
13,700
Reppin
NULL
When I set it at 10bit, Dolby Vision still only activates in 8bit when HDR is set to adaptive. If I switch to Always HDR, Dolby Vision is working, but never in 10bit if you get what I'm saying.

When I switch to adaptive, it says 10bit or 12bit according to what I have it set at, and looks a lot better/darker colors than Dolby Vision. 10bit or 12bit seems to be exactly the same. But they both look better than Dolby Vision at 8bits, which is HDR.

Which one should I use? It looks better when I set it to no HDR 10 or 12bit. It looks better than Dolby Vision at 8bits.
Basically, I only use HDR or Dobly Vision if what I'm watching was actually encoded for it. If you force it, it's gonna look trash cuz it's just guessing what it should look like and wasn't actually calibrated to look right. So, I would turn it off unless the content specifically says it's HDR or DV.
 

StatUS

Superstar
Supporter
Joined
Apr 30, 2012
Messages
28,036
Reputation
1,745
Daps
61,392
Reppin
Everywhere
First response is spot on.

But what do you mean when you say HDR "looks better" than DV? Is it just brighter, better color?
 

DetroitEWarren

Superstar
Joined
Jul 15, 2012
Messages
18,759
Reputation
6,670
Daps
59,741
Reppin
Detroit You bytch Ass nikka
Basically, I only use HDR or Dobly Vision if what I'm watching was actually encoded for it. If you force it, it's gonna look trash cuz it's just guessing what it should look like and wasn't actually calibrated to look right. So, I would turn it off unless the content specifically says it's HDR or DV.
Good looking out. This is what I needed to know.

Cancelled Xfinity last month and got a year long IPTV provider. I'm using Sparkle TV player because I can rewind live TV.

Whenever I use Kodi to watch HDR movies, Dolby Vision looks good ASF. But when using the IPTV player, it looks bad. The 10-12bit with no HDR looks a lot better because the IPTV player is not encoded in Dolby.

I get it now.

But when I'm using 10-12bit, I'm not actually using HDR right? Even though the colors look really good?
 

Barney Rubble

Superstar
Supporter
Joined
May 3, 2012
Messages
2,961
Reputation
1,868
Daps
13,700
Reppin
NULL
Good looking out. This is what I needed to know.

Cancelled Xfinity last month and got a year long IPTV provider. I'm using Sparkle TV player because I can rewind live TV.

Whenever I use Kodi to watch HDR movies, Dolby Vision looks good ASF. But when using the IPTV player, it looks bad. The 10-12bit with no HDR looks a lot better because the IPTV player is not encoded in Dolby.

I get it now.

But when I'm using 10-12bit, I'm not actually using HDR right? Even though the colors look really good?
You should know if you're in HDR mode if the logo pops up in the top right or when you go into your menu to change brightness and shyt if it says it's in HDR mode.
 
Top