SO IS MY HDTV FLABBY AND SICK ?

lamont614

Superstar
Joined
May 2, 2012
Messages
15,635
Reputation
1,462
Daps
31,052
50 in Panasonic viera tcp50c2 :flabby:

Won in a work back in 2011 or 12 I forget


TC-P50C2 - PanasonicB2C


I had my plasma since 2011 and it's still beautiful
I still get compliments when I got chick over watching blurays or Netflix
I love my plasma but with PS4 pro coming should I upgrade will my tv blow up? :laff:

I just realized my tv is still 720 :sadcam:


Will I notice a difference on my tv if I get the ps4pro



Do they even make plasmas any more ?

@Fatboi1
 

Fatboi1

Veteran
Supporter
Joined
May 6, 2012
Messages
60,193
Reputation
7,898
Daps
110,228
Yes.

A 4K HDR 10 Bit Panel is a night and sunrise difference.

There's literally more colours able to be seen that's not possible on current TVs.

Here.. read this,, I found it online
---------------------

WHAT IS HDR?


HDR10 is an open standard in the industry. It has an odd, hard to remember name. That’s why you probably won’t see “HDR10” listed on many specification sheets or boxes. The TV will simply say it supports “HDR” and you’ll have to assume it supports HDR10 content. ~How-To Geek

  • 10-bit Colour
    10-bit Colour, 2^30 gives a total of 1,073,741,824 (1.07b) colours
    8-bit Colour, 2^24, gives a total of 16,777,216 (16.8m) colours

  • Rec. 2020
    This means you will have available colours much closer to the visible spectrum of our eyes. While HDR Media Profile doesn't specify how much of this should be covered, a UHD Premium label on a display means it has at least 90% of DCI P3...
    hY2lpzh.jpg

    Screens of today use Rec.709 (much smaller) - this, by the way, is why it is important for us not to try and view HDR images directly on our non-HDR screens. The colours will all fall in the wrong place.


SO, WHAT DO WE SEE?

colour_infographic.jpg


On a conventional display, what the game is actually being rendered in is being reduced to fit Rec. 709, so we are losing the ablity to show millions of natural colours that we see in real life. With Rec. 2020 (which is what HDR maps to) we are able to display them.

1.jpg


A good example is to look at a picture of red flame. It would have really bright reds, but in Rec. 709 that red isn't available. So what do we do? At the moment, that red is mixed with blues and greens to make it as bright as we want it. This is the technique we have had to deal with up until HDR.

On an HDR screen (and if this image itself was made for Rec. 2020) the colour of that flame would be closer to how it is in real life.


WHAT'S THE DEAL WITH THIS TALK OF MORE BRIGHTNESS?

This is a hard one. I won't embed them on this page, but throughout GAF you see these horribly innacurate SDR (normal) vs. HDR images where the HDR side is simply brighter and higher saturated. While these images are false, they're not misleading.

This is the answer I have decided to go with:

UHD-Premium-logo-1.jpg


The above logo refers to a specification that may allow a TV to be advertised as UHD Premium. The requirements are:

  • At least 3,840 x 2,160 (4K)
  • 10-bit Colour
  • At least 90% of DCI P3
  • More than 1000 nits (peak brightness) and less than 0.05 nits black level or,
  • More than 540 nits (peak brightness) and less than 0.0005 nits black level (since you don't need high brightness if your blacks are so good, like an OLED)

What's happening is that while HDR10 is an open platform, and contrast ratio requirements aren't part of the specifications (there are none), TV companies are sort of pretending it DOES have minimum specifications -- and that all TVs prior to HDR10 are garbage with low brightnesses and horrible blacks. A lot of the comparisons you are looking at are UHD Premium compliant TVs that are showing their HDR compatibility along with their contrast ratio standards.

Does that mean a non-UHD Premium TV can't do HDR? No. It's just really unlikely that you'll get a screen sporting HDR that will have terrible contrast ratio, although it is certainly possible.

Some additional perspective:
...Philips, too, is sticking to HDR-only branding, although none of its tellys meet the UHD Premium specification anyway; its top-tier ‘HDR Premium’ models only go up to 700nits, with its ‘HDR Plus’ sets coming in around 400nits. ~ WHAT HI*FI




A COMPARISON BETWEEN 8-BIT AND LOWER?

If this looks like a minor difference etc. sure. You could write an entire paper with regard to how we process the images we see. Essentially, we're pretty good at putting colours next to each other to make it appear like another:

f2N356N.png


I used Error Diffision simply to prove a point. Even with just 256 total colours (from 16.8 million), we have come a long way with compression techniques and can do stuff with a limited space so that you wouldn't see much of a difference. You don't need to use all these colours at the same time, it's more about what colours you can use. If you get a scene that needs lots of shades of the same colour, that's where you need more colour depth:

M7Gw0ha.png


Here I used Nearest to prove a point (I'm cheating). Anyway, you can see in this image that there is no way for the limited number of colours to show BFF-chan's face without some serious issues. We only have 256 colours available.

These comparison are completely cheating anyway. If you had a display that could only produce 256 colours, it doesn't mean at once, it means overall. That means our 256 colour display can't produce both images. What you're more likely to get is this:

yHWCLnC.png


So with HDR10, there are some scenes we can show that we currently don't even think of showing. Think back to the picture of the fire, can you see the individual bright reds in the middle of the flame? No, because that information isn't there and our SDR screens couldn't display it anyway. So you have to understand, these images won't even show you anything new on your HDR display because they were shot in 8-bit.

IS IT LIKE LIMITED VS. FULL COLOUR SPACE?

Yes... But the differences are FAR greater for what we call DEEP COLOR:

  • LIMITED : < 8/24-bit : (17 - 235) = 218*218*218 = 10,360,232 (10.3m) Colours
  • FULL : = 8/24-bit : (0 - 255) = 256*256*256 = 16,777,216 (16.8m) Colours
  • DEEP : 10/30-bit : (0 - 1023) = 1024*1024*1024 = 1,073,741,824 (1.07b) Colours



8-BIT? 24-BIT? 10-BIT? 30-BIT?

ikr? 8-bits means 256 shades per colour. Since we use red, green and blue, we sometimes call this 24-bit (8*3) True Colour. 10-bit is 1024 shades etc.


IS HDR10 JUST 10-BIT COLOUR THEN?

No, HDR Media Profile is other stuff (in super simple terms):

  • EOTF: SMPTE ST 2084, or Perceptual Quantizer (PQ), lets us use Rec. 2020 and much higher luminance.
  • METADATA: SMPTE ST 2086, MaxFALL, MaxCLL - This allows the device to tell the screen what to do with the image. This is why your 10-bit monitors can't display HDR content etc.
 

lamont614

Superstar
Joined
May 2, 2012
Messages
15,635
Reputation
1,462
Daps
31,052
:wtf: 50 inch with 720p ? :mindblown:



:laff:


I don't even notice to be honest I never heard anybody say my shyt weak but browsing the coli
Got me thinking I need to upgrade it's not a necessity so I'll wait


Just wanted to know if I'm pouring jelly on myself getting the PS4 pro with this tv
 

lamont614

Superstar
Joined
May 2, 2012
Messages
15,635
Reputation
1,462
Daps
31,052
Yes.

A 4K HDR 10 Bit Panel is a night and sunrise difference.

There's literally more colours able to be seen that's not possible on current TVs.

Here.. read this,, I found it online
---------------------

WHAT IS HDR?


HDR10 is an open standard in the industry. It has an odd, hard to remember name. That’s why you probably won’t see “HDR10” listed on many specification sheets or boxes. The TV will simply say it supports “HDR” and you’ll have to assume it supports HDR10 content. ~How-To Geek

  • 10-bit Colour
    10-bit Colour, 2^30 gives a total of 1,073,741,824 (1.07b) colours
    8-bit Colour, 2^24, gives a total of 16,777,216 (16.8m) colours

  • Rec. 2020
    This means you will have available colours much closer to the visible spectrum of our eyes. While HDR Media Profile doesn't specify how much of this should be covered, a UHD Premium label on a display means it has at least 90% of DCI P3...
    hY2lpzh.jpg

    Screens of today use Rec.709 (much smaller) - this, by the way, is why it is important for us not to try and view HDR images directly on our non-HDR screens. The colours will all fall in the wrong place.


SO, WHAT DO WE SEE?

colour_infographic.jpg


On a conventional display, what the game is actually being rendered in is being reduced to fit Rec. 709, so we are losing the ablity to show millions of natural colours that we see in real life. With Rec. 2020 (which is what HDR maps to) we are able to display them.

1.jpg


A good example is to look at a picture of red flame. It would have really bright reds, but in Rec. 709 that red isn't available. So what do we do? At the moment, that red is mixed with blues and greens to make it as bright as we want it. This is the technique we have had to deal with up until HDR.

On an HDR screen (and if this image itself was made for Rec. 2020) the colour of that flame would be closer to how it is in real life.


WHAT'S THE DEAL WITH THIS TALK OF MORE BRIGHTNESS?

This is a hard one. I won't embed them on this page, but throughout GAF you see these horribly innacurate SDR (normal) vs. HDR images where the HDR side is simply brighter and higher saturated. While these images are false, they're not misleading.

This is the answer I have decided to go with:

UHD-Premium-logo-1.jpg


The above logo refers to a specification that may allow a TV to be advertised as UHD Premium. The requirements are:

  • At least 3,840 x 2,160 (4K)
  • 10-bit Colour
  • At least 90% of DCI P3
  • More than 1000 nits (peak brightness) and less than 0.05 nits black level or,
  • More than 540 nits (peak brightness) and less than 0.0005 nits black level (since you don't need high brightness if your blacks are so good, like an OLED)

What's happening is that while HDR10 is an open platform, and contrast ratio requirements aren't part of the specifications (there are none), TV companies are sort of pretending it DOES have minimum specifications -- and that all TVs prior to HDR10 are garbage with low brightnesses and horrible blacks. A lot of the comparisons you are looking at are UHD Premium compliant TVs that are showing their HDR compatibility along with their contrast ratio standards.

Does that mean a non-UHD Premium TV can't do HDR? No. It's just really unlikely that you'll get a screen sporting HDR that will have terrible contrast ratio, although it is certainly possible.

Some additional perspective:
...Philips, too, is sticking to HDR-only branding, although none of its tellys meet the UHD Premium specification anyway; its top-tier ‘HDR Premium’ models only go up to 700nits, with its ‘HDR Plus’ sets coming in around 400nits. ~ WHAT HI*FI




A COMPARISON BETWEEN 8-BIT AND LOWER?

If this looks like a minor difference etc. sure. You could write an entire paper with regard to how we process the images we see. Essentially, we're pretty good at putting colours next to each other to make it appear like another:

f2N356N.png


I used Error Diffision simply to prove a point. Even with just 256 total colours (from 16.8 million), we have come a long way with compression techniques and can do stuff with a limited space so that you wouldn't see much of a difference. You don't need to use all these colours at the same time, it's more about what colours you can use. If you get a scene that needs lots of shades of the same colour, that's where you need more colour depth:

M7Gw0ha.png


Here I used Nearest to prove a point (I'm cheating). Anyway, you can see in this image that there is no way for the limited number of colours to show BFF-chan's face without some serious issues. We only have 256 colours available.

These comparison are completely cheating anyway. If you had a display that could only produce 256 colours, it doesn't mean at once, it means overall. That means our 256 colour display can't produce both images. What you're more likely to get is this:

yHWCLnC.png


So with HDR10, there are some scenes we can show that we currently don't even think of showing. Think back to the picture of the fire, can you see the individual bright reds in the middle of the flame? No, because that information isn't there and our SDR screens couldn't display it anyway. So you have to understand, these images won't even show you anything new on your HDR display because they were shot in 8-bit.

IS IT LIKE LIMITED VS. FULL COLOUR SPACE?

Yes... But the differences are FAR greater for what we call DEEP COLOR:

  • LIMITED : < 8/24-bit : (17 - 235) = 218*218*218 = 10,360,232 (10.3m) Colours
  • FULL : = 8/24-bit : (0 - 255) = 256*256*256 = 16,777,216 (16.8m) Colours
  • DEEP : 10/30-bit : (0 - 1023) = 1024*1024*1024 = 1,073,741,824 (1.07b) Colours



8-BIT? 24-BIT? 10-BIT? 30-BIT?

ikr? 8-bits means 256 shades per colour. Since we use red, green and blue, we sometimes call this 24-bit (8*3) True Colour. 10-bit is 1024 shades etc.


IS HDR10 JUST 10-BIT COLOUR THEN?

No, HDR Media Profile is other stuff (in super simple terms):

  • EOTF: SMPTE ST 2084, or Perceptual Quantizer (PQ), lets us use Rec. 2020 and much higher luminance.
  • METADATA: SMPTE ST 2086, MaxFALL, MaxCLL - This allows the device to tell the screen what to do with the image. This is why your 10-bit monitors can't display HDR content etc.





So will PS4 pro work with my tv ?
 

NatiboyB

Veteran
Supporter
Joined
Apr 30, 2012
Messages
65,178
Reputation
3,816
Daps
103,506
Yes it is it's damn near 2 picture quality leaps behind (1080I/1080P, 4K and maybe even 3D)
 
Top