Is there any point ot upload in 32-bit color depth on Youtube or 8-bit is enough ?

Is there any point ot upload in 32-bit color depth on Youtube or 8-bit is enough ?

32-bit render processing is outrageously long.

Other urls found in this thread:

x264.nl/x264/10bit_02-ateme-why_does_10bit_save_bandwidth.pdf
necdisplay.com/p/medical-diagnostic-displays/md211c3
twitter.com/AnonBabble

YouTube will just lower the quality to the shittiest mp4 quality with maximum compression. No reason to upload 32-bit.

Why would you not want as much quality as possible?

Legitimate argument. Still a bummer since 8-bit is basically guaranteed unacurate colors but whatever.

Because the render time is doubled with no apparent difference on Youtube. Answer the question.

8 bit color depth is 256 colours.

You guys have literally no idea what your talking about. You will never see any quality improvement in rendered footage in anything higher than 8-bit.
Human eyes can't see detail beyond 8-bit and no monitor that you can buy will display anything higher than 8-bit. Higher bit levels are for when you are EDITING and COMPOSITING. They allow you to push colour correction effects much further without introducing banding. When you output it your back to 8-bits per channel srgb colour space which is as much as you'll ever be able to see.

Then just wait. It's not like you do anything but fap to hentai and turn perfectly good food into poo.

In a ideal scenario you're perfectly right.

But in a Vegas / Render / Youtube scenario 8-bit can cause a washed out picture.

256 colours PER CHANNEL retard.
that's 256*256*256 = 16.7million colours

32bit

That has nothing to with it being 8-bit and everything to do with you not rendering your footage into the propor colour space, which for a monitor is srgb with a gamma of 2.2.

8bit

they're both straight out of a youtube video 1440p 60fps yadda yadda

virtually no difference outside of a sharpen effect which was different on 8bit render

not worth the render time

These are both 8-bit jpegs you silly sod.

i beg to actually differ your claims

So, explain how an 8 bit image format can magically be in 32bits. I'll be jolly impressed if you can.

computers magic ain't gotta explain shit

Is it exactly the same frame? Because I see a clear difference.

Pretty sure youtube downscales to 8bit and TV levels anyway, so might as well upload a 1080p mp4 that's already converted

exactly the same and no you don't see shit

i watch youtube vids at 360p with 1.25x speed. who cares?

Here's your difference retard.
Proof that your a faggot.

the different is subtle enough to be blown away by what youtube will do to it
don't even worry about it

...

>caring about minute details on a 4 megabutt web video

i doubt anyone'd notice if you rendered at 720p and upscaled it before uploading

>Human eyes can't see detail beyond 8-bit
that's wrong

Nope, read a fucking book or better yet use your fucking eyes.

You should have no trouble reading this then

Required reading.
x264.nl/x264/10bit_02-ateme-why_does_10bit_save_bandwidth.pdf

Though this 32bit bullshit is bullshit.

No, you're a faggot!

top half is without dither, bottom with dither
it's not easy to see, but there is banding

if 8bit was enough, there would be no discernible difference

Can't see any difference, but I guess that's cos I'm a pleb who doesn't watch anime right?

read

speaking of dither, this is one reason reason how you can benefit from rendering over 8bit, even when the target is 8bit
it allows you to dither when downsampling to 8bit

give this a try: zoom in and scroll left and right
(also yes)

If I have to zoom in and move it about in some weird way then it's not really visible to the naked eye is it.

if you can see any difference between the top and bottom under any circumstances (obviously not including actually modifying the image)
then you can in fact see more than what 8bit can do, as the only difference between the top and bottom is dither, they are made up of exactly the same palette

nah nigga it's a 1440p render with huge ass variable bitrates

youtub gonna ruin it but still

literally no differences outside of the sharpen effect i mentioned earlier

Yes but in reality this would be moving at 25 frames per second and you wouldn't be pressing your face into the screen and straining your eyes to just about make out the smallest difference. 8-bit is on the very outer limit of what your eyes can discern, and besides that I really can't see a difference in the top and bottom of that gradient even if I zoom in and move my head around or whatever. I certainly won't see it any smoother in 10 bit. besides this is all pointless anyway as your monitor can only ever show you 8bit colour anyway. Even if you encode at 10-bit and try and play it back your still only going to see it in 8-bit because that's what your monitor is capable of showing.

not everyone will be able to see it
my vision isn't perfect, but others are worse
that and your display may not be good enough either (many cheap LCDs for example are only 6bit with their own dithering!)

8bit is close enough for most things, and i'm not disputing that, all i'm saying is that it's not at or above the limits of human eyesight

>Human eyes can't see detail beyond 8-bit and no monitor that you can buy will display anything higher than 8-bit.
Wrong on both accouts.
Human eyes can see detail beyond 8 bit. Just not necessarily for all colours and not necessarily under all viewing conditions. Remember that our viewing experience is a semi-active process, with both our eyes and brains trying to interpret and correct the raw incoming data before we experience it. That's how we got the blue/black vs. white/gold dress thing. That's how we get the 'stare at a weird picture for 20 seconds and then suddenly we see colours on a black-and-white image' gifs. That being said, 8bit gives enough fidelity for us not to notice the difference under normal use.

For medical purposes (viewing X-rays, etc.) there are dedicated displays supporting 10-bit colour depth.

Yeah it is. And I'm looking at this on a 27inch iMac. So the monitor is good. You really can't discern the difference, but whatever, I guess we're at the point in the argument where it's just subjective. I call you an autist, you call me a pleb. That's it I guess.

None of that shit has anything to do with bit-depth.
Also show me a 10bit monitor for medical use. I don't buy it.

>Also show me a 10bit monitor for medical use. I don't buy it.
necdisplay.com/p/medical-diagnostic-displays/md211c3

did you try scrolling horizontally?

your eyes (or probably, your brain) will tend to blend similar colors together, so just staring at it unmoving will be rather ineffective (once you spot a band, you should be able to hold your view of it)

Fair enough. I don't buy that it makes chinese cartoons look better though. Or that it's really that actually useful,like lives were saved because of it, but you got me, they do exist.

>I don't buy that it makes chinese cartoons look better though.
True, it doesn't since the source material is only 8bit. Buying this monitor to watch anime is like buying a HMMV to park on your lawn instead of the driveway.
>Or that it's really that actually useful,like lives were saved because of it
Damnit man, I'm an engineer, not a doctor. Go ask them if you want to know if it saves lives.
Besides, my point was that humans do under specific conditions see more than 8bit worth of colours and there are monitors that can show more than 8bit worth of colours.
That being said, I also believe using more than 8bit on a video is wasteful.

Agreed.
Holy fuck, Cred Forums resolved an argument!
We did it boys, first time ever.

>That being said, I also believe using more than 8bit on a video is wasteful.
you'd be right
most video isn't even close to RGB24, rather YUV420, which exploits several limitations of human vision to cut down the amount of raw data, most notibly, chroma (color) is only a quarter the resolution of luma (brightness)