In the histogram display I am implementing, I managed to successfully display RGB histogram. When compared with Photoshop's, it seems right-on!
(All screenshots below are from same image)
Now, for Luminosity histogram, I used the following formula:
Y = 0.2126 R + 0.7152 G + 0.0722 B
(as discussed here)
However, mine seems to be off from Photoshop's Luminosity Histogram (also from other applications'):
In fact, my Luminosity Histogram (calculated from above formula), seems to be equal to Photoshop's RGB Histogram:
What could be the proper formula to implement the proper Luminosity Histogram from RGB values?
P.S. : I've tried following formulas, but they don't seem to get me close to Photoshop's actual Luminosity Histogram:
Y = 0.2126 R + 0.7152 G + 0.0722 B
Y = 0.299 R + 0.587 G + 0.114 B
Y = 0.33 R + 0.5 G + 0.16 B
Y = 0.375 R + 0.5 G + 0.125 B
(Please note: I do understand, due to color space differences, actual formulas used by different applications, etc., histograms tend to differ slightly from one application to another. I used Photoshop as a common example. I also did compare with other image editing softwares, and the differences seemed the same.)
Prior to applying the coefficients, you must linearize the sRGB values.
The appropriate term is luminance. Luminosity is light over time, used in astronomy for starlight intensity.
sRGB, and most RGB color spaces used for storing or transmitting color data are using values encoded with a gamma or transfer curve, aka a "TRC or tone response curve.
On the otherhand, luminance is a linear value of light intensity. While Luminance has a spectral weighting per human sensitivity to different wavelengths, luminance is not relative to the human perception of lightness/darkness, and the human lightness perceptuion is non-linear relative to physical light intensity.
The reasons for the use of gamma / TRC are well described in Dr. Poynton's "Gamma FAQ".