Since any RGB value is just a 24-bit integer, it makes total sense to express a color as a single hexadecimal number. Since the agreed-upon prefix for hex numbers is 0x, why did we instead introduce a new prefix (#) for colors?
As far as I know, the hex notation is derived from X11 (X Window System), which dates as far back as 1987, before Linux and WWW even existed.
Here's the relevant page from the Xlib manual:
The reason it was ultimately chosen, is likely because Unix-based systems were the OS of choice in academia at the time, and they were familiar with that notation and thought of using something already supported rather than reinventing the wheel.
When the first web browser came out, it was just plain text. Some browsers gave you the option to set the background and text color, but it was not part of the web page. But some people wanted background colors:
"Mosaic has the capability to change the background color of a page in the prefs. I asked NCSA if it was possible for their program to read a "page color" tag, and then set the page color appropriately. They told me that would take a change in HTML specs and only then will they support it."
Bradford Bohonus (Sep 22 1994)
Of course, the whole idea behind HTML was to separate structure and presentation, so that wasn't going to happen. The idea of stylesheets soon appeared, prior to the release of HTML 2.0 (in 1995).
There were numerous proposals for stylesheets, one early proposal actually did use 0x
hex notation:
Rob Raisch (raisch@ora.com)
Thu, 10 Jun 1993 15:23:20 -0400
...
(fo) foreground= COLOR DEFAULT:''
Describes the recommended foreground color representation
for a character. Colors are specified as 'inherit', text
names, (eg. black, white, magenta), or as RGB color
values in hexidecimal, (eg. 0x000000, 0xffffff, 0xff00ff)
(ba) background= COLOR DEFAULT:''
Describes the recommended background color representation
for a character. Colors are specified as 'inherit', text
names, (eg. black, white, magenta), or as RGB color
values in hexidecimal, (eg. 0x000000, 0xffffff, 0xff00ff)
...
Another proposal known as "Style Sheets for HTML" (Joe English, 1994), suggested the following:
All the colors used by a style sheet must be declared in the (optional) COLORS element. This element contains one or more COLOR elements, each of which specifies a single color.
COLOR elements have two required attributes: ID, a unique identifier, and RGB, which defines the color. Colors are referenced by ID.
Colors are defined by their red, green, and blue components using the X11 hex notation: a pound sign followed by 3, 6, 9 or 12 hexadecimal digits. The digits are interpreted as three groups of 1, 2, 3 or 4 half-bytes, the first specifying the red component, the second green, and the third blue. Hex digits A through F may be upper or lower case.
<colors> <color id=red rgb="#F00"> <color id=green rgb="#00FF00"> <color id=blue rgb="#000000000FFFFF"> <color id=grey rgb="#c0c0c0"> <color id=white rgb="#FFFFFF"> </colors>
Although not directly attributed to Joe English, ultimately the #RGB and #RRGGBB hex notation is what ended up in the CSS1 drafts and final recommendation, which you can see here: https://www.w3.org/Style/CSS/history.html.
And personally, I think it probably had to do with lexical analysis being used to parse the text with specific rules (#RGB vs #RRGGBB), rather than representing raw data, like in C/C++ source code. In the CSS spec drafts, it defines the rules required to parse "hexcolor":
h [0-9a-fA-F]
hexcolor #{h}{h}{h}|#{h}{h}{h}{h}{h}{h}|#{h}{h}{h}{h}{h}{h}{h}{h}{h}
So basically they wanted a textual definition for representing color codes, and the already-existing X11 specification fit the bill. Since hex notations like 0xCAFE
, $CAFE
or CAFEh
are used to represent nondescript raw data, it might have led to confusion to impose stricter rules on those already existing notations.