Why does HTML think “chucknorris” is a color?


How come certain random strings produce colors when entered as background colors in HTML? For example:





...produces a document with a red background across all browsers and platforms.



Interestingly, while chucknorri produces a red background as well, chucknorr produces a yellow background.



What's going on here?



It's a holdover from the Netscape days:



Missing digits are treated as 0[...]. An incorrect digit is simply interpreted as 0. For example the values #F0F0F0, F0F0F0, F0F0F, #FxFxFx and FxFxFx are all the same.



It is from the blog post A little rant about Microsoft Internet Explorer's color parsing which covers it in great detail, including varying lengths of color values, etc.



If we apply the rules in turn from the blog post, we get the following:



Replace all nonvalid hexadecimal characters with 0's



Pad out to the next total number of characters divisible by 3 (11 -> 12)



Split into three equal groups, with each component representing the corresponding colour component of an RGB colour:



Truncate each of the arguments from the right down to two characters



Which gives the following result:



Here's an example demonstrating the bgcolor attribute in action, to produce this "amazing" colour swatch:





This also answers the other part of the question; why does bgcolor="chucknorr" produce a yellow colour? Well, if we apply the rules, the string is:



Which gives a light yellow gold colour. As the string starts off as 9 characters, we keep the second C this time around hence it ends up in the final colour value.



I originally encountered this when someone pointed out you could do color="crap" and, well, it comes out brown.



I'm sorry to disagree, but according to the rules for parsing a legacy color value posted by @Yuhong Bao, chucknorris DOES NOT equate to #CC0000, but rather to #C00000, a very similar but slightly different hue of red. I used the Firefox ColorZilla add-on to verify this.



The rules state:



I was able to use these rules to correctly interpret the following strings:



UPDATE: The original answerers who said the color was #CC0000 have since edited their answers to include the correction.



Most browsers will simply ignore any NON-hex values in your color string, substituting non-hex digits with zeros.



ChuCknorris translates to c00c0000000. At this point, the browser will divide the string into three equal sections, indicating Red, Green and Blue values: c00c 0000 0000. Extra bits in each section will be ignored, which makes the final result #c00000 which is a reddish color.



Note, this does not apply to CSS color parsing, which follow the CSS standard.



The browser is trying to convert chucknorris into hex colour code, because it's not a valid value.



This seems to be an issue primarily with Internet Explorer and Opera (12) as both Chrome (31) and Firefox (26) just ignore this.



P.S. The numbers in brackets are the browser versions I tested on.



.



On a lighter note



Chuck Norris doesn't conform to web standards. Web standards conform
to him. #BADA55



The WHATWG HTML spec has the exact algorithm for parsing a legacy color value:
https://html.spec.whatwg.org/multipage/infrastructure.html#rules-for-parsing-a-legacy-colour-value



The code Netscape Classic used for parsing color strings is open source:
https://dxr.mozilla.org/classic/source/lib/layout/layimage.c#155



For example, notice that each character is parsed as a hex digit and then is shifted into a 32-bit integer without checking for overflow. Only eight hex digits fit into a 32-bit integer, which is why only the last 8 characters are considered. After parsing the hex digits into 32-bit integers, they are then truncated into 8-bit integers by dividing them by 16 until they fit into 8-bit, which is why leading zeros are ignored.



Update: this code does not exactly match what is defined in the spec, but the only difference there is a few lines of code. I think it is these lines that was added (in Netscape 4):



Answer:



The reason is the browser can not understand it and try to somehow translate it to what it can understand and in this case into a hexadecimal value!



chucknorris starts with c which is recognised character in hexadecimal, also it's converting all unrecognised characters into 0!



So chucknorris in hexadecimal format becomes: c00c00000000, all other characters become 0 and c remains where they are...



Now they get divided by 3 for RGB(red, green, blue)... R: c00c, G: 0000, B:0000...



But we know valid hexadecimal for RGB is just 2 characters, means R: c0, G: 00, B:00



So the real result is:



I also did the steps in the image as a quick reference:



Why does HTML think “chucknorris” is a color?



chucknorris is stats with c the browser read into a hexadecimal value.



because a,b,c,d,e,f are characters in hexadecimal



The browser chucknorris convert to c00c00000000 hexadecimal value.



Then c00c00000000 hexadecimal value convert to RGB format(divided by 3)



c00c00000000 => R:c00c,G:0000,B:0000



The browser needs only 2 digits to indicate the colour.



R:c00c,G:0000,B:0000 => R:c0,G:00,B:00 => c00000



finally, show bgcolor = c00000 in web browser.



Here's an example demonstrating






Thank you for your interest in this question.
Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).


Would you like to answer one of these unanswered questions instead?

Popular posts from this blog

The Dalles, Oregon

眉山市

清晰法令