I'm trying to add functionality to a serial port web app which can represent all received bytes from 0x00
to 0xFF
(including typically unprintable characters, e.g. ASCII control codes) as a single width visible glyph. I've created a special vector font (in .woff2 format) which has visible glyphs assigned to all of these code points, as shown:
I have mostly got it working, however, when trying to show these inside a span element in the HTML of the web app, some are not working correctly, as shown below (which is just 0x00
, then 0x01
, then 0x02
, e.t.c.):
One example is the horizontal tab, code point 0x09
which should print after the [BS]
. Rather than the HT glyph being shown, the browser is inserting white space for the tab (which I think ends up collapsing down to a single space because of HTML rules). This is the space indicated by the first left arrow.
If it matters, the UI consists of a span element with the child text for that span being set by calling String.fromCharCode(0x09)
(this is for the HT char in particular). The CSS assigns my custom font to this span.
Is there a way to prevent the browser interpreting special unicode code points (e.g. horizontal tab) and just tell it to render the glyph from the font instead?
Tabs are converted to spaces under CSS (white-space) rules, not HTML rules, but even so an unconverted tab character is going to mean "advance to the next tab stop", not "render the glyph".
I don't think there's anything in CSS to control this, but browsers can render the full Unicode space, not just 0x00 to 0xFF, so I suggest that you add an 0xE000 offset to each received character so that the code points are now in the Unicode Private Use Area. Then construct your font to place the glyphs in that space.