Search code examples
javascriptstringcharat

how can a non empty string .charAt(0) return an empty string?


I'm really puzzled by this one ... It's so unbelievable that I post a screen capture of the Chrome debugger:

enter image description here

I quickly wrote this function in a test to compare 2 base64 encoded strings (a might be shorter than b). However it always return 0, as if the first characters were different. And in fact they are : the first character ca of string a is correct, but for some mysterious reason the first cb of string b is empty !!! The string b looks correct, has a correct length (685 chars) and has the correct type typeof(b) == 'string'

The calling code (actually it's TypeScript) is this, in case it helps :

requestGET('qBandConfig.sql', { jobid: this.job.jobid }) // get the blob from db 
    .then(json => {
      const base64:string = json[0]['bandconfig'] // type enforced to make sure
      this.editor.setBands(base64)
        .then(() => {
          // test that we find the same blob when re-encoding
          const check:string = this.editor.getBandsBLOB()
          const diff = firstDiff(check, base64) // always return 0 ????
          if (diff > -1 && check[diff] !== '=')
            this.log.warning('encoded blob ', check, ' is different from decoded ', base64)
        })
    })

Solution

  • It could contain a non-printable character such as the Zero Width Space, known in HTML as ​. I am not sure what the Chrome Dev Tools would show in such a case; it might show nothing indeed.

    Examine cb.length, if it is 0 then cb is empty, otherwise it's not empty (obviously) and then you can use charCodeAt() to find out what is there.

    Example code with Zero Width Space character:

    var x = "​";
    console.log("x === '" + x + "'");
    console.log("x.length === " + x.length);
    console.log("x.charCodeAt(0) === " + x.charCodeAt(0));