In JS, I can run:
/[a-z]/.test('foo');
Which returns true
, as a single lowercase character appears anywhere inside the string.
However:
<input pattern="[a-z]" value="foo" required />
and
console.log(document.querySelector('input').checkValidity())
Returns false
.
Why does HTML5 checkValidity return different results from JavaScript regexp.test() ?
A regular expression that the control's value is checked against. The pattern must match the entire value, not just some subset.
— https://developer.mozilla.org/en/docs/Web/HTML/Element/Input
So pattern="[a-z]"
is equivalent to /^[a-z]$/.test('foo');