This is the synopsis for strspn
:
#include <string.h>
size_t strspn(const char *s1, const char *s2);
This is the description and return value for POSIX.1-2001:
The strspn() function shall compute the length (in bytes) of the maximum initial segment of the string pointed to by s1 which consists entirely of bytes from the string pointed to by s2.
The strspn() function shall return the length of s1; no return value is reserved to indicate an error.
This is (almost) the same from POSIX.1-2017:
The strspn() function shall compute the length (in bytes) of the maximum initial segment of the string pointed to by s1 which consists entirely of bytes from the string pointed to by s2.
The strspn() function shall return the computed length; no return value is reserved to indicate an error.
Is it possible for an implementation of strspn
to be compliant with both POSIX.1-2001 and POSIX.1-2017? How?
This is a bug in POSIX.1-2001.
As the POSIX description of strspn
says:
The functionality described on this reference page is aligned with the ISO C standard. Any conflict between the requirements described here and the ISO C standard is unintentional. This volume of IEEE Std 1003.1-2001 defers to the ISO C standard.
And the C standard (ISO 9899:1999, 7.21.5.6 The strspn
function) clearly says:
The
strspn
function returns the length of the segment.
Newer editions of POSIX fixed the wording to make it say the same thing as the C standard, which is what was always intended. (It was apparently noticed and changed in 2006; see https://www.opengroup.org/austin/docs/austin_330.txt. The current version of strspn
in POSIX refers to this (rather cryptically) as "SD5-XSH-ERN-182 is applied" in the CHANGE HISTORY section.)
As POSIX says it "defers to the ISO C standard", I believe compliant implementations must follow the C standard when there is a conflict, such as in this case.