I am trying to have a uint64_t
bitfield be all set to 0. Then when I call the function within the given string, and it matches with the static global array that I have set it will flip the bit to 1. Currently I have the following code but for some reason when giving it different strings it is following the same behavior. So for example when I input the following string of "ABC" it should be printing out 111000
. How would I get the following behavior.
const size_t SETSIZE = sizeof(uint64_t) << 3;
char key[5] = { 'A', 'B', 'C', 'D', 'E', 'F' }
uint64_t set_encode(char *st) {
int i, j;
uint64_t set = 0;
int length = strlen(st);
for (i = 0; i < length; i++) {
for (j = 0; j < 5; j++) {
if (st[i] == key[j]) {
printf("%c", st[0]);
set = set | 1 << (SETSIZE - 1 - i);
}
}
}
printf("%lu\n", set);
return set;
}
Multiple problems in your code:
const size_t SETSIZE = sizeof(uint64_t) << 3;
Bytes might not be 8 bits, you should use const size_t SETSIZE = 64;
since the type uint64_t
, if present, is defined to be exactly 64-bit wide with 2's complement representation.
char key[5] = { 'A', 'B', 'C', 'D', 'E', 'F' }
The initializer has 6 characters but the explicit size is set to 5
. Use char key[] = { 'A', 'B', 'C', 'D', 'E', 'F' };
. Note that key
is not a C string as not '\0'
is present in the initializer. Note also that you have a missing ;
at the end of the initializer.
uint64_t set_encode(char *st) {
You do not modify the string pointed to by st
, use const char *st
int i, j;
i
and j
should be defined as size_t
for consistency with length
.
uint64_t set = 0;
int length = strlen(st);
The return type is size_t
because the length of the string might be larger than the range of int
. In your particular case it is not fundamental as the function is only useful for strings with at most 64 characters, and you should also test for that.
for (i = 0; i < length; i++) {
for (j = 0; j < 5; j++) {
j
should be compared to sizeof(key)
.
if (st[i] == key[j]) {
printf("%c", st[0]);
You probably want to print st[i]
instead of st[0]
.
set = set | 1 << (SETSIZE - 1 - i);
It would probably be more consistent to use the bits from the lowest value to the highest value, and 1
must be cast to (uint64_t)
to avoid arithmetic overflow on int
for strings longer than 31 characters (if int
is 32-bit wide): set = set | (uint64_t)1 << i;
. Note however that even with the cast, the shift operation is still undefined for shift amounts larger than 63 or negative.
}
}
}
printf("%lu\n", set);
set
is not necessarily a long
. You can print it as an unsigned long long
that is at least 64-bit wide: printf("%llu\n", (unsigned long long)set);
Or you can use the format specifiers from <inttypes.h>
: printf("%"PRIu64"\n", set);
return set;
}
Here is corrected version:
const size_t SETSIZE = 64;
char key[] = { 'A', 'B', 'C', 'D', 'E', 'F' };
uint64_t set_encode(const char *st) {
uint64_t set = 0;
size_t length = strlen(st);
if (length > SETSIZE) {
printf("string too long: %zd bytes\n", length);
length = SETSIZE;
}
for (size_t i = 0; i < length; i++) {
for (size_t j = 0; j < sizeof(key); j++) {
if (st[i] == key[j]) {
printf("%c", st[i]);
set |= (uint64_t)1 << i;
}
}
}
printf("%llu\n", (unsigned long long)set);
return set;
}