Search code examples
c++stringbinarybitset

Why is the bitset function not returning correct decimal value for binary strings like 1110 or 1011?


Here is my code that takes a 4 character string of 1's and 0's and converts to decimal using bitset function. It is returning correct values for all combinations except those involving 11's and 10's like {1110,1010,1011,1111}. For these numbers its returning the result ignoring MSB. That is for 1010 its giving 2 as the answer.

#include<bits/stdc++.h>
using namespace std;
#define ul unsigned long

int main(int argc, char const *argv[])
{
    int bin1=0,bin2=0,choice=0;
    ul x1=0,x2=0;
    //string binary;
    cin>>bin1;
    x1=bitset<4>(bin1).to_ulong();
    cin>>bin2;  
    x2=bitset<4>(bin2).to_ulong();
    cout<<x1<<" "<<x2<<endl;
    return 0;

}

EDIT here is the snapshot of my results

enter image description here

Another snapshot of same program reading another set of input but this time it gives correct output. Btw the 1101 and 1001 are the inputs and the next two limes are the output

enter image description here


Solution

  • You need the string overload of the bitset's constructor.

    template<class CharT, class Traits, class Alloc>
    
    explicit bitset( const std::basic_string<CharT,Traits,Alloc>& str,
                     typename std::basic_string<CharT,Traits,Alloc>::size_type pos = 0,
                     typename std::basic_string<CharT,Traits,Alloc>::size_type n =
                         std::basic_string<CharT,Traits,Alloc>::npos,
                     CharT zero = CharT('0'),
                     CharT one = CharT('1'));
    

    From your use case looks like changing the type of bin1 and bin2 to std::string may just work.