Hi,
I want to convert an integer to a binary string. I got the following code from http://www.daniweb.com/forums/thread11049.html and it compiles fine in g++ compiler. But when I use it in a ROOT macro, it gives errors.
#include<iostream>
using namespace std;
char *strrev(char *str) {
char *p1, *p2;
if (!str || !*str)
return str;
for (p1 = str, p2 = str + strlen(str) - 1; p2 > p1; ++p1, --p2) {
*p1 ^= *p2;
*p2 ^= *p1;
*p1 ^= *p2;
}
return str;
}
char *itoa(int n, char *s, int b) {
static char digits[] = "0123456789abcdefghijklmnopqrstuvwxyz";
int i=0, sign;
if ((sign = n) < 0)
n = -n;
do {
s[i++] = digits[n % b];
} while ((n /= b) > 0);
if (sign < 0)
s[i++] = '-';
s[i] = '\0';
return strrev(s);
}
//testing functionality
int main()
{
int a=32768;
char binary[33];
cout<<itoa(a,binary,2)<<endl;
return(1);
}
I included TString in the headers and replaced strlen(str) with str->Length(), but still I got errors. So in general, how do I convert an int to a binary string in ROOT?
MadCow I tried to use the C++ bitset class as you recommended in your last post but when I execute my code I get the error message: Symbol bitset is not defined .
Hi,
From your link I found something like this:bitset<10> third (string("01011")); // initialize from string
Furthermore my last code for binary to int (bin2dec)compiles fine in g++ compiler but ROOT complains of not knowing what bitset is. Well, can you please make a simple example code which shows the usage of bitset class.