Int to binary string

Hi,
I want to convert an integer to a binary string. I got the following code from http://www.daniweb.com/forums/thread11049.html and it compiles fine in g++ compiler. But when I use it in a ROOT macro, it gives errors.

#include<iostream>
using namespace std;

char *strrev(char *str) {
	char *p1, *p2;

	if (!str || !*str)
		return str;

	for (p1 = str, p2 = str + strlen(str) - 1; p2 > p1; ++p1, --p2) {
		*p1 ^= *p2;
		*p2 ^= *p1;
		*p1 ^= *p2;
	}

	return str;
}

char *itoa(int n, char *s, int b) {
	static char digits[] = "0123456789abcdefghijklmnopqrstuvwxyz";
	int i=0, sign;
    
	if ((sign = n) < 0)
		n = -n;

	do {
		s[i++] = digits[n % b];
	} while ((n /= b) > 0);

	if (sign < 0)
		s[i++] = '-';
	s[i] = '\0';

	return strrev(s);
}

//testing functionality
int main()
{
	int a=32768;
	char binary[33];
	cout<<itoa(a,binary,2)<<endl;
	return(1);
}

I included TString in the headers and replaced strlen(str) with str->Length(), but still I got errors. So in general, how do I convert an int to a binary string in ROOT?

Cheers.

you cannot just brute force replace the byte type char with the complex class TString.

But you can use the result of itoa (an array of chars) and put it into a TString:

TString str(itoa(a,binary,2));

root.cern.ch/root/html526/TStrin … ng:TString%1

You can also use the C++ bitset class to archieve this result with less code:

#include "TString.h"
#include <bitset>
#include <string>

int main(int argc, char * argv[])
{
  int a =2414;
  std::bitset<sizeof(a) * 8> bits(a);
  // c++ string
  std::string str(bits.to_string());
  // root string
  TString tstr(bits.to_string());
}

Thank you!

MadCow I tried to use the C++ bitset class as you recommended in your last post but when I execute my code I get the error message: Symbol bitset is not defined .

Hi,

Are you missing the #include and/or are you using bitset rather std::bitset?

Philippe.

Hi,
I have both #include and std::bitset exactly as was suggested.

Hi,

humm … strange … How did you compile the code?

Philippe.

PS. Did you also try to use exactly the code provided?

Hi,
Actually this time I was trying to use the bitset class for converting a binary number to decimal. The code is given below:

#include "TString.h"
#include <bitset>
#include <string>

//convert binary to decimal
void bin2dec()
{
	string binNum;
   	cout << "Enter binary number (max 32 digits): ";
   	cin >> binNum;
   	cout << "The number translated to decimal is " << std::bitset<32>(binNum).to_ulong()<<endl;
}	

Hi,
So guys where did i go wrong with the bitset class, how is it used?

cheers

Hi,

std::bitset (see cplusplus.com/reference/stl/bitset/bitset/) does not have a constructor that takes a string as an argument, you must pass it an unsigned long.

Cheers,
Philippe.

Hi,
From your link I found something like this:bitset<10> third (string("01011")); // initialize from string
Furthermore my last code for binary to int (bin2dec)compiles fine in g++ compiler but ROOT complains of not knowing what bitset is. Well, can you please make a simple example code which shows the usage of bitset class.

cheers

Sorry my bad :frowning:

Indeed bitset is currently not supported in interpreted mode. As a workaround you can either use the class TBits or compile your script via ACliC.

Cheers,
Philippe