Hello!
I have found something strange when trying to calculate bin limits for a histogram. To calculate the number of bins between to limits I took the difference, divided it by the stepsize I have in my input and converted it to an integer since this is what the histogram wants in the declaration. However, I don’t agree with the output I get. As a double I get 19 from the example below, casting to an integer changes it to 18. Could someone explain this?
#include <iostream>
#include <iomanip>
void test(){
double sMin=0.1;
double sMax=2.0;
double step=0.1;
cout<<"sMax before = "<< sMax<<endl;
cout<<"sMin before = "<< sMin<<endl;
cout<<"step before = "<< step<<endl;
cout<<"(int)((sMax-sMin)/step) = "<< (int) ((sMax-sMin)/step)<<endl;
cout<<"((sMax-sMin)/step) = "<<((sMax-sMin)/step)<<endl;;
cout<<"((sMax-sMin)/step) (WITH PRECESION 9) =" <<setprecision(9)<< ((sMax-sMin)/step)<<endl;
cout<<"sMax after = "<< sMax<<endl;
cout<<"sMin after = "<< sMin<<endl;
cout<<"step after = "<< step<<endl;
}