I have a string containing an integer (which has been read from a file).
I’m trying to convert the
string to an
ParseInt requires that I provide a bitsize (bit sizes 0, 8, 16, 32, and 64 correspond to int, int8, int16, int32, and int64).
The integer read from the file is small (i.e. it should fit in a normal int). If I pass a bitsize of 0, however, I get a result of type
int64 (presumably because I’m running on a 64-bit OS).
Why is this happening? How do I just get a normal int? (If someone has a quick primer on when and why I should use the different int types, that would awesome!)
Edit: I can convert the int64 to a normal int using
int([i64_var]) . But I still don’t understand why
ParseInt() is giving me an int64 when I’m requesting a bitsize of 0.