Perhaps I need to take this to the Language Definition forum. These do not conform to the specification. bit size 0 is supposed to return an “int” whether it is 32 or 64 bit is machine dependent, but it supposed to be an “int”.
The bit-size is requested because a string can express any real value that can go beyond the memory size of any integer type including int64. You need to tell ParseInt the bit-size limit you’re planning to get so that it converts it properly for you. For example:
MaxUint16 (Value at 65,536) for uint8 (Max at 255), you will get 255 as a result instead of 65536.
If you want to maintain 65536 real value, then you need to increase the bit size to 16-bit (uint16) to accommodate the actual size of the data.
NOTE: I’m using uintX data type here to keep the explanation simple about bit-size capacity.
To be fair, Go is very specific with data type and as of to-date, there is no “generic”. ParseInt needs to deal with multiple intX data types of different sizes: int, int8, int16, int32, and int64 a given time, for different customers (like you and me).
Another problem with int is its bit-size is based on the CPU architecture (which also means it is not consistent across CPU like 16-bit, 32-bit, and 64-bit). That’s why it always better to return a single type: int64 (the largest) in which user can convert the int64 data to the appropriate intX of your choice using the simple intX(...) conversion. That’s how Atoi works (https://golang.org/src/strconv/atoi.go#258).
If the standard library is to offer int, there will be requests for all types of intX. That also means developers have to wrap ParseInt 5 times to return each intX data types respectively.
This makes no sense in term of long term maintenance just to cover these edge cases requirements for every customers.
Instead of forcing users (with and without low-level knowledge about bit-size and CPU architectures) to easily convert number to integer, a wrapper function is created for it called Atoi, which is essentially ParseInt(string, 10, 0).
To answer this question, since the code is open source, you can always check it out by reading its source codes: https://golang.org/src/strconv/atoi.go#205. You can easily change the documentation from package form to source code by changing pkg in url to src.
From the source codes, 0 respects your CPU bit-size. That means if you’re on, say:
Raspberry Pi 3 (32-bit), int is 32-bit.
Intel i5, i7, iX (64-bit), int is 64-bit.
As for the lack of description, you can contribute the documentation updates in the source codes to Go source codes. Please do remember to read the contribution guidelines.
For personal opinion-wise, since I’m calling ParseInt function for specific conversion, I am already sensitive to the maximum data capacity. Hence, there is no reason for me to use 0 at all for compiler to guess the bit-size and always be specific (8, 16, 32, 64) about it.