I’ve recently come across a funny bug in our project at work, caused by some confusion regarding
how IPv4s are represented.
In the code we try to get the IP address from a UDPAddr:
var addr net.UDPAddr = ...
addr.IP
But this IP value, when it contains an IPv4 address, is truncated to 4 bytes.
However, if you get an IPv4 string and call net.ParseIP
on it, it returns you a 16-byte array.
This caused us problems when we were serializing the byte arrays of certain IP’s that were parsed using net.ParseIP
and comparing it to the serialized version of an IP coming from the UDPAddr object.
Similar to this:
parsedIP := net.ParseIP("1.1.1.1")
udpIP := udpAddr.IP
if string(parsedIP) == string(udpIP) {
...
}
Since we are serializing the byte array by calling string()
on the IP objects, they obviously won’t be equal since the sizes of the arrays differ.
The proper way to compare them would be to call .String()
on each of them.
Why is the default have IPv4’s represented as 16-byte arrays, instead of truncating the array to 4-bytes?
Wouldn’t truncating it to 4-bytes be a more sensible default?