[weird] Anomaly in memory consumed by buffered channels

Hi all,

Here is a small benchmark, to show the memory consumed by a buffered channel:

func getChan1() chan int {
    return make(chan int, 101)
}

var c chan int

func BenchmarkChan1(b *testing.B) {
    for i := 0; i < b.N; i++ {
        c = getChan1()
    }
}

This code tells me various values, depending on the buffer size S of my buffered channel, here are some values:
1 -> 112 B/op
2 -> 112 B/op
3 -> 128 B/op
4 -> 128 B/op
5 -> 144 B/op
6 -> 144 B/op
7 -> 160 B/op

I have thus deduced that the actual size occupied by a buffered channel of int, is given by this formula: size(channel of S ints) = 96 + Sizeof(int) * 2 * [(N+1) / 2]

Why does a channel of S ints occupy the same memory as a channel of S + 1 ints, when S is odd ?

This is strange but not strange enough. Now comes the weird part:
100 -> 896 B/op

101 -> 1024 B/op
102 -> 1024 B/op
103 -> 1024 B/op
104 -> 1024 B/op

199 -> 1792 B/op
200 -> 1792 B/op
201 -> 1792 B/op
202 -> 1792 B/op

300 -> 2688 B/op
301 -> 2688 B/op

998 -> 8192 B/op
999 -> 8192 B/op
1000 -> 8192 B/op
1001 -> 8192 B/op

What is going on here ? For a buffered channel of size > 100 there is an additional overhead, where does it come from ?

It is something to do with scaling, or maybe padding data structure to optimize ? Or maybe my benchmark malfunctions for large-ish numbers, how can one explain this behaviour ?

This is due to Go’s size segregated heap model. In order to efficiently manage memory and avoid fragmentation, it splits objects into heaps according to their size, so for each heap the size of every object is equal. For each allocation it rounds up the object size to the smallest heap size and allocates an item from that heap. This allows the runtime to manage memory easily, at the cost of some overhead for each object.
The size of each heap can be retreived by calling runtime.ReadMemStats and inspecting the Size field of BySize. Calling it gives the following numbers on my machine:
0, 8, 16, 32, 48, 64, 80, 96, 112, 128, 144, 160, 176, 192, 208, 224, 240, 256, 288, 320, 352, 384, 416, 448, 480, 512, 576, 640, 704, 768, 896, 1024, 1152, 1280, 1408, 1536, 1792, 2048, 2304, 2688, 3072, 3200, 3456, 4096, 4864, 5376, 6144, 6528, 6784, 6912, 8192, 9472, 9728, 10240, 10880, 12288, 13568, 14336, 16384, 18432, 19072
So you can see that 968, 1024, 1792, 2688 and 8192 are all on the list.

2 Likes

This is a great answer. One question though: if I allocate an object larger than 19072 bytes (the largest heap), and the object should og to heap according to escape analysis, what happens the heap, what is the mechanism for that ?

Large objects get rounded up to the closest page size and uses a heap of such-sized allocations.

1 Like

What do you mean largest page size ? What if an object is larger than one page ?
Could you please explain, or maybe link a short&easy tutorial on this ?

I linked to one such in your other topic.

1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.