JSON Decode input buffering - Feature Proposal?

I was surprised to find that when parsing json into a struct from an io.Readerusing Decode (encoding/json) that the entire thing is stored in memory not Read out of the Reader in a buffered way as most consumers of io.Reader do.

I am reading json objects that i have no control over, so I expected to be able to prevent Out-of-Memory issues by buffering the contents into the Decoder.

After reading up I found that apparently Decoders are not intended to work this way but rather on many separate objects coming in via the Reader.

Question 1: How do I control the memory usage of this process? Without being able to cap the memory usage, how can I safely run these processors, or set a safe level of concurrently?

Example

Suppose I want to fill the following struct:

type user struct {
    Name string
}

And i am given the json:

{"Junk": ["...large", "nested array", "of great size"], "Name": "jojo"}

Is it not desirable to read the token Junk, realize it is not required, and skip its value (next Delim comma)?

Currently the Decode function repeatedly resizes its internal buffer to completely contain the raw bytes BEFORE parsing for the required keys.

Question 2: Would this warrant a feature proposal at https://github.com/golang/go/issues ? Or is it possible to code this in a memory safe way that I have missed?