Search code examples
gobase64out-of-memorypprof

memory consumption at encoding base64


I have problems with memory consumption at my software using golangs lib encoding/base64

My software is splitting a videofile to separate images, (gocv mat) converting them to base64 string and saving it to file in json format.

During testing I found that the memory usage is piling up until the oom-reaper is killing the process.

Investigation with pprof showed that the encoding/base64 memory seems to pile up.

I did pprof snapshots after each image frame, and allocated mem of encoding/base64 is raising from 976.89kB(flat) to 4633.54kB(flat) shortly before oom-reaper was killing the process.

Beginning:
      flat  flat%   sum%        cum   cum%
  976.89kB 32.29% 32.29%   976.89kB 32.29%  encoding/base64.(*Encoding).EncodeToString
  512.50kB 16.94% 49.23%   512.50kB 16.94%  runtime.allocm
  512.20kB 16.93% 66.15%   512.20kB 16.93%  runtime.malg
  512.05kB 16.92% 83.08%  1488.94kB 49.21%  runtime.main
     512kB 16.92%   100%      512kB 16.92%  time.resetTimer (inline)
         0     0%   100%   976.89kB 32.29%  main.Process

End:
Showing nodes accounting for 6170.44kB, 100% of 6170.44kB total
      flat  flat%   sum%        cum   cum%
 4633.54kB 75.09% 75.09%  4633.54kB 75.09%  encoding/base64.(*Encoding).EncodeToString
 1024.41kB 16.60% 91.69%  1024.41kB 16.60%  runtime.malg
  512.50kB  8.31%   100%   512.50kB  8.31%  runtime.allocm
         0     0%   100%  4633.54kB 75.09%  main.Process

list shows me the code acoording to it:

(pprof) list encoding/base64
Total: 2.95MB
ROUTINE ======================== encoding/base64.(*Encoding).EncodeToString in /usr/local/go/src/encoding/base64/base64.go
  976.89kB   976.89kB (flat, cum) 32.29% of Total
         .          .    175:
         .          .    176:// EncodeToString returns the base64 encoding of src.
         .          .    177:func (enc *Encoding) EncodeToString(src []byte) string {
         .          .    178:   buf := make([]byte, enc.EncodedLen(len(src)))
         .          .    179:   enc.Encode(buf, src)
  976.89kB   976.89kB    180:   return string(buf)
         .          .    181:}
         .          .    182:
         .          .    183:type encoder struct {
         .          .    184:   err  error
         .          .    185:   enc  *Encoding

So in my golang code the according line of code was:

func Process(img gocv.Mat) ( myImage Images  ){

    detectImg, detectClass, detectBoxes := Detect(&net, 
                                           img.Clone(), 
                                           0.45, 0.5, 
                                           OutputNames, classes)
    defer detectImg.Close()

    // convert gocv.Mat to []bytes
    myImg , _ := detectImg.ToImage()
    myJPG := new(bytes.Buffer)
    jpeg.Encode(myJPG, myImg, &jpeg.Options{95})
    myBytes := myJPG.Bytes()


    // memory consuming
    encodedString := base64.StdEncoding.EncodeToString(myBytes)

// [...]

    return myImage

}

How can I release the memory of "encodedString" in this case that it does not pile up? (Update: Answers say this is not necessary and not possible)

Or is it maybe not my wrong coding, and the mem-leak is at the lib base64 ?? (Update: Answers say this is surely not the case)


Solution

  • My question above was at a totaly wrong way.

    Base64 was not at all the problem, it was just the top-consumer of memory displayed at pprof, leading me to the faulty conclusion that base64 is the problem.

    I guessed that pprof would tell me all mem-consumption of my go programm, including gocv. gocv is a c wrapper around opencv, but the mem-consumption of it is not visible to pprof, as it is c-code! (I did not know that at asking the question). The memory consumption that was visible due pprof was not showing the used memory by c-wrapper libs for go like gocv. The big part of memory consumption was not visible to golang at all. So the helping hint of JimB was:

    Seeing how you are using a Go wrapper around opencv, the memory you are concerned with is probably not even allocated by Go. In that case you do need to ensure that everything is probably closed or released according to their documentation, because the bulk of the work is done in C++, not Go. Even if you are cleaning up properly however, you still need to be aware of your memory limitations and ensure you are not trying to hold too much data at any given point.

    As I was cleaning up the gocv objects, mem consumtion was going down significantly. I used to close objects:

    defer obj.close()