Search code examples
httpgouploadprogress

PUT upload a file's byte range with streams and progress


I just got started with Go and need some help. I would like to upload a certain range of bytes from a file.

I already accomplished this by reading the bytes into a buffer. But this increases memory usage. Instead of reading bytes into memory, I want to stream them while uploading and have an upload progress. I did something like this in Node.js but struggle to get the puzzle pieces together for Go. The code that I have now looks like this:

func uploadChunk(id, mimeType, uploadURL, filePath string, offset, size uint) {
    // open file
    file, err := os.Open(filePath)
    panicCheck(err, ErrorFileRead) // custom error handler
    defer file.Close()

    // move to the proper byte
    file.Seek(int64(offset), 0)

    // read byte chunk into buffer
    buffer := make([]byte, size)
    file.Read(buffer)
    fileReader := bytes.NewReader(buffer)

    request, err := http.NewRequest(http.MethodPut, uploadURL, fileReader)

    client := &http.Client{
        Timeout: time.Second * 10,
    }

    response, err := client.Do(request)
    panicCheck(err, ErrorFileRead)

    defer response.Body.Close()

    b, err := httputil.DumpResponse(response, true)
    panicCheck(err, ErrorFileRead)
    fmt.Println("response\n", string(b))
}

Could you guys help me to figure out how to stream and get progress for an upload?

Thanks


Solution

  • You can use an io.LimitedReader to wrap the file and only read the amount of data you want. The implementation returned by io.LimitReader is an *io.LimitedReader.

    file.Seek(int64(offset), 0)
    fileReader := io.LimitReader(file, size)
    
    request, err := http.NewRequest(http.MethodPut, uploadURL, fileReader)
    

    And for S3 you will want to ensure that you don't use chunked encoding by explicitly setting the ContentLength:

    request.ContentLength = size
    

    As for upload progress, see: Go: Tracking POST request progress