In my Go program, I am uploading a multi-gigabyte file to Google Cloud Storage (GCS). I am using the client library here: https://pkg.go.dev/cloud.google.com/go/storage
My code is similar to the example in the documentation. r
is a file reader:
client, err := storage.NewClient(ctx)
if err != nil {
return err
}
defer client.Close()
w := client.Bucket(l.Bucket).Object(fileName).NewWriter(ctx)
w.ContentType = ""
w.ACL = []storage.ACLRule{{Entity: storage.AllUsers, Role: storage.RoleReader}}
if _, err := io.Copy(w, r); err != nil {
return err // ########## Error is returned on this line! ##########
}
if err := w.Close(); err != nil {
return err
}
Small files succeed. However, for large files that take over an hour to upload, my laptop will go to sleep before the upload completes. When my laptop wakes, the connection is interrupted and the following error is returned (I scrubbed the IP addresses for privacy):
read tcp x.x.x.x:ppp->X.X.X.X:PPP: read: connection reset by peer
Instead, I want to resume the upload. From the GCS documentation, I see that resumable uploads are supported (https://cloud.google.com/storage/docs/resumable-uploads), but I am unsure how to use this functionality from the Go client.
One last important detail: I do not want to retry the upload from the beginning. I want to continue and retain the progress from before the interrupt.