The whole concept of the io.Reader interface in Go is pure genius (the same applies to io.Writer by the way). Recently, I wanted to inspect a .tar.gz file which was hosted on a server and I needed to extract a single file from it.

Normally, you would download the file to disk, gunzip it, untar it and then keep the file you want. By using the io.Reader interface, this task can be done withouy having to create all the temporary files:

package main

import (
    "archive/tar"
    "compress/gzip"
    "io"
    "net/http"
    "os"
    "time"

    "github.com/pieterclaerhout/go-log"
)

func main() {

    targetFilePath := "file-to-save"

    httpClient := http.Client{
        Timeout: 5 * time.Second,
    }

    req, err := http.NewRequest("GET", "https://server/file.tar.gz", nil)
    log.CheckError(err)

    resp, err := httpClient.Do(req)
    log.CheckError(err)

    uncompressedStream, err := gzip.NewReader(resp.Body)
    log.CheckError(err)

    tarReader := tar.NewReader(uncompressedStream)

    for true {

        header, err := tarReader.Next()

        if err == io.EOF {
            break
        }

        log.CheckError(err)

        if header.Name == "the-file-i-am-looking-for" {

            outFile, err := os.Create(targetFilePath)
            log.CheckError(err)
            defer outFile.Close()

            _, err = io.Copy(outFile, tarReader)
            log.CheckError(err)

            break

        }

    }

}

The beauty is in these lines:

resp, err := httpClient.Do(req)
log.CheckError(err)

uncompressedStream, err := gzip.NewReader(resp.Body)
log.CheckError(err)

tarReader := tar.NewReader(uncompressedStream)

Since the response body from a HTTP client request is implementing the io.Reader interface, you can wrap it in a gzip.Reader. This basically allows you to get the tar file. That stream also implements io.Reader which can then be "untarred" by feeding it into tar.Reader.

You can find a full example of how this can be used in the DatabaseDownloader type of my GeoIP project.