Multipart Upload to Google Cloud Storage with Authlib

Uploading files to Google Cloud Storage using requests instead of Google Python Client.

In our last post Access Google Analytics API, I have said that Google Service Account is no different than a JWT for Authorization Grants, what you need to do is fetching the access token with Authlib AssertionSession. But you don't really need to fetch the token, since AssertionSession will handle it automatically.

First, let's create a requests session with Google service account config file, its scope is https://www.googleapis.com/auth/cloud-platform. Remember to turn Google Storage API on in cloud console.

import json
# before v0.13
from authlib.client import AssertionSession
# after v0.13
from authlib.integrations.requests_client import AssertionSession
def create_assertion_session(conf_file, scope, subject=None):
with open(conf_file, 'r') as f:
conf = json.load(f)
token_url = conf['token_uri']
issuer = conf['client_email']
key = conf['private_key']
key_id = conf.get('private_key_id')
header = {'alg': 'RS256'}
if key_id:
header['kid'] = key_id
# Google puts scope in payload
claims = {'scope': scope}
return AssertionSession(
grant_type=AssertionSession.JWT_BEARER_GRANT_TYPE,
token_url=token_url,
issuer=issuer,
audience=token_url,
claims=claims,
subject=subject,
key=key,
header=header,
)
session = create_assertion_session('your-google-conf.json', 'https://www.googleapis.com/auth/cloud-platform')

You can always use the GoogleServiceAccount in loginpass so that you don't need to write the code above. Instead, it can be as simple as:

from loginpass.google import GoogleServiceAccount
session = GoogleServiceAccount.from_service_account_file('your-google-conf.json', 'https://www.googleapis.com/auth/cloud-platform')

This session is a requests session, which has the same API as requests, such as requests.get, requests.post. Reading the documentation from Google website on JSON API: Performing a Multipart Upload, let's figure out what should we do.

  1. figure out what metadata should we send
  2. create a multipart form as the POST payload

We will use requests-toolbelt to create the Multipart Form, which can also sending streaming data. The code will look like:

import json
from requests_toolbelt import MultipartEncoder
bucket = 'your-bucket-name'
url = 'https://www.googleapis.com/upload/storage/v1/b/{}/o?uploadType=multipart'.format(bucket)
# file name to be saved in bucket
name = 'foo/bar.jpg'
metadata = {
'name': name,
'cacheControl': 'public, max-age=5184000'
}
# obj can be a file / bytes or anything that requests support
obj = open('example.jpg', 'rb')
files = [
('file', (name, json.dumps(metadata), 'application/json')),
('file', (name, obj, 'image/jpeg')),
]
encoder = MultipartEncoder(files)
headers = {'Content-Type': encoder.content_type}
# use the session created above
resp = session.post(url, data=encoder, headers=headers)

The metadata in this example contains a name and cacheControl, between which, the name is required, and you can add more metadata if you want.


Checking our guide - Upload to Google Cloud Storage from browser directly.