Would you rather listen than read? Check out our new tech blog podcast.
So you’ve created your buckets, and now you want to use the power of the cloud to serve your content. With a can-do attitude and the details of this post, you’ll learn how to get your data into Cloud Storage with a variety of upload methods. Let’s go!
When you upload an object to your Cloud Storage bucket, it will consist of the data you want to store, along with any associated metadata. When it comes to the actual uploading, you’ve got a few different options to choose from, which we’ll go over below. For more detail, check out the documentation. And for general, conceptual information on uploads and downloads, read this.
First, we’ll cover the Cloud Console. This provides you with an in-browser experience where you can easily click to create buckets and folders, and then choose, or drag and drop the files from your local machine to upload.
For production environments, you may want an automated, command line solution.
For this, we provide the gsutil tool. gsutil is a Python application that lets you access Cloud Storage from the command line, providing you with the ability to do all sorts of things like creating buckets, moving objects, or even editing metadata.
To use it, run the gsutil program with a variety of command line options. For example, this command uploads a directory of files from your local machine to your Cloud Storage bucket using parallel upload.
And this command lists out specific objects that have a version-specific URL using a wildcard.
More cool stuff you can do with the gsutil tool can be found in this documentation.
At some point, you might need to interface with Cloud Storage directly from your code, rather than going out to a command line option. You can include the client libraries into your code and call a simple api to get data into a bucket or folder.
And before you even ask about language, with options in C++, C#, Go, Java, Node.js, PHP, Python, and Ruby—we’ve got you covered.
For example, check out this Python code to upload an object to a Cloud Storage bucket:
from google.cloud import storage def upload_blob(bucket_name, source_file_name, destination_blob_name): """Uploads a file to the bucket.""" # bucket_name = "your-bucket-name" # source_file_name = "local/path/to/file" # destination_blob_name = "storage-object-name" storage_client = storage.Client() bucket = storage_client.bucket(bucket_name) blob = bucket.blob(destination_blob_name) blob.upload_from_filename(source_file_name)
Check out even more code samples here.
JSON and XML
And finally, if none of that does the trick, there’s always the JSON and XML APIs, which can let you kick off an HTTP POST request to upload data directly to a bucket or folder. It’s a bit more complex, but it’s there if you need it.
POST /OBJECT_NAME HTTP/2 Host: BUCKET_NAME.storage.googleapis.com Date: DATE Content-Length: REQUEST_BODY_LENGTH Content-Type: MIME_TYPE X-Goog-Resumable: start Authorization: AUTHENTICATION_STRING
Cloud Storage Transfer Appliance
Now, for you folks with LOTS of data, it’s worth noting that it might not be feasible to upload all of that data directly from your on-prem systems to Google Cloud—for that you can use the Cloud Storage Transfer Appliance.
We ship you a fancy device, you connect it, add your data, and send it back to us. Plus you get this cool looking box on your desk for a while, which can be a great conversation starter, if you’re into that kind of thing. More details here.
More clouds, more problems? Not so!
Don’t worry if your data is in another cloud, we’ve got easy-to-use guides to help you get up and running with supporting a multicloud environment, and getting that data over to Cloud Storage.
Of course, now that the data is in Cloud Storage, you’ve got to figure out the best ways to serve it to your users worldwide. Stay tuned for best practices around getting that data out into the world in our next post.
By Jenny Brown(Cloud Developer Advocate)
Source: Google Cloud Blog