
How to handle server-side file upload; tldr: you don't
4 min read
File uploads are a common feature in modern applications, whether it’s for submitting tax documents, updating a profile photo, or providing context files to an AI like ChatGPT. Traditionally, your backend server would handle these upload requests, performing various validations before moving the received bytes to a file storage service. Popular choices for file storage include Amazon S3 and Google Cloud Storage, most cloud providers offer these as part of their suite of tools.
However, handling these upload requests undeniably consumes significant resources from your backend machines, regardless of whether you’re using a serverful or serverless architecture. As your application scales to accommodate a larger user base, you’ll inevitably find yourself scaling either horizontally or vertically, which directly translates to increased charges on your credit card. Surely, there must be a more efficient way, right?
Introducing Presigned URLs: Offloading the Work
What if I told you that you could delegate upload requests directly to the file storage provider itself? Yes, you heard that correctly! Most major cloud providers support what are called presigned URLs. A presigned URL is, as the name suggests, an authenticated URL that is “pre-signed” by your backend. This allows you to securely share it with your user-facing application, enabling clients to directly perform uploads (or downloads) to the specified URL. The key caveat is that you must securely generate this URL on your backend, but that’s essentially the extent of your server’s involvement.
In this blog post, I’ll demonstrate how to generate a presigned URL using Google Cloud Storage. However, other cloud providers offer similar mechanisms, so always consult their documentation – don’t be a lazy bum!
Example of a presigned-url
https://storage.googleapis.com/example-bucket/cat.jpeg?X-Goog-Algorithm=
GOOG4-RSA-SHA256&X-Goog-Credential=example%40example-project.iam.gserviceaccount.com
%2F20181026%2Fus-central1%2Fstorage%2Fgoog4_request&X-Goog-Date=20181026T18
1309Z&X-Goog-Expires=900&X-Goog-SignedHeaders=host&X-Goog-Signature=247a2aa45f16
9edf4d187d54e7cc46e4731b1e6273242c4f4c39a1d2507a0e58706e25e3a85a7dbb891d62afa849
6def8e260c1db863d9ace85ff0a184b894b117fe46d1225c82f2aa19efd52cf21d3e2022b3b868dc
c1aca2741951ed5bf3bb25a34f5e9316a2841e8ff4c530b22ceaa1c5ce09c7cbb5732631510c2058
0e61723f5594de3aea497f195456a2ff2bdd0d13bad47289d8611b6f9cfeef0c46c91a455b94e90a
66924f722292d21e24d31dcfb38ce0c0f353ffa5a9756fc2a9f2b40bc2113206a81e324fc4fd6823
a29163fa845c8ae7eca1fcf6e5bb48b3200983c56c5ca81fffb151cca7402beddfc4a76b13344703
2ea7abedc098d2eb14a7
Setting Up GCP Permissions for Presigned URLs
First, create a custom service account in GCP (which I highly recommend) for your backend service to use, and ensure you assign the appropriate roles to it:
-
Storage Object User (roles/storage.objectUser): Use this role to create signed URLs for both uploading and downloading objects. This role is also required if the signed URL could overwrite an existing object.
-
Storage Object Viewer (roles/storage.objectViewer): Use this role if you only want to create signed URLs for downloading objects.
-
Storage Object Creator (roles/storage.objectCreator): Use this role if you only want to create signed URLs for uploading objects and the uploaded object won’t overwrite an existing object in the bucket.
-
Service Account Token Creator role (roles/iam.serviceAccountTokenCreator).
Generating the Presigned URL in Node.js
Next, we’ll generate the URL from our backend. For this demonstration, I’ll use Node.js, so you’ll need to install the @google-cloud/storage first. Here is the sample code on how to generate the presigned url.
import { Storage } from "@google-cloud/storage";
const storage = new Storage({
// provider credentials based on the service account file
});
async function generateUploadUrl() {
const [url] = await storage
.bucket(bucketName)
.file(fileName)
.getSignedUrl({
version: "v4",
action: "write",
expires: Date.now() + 3 * 60 * 1000, // 3 minutes
contentType: "application/octet-stream",
});
return url;
}
Naturally, you should integrate this generateUploadUrl function into a web framework or server to handle incoming requests, generate the presigned URL, and then return it to the frontend client. The client can then use any standard HTTP client to upload the file directly to the received URL or try uploading to the URL using curl.
curl -X PUT -H 'Content-Type: application/octet-stream' --upload-file my-file '${url}'
Considerations and Alternatives
By delegating the file upload process directly to the file storage provider, you significantly reduce the resource burden on your backend. However, is this always the best way to handle file uploads? As with most things in real life, it depends.
For example, you cannot perform real-time validation during the upload process itself. The best you can do is validate the file after it has been uploaded, perhaps by using Firebase Functions or Cloud Run triggered by Cloud Events. This approach can work well if you have a reasonable level of trust in your users. For most general workloads, however, I believe this is an optimal method for handling file uploads.
export const onFileUploaded = functions.storage.onObjectFinalized(
{},
(event) => {
const path = event.data.name;
}
);
There is also another method for handling file uploads: directly using a client library in your frontend application. Firebase, for instance, provides the Firebase Storage package, which enables direct uploads from the client. Additionally, you can enforce validation rules using Firebase Storage Security Rules, which can be particularly useful in scenarios where you are heavily invested in the Firebase ecosystem. However, this adds an extra package to your frontend application, which might be undesirable in some contexts.
It’s worth noting that presigned URLs aren’t exclusively for upload purposes; they can also be used to generate download URLs, although this is often less critical in my opinion. Stay tuned for my next blog post, where I’ll delve into even better ways to expose your images from cloud file storage.