"Will code for travel"

Search technical info
How to upload file to Google Cloud Storage from PHP?
Written: 2019-06-05 10:34:41 Last update: 2019-11-21 18:28:45

To follow 'getting started' to use PHP to access Google Cloud Platform (GCP) just to know how to use PHP to upload file(s) to Google Cloud Storage (GCS) is tiresome because there are many pages to read, especially for first timer to use GCP and GCS, searching and jumping to many different pages, it may takes 1 day to read and understand the concept of GCP and GCS, this article is written for 1 purpose only, to provide quick-help guide for PHP developers to upload file programmatically to GCS, by quickly describe what is GCS, how to utilize GCS for free, how to get GCS library into local PHP development, how to authenticate the server-to-server connection, what are the security measurements to keep private key, working code of short PHP codes to upload files, and setting up CORS to allow access to static files from other domain, all these info is packed in here just one page, I will try not to explain too broad but write only the essential info to get started, also put many URL links to reference page to get more detail for specific subjects.

Background (reason to write this article)

These days there are many people have private website, some even have multiple websites for their blog site, personal diary, photo albums, etc. many of these private websites are using PHP language/script for the back-end (server-side) because many are hosted in a cheap shared hosting company which use Linux OS to provide either cPanel or Plesk or other hosting control panels. Why PHP? because it is a general purpose (not only for a website), open-source, easy to setup development environment and easy to learn with a great community. Fact: PHP is very popular (in 2019 more than 50% of all websites in the world use PHP), do you wonder which popular websites use PHP?

This quick.work website is currently also using PHP language/script hosted in a cheap shared hosting which cost only around $1 (one-dollar) per month, it only has 1 GB of storage capacity (to store PHP files, HTML files, images, MySQL db, configuration files, etc.), the storage size is enough for now but definitely need more later, this cheap hosting performance is slower when compared to dedicated VPS (Virtual Private Server) hosting, hence this how-to article to describe how to utilize GCS for free storage to increase storage size and to increase content serving speed.

Google Cloud Storage (GCS) comes to help to expand server storage, provide easy and secure way to manage files, and giving faster file serving through Google's multi regions cloud infrastructure.

Why choose GCS? there are some good reasons but for me personally the most important reason is GCS has a feature to allow certain file(s) or all files inside a bucket to be readable (view-able) by public, that is each files that we uploaded have a unique URL which everyone can retrieve/download, so it is like a free CDN service, GCS offer 'Always Free' account for storage up to 5GB which must be located in one of the following regions 'us-west1', 'us-central1', or 'us-east1'. 5GB is plentiful to store static files to build website (.json, .css, .js, .jpg, .png, .html, etc.) also GCS provide incomparable reliability and speed of Google Cloud Platform (GCP). Please note that by default all files inside a bucket is not public readable, to make all files public readable then we must edit the bucket permissions, add a new member 'allUsers' to have a role of 'Storage Object Viewer'.

To actually do the exercise (coding in PHP) to upload file(s) onto GCS with PHP, firstly we need to get GCP account, if you don't have GCP account yet then please create a new GCP account, you may asked to setup billing before use GCS, if so then just do it because Google won't charge our credit card as long as we keep the total all files size less than 5GB.

Inside GCP console we need to create a new project then create a bucket, then we need to generate service account, service account is an account which belong to application, not a member (person), to use service account to allow us to upload file from our php hosting server (third party server) to GCS then we must generate a key for the service account, this key is a private key stored as a file either in JSON (recommended) or P12 format, from here on I will use the word 'private key' to refer to service account's key JSON file, to create a private key we need to go to "APIs & Services" menu then open Credentials page and select create a new "Service account key" [Enables server-to-server, app-level authentication using robot accounts]. When creating a new key, we need to select 'Role' (the working scope or purpose of this key), this role is very important for security purposes.

To create a new service account

  1. Please login into Google Cloud Console.
  2. Choose our project (if we have more than 1 project).
  3. Open left menu (on top right menu click the sandwich button).
  4. Select "IAM & Admin" and click "Service accounts".
  5. At the top, click the button "CREATE SERVICE ACCOUNT".
  6. On the first screen, we need to write name and description.
  7. Second screen select permission for this new account, we must select at least 1 role to allow upload file to Google Cloud Storage, 'Storage' -> 'Storage object admin'.
  8. Third screen, click the "Create key" to download key file, on Key type selection, select JSON (don't select old type P12), then JSON file will be downloaded and we can use it immediately.

Please note that the newly created service account by default can access all buckets, if we have multiple buckets want to control which service account can access which buckets then we need to change bucket or object permission using AIM, in my case I have several buckets and several service accounts and I only allow a specific service account to upload to a bucket, so I need to prevent the newly created service account to upload/delete file to other buckets, we can see each bucket's permission but we can not delete the service account from accessing the bucket because by default service account has inheritance to access all buckets (the delete/edit icon is grayed/disabled), so we need to 'remove' the inherited permission by using gsutil.

# to remove the inherited permission of 'my-service-account' to 'my-other-bucket'
gsutil iam ch serviceAccount:[email protected]:objectAdmin gs://my-other-bucket

After the inherited permission is removed then we can delete the service account to the bucket by using the UI console, we need to do this for this service account to all other buckets, why Google design this inheritance permission? I am guessing for ease of usage (by default a service account can access all buckets).

If this private key will be stored outside Google Cloud, like in this case (using shared hosting in a third party company, a non-Google company) then please note the following security measures:

  • Never-ever create a private key with "Project Owner" (highest) role and then store the generated private key file in other non-Google cloud (third party shared hosting company), because "Project Owner" role has access to all GCP services including App Engine, Compute Engine, Datastore and many other Google's big services, imagine what would happen if a bad person found it and use it for bad purpose.
  • Make sure to choose the least privileges role for the key, in this case, we may need to only allow to create a new object (file) then choose "Storage Object Creator" (roles/storage.objectCreator) but this role can't REPLACE or DELETE files, so if you want to CREATE, REPLACE and DELETE files then choose "Storage Object Admin" (roles/storage.objectAdmin).
  • If you are working in a team or using code repository like GitHub then please be aware the needed security for handling this private key file, must read about How keep your Google Cloud service account keys safe.

Inside GCP console, there is a full UI feature to create/delete bucket (what is a 'bucket'? simply think of 'bucket' as a hard-drive to store folders/files, we can create and use many buckets in a project at the same time), create/delete folders and upload/delete files directly without any programming. Lucky for us this feature will make easier for us to do testing, after we uploaded a new file from PHP then we can easily see the file in GCP console and we can delete any files if we want to, GCS also provides a great and easy Access Control List (ACL), to help us to specify whether some certain files can be readable by the public (anyone) or private only for ourselves.

How easy is it to upload files to GCS programmatically using PHP? it is very easy, only a few lines of PHP codes. Before we play with PHP code, I am assuming that your local machine already setup for PHP development and web server is running using Apache or NGINX, maybe using either one of the PHP packages, my personal experiences have been using LAMP (Linux Apache MySQL and PHP), MAMP (for Mac OS) or WAMP (for Window), the next step is to get (download) GCS library for PHP.

  1. Install Composer (Dependency Manager for PHP) for local machine globally.
  2. We need to keep our code as small as possible because we need to upload and store them in our server later, so we only need to install google/cloud-storage library without other Google libraries, we use Terminal (command line) to create an empty folder, go inside the folder and run
    $ composer require google/cloud-storage
  3. Composer will get the latest stable version and after composer finished successfully to get Google Cloud Storage library then inside the folder will have a folder called 'vendor', it is very important that we don't change anything inside it and we will need to upload all files/folders inside 'vendor' to our shared hosting later.

After we have GCS library then we can start to use PHP code to upload any file to GCS, the PHP code below is using dummy $privateKeyFileContent, you need to copy your own private key JSON file value and put it in $privateKeyFileContent for your testing, this way is only for quick testing to verify whether the private key value is valid or not, this way is not for production because putting private key value inside php file is not secure.

<?php

  // install Google's Cloud Storage library with 'composer require google/cloud-storage'

  // load GCS library
  require_once 'vendor/autoload.php';  
  
  use Google\Cloud\Storage\StorageClient;

  // Please use your own private key (JSON file content) and copy it here
  // your private key JSON structure should be similar like dummy value below.
  // WARNING: this is only for QUICK TESTING to verify whether private key is valid (working) or not.  
  // NOTE: to create private key JSON file: https://console.cloud.google.com/apis/credentials  
  $privateKeyFileContent = '{
      "type": "service_account",
      "project_id": "my-project-id",
      "private_key_id": "aaabbbcccc",
      "private_key": "-----BEGIN PRIVATE KEY-----\nblablablablabla=\n-----END PRIVATE KEY-----\n",
      "client_email": "[email protected]",
      "client_id": "12345123451234512345",
      "auth_uri": "https://accounts.google.com/o/oauth2/auth",
      "token_uri": "https://oauth2.googleapis.com/token",
      "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
      "client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/blablabla%40blabla.iam.gserviceaccount.com"
    }';

  /*****
  * NOTE: if the server is a shared hosting by third party company
  * then private key should not be store as a file,
  * may be better to encrypt the private key value then 
  * store the 'encrypted private key' value as string in database,
  * so every time before use the private key we can get a user-input (from UI) to get password to decrypt it.
  ******/

  // connect to Google Cloud Storage using private key as authentication
  try {
    $storage = new StorageClient([
        'keyFile' => json_decode($privateKeyFileContent, true)
    ]);
  } catch (Exception $e) {
    // maybe invalid private key ?
    print $e;
    return;
  }

  // set which bucket to work in
  $bucketName = 'my-bucket-name';
  $bucket = $storage->bucket($bucketName);

  // get local file for upload testing
  $fileContent = file_get_contents('qw.png');

  // NOTE: if 'folder' or 'tree' is not exist then it will be automatically created !
  $cloudPath = 'images/a/b/c/qw.png';

  $isSucceed = uploadFileContent($bucket, $fileContent, $cloudPath);
    
  if($isSucceed == true) {
    echo 'SUCCESS: to upload ' . $cloudPath . PHP_EOL;  
    
    // TEST: get object detail (filesize, contentType, updated [date], etc.)
    $object = $bucket->object($cloudPath);
    print_r($object->info());
  } else {
    echo 'FAILED: to upload ' . $cloudPath .  PHP_EOL;    
  }

  // testing, to see all files in 'images' folder
  printFiles($bucket, 'images');

  return;

function uploadFileContent($bucket, $fileContent, $cloudPath) {

  // upload/replace file 
  $storageObject = $bucket->upload(
      $fileContent,
      ['name' => $cloudPath]
      // if $cloudPath is existed then will be overwrite without confirmation
      // NOTE: 
      // a. do not put prefix '/', '/' is a separate folder name  !!
      // b. private key MUST have 'storage.objects.delete' permission if want to replace file !
  );

  // is it succeed ?
  return $storageObject != null;
}

function printFiles($bucket, $directory = null) {

  if($directory == null) {
      // list all files
      $objects = $bucket->objects();
  } else {
      // list all files within a directory (sub-directory)
      $options = array('prefix' => $directory);
      $objects = $bucket->objects($options);
  }

  foreach ($objects as $object) {
      print $object->name() . PHP_EOL;
      // NOTE: if $object->name() ends with '/' then it is a 'folder'
  }
}
?>

There are many APIs inside GCS library, to see complete API list then please see GCS APIs documentation. We can do many things including rename, delete, copy to other cloudPath, read file content, download file, change file permission (ACL), etc.

Everyone need to be security conscious, especially in relation to keep a private key in a remote place, in order to create this article short and relevant, I would not go deep about security measurements, just would give my personal recommendation to secure the private key.

  • The above PHP code is just for testing, please do NOT copy the private key value 'as-is' inside your PHP file.
  • Do not store private key as a file, store the private key content as string inside database, put it in a secure table.
  • Use encryption, just select 1 of symmetric key algorithm, using encryption means every times we want to use the private key then we need to manually type-in a password to decrypt encrypted private key.

It is easy to use encryption in PHP, you can find many code or libraries out there, below is the simple working logic from https://gist.github.com/ve3/0f77228b174cf92a638d81fddb17189d, I changed it a little bit to simplify the code, we just need to call encrypt(..) and decrypt(..) to secure our private key content with whatever password we want to use.

<?php 
function decrypt($key, $encryptedString, $encryptMethod = 'AES-256-CBC') {

  $json = json_decode(base64_decode($encryptedString), true);
  try {
      $salt = hex2bin($json["salt"]);
      $iv = hex2bin($json["iv"]);
  } catch (Exception $e) {
      echo 'decrypt(..), Exception: '.$e;
      return null;
  }

  $cipherText = base64_decode($json['ciphertext']);
  $iterations = intval(abs($json['iterations']));
  if ($iterations <= 0) {
      $iterations = 999;
  }
  $hashKey = hash_pbkdf2('sha512', $key, $salt, $iterations, (encryptMethodLength() / 4));

  unset($iterations, $json, $salt);
  $decrypted = openssl_decrypt($cipherText , $encryptMethod, hex2bin($hashKey), OPENSSL_RAW_DATA, $iv);

  unset($cipherText, $hashKey, $iv);
  return $decrypted;
}// decrypt

function encrypt($key, $messageToEncrypt, $encryptMethod = 'AES-256-CBC') {

  $ivLength = openssl_cipher_iv_length($encryptMethod);

  $iv = openssl_random_pseudo_bytes($ivLength);

  $salt = openssl_random_pseudo_bytes(256);
  $iterations = 999;
  $hashKey = hash_pbkdf2('sha512', $key, $salt, $iterations, (encryptMethodLength() / 4));
  $encryptedString = openssl_encrypt($messageToEncrypt, $encryptMethod, hex2bin($hashKey), OPENSSL_RAW_DATA, $iv);
  $encryptedString = base64_encode($encryptedString);

  unset($hashKey);
  $output = ['ciphertext' => $encryptedString, 'iv' => bin2hex($iv), 'salt' => bin2hex($salt), 'iterations' => $iterations];
  unset($encryptedString, $iterations, $iv, $ivLength, $salt);

  return base64_encode(json_encode($output));
}

function encryptMethodLength(){
  $encryptMethod = 'AES-256-CBC';
  $number = filter_var($encryptMethod, FILTER_SANITIZE_NUMBER_INT);
  return intval(abs($number));
}// encryptMethodLength
?>

After all testing finished then we are ready to deploy our GCS library, simply upload all GCS library files from local development to our remote server (shared hosting company), the previous 'vendor' folder with all files inside must be completely uploaded, the current "google/cloud-storage" library version is "1.12", total files/folders is 676, total files size is 2.7 MB, compressed zip file only 914KB, fortunately it is not too big.

You can upload all kinds of file types and you can use download them easily too, but if you want to get the file using JavaScript's XMLHttpRequest (XHR) then by default GCS will not allow any domain to read file because CORS, to solve this problem we need to create configuration to allow one or more domains to use XHR to get file, unfortunately GCP console does not have UI feature to adjust this configuration, we need to use gsutil to set CORS on bucket(s).

To use gsutil, we can either use gsutil in Cloud Console or in our local terminal (if gsutil is installed), to see the current CORS configuration:

gsutil cors get gs://mybucket

To set/change CORS configuration:

gsutil cors set mycorssettingfile.txt gs://mybucket

In my case, I am allowing my domain quick.work to be able to read any file, so I created a JSON file (example: mycorssettingfile.txt) like below.

[
  {
    "origin": ["https://quick.work", "http://quick.work", "http://127.0.0.1"],
    "responseHeader": ["Content-Type"],
    "method": ["GET"],
    "maxAgeSeconds": 3600
  }
]
NOTE: the "http://127.0.0.1" is only use during local development, after development is finished then must remove/update.

To verify whether our cors setting for our bucket is working or not, we can open Chrome browser to open our web address, for example http://127.0.0.1, then we open Chrome development console and try to get the content using fetch (Ajax), we can view the content if cors setting is set properly.

JSON.stringify(await (await fetch('https://storage.googleapis.com/quick.work/articles/article_26.json')).json())

Using GCS to serve static files is an excellent idea, my website quick.work is using Single Page Application (SPA) structure, the initial data (index.php) is from the shared hosting and the all static files are uploaded to GCS (.jpg, .json, .js, .css, etc.), it feels much faster than loading everything from shared hosting, there are many people already using GCS for this same purpose, my $1/month cheap shared hosting will be used for Content Management System (CMS) and as a secondary backup data (just in case Google Cloud Storage is down).

Maybe you have question about why I use shared hosting and not use Google App Engine (GAE) as hosting? there are some inconveniences for me to use GAE, such as I prefer to access 'traditional MySQL' instead of Google's MySQL (Cloud SQL), GAE pricing is dynamic and I prefer to use fix price (avoid bad surprises), and other personal reasons ^.^

I hope this article can be useful to save your precious time during your implementation, if you have any question then please don't hesitate to ask me, I would love to help.

Search more info