If you've ever wondered if you can do serious cloud development and set up a professional workflow from a Chromebook, I'm here to tell you it's not only possible but also a fantastic experience. I’ve been using my HP Elite Dragonfly Chromebook, with its built-in Linux environment, as my command center for this entire project. My goal? To build and deploy a Drupal blog from scratch on Google Cloud, complete with an automated CI/CD pipeline using GitHub Actions.

It's been a journey of configuration, troubleshooting, and learning, and I wanted to share the story so far.

The Foundation: Server and Database

Every website needs a home, and for this project, that home is a Google Compute Engine virtual machine. I spun up a standard Debian Linux instance to serve as my web server and a separate Cloud SQL instance to handle the database. This is a pretty standard setup, but it keeps the web server and the database decoupled, which is great for scalability and management.

The first minor hurdle appeared almost immediately. After installing the web server software (Apache and PHP), I realized the default version was PHP 8.2, but the latest version of Drupal runs best on PHP 8.3. This led to my first real lesson: how to manage software versions on a Linux server. The solution involved adding a trusted third-party package repository (a PPA maintained by OndÅ™ej Surý) that provides the latest PHP builds. A few commands later, and my server was running the correct version.

The Drupal Dance: Permissions and Configuration

With the server ready, it was time to install Drupal using Composer. If you've ever set up a CMS manually, you'll be familiar with the "permissions dance." This is where the installer tells you it can't write to certain directories. In my case, it was the classic combo: the sites/default/files directory didn't exist, and the settings.php file needed to be created from a template. A few mkdir, cp, and chown commands to give the web server the right permissions, and we were through to the database configuration screen.

This is where having a separate Cloud SQL instance came into play. Instead of using localhost, I had to point Drupal to the public IP address of my database instance, and just like that, the Drupal site was live!

The Cloud Conundrum: When Your IP Address Disappears

Things were going great... until they weren't. I could access the site via its IP address, but when I pointed my domain name to it using Cloudflare, I was hit with a dreaded 521 error. This error means Cloudflare can't reach your server. After double-checking the firewall rules in Google Cloud and confirming Apache was running, I discovered the culprit. I had resized my VM instance to adjust its performance, and in the process, Google Cloud had assigned it a brand new IP address!

This was a fantastic real-world lesson in how cloud infrastructure works. I had to update the IP in my Cloudflare DNS settings, the authorized network settings for my Cloud SQL database, and in the secrets for my planned GitHub Actions workflow. (Pro-tip: I've since assigned a static IP to the VM to prevent this from happening again).

From Server to Local: Setting Up a Real Workflow

With the site running, it was time to set up a proper development workflow. This meant getting the code off the server and onto my local Linux environment on the Chromebook. This is where we hit our next set of permission puzzles. My initial attempts to copy the files with rsync kept failing with a "Permission denied" error, even after adding my SSH keys to the server.

We pivoted. Instead of copying thousands of small files, I SSH'd into the server, created a compressed tar.gz archive of the project, and downloaded that single file to my local machine. It was much faster and bypassed the permission issues entirely. I now had a local copy of the site, ready for version control.

The Final Frontier: Automating with GitHub Actions

The end goal has always been CI/CD—a way to automatically deploy changes to my site whenever I push code to my GitHub repository. We set up a private repository, configured it to use SSH for authentication, and created the deploy.yml workflow file for GitHub Actions.

And that's where we are right now, on the cusp of a fully automated workflow. Our first attempts to run the deployment script have been met with one final, stubborn Permission denied error. This time, it's not about logging in; it's about the GitHub Actions user not having permission to write files into the web server's protected directory.

We've just updated the deployment script with a more robust strategy: copying the files to a temporary "staging" directory and then using sudo to move them into the final location. This should be the final piece of the puzzle.

In Part 2, I'll walk through the results of this new strategy and, hopefully, celebrate a fully functioning, automated CI/CD pipeline for my new Drupal blog, all managed from a Chromebook. Stay tuned!

 

Tags