Hugo, I just recently discovered, is one of many static site generators. You feed them content in the form of Markdown files and they spit out HTML pages which you can then provide to a web server for hosting.
This in and of itself might not be very interesting but the potential hosting environments are what particularly interest me.
Why go static?
Since the pages are HTML, you don’t need to worry about having a server dynamically render a page when a user makes a request to your site.
No rendering language (JSP, PHP, NodeJS, etc.) means no application to run.
No application to run means no paying for or dealing with backend server hosting.
This is beneficial in two ways: cost and simplicity.
Cost is pretty obvious; dynamic hosting can get relatively expensive when we compare it to free.
Simple HTML pages don’t need any server-side processing prior to serving as they’re already HTML. This cuts down on the time it takes to actually deliver the page to the user.
no server-side processing + small file size = super fast website
One really good free option for hosting static pages is Github Pages. You create a repo with the name
username.github.io and drop your content in. Easy.
Another essentially free option is Google Cloud Storage; Google Cloud’s object storage. Similar process: create a bucket, drop your files in, and profit.
We’ll be taking a look at both today so let’s jump in.
Forget Bart, let’s go with Hugo
brew install hugo
Next we check that it’s installed correctly:
Sweet. Now we tell Hugo to create a new site for us (
my-site can be whatever you like):
hugo new site my-site
Hugo will then tell you that it created the site, the path to it, and some next steps:
Congratulations! Your new Hugo site is created in /Users/YOUR_USER/path/where/you/ran/hugo/my-site. ...
Almost there. Now we need to setup a theme. There are lots to choose from here; we’ll use the beautifulhugo theme. We need to check out the theme’s files from github and then tell Hugo we want to use it.
# Move to our site's directory cd my-site # Checkout the theme's files git init git submodule add https://github.com/halogenica/beautifulhugo.git themes/beautifulhugo # Tell Hugo we want to use the beautifulhugo theme by appending it to the config file echo 'theme = "beautifulhugo"' >> config.toml
Success! We’ve installed and configured Hugo!
One last thing we’ll do is create a test post. We use
hugo new to create a new post, provide it the path where we want to create it (the path is relative to the
content directory of our site), and then add some content to it.
We can also manually create files in the content directory and Hugo will still pick them up. The upside to using
hugo newis that it’ll add some template markup to the post file for us.
hugo new posts/my-first-post.md # Add some content to the post file echo '# Hello World from Hugo!' >> content/posts/my-first-post.md # Show the generated post file with our content cat content/posts/my-first-post.md --- title: "My First Post" date: 2019-12-10T11:17:47-05:00 draft: true --- # Hello World from Hugo!
The final step is to test that Hugo works and our ground-breaking content displays correctly. We’ll run the hugo server with drafts mode enabled so that we can see our post from above. This is because Hugo marks new posts as drafts by default (more info about draft mode here):
# -D and --buildDrafts are equivalent flags hugo server -D ... Web Server is available at http://localhost:1313/ (bind address 127.0.0.1) ...
Gimme, Gimme, Gimme (some HTML)
We’ve got our site ready to go and all we need are the generated HTML files.
Before we start generating, we need to do one important thing: we need to move the posts out of the draft mode. For each post that we want to publish, set the draft option at the top of the post Markdown file to false:
Now we can generate the files and this is super simple to do with Hugo. In our site’s main directory we run:
That’s it. Really.
Hugo will let you know what it generated and you’ll now notice there’s a
Now that we know Hugo works (I never doubted him), it’s time to take it to the cloud!
Git ready, Git set, go!
Now that we have those sweet, sweet generated files, we can push them real good to the cloud(s) for hosting. The first hosted option we’re going to use is Github Pages and it’s pretty
difficult error-prone convoluted straightforward.
Head over to your github account and create a new repo with the name
YOUR_USERNAME.github.io. Mine will be:
Before we copy our generated files over, we need to make one very important change: update the base URL in our site’s
config.toml file. You should see the
baseURL configured as
"http://example.org/", we need to change that to use the Github Pages URL (which is the name of our repo:
baseURL = "tunzor.github.io"
Now that we’ve updated that config, we need to regenerate the files. In our site’s directory:
# Delete public folder and its contents rm -rf public/ # Regenerate our static files hugo
We want to copy the
public/ directory’s contents to an empty directory so that we can push it to our new github repo.
# ~/hugo-blog is a temp directory; make it whatever you like cp -r public/ ~/hugo-blog/ # Move to the directory, add all the generated files, and push it real good cd ~/hugo-blog git init git add . git commit -m "Initial commit with an informative commit message." git remote add origin https://github.com/tunzor/tunzor.github.io.git git push -u origin master
Navigate to your github pages URL, which is conveniently the name of our repo
YOUR_USERNAME.github.io, and bask in its warm, glowing, warming glow.
We’re [going to be] on cloud nine!
To get our site into the Google Cloud, we’ll use the Google CLI tool
gsutil to actually do the upload but before we can use it, we have to install it. It comes packaged with the
gcloud tool which is how we can interact with the Google Cloud APIs. Instructions for setting it up are well documented here. Once we’ve got it installed and working, we’ll need to initialize it so we can authenticate and start doing cool stuff.
In GCP a project ID and project NAME are different. During creation if we make the name unique, then GCP will assign it as the project name and ID. If it’s not, then GCP will generate a unique ID; we don’t want this to happen as it won’t be as easy to remember.
# This opens a browser for authentication and sets gcloud config properties gcloud init # This will create a new GCP project gcloud projects create my-super-unique-projects-name
Now that we’ve got our project up and running, we have to create a GCS bucket.
Unlike the project ID that can be unique, the bucket name must be unique, so be specific.
We’ll be creating it in the
us-east1 region (this is part of the GCP free tier). Then we’ll modify the access control list to make the bucket and its contents public.
gsutil web setcommand below is NOT necessary when using a bucket name as it doesn’t seem to work for them. You’ll need to setup a domain and use that as the bucket name for it to work. For example,
/posts/my-first-post/index.html. The site however will still technically work if we path directly to the index file
https://storage.googleapis.com/BLOG-NAME/posts/my-first-post/index.html. This seems like it might be related to DNS CNAME configuration.
# -p specifies the GCP project ID; we created it above # -l is the bucket region gsutil mb -p my-super-unique-projects-name -l us-east1 gs://toninos-blog # Give everyone (read: unauthenticated users) read access gsutil iam ch allUsers:objectViewer gs://toninos-blog # Sets the index.html as the default page if the path is a directory # e.g. /posts/my-first-post will serve /posts/my-first-post/index.html gsutil web set -m index.html -e 404.html gs://toninos-blog
As we did for the Github Pages part above, we’ll need to update the
baseURL config before we push to GCS. Update it to use the GCS URL and the name of the bucket we created above and regenerate them as we did before:
# Update line in config.toml baseURL = "https://storage.googleapis.com/toninos-blog" ... # Delete public folder and its contents rm -rf public/ # Regenerate our static files hugo
Next we need to copy the public directory’s contents to the bucket.
# Move into our public directory cd public/ # -m for multi-thread/process # -r for recursively copy (child directories/files) gsutil -m cp -r * gs://toninos-blog
Aaaaand we’re done!
We can now navigate to our bucket using the
baseURL from above and adding
index.html to it and we should see our site!