Checking my monitoring logs I have 26 minutes of downtime attributable to Netlify since 03 March 2017. That's about 99.994% uptime.
If I attribute every single outage. That is 43 minutes since 03 March 2017, though some of those were not Netlify errors, but even so, if we are uncharitable/inaccurate and attribute them to Netlify that works out to 99.98% uptime.
Not bad for a free service.
They're also very transparent about service issues:
And support is very responsive. I recently talked to them about deprecating TLS 1.0 and 1.1 for instance, or providing the option to force TLS 1.2 if users desired. They were quick to respond and helpful.
So if you are seeing unusual issues, get in touch with them, even if you're a free user they'll still talk to you.
I use hugo and then setup my output folder (where the generated output goes) as a git repo. Generate the site -> git commit/push -> Caddy. Caddy has a feature to pull in content from a git repo so when you couple that with the built-in Let's Encrypt support it makes it dead simple.
I stopped using Caddy when I discovered that it didn't support one of the unusual TLDs I had. Maybe I should give it another go, though. That was over a year ago.
It was a bit of work to get everything running, but there's very little to actually maintain afterward. I'm definitely going to start replicating the setup elsewhere (including my own homepage, which is currently down due to its VPS having failed hard and me not having enough time to rebuild it).
I simply have a Travis-ci.org setup that does an "rsync" after "jekyll build". And in case I'm not in the mood for waiting on Travis, I simply "rsync" from my localhost after build.
Having an automated build system has advantages though - if you get a PR on your website repository with typos and so on, you just have to merge it and the content will get published, so you can do it over your phone. And yes, I had PRs, since I publish 2 project documentation websites this way.
Also folks, you don't need a CDN or Cloudflare, or any of that — you just need a healthy Nginx setup hosted at a decent VPS provider. I've had my websites withstand Reddit and HN level traffic just fine, paying $5 per month for hosting about 4 static websites, plus other stuff.
I also hate Medium, Blogger, Wordpress and any of that crap, I hate their bloat and trackers and I do think having your own website published in a Git repository is worth it. Yes, there is a cost in maintaining my websites, but I do so willingly, because they are mine.
PS: shameless plug — https://alexn.org
You are also right about not needing a CDN. My site has occasionally become momentarily popular and my $5 hosting VM hasn't even blinked on my completely static site. A database is a fine thing but you don't want to be serving web pages out of one. Thats why I finally ditched WordPress.
[1] https://sheep.horse/tagcloud.html#computing - a complete waste of your time.
More specifically, I like the dry humor and scattershot nature of the content. Reminds me of how the web used to be.
Do editing, with hugo in server mode so I can WYSIWYG edit my pages. Then run a bash script:
#!/bin/bash
# build site from markdown + template
hugo ~/sitedir/
# post to S3 bucket which is a file storage service
aws s3 sync ~/sitedir/public s3:sitedirbuckket --recursive
# invalidate CDN distribution so content delivery is nice and fresh!
aws cloudfront create-invalidation --distribution-id XXXXXXXXXX --paths /*
echo -e "All done"I like the CodeBuild solution for the times when I'm editing on my phone or a shared computer. I push to GitHub, and CodeBuild handles:
* build (as above, plus asset processing and minification)
* deploy (s3 sync, plus some fiddling to add 301 redirects)
* ping search engines
I keep a lot of drafts and temporary notes in my local checkouts and doing build/deploy on a fresh checkout helps to ensure they don't slip onto the public website.
I wonder how much cool stuff you could add to this build gem. Minification, staging, SCSS, etc.
The c9 setup is really nice for online page editing and to compile the static pages. I am not quite sure, if I would really use it in my current workflow.
Disclaimer: I am the author of github-bucket.
[1] https://gohugo.io/ [2] https://github.com/berlam/github-bucket
Is this because of people wanting to host on static-asset only servers (GitHub Pages, S3 Website, etc) or is there some other benefit above simply using any standard blogging software? If it's a question of speed, that's what caching does.
It's simpler on the server as you say, you can serve the files from pretty much anywhere. You also need less resources to do so. I appreciate the cache idea, used to do it myself with wordpress, still do with MyBB, but it's imperfect and there are always misses, espceially if someone is actively cache busting you to DOS your site.
Much less hassle to setup. If you're going to do it 'properly' you're going to want to set up your dynamic site to run in a chroot jail, run the php process under a unique user per site (especially with nginx), setup unique database users and databases per site, secure your credentials, have a version control setup and update functionality and on and on. There's a lot to do. You can automate it (I have) but it's still annoying and requires monitoring.
You can move almost anywhere almost instantly, with just a git push/rsync and a DNS change.
Hugely reduced attack surface. It's literally a collection of text files.
The benefits are somewhat reduced if you get someone else to manage your hosting for you, but it remains simpler to move and usually cheaper to host as you need no database service.
Anyway, that aside: So it's literally just the desire to have a zero-footprint blog. I can appreciate that notion but I'm surprised this trend came about with brand new tools as opposed to just packaging up the output of existing blog generators.
Sounds like you should build a WP plugin that generates a static/exportable site. You know the space and clearly understand the market dynamics.
Web sites are software products. If you think like a coder, you want to run them like a coder. If I were just a writer, I'd probably think very differently.
New thing for me was using Gitlab CI/CD. I taught the 'customer' how to edit on gitlab website and do merges. Now changes are deployed automagically without needing me to get involved.
Best part, no wordpress databases I need to worry!
Maybe I’m unaware of potential issues for a static site?
* Build systems besides Jekyll
* HTTPS for custom domains
...but the second problem can easily be solved with Cloudflare and the first one can be solved with a git subtree push. Beyond this it's pretty fully featured.
So an attacker can still alter/intercept content between GitHub Pages and Cloudflare before it gets to the visitor.
To some, the illusion of security might be considered more harmful that knowing you have none at all.
For an alternative, GitLab Pages offers HTTPS on custom domains, provisioned by LetsEncrypt I believe.
Netlify does something similar.
Both alternatives also allow any build system you configure.
Terraform config here: https://github.com/charlieegan3/personal-website/tree/master...
Sounds "just fine" for your own personal blog, sure…