Doing static versions.

Most websites are static HTML pages. However, people continue to spin up infrastructure based on a web server, which could run Apache or Nginx. But, you want to keep costs low. You can use DigitalOcean, but I’m an AWS guy, so we’ll do things the AWS way.

So let us talk about the goal. The more you focus on Cloud-based infrastructure, the more you understand it’s not about the hardware, instead it is the software or code. And being able to version control your system is fantastic. This goes along with being able to version your infrastructure, as your hardware is no longer a lovely little pet you maintain, it has become the cattle that you slaughter and eat. Meaning you don’t care about it; if it becomes an issue, you delete it and move on.

If you plan on doing a Blog site, your best bet is to pay the small yearly cost to WordPress.com; they do a fantastic job. But, most websites are just posting simple information. Think of a restaurant. How often does that experience change? The menu might change weekly, monthly. But contact information, location, etc.. does not. This is where a static hosted web page is perfect. 

Now since we all love Google, I did a quick search using these keywords:

automate static S3 GitHub

AWS is very powerful, and someone else already made a blog post covering what I wanted to explain:

Automate static website deployment from Github to S3 using AWS CodePipeline – sithum DevOps

So you follow that post, it will do everything you need – if you’re using GitHub. It shows how to create a code pipeline that will commit updates to S3 when you submit changes to your GitHub project. A couple of parts are left out, for example, SNS, Amazon Simple Notification Service. This is used to set up Email or SMS notices for your pipeline, i.e., success, fail, or manual intervention is needed if something happens to your pipeline. And, you need to have your S3 buckets already created. I’ll post another article going over what I would call “AWS 101”, which will outline security, billing, and alert notices to do first in your AWS Account. 

Another little thing to add to this pipeline would be to invalidate CloudFront data(CloudFront is a CDN, that distributes your content from an Origin-yours being S3-to geographically located servers around the globe). Just because you updated S3 with newer code doesn’t mean you would be sending the most up-to-date information, that is, if you’re using CloudFront. With CloudFront, you would need to let the content expire on those geographic servers, usually 24 hours. Then a new pull from your Origin would be done, or you would process an invalidate request-which deletes that file(s) from the cache on the servers(however, this can be costly if you are continually invalidating single files all the time). But talking about CloudFront is for another post. 

Also, keep in mind each time you process your pipeline, i.e., submit updates. This will push new content into S3. And, having an “active pipeline” does cost $1.00/Monthly. Now, this cost does not include the other service costs that might be associated with running the pipeline, i.e., transit data costs etc… So keep that in mind.

Some time spent in Vegas with AWS

Another year and another AWS re:Invent conference. This time with more people and longer lines. The down side, the re:Play after party was insane. A good time, however this year had multiple issues with long 15-20 minutes of waiting, only to find out “we’re out of food”. Wow. Not just one area, but multiple areas out of food. I think a serious logistic issue occurred. The past years I never experienced any issue with waiting longer than a couple minutes for drinks or even hardly waiting for food. I’ll say it was a communication issue this time, hopefully next year(2019) will be a smoother process to get food out quickly.

On the plus side, I did have access to the Certification Lounge. That made getting drinks, snacks and coffee a very quick experience. However, most of the time, all the nice seating was occupied. I think they had roughly 50 nice seats and a few small benches. Needs to be a bigger area for sure!

For the most part, overall was a very good conference, I heard around 55K+ people attended. Of course the lines for sessions are always packed, a good thing they’re recorded. This year they added little pins to track down either by word-of-mouth or doing certain sessions/activities and you’re awarded one of 60 pins. Lots of vendors did the same thing, lots of pins to collect, even doing different pins for different days.

A nice AWS swag item they added was a little water bottle, similar to what the Salesforce conference was doing. This made dealing with the Las Vegas heat nicer! Other than walking 10+ miles daily, the conference is awesome! Hope to see everyone again next year.