How to shutdown a kops Kubernetes cluster on AWS

While playing with the excellent lops I had the need to shutdown the cluster while not in use, this proved hard as every time I stopped the machines, something would turn it back on, turns out it was the AWS autoscaling groups.

To fix this I didn’t wanted to mess with the kops settings on the AWS side so I had to find the kops way to do this.

What I ended up doing was changing the instance groups to 0 using kops edit

For the nodes

kops edit ig nodes

and set maxSize and minSize to 0

for the master, I had to figure my master by doing

$ kops get ig
Using cluster from kubectl context: staging.espresive.com

NAME ROLE MACHINETYPE MIN MAX SUBNETS
master-us-west-2a Master m3.medium 0 0 us-west-2a
nodes Node t2.medium 0 0 us-west-2a

Then, with the name of my master

kops edit ig master-us-west-2a

and again set maxSize and minSize to 0

lastly, I had to update my cluster

kops update cluster --yes
kops rolling-update cluster

Awesome, cluster is offline now! no need to go into AWS.

If you wanted to turn your cluster back on, rever the settings, changing your master to at least 1, and your nodes to your liking, I use 2.