2015 was year of docker.I too was very skeptical of using docker for production.But coming to the end of the year, my views have changed on it drastically.This change was mostly due to the improvements in the developer workflow and maturation of tools like docker-compose(Formerly fig).
My three major concerns with docker were
How do I store persistent data essential in cases like running a database or image datastore.
Deployment of containers.
- Running development environment like guard which listens for file changes and runs tasks like tests / code linting.
The first two of my concerns were resolved, but the last one still lingers.This issue is mostly due to the way docker is setup,during development I run docker using boot2docker since I work on a mac.Guard is a really essential tool while developing rails backends.Since it runs your tests whenever your code changes.This provides rapid feedback and helps to get into the RED GREEN RED code flow.However docker doesnt push the file changes trigger back to guard,rendering it useless.
I would like to end this post on a high note.The Docker community is growing and there are lot of changes to come to docker.
This is the first blog post from the Vim a day series.
The post is about the plugin ctrlP which is available here.This plugin is really useful since it lets you jump between files really quickly.
If you are using Vundle, its really easy to install.
Now typing control + p does a quick fuzzy search in the current directory.Some of the quick shortcuts that I came across.
<c-t> Open the file in a new tab
<c-v>Open the file as vertical split
<c-x>Open the file as horizontal split
Lastly there is an essential configuration that could really speed up the fuzzy search
let g:ctrlp_user_command = ['.git/', 'git --git-dir=%s/.git ls-files -oc --exclude-standard']
This ignores the files in .git folder and the .gitignore file.
Today I upgraded one of my old macs with an SSD.The SSD is freaking fast.But since the SSD was less than half the size of the existing HDD, I thought it would be good idea to move the docker host machine to the HDD instead of the SSD
Doing this was pretty simple.
First make sure the virtual machine was switched off.Then click on setting and then on the storage button which is the 3rd from the left.
Select the disk.vmdk and click on the floppy icon with the minus button.For those who dont know what a floppy is click here.Then click ok to save it.
Next open the following path in finder. ~/.docker/machine/machines/default
Next copy the disk.vmdk drive to the HDD.
Once it is copied you can delete it from this folder.
Now we need to connect the Virtual drive back to the virtual machine.
Then click on File and select Virtual Media Manager.
Then click on the remove button from the virtual drive.If you dont follow this setup virtualbox will not allow you to connect the drive to the virtual machine.
Now you are ready to connect the virtual drive to the virtual machine.Click on the setting button for the virtual machine.Select storage and click on the plus hdd button and select the virtual drive.
Now the kitematic will use the same virtual machine and it will not notice a difference.
Recently I bought an old mac from my friend.This mac is pretty old, it has two GFX cards one integrated and one discrete.Switching the cards was really important since the machine would heat up while using the dedicated GPU.
I found this tool online on gfx.io.This tool lets me easily switch the GPU using a statusbar application.
Last week my colleague Victor from Paack and I were digging through the server to look at the logs.
-rw-rw-r-- 1 dev dev 3.5M Nov 25 17:33 newrelic_agent.log
-rw-r--r-- 1 www-data root 1G Nov 25 17:55 nginx.access.log
-rw-r--r-- 1 www-data root 1G Nov 25 17:55 nginx.error.log
-rw-rw-r-- 1 dev dev 0 Apr 30 2015 production.log
-rw-rw-r-- 1 dev dev 1G Nov 25 17:33 puma.access.log
-rw-rw-r-- 1 dev dev 10G Nov 25 12:03 puma.error.log
-rw-rw-r-- 1 dev dev 0 Dec 2 17:34 puma.log
-rw-rw-r-- 1 dev dev 50M Dec 2 17:52 sidekiq.log
When we looked at this folder we were shocked.Our log files were a couple of Gigabytes,That was a huge problem, since this could result in slower requests as it logs it.
Are some googling we found that logrotate came part of ubuntu.So though of using it.
The configuration was pretty simple, Logrotate configuration is really cool.It loooks like a bunch of dsls.
Runs the script weekly
Maximum size of the log file
Create a maximum of 30 files.
Ignore error if the file is missng
Compress the old logs with gzip
Truncate the old log file and create a new one file with the truncated data.
More information can be found on the documentation website