Last month I released a plugin for fog, a ruby gem that helps you manage servers on different cloud providers.

Cloudatcost is one of those unique providers where you can get a server for lifetime so basically you will don’t have to worry about paying for the machine every month.However the ui to create the virtual machine is extremely horrible.But then I came across their api. The noticed you can set the ram capacity for each machine through it.This is unlike most other providers who have a fixed size for the virtual machine.

So here is how i used fog-cloudatcost to setup multiple vpn servers.

require 'fog'

cac = Fog::Compute.new({
  :provider  => 'CloudAtCost',
  :email     => 'example@email.com',         # Your email address
  :api_key   => 'poiuweoruwoeiuroiwuer', # your API Token   
})
# Then we get the OS templates available

cac.templates.each do |image|
  puts image.id
  puts image.detail
end

server = cac.servers.create :cpu => 'foobar', # 1, 2, 4 
                            :ram  => 1024, # multiple of 4 min 512
                            :storage => 10, # 10G
                            :template_id => 75 #Template id

 



2015 was year of docker.I too was very skeptical of using docker for production.But coming to the end of the year, my views have changed on it drastically.This change was mostly due to the improvements in the developer workflow and maturation of tools like docker-compose(Formerly fig).

My three major concerns with docker were

  • How do I store persistent data essential in cases like running a database or image datastore.
  • Deployment of containers.
  • Running development environment like guard which listens for file changes and runs tasks like tests / code linting.

The first two of my concerns were resolved, but the last one still lingers.This issue is mostly due to the way docker is setup,during development I run docker using boot2docker since I work on a mac.Guard is a really essential tool while developing rails backends.Since it runs your tests whenever your code changes.This provides rapid feedback and helps to get into the RED GREEN RED code flow.However docker doesnt push the file changes trigger back to guard,rendering it useless.

I would like to end this post on a high note.The Docker community is growing and there are lot of changes to come to docker.



Last week on saturday I had participated in the dec0de hackathon with my friend.Unlike previous hackathon that I took part in.I had an idea in mind days before the hackathon.All that was left was implementing it during the hackathon.

The idea was pretty straight forward, it was a mobile app that tracks the transaction from different bank accounts through the SMS that it receives after every transaction.

Here is the presentation:

After thinking about it for a while, I realized that it was a combination of the presentation and the idea that made us win first place at the hackathon.


More over we were also published in a couple of article online:

http://www.thenational.ae/business/the-life/race-to-the-app-store-innovative-minds-tackle-dubai-hackathon



Today I was working with one of the startups at the cribb on their project to clean up their code and fix a critical issue with the it.

The codebase was not large but it was kinda modular.Being an MVP the product worked, it was serving a bunch of users.Unfortunately the code most important for them was a bit buggy.It was creating multiple connections to the database on a single request.This problem was exaggerated by the limit the hosting provider put on the number of database connections to it.

So we sat down to fix the code.Unfornately the code didn’t have test ,I wouldn’t blame them as it was an MVP.So we manually tested the website.We turned on debug messages and we were off fixing the issue.

The main problem arose at the database connection class.This class was used to create connections to the database and every instance of this class would start another connection to the database.

The obvious solution that came to mind was to write a singleton, which is mostly frowned upon.But we decided to do it anyways.Moreover it had been a really long time since I touched PHP code.So after going through  some old code from other projects.I found realized how easy it is to forget such stuff.Code which I once thought I was a master off was completed erased from my head.Lucky the concepts still stayed with me.

Fixing the issue on the database was straight forward we created a singleton database connection class which other models would use to query for information.This brought down the connection count to just one as it is supposed to be.

Singletons is well know for being an anti pattern as it is makes testing difficult or it being used as a global variable.But in this case the singleton pattern was really benificial.



A couple of months ago I worked on a chef recipe to deploy pptd on your server and I used it to setup my own pptpd server on host1plus.
I even made it open source on github.last week I had to setup another pptpd server,So this was the right time to reuse the recipe I wrote.It had been more than a month that I had used chef and knife solo to setup the server.

I tried out my recipe and shot things were not working.there were multiple issues with the recipe.the documentation as good as no documentation.i had to go through the ruby code to actually understand the parameters.moreover I had to restart my server to get the iptables rules to apply and on top of it the iptable rules in my recipe were not complete.basically The server would receive data from the client but it would not forward the data to the interface connected to the network.

So with all of this chaos the lesson I learnt was open source is good.open source is not just about writing code,the only way you can make open source work is by supplementing your code with documentation and tutorials which would help others or the future you to get started with using your project.



Bitbucket is a good git hosting service especially when they let you create unlimited private repos for free.However bitbucket doesnt let you use the same ssh key with two different users.Now you might be thinking why the hell does he need two different ssh keys.My answer to that is “I wanted to keep my personal and private account seperate”

I am across this post on stackoverflow.It pretty much solved my issue.All I had to do was change my ~/.ssh/config file to this..

Host github1
  HostName github.com
  User git
  IdentityFile ~/.ssh/id_repo1

Host github2
  HostName github.com
  User git
  IdentityFile ~/.ssh/id_repo2

You will also have to update the git remote url for the repos.

git remote set-url origin github1:user/repo1



Its been about two months since i sat down to write a blog post.Just like my dieting there are always highs and lows.Now that the 5 projects are done with I can return safely to my computer and type away.

monkey_typing

Usually when I am on a diet I would start the week off with high hopes and energy not consuming those awesome delicacies on my way home.But as the week passes by I would revert back to my old habits and eating out of control.
The same happens when I start blogging ,it starts of with a couple of posts every week and as work piles up I stop writing and that ends up i this lonely graveyard i call my blog.

So this time I will make it an habit to write atleast one post every week.



I am really crazy about the concept of time travel not regular travel(I hate travelling where the weather is bad).By writing this blog i have actually done some sort of time travel or something close to it.

Now you might me wondering what I was talking about.Thats where the crazy inception comes along.The inception here is the blog post.
For me blog posts are a way of reminding the future me on how to get things done.I might have learnt about something of utter importance a year ago and due to the way my brain functions ,my long term memory is pretty crap without exercise.

This post a note for myself in the future.



After looking online for a long time for a stable and secure way to setup ruby for production.I was forced to install ruby 2.1.2 from source.These were the steps i have to run so get ruby running.
Firstly to build ruby we need the following packages.We need to update your package information using

sudo apt-get update

Once that is done the package manager knows where to get the packages and list is successfully updated.
To build ruby on the machine we have run the following command which will install the build tools

sudo apt-get install openssl libreadline-dev curl zlib1g zlib1g-dev libssl-dev libyaml-dev libxml2-dev libxslt-dev autoconf libc6-dev ncurses-dev automake libtool bison libcurl4-openssl-dev build-essential

 

Once that is done we then get ruby on source and extract it a particular directory.Next we run the configure the code to check for dependencies which are required to build.Lastly it is time to build so running the make command to build the binary from source.Make install then installs the software on the computer.

wget http://cache.ruby-lang.org/pub/ruby/2.1/ruby-2.1.2.tar.gz
tar xzfv ruby-2.1.2.tar.tar.gz
cd ruby-2.1.2.tar
./configure
sudo make && sudo make install

 

  
There you go now you have the latest ruby running on your machine.



Chef has really speed up the deployment process for me.It basically allows me to define the infrastructure through code which is version control and redeploy-able in seconds.Tody I will working with chef solo through vagrant.
Vagrant allows me to me create virtual machines which helps me isolate the dev environments from each other.

We start by creating the vagrant file
I will be using librarian chef to manage recipes and cookbooks.We initiate the chef repo with the following command

librarian-chef init

 

This creates the the required Cheffile which we will use to manage the cookbooks.

To get the relevant cookbooks we have to run the following command.

librarian-chef install

This will download the cookbooks mentioned in the file above into the cookbooks directory.
We then can run the knife tool to get the cookbooks installed on to the machine.

knife solo bootstrap vagrant@localhost -p 2222

This will install the chef client on the vagrant box.It will also create a .json file in the nodes directory.Inside the this file we mention the recipes which should be run by the chef client.

Once the file is ready running the knife solo bootstrap command again sets up the lamp stack on the machine.