How to integrate your existing roles with vagrant

If you have created an Ansible role and you want to reuse it for Vagrant there are a couple of things that you must now to run this role into Vagrant.

As you may know, you can use Ansible playbooks with Vagrant. In fact, is really easy. In your Vagrantfile you will have something like:

Vagrant.configure("2") do |config|
  config.vm.provision "ansible" do |ansible|
    ansible.playbook = "playbook.yml"
  end
end

But if you have a role.yml file and you want to apply it into Vagrant virtual machine you need to:

  1. Create an inventory alias for that server as follows in your inventory file:
    vagrant ansible_ssh_host=127.0.0.1 ansible_ssh_port=2222
  2. Create a playbook on the same directory than Vagrantfile location and specify there the alias created above. In our case we will call that file my_role.yml. Apply also the role (or roles) that you want to run for your server:
      - hosts: vagrant
        user: vagrant
        become: yes
        roles:
           - my_role
  3. Put the roles directory in the same location than Vagrantfile with the roles you need for your Vagrant virtual machine. In our example only a directory called ‘my_role’ with our templates, files, default vars and main tasks.
  4. Finally inside the configuration of the Vagrantfile you must configure the name of the virtual machine to be the same than the one specified above:
Vagrant.configure(2) do |config|
      config.vm.define "vagrant"
      config.vm.provision "ansible" do |ansible|
               ansible.playbook = "my_role.yml"
      end
end

Then you can easily run Vagrant with your roles.

Run ansible tasks on a remote server using a SSH Tunnel

If you want to run an ansible playbook on a remote server by using a ssh tunnel, you can use the following procedure:

Create an entry in your inventory file configuring the host as localhost and the port you want to use for the ssh tunnel. In our example we will use ‘tunnel’ as server alias:

tunnel ansible_host=127.0.0.1 ansible_port=2222

The procedure of the playbook should be as follows:

  1. Connect to localhost in order to create the tunnel.
  2. Connect to localhost using the tunnel and run tasks.
  3. Connect to localhost in order to delete the tunnel.

So first of all kill remaining SSH sessions that you can have using the port you’ve configured above (if any) and create the new connection. Take in consideration that we are also asking remote server IP (or hostname) and the remote SSH port. You don’t need to do that if you’re gonna connect always to the same server or if you know the remote SSH port. You can specify them in your playbook instead of using variables:

- hosts: 127.0.0.1
   connection: local
   vars_prompt:
         - name: "hostname"
           prompt: "Enter remote server hostname or IP"
           private: no
         - name: "ssh_port"
           prompt: "Enter remote ssh port"
           private: no
   tasks:
         - name: "Kill previous sessions on local port"
           shell: ps axuf | grep 2222 | grep ssh | awk '{print "kill -9 " $1}'

         - name: Create SSH tunnel
           shell: ssh -fN -L 2222:localhost:{{ ssh_port }} {{ hostname }}

Now that the connection has been established you can run commands on the remote server by using following code:

- hosts: tunnel
  user: <user with ssh access>
  tasks:
     - name: "Remote task"
           ...

It’s important to remark that you must know which user has ssh access to that server and you must use or key authentication or the same credentials used for localhost.

To finish your playbook properly is better if you kill your SSH tunnel:

- hosts: 127.0.0.1
   connection: local
   gather_facts: no
   tasks:
         - name: "Killing ssh process"
           shell: ps axuf | grep 2222 | grep ssh | awk '{ print "kill -9 " $1}'