network-automation-blog

A collection of posts on technologies used in the area of network automation and programmability.

View on GitHub

This post goes through a pet project for myself- to automate the updating of refs from remotes for Git projects I work with, using Ansible.

Staying up to date with projects

Working with multiple people on multiple projects can be a difficult thing to keep track of. Sometimes you may open a new branch to start working on a new feature, get your work done, and submit in a pull request- only to have your reviewer(s) curse you that you forgot to rebase your branch on the destination branch. As automation engineers, we want all forgetful, repetitive tasks to be done for us in some way or fashion. For me, this is one such small task that I wanted to get done via the run of a single command on my system.

The process

I keep my Git projects within my WSL setup- where the root folder is my home folder. Within the home folder, I have different types of files and folders:

So chalking a straightforward way to fetch all refs for all Git repositories within my root directory looked like this:

  1. Get a list of all directories in root directory
  2. For all directories, find out if it is a Git repository
  3. If it is a Git repository, find out if Git remote repository tracking is configured
  4. If 2. and 3. are satisfied, it is a repository for which remote refs can be fetched. Fetch remote refs for Git repositories that satisfied point 2. and 3.

For me, this looks like a good opportunity to use Ansible because I find writing YAML easier than dealing with shell scripts.

NOTE: This works properly only when your Git repositories have personal access tokens configured to interact with your remote Git repositories. Read this for GitHub

The playbook

To get a basic hang of Ansible, please find my post on Ansible. In a file, I create a file named git-fetch-remotes.yml, and define the bed for the playbook:

---

- name: FETCH REMOTES FOR YOUR REPOS
  hosts: localhost
  gather_facts: no

  vars:
    root_directory: "/home/username/"

The playbook runs on the localhost, which is basically the Ansible controller, and we need to let the playbook know a variable named root_directory so it knows where to check for Git repositories.

Now to get to the tasks:

  tasks:

  - name: LIST ALL DIRECTORIES IN ROOT WITH INDICATOR
    shell:
      cmd: "ls -d */"
      chdir: "{ { root_directory } }"
    register: dirs

  - name: LOOP THROUGH LIST OF ITEMS IN ROOT DIRECTORY
    include_tasks: find-git-repos.yml
    loop: "{ { dirs.stdout_lines } }"

  - name: GIT FETCH FOR REMOTES
    command:
      chdir: "{ { root_directory ~ item } }"
      argv:
        - git
        - fetch
        - "--all"
    loop: "{ { git_repos } }"

(Please remove the spaces in between the curly brace characters, GitHub pages does not display it properly. Follow Jinja2 variable syntax)

A rough run-through of the tasks in here:

And a look into find-git-repos.yml:

---

- name: GET LIST OF ALL ITEMS IN DIRECTORY
  command:
    cmd: "ls -al"
    chdir: "{ { root_directory ~ item } }"
  register: ls_al

- name: IF GIT REPO EXISTS IN DIRECTORY, REGISTER CONFIGURED GIT REMOTES
  command:
    cmd: "git remote -v"
    chdir: "{ { root_directory ~ item } }"
  when: '".git" in ls_al.stdout'
  register: remotes_list

- name: IF GIT REMOTES PRESENT, ADD TO git_repos LIST
  set_fact:
    git_repos: "{ { git_repos|default([])|union([item]) } }"
  when: remotes_list.stdout is defined and remotes_list.stdout != ""

Once this playbook is ready, you can keep an alias for it in your .bashrc or similar file (depending on which *nix operating system you have). I kept an alias called go-fetch, so when I run the command from my BASH shell, the playbook executes ansible-playbook /path/to/git-fetch-remotes.yml.

alias go-fetch="ansible-playbook /path/to/git-fetch-remotes.yml"

Additionally, you can look to have go-fetch run in a scheduled manner on your machine using a cron job by adding an entry to your crontab.

References