Zero to Bitbucket in three weeks flat

Reading Time: 3 minutes

We have all been there. When we started bitHound I knew we would inevitably be supporting multiple platforms and different environments for analyzing code. However, with several code hosting platforms out there and limited resources available we had to grab one platform to start and do so quickly.

This decision was then followed by a pile of assumptions, shortcuts, leaking abstractions and general hackery that defines any software project. Eventually, the time came to clean things up, get back on track and for us to focus on adding true support for an additional code management platform, Bitbucket.

Step 1: Who are you?

Authentication and authorization is always a tricky part of any project, however, bitHound didn’t need to store much user information outside of the OAuth token from GitHub stored on the session.

Adding support for Bitbucket was just a matter of creating real user records that allowed us to have both Bitbucket and GitHub tokens linked together as one user.

This turned out to be quite a simple process since we use Passport and wired everything up for supporting multiple OAuth types quickly.

Step 2: Hiding the cruft

Our UI had grown rather organically – adding pages, features and controls over the course of a year of development. Most of our views required us to get information from GitHub about repositories, users, commits, files, etc. Over time we had abstracted most of these calls into an API wrapper which we called and used the raw GitHub returned objects to render our pages.

We started with a simple provider -> API -> Map combo that allowed us to extract out and modularize what our types were, map them to API’s and provide a standard interface:

  var provider = require('providers/github');

  provider.repo.get({owner: 'bithound', repo: 'app'}, function (err, repo) {

    //render the page  


Where the provider call would map which API calls are needed to build up the new standard repo type:

var api = require('github/api');

var map = require('maps/repo');

function get(opts, callback) {

  api.getRepo(opts, function (err, repo) {

    api.getLanguages(opts, function (err, languages) {

      map.github.repo(rep, languages, callback);




After some refactoring and getting all of our pages using the new provider it was just a matter of matching up the Bitbucket API calls to our entities. One of the best tools I had while working on this was the rest browser. Being able to quickly test and browse API’s to find the ones that matched what I needed was very useful.

Once we had the list of API’s we were surprised that there were very few cases where we were missing something that GitHub had provided us. Between the Version 1 and Version 2 API endpoints, we had access to everything we needed.

var api = require('bitbucket/api');

var map = require('maps/repo');

function get(opts, callback) {

  api.getRepo(opts, function (err, repo) {

    api.defaultBranch(opts, function (err, branch) {

      map.bitbucket.repo(rep, branch, callback);




Step 3: What do you mean they are not the same?

Once we had most of the functionalities and UI working, we needed to make sure our backend process could work with the repositories.

Our back end services at the time would take the owner/repo and access token, and would use that information to clone/analyze and save the results.

While GitHub allows us to clone via https with the access token, bitbucket does not allow this. Initially we were a little scared, because cloning with just the token was so easy. That said, once we got over our initial fear we found that it would actually be simple for us to handle this with Bitbucket.

Bitbucket provides two API’s for adding SSH keys to allow cloning of private repositories. Deploy keys are keys you can add to a repository as an admin that allow read only access to the code. User level SSH keys allow an application the same access that the user has to all repos.

In addition, we needed to be able to clone and analyze any repo a user had access to– regardless of admin rights. We ended up settling on adding a user SSH key. This is done automatically once a user attempts to process a private repo. The key is unique per user and can be updated on regular intervals.

Using a custom SSH key with Git is really easy and just involves having an alternative script Git calls for ssh defined in the SSH_GIT environment variable

exec('git clone ' + url + ' ' + target_folder, {

  env: {

    BH_KEY: user.ssh_key_path,

    GIT_SSH: path.join(__dirname, '../etc/ssh/')


}, callback);

in which our script is:


exec /usr/bin/ssh -o StrictHostKeyChecking=no -i $BH_KEY "$@"

Overall, working to build our integration with Bitbucket was a very smooth, positive experience. Atlassian’s Bitbucket development team was both informative and supportive. They have some really great tools and APIs available to developers and we are really excited to see what will happen now that the integration is complete.