OpenMDM Development Workflow

The OpenMDM project is using an Agile Project Management methodology. Our work revolves around 3 week Sprints. The sprint artifacts are derived from our JIRA Story/Issue backlog. Our Sprints are managed using a set of standard meetings:

  • Grooming
    • The participants of this meeting are the Product Manager, the development team, & QA team. This meeting happens before the sprint is started. The team reviews the backlog for stories that may be included in the next sprint and ensures that the "Story Points" for those stories are accurate.
  • Sprint Planning
    • The participants of this meeting are the Product Manager, the development team, the QA team, and the project/product stakeholders. The team reviews and prioritizes the stories that were not completed in the previous sprint and pulls them into the next sprint. The team reviews and prioritizes stories from the backlog and chooses the stories they can complete in the next sprint. 
  • Sprint Review
    • The participants of this meeting are the Product Manager, the development team, the QA team, and the project/product stakeholders. If demonstrable, the engineers show their completed stories to the stakeholders. Only stories that are complete should be demonstrated in this meeting. The criteria for completeness is as follows:
      • Code has been committed to a feature branch.
      • The code passes all unit tests.
      • The code meets all code coverage requirements.
      • A pull request has been created to merge the code into the "develop" branch. A set of "Reviewers" has been assigned to that pull request.
      • The "Reviewers" have reviewed the coded and "Approved" the pull request.
      • The developer has "Merged" the code into the develop branch.
      • The artifact has successfully been deployed to a QA instance.
      • A QA engineer has marked the Story/Issue as "Passed QA". We are being a little lax about this requirement since we only have one QA engineer.
    • Engineers should be demonstrating their work from the QA instance.
  • Sprint Retrospective
    • The participants of this meeting are the Product Manager, the development team, and the QA team. In this meeting, each participant is asked to describe what went well and what could be improved from the previous sprint.

Sprint Workflow

  • During Sprint Planning, we will attempt to assign Stories/Issues to engineers
  • After the Sprint Planning meeting, engineers (QA & developer) will task out their stories in JIRA
  • When an engineer starts working on a Story/Issue, they should "Start Progress" in JIRA. Right now we are ignoring the "Begin Business Analysis" and "BA Review Complete", so you can skip through them.
  • The Technology Center uses Git for version control. Git is a distributed version control system (DVCS) as opposed to a centralized version control system (CVCS). If you have not used a DVCS, (Git, Mercurial, Bazaar), please go through an online tutorial, as there are significant differences between DVCS & CVCS. Below are a couple of tutorials: 
  • In Git, you should create a branch off of the "api" branch with a branch name matching the following format "feature/MDM-XXX" where MDM-XXX matches your JIRA Story/Issue
    • git checkout develop
    • git pull
    • git branch feature/MDM-XXX
    • git checkout feature/MDM-XXX
  • For the duration of your Story/Issue work, you can/should commit to your work often to your local repository. You can/should also push your work to BitBucket. Be aware, that when you push your branch to BitBucket, a Jenkins build will be triggered, the unit tests will be run and, in the future, Cobertura code coverage thresholds will be checked. We may want to limit the length of this test to a short (10 mins or less) sanity test and then add a longer nightly test?
    • Use "git add" to "stage" the files you have changed or want to add to the repository 
    • The following command can be used to commit to your local repository:
      • git commit \-m "MDM-XXX - a description of the change"
  • As you add code to your repository, make sure you are adding unit/postman tests and, in the future, checking the Cobertura code coverage thresholds using the following commands (does Cobertura give coverage results for Postman tests?? -Mark):
    • Run unit tests
      • mvn clean package
    • IN THE FUTURE: Generate Cobertura code coverage web reports:
      • mvn clean cobertura:cobertura
        • In module being built, run the following command (on a mac):

          • open target/site/cobertura/index.html

      • Can Cobertura check coverage for but JUnit and Postman tests?

    • IN THE FUTURE: Check Cobertura code coverage thresholds:
      • mvn clean cobertura:check
  • Also, as you push your branch to BitBucket, make sure the Jenkins builds are passing. If the Jenkins builds are not passing, your Story/Issue is not complete.
  • As you are working on your feature branch, its a good idea to merge "develop" into your branch occasionally.
    • There are multiple ways of doing this. One way is to have your branch checked out locally and run the following command:
    • git merge origin/develop
  • Push your code from your local repository into the Bitbucket repository (origin)
    • git push origin feature/MDM-XXX
  • Before you do your "Pull Request" merge develop into your feature branch.
  • Once your coding is complete and the Jenkins builds are passing, you can do a Bitbucket "Pull Request".
    • When you do a "Pull Request", pick some reviewers.
    • If the reviewers make comment or change requests, make those changes on your feature branch and update the "Pull Request" in Bitbucket
    • Mark your JIRA Story/Bug/Task as "Ready for Code Review"
  • Once the review is complete and all the changes have been made as a result of the review and the reviewers have "Approved" the "Pull Request" in Bitbucket, you, the developer, can perform the "Merge" into the develop branch.
    • Mark your JIRA Story/Bug/Task as "Passed Code Review"
  • Merging into the develop branch will trigger a deployment to the QA environment
  • Go out to the QA environment and do a quick sanity test on your Story/Issue.
  • Mark your Story/Issue as "Ready for QA". Marking a story as "Ready for QA" means that it has been deployed to the QA environment and is ready for a QA engineer to do manual and automated testing against that QA deployment.
  • Make sure to update any documentation (READMEs, Confluence pages) with any new behavior/set-up required resulting from your Story/Issue
  • Once the QA engineer has completed their work on a story/bug/task, they will mark the story as "Ready to Deploy"
  • "Ready to Deploy" means that this story is ready to be deployed to the Production environment. (Currently, we do not have a production environment)