Tag Archives: Reporting

Azure DevOps “Out of the Box” – Getting Started with Customizations

New to Azure DevOps? Here are a few customization recommendations you can make with minimal experience and deliver maximum value. User Stories are an essential part of delivering using agile methodologies, and Azure DevOps provides a basic template for creating a User Story, such as title, description, and acceptance criteria. However, there are a few additional fields the author of user stories can capture to maximize their agile journey such as MoSCoW priority, Precedence, and Size Estimate to name a few.

In addition, there is a Marketplace (i.e. Library) of Azure DevOps Extensions that can enhance your user’s DevOps experience. The post will cover the recommended extensions to apply to “Out of the Box” implementations of Azure DevOps.

Azure DevOps “Process” Updates: New Fields

Adding Fields to a User Story is very simple, as long as you have access to do so. Upon opening your Azure DevOps (ADO) project, select “Project Settings”, and the “Project details” page should appear. Select the “Process” defined for that project, e.g. “Scrum”. Depending upon which Process type selected, “Scrum” or “Agile”, you will see “Product Backlog Item” or “User Story”. Both may be used interchangabily. Note that only “inherited” processes can be modified by “Project Collection Administrators” group.

Process Change: Work Item Types
Process Change: Work Item Types

A list of Work Item Types appear. Select “User Story” or “Product Backlog Item”. The Layout of the work item will be displayed. Now you are able to add fields, by selecting the “New Field” button.

User Story – MoSCoW for MVP

For a Minimum Viable Product (MVP), where is the line drawn to get the product “out the door”? Here is a methodology called MoSCoW, self-explanatory, in which the capitalization is important and stands for:

  • “Must Have” – we aren’t going to production without it
  • Should Have” – borderline must have but could fall off the MVP list if there is pressure to reduce scope to meet timelines, for example.
  • Could Have” – a story identified but not prioritized in the currently targeted MVP.
  • “Won’t Have” – identified and then forgotten. It will never reach prod.

User Story – Precedence (Prioritization)

Reminiscent of the original BASIC programming language, using 10, 20, 30, etc., line numbers for execution sequence. In addition, like in BASIC, implement precedence by 10s, so there is room later on to fit in additional work items.

Priority within the Sprint for a given team member

How should someone on the implementation team prioritize their work? Especially important if the team runs out of time for a sprint and only produces the highest business or technology value first.

Priority within a Sprint for all team members

Collectively, as input from the product owner or team tech lead, the most important work items to deliver within a sprint.

User Story: Size Estimate (paired with Story Points)

Relative, standard, effort estimations are essential that everyone on the implementation team is “on the same page.” to sizing the user stories. Although “Story Points” is “Out of the Box” for User Stories, a “Size Estimate” field is not. Relative effort estimations I’ve used before are Tee Shirt sizes (X-small, small, medium, large, X-large), and can be correlated to Story Points to attempt to quantify the effort in days.

User Story: Lead Developer

A custom “Lead Developer” field is valuable for quickly identifying who performed the work. The current “Assigned To” person may not be the developer who implemented the User Story. Most likely, it’s a QA tester or the Product Owner for Accepting Stories.

This could be helpful if you want to track each developer’s progress either by the SUM of Story Points or the COUNT of Stories.

Risks to Compliment Issues

If you’re tracking “Issues,” an “Out of the Box” Azure DevOps work item, then why not add a custom object in the “Process” section called “Risk” and any fields you would like to track with that custom RIsk object?

Azure DevOps Extensions

Analytics

Created by Microsoft, this extension may or may not already be rolled into the core Azure DevOps product. It’s ideal if you want to externalize in-depth reporting using Microsoft Power BI.

Open in Excel

Created by Microsoft DevLabs, this extension may or may not already be rolled into the core Azure DevOps product.

Azure DevOps Office® Integration 2019

The best tool for importing and exporting work items from Azure DevOps to and from MS Excel. It can be downloaded here.

Delivery Plans

Created by Microsoft, this extension may or may not already be rolled into the core Azure DevOps product. It’s the closest I’ve seen (for free) with a graphic depiction of delivery timeframes in a Gantt-like chart. You can’t print or export it, which is a massive inhibitor to sharing your timelines with stakeholders outside the ADO universe.

Estimate

Created by Microsoft DevLabs, this extension may or may not already be rolled into the core Azure DevOps product. It’s Planning Poker in Azure Boards. I enjoy Planning Poker, but this integration may be more convenient because it can save the Story Point values directly to the User Stories. Also, note some corporate environments BLOCK “Planning Poker” on the firewall due to the words in the URL.

Feature timeline and Epic Roadmap

This Azure DevOps extension by Microsoft DevLabs is a close 2nd to the “Delivery Plans” visualization of deliverables. Again, no export or print capabilities.

Retrospectives

This extension is a “Must Have” for all teams leveraging the Scrum Retrospectives session. This extension, built by Microsoft DevLabs, is highly configurable and is ideal for remote teams unable to perform this activity in person.

Recipe for Optimization: Waterfall, Agile, and Scrum

Many firms try to graduate from Waterfall to Agile without completing the journey. The team may be embedded in an organization with strong ties by leadership to the traditional project plans with milestones. How can three schools of thought coalesce into an SDLC where all sides (mostly) buy into the resulting process?

The challenge with integrating new tools and process updates is to make sure there are no gaps in the new, incremental process. The more changes in people, processes, and technology, the greater the need to independently assess the target state SDLC.

Capability Maturity Model (CMM)

The Capability Maturity Model (CMM) is a development model created in 1986 after a study of data collected from organizations that contracted with the U.S. Department of Defense, who funded the research. The term “maturity” relates to the degree of formality and optimization of processes, from ad hoc practices, to formally defined steps, to managed result metrics, to active optimization of the processes.

The model’s aim is to improve existing software development processes, but it can also be applied to other processes.

Capability Maturity Model (CMM) Wikipedia

Tools Help Shape and Reinforce Product Life Cycle

Process Requirements: Epics, Features, and User Stories

From a top-down perspective, a discrete hierarchy of requirement elements helps logically organize the product requirements and so much more. An Epic is the highest level of requirements definition, which is a Theme of Features bundled together, e.g., for a major release. Features are the “next level” requirements definition and are associated with Epics as children. User Stories are the detailed level requirements and are usually formulated in the form of a narrative. Similar to use cases, there are personas or actors that operate on the product/system and design the implementation of a Feature. Successfully defined user stories have “Acceptance Criteria” for which the QA and/or Product Owner declares the User Story has been implemented according to spec.

Tools for Manging Requirements Implementation

Many SDLC requirements management products, such as Microsoft Azure DevOps and Atlassian JIRA, allow you to define a product backlog of Features and User Stories to be implemented by an implementation team. In addition, the QA implementation team members can create test coverage, i.e., associating Test Cases to each of the User Stories to be executed once (or in parallel to) the user story state has entered some form of “Test Ready” state. Finally, the implementation team may create Tasks as children to a User Story to help granularly track the implementation, such as Database Tasks, UI Tasks, or Interface Tasks.

Agile Manifesto on Documenting Requirements

The Agile Manifesto reinforces the “right” amount of documentation:

Working software over comprehensive documentation

That is, while there is value in the item on the right, we value the items on the left more.

The Agile Manifesto

Classically, in Waterfall SDLC, we await completed documentation such as the finalized Business Requirements document and technical specifications. Leveraging an Agile approach, a Sprint can incorporate incremental business requirements definition and iterate with evolving documentation. In addition, User Stories dictate the requirement in a practical way, where we can see the Persona travel through the User Story, ultimately meeting the “Acceptance Criteria”

There’s Nothing like a Good Gantt Chart

Visual timelines for tasks and milestones, showing dependencies between tasks and predecessor definitions dynamically push dependent work items. Typically, classic waterfall maps out milestones “going beyond the near-term.” Agile may look toward the delivery of one or two sprints ahead, sprints varying in time between one to six weeks each. In some instances applying SAFe, Scaled Agile Framework may instantiate a Product Increment [Sprint], which attempts to plan 8 + weeks ahead.

There are several ways to overlay classic Gantt chart visuals over the product backlog delivery timeframes. Depending on the toolset you use, such as Microsoft Azure DevOps and Atlassian JIRA, these visuals may be provided “out of the box” or leveraging 3rd party extensions, or even exporting the product backlog data to be reported using a 3rd party tool such as Microsoft Power BI.

Burndown Delivers Value

Neophytes to Agile will not be initially exposed to Burndown Charts. Scrum masters, akin to project managers, attempt to measure the health of initiatives using Key Performance Indicators (KPIs) and, in the case of Agile and Scrum, leverage sprints, story points, and average sprint velocity.

Burndown Release Chart
Burndown Release Chart
  • Story Points Remaining” – All of the user stories contain “Story Points.” Story points are derived from collective, relative effort estimations. Each person on the team guesses the size of each story based on other stories previously estimated. Implementation team members use a consistent scale for estimations, such as the Fibonacci Sequence. All implementation team members estimate each story and speak their answers at the same time. Then a consensus is achieved for a given story. Story Points Remaining is an aggregate of points for a defined major/minor release.
  • Items Not Estimated” – are stories in the “initiative” product backlog that have not yet been estimated. This number can skew the overall burndown estimated completion date/sprint by inflating the number of points still remaining. but are currently unknown. i.e., “Projected Completion” will not be accurate.
  • Total Scope” – is the total number of story points for the “initiative” regardless of user story completion status. There may be an upward tick of Total Scope, as we are agile and are able to accommodate for changes or increases in scope. over the course of the initiative.
  • Remaining” is the bar chart that shows a downward trend in the remaining scope for the initiative. The remaining may also have an uptick in user stories as we see “Items Not Estimated” become estimated.
  • Burndown” should be a downward trend, and based on the tool that derives this graph, it may predict the projected completion of the initiative based on several factors, including average total velocity per sprint.

Daily Scrum v. Daily Status – Removing Blockers

Daily, Weekly, and Biweekly status update sessions with the implementation team are no match for Daily Scrum sessions, which primarily focus on Blockers. Blockers may be Issues that impede progress for the implementation of User Stories. We all focus on unblocking team members so they can implement stories and we can earn Story Points.

Collective, Relative, Effort Estimations

The classic developer SWAG for effort estimations is “two weeks.” None of which may have any basis upon reality. Performing relative effort estimations allows the team to apply a reproducible methodology. We compare the size of a change relative to other changes we have made to the system. Any scale will do so long as you consistently apply the method. For example, you can use tee shirt sizes, Extra Small (XS), Small (S), Medium (M), Large (L), or Extra Large (XL).

Some teams use a sequence of numbers. One most notably used is the Fibonacci Sequence: 1, 2, 3, 5, 8, 13, 21, 34, and so on forever. with many of my teams, we use 1,3,5,8,13, and 20, a “modified” Fibonacci Sequence for 3-week sprints. If using user stories as the team’s discrete unit of requirements to implement., each story can have “Story Points,” and these points are populated using the Fibonacci Sequence. Your team can equate

  • 1- one day or less; ideal for a small change or spike
  • 3 – three days or less for change to implement
  • 5 – one business week
  • 8 – Week and 1/2
  • 13 – 2 weeks
  • 20 – 3 weeks

When deriving “Story Points,” the implementation team must agree that story points are inclusive of system integration testing.

Perception – Stakeholder Point of View

Stakeholders want to have a holistic review of the project/product health. Actually, that is just some stakeholders. Other stakeholders may just want to know how many open Bugs currently exist with the severity of one. The Scrum Master can develop dynamic reports and dashboards for whoever wants a peek into the product/project health in Azure DevOps and other tools.

Charts help communicate a message and help shape our point of view. Different project stakeholders have different needs of perspectives. Both Agile principles and Waterfall methodologies inspired visual mediums that reflect the Key Performance Indicators (KPIs) of a project or product evolution.

Agile, what have you done for me lately?

At the end of each sprint, during the Scrum, Sprint Close ceremony, the implementation team members demonstrate/discuss each of their completed user stories. The Product Owner (PO) accepts or reopens the user story based upon the Acceptance Criteria being met. Each user story that is accepted by the Product Owner has Story Points associated with it. All the accepted user stories “earn” story points for the team, and the points are accumulated for each sprint which is the velocity of the team for each sprint.

There are lots of ways the Sprint Close can go “Pear Shaped”.

  • “Acceptance Criteria” was not as detailed as required; the user story results were not entirely what was as expected by the Product Owner.
  • The implementation team took on too many stories and were not able to start/complete the projected stories for the sprint.
  • By failing to deliver on the Sprint “Open/Planning” committed Story Points, the average velocity of the team’s sprint may likely go down.

As a team, make sure you are prepared for the Sprint Close by performing Product Backlog Refinement days before to confirm things like “Acceptance Criteria” verbiage with the implementation team and the Product Owner. Work in Progress or WIP limits could help the team focus on their bandwidth and apply constraints to how many user stories the team can work on at one time, thus minimizing over-promising the Product Owner.

Waterfall Gates Persist

  • User Acceptance Testing – The business team(s) insisted they validate anything before it goes into the production environment.
  • Approvals from Internal Teams – conformity to organization architecture standards, for example, must be approved when changes in target state architecture changes are proposed.

Questions and Comments Appreciated

Please let me know if I missed any other Agile, Scrum, and Waterfall areas that can cohabitate/coalesce into cohesive SDLC.

Azure DevOps: In Search of Exceptional Reporting Falls Short of Expectations

Azure DevOps (ADO) reporting “out of the box” or leveraging extensions lacks robust project delivery timeline reporting many are reliant upon for conveying project timelines. Integrations by Microsoft DevLabs such as “Delivery Plans”, “Feature Timelines,” and “Epic Roadmap” all fall short of making the mark. Why? The ADO product targets Sprint delivery and primarily focuses on reporting with the most common graphical paradigms for measuring Project / Sprint progress, such as the Sprint Burndown rather than a Gantt chart.

Still Can’t Print

Sounds like a small ask, but it’s not. None of the aforementioned Azure DevOps Extensions gives the user the ability to print out timelines. I don’t think anyone said, “let’s not let users do that”. Printing out “Gantt-like” charts is not easy with special formatting constraints. If you’ve ever tried to print out an MS Project Gantt chart, you would know the pain of adjusting print parameters to get it just right, e.g., fit to NN page(s)

Still Can’t Share Outside Azure DevOps

This kind of relates to the “Still Can’t Print” issue. In the best-case scenario, users should be able to print their “Gantt like” charts to a PDF, and then the PDF can be used to externalize and vocalize the timelines, for example, with the “Feature TImelines” extension. Yes, you can send a link to these timelines visualizations; however, the user who will try and click the link will need to have some license setup in ADO to see the page.

Reporting Across Projects and Teams within an Org

The Azure DevOps “Delivery Plans” extension has finally empowered users to report across any and all projects across your ADO organization. In addition, filtering by a Team is also available if you have multiple teams working within a project. A portfolio manager could look across their portfolio of projects and only see what is relevant to them.

Markers

The Azure DevOps “Delivery Plans” extension allows the user to add markers/milestones to a project timeline. Product “release indicators” could be added to the timeline.

Delivery Plans Extension

Delivery Plans seem to be the most promising visualization tool, with additional capabilities noted:

  • Styling Rules (Colors, Bold, Italic, Underline)
  • Fields Displayed on Cards (up to 17)
  • Tag Colors

Drawbacks to Implementation

  • Field Criteria doesn’t include AND | OR Logic
  • “Feature Timelines” does a vertical “Group By” Epic, which seems to be a better “delivery focused” view showing which features will be delivered within a specific Epic instead of delivery grouped by Team. At least we should have the option to do either.

Power BI with Gantt Chart Reporting Against ADO

The closest you will come to Azure DevOps Project Plan reporting will be to utilize the Azure DevOps data source from within Power BI and install the Gantt Chart Visualization designed to report on ADO.

Power BI and Azure DevOps: Reporting “outside the box” to Stakeholders

Microsoft Azure DevOps (ADO) Reporting

With one Power BI report, users have the ability to report against ALL of their Azure DevOps servers and ADO Projects within a single report, and data would be up to date.

Out of the Box Capabilities

For those who need to pull data out of Microsoft Azure DevOps for reporting purposes, there are challenges when attempting to provide that information outside of Azure DevOps.

Typically, if I want to share project reports with my stakeholders, I would provide them a link to share these dynamic dashboards which focus on what they want to see. Project stakeholders may want to see an upcoming production release “bill of health” view, e.g. Burndown chart, Average Velocity, open critical bugs, etc.

However, what if some of your stakeholders don’t have or want access to Azure DevOps? Well, you could take a screen capture of a dashboard, and email your stakeholders that information or…

Power BI to the Rescue

Using both Power BI Desktop, a free license, and cloud Power BI Pro within the Office 365 suite of products, you can create a suite of reports against the Azure DevOps data, and share those reports on a schedule of your choosing. There are also several Analytics / Views that come with Azure DevOps to get you started.

Step 1: Select the Data Source:

Launch Power BI Desktop application found in the Microsoft Marketplace. Select “Get Data” after launching the application. Then a list of data sources is displayed to the user. Select “Online Services” data source group, “Azure DevOps (Beta), then “Connect”.

Power BI Data Source
Power BI Data Source

The user should then be presented with an Azure DevOps login.

ADO Login
ADO Login

Enter your Azure DevOps instance details for connecting to your site. If you are already logged into Azure DevOps in another browser tab, no additional authentication is required. You should now be presented with a list of Analytics / Views that come with ADO “out of the box”.

ADO Analytics Views in Power BI
ADO Analytics Views in Power BI

Just for demonstration purposes, please select the first item on the list, “Bugs – All History by Month”. A preview of the data should be shown on the right side of the panel. Select the “Load” button, which should be enabled if you’ve followed the steps thus far.

On the right side of the screen, there should be a panel called “Fields”. You can select all or some of the columns/fields within the View that was pulled from ADO. As you select the fields, they should populate on the left side of the screen, “Page 1” of the Power BI report. At this point, you may leverage your Power BI prowess to build graphical visualizations of the data you’ve imported.

Power BI Graphical Reports
Power BI Graphical Reports

Save your Power BI report, and then “Publish to Power BI”. The default destination is “My Workspace”, which should be defined with the use of the Power BI Pro, Office 365 app. Save the report and close the Power BI Desktop app. Open the Power BI cloud app from Office 365.

Open the “My Workspace” folder, and look for the “Dataset” and accompanying Power BI “Report” you just created. Click on the “Dataset” with the same name as your report to open it. Select the “Refresh” menu, and the “Schedule Refresh” menu item. Define your schedule to run BEFORE you will push the report via email to your stakeholders.

Subscribe
+ Add new Subscription

Go back to your home screen, select “My workspace”, then select the report you’ve created. Once the report appears, select the “Subscribe” menu. select the menu item “+ Add new Subscription”. Populate the who, what, and when, then select the “Save and Close” button.

Azure DevOps View Creation
Azure DevOps View Creation

That’s it. You could then start to create your own Analytics Views from within Azure DevOps, and then create Power BI reports.

Please note:

“Analytics views are data sets that are exposed to Power BI. You can use views to create reports based on your Azure DevOps data. This feature is in preview. How do I use analytics views?