Transitioning from Agile, Iterative Project Initiatives to Production Support, DevOps

Going from dedicated project-funded efforts using Agile and Scrum methodologies, such as Sprint Planning, Backlog Refinement, Sprint Close Demos, etc., to a production support process leveraging the DevOps (Development and Operations) model requires a transition path to be successful.

People, Processes, and Technology need to shift along with this change in the Software Development Lifecycle (SDLC) mandated by management.

Commitment v. “Pulling from the Backlog” Mindset

Agile Teams Leveraging Scrum Ceremonies

At the foundation of Agile with the application of Scrum ceremonies is a commitment from the team and individuals on the team to implement User Stories within an agreed cadence, a Sprint (e.g., two weeks). The product owner and the implementation team articulate what is required to implement the story, produce a collective, relative effort estimate in the form of Story Points, and agree to complete the set of user stories within a given sprint cadence. For each user story, the “Definition of Done” is clearly articulated in the form of “Acceptance Criteria”, and this criteria is used as a guidepost for software development and quality assurance.

As an Agile, Scrum team, you may view your product backlog differently than you would in a DevOps, Development, and Operational model. Scrum teams are focused on the day-to-day work toward implementing user stories and making progress on user stories and Bugs to fulfill their commitments for the current Sprint. Scrum teams are typically focused on implementing work items or removing blockers and may discuss these activities each day during a Daily Scrum or “Daily Standup.”

DevOps – Pull from the Backlog

Unlike Scrum teams, who set up sessions to measure progress in a particular cadence, e.g., two-week sprints with Sprint Planning and Sprint Close sessions, DevOps team members pull from the backlog as their bandwidth becomes available. The [Business] Product Owner and the DevOps team may have some regular or ADHOC sessions for Backlog Grooming / Refinement to ensure the user stories are ready for implementation and prioritized appropriately.

The Product Owner and DevOps team periodically perform Backlog Refinement sessions to make sure the prioritized User Stories have all of the necessary elements to implement the user story. During these Product Backlog Refinement sessions, team members perform a relative effort estimation of each user story. How long will it take to implement the user story? Each team member may indicate how much effort they feel will be needed to implement the product backlog item (a.k.a. User Story). See articles on poker planning, a collective, relative, effort estimation process/tool that standardize how to perform these estimations.

When one of the DevOps team members has the bandwidth to take on one of the stories, they pull it off the backlog and move it to the Board for implementation. [Kanban] Boards have an agreed workflow to allow DevOps team members to move items through the agreed software development lifecycle (SDLC).

Production Critical Alerts Take Precedence

In this process, there is no commitment or agreement when the team member will finish their work on the user story, i.e., complete by Sprint Close cadence. Story Points, or Product Backlog, Size Estimation give the individual and the team an indication of how long the Product Backlog Item (PBI) might take to implement. Unfortunately, the (Development / Operations) DevOps team member’s responsibility stretches beyond “new” work from the Product Backlog. Operations duties, such as reacting to critical application monitoring alerts from the production environment, may take higher precedence.

Where Am I?!?

DevOps team members may have frequent disruptions in their work from production issues and have their heads spinning, switching back and forth from implementing PBIs to handling ADHOC issues from production applications. The Kanban board is one way to get everyone back on the same page with the changes in progress. At a glance, we can visualize the progress of implementing user stories, bugs, and associated tasks on the Kanban board.

Kanban board
Kanban board

Anatomy of a Great Kanban Board

Moving from a Scrum team to a DevOps team, you may, as an individual, be looking at the Kanban board from time to time, such as when you have bandwidth available to work on Bugs or Product Owner prioritized User Stories. The following does not assume your project team transitioned to a DevOps model or a separate DevOps team took over,

Columns Match your DevOps, SDLC Workflow

Regardless of who is doing the work, how the work is being done moving forward is essential to map out the software development lifecycle (SDLC) under DevOps constraints. The DevOps team will establish the states for each work item as they apply to the DevOps team. For example, there may not be QA team members, but there would be a testing process to verify the implementation of a Bug fix or User Story.

However, in a Scrum team going from “Dev Complete” to “Testing Complete” may require a “Release Management” phase, i.e. promoting code from DEV to TEST environments. On a Scrum team, between “Dev Complete” and “Testing Complete”, there may have been a phase to run a cursory or “smoke test” before going to “QA Approved.” This alternate DevOps SDLC process may not require a smoke test anymore due to the team’s composition. Long story short, it’s essential to get your process agreed to and implemented on the DevOps team Kanban board. Each column has a state, and the idea is to move Product Backlog Items (PBIs) from left to right and terminate at the “Closed” status.

Identifying and Removing Blockers

It’s all about keeping the momentum forward. If we cannot work on a Bug or User Story because we are Blocked for any reason, that is time wasted without progress. As a team, we should always be on the lookout for Blocking Issues that prevent our teammates or us from moving forward. Once identified, we aggressively look for ways to unblock ourselves or our teammates. The Kanban Board typically has a “Blocked” status column, so it’s very visible to the team once the PBI is indicated to be Blocked. Of course, the “Blocked” identification and remediation process is not limited to DevOps or Scrum teams.

The HOV Lane for Critical Production Issues

In some cases, changes to production code or configuration need to be dealt with by the DevOps team. These production issues that require “priority treatment”, e.g. Severity = Critical, may go in a “swimlane” on the Kanban board, which clearly articulates these Product Backlog Items (PBIs) are the top priority for the team (see figure above).

Definition of Done – Acceptance Criteria

As in Scrum ceremonies, the “Definition of Done” should be clearly articulated in the PBIs (i.e. user stories and Bugs). Sometimes the Definition of Done fits well in the “Acceptance Criteria” field of the PBI, i.e. these are the following things that need to appear in the code or surface on the UI to be accepted as “Closed” or “Done”.

Work in Progress (WIP) Limits

On some teams, there is a concern about “workflow blockage” at any given state in the SDLC process. For example, there could be 20 PBIs in the “In Progress” state for three DevOps team members. This could be identified as excessive and trying to do too much simultaneously. It also may contribute to confusion on the current state of any given work item. Some Kanban Board tools allow you to apply WIP limits so you cannot add more work items to a given status on the board. It also could be done using a standard paper Kanban board.

Product Documentation

If two separate teams are transitioning the work, documentation may be vital in the successful transition and ongoing product maintenance. Many agile teams are lighter on documentation and trust the product speaks for itself. Best case, user stories have been created that cover the team producing/updating a functional specification doc and a wireframe collection. The most probable situation is we have a pristine set of Features and associated User Stories. Each of the User Stories clearly articulates a description and, most importantly, “Acceptance Criteria.” that may be used for the development and validation of the functionality of the system. User Stories can be derived for knowledge transfer documentation.

Always Room to Improve – Retrospective

Although a Retrospective session is typically attributed to a Scrum ceremony, you don’t have to be engaged in Scrum activities to perform a retrospective. Depending upon the DevOps Team composition, it could be a collective, grassroots suggestion, or the team DevOps manager can recommend and facilitate the session. It would be better if a team peer fulfills the role of facilitator, and some retrospective tools allow anonymous feedback.

Good luck on your journey, and if you have any questions, please reach out.

“Must Have” Power Automate Enhancements for Solution Maturity

My aspirations at commercializing “externally” facing, manually executed workflows using the MS Flow SDK are at an impasse. There are several key changes required to move forward, listed below.

In addition to removing my “Blockers” for leveraging the MS Flow SDK, I have a “wish list” below that contains features that would enhance the overall Power Automate solution.

“My Flows” Folder and Tag Hierarchy

I asked for this feature from Day #1 when MS Flow was released. Organizing your Flows within a flat, hierarchical structure is difficult. Users cannot create folders and organize their content.

Organizing Flows
  • The user should be able to create folders and put Flows in them.
  • Attach “Tags” to each of the Flows created. implement another view of “My Flows,” and group Flows based on Tags with the ability to show a single Flow within multiple Tag views.

Version Control

The user should have the ability to iterate saving workflows, compare versions of workflows, and revert to a previous workflow version.

Security Model Enhancements – Sharing Workflows

Implement execution permissions without the ability to manage or read any other workflow information. Introduce the concept of a two-tiered security model:

  • Admin user, the current view for all Power Automate users
  • Introduce a user profile/security group, execute only specific workflows where explicitly granted permissions to a Flow.
  • Eliminate the need for another Power Automate account only being used for shared workflows to execute.
  • You should only need one account with a Premium Connector $ license if only using the secondary account for execution.

Microsoft Power Automate SDK to Build Externalized Applications

When I first started looking at Microsoft Flow, the previous name of Microsoft Power Automate, I recognized the high value and potential uses within my organization. Almost innumerable Connectors to 3rd party applications, “sensing” / triggers for many of the applications participating. Huge potential, and won’t “break the bank” with a 15 USD price per user/month which contains all of the “Premium Connectors.”

I started automating processes for both personal and business. I upped my social media game, for example, sending a mobile notification to myself when there was a potentially interesting tweet. Or, if there was an RSS feed containing keywords I was monitoring, I sent myself an email with the news article. My client had needs around Microsoft Azure DevOps (ADO) that it was not capable of doing “out of the box,” so I took on those automated workflows with ease.

Venture using New Business Model with msflowsdk

Then I thought it would be great to commercialize some of these workflows. However, there were several technical limitations I came to realize. First, to execute one of these workflows manually, you would have to execute it from within the MS Power Automate web or mobile application. The user must be logged in with your Power Automate credentials to execute “your” workflow manually. As a Power Automate user, you could “Share” Power Automate flows with other Power Automate users. Unfortunately, that would require your web app customers to have Power Automate accounts paying as much as 15 USD per month. We would have to think in terms of a generic “Production” application user, potentially shared with all external, commercial users.

Providing Custom Interface using HTML and JavaScript

Then I realized there was one way to present the Power Automate Flows without the Power Automate Web UI or the Power Automate iPhone app, and that would be to use the MS Flow SDK to build HTML and JavaScript Web applications. Unfortunately, you would still have to log in with your Power Automate user, that has access to the flow, but the User Interface was highly customizable by using MS Flow SDK.

Using “Generic” Test User for MS Power Automate

  • Limit the user’s access who does have access to your Microsoft Power Automate workflows. As micro as possible to granularize the permissions, such as execute XYZ Power Automate Flow without permission to read/see ALL the Power Automate Workflows, At the moment, that doesn’t seem possible (TBD). I need to recheck in Azure Portal and client app registration.
  • Does my approach to commercializing MS Power Automate apps even supported from a Microsoft business perspective? I don’t know yet. I read the article: Types of Power Automate licenses and need to reread this document.
  • Need the ability to grant “Execute” access to specific MS Power Automate workflow users without the ability to create or read any workflows of their own, limited Power Automate, User License?

Create an Azure AD User For Each Customer

  • Technical seamless implementation would be required to add Azure AD users who have paid a commercial fee for the Web app powered by Power Automate, or I embed advertisements into the Azure, Power Automate, Custom Web App.

The Experiment – SMS Delay

Wouldn’t it be fun to send a text message with a delay, enter a text message, and parameterize the delay in N minutes? How fast could I write the app across multiple platforms, desktop, and mobile? The backend and mid-tier would probably be the longest aspect of the development of this app. You probably need to put it in a responsive Web App to resize it to fit the platform. But the N tiers of the stack, how fast can I develop that? Less than an hour using Power Automate.

Power Automate Workflow

This Power Automate Workflow has three steps in the workflow: the “Manual Trigger,” the “Delay,” and leveraging the Twilio Action – Send Text Message (SMS), which happens not to be a “Premium” connector.

Power Automate: SMS-Delay
Power Automate: SMS-Delay

Front End Code to Integrate

With a combination of HTML, JavaScript, and the MS Flow SDK, I was able to put the SMS-Delay app together rather swiftly, including everything from Azure Authentication to my app to execution.

SMS Delay UI
SMS Delay UI

Give SMS-Delay a Try

Would you like to try out this Power Automate manual workflow? Please provide ANY login you would like to use for Azure AD authentication, and the user must have access to Microsoft Power Automate FREE license. Once you provide the user name to me, I will update Azure AD to include your permissions to the app and then send you a note to give the app a try: SMS Delay Application (rosemansolutions.com)

To Be Continued

For the next steps, I’d like to…

  • Publish the project HTML and JavaScript code I used to create this app
  • Solve the riddle of the Power Automate authentication
  • Create this and many other applications using the MS Flow SDK

Anonymous Authentication or Limited Authentication

Limiting the authentication, using very granular controls of Power Automate which may or may not yet be implemented. Have a limited Power Automate user with grant permissions ONLY to execute a specific workflow.

Is it possible to execute a Power Automate workflow with anonymous credentials and not necessarily have a Power Automate user account?

Digital Download: Content and License Transfer – Business Model In Jeopardy!

GameStop reminds me of Redbox and Netflix facing business model decimation as we transitioned from DVDs and Blu-Ray to streaming digital content. No more physical medium to borrow/rent, just streaming data from massive content libraries. Netflix pivoted early on and became a survivor and thriving revised business model.

GameStop’s pre-owned buy-and-sell business model is in jeopardy and has been for some time now. All of the major game consoles provide users with purchasing via digital download. There is no way to transfer that digital content and license purchase to anyone else. If there was a way to transfer the digital content and associated license for a game, maybe GameStop’s pre-owned business model might thrive again.

Securely Transfer Digital Content and License

There are several possibilities for implementing this transfer. One opportunity could be leveraging a large-capacity SD card, and the software on the console can push the digitally downloaded game onto the SD card along with the correlated license. The opposite should also be true. Pop in the SD card with a loaded game and license and that content could be transferred to any console of the same manufacturer.

Is this a job for Blockchain?

It should put software game designers at ease, leveraging several design features of Blockchain. Blockchain architecture would guarantee ownership, the uniqueness of a digital license, and associated digital game ownership. The content could be stored in the cloud, similar to NFT art and video content. This should NOT be confused with using ETH to purchase NFTs with cryptocurrency. In this scenario, we would exchange the SD card medium with Blockchain architecture.

So, why is this not implemented already?

The game console manufacturers don’t profit from trading and selling pre-owned games, so there is no push. GameStop should be leading the charge on this endeavor, offering to implement this module in all major gaming systems or outsourcing its implementation. Worst case, form an ADHOC committee to derive standards for implementing this module. The game console manufacturer market is a monopoly or, at a minimum, an oligopoly. Can anti-trust legislation be applied here to Microsoft Xbox, Sony Playstation, and Nintendo Switch?

Online, Gaming as a Service (GaaS) – Not Applicable

To state the obvious, online gaming or Gaming as a Service (GaaS) business models charging monthly or annual fees to access their game service do not apply to the one-time purchase of the game where the customer owns “the game.”

Azure DevOps “Out of the Box” – Getting Started with Customizations

New to Azure DevOps? Here are a few customization recommendations you can make with minimal experience and deliver maximum value. User Stories are an essential part of delivering using agile methodologies, and Azure DevOps provides a basic template for creating a User Story, such as title, description, and acceptance criteria. However, there are a few additional fields the author of user stories can capture to maximize their agile journey such as MoSCoW priority, Precedence, and Size Estimate to name a few.

In addition, there is a Marketplace (i.e. Library) of Azure DevOps Extensions that can enhance your user’s DevOps experience. The post will cover the recommended extensions to apply to “Out of the Box” implementations of Azure DevOps.

Azure DevOps “Process” Updates: New Fields

Adding Fields to a User Story is very simple, as long as you have access to do so. Upon opening your Azure DevOps (ADO) project, select “Project Settings”, and the “Project details” page should appear. Select the “Process” defined for that project, e.g. “Scrum”. Depending upon which Process type selected, “Scrum” or “Agile”, you will see “Product Backlog Item” or “User Story”. Both may be used interchangabily. Note that only “inherited” processes can be modified by “Project Collection Administrators” group.

Process Change: Work Item Types
Process Change: Work Item Types

A list of Work Item Types appear. Select “User Story” or “Product Backlog Item”. The Layout of the work item will be displayed. Now you are able to add fields, by selecting the “New Field” button.

User Story – MoSCoW for MVP

For a Minimum Viable Product (MVP), where is the line drawn to get the product “out the door”? Here is a methodology called MoSCoW, self-explanatory, in which the capitalization is important and stands for:

  • “Must Have” – we aren’t going to production without it
  • Should Have” – borderline must have but could fall off the MVP list if there is pressure to reduce scope to meet timelines, for example.
  • Could Have” – a story identified but not prioritized in the currently targeted MVP.
  • “Won’t Have” – identified and then forgotten. It will never reach prod.

User Story – Precedence (Prioritization)

Reminiscent of the original BASIC programming language, using 10, 20, 30, etc., line numbers for execution sequence. In addition, like in BASIC, implement precedence by 10s, so there is room later on to fit in additional work items.

Priority within the Sprint for a given team member

How should someone on the implementation team prioritize their work? Especially important if the team runs out of time for a sprint and only produces the highest business or technology value first.

Priority within a Sprint for all team members

Collectively, as input from the product owner or team tech lead, the most important work items to deliver within a sprint.

User Story: Size Estimate (paired with Story Points)

Relative, standard, effort estimations are essential that everyone on the implementation team is “on the same page.” to sizing the user stories. Although “Story Points” is “Out of the Box” for User Stories, a “Size Estimate” field is not. Relative effort estimations I’ve used before are Tee Shirt sizes (X-small, small, medium, large, X-large), and can be correlated to Story Points to attempt to quantify the effort in days.

User Story: Lead Developer

A custom “Lead Developer” field is valuable for quickly identifying who performed the work. The current “Assigned To” person may not be the developer who implemented the User Story. Most likely, it’s a QA tester or the Product Owner for Accepting Stories.

This could be helpful if you want to track each developer’s progress either by the SUM of Story Points or the COUNT of Stories.

Risks to Compliment Issues

If you’re tracking “Issues,” an “Out of the Box” Azure DevOps work item, then why not add a custom object in the “Process” section called “Risk” and any fields you would like to track with that custom RIsk object?

Azure DevOps Extensions

Analytics

Created by Microsoft, this extension may or may not already be rolled into the core Azure DevOps product. It’s ideal if you want to externalize in-depth reporting using Microsoft Power BI.

Open in Excel

Created by Microsoft DevLabs, this extension may or may not already be rolled into the core Azure DevOps product.

Azure DevOps Office® Integration 2019

The best tool for importing and exporting work items from Azure DevOps to and from MS Excel. It can be downloaded here.

Delivery Plans

Created by Microsoft, this extension may or may not already be rolled into the core Azure DevOps product. It’s the closest I’ve seen (for free) with a graphic depiction of delivery timeframes in a Gantt-like chart. You can’t print or export it, which is a massive inhibitor to sharing your timelines with stakeholders outside the ADO universe.

Estimate

Created by Microsoft DevLabs, this extension may or may not already be rolled into the core Azure DevOps product. It’s Planning Poker in Azure Boards. I enjoy Planning Poker, but this integration may be more convenient because it can save the Story Point values directly to the User Stories. Also, note some corporate environments BLOCK “Planning Poker” on the firewall due to the words in the URL.

Feature timeline and Epic Roadmap

This Azure DevOps extension by Microsoft DevLabs is a close 2nd to the “Delivery Plans” visualization of deliverables. Again, no export or print capabilities.

Retrospectives

This extension is a “Must Have” for all teams leveraging the Scrum Retrospectives session. This extension, built by Microsoft DevLabs, is highly configurable and is ideal for remote teams unable to perform this activity in person.

Who’s at the Front Door…Again?

Busy Time of Year, Happy Holidays

The holiday season brings lots of people to your front door. If you have a front door camera, you may be getting many alerts from your front door that let you know there is motion at the door. It would be great if the front doorbell cameras could take the next step and incorporate #AI facial/image recognition and notify you through #iOS notifications WHO is at the front door and, in some cases, which “uniformed” person is at the door, e.g. FedEx/UPS delivery person.

RIng iOS Notification
RIng iOS Notification

This facial recognition technology is already baked into Microsoft #OneDrive Photos and Apple #iCloud Photos. It wouldn’t be a huge leap to apply facial and object recognition to catalog the people who come to your front door as well as image recognition for uniforms that they are wearing, e.g., UPS delivery person.

iCloud/OneDrive Photos identify faces in your images, group by likeness, so the owner of the photo gallery can identify this group of faces as Grandma, for example. It may take one extra step for the camera owner to login into the image/video storage service and classify a group of videos converted to stills containing the face of Grandma. Facebook Meta also can tag the faces within pictures you upload and share. The Facebook app also can “guess” faces based on previously uploaded images.

No need to launch the Ring app and see who’s at the front door. Facial recognition can remove the step required to find out what is the motion at the front door and just post the iOS notification with the “who’s there”.

One less step to launching the Ring app and see who is at the front door.

Power Automate Goes Beyond “Out of the Box” Azure DevOps Automation Workflow

“Out of the box” are Process Workflow features to build automation rules within Microsoft Azure DevOps (ADO). One caveat found thus far. If criteria are met for an update to an ADO work item, the user cannot update the Tags of a work item (e.g., append tags). Bizarre but true.

In this case, the user must leverage Power Automate to update the ADO work item to append Tags to the work item(s) that meet the criteria.

Thanks, Omer, for pointing out this shortfall so we could plug the hole with Microsoft Power Automate. Note: other fields can be updated when the rule is executed, just not the Tags field. Special logic is required to update this Tags field, i.e., Replace, Append, Remove

Again, Microsoft Power Automate to the Rescue.

Power Automate, AI Builder Extract Info from Docs, and Redact

Out of the Box, AI Builder Extraction

I’m looking at some interesting new functionality from Microsoft Power Platform, Power Automate. You can ingest documents, the AI will parse the documents for predetermined fields, like fields in an invoice, and then you could insert this invoice data into a database.

Automatic Redaction of Information

Taking this a step further, I propose to leverage MSFT Power Automate to troll your Data Lake flat files to redact any information patterns with AI Builder such as SSN. As data objects are posted to the Azure Data Lake, Power Automate processes the files, updates a copy, and moves the original to a secure repository. This methodology could be used to protect Personal Identifiable Information (PII) and other security compliance regulatory mandates.

For more information, see Power Platform, AI Builder, Overview of the document processing model.