If more organizations are investing in self-service BI programs, how are these new dashboards being tested before they are used in critical decision making? How are organizations validating new SaaS applications before they are deployed to users? Deploying new marketing tools? What steps are being taken to insure KPI changes are as expected and no new issues are introduced?
Might I suggest that we give QA a seat at the table.
Perhaps we should call it DevQOps
- QA should automate regression tests so that agile development and iterative releases can be performed efficiently and reliably
- QA should be validating performance of the application especially when response time is critical to success and where usage growth is expected
- QA should help minimize the work required by User "Acceptance Testing" since users are often ill equipped to test applications through multiple flows, data inputs, and boundary conditions
- QA should lead efforts to validate security and perform other code validations
- QA should insure the user experience is optimized for different devices and browsers
- QA should manage business risk by itemizing, prioritizing, and developing action plans to mitigate them
Why Give QA a Seat at the Table?
Nothing is new in the list of responsibilities I listed and QA has been performing some or all of these roles in many application development practices for some time. If you're already investing in DevOps, you can't legitimately claim or achieve continuous delivery without factoring in some of these practices. My issue is that these needs are not given equal billing in a DevOps transformation and QA seems to be the middle child being squeezed out by their more aggressive Dev and Ops siblings.
What is new is that many organizations are building customer facing applications and proprietary workflows as part of their digital transformation programs. QA is often an afterthought in the budgeting process and maturing QA a distant third versus Dev and Ops practicesThe mindset among many business managers is that QA is still "the thing you do after the application is developed", that developers should be writing bug-free code, and that whatever validation is required can be performed by the expected business users. They are more likely to invest in additional developer resources if they believe it will gain them functionality or speed to market, or in system engineers if it will gain them reliability or help reduce costs. The CIO and IT leaders in these programs are then more likely to invest their efforts maturing development and operational practices without learning quality assurance tools, practices and governance.
No QA = Legacy Application?
Whether you're moving to a DevOps or investing in digital transformation, I suggest leaving out or underinvesting in QA is a mistake. Ever here anyone going back and easily adding a QA practice to a legacy application? It's hard and often more expensive to do after the fact, so less likely to be done.
In fact, I would argue that a successful application without QA is the definition of a legacy application.