Monday, June 23, 2014

How to Conduct a Quality Assessment Under a Short Timeframe With Limited Resources

As part of its service offering, Prolifics provides clients (or prospective clients) an assessment of their current quality processes. Recently, Prolifics engineers were requested to go onsite at a client location to render such an assessment of a complex, multi-tier, application. However, the client would only agree to a very short length of time for the assessment.  Prolifics accepted this challenge – knowing that it would be a demanding project. This brief article describes, at a high level, the approach taken, the constraints that we worked under and the end result.

Rapid Assessment Approach
For such an ambitious project, every hour of time is important. Thus, it is indispensable to initiate the project with remote outreach to the client, even prior to going onsite. Gather as much essential information as possible. This advance work can take the form of reviewing any existing documentation about the client and the systems under test. Once the early information is reviewed, it can in turn prompt you for additional items to request, again in advance of going onsite, or during that first day or two. For the assessment at this client, we reviewed relevant Prolifics correspondence and other documentation between the client and the quality practice. There was recent documentation from an initial round of discussions which included both the Head of the Testing Practice and the Director. We also held pre-assessment conference call discussions with the Sales team, who provided helpful insights into the client organization.

We then requested the client to provide an initial set of test plan documents and test cases. We also requested high level information on system architecture. Every day of “front loading” the process advances the project to the engineer’s advantage. Thus, early on in the project we obtained a basic sense of how the client’s QA team conducted testing work within their test cycle.

Also, to address items which could constrain us, we made requests for client network access and test case and defect tracking repositories.  Both these items (network access and test case/defect tracking system access) are typically “long lead time” items which can further constrain the assessment schedule. Therefore it is important to request these in advance of going onsite.

The day before our client visit, we developed a preliminary project plan, together with milestones and dates that we wanted to achieve for the assessment.

Selecting a Methodology:
Various options are available for assessment methods. It is important to choose a methodology which provides as much quantitative and qualitative measurement as possible, in the time allotted. So, the assessment method must do the following:

  • The methodology must be comprehensive enough to cover testing processes both in breadth and in depth.
  • It must be flexible in order to adapt to the constraints of limited time and resources.  
  • It needs to be sufficiently objective such that it has credibility with the client.

With the Practice Head’s guidance, we utilized a Test Process Improvement (TPI) methodology to assess this client. The Prolifics version of TPI measures up to twenty (20) dimensions of testing effectiveness. In the brief two weeks that we had for this assessment, we had to select which of those twenty dimensions to assess. It came down to a combination of the most important dimensions of the testing process and the most available to measure.

Conducting the Assessment
Knowing that we had just two weeks for the assessment, we placed much energy into “front loading” our activities during the first week. This involved meeting the key managers and leaders of the client.  We scheduled a round of one-on-one interviews with those individuals.  It is important to plan these interviews in advance of the actual meetings; plan the interview questions and carefully document the answers. Our interviewees were:  the Business Owner, QA Manager, QA Team Leads, Development Manager and Leads, Project Manager, Business Analysts and test automation staff.

In order to fully carry out our assessment, we made follow-up requests for information we had not yet received, plus requested system requirements, specifications and documentation. Various of these items were identified as useful during those interviews and meetings.

Data Gathering and Analysis
Reviewing test artifacts (Test Strategies, Test Plans, Test Cases) is a natural place to look and evaluate.  However, system requirements provide the initial “blueprint” for understanding the application under test. While clients may at first be reluctant to disclose requirements or design documents, it is essential to review a sufficient sample of requirements in order to comprehend what the application is intended to do. Then, proceed to interpret the Test Plans, cases, etc. in light of the requirements.

From there, defect reports provide the essential, interlocking pieces of information. Though much can be gleaned from reviewing defect reports, a more complete interpretation is possible when requirements, test strategies, plans and test cases are first considered.  Then, defect reports can be assessed fully illuminated.

Having requirements and artifacts, we began preliminary analysis and compilation of the data.  Day after day, we added incremental data, performed further analysis and began to identify deficiencies, patterns and gaps in the quality processes.  For areas of significant concern, we continued to gather even more data to strengthen and validate our analysis.

Frequent Communication
We decided from the outset to mandate frequent communications among the assessment team. This included the engineers onsite, our project manager, the practice director, sales team and project manager.  At designated intervals, and on as as-needed basis, we also included the client executive.
Our communication included daily conference calls between the above members, followed by daily email meeting minutes mailed out to the team, plus certain Prolifics executives. There were other communications (follow-up meetings, email, etc.) as needed to keep the assessment moving forward and deal with impediments. This all had the effect of transparency, delineating our progress and identifying potential problems. Trust among team members grew and helped this ambitious schedule proceed with less stress.

Interim Progress Briefing
At the end of week one, we delivered an abbreviated status presentation to the client. This helped us to clarify that we were on the correct path for this assessment. If there were areas to go deeper, or areas that did not require much additional data gathering, this briefing helped to clarify matters. It also provided the opportunity to deliver preliminary findings that our analysis was surfacing.
Although our presentation was intentionally limited in content, it did serve to firm up both our (Prolifics) and the clients sense that the assessment was proceeding in the correct direction and making meaningful progress.

It is important that this particular briefing be a formal, scheduled milestone event. This gives the assessment team a midpoint goal to achieve. The client also has the opportunity to receive preliminary results, ask questions and offer suggestions and guidance. This interim briefing surfaced no surprises.  We confirmed that we were on the right track and that progress was being made. Our client indicated they were looking forward to the final report. We took this as a good sign.

The Second Week
After the interim briefing, we exerted ourselves to gather additional information. We performed second round interviews with client staff where needed. Then we went deeper into client test artifacts and the TestWare systems. Reviewing the way quality information is structured and organized and comparing that to release requirements is very useful aspect of an Assessment. Patterns became clearer, as did any gaps or deficiencies.

While keeping good notes and other documentation is needed throughout the assessment, it is vital to begin to outline the assessment report at the earliest possible time. Ideally, you would want to start the report outline as the data begin to form facts – then as facts emerge into findings.

Then, we circulated drafts of the assessment as early as practical within the quality practice.  As is human nature, much of the feedback on the draft assessment report arrived late – bumping right up against the deadline. Fortunately, we had incorporated many early comments and suggestions in previously. So we were able to accommodate significant last day revisions.

Delivering the Assessment
The act of delivering an assessment report to the client does not simply occur at the conclusion of the project.  It is a cumulative effort that actually begins days before.  An assessment is like a “report card” to the client.  More often than not, it will convey information that is critical of their processes.  Providing negative information must be done with sensitivity.  Working under ambitious deadlines can be stressful to all involved.  Adding negative findings into the mix can lead to unpredictable results.  Therefore, these dynamics must be anticipated and managed.

As described earlier, we built in frequent, open communication within the project. We approached the assessment in a balanced and fair way with all of those involved.  Client stakeholders were hearing about our day to day activities from their staff; it is essential that feedback be positive. Objectivity is also important. Thus, we presented positive findings along with critical, or negative, findings. All of this fosters trust.

The final presentation culminated with an onscreen presentation for the client. \ As the Assessment report was lengthy, was delivered the presentation as an executive briefing, with a summary up-front. The client requested a preview of the following detailed section, which we were happy to provide. Thereupon, the client asked questions and our answers were crisp and to the point. This laid the groundwork for the client to read the entire presentation.

Our client recognized the hard work that went into a very short two week engagement. \ It was evident that our approach was objective and that the results had integrity.

To learn more about Prolifics' testing practice, visit our website.

Dex Malone is a Delivery Manager at Prolifics specializing in Independent Validation and Verification (IV&V). With over twenty-five years’ experience, Dex specializes in large, complex IT systems.  He has worked in Quality Engineering leadership roles in software development across various industries.  These include regulated environments in healthcare, telecommunications, general business, banking and finance, across both public and private sectors.  His interests are in software security and privacy.  When not in front of a computer, he can be found with family, in the mountains or at the shore.