Fresh Links Sundae – April 29, 2012 Edition

Fresh Links Sundae encapsulates some pieces of information I have come across during the past week. They maybe ITSM related or not entirely. Often they are from the people whose work I admire, and I hope you will find something of value.

Managing the Business of IT Needs More Than Just Good Project Management Robert Stroud discussed the three key elements of “Business of IT,” Portfolio Analysis, Financial Transparency, and Performance Management, and why it is critical to execute them well. (CA on Service Management)

End users: should we put them in padded cells? David Johnson discussed the term “end user” and why people oriented considerations are important in any infrastructure design decisions. (Computerworld UK)

Do you have a people strategy? Seth Godin argued that strategies for communication medium such as email, web, and mobile are not addressing the most important strategy of it all. (Seth’s Blog)

Help Desk 101 – 10 Things to Consider for your EMAIL ONLY Support Team Joshua Simon gave ten solid suggestions on running an email only support operation. (ITSM Lens)

What is Service Management? Rob England gave a detailed run-down of the service management concepts using a railway example. (The ITSM Review)

ITSM Customer Relationships: Mad Customer Disease Julie Montgomery talked about ways to help customers with getting things done effectively, efficiently, economically and equitably to get value for money. (Plexent)

SDITS 12 – A New Beginning? James Finister shared his recent experience at SDITS 12. (Core ITSM)

The cult of innovation Rob England discussed why innovation for its own sake is counter-productive and why instead we need to concentrate on the efficiency and effectiveness of what we do for the organization. (The IT Skeptic)

You Don’t Need This “Recovery” Umair Haque discussed we might be in a eudaimonic depression, in his terms, and suggested what to do about it. (Harvard Business Review)

Overcome the Addiction to Winning Marshall Goldsmith discussed the importance of not winning on everything; include the meaningless or trivial stuff. (Marshall Goldsmith)

COBIT 5 and What You Can Leverage for ITSM Work

ISACA recently released COBIT 5, a governance and management framework that can help organizations create optimal value from IT. If you are familiar with COBIT, hopefully you have already downloaded the framework documents. If you are not familiar with COBIT or ISACA, follow this link to get more information on the framework. In this post, I will outline some of the useful information you can leverage from COBIT to help you in your ITSM journey, based on my early perusal of the framework.

Good Practices

For a number of processes we use in ITSM, there is a corresponding one in COBIT. For example, DSS02 in COBIT “Manage Service Requests and Incident” maps approximately to the Incident Management and Service Request Management processes in ITIL. Within the process DSS02, COBIT breaks the processes down further into seven management practices. Within each management practice, there are a number of activities associated with each management practice. If you want to implement or improve an ITIL Incident Management process for your organization and wonder what are considered as good practices, these management practice activities can provide some valuable insights for your effort. Tailor those activities further into exactly what you would do in your organization and you have a list of good practices for your shop.

Metrics

For each process, COBIT 5 has outlined various IT-related and process goals that a process contributed directly towards. Next to each goal, COBIT outlines a list of recommended metrics to measure for those goals. Of course, depending on your organization and the availability of certain service management data, you will have to find tune those metrics for your environment. It offers an excellent starting point for defining the list of metrics you plan to capture.

RACI Chart

For each process, COBIT 5 has a RACI chat that talks about who is responsible and/or accountable for certain key management practices within the process. Granted, the RACI chart can be high-level and somewhat generic. It nevertheless offers a good starting point for those who are working on a process design exercise or just want to better define the roles and responsibilities within your environment.

In summary, I must say I like what I have seen from COBIT 5 so far because the framework offers a great deal of good information to use for your ITSM work. I definitely recommend downloading and checking out the new framework further. On Tuesday, April 17, 2012, Debbie Lew of Ernst & Young and Robert Stroud of CA hosted an education session on COBIT 5 during ISACA Los Angeles Chapter’s annual spring conference. Normally the presentation deck is available only to the attendee of the conference. Ms. Lew has graciously given me the permission to make the presentation deck available via this blog. Check out their deck for more information on COBIT 5 and feel free to post questions and comments.

DIY Process Assessment Wrap-up – Constructing the Report and Presenting the Results

This is the concluding post on the DIY Process Assessment series. In the previous posts, we went from lining up the approaches and resources, planning various aspects of the assessment, running the assessment and collecting the data, and eventually making sense of the data collected. The last major steps are to write up the report and present the results to the stakeholders.

Writing up the Report

The final report should summarize the assessment effort, provide solid findings on the current maturity level, and suggest both near-term and long-term actions for improvement. Generally, the assessment report will contain the following elements:

  • Executive Summary
    • Short summary of project background and problem definition
    • Brief description of the assessment methodology used
    • Summary of maturity scores for each process assessed
    • Discussion on the integration between processes and other comparative benchmark information
    • Project Scope – mention the processes and organization units covered under the assessment
    • Overall conclusion, recommendations, and next steps
      • Did the conclusions appear to be logically drawn based on data gathered?
      • Did the results confirm the perceived problem?
      • Are the recommendations aligned logically with the conclusions?
      • A roadmap showing the sequence of actions and dependencies between actions
      • Analysis of the Processes (for each process)
        • Scores or maturity levels by processes
        • Process goals, intended outcomes, and perceived importance
        • Process specific conclusions and recommendations
        • Organizational Considerations
          • Any noteworthy factors encountered during the assessment that could provide more insight or context on the conclusions
          • Any other organization related factors that should be taken into account when implementing the recommendations or actions

Presenting the Results

When presenting the results, keep the following suggestions in mind.

  • Depending on your organization, you may use different types of meetings or communication vehicles to present the results. At a minimum, I feel the project sponsor should host at least one presentation with all assessment participants and senior leadership team.
  • Hold additional meetings with the process owners to discuss the results and to identify quick-wins or other improvement opportunities.
  • Anticipate questions and how to address them, especially the ones that could be considered emotional or sensitive due to organization politics or other considerations.

It took seven posts in total to cover this process assessment topic, and I feel we have only covered this subject at a somewhat rudimentary level. There are more things to drill down in-depth, but everything we have covered so far will make a very good starting point. As you can see from the steps involved, the assessment is not a trivial effort. Before you go off and start planning the next assessment, some people might ask one important question “why bother?” I can think of a few good reasons for taking the time to plan and to do the assessment.

  1. Most organizations do not have the processes at a minimally effective level they need to support their business or operations. They want to fix or improve those processes, and a process assessment effort can help to identify where things might be broken and need attention. The problem definition is a key area to spend some effort on.
  2. Many organizations undertake process improvement projects and need some ways to measure progress. Process assessment helps not only for establishing the initial benchmarks but also for providing subsequent benchmarks that can be used to calculate the progress. A lot of us do measurements by gut-feel. Intuition and gut-feel can be right sometimes about these things but having the more concrete measurement is much better.
  3. Along the same line of reasoning for having the concrete measurement, I cannot think of another better way to show evidence of process improvement or ROI to your management or project sponsor than with formal assessment. Many people do process improvement initiatives via a grass-root or informal effort with internal funding due to organizational realities. At some point, you may find yourself needing to go to management and ask for real budget for time, people, and tools. Having a structured approach to show the potential contributions or ROI down the road can only help out your cause.

In conclusion, process assessment can be an effective way to understand where your process pain points are, how to address those pain points, and how far your organization has come along in term of improvement. All meaningful measurements usually take two or more data points to calculate the delta. Conducting process assessment periodically can provide the data points you need to measure your own effectiveness and to justify further improvement work.

Links to other posts in the series

Fresh Links Sundae – April 22, 2012 Edition

Fresh Links Sundae encapsulates some pieces of information I have come across during the past week. They maybe ITSM related or not entirely. Often they are from the people whose work I admire, and I hope you will find something of value.

Why a “rules based” approach to Change Management always fails Glen Taylor discussed why rule-based change management practices have limited effectiveness and why risk-based approach is the better target. (ITSM Portal)

COBIT 5 Miscellany Geoff Harmer gave his initial impression of COBIT 5 and how it differs from the previous version of the framework. (ITSM Portal)

IT Metrics Planning: The Business Meeting Julie Montgomery suggested ways for IT and business to work together and come up with metrics that can help both organizations. (Plexent Blog)

At the Helm of the Data Refinery: Key Considerations for CIOs Perry Rotella discussed that “data refinery” is the new strategic operating model for companies and why CIO is the executive best positioned to lead the enterprise forward in this new model. (Forbes)

5 Ways to Access the Power of the Hive for ITIL Initiatives Jeff Wayman discussed ways to leverage a diverse group of people for the benefit of ITSM initiatives. (ITSM Lens)

7 Benefits of Using a Known Error Database Simon Morris gave an in-depth discussion of KEDB and suggested ways to extract values and benefits from it. (The ITSM Review)

The ABC of ITSM: Why Building The Right Process Matters Ben Cody discussed the human aspect of ITSM and why a positive dedication to “process” should be at the heart of how organizations solve complex IT services challenges. (The ITSM Review)

How to Make Your Company More Like Apple Daniel Burrus talked about how companies, large or small, can build their future by competing on things other than price. (Strategic Insights Blog)

An Asshole Infested Workplace — And How One Guy Survived It Surviving a toxic work environment is not a trivial undertaking – you do what you could and had to do without spreading the toxic atmosphere further. (Bob Sutton)

How to fix IBM in a week Robert Cringely wrote a long series of blog entries discussing what is going on within IBM, what is wrong, and how to fix it, maybe. (I, Cringely)

ISACA Los Angeles Chapter Spring Conference, Week of April 16, 2012

Apologies to the readers of the blog.

There will be no posting this week due to my volunteer work with ISACA Los Angeles Chapter Spring Conference Committee. ISACA LA is celebrating 40th anniversary for their annual 3-day conference. This education event covers fundamental information systems auditing concepts and emerging technology risks. The conference also provides rich opportunities for its attendees to network with other governance, assurance and security professionals.  The Spring Conference has turned into the leading IT governance, security, and assurance event for the Southern California area. The 2012 conference attracted over 300 participants.

I have been working with the Spring Conference organizing committee for the last nine years. The committee has always come with dedicated volunteers who gave their time and superb effort to deliver a professional quality event for the benefits of the Chapter’s members. I have nothing but great things to say about this group of people with whom I have come to know and respect.

I will be back next week. In the meantime, if you are curious about the ISACA LA Spring Conference, head over to the Spring Conference website and check it out.

Fresh Links Sundae – April 15, 2012 Edition

Fresh Links Sundae encapsulates some pieces of information I have come across during the past week. They maybe ITSM related or not entirely. Often they are from the people whose work I admire, and I hope you will find something of value.

5 Ways to Fix Your High Value Jerks Susan Cramm suggested strategies to deal with “talent jerks” who deliver results yet intimidate their colleagues and reports in the organization. (Valuedance)

Moving IT into the unknown with boldness, courage and strength to drive business value Robert Stroud discussed the importance of transforming IT from followers of the business to equal partners sharing in the common goals of the organization’s mission. (CA on Service Management)

Man Alive, It’s COBIT 5: How Are You Governing And Managing Enterprise IT? With the release of COBIT 5, Stephen Mann outlined his initial thoughts on the new framework from ISACA. (Forrester Blogs)

A Change Management Strategy for Clouds in Azure Skies Jeff Wayman discussed five Change Management strategies that can promote success to your cloud operations. (ITSM Lens)

Meet your iceberg. Now in 3D Roman Jouravlev explained why selling IT processes to business customers is, in most cases, pointless and doomed from the start. (ITSM Portal)

Leadership Encourages Hope Bret Simmons discussed what leaders can do to give the followers hope, in his word, the belief that one knows how to perform and is willing to direct and sustain consistent effort to accomplish goals that matter. (Positive Organizational Behavior)

10 Predictions from Experts on Big Data will Impact Business in 2012 10 Big Data predictions from experts at Forrester, Gartner, Ovum, O’Reilly, and more discussed how the Big Data realm will develop and impact business. (Evolven Blog)

Too much information Barclay Rae talked about the ‘inconvenient truth,’ where the conventional IT reporting is for the most part of little business or IT management value. (BarclayRae Website)

Reducing Negativity in the Workplace Marshall Goldsmith discussed a simple, yet effective strategy to reduce “whining time.” (Marshall Goldsmith)

The Great Collision Umair Haque talked about a Great Collision in which the future we want is at odds with the present we choose, and what to do about it. (Harvard Business Review)

DIY Process Assessment Execution – Analyzing Results and Evaluating Maturity Levels

In the previous post, I gave an example of process assessment survey. Using a one-to-five scale model, you can arrive at a weighted (or a simple average) score for a given process after collecting the data from the assessment participants. The more data points (or survey results) you can collect, the more realistic (and hopefully accurate) the process maturity score will be. Before you take the process maturity scores and start making improvement plans, I think there two other factors to consider when analyzing and evaluating the overall effectiveness of your processes. The two additional factors are:

  • Perceived importance of the process:

In addition to measuring the maturity level of a process, I think it is also important to measure how your customer and the business perceive the importance of the processes. The information gained from measuring the perceived importance can be important when gauging the level of and prioritizing the investments that should go into the improvement plan for a process. For example, a process with a low maturity level but perceived to be of high importance to business may be a good candidate for some serious and well-planned investment. On the other hand, a process, that has a high maturity level in IT’s eyes but perceived to have a lower importance to business, may signal you to have a further look at the current investment level and see whether some scaling-back or reallocation of the funds could be an option. After all, we want to be in a position where the investments of any process will yield the most value for the organization overall. We simply cannot make decisions on the improvement plans without understanding the perceived business values.

Measuring the perceived importance accurately requires asking the right questions and getting the feedback from the right audience. People from the senior management team or IT customers who are considered power users are probably in a better position than others to provide this necessary insight. Also, simply asking IT customers how important a process is to the organization may not be effective because those customers are not likely to be as familiar with the nitty-gritty IT processes as we are. We will need to find a way to extract the information by stating the questions in a way that our customers can understand and respond, without getting into too much of technical jargons.

As an example, the result of this analysis could be a bar chart showing the maturity level and the perceived importance level for the processes under assessment.

  • Degree of Integration Between Processes

Another factor to consider before taking a process maturity score and making an improvement plan is to also understand how well processes integrate with one another. Most ITSM processes rarely act alone, and the effectiveness of an overall ITSM program also depends on the level of integration between processes. Assessing how well one process integrates with another generally involves looking at just how well the output from one process is used in other processes. Some examples of process integration for problem management can include:

    • Processes Providing Input Into Problem Management:
      • Capacity management process could provide historical usage and capacity trends information to aid the root cause analysis or formulation of permanent solutions.
      • Incident management process could provide incident details for the root cause analysis activities. Incident data can also enable proactive problem management through the use of trend analysis.
      • Configuration management process could provide relationship information between configuration items, which can help in determining the impact of problems and potential resolutions.
    • Processes Receiving Output from Problem Management:
      • Incident management process could receive known error records and details of temporary fixes in order to minimize the impact of incidents.
      • Change management process could receive requests for change triggered by problem management to implement permanent solutions to known errors.

What scale should you use to rate the integration between processes? I think a simple scale of one to five should work just fine. For example:

    • One could indicate the output from the originating process is used inconsistently by the target process
    • Two could indicate the output from the originating process is used consistently but only informally by the target process
    • Three could indicate the output from the originating process is both consistently and equally by the target process in a documented manner
    • Four could indicate the output from the originating process is used consistently to support the target process in a managed way
    • Five could indicate the output from the originating process is used consistently to support the target process in an optimized way

You define what the scale really means for your environment in a way that is easily understandable by your team. Also keep in mind that not all processes must integrate seamlessly with other processes on every possible front in order to have an effective ITSM program; however, a good use of the integration scores can help us to uncover opportunities to capitalize on our strengths or to improve upon our challenges. For example, a low integration score between incident and problem management processes could signal us an opportunity to improve how those two processes exchange and consume the output from one another. If we find the known error database is not being utilized as much as we should during incident triage, we should dig in further and see what actions we can take to improve the information flow. If the problem management process is being hampered due to lack of accurate incident information coming from the incident management process, the integration score should point us to the need that we need to raise the quality of information exchange between the two processes.

As an example, the result of the process integration analysis could be a two-by-two chart showing the integration scores between processes.

We have come a long way in this DIY process assessment journey, from gathering the potential resources, planning for the assessment, executing the assessment, to analyzing the results. For the next and concluding post on the process assessment topic, we will discuss presenting the assessment results and suggesting some quick-win items to consider as part of the follow-up activities.

DIY Process Assessment Execution – Process Survey Example

From the last DIY assessment post, we discussed the data gathering methods and instruments to use for the surveys, workshops, and interviews. No matter what method(s) you end up deploying for your assessment, you will need a list of good/effective/best practices for a process in order to formulate the assessment questions. During the first post of the series, we talked about what reference sources you can use to come up with a list of good practices for a given process. In this post, we will illustrate an example of what the good practices and survey questions might look like for Problem Management.

Problem Management Process Assessment Questionnaire Example

As you look through the example document, I would like to point out the following:

  1. Each question in the questionnaire represents a good practice as part of what a mature process would look like. To come up with the list of practices, I leveraged the information from ISO/IEC 20000 Part 2: Guidance on the Application of Service Management Systems. With helpful information sources like ITIL, ISO 20000, COBIT, etc., they provide a great starting point for us DIY’ers and there is no reason to reinvent the wheels for the most part.
  2. To rank the responses and calculate the maturity level, I plan to use the 5-point scale of CMMI. The maturity levels used by CMMI include 1) Initial, 2) Repeatable, 3) Defined, 4) Managed, and 5) Optimized. However, the maturity levels will not likely be something your survey audience will know very well, so we need to find some other ways for our survey audience to rank their answers. As you can see from the example, I used either the scale of 1) Never, 2) Rarely, 3) Sometimes, 4) Often, 5) Always or 1) Not at All, 2) Minimally, 3) Partially, 4) Mostly, 5) Completely. You don’t have to use both scales – it all depends on how you ask the questions. I could have asked all questions with the scale of 1) Never, 2) Rarely, 3) Sometimes, 4) Often, 5) Always or vice versa. In my example, I chose to mix things up a bit by using both scales just to illustrate the fact that both scales are viable for what we need to do.
  3. Some questions are better asked with close-end options like Yes or No, instead of using a scale. Those questions tend to deal with whether you have certain required artifacts or deliverables. For example, you either have documented problem management process and procedures, or you don’t.
  4. As you can see, the scale questions translate nicely when calculating the maturity level. You may calculate the maturity level by using a simple average of all responses from the scale questions, where all questions have an equal weight or preference. Depending on your environment or organizational culture, you may also assign a different weight to each question by emphasizing certain practices over others. For the close-end questions, you will need to think about what the responses of “yes” and “no” mean when you calculate the final maturity level. For example, you may say having an “Yes” for a group of questions gets a score of 3 out 5, where the response of “no” equal to 1. For some questions, you may even say the “yes” response equals to 5.
  5. This is a simplistic model for assessing and calculating maturity level for a DIY approach. You will need to construct a similar good practice model for each process you plan to assess. Coming up with a list of good practices model to assess against can turn into a significant time investment. However, the majority of effort is upfront and you can re-use the model for subsequent assessments. If you contract out the assessment exercise to a consultant, coming up with the best practice model to evaluate your processes against is normally a deliverables from the consultant. Be sure to spend some time to understand your consultant’s model, and make sure the best practice model is applicable to your organization. It is an important way to ensure the assessment results will be meaningful and easier for everyone to understand.

Please have a look at the example document and let me know what you would do to improve it. On the next post, we will continue the discussion of the assessment execution phase by examining how to analyze the results and evaluate the maturity Levels. We will also discuss how inter-process integration as well as organization and culture could play a part in the maturity level assessment.

Fresh Links Sundae – April 8, 2012 Edition

Fresh Links Sundae encapsulates some pieces of information I have come across during the past week. They maybe ITSM related or not entirely. Often they are from the people whose work I admire, and I hope you will find something of value.

IT Service Management And ITIL Thinking – Brawn, Brains, Or Heart? Stephen Mann discussed ITSM people “stereotypes” and what we can learn from them in terms of communication, education, and ITSM tool selection. (Forrester Blogs)

The 7 Tenets of Providing Great Service at the Help Desk Jeff Wayman discussed seven principles a service desk can follow through to provide great IT services. (ITSM Lens)

3 Ways to Simplify IT Reporting Julie Montgomery provided suggestions on effective IT reporting. (Plexent)

Eating the ITIL elephant one leg at a time Rob England discussed why sticking with the premise that an ITSM initiative is assembled from the ITIL processes could be missing the real point. (The IT Skeptic)

Dev-Ops? New-Ops? Cloud-Ops! Martin Perlin discussed what impact cloud computing has on the interaction between the Dev and Ops teams. (Evolven Blog)

Team Building without Time Wasting Marshall Goldsmith explained a simple, practical, and efficient process for team building. (Marshall Goldsmith)

Make Less Decisions Mark and Mike discussed the importance of prioritizing and making decisions ahead of time can make later decision-making easier and to avoid distractions (Manager Tools)

The Biggest Mistake You (Probably) Make with Teams Tammy Erickson discussed which is more important to promoting collaboration: a clearly defined approach toward achieving the goal, or clearly specified roles for individual team members? (Harvard Business Review)

The Things Customers Can Do Better Than You Bill Lee gave examples where customers can do things better than the organization thinks they can. And why an organization should involve their customers more help grow its business? (Harvard Business Review)

Where’s the Web Heading? A Prediction Daniel Burrus gave his takes on the direction of technologies where web 3.0 and even web 4.0 might be going. (Strategic Insights Blog)

DIY Process Assessment Execution – Surveying the Participants and Gathering the Data

In the previous DIY assessment posts, we talked about what information sources to use and what considerations to take into account when planning your process assessment project. Now it is the time to actually put together the instruments that can be used to gather the data and do the assessment. For the data gathering phase, there are three major activity areas to cover and manage:

  • Survey Management

Individual surveys and questionnaires, either in paper or electronic format, are often used to collect assessment data. Compared to the workshop or interview methods, surveys are also generally less expensive to administer and can help in collecting sensitive data. The challenge for the survey method is that the questions need to be precise with little room for personal interpretation in order to ensure accuracy and usefulness of the data. Due to the difficulty of crafting precise questions for all situations, the survey is less suitable for questions that are complex and open-end by nature, though free-form comments can still be collected via a survey.

In managing the surveys, here are some general steps and sequence to consider:

  1. Determine the survey mechanism, paper, electronic, or both.
  2. Select the processes to assess and the questions to ask.
  3. Identify the potential participants by process and collect their contact information to where the survey should go.
  4. Send out the survey communication with the timeline, login credentials, and other survey or assessment related information.
  5. Send out the surveys and monitor the completion progress. Send reminders to participants about completing the survey if necessary.
  6. Close the survey at the deadline.
  7. Follow up with the survey participants for further clarification, if necessary.
  8. Compile the survey results and provide the results to the lead assessor. The survey results can also be used as the input into the follow-up workshop and/or interview stages, if planned that way.
  • Workshop Management

The workshops usually get several or more individuals together, so the assessor can interview the participants together. The workshops can be a time-saver, compared to the one-on-one interviews. The workshops sometimes can also spur more discussions that yield good insights and useful information. At the same time, the workshop setting can also limit the sharing of some sensitive information due to participants being careful of revealing confidential data in front of other participants. During the workshops, I would recommend having someone from the assessment team served as the scribe. The scribe allows the assessor to concentrate on facilitating the workshop and the interaction between participants, without the distraction of having to write down the conversations as they take place during the workshop.

A workshop can be delivered using a number of formats, but it usually includes the following elements:

  1. Workshop kick-off. Participant introductions. Review of agenda and schedule.
  2. Review of the assessment project, scope, and goals.
  3. Review of the process in question, scope, and definitions.
  4. Further data gathering with more in-depth interviews within the workshop.
  5. Review of findings and discussion of the next steps
  • Interview Management

The one-on-one interviews can be the most labor-intensive but also have the most flexibility in collecting assessment data. It is also the best mechanism to handle complex questions and responses. A good deal of interview management is calendar, schedule, and time management. Scheduling meetings well in advance with timely reminders should go a long way to help the participants in sticking with the interview appointments. Similar to the workshops, I would also recommend having a scribe, so the assessor can focus her attention on asking the questions and assessing the quality of the answers from the participants.

The types of question asked in an interview typically include:

  1. Open-ended questions where the participants can share descriptive information without a lot of structure.
  2. Closed-ended questions where the participants answer with short responses like “yes,” “no,” or some others.

When asking a question, the interviewer should always frame the question first by stating the topic or background for the question. The interviewer will ask further probing questions, if necessary, to obtain the most complete answers for the question. When framing the questions, it is also important that the framing comments do not evolve into a leading question, where the participants feel compelled to answer the question a particular way based on how the question was asked to begin with.

Depending on the complexity and timing considerations of the assessment project, most organizations will likely take a blended approach of using all three methods for data gathering. Throughout the data gathering phase, it is important for the assessor and the scribe to maintain objectivity and not projecting their views or bias over the conversations or interviews. The questions on the surveys or interviews should always be clear and precise, so they minimize personal interpretation that could lead to biased answers. The scope assigned to the participants should also be reasonable but still comprehensive enough so the most needed answers can be collected. On the next post, we will discuss a process survey example using the problem management process as the part of the assessment scope. The example will illustrate what defines various levels of maturity within problem management and what questions to ask in the survey or interview.