Tips for Change Leaders – How to Show Your Impact

We work with a lot of leaders who are responsible for driving change. A common question that they ask us is “How do I show the impact of what we’re doing?” Of course, they have their standard measures (i.e. improved outcomes, increased cost-efficiency, reduced delays, etc), but the following are some of the tricky scenarios that they share with us:

  • “Our numbers don’t show how well we’re doing … what do I do?”
  • “It will take a while before we start seeing an impact, but I need to show results now!”
  • “The team is really working better now, but we’re still not hitting our targets. How do I prove that it’s worthwhile to keep going?”

Sometimes projects involving change don’t get the support they need to realize to their full potential, but it doesn’t always have to be that way. Here are some tips that Change Leaders can use to set the odds in their favor:

How to Show Your Impact

Tip 1: Begin with the end in mind
In addition to being one of my favorite Stephen Covey habits, beginning with the end in mind is incredibly practical tool for successful projects. When leaders are driving change this concept applies equally well. Some examples of applying this to change initiatives include:

  • Getting crystal clear on how things will be better once your change initiative is complete. Think about the conversations you will have in that future state, such as, “We’re way better at retaining our customers than we were in the past”. Then think about the numbers that you’d like to say to back it up, such as “We’ve decreased our customer churn rate by 60%”. In order to do these comparisons in the future you will need a baseline reading of your current performance. This thought exercise can be an easy way of identifying the performance measures that will be essential to show an impact.
  • If you’ve led a change initiative previously, you probably have learned that things rarely go as well as planned. So, it’s important to set realistic expectations on when you will hit your performance targets. The rule of “under-promise and over-deliver” comes into play here.
  • Many seasoned change leaders also know that there will be some periods in the initiative where the efforts are high but the outcomes are low to non-existent. It’s important to think about the milestones along the way, or the interim performance measures, that can show that you’re making progress in the right direction, and that the initiative should keep going.
  • Where possible, choose performance measures that you have direct influence on. The last thing you want when leading a change initiative is being evaluated on a performance measure that you’re not able to directly influence.

Tip 2: Set yourself up for success
It’s fairly common for change initiatives to generate a lot of excitement, and a lot of positive momentum where the people involved “just know” that they are making a positive impact. But at some point you do have to prove it. Some considerations to set yourself up for success include:

  • Track your performance along the way. Try to avoid what many change leaders do, which is, leave the performance evaluation to the very end. By tracking performance along the way, both you and the team involved can keep your eyes on the numbers that matter, and more importantly, correct the course if things aren’t going in the right direction.
  • Plan for achievable interim wins. It’s easier for a change initiative to be supported if it’s showing incremental progress towards the goal. It’s harder to stay the course when it’s a situation of “just trust me … in 3 years this will all work great”. Give yourself and your change initiative some achievable wins along the way to the finish line.
  • Make sure your numbers tell the full story. If the numbers aren’t trending in the right direction but you know that the change initiative is generating positive outcomes, then it may be time to rethink your metrics. Try and be as creative as possible in thinking through how that benefit can be quantified. Stakeholder surveys can often help round out the full impact of the change. Try to avoid having the performance of the change initiative be strictly based on financial factors alone, or solely on productivity measures. There are costs to “softer” considerations, they are just harder to quantify.

Tip 3: Get some help from a data friend

Not everybody is good with numbers, performance measures, or target setting. If this is you, then do your conceptual thinking of the performance measures and then lean on someone who is good with spreadsheets, data, and/or basic statistics. They will be able to coach you on how you can set up your measures so that a before and after comparison is valid and meaningful. They may even help you set up a tracking spreadsheet if you buy them a coffee!

Hopefully these steps will give change leaders some actionable tools you can use to make sure that you can show the impact of your change initiative. There are many experts out there that I’m sure will have more to add. Please feel free to weigh in with your point of view.

Tips for Executives – How to Create a Culture of Evidence

We’re often asked how do we create a Culture of Evidence? Most leaders know that they should be more evidence-based in how they work, but don’t know how they can go about doing it.

We’ve all heard the phrase “Culture eats strategy for breakfast” and anyone who’s attempted to drive change in a complex organization knows how true that statement can be. And, many seasoned leaders know that culture change doesn’t happen overnight, but here are some tips that you can use to get started.

Culture of Evidence

Tip 1: Paint a picture of “What a Culture of Evidence looks like”
If you want to make meaningful progress towards creating a culture of evidence, there’s no better place to start than envisioning your future state. Things to consider include:

  • How will life be better? For you, your team and for the company?
  • What opportunities will you be able to access?
  • What risks will you be able to avoid?
  • What decisions will be smarter?
  • What time will be saved?

If you can create a compelling vision of your organization in the future that thrives in a Culture of Evidence, then you can use this to win supporters.

Tip 2: Set the standard for “What counts as evidence?”
In the spirit of “crawl, walk, run”, getting started with using evidence doesn’t have to begin with hiring a team of scientists, researchers and lawyers. To begin with it may be as simple as using data to support your decision-making, carrying out basic research, or using spreadsheets to do “what if” analysis. Most leaders do this already, but many others still rely on their intuition to make their decisions.

The following is an illustrative example of “what counts as evidence?”:

  • A declarative statement of your position such as “I believe that we should launch a social media awareness campaign for our red widgets”
  • Some form of objective proof that shows how you formed your position, such as “According to our market data 85% of our target customers have never heard of our red widgets, and 57% of them use social media. The campaign would be cost effective even if it only generated a 5% increase in our market share.”
  • A disclosure of what you don’t know, such as “Admittedly our market data is one year old, so we’re assuming that the patterns still hold.”
  • An action statement, such as “I’d like to update our market data but the delays and costs outweigh the risk of missing an opportunity … I recommend that we launch the campaign and track performance.”

The ultimate goal of evidence is that it holds up to the review process, meaning that another leader could review the evidence and arrive at the same conclusions. Along those lines, “what counts as evidence?” could be just that … an objective analysis that has been peer reviewed.

Manager Reading Data

Tip 3: Put the tools in place
To set your team up for success, you will want to make sure that the basic tools are available for evidence-based thinking. Some questions to consider include:

  • Are the right investments being made to collect the right data?
  • Does your team have access to the data they need? Is the data being collected at the source, but it’s not being stored in the data warehouse? Or is the data there, but the privacy levels are too restrictive?
  • Do they have the skills for working with the data, or alternatively, is the right information available in insightful reports or visual dashboards?
  • Do they have the right technical and human resources perform deeper analyses, in response to important business questions that arise?

Tip 4: Lead by example
If you want to convince your team and your peers that you are fully behind this idea of a Culture of Evidence, then you’ll need to walk the talk. This will require effort at the beginning, but after a while it will become just “the way things are done around here”. Leading by example can include shifting your own language from “I think this is what we should do …” into “The evidence tells me that this is what we should do …”

It can also include making a concerted effort to not do things the old way because “that’s the way we’ve always done it” but instead doing things in ways that are proven to generate the right outcomes. This relates to everyday decision-making and operations, as well as longer-term strategy and planning.

Tip 5: Reward the adopters
It is often said that “you get what you reward”. This is an easy concept to apply to building a Culture of Evidence. For example you can reward your team for using evidence in situations like:

  • Decision-making on special projects: Projects that have proposals that have supporting evidence are often approved, whereas other projects often don’t.
  • Decision-making on budget: Budget increases (or exemptions from budget cuts) are generally provided to those departments that can prove that they need it, whereas departments that can’t prove their value miss out.
  • Decision-making on promotions: Team members that demonstrate the effective use of evidence are generally promoted to higher positions, whereas other team members don’t.

By taking this approach it won’t take long for people in your organization to learn that the way to win is by embracing an evidence-based approach. Team members will either adopt the new direction or self-select themselves out of your organization. Over time this will increase the momentum of the culture change, and gradually you will find that your organization attracts talent that values a Culture of Evidence.

Tips for Executives – How to Get the Data You Need

One of the most common complaints that we hear from leaders and executives is that they have “too much data” and “not enough information”. Some examples of what they mean by “too much data” include:

  • Reports that consist of pages and pages of numbers
  • Tables of figures with no overall summary number
  • Charts that are cluttered and confusing
  • Analyses that show a lot of numbers but no “so what” message

It doesn’t have to be that way. Here are a few tips that executives can use to get the data they need:

Tip 1: Ask yourself “What information would help me be more effective?”
It may sound selfish, but you should ask yourself “What information would help me be more effective in my job?” This might be information that helps you save your own time, make better decisions, or seize big opportunities.

The Data Thinker

Another way to approach this question is to review the data that you already have available and ask yourself “What isn’t this telling me?” or “Why is this not useful to me?”

Based on this thought process, prepare a simple table with two columns. In the first column include a description of what you want, and in the second column identify why you want it. Then choose your top 3 to 5 items on the list. Now you’re ready to start the next step – following up with your Data Team and/or your Business Intelligence people to have a first conversation about your top-ranked items.

Tip 2: When people say data isn’t available, use the “5 Whys”
Many data people have difficultly seeing the world beyond the standard data that they use every day. So, when you meet with them and tell them about the data that you need, chances are that they will reply by saying “that just isn’t available”.

When it comes to data – almost anything is available – it’s just a matter of how much you’re willing to fight to get what you need.

The “5 Whys” is a simple process of getting to the root of an issue. When your data people tell you that getting the data you need is impossible, ask “why”. They will give you a list of reasons such as “it’s not in the data warehouse”, or “we don’t measure that”, or “the system doesn’t allow that type of reporting”. Pick any of the reasons, and then ask “why” again, which will generate a new list of reasons. Continue this until you’ve reached the root of the issue (hopefully in 5 or less “whys”). The root issue is often one or more of the following:

  • Nobody thought to ask for this before
  • At some point in the past, somebody decided that it was too hard to collect the data
  • The people running the analysis and reporting are limiting themselves based on the capabilities of their reporting tools
  • Nobody has thought of taking a prospective data collection approach, and/or nobody has thought of doing a sampling approach (to reduce data collection costs)

Through a few meetings, you now should have the real reasons why you don’t currently have the information you need. You may even have a sense of how much it would cost.

Tip 3: Estimate the cost of not having the information you need
The last step is where you can make your convincing argument. For each of your top-ranked ideas, you can think about what it’s costing you to not have access to that information.

Does it translate to productivity? Lost time? Missed opportunities? Lost revenue? Customer loyalty? Employee turn-over? If so, then you can translate these consequences into real tangible costs. This isn’t an exercise of doing high-precision activity based costing – instead this is just getting the cost estimates roughly right.

These figures give you an idea of how much your organization could potentially invest into better data and reporting. If you’re business-minded and you could work out the actual investment amounts that would still generate a positive return on investment.

Armed with this analysis, now you’re in a position to convince others what this information is worth. Which brings us to our last step.

Tip 4: Gain the support of the leadership team
Chances are that the information that will help you be more effective in your role, will also be useful to others in the leadership team and throughout your organization. If you can gain the support of the rest of the leadership team then you can increase the chances of getting what you want.

Each team dynamic is different, but a one-on-one approach often works well. These can be quick conversations with each leader with a real focus on “what’s in it for them”. You may be surprised with how many of your peers are equally frustrated by the lack of good information.

With the support of the team, the cost of not having the information and some return on investment estimates, you’ll be able to drive to get the information you need to be successful.

These are just a few tips, but I’m sure there are many of leaders out there who have many more great ideas and experiences. If you have suggestions, or alternate points of view, please weigh in.

 

Note: What is a Data Team?
When we refer to “Data Teams” it’s a catch all for groups of technical, statistical, and subject-matter domain experts that are involved in providing information to support their organization. These teams are sometimes called “Business Intelligence”, “Decision Support”, or “Information Management”, but they can also be internal consultants such as “Operations Analysts”, “Strategic Information” or “Research”. Many of these concepts equally apply to teams of Data Scientists.


Tips for Data Teams – The Consistency Check

Have you ever delivered an analysis, only to hear from your client that “these numbers can’t be right”? It’s hard to convince someone that your results are credible when they don’t even pass the first 5 seconds of review. As much as we may not want to admit it, sometimes the numbers are indeed wrong, so how do we avoid these situations from happening? One type of check that a Data Team can adopt is the “Consistency Check”. Here are some questions that you can ask yourself when doing a consistency check:

Consistent numbers

Question 1) Are the numbers consistent with themselves?
When building complicated analyses, different sections of the analysis can fall “out of sync” with each other if they are not all updated in the same way. When this happens it can produce inconsistent summary results (i.e. the cover page reports 255 conversions per hour, but the supporting details on other pages show 237 conversions per hour). Sometimes we place too much faith on our reporting tools and assume that they will report exactly as intended. In other situations it’s just a matter of being too close to the work. After a while the numbers are burned into your short term memory and you lose your ability to critically review them with an objective eye. Suggested work-arounds include:

  • Have another member of your team do a consistency check on the results, preferably someone who hasn’t been involved in the work.
  • Take an old school approach. Print out the results, and use different colored highlighters for each type of metric. Highlight the summary numbers that represent the same result, and confirm that they are indeed consistent. Continue until you’ve highlighted all summary numbers.
  • Take another old school approach. Get your calculator out or use a separate spreadsheet, and confirm that you can replicate the summary numbers just based on the results that are being presented. You may be surprised with how many of your clients are doing this with your results already.

Question 2) Are the numbers consistent with your previous analyses?
When a client receives a new set of results they often pull up the previous results that you gave them. They are asking the question “how much have things changed?” You can beat them to the punch by doing this consistency check yourself. To be more specific:

  • Start with the previous result that was presented or released. Compare the summary numbers from the previous results to your current summary numbers.
  • Assess if the changes are interpretable. If they are, then this interpretation will likely be part of what you communicate when you release the new results. If the changes are not interpretable, then it’s time to go back into your current results, or your previous results to diagnose why the changes aren’t explainable.

Question 3) Are the numbers consistent with other reports?
Stepping into the shoes of your audience, you can think about the other reports that they are referring to on an on-going basis. It doesn’t matter if the other reports that they use came from a completely different source – from their perspective all data from all sources is supposed to tell the same story. In a similar manner to Question 2, you can do some additional homework so that your results are valuable to your audience as possible. For example you could:

  • Ask your clients if they have any other reports that they use frequently, and if they would be willing to share them with you. You can frame it honestly – you want to make sure that your results are valid, and if they are different from other sources, you want to be able to explain why.
  • Do a little research on your own, in particular, reviewing any routine corporate reporting, or industry reporting. Sometimes, a skeptic can be won over by proving that you did your homework. Again if the numbers line up from other sources, it becomes something you can report as proof of consistency. If the numbers don’t line up and you can’t explain the difference, then it may be an indication that you need to review your analysis.

Question 4) Are you telling the right story?
Taking all of the above into account, you should be able to deliver your results confidently. You should now know that the numbers in the report are consistent amongst themselves, that the analysis is consistent with previous analyses, and that the results are interpretable in comparison to other sources. This now can become part of your summary and presentation of your stunning new work. Or at least it can form as an addendum to the email, or the presentation that shows your audience the efforts that you went through to ensure that the numbers are the right numbers. Then you have the foundation to begin telling the actual story of the analysis (the “so what” message).

These are just a few tips, but I’m sure there are many of experts out there who have many more great ideas. If you have suggestions, or alternate points of view, please weigh in.

Note: What is a Data Team?
When we refer to “Data Teams” it’s a catch all for groups of technical, statistical, and subject-matter domain experts that are involved in providing information to support their organization. These teams are sometimes called “Business Intelligence”, “Decision Support”, or “Information Management”, but they can also be internal consultants such as “Operations Analysts”, “Strategic Information” or “Research”. Many of these concepts equally apply to teams of Data Scientists.


Reducing Rework in a Data Team

As much as we’d all like to get things done right the first time, with analysis and modeling it’s not always possible.

When delivering results, it’s fairly common to receive requests for minor revisions – and most of that we can all handle. But every so often the situation catches you by surprise. You’re delivering what you think is a great piece of work only to learn that it missed the mark completely. You hear statements like “This isn’t what I asked for!” or “You misunderstood what I asked for!” and you wonder where things went wrong.

Sometimes you can rightfully blame the person who requested the analysis, and then conveniently changed their mind. But more often the breakdown happens around communication and agreeing on expectations.

Final version

So what do you do? Here are some coping strategies:

1) Ask the question “What does a job well done look like?”
The next time you’re asked to run a major analysis where you feel that you don’t have an adequate understanding of what is being asked, try this script:

“I want to make sure that I give you what you want. Would you mind if I grabbed a couple of minutes to clarify a few things?”

Then ask your clarifying questions. For example:

  • What’s the business question that this analysis is supporting you with?
  • Do you just want the summary, or did you want the supporting details?
  • Is this analysis just for your reference, or is it going to be distributed?
  • How accurate does this need to be?

The answers to these questions can make a big difference in determining the final deliverable. If you only have time for one question, the first question is the best one to ask.

If you’re lucky enough that the person making the request is willing to spend more than a couple minutes with you, then you can try to get crystal clear on “What does a job well done look like?” The following are some of the statements that you might hear:

  • It will help me answer this questions …
  • The numbers will be consistent with our annual report
  • The summary of results will be jargon-free
  • The results will be delivered by Friday morning at 10 am, both by email as well as a color print out on my desk

2) Put your understanding in writing
Now, with your heightened clarity you can now put it into writing. A short follow up email of the form “Thanks for clarifying. So, just to recap I will …” will provide one more opportunity for corrective feedback.

In many situations you won’t be able to do the first step (getting clear on “what a job well done looks like”) because the person making the request is too busy. But even in these situations it’s still worthwhile putting into writing. You can write the same short email, but this time it will have an opening line of the form “I know you’re too busy to discuss the analysis, so I’ll make the following assumptions when I do it …” And then, you can add a closing line “Hopefully that captures it. If I don’t hear otherwise from you, I’ll deliver results based on this understanding.”

3) When delivering your result, include the original request
You’ve done the hard work of clarifying expectations, you’ve done the analysis, and now this is the easy part. When summarizing the results, make sure that you attach your analysis to the clarifying email. If you’re delivering it in hard copy, you can attach a print out of the clarifying email to the top.

Using this approach the person making the request will be able to see their role in the entire process. It won’t take long for people to see the value of slowing down and spending a few minutes getting clear on the request.

4) Follow up after the fact
The worst situations are when you’ve put in the hard work, but it wasn’t really what the requester wanted, and so they don’t use it. They’ve wasted their time, your time, and they still didn’t get what they want. Because they feel embarrassed about not using the work, they will often not bother giving you feedback.

So, it’s up to you to solicit feedback after each major deliverable. A brief check-in after the fact can yield great feedback. If you’re not getting rave reviews about the great work you did, you can ask “What could I have done to make it even better?” This seemingly innocent question prompts the requester to give candid feedback, and demonstrates that you really care about the value of your work.

How's my analysis?

These coping strategies are not for everyone, and are not needed in every situation (especially the quick and easy analyses). But it’s the times when we get it wrong where we really appreciate the value of clarifying expectations. If you have your own coping strategies, please weigh in.

Note: What is a Data Team?
When we refer to “Data Teams” it’s a catch all for groups of technical, statistical, and subject-matter domain experts that are involved in providing information to support their organization. These teams are sometimes called “Business Intelligence”, “Decision Support”, or “Information Management”, but they can also be internal consultants such as “Operations Analysts”, “Strategic Information” or “Research”. Many of these concepts equally apply to teams of Data Scientists.