Tips for Executives – How to Create a Culture of Evidence

We’re often asked how do we create a Culture of Evidence? Most leaders know that they should be more evidence-based in how they work, but don’t know how they can go about doing it.

We’ve all heard the phrase “Culture eats strategy for breakfast” and anyone who’s attempted to drive change in a complex organization knows how true that statement can be. And, many seasoned leaders know that culture change doesn’t happen overnight, but here are some tips that you can use to get started.

Culture of Evidence

Tip 1: Paint a picture of “What a Culture of Evidence looks like”
If you want to make meaningful progress towards creating a culture of evidence, there’s no better place to start than envisioning your future state. Things to consider include:

  • How will life be better? For you, your team and for the company?
  • What opportunities will you be able to access?
  • What risks will you be able to avoid?
  • What decisions will be smarter?
  • What time will be saved?

If you can create a compelling vision of your organization in the future that thrives in a Culture of Evidence, then you can use this to win supporters.

Tip 2: Set the standard for “What counts as evidence?”
In the spirit of “crawl, walk, run”, getting started with using evidence doesn’t have to begin with hiring a team of scientists, researchers and lawyers. To begin with it may be as simple as using data to support your decision-making, carrying out basic research, or using spreadsheets to do “what if” analysis. Most leaders do this already, but many others still rely on their intuition to make their decisions.

The following is an illustrative example of “what counts as evidence?”:

  • A declarative statement of your position such as “I believe that we should launch a social media awareness campaign for our red widgets”
  • Some form of objective proof that shows how you formed your position, such as “According to our market data 85% of our target customers have never heard of our red widgets, and 57% of them use social media. The campaign would be cost effective even if it only generated a 5% increase in our market share.”
  • A disclosure of what you don’t know, such as “Admittedly our market data is one year old, so we’re assuming that the patterns still hold.”
  • An action statement, such as “I’d like to update our market data but the delays and costs outweigh the risk of missing an opportunity … I recommend that we launch the campaign and track performance.”

The ultimate goal of evidence is that it holds up to the review process, meaning that another leader could review the evidence and arrive at the same conclusions. Along those lines, “what counts as evidence?” could be just that … an objective analysis that has been peer reviewed.

Manager Reading Data

Tip 3: Put the tools in place
To set your team up for success, you will want to make sure that the basic tools are available for evidence-based thinking. Some questions to consider include:

  • Are the right investments being made to collect the right data?
  • Does your team have access to the data they need? Is the data being collected at the source, but it’s not being stored in the data warehouse? Or is the data there, but the privacy levels are too restrictive?
  • Do they have the skills for working with the data, or alternatively, is the right information available in insightful reports or visual dashboards?
  • Do they have the right technical and human resources perform deeper analyses, in response to important business questions that arise?

Tip 4: Lead by example
If you want to convince your team and your peers that you are fully behind this idea of a Culture of Evidence, then you’ll need to walk the talk. This will require effort at the beginning, but after a while it will become just “the way things are done around here”. Leading by example can include shifting your own language from “I think this is what we should do …” into “The evidence tells me that this is what we should do …”

It can also include making a concerted effort to not do things the old way because “that’s the way we’ve always done it” but instead doing things in ways that are proven to generate the right outcomes. This relates to everyday decision-making and operations, as well as longer-term strategy and planning.

Tip 5: Reward the adopters
It is often said that “you get what you reward”. This is an easy concept to apply to building a Culture of Evidence. For example you can reward your team for using evidence in situations like:

  • Decision-making on special projects: Projects that have proposals that have supporting evidence are often approved, whereas other projects often don’t.
  • Decision-making on budget: Budget increases (or exemptions from budget cuts) are generally provided to those departments that can prove that they need it, whereas departments that can’t prove their value miss out.
  • Decision-making on promotions: Team members that demonstrate the effective use of evidence are generally promoted to higher positions, whereas other team members don’t.

By taking this approach it won’t take long for people in your organization to learn that the way to win is by embracing an evidence-based approach. Team members will either adopt the new direction or self-select themselves out of your organization. Over time this will increase the momentum of the culture change, and gradually you will find that your organization attracts talent that values a Culture of Evidence.

Tips for Executives – How to Get the Data You Need

One of the most common complaints that we hear from leaders and executives is that they have “too much data” and “not enough information”. Some examples of what they mean by “too much data” include:

  • Reports that consist of pages and pages of numbers
  • Tables of figures with no overall summary number
  • Charts that are cluttered and confusing
  • Analyses that show a lot of numbers but no “so what” message

It doesn’t have to be that way. Here are a few tips that executives can use to get the data they need:

Tip 1: Ask yourself “What information would help me be more effective?”
It may sound selfish, but you should ask yourself “What information would help me be more effective in my job?” This might be information that helps you save your own time, make better decisions, or seize big opportunities.

The Data Thinker

Another way to approach this question is to review the data that you already have available and ask yourself “What isn’t this telling me?” or “Why is this not useful to me?”

Based on this thought process, prepare a simple table with two columns. In the first column include a description of what you want, and in the second column identify why you want it. Then choose your top 3 to 5 items on the list. Now you’re ready to start the next step – following up with your Data Team and/or your Business Intelligence people to have a first conversation about your top-ranked items.

Tip 2: When people say data isn’t available, use the “5 Whys”
Many data people have difficultly seeing the world beyond the standard data that they use every day. So, when you meet with them and tell them about the data that you need, chances are that they will reply by saying “that just isn’t available”.

When it comes to data – almost anything is available – it’s just a matter of how much you’re willing to fight to get what you need.

The “5 Whys” is a simple process of getting to the root of an issue. When your data people tell you that getting the data you need is impossible, ask “why”. They will give you a list of reasons such as “it’s not in the data warehouse”, or “we don’t measure that”, or “the system doesn’t allow that type of reporting”. Pick any of the reasons, and then ask “why” again, which will generate a new list of reasons. Continue this until you’ve reached the root of the issue (hopefully in 5 or less “whys”). The root issue is often one or more of the following:

  • Nobody thought to ask for this before
  • At some point in the past, somebody decided that it was too hard to collect the data
  • The people running the analysis and reporting are limiting themselves based on the capabilities of their reporting tools
  • Nobody has thought of taking a prospective data collection approach, and/or nobody has thought of doing a sampling approach (to reduce data collection costs)

Through a few meetings, you now should have the real reasons why you don’t currently have the information you need. You may even have a sense of how much it would cost.

Tip 3: Estimate the cost of not having the information you need
The last step is where you can make your convincing argument. For each of your top-ranked ideas, you can think about what it’s costing you to not have access to that information.

Does it translate to productivity? Lost time? Missed opportunities? Lost revenue? Customer loyalty? Employee turn-over? If so, then you can translate these consequences into real tangible costs. This isn’t an exercise of doing high-precision activity based costing – instead this is just getting the cost estimates roughly right.

These figures give you an idea of how much your organization could potentially invest into better data and reporting. If you’re business-minded and you could work out the actual investment amounts that would still generate a positive return on investment.

Armed with this analysis, now you’re in a position to convince others what this information is worth. Which brings us to our last step.

Tip 4: Gain the support of the leadership team
Chances are that the information that will help you be more effective in your role, will also be useful to others in the leadership team and throughout your organization. If you can gain the support of the rest of the leadership team then you can increase the chances of getting what you want.

Each team dynamic is different, but a one-on-one approach often works well. These can be quick conversations with each leader with a real focus on “what’s in it for them”. You may be surprised with how many of your peers are equally frustrated by the lack of good information.

With the support of the team, the cost of not having the information and some return on investment estimates, you’ll be able to drive to get the information you need to be successful.

These are just a few tips, but I’m sure there are many of leaders out there who have many more great ideas and experiences. If you have suggestions, or alternate points of view, please weigh in.

 

Note: What is a Data Team?
When we refer to “Data Teams” it’s a catch all for groups of technical, statistical, and subject-matter domain experts that are involved in providing information to support their organization. These teams are sometimes called “Business Intelligence”, “Decision Support”, or “Information Management”, but they can also be internal consultants such as “Operations Analysts”, “Strategic Information” or “Research”. Many of these concepts equally apply to teams of Data Scientists.


Tips for Data Teams – The Consistency Check

Have you ever delivered an analysis, only to hear from your client that “these numbers can’t be right”? It’s hard to convince someone that your results are credible when they don’t even pass the first 5 seconds of review. As much as we may not want to admit it, sometimes the numbers are indeed wrong, so how do we avoid these situations from happening? One type of check that a Data Team can adopt is the “Consistency Check”. Here are some questions that you can ask yourself when doing a consistency check:

Consistent numbers

Question 1) Are the numbers consistent with themselves?
When building complicated analyses, different sections of the analysis can fall “out of sync” with each other if they are not all updated in the same way. When this happens it can produce inconsistent summary results (i.e. the cover page reports 255 conversions per hour, but the supporting details on other pages show 237 conversions per hour). Sometimes we place too much faith on our reporting tools and assume that they will report exactly as intended. In other situations it’s just a matter of being too close to the work. After a while the numbers are burned into your short term memory and you lose your ability to critically review them with an objective eye. Suggested work-arounds include:

  • Have another member of your team do a consistency check on the results, preferably someone who hasn’t been involved in the work.
  • Take an old school approach. Print out the results, and use different colored highlighters for each type of metric. Highlight the summary numbers that represent the same result, and confirm that they are indeed consistent. Continue until you’ve highlighted all summary numbers.
  • Take another old school approach. Get your calculator out or use a separate spreadsheet, and confirm that you can replicate the summary numbers just based on the results that are being presented. You may be surprised with how many of your clients are doing this with your results already.

Question 2) Are the numbers consistent with your previous analyses?
When a client receives a new set of results they often pull up the previous results that you gave them. They are asking the question “how much have things changed?” You can beat them to the punch by doing this consistency check yourself. To be more specific:

  • Start with the previous result that was presented or released. Compare the summary numbers from the previous results to your current summary numbers.
  • Assess if the changes are interpretable. If they are, then this interpretation will likely be part of what you communicate when you release the new results. If the changes are not interpretable, then it’s time to go back into your current results, or your previous results to diagnose why the changes aren’t explainable.

Question 3) Are the numbers consistent with other reports?
Stepping into the shoes of your audience, you can think about the other reports that they are referring to on an on-going basis. It doesn’t matter if the other reports that they use came from a completely different source – from their perspective all data from all sources is supposed to tell the same story. In a similar manner to Question 2, you can do some additional homework so that your results are valuable to your audience as possible. For example you could:

  • Ask your clients if they have any other reports that they use frequently, and if they would be willing to share them with you. You can frame it honestly – you want to make sure that your results are valid, and if they are different from other sources, you want to be able to explain why.
  • Do a little research on your own, in particular, reviewing any routine corporate reporting, or industry reporting. Sometimes, a skeptic can be won over by proving that you did your homework. Again if the numbers line up from other sources, it becomes something you can report as proof of consistency. If the numbers don’t line up and you can’t explain the difference, then it may be an indication that you need to review your analysis.

Question 4) Are you telling the right story?
Taking all of the above into account, you should be able to deliver your results confidently. You should now know that the numbers in the report are consistent amongst themselves, that the analysis is consistent with previous analyses, and that the results are interpretable in comparison to other sources. This now can become part of your summary and presentation of your stunning new work. Or at least it can form as an addendum to the email, or the presentation that shows your audience the efforts that you went through to ensure that the numbers are the right numbers. Then you have the foundation to begin telling the actual story of the analysis (the “so what” message).

These are just a few tips, but I’m sure there are many of experts out there who have many more great ideas. If you have suggestions, or alternate points of view, please weigh in.

Note: What is a Data Team?
When we refer to “Data Teams” it’s a catch all for groups of technical, statistical, and subject-matter domain experts that are involved in providing information to support their organization. These teams are sometimes called “Business Intelligence”, “Decision Support”, or “Information Management”, but they can also be internal consultants such as “Operations Analysts”, “Strategic Information” or “Research”. Many of these concepts equally apply to teams of Data Scientists.


Reducing Rework in a Data Team

As much as we’d all like to get things done right the first time, with analysis and modeling it’s not always possible.

When delivering results, it’s fairly common to receive requests for minor revisions – and most of that we can all handle. But every so often the situation catches you by surprise. You’re delivering what you think is a great piece of work only to learn that it missed the mark completely. You hear statements like “This isn’t what I asked for!” or “You misunderstood what I asked for!” and you wonder where things went wrong.

Sometimes you can rightfully blame the person who requested the analysis, and then conveniently changed their mind. But more often the breakdown happens around communication and agreeing on expectations.

Final version

So what do you do? Here are some coping strategies:

1) Ask the question “What does a job well done look like?”
The next time you’re asked to run a major analysis where you feel that you don’t have an adequate understanding of what is being asked, try this script:

“I want to make sure that I give you what you want. Would you mind if I grabbed a couple of minutes to clarify a few things?”

Then ask your clarifying questions. For example:

  • What’s the business question that this analysis is supporting you with?
  • Do you just want the summary, or did you want the supporting details?
  • Is this analysis just for your reference, or is it going to be distributed?
  • How accurate does this need to be?

The answers to these questions can make a big difference in determining the final deliverable. If you only have time for one question, the first question is the best one to ask.

If you’re lucky enough that the person making the request is willing to spend more than a couple minutes with you, then you can try to get crystal clear on “What does a job well done look like?” The following are some of the statements that you might hear:

  • It will help me answer this questions …
  • The numbers will be consistent with our annual report
  • The summary of results will be jargon-free
  • The results will be delivered by Friday morning at 10 am, both by email as well as a color print out on my desk

2) Put your understanding in writing
Now, with your heightened clarity you can now put it into writing. A short follow up email of the form “Thanks for clarifying. So, just to recap I will …” will provide one more opportunity for corrective feedback.

In many situations you won’t be able to do the first step (getting clear on “what a job well done looks like”) because the person making the request is too busy. But even in these situations it’s still worthwhile putting into writing. You can write the same short email, but this time it will have an opening line of the form “I know you’re too busy to discuss the analysis, so I’ll make the following assumptions when I do it …” And then, you can add a closing line “Hopefully that captures it. If I don’t hear otherwise from you, I’ll deliver results based on this understanding.”

3) When delivering your result, include the original request
You’ve done the hard work of clarifying expectations, you’ve done the analysis, and now this is the easy part. When summarizing the results, make sure that you attach your analysis to the clarifying email. If you’re delivering it in hard copy, you can attach a print out of the clarifying email to the top.

Using this approach the person making the request will be able to see their role in the entire process. It won’t take long for people to see the value of slowing down and spending a few minutes getting clear on the request.

4) Follow up after the fact
The worst situations are when you’ve put in the hard work, but it wasn’t really what the requester wanted, and so they don’t use it. They’ve wasted their time, your time, and they still didn’t get what they want. Because they feel embarrassed about not using the work, they will often not bother giving you feedback.

So, it’s up to you to solicit feedback after each major deliverable. A brief check-in after the fact can yield great feedback. If you’re not getting rave reviews about the great work you did, you can ask “What could I have done to make it even better?” This seemingly innocent question prompts the requester to give candid feedback, and demonstrates that you really care about the value of your work.

How's my analysis?

These coping strategies are not for everyone, and are not needed in every situation (especially the quick and easy analyses). But it’s the times when we get it wrong where we really appreciate the value of clarifying expectations. If you have your own coping strategies, please weigh in.

Note: What is a Data Team?
When we refer to “Data Teams” it’s a catch all for groups of technical, statistical, and subject-matter domain experts that are involved in providing information to support their organization. These teams are sometimes called “Business Intelligence”, “Decision Support”, or “Information Management”, but they can also be internal consultants such as “Operations Analysts”, “Strategic Information” or “Research”. Many of these concepts equally apply to teams of Data Scientists.


Tips for Managing Priorities in a Data Team

We work with a lot of different Data Teams, and most of them are faced with the same challenge:

How do you handle all of these competing requests for information?

Below are some relatively easy-to-implement tips for dealing with this situation, but first let’s see why this can be so hard. The following are some of the more common reasons we’ve seen in the field:

  • Every request seems to be urgent. Most Data Teams are all too familiar with the expression “we need it yesterday”.
  • Every request seems to be very important. How can a Data Team not give priority to a request that comes from the CEO’s office or from the Board? What about situations where Public Relations needs good information to handle an emerging PR issue?
  • Requests for information are “free”, meaning that in most situations, the people requesting the information don’t have to pay for it. As a result, demand for information grows much faster than the capacity of the Data Team.

Overloaded Inbox

Here are some tips for Managing Priorities in a Data Team:

1) Keep a log of all active requests
As simple as it sounds, keeping an up-to-date log of all active requests is a “must have” enabler for managing competing requests in a Data Team. Many Data Team leads feel that they don’t need such a log, citing that they have it all under control, and that they are too busy to keep another list up to date. But such a log can help identify the capacity needed in the Data Team, and the skill mix that’s required. At minimum the Active Request Log should include the following information for each information request:

  • Who is asking for the information?
  • What are they asking for?
  • When did they ask for it?
  • Who in the Data Team is handling the request?
  • When did we promise to get it done?
  • What’s the status of the request (not started, active, completed, cancelled)?

In addition, the following information can be very helpful for planning purposes:

  • When was the information delivered?
  • How many hours of effort were involved in preparing it?
  • Was the due date pushed back? If so, how many times and by how many days?
  • Was there any feedback from person who requested the information?

This list can be as simple as a whiteboard, a shared spreadsheet, a SharePoint list, or a Google Doc. The hard part is having the discipline to keep it up to date.

2) Review the log as a Data Team every day
Having a daily 5 minute meeting as a Data Team may seem like a big burden. Who needs another meeting in their already-too-busy schedule? But if done right, a daily 5 minute meeting to review the Active Request Log can help a too-busy Data Team work together to make sure that the most important things are being worked on every day. Specific things that can be clarified during this 5 minute check-in include:

  • What must we get done today?
  • What must we get done in the next couple of days?
  • Who has the lead on each piece of work?
  • What requests need more support?
  • What counts as “good enough” for the requests that we’ll be working on today and tomorrow?

This quick meeting can set the entire Data Team in the right direction at the start of each day, and in doing so, go a long way to reducing the last-minute scramble, and make sure that the Data Team works to it’s full potential as a team.

3) When handling new requests, use the active request log to set expectations
If you have the discipline to do the above 2 steps, then after not too long you will have great information for managing expectations with new requests. For example, if there is a last minute urgent and important request for information, then at minimum you will now know:

  • How long will this really take us to complete?
  • Are there any recent requests for information that are similar to this one? If so, can that requests be modified to meet this urgent need?
  • Will any active requests not be completed on time, as a result of this new urgent request? If so, is the person making this new urgent request willing to take the heat?

In a lot of respects, most Data Teams are carrying out all of these three functions, but often it’s in people’s heads. By adding a little bit of tracking and daily discipline, the Data Team can significantly improve their work effectiveness, and at the same time better meet the needs of their customers.

We’re sure you have perspectives of your own on this subject. If you so, please share your thoughts and ideas.

Note: What is a Data Team?
When we refer to “Data Teams” it’s a catch all for groups of technical, statistical, and subject-matter domain experts that are involved in providing information to support their organization. These teams are sometimes called “Business Intelligence”, “Decision Support”, or “Information Management”, but they can also be internal consultants such as “Operations Analysts”, “Strategic Information” or “Research”. Many of these concepts equally apply to teams of Data Scientists.


Applying “Purposeful Abandonment” to Big Data

I’ve recently been reading “Inside Drucker’s Brain” by Jeffrey Krames. I’ve read some of Drucker’s hits, but I found this book put his great ideas all together in an easy to digest format.

One of the Drucker concepts that resonated with me is the concept of “purposeful abandonment”. He argues that it’s easy to take on more responsibility, do more products, support more customers, but the hard part is the “letting go” part. By taking a concerted and proactive approach to identifying “what you won’t do anymore” one creates the space needed to move forward in the areas that matter.

The concept is surprising relevant when applied to Data Science. Here’s my take on it:

1) Do you really need all those data fields and metrics?
The thrill of Big Data is having no limits on the number of fields that we have in our datasets. With space being so cheap, and an abundance of distributed computing power, there’s no need to scrutinize the fields that we’re tracking. But, isn’t this just a form of Parkinson’s law in action (i.e. Data expands to fill the space available for storage)? With every data field and metric comes the need to do quality assurance, test for face-validity, and understand the underlying quirks. Letting go of those “nice to have” data fields and metrics allows Data Scientists to better focus on the ones that really matter. Less time checking redundant fields and metrics equals more time for insightful and impactful analyses.

Saying No

2) Do you really need all those records?
Just like the previous concept, what’s the big deal? Why not analyze all the data records in our data sets, all the time? There are certainly times when we really need the full dataset, but often this stage can wait until the first exploratory analyses have been done. Sadly, some analysts can get stuck in a mindset of always running analyses on the full dataset. And so, they spend lots of time and effort on using Big Data tools, when they could have used good old fashion statistical samples to just cut to the chase. Less time running all analyses on all of the data records can equal more time nimbly running exploratory analysis to find the hidden gems you’re looking for.
(more…)

New Year’s Resolutions for Data Scientists

As a group, Data Scientists seem like the type of a people that would seize any opportunity to improve. So in the spirit of fun, the following are 4 “tongue in cheek” resolutions for this year.

1) Gain More Weight
Data Scientists are getting a lot of attention these days, which is great. We need to continue to gain our collective weight as people who help other people make sense of the ever-growing mass of data, translating what the numbers mean into something actionable for non-Data Scientists.

Data scientist

2) Keep Smoking!
Yes, really, keep smoking! The concept of the Data Scientist is smoking hot, and in a self-promotion kind of way, it makes sense to keep this momentum going. So this means doing things like being a good ambassador of Data Scientists as a group, and explaining to people (i.e. your mother, your neighbor, the person on the street) what the heck we do.

3) Learn a New Language … Spanish, SQL, R …
Data Scientists are human too, and so it’s not uncommon for a Data Scientist to get really comfortable with a set of analytical tools – almost too comfortable. This could be the year to broaden your horizons and try something new. Different technologies often have completely different ways of approaching the same problem, and some are better than others depending on the task at hand. Knowing the options can save a lot of time in the long run. The article Top Holiday Gifts for Data Scientists has some good references for books and other resources.

4) Learn How to Make Friends and Influence People
Data Scientists can suffer from being too analytical, too technical and just too darn scientific. The greatest insights in the world don’t matter if they can’t be communicated to people in way that they can be understood. Data Scientists often can do with a little help in this area. These are two books that I’d recommend for Data Scientists that are looking to improve their game at presenting:

And let’s not forget the “making friends” part. The Data Scientist community is a growing one, and as good friends there’s a lot we can learn from each other.

I’m sure there are more resolutions in store for Data Scientists – please share your suggestions and thoughts.


How to Get Started With Simulation

Many business analysts decide that they want to start using simulation not just because it’s flashy and high-brow, but also out of pure necessity. These business analysts have taken their spreadsheets as far as they can and are at a point where the spreadsheets are becoming unwieldy and ineffective at providing reliable answers to their important business questions.

1 2 3 4

These analysts often ask “How do I get started with simulation?” Is there a course that one can take? Is there a tutorial? Is there a book? Ultimately what’s the best way to get started? Here are 4 questions that I suggest they consider:

Question 1: Are you sure Simulation is for you?

I have a belief that most people can learn most things if they are motivated enough, and I believe the same is true with simulation. However, there are some skills that make the learning curve easier:

  • Are you logical and process oriented? The guts that drive simulation models are driven by process logic. If you’re able to look at a real-life business process and convert it into a meaningful and clear process flow map then this is a good thing.
  • Have you done any programming? There is a lot of “If … then” logic in simulation models, and having experience with programming (including VBA and complex spreadsheet logic) will only work in your favor. Simulation models are almost never programmed correctly the first time, so debugging skills are also very important.
  • Are you good at handling a lot of data? There is a lot of data handling to estimate inputs for simulation models, and most simulation model will generate a mass of data. This is a very important skill in being an effective simulation modeler.
  • Are you good at experimentation? Simulation is like a sand box, and experimenting with your model is a key part of developing, calibrating, and validating your models, as well as designing and carrying out scenario analysis.
  • Can you work without perfect information? It’s a routine experience that simulation models will need parameters and factors that have no available data. The simulation modeler often needs to form credible assumptions as a workaround to having to deal with incomplete information.

If you can answer “yes” to the above questions then simulation might be a good tool for you.

Question 2: Are you just dabbling or are you ready for a deep dive?

Simulation is often described as both an art and a science. Simulation is one of those skills that seems to be better developed through “doing” rather than reading books or taking courses. I’d highly recommend taking courses if you’re convinced that simulation is for you (I taught a simulation course for 5 years at the
University of British Columbia). However, what you learn from a course won’t really stick unless you are going to be able to work on a real simulation project shortly afterwards.

Simulation is one of those skills where it’s difficult to be effective until you’ve been working with it for a while (i.e. your second simulation project will be much better and easier than your first,… your third simulation project will be even better, and so on). If this is something that you’re just looking to add to your resume then I wouldn’t bother. People who hire talent for their simulation skills can easily differentiate between a dabbler versus an experienced practitioner.

A “deep dive” simulation project allows a new simulation modeler to really understand what they can and can’t do with a simulation model, the effort involved with various model elements, and the associated value that those elements add to the final conclusions. New simulation modelers sometimes learn through their first intense simulation project that simulation isn’t really for them.

Coloring numbers

Question 3: Do you have a “Simulation Worthy” problem?

Simulation is an invaluable tool if it’s applied to an important problem that cannot be solved using traditional tools. But, if you could effectively answer the same problem using a spreadsheet, then why wouldn’t you just use a spreadsheet? If the business problem isn’t important enough to justify spending days (sometimes weeks) of effort programming and validating your model, then you could be creating a situation where your organization perceives simulation modeling as high effort for low value.

Ideally, you would be in a situation where there is a business problem that has high value (i.e. the potential to support a million-dollar decision, or a strong potential to reduce risk, or increase efficiency). And, ideally, the problem involves complex inter-relationships between resources, and/or processes – the type of logic that is very hard or impossible to set up in a spreadsheet. And finally, the situation requires a handling of uncertainty and variability in order to fully address the business problem. We would argue that if you don’t have a problem that is “simulation worthy” then it’s best to wait until you do.

Question 4: Do you have a good example to start from?

Simulation is not like other mathematical and theoretical disciplines, in that there is no single “right answer”. There are many different ways of modeling a system, all of which can be valid (provided that the assumptions are disclosed), and it often comes down to a balancing act of model accuracy versus model complexity. Simulation modelers often add more detail and logic into their models in an effort to improve the accuracy of the model, but as they do so, the model typically becomes more complicated, more difficult to debug, more difficult to validate, and more difficult to run scenario analysis on.

When new simulation modelers are getting started it can be difficult to make these decisions. A great way to learn is to partner up with a mentor – ideally someone who has done a few simulation projects where the results actually supported a decision outcome. The INFORMS Simulation Society is a good place to start (), and if you can do it, attend the annual Winter Simulation Conference next November.

If you can’t find a mentor, try learning from example models. Our company AnalysisWorks, made a simple simulation model of an Emergency Department that is 100% free and available for
download
. Without any programming, you can interact with this 3D animated model to get a sense of the types of things you can do with a simulation model.

Simulation Model




The Science of Data Scientists

The concept of the Data Scientist may very well be the next big thing in the field of analytics. Recently several industry leaders have weighed in on the question “What is a Data Scientist?”, but another way of looking at this is to ask the question “What is the Science of Data Scientists?”
            Data Scientist

A dictionary definition of science is a “systematic knowledge of the physical or material world gained through observation and experimentation”. So let’s look at the use of science in three areas that Data Scientists all need to do in carrying out their basic work:

  1. They transform the data into a format and structure that is conducive to analysis
  2. They carry out some kind of descriptive, interpretative, or predictive analysis
  3. They communicate their results

Using Science in Data Transformation:

Anyone who’s worked with data for a while knows that the data you have available is usually less than perfect. Missing data, inconsistently formatted data, and duplicate data are fairly routine obstacles, and then linking data from different sources is even more challenging. Data Scientists are also often required to work with “secondary data” that has been generated through an operational system or process. The data was originally designed to meet a functional requirement, rather than with the intention of it being analysed in the future. Even if the data is clean and error-free, there is a requirement to reorganize the data into a structure that is conducive to the analysis that needs to be performed.

So, in response, most Data Scientists develop skills in transforming data, and are quite good at it too. They use tools ranging from statistical analysis software to standard database technologies. Where the science comes in, is that there if often a lot of experimentation that takes place along the way, as the Data Scientist figures out how best transform the data while introducing little to no error along the way.

Many Data Scientists have learned the hard way that using a scientific method to prove that the data transformation has been done correctly ultimately saves time and reduces rework in the end.

            
Big Number

Using Science in Performing Analysis:

Here the use of scientific method is more obvious. It is taken as a given that Data Scientists conduct their analysis and modeling systematically, and that the essence of the work involves observation and experimentation. In carrying out the work, often “the proving” is a key component of what the Data Scientist does, so that they know they are drawing the right conclusions.

However, there is a wide range of scientific tools that Data Scientists can use to understand and interpret massive amounts of complex data. Data Scientists are not unlike other skilled experts, and can be sometimes be like a carpenter with a hammer who sees every problem as a nail. For example, some Data Scientists are truly exceptional when it comes to logistical regression modeling (making the best guess of a “yes/no” variable), but then are complete novices when it comes to multivariate analysis (such as condensing information captured in 1,000 correlated variables into 10 summary variables). As is often the case with niche skills, it takes a while to really get good at using them effectively, and it’s rare to find Data Scientists that are truly effective in all domains. The scientific connection here is that Data Scientists sometimes have to come to grips with the limits of their own skill set, and have to experiment in new directions to expand their knowledge base.

Using Science in Communicating Results:

This angle is less intuitive, but ultimately what’s the point of doing high-brow analysis, if nobody is able to understand the result, or even worse, if they can’t use the result to support a key decision?

Data Scientists that are in high demand are those that are able to truly understand the business question being asked, and why it’s being asked. Then they communicate their complex findings in a way that the decision-makers can actually do something with the result.

This important skill takes a while to develop, often through experimentation (i.e. what happens when I present it this way?), and then observation (i.e. what did the CFO do with the last findings I sent her?). Even better, is when the Data Scientist adopts basic market research approaches to their own work. Specifically, by following up with their clients and/or end-users of their work and discovering how the results could be even more useful. Or taking a more traditional approach, they can literally post their results with on-line reporting tools and run analytics to see how often and how deeply their results are being viewed.

The concept of the Data Scientist is still relatively new and will be shaped by those of us who work in and around in the industry. Please offer your own comments and feedback, even if you disagree with any of these ideas.

How to Allocate Resources as an Executive Team

Many executive teams feel that they could improve how they make decisions about resource allocation. These are decisions such as “which strategic initiatives should we approve for this year?” or “how much budget should we allocate to marketing versus customer service?” or “how many beds should be allocated to the surgical program, versus the medical program?” And this is at a time when organizations are all trying to do more with less – more sales with less sales staff, more shipments with less cost per shipping, more strategic initiatives with fewer leaders to push them forward, and so on.

The following are some of the common challenges that executive teams face when they are making decisions about allocating resources as a team:

  • Different people involved in the decision-making process have different goals and objectives. A win for one participant is a loss for another.
  • Decisions are often influenced by personality and emotion, as opposed to based on evidence.
  • Decision making processes have no feedback loop. As a result nobody keeps score on the quality of the decisions, and decision-making doesn’t improve over time.
  • Decision making

    Some teams have frameworks that they use to support decision-making, but it’s not working for them because the process, framework, and technology (i.e. spreadsheet, decision-support system) is too complicated to use, or it’s too cumbersome to maintain.

    Even worse, is when the team is attempting to use an “off the shelf” solution that doesn’t do a good job of capturing what’s important to them as a team.

    At AnalysisWorks we’ve successfully developed and implemented solutions that help teams make resource allocation decisions. These solutions have been developed over years, and we’ve learned a lot of painful lessons along the way. Here are 5 tips for success for allocating resources as an executive team:

    Tip 1: See resource allocation as a decision-making process …

    … as opposed to a one-time event. For example, it may very well be that you only decide which IT projects will be approved once a year, however, a year goes by quickly, and you’ll be right back at the decision making point soon. See the design of an effective process as an investment for your future, and a means to make the most out of your scarce resources.

    Tip 2: Define the common goals that your allocation should be based upon.

    This should be something that the entire team can get behind. The financial side is often over-represented in these types of decisions, so it’s important to round out the goals to include non-financial aspects as well. Ideally you should see a connection here to your mission statement, organizational values, this year’s strategies, and your overall strategic plan.

    Tip 3: Decide on the rules of the game.

    Again, the entire executive team should agree on the rules of the game at the very beginning. The rules should be fair and transparent. The executive team should avoid reverse-engineering the process to justify one-time decisions, and instead the process should be based on agreed-upon principles. Ultimately, if the rules of the game are set up right, they will serve to communicate to the entire team what behaviours will be rewarded. It’s at this stage where the team will need to agree on what objective inputs will be input into the decision making process.

    Tip 4: Have an objective party “keep score”.

    Building on the previous tip, it’s important that the rules of the game have accurate measurement, and that the score keeping is fair. This is a situation where an objective group or individual should be at the center of the process making sure that inputs to the process are accurate and consistent. Even more important, this person or group will have the difficult job of overseeing that the outcomes of the resource allocation are measured. Specifically, they will be instrumental in closing the loop with respect to the question “Did we get the outcomes we were expecting from our resource allocation decisions?”

    Tip 5: Improve now, keep it simple, and learn as you go.

    Sometimes executive teams try to make the “perfect” decision-making process before they are willing to use it. At the end of the day, making a minor improvement over your current state is still an improvement in the right direction. For example, if your executive team makes resource allocation decisions based on purely financial information supplemented with qualitative information, then an improvement may be to quantify a non-financial consideration (i.e. Scoring “alignment with our three year strategy” on a High, Medium, Low scale is better than nothing). These processes can grow out of control, so it’s important to keep it as simple as you can get away with. And finally, chances are that along the way the executive team will identify how the resource allocation process can be improved next time. These improvements can and should be incorporated into the next round of decision-making to make the best, simplest process possible. You will know that it’s working if your executive team feels comfortable with the process, and is able to support the resource allocation decisions that are made.