Applying “Purposeful Abandonment” to Big Data

I’ve recently been reading “Inside Drucker’s Brain” by Jeffrey Krames. I’ve read some of Drucker’s hits, but I found this book put his great ideas all together in an easy to digest format.

One of the Drucker concepts that resonated with me is the concept of “purposeful abandonment”. He argues that it’s easy to take on more responsibility, do more products, support more customers, but the hard part is the “letting go” part. By taking a concerted and proactive approach to identifying “what you won’t do anymore” one creates the space needed to move forward in the areas that matter.

The concept is surprising relevant when applied to Data Science. Here’s my take on it:

1) Do you really need all those data fields and metrics?
The thrill of Big Data is having no limits on the number of fields that we have in our datasets. With space being so cheap, and an abundance of distributed computing power, there’s no need to scrutinize the fields that we’re tracking. But, isn’t this just a form of Parkinson’s law in action (i.e. Data expands to fill the space available for storage)? With every data field and metric comes the need to do quality assurance, test for face-validity, and understand the underlying quirks. Letting go of those “nice to have” data fields and metrics allows Data Scientists to better focus on the ones that really matter. Less time checking redundant fields and metrics equals more time for insightful and impactful analyses.

Saying No

2) Do you really need all those records?
Just like the previous concept, what’s the big deal? Why not analyze all the data records in our data sets, all the time? There are certainly times when we really need the full dataset, but often this stage can wait until the first exploratory analyses have been done. Sadly, some analysts can get stuck in a mindset of always running analyses on the full dataset. And so, they spend lots of time and effort on using Big Data tools, when they could have used good old fashion statistical samples to just cut to the chase. Less time running all analyses on all of the data records can equal more time nimbly running exploratory analysis to find the hidden gems you’re looking for.
(more…)

How to Get Started With Simulation

Many business analysts decide that they want to start using simulation not just because it’s flashy and high-brow, but also out of pure necessity. These business analysts have taken their spreadsheets as far as they can and are at a point where the spreadsheets are becoming unwieldy and ineffective at providing reliable answers to their important business questions.

1 2 3 4

These analysts often ask “How do I get started with simulation?” Is there a course that one can take? Is there a tutorial? Is there a book? Ultimately what’s the best way to get started? Here are 4 questions that I suggest they consider:

Question 1: Are you sure Simulation is for you?

I have a belief that most people can learn most things if they are motivated enough, and I believe the same is true with simulation. However, there are some skills that make the learning curve easier:

  • Are you logical and process oriented? The guts that drive simulation models are driven by process logic. If you’re able to look at a real-life business process and convert it into a meaningful and clear process flow map then this is a good thing.
  • Have you done any programming? There is a lot of “If … then” logic in simulation models, and having experience with programming (including VBA and complex spreadsheet logic) will only work in your favor. Simulation models are almost never programmed correctly the first time, so debugging skills are also very important.
  • Are you good at handling a lot of data? There is a lot of data handling to estimate inputs for simulation models, and most simulation model will generate a mass of data. This is a very important skill in being an effective simulation modeler.
  • Are you good at experimentation? Simulation is like a sand box, and experimenting with your model is a key part of developing, calibrating, and validating your models, as well as designing and carrying out scenario analysis.
  • Can you work without perfect information? It’s a routine experience that simulation models will need parameters and factors that have no available data. The simulation modeler often needs to form credible assumptions as a workaround to having to deal with incomplete information.

If you can answer “yes” to the above questions then simulation might be a good tool for you.

Question 2: Are you just dabbling or are you ready for a deep dive?

Simulation is often described as both an art and a science. Simulation is one of those skills that seems to be better developed through “doing” rather than reading books or taking courses. I’d highly recommend taking courses if you’re convinced that simulation is for you (I taught a simulation course for 5 years at the
University of British Columbia). However, what you learn from a course won’t really stick unless you are going to be able to work on a real simulation project shortly afterwards.

Simulation is one of those skills where it’s difficult to be effective until you’ve been working with it for a while (i.e. your second simulation project will be much better and easier than your first,… your third simulation project will be even better, and so on). If this is something that you’re just looking to add to your resume then I wouldn’t bother. People who hire talent for their simulation skills can easily differentiate between a dabbler versus an experienced practitioner.

A “deep dive” simulation project allows a new simulation modeler to really understand what they can and can’t do with a simulation model, the effort involved with various model elements, and the associated value that those elements add to the final conclusions. New simulation modelers sometimes learn through their first intense simulation project that simulation isn’t really for them.

Coloring numbers

Question 3: Do you have a “Simulation Worthy” problem?

Simulation is an invaluable tool if it’s applied to an important problem that cannot be solved using traditional tools. But, if you could effectively answer the same problem using a spreadsheet, then why wouldn’t you just use a spreadsheet? If the business problem isn’t important enough to justify spending days (sometimes weeks) of effort programming and validating your model, then you could be creating a situation where your organization perceives simulation modeling as high effort for low value.

Ideally, you would be in a situation where there is a business problem that has high value (i.e. the potential to support a million-dollar decision, or a strong potential to reduce risk, or increase efficiency). And, ideally, the problem involves complex inter-relationships between resources, and/or processes – the type of logic that is very hard or impossible to set up in a spreadsheet. And finally, the situation requires a handling of uncertainty and variability in order to fully address the business problem. We would argue that if you don’t have a problem that is “simulation worthy” then it’s best to wait until you do.

Question 4: Do you have a good example to start from?

Simulation is not like other mathematical and theoretical disciplines, in that there is no single “right answer”. There are many different ways of modeling a system, all of which can be valid (provided that the assumptions are disclosed), and it often comes down to a balancing act of model accuracy versus model complexity. Simulation modelers often add more detail and logic into their models in an effort to improve the accuracy of the model, but as they do so, the model typically becomes more complicated, more difficult to debug, more difficult to validate, and more difficult to run scenario analysis on.

When new simulation modelers are getting started it can be difficult to make these decisions. A great way to learn is to partner up with a mentor – ideally someone who has done a few simulation projects where the results actually supported a decision outcome. The INFORMS Simulation Society is a good place to start (), and if you can do it, attend the annual Winter Simulation Conference next November.

If you can’t find a mentor, try learning from example models. Our company AnalysisWorks, made a simple simulation model of an Emergency Department that is 100% free and available for
download
. Without any programming, you can interact with this 3D animated model to get a sense of the types of things you can do with a simulation model.

Simulation Model




How to Allocate Resources as an Executive Team

Many executive teams feel that they could improve how they make decisions about resource allocation. These are decisions such as “which strategic initiatives should we approve for this year?” or “how much budget should we allocate to marketing versus customer service?” or “how many beds should be allocated to the surgical program, versus the medical program?” And this is at a time when organizations are all trying to do more with less – more sales with less sales staff, more shipments with less cost per shipping, more strategic initiatives with fewer leaders to push them forward, and so on.

The following are some of the common challenges that executive teams face when they are making decisions about allocating resources as a team:

  • Different people involved in the decision-making process have different goals and objectives. A win for one participant is a loss for another.
  • Decisions are often influenced by personality and emotion, as opposed to based on evidence.
  • Decision making processes have no feedback loop. As a result nobody keeps score on the quality of the decisions, and decision-making doesn’t improve over time.
  • Decision making

    Some teams have frameworks that they use to support decision-making, but it’s not working for them because the process, framework, and technology (i.e. spreadsheet, decision-support system) is too complicated to use, or it’s too cumbersome to maintain.

    Even worse, is when the team is attempting to use an “off the shelf” solution that doesn’t do a good job of capturing what’s important to them as a team.

    At AnalysisWorks we’ve successfully developed and implemented solutions that help teams make resource allocation decisions. These solutions have been developed over years, and we’ve learned a lot of painful lessons along the way. Here are 5 tips for success for allocating resources as an executive team:

    Tip 1: See resource allocation as a decision-making process …

    … as opposed to a one-time event. For example, it may very well be that you only decide which IT projects will be approved once a year, however, a year goes by quickly, and you’ll be right back at the decision making point soon. See the design of an effective process as an investment for your future, and a means to make the most out of your scarce resources.

    Tip 2: Define the common goals that your allocation should be based upon.

    This should be something that the entire team can get behind. The financial side is often over-represented in these types of decisions, so it’s important to round out the goals to include non-financial aspects as well. Ideally you should see a connection here to your mission statement, organizational values, this year’s strategies, and your overall strategic plan.

    Tip 3: Decide on the rules of the game.

    Again, the entire executive team should agree on the rules of the game at the very beginning. The rules should be fair and transparent. The executive team should avoid reverse-engineering the process to justify one-time decisions, and instead the process should be based on agreed-upon principles. Ultimately, if the rules of the game are set up right, they will serve to communicate to the entire team what behaviours will be rewarded. It’s at this stage where the team will need to agree on what objective inputs will be input into the decision making process.

    Tip 4: Have an objective party “keep score”.

    Building on the previous tip, it’s important that the rules of the game have accurate measurement, and that the score keeping is fair. This is a situation where an objective group or individual should be at the center of the process making sure that inputs to the process are accurate and consistent. Even more important, this person or group will have the difficult job of overseeing that the outcomes of the resource allocation are measured. Specifically, they will be instrumental in closing the loop with respect to the question “Did we get the outcomes we were expecting from our resource allocation decisions?”

    Tip 5: Improve now, keep it simple, and learn as you go.

    Sometimes executive teams try to make the “perfect” decision-making process before they are willing to use it. At the end of the day, making a minor improvement over your current state is still an improvement in the right direction. For example, if your executive team makes resource allocation decisions based on purely financial information supplemented with qualitative information, then an improvement may be to quantify a non-financial consideration (i.e. Scoring “alignment with our three year strategy” on a High, Medium, Low scale is better than nothing). These processes can grow out of control, so it’s important to keep it as simple as you can get away with. And finally, chances are that along the way the executive team will identify how the resource allocation process can be improved next time. These improvements can and should be incorporated into the next round of decision-making to make the best, simplest process possible. You will know that it’s working if your executive team feels comfortable with the process, and is able to support the resource allocation decisions that are made.

What Every Executive Should Know About Simulation

Simulation modeling is an emerging management tool to support big decisions involving complex operations.  The technology has been around for decades and in the last while it’s become increasingly easy to apply to almost any operation, such as patient flow in an Emergency Department, bits of data through a telecommunications network, or a global supply chain.

Just to be clear, the kinds of simulations we’re talking about here are of operations and/or system flows, not flight simulations or video games. 

Patient Flow Simulation

Simulation is not for every organization, but it can be a valuable tool to have in your arsenal.  Unfortunately, though, most executives aren’t even aware that it exists.  The following tips meant to introduce the topic of simulation are based on 20 years collective experience using simulation in both consulting and academic settings.

Tip 1: Simulation can be a powerful tool to look before you leap

Having a simulation of your operations can be a valuable asset for providing insight into the potential impact of crucial decisions.  They can provide powerful supporting information to aid decision-making and investigating the associated costs and benefits of multiple decision options.

Simulation models can be very useful for big decisions: Situations involving the buying, selling, or reconfiguring or key system resources are ideal for simulation analysis.  They can allow organizations to analyze multiple potential scenarios and provide insight into nearly any situation involving uncertainty.

Big risk/reward industries such as finance, oil and gas, aerospace have used it for years, but now simulation has become very accessible.  Think of a simulation model as the next step up from your most complex spreadsheet.

Tip 2: Don’t use simulation if you’re not ready

Simulation can be expensive between the costs of staff time and software licensing so it is important assess whether or not your organization has the pieces in place to successfully develop a compelling model.  It may be premature to jump into a building a simulation if you don’t have a good profile of your activity, work flows, and financials.

Typically, organizations that use simulation effectively have already answered as many questions as they can with spreadsheet analysis.  It’s better to use simulation to address business questions that can’t be addressed effectively with spreadsheets (i.e. situations involving complex logic, multiple flows of activity, uncertainty),  rather than attempting to use simulation on a problem that doesn’t need it.Simulation Logic

Tip 3: It’s not what software you use that’s important

Many tools are available on the market ranging from simple spreadsheet model to complex custom software, but more important than the tool you are using, is the team running the tool.

You need good talent to make a good simulation: There is a big difference between a good programmer and an effective simulation analyst.  An effective simulation analyst understands:

  • How to keep the model manageable, flexible and scalable
  • The importance of validating the model so you know the results can be trusted
  • How to come up with good ideas for different scenarios

It is an important first step to decide if your organization is interested in getting into simulation or not.  If you’re unsure, you could try working with a university or a consulting firm to try it out.  If you are committed to getting into simulation, a critical first step should be to focus on recruiting and developing appropriate talent.  It takes time to get really good so don’t train someone if they won’t have an opportunity to use it intensely at least a few times a year.

 Tip 4: Think of it as a “sand box” rather than an optimizer

It is important to set expectations right from the start.  Simulation is not a crystal ball that shows you the solution to all your future problems.  It will not tell you what to do in a given situation.

It is, however, a tool for creatively testing different potential ways of running you system.  It will tell you that if you run your system in a particular way, then this is what you may expect as a result.

The ability to generate and interpret meaningful operational scenarios is why having an effective simulation analyst can be the make or break of a successful simulation modeling project.Aerospace manufacturing simulation

Tip 5: You should consider using simulation if …

So how can you tell if simulation is the right tool for you?  The following are some guidelines to let you know if your situation could benefit from using simulation modeling:

  • You have a big decision to make with high potential for risk or reward.
  • You cannot afford to make mistakes and it’s worth investing the time and effort to make sure new processes work as good as they can before being implemented.
  • You have a good understanding of your operations and system data.
  • You are able to do your first project with an analyst or team that has a track record of successfully using simulation
  • You are innovative and ready to utilize a new management tool.
  • You are committed to use the findings and recommendations, even if they tell you what you don’t want to hear.  After all, there is no point in wasting your investment.

 

Escaping Key Performance Indicator Hell

Buchenwald-100625-14486-Schwerte-hell
Image via Wikipedia

Key Performance Indicators (KPIs) are all the rage these days.  Many leaders of industry have instant access to their KPIs, but still in their gut they know that there’s much missing from the dashboard.  

The famous Peter Drucker quote “What gets measured, gets managed” rings true here too.  A bad set of KPIs can result in a lot of unproductive busy work, or as I like to call it Key Performance Indicator Hell. 

Hopefully that’s not your situation, but if it is, here are 5 tips for to success that can help you drive performance through KPIs.

Tip 1: Emphasize the word ‘KEY’ in Key Performance Indicators

The reporting tools that have made it easier to present all your KPIs in a million different ways, created a flurry of new problems:

  • Too many KPIs:  By the time you’re reporting on more than 10 KPIs, chances are that some will look good and some will look bad, just due to normal variation.  What does it mean to be doing great in 5 KPIs, ok in 3 KPIs, and terrible in 2 KPIs? 
  • Too much overlap in KPIs:  Often dashboard reports show signs of major scope creep as the report writer attempts to please all viewers.  When dashboard reports contain 2 or more KPIs that essentially represent the same underlying concept (i.e. total widget units shipped, total boxes of widgets shipped, total widgets weight shipped, etc) it makes it distracting for viewers to know what KPIs they are supposed to focus on.
  • Too many options:  Just because it’s easy to report KPIs in a variety of ways, it doesn’t mean that it’s the right thing to do.  (more…)