Is Your Organization Ready for Machine Learning and Predictive Analytics?

Published on feature[23], July 2015.


Machine learning and predictive modeling are truly powerful concepts of digital change within your organization, and the benefits to successfully implementing into your platform are without question. While your data can unlock valuable insights into the capabilities of your business, however, the type of thinking, and decision making, necessary to make these efforts succeed, is also truly noteworthy.

In our working with companies who take on these types of transformative initiatives, we've been allowed to see first hand, what needs to be considered before going down this road.

What is your current digital position? What is the business' agenda?

What is...

The Understanding Gap?

"Only 55% of the middle managers we have surveyed can name even one of their company’s top five priorities." (see references 2 & 3 below)

True understanding of the operational definitions is a critical requirement for any organization looking to take on Machine Learning or Predictive Modeling. As the models are constructed, and machine learning is applied to the problems outlined, it will be the 'business' that guides the work. If their understanding is off, by even a little, it can have a devastating effect on the machine learning/predictive modeling initiatives.

Furthermore, these types of initiatives should lead to an evolution in the business rules. As the data gets better, and the team's ability to act on them grows, the business rules become more explicit.

It will be critical to make sure the teams likely effected by the initiative itself, have a clear understanding of the company mission/vision/goal as they'll be required to help influence the construction of the initiative itself. Most importantly, they need to understand how the business rules used to govern the company's success, influence the overall goal. Without this, they'll have challenges building the correct product, and later, have difficulties leveraging the new asset.

Evidenced Based Thinking

  • How are your teams currently using data to influence the company's decisions?
  • How many people in the organization know how to read the data?

This point enough can't be stressed enough. Learning to use data as part of the decision-making process tends to be difficult for many employees. And 'intelligence' rarely factors into the equation. In our experience, employees first brush with the 'new platform', tends to create fear and doubt in their ability to understand and leverage the new system. The largest issue [addressed in data competence as well], tends to be a lack of true understanding about how to use the data. Statistics is a big part of these initiatives, and without help understanding how to read the data, many employees simply won't try.

Data Competence

One early measure for how far to go with big data/machine learning initiatives, is to simply ask yourself, "Are we good with our current data?".

To what extent are your teams evidenced based thinkers?

And from a pure data perspective, do you have a 'single source of truth' within the organization? Additionally, how good are your data teams? These types of initiatives will put a great deal of stress on your existing assets, eat up a lot of space, and require a certain type of expertise to maintain, and grow. Without the right technical skill, you'll start with an asset that's very good, and slowly degrade over time.

Last point on this one, statistics tends to be a big part of many big data/machine learning initiatives. If your teams don't have these skills, they'll have great challenges in 'testing' these products, or validating them for truthiness.

And if the organization has plans to consume this new asset in any of the custom software solutions used and/or managed in-house, those teams will also need to understand how to consume and test these types of applications.

We must know what questions we seek to answer.

This one always sounds crazy when you say it, but, it's probably the most important one on the list. Big data/machine learning/predictive analytics projects take a great deal of time to reach a point of demonstrable value. If you're not clear about the types of answers you expect from the system when completed, you'll end up with really good answers, to questions no one cares about.

Now, a good thing to point out, you can most certainly go down the path of figuring out what stories your data MAY tell you. As long as the organization accepts, it will likely take a great deal of time, and money. Any experience in statistics will tell you that you can MAKE data tell whatever story you want it to. Especially, to an inexperienced 'evidence thinker'. Starting with the question(s) you're trying to answer, can help you make considerable progress on important business issues today, while your team works to understand what your data is capable of. Giving you the best of both worlds.

Large research and experimentation component

Because of the amount of research and experimentation these types of projects normally require, it's often hard to predict how much time is necessary to make progress on big data/machine learning/predictive analytics projects.

Also, the expertise used to conduct this type of research and experimentation should not be overlooked. If you DO NOT have this type of culture already, it will often be a shock to those on staff.

Technology Leadership

And last...

  • Who is going to be the team your employees and users look to when they need assistance with the platform?

  • Does your current technology leadership have the skills and/or experience necessary to champion these types of initiatives?

  • This is always an unfair question, but, does your gut tell you that your technology leadership team is ready for this?

We work with some of the brightest software engineers, data experts, and technology leaders on the planet... and, big data/machine learning/predictive modeling initiatives are some of the hardest to deliver for even these folks.

Are you sure your team is ready?


A Prototype

The easiest way to test the waters on this entire topic, is to first do an internal prototype, or test, of the machine learning mindset. Pick a problem you believe can be solved, and let your team put a prototype together which attempts to address it.

This should force them to walk down the path of understanding the type of work necessary to be successful, and help you gauge where they are with the skills needed.

It NEEDS to be a real business problem, and should include the business unit and/or department affected.

Regardless of the outcome, you'll be able to sit down with the team and have a very real conversation about the impact of this type of thinking across the board. It should also allow you to introduce the idea of bringing in experts to help the team be MORE successful on the next run.


Reference

Read the full "Is Your Organization Ready for Machine Learning and Predictive Analytics?" at feature[23] ►

Comments