Predictive Analytics Success For TV & Music

4/9/15

How is predictive analytics being used these days? Let’s look at two examples of how gathering information through social media and other outlets is being used to choose actors for television shows and how music is streamed to you.

“Big data and analytics are empowering the rule breakers” ~ Kevin Spacey

How many of you watch House of Cards on Netflix? How many knew it was green-lighted based primarily on big data and predictive analytics gathered from viewing habits and patterns of their subscribers? And Kevin Spacey as the lead actor? Yep….selected primarily because of gathered data. They knew that people who watched shows/movies similar to House of Cards also watched a lot of Kevin Spacey movies.predictive analytics success They then pitched it to Kevin knowing that people who liked “A” also liked “B”. Combine data from “C” and they predicted that “D” (House of Cards) will be a hit even before a pilot was shot.

Netflix has been collecting big data from their users. Every time they hit the pause button, fast forward, re-watch a show, abandon a show entirely after watching the first few minutes, what device they are using to receive their service, and MANY other “events”, it is logged, recorded, and analyzed. This allows them to come up with other ideas and concepts for shows.

Let’s now look at music. Forward thinking companies like Spotify are using new technology like Nestify. It is predictive analytics that targets and anticipates specific music depending on the time of day & activity you are engaging in without you having to manually select the genre. Take, for example, the song “Hooked on a Feeling” by Blue Suede which was prominently featured in the movie “Guardians of the Galaxy”. You click the “thumbs up” button on your media device while on the treadmill at 5:30pm. But how many people really want more cover songs from Bjorn Skifs and his 1974 bandmates? Nestify is using predictive analytics to distinguish the difference between your “most listened to” genre of Adult Contemporary (because you have it on while working as background music) and that time of day when you’re typically at the gym to start playing “Gonna Fly Now” from Rocky. Couple that with a GPS system in your device that will recognize/pinpoint when you actually arrive at the gym to start your “pump it up” playlist, and BAM! Your device will know you better than you know yourself.

There is a move towards “customer-centricity” where companies are reorienting their operating models around the customer. Studies have suggested that those with leading demand-analytics capabilities showed higher commercial performance levels compared to those that used more “traditional” methods. So, in summary, if you are not using or at least considering predictive analytics, the speed of technological advancements and how data is used is going to leave you in a position of playing constant catch up.

http://www-03.ibm.com/systems/infrastructure/us/en/big-data-analytics/index.html?lnk=tab&cmp=ibmsocial&ct=stg&cr=sc&cm=h&ccy=us

 

ESRI Maps for Cognos: Location Analytics Strategy For Your Business

As of the 2000’s, the majority of midsize to large companies use analytics as part of their daily operations and competitive strategy. However, the majority of analytical reporting and analysis tools being utilized are still crosstabs and tabular tables. This is a problem because 65% of adults are primarily visual learners and thinkers. If business analytics is about benefiting the bottom line through actionable insight, you are only, at most, utilizing 35% of your organization’s analytical ability to cut costs and recognize opportunity. This is where ESRI Maps could come into play for your business.

I don’t know about you but I’m not quite John Nash from the movie “A Beautiful Mind”. Numbers in tabular tables don’t immediately jump off the page and form obvious trends. Visuals allow the user, at a glance, to quickly see patterns, trends, and correlations.

One of the most powerful visuals a company can use are maps. Geography is relevant to nearly every decision customers and organizations make. At least 80% of your BI data has a location component that relates or correlates to other pieces of your organization’s data. Mapping this geographic data allows you to see those patterns, trends, and pic1correlations that were otherwise not blatantly obvious. Location Analytics is more than knowing where your customers live or what store they purchased from although this is important information. It’s also about customer profiling, minimizing risk, maximizing productivity, taking your analytics program to the next level, and fully utilizing all of your organization’s brain power. For example, an insurance company used ESRI Maps for Cognos, national weather service data, and client geographic data to determine which clients were going to be hit by a major hail storm. They sent the customers in that geographic area a text about the storm and reminded them to put their cars in a garage. This 15 minute analysis saved the company millions of dollars in claims.

Method Busters Location Analytics is not a new cutting-edge concept. Companies using Location Analytics like ESRI Maps are strategically not sharing the fact they are using this technology. Those using location analytics actually love the fact that it is not a wide spread practice, and subsequently use it to crush their competition. Organizations also love the fact that many of their competitors think it’s complicated and expensive. You don’t need a GIS (geographic information system) department.

ESRI Maps for Cognos is a simple integration into Cognos BI and implementing it typically takes 3 to 5 days. You can test drive ESRI Maps now by downloading the ESRI for Office 30 day trial. Then, take your Cognos BI data, dump into Excel, and within hours, have a quick prototype of how location analytics takes your data and decision making power to the next level. Although it’s so much more than analyzing where your customers live, as demonstrated in the previous example, this is where most firms get their feet wet. For the most part, organizations have their customer’s zip codes or even prospect geographic data. That’s all you need to get started. A little bit of geographic data and you are ready to rock.  Where do your big spenders live? Is there a pattern or tend? With this little bit of knowledge, you can now start targeting your marketing dollars and move to the next level of customer profiling.

ESRI Maps for Cognos

You can download the ESRI 30 day trial link at: http://www.esri.com/software/arcgis/arcgisonline/evaluate Please also take a minute to view Lodestar Solution's User Group video with guest Harold Bergeron of GREBIT Solutions by using this link: http://youtu.be/bAGtCHcWuoY.  He explains how Location Analytics can help you be more effective and efficient.

How Fast Is Knowledge Doubling?

The total amount of information in the world is increasing. Simple enough concept to grasp, right?  Obviously, we would then conclude that human knowledge is increasing as well. (*Enter your own joke if you care to do so.) I don’t think you need a degree in rocket science to reason that one out either. However, have you truly considered what the rate of increase really is on each of these and how the future will look concerning analytics and technology?

I was watching a program not too long ago and the host said something that really caught my attention. He said that human knowledge is doubling every 13 months. That’s when I lifted my head out of the bowl of SpaghettiO’s and started really paying attention to what he was saying.

From an article on Industry Tap written by David Schilling, the host went on to say that not only is human knowledge, on average, doubling every 13 months, we are quickly on our way, with the help of the Internet, to the doubling of knowledge every 12 hours.  To put it into context, in 1900 human knowledge doubled approximately every 100 years. By the end of 1945, the rate was every 25 years. The “Knowledge Doubling Curve”, as it’s commonly known, was created by Buckminster Fuller in 1982. If you want to take this even further down the preverbal road, you combine this with Ray Kurzweil’s (Head of Google Artificial Intelligence) “singularity” theory and Google’s Eric Schmidt and Jared Cohen’s ideas which are discussed in their book, “The New Digital Age: Reshaping the Future of People, Nations and Business” and you have some serious changes to technology, human intelligence and business coming down the pike whether you like it or not.

Here are some numbers which can put the below chart into context but just keep in mind that whole “doubling every 12 hours” statistic:

Human Brain = several billion petabytes to index
The Internet = 5 million terabytes
Amount of Internet indexed by Google = 200 terabytes or .004% of the total Internet

I’m not saying that the mapping of the human brain is planned for next Thursday, but it should open your eyes to what lies ahead of all of us. Will you be able to keep up with all the technology and information and human knowledge or will you get left in the rearview mirror using an abacus? It’s just some food (maybe SpaghettiO’s) for thought.

Knowledge Doubling

Forward Looking Business Intelligence

What is Forward Looking Business Intelligence (FLBI)?

Most of IBM’s Cognos customers may not know this yet, but IBM has made a seismic shift in its BI licensing structure. IBM has consolidated many of its older BI roles into 5 roles. Quite a few of the converted roles will end up gaining new functionality in the new licensing landscape. One of the most important changes has to do with an introduction of a whole new role called the “FLBI role”, or Forward Looking Business Intelligence role. The FLBI role is a combination of Business Intelligence and Predictive Analytics. This role introduces a tool called SPSS Modeler for its Predictive Analytics capabilities.

Role of Predictive Analytics

With Predictive Analytics, Business Intelligence can enable smarter and more strategic decision making. With this type of capability, the end user can marry the historic and current views of the business to have views that can help determine what is likely to happen next. Predictive Analytics processes historical data, in that, it learns what has happened in the past and creates models that examine new sets of data against those models to obtain a prediction, or the likelihood of a future event or behavior occurring.

What is the FLBI role designed to do?

The FLBI role was designed by keeping in mind the need to understand the past and current state of your business, as well as future outcomes and their drivers. It gives you the tools to model your existing historical data to give you a guideline for future business behavior. By tying predictive measures against operation processes, FLBI can optimize your business outcomes. The role provides users of different levels of analytical skills to perform self-service data analysis. The role also leverages the flexibility of deployment options (be it desktops, browsers or mobile devices) and collaboration capabilities across departments and applications.

Going Forward

FLBI provides a very cost effective way to enter the world of Business Intelligence and Predictive Analytics. The introduction of this role, and the utilization of tools within the role, will provide your organization or LOB with insight into your data that will lead to better business outcomes.

Lodestar Solutions can be a valuable asset in providing you with knowledge and guidance required in order to understand and implement Forward Looking Business Intelligence capabilities.

Using Analytics Effectively – 5 Tips

With Analytics becoming more mainstream, we still see many companies shying away from it; perhaps because they have been burned by previous attempts. Here are five tips for any organization looking to turn the corner and start being successful in using their data to make decisions:

  1. Make data and analysis available to employees: Analytics veterans will tell you there is big value in allowing employees to see and use data. The power transfer that occurs when this is permitted inspires insights in employees’ own work, which helps analytics programs grow and evolve; thereby providing even more value and improvement. Be sure to always ask for users’ assistance and to put some of their ideas to work. These insights will help guide the progression of the project, all the while strengthening employees’ commitment to the data analytics strategy.
  2. Return on Investment should be measured early & often: Recording wins from data analytics projects is important; however, it is a bottom line requirement and may not be enough to continue to win budgetary support. Times are tight for organizations of all sizes and in all markets, so when it comes time to verify that dollars were well spent on your data analytics efforts, be sure to include: clear measurements on cost benefit (in dollars), hours of employee time saved (in hours and dollars – and include value of re-assignments), as well as the improved outcomes that were seen (that may not have a direct monetary value i.e. improved employee morale, improved customer satisfaction, etc.).
  3. Hire an expert: Some organizations opt to skip consulting assistance in lieu of pre-recorded training modules and learn to do it themselves for the sake of saving a few bucks. Unfortunately, this mindset causes sluggish returns on the Analytics investment, and we have seen analytics programs get cut for this very reason. Go ahead and get yourself that consultant and ask for a start-up package with training, or better yet, if you can find an expert – HIRE THEM, and then learn from them! Their wisdom and experience can speed your first few projects into quick wins that will help boost data-use culture within your department and maybe across the organization. These wins will also help you continue to get budget allocations, and when word gets around about your success, you may be surprised to see who else from your company comes knocking at your door.
  4. Get to the point: Top executives’ support is vital to keep analytics projects afloat, and the information and insights analysts develop are vital to top executives’ decision making. “Clear, concise and, most of all, brief” should be your mantra if you are presenting at the executive level. Speak as if you are an executive and be sure you are presenting the product of your analysis, and not the analysis itself. In order to get top leaders to support your data analytics program, they must understand the results of your analysis and how they align with achieving the organization’s overall mission(s). This is actually the largest piece we find lacking in the talent pool for analysts – people with statistical and/or data mining experience AND Business Sense.
  5. Seek to envelop the organization in data-use culture: Eventually, analytics will be standard operating procedure in all areas of your organization. However, this will take time and numerous successful analytics projects to win over management at all levels. When this happens, there will be a need for on-the-job analytics training for employees to meet the new demand from management (again – hire an expert!). The return on investment for analytics projects (particularly predictive analytics) is so high that the cost for training and/or hiring an expert ends up paying for itself through speedier results and measurable ROI.

These are just a few highlights from data analytics veterans who have seen just about everything. If you have been reluctant to get started, or feel as if you have been burned before but are still feeling interested – we are happy to help!

Predictive Analytics in the Cloud

Predictive Analytics in the Cloud

Recent research from www.information-management.com suggests that Predictive Analytics in the cloud computing is becoming more and more mainstream. Approximately two thirds of organizations surveyed this year responded that they have seen a positive impact from Predictive Analytics. Furthermore, approximately the same amount of respondents answered that they are utilizing Predictive Analytics within the cloud and have current deployments or implementations of Predictive Analytics scheduled within the next twelve months.

The two biggest factors driving this growth in Predictive Analytics are price affordability and the demand for “big data”.  On top of the big data issue, there is additional demand for real time actionable data, or as some have referred to, the “velocity of data”. For more information on Predictive Analytics in the cloud and its overall growing role in cloud computing, here is James Taylor, CEO of Decision Management Solutions, article “Predictive Analytics in the Cloud: Research Results”.

Have you joined the Lodestar Solutions community yet?  If not, why?  It takes less than 1 min and will ensure that you are up to date on all of Lodestar's new programs and blogs.  So I beg you, dont wait another minute, go to lodestarsolutions.com and join today!

Predictive Analytics in the Cloud

On The Bubble Regarding Predictive Analytics?

“On the bubble” – ever heard that phrase? Do you know where it comes from? Legend has it that this American expression may have originated in the Indy 500 racing community. If a driver was about to qualify for the Indy 500 race, and then another driver came up with a new best time and pushed him out of the qualifying pack, his bubble would be burst. Another origin maybe tied to bubbles found in a carpenter’s level which accurately display the threshold between verging one way or another. My favorite interpretation is thinking about a pot of water on the stove and the bubbles sticking to the bottom and sides of the pan as it heats, just about to boil.

I often hear, “We want to DO Predictive.” I then inquire back, “What does that mean to you and to your organization?” Most of the time, the person asking can’t say what it is or what it means or how they expect to “do” it, but they know they want in!

So, if you want to “DO” Predictive Analytics, you must first determine your bubble.

  • Who or what is on the bubble?
  • What action can I take to steer them one way or another before or at the moment of decision?

Thus, in the context of Predictive Analytics and Big Data, “on the bubble” means: on the verge of making the cut or not, on the verge of going one way or the other, on the verge of boiling (i.e. a student likely to drop out, a customer likely to churn to and move to another provider, a shopper likely to take an offer and expand wallet share, etc.). This means that, by using historical and current data, you can focus your operations, efforts and funding in the places that will have the most and best impact.

There you have it; that is about the smallest and simplest nutshell there is for Predictive Analytics. If you’d like to DO Predictive Analytics, don’t just sit on the bubble – give us a call or contact us at Sales@LodestarSolutions.com. We would be happy to help you get started!

Predictive Analytics Made Easy

While we can’t say “long gone are the glory days of the statistician, data scientist, and analyst” – the development of IBM SPSS’ Analytic Answers is a great first step into Predictive Analytics for specific organizations that are interested, but don’t know where to start. This communique will outline some very basic principles about what Analytic Answers IS and what it is NOT. As mentioned in previous posts, Analytic Answers seeks to answer very specific questions for very specific market segments, utilizing very specific columns or variables in a data set.

First, let’s talk a little bit about the structure: Analytic Answers is a cloud based (SaaS) solution hosted by IBM. Lodestar and IBM will work with you to identify very specific parts of your data that are required in order to get your questions answered. Lodestar will work with you to prepare your historical data, which will be sent to IBM so they can build your model. About 2 weeks later, you should be able to log into a web page and run new data (a CSV file) against the model to get your results. What you receive back is your CSV file with some additional variables at the end: The prediction, the confidence score (how sure we are about the prediction), and a Prescribed Action. To be clear – IBM is very specific about the data that will and will not be accepted. For example, no personally identifiable data or personal health information will be permitted into the system – so your constituents’/customers’ PI and PHI data is still secure within your systems.  Neither Lodestar nor IBM will ever see it or have access to it in any way. If you wish for Lodestar to prepare your data for you, we recommend a non-disclosure agreement be in place before getting started.

Because this is a cloud hosted solution, it is priced as a monthly subscription, and there is a tier structure within the pricing, depending on which blueprint you need. Most of the blueprints will allow you to run your model unlimited times per month – but you may have a cap on how many cases you run per month. Standard overages are permitted, but you’ll be charged. You can add to the standard package (i.e. additional questions/predictions to be made) at an additional cost as well. The interesting part about this is when you’re purchasing software to install and use, it is a capital expenditure.  Because this is a subscription service, it can now be documented as an operating expenditure. The price point for Analytic Answer is exceptionally reasonable considering there is no additional expense for training or services – and basically does all the dirty work for you. Furthermore, this is an excellent opportunity to foray into Predictive Analytics as a proof of concept with high ROI at a minimal cost, and provides the ability to present your findings to IT and the upper levels to prove that Predictive Analytics should be more important to your organization.

Next let’s take a look at what questions are available to be answered. IBM has developed specific blueprints that answer specific questions for specific markets:

  • Retail: Purchase Analysis & Offer Targeting
    • Insights for creating combination offers and promotions
    • Targeted product recommendations for individual customers
    • Drive additional purchases and increase spending
    • Assists in adapting to new trends and patterns
  • Non-Profit: Donor Contribution Growth
    • Identify likely prospects for donation
    • How to increase individual donor generosity
    • Right message to right donor at right time
  • Insurance: Renewals
    • Identify policy holders at risk of not renewing
    • Select incentives most likely to retain those likely to churn
    • Prioritize proactive outreach
    • Improve policy holder satisfaction
  • Telco: Churn
    • Identify customers at risk of leaving
    • Determine incentives most likely to persuade individual subscribers to stay loyal
    • Decrease customer churn and increase customer loyalty
  • Debt Collection: Improve Outcomes
    • Determine most and least likely to pay (spend time on those most likely)
    • Recommendation on collection treatment most likely to succeed
    • Increase collection while lowering cost
  • Education: Student Success
    • Identify students at risk
    • Select action most likely to get student back on track
    • Identify patterns in dropout rates and drivers of disengagement

We wish we could tell you more about the stack of technology that IBM has sitting in the cloud that helps produce these results – but the list is long and proprietary to IBM. It is an exceptionally robust set of IBM hardware and software and automated processes and systems that have been developed to get you answers in about a minute when you run your query. With that said, I can say with certainty that the predictive outcomes are produced with IBM SPSS Modeler. If you decide that after your subscription, you would like to move operations in-house and work with Modeler, IBM will provide you with the model built for your organization to support a strong starting point. We recommend training and some services for the smoothest possible transition, and largest Return on Investment (ROI). This transition to in-house analytics will allow for ultimate freedom in data exploration and uncovering other hidden patterns and trends in your organizational data to drive efficiencies, cost savings, and business improvements.

In closing, it is interesting to note that Lodestar can also help you implement the Predictive outcomes into your existing BI solutions, so we are able to assist on the front and back ends of a project like this, making it almost entirely seamless and not requiring much time or effort on the part of your organization. IBM will be continuing to develop new blueprints to be released incrementally over the coming weeks/months/years.  If there is one developed for forecasting, we will also be able to assist in adding these fields into TM1 (as Modeler can now accept TM1 data, add a prediction and confidence interval and output the updated file back to TM1 for consumption).  Stay tuned…not sure if or when that day will come.

If you’d like to learn more about Predictive Analytics Answers and SPSS, please give us a call or send us an email at Sales@LodestarSoltions.com. We would love to work with you!

Introduction to Predictive Analytics with IBM SPSS – Part I

Most folks who know about SPSS are aware of it because they took a Statistics class in college and used IBM SPSS Statistics. It is a favorite among professors for teaching due to its easy to learn interface and the fact that the user doesn’t need to learn any coding language.

What’s not widely known is that there’s MUCH MORE than IBM SPSS Statistics in the brand family!

Because there is a lot to cover, I’m going to split this into “Beginner” and “Advanced” – this is aligned with the analytical maturity blog published earlier, so check where you are before you dive in.

The items listed in this “Beginner” post are more geared towards folks with little to no experience or exposure – maybe working in spreadsheets but interested in “getting predictive”.

IBM Analytic Answers: This new little gem is about as easy as predictive analytics gets. There are several very specific blueprints that IBM has developed for very specific needs: Insurance Renewals, Donor Contribution Growth, Prioritized Collections, Retail & Offer Targeting, Student Retention, and Telco Churn prevention. The tag line for this gem is “if you know how to upload a picture to Facebook – you can do predictive analytics!” IBM hosts this offering in a SaaS model – you just log into a webpage and upload a CSV file with variables pre-specified by IBM, and click submit. About a minute later, a file is returned to you with a prediction, a confidence interval, and an action. IBM won’t accept any personally identifiable information so your data is very secure, and it will take them about 2 weeks to build the model for you once you’ve signed up. When you submit a file, it runs against a Modeler (see below) instance in the cloud, and through a Modeler stream that IBM has created specifically for your data. Lodestar (as an IBM Business Partner) can offer services adjacent to this offering and can prep and submit the data for you, as well as provide results in a framework that will snap-fit into your business (i.e. Cognos Dashboard, etc). Pricing will vary depending on which blueprint(s) you need, any additional data you’d like to run, and how many cases you want to run against the model per month. This is a VERY cost effective launch for an organization that is interested in “getting predictive” but doesn’t know where to start.

IBM SPSS Data Collection: This family is the soup to nuts survey tool offered by IBM. Here you can create a survey and deploy it in multiple modes; paper, web, phone, mobile, even manual data entry. It also has a reporting feature available to keep you up to date with your progress. There is a Text Analytics component as well. This allows you to put those open ended responses to work by providing valuable sentiment analysis and allowing you to make that qualitative data into quantitative data to boost the accuracy of any models you might create with the data. When speaking about licensing, it is modular, so you don’t need to buy any capabilities you don’t require and can create a snap-to-fit survey solution.

IBM SPSS Statistics: This is the old familiar spreadsheet-looking package you might remember from college. The licensing methodology is modular, so there is the Base and fourteen modules to choose from. IBM was kind enough to analyze SPSS Statistics customers’ buying habits and has created discounted bundles that align to various common analysts’ needs: Base, Standard, Professional and Premium. There are a few stand-alone products in this family: Sample Power and Viz Designer. IBM SPSS Statistics is an excellent way to avoid common errors found in spreadsheets and has many capabilities for data cleansing, organizing, and modeling. In the most recent releases, IBM has added Monte Carlo simulation.  It furnishes the decision maker with a range of possible outcomes, along with how probable that outcome is (confidence score). Statistics is friendly with open source R algorithms.  They can even be stored in the drop down menu, so any algos that don’t come stock can be added at any time. If you’re a Statistics customer already and notice that your jobs are running slow (usually due to a very wide data set, or just very large – peta bytes or terabytes) there is a server component that can be added. This component pushes the “crunching” to the server for additional power, and when the job is finished, the results are sent back to the client. We can help you design a statistical package that meets your needs, so just let us know.

IBM SPSS Modeler: This data mining workbench is the world class gold standard. While Statistics is great for proving a hypothesis, Modeler is more focused on hypothesis generation and allowing a user to uncover complex and hidden patterns and trends in large data sets. To be clear, this workbench is not a “black box” technique as many claim data mining to be – and while it’s the easiest data mining software on the market, one should most certainly invest in learning how to use it and why someone would choose one course of action over another. The workbench, unlike Statistics, begins as a blank canvas.  The user pulls in icons from the tray at the bottom and connects them into a stream or workflow. Modeler is data agnostic and friendly with any ODBC compliant source. Some capabilities included in modeler are data cleansing, data merging, auto modeling, auto clustering, social analytics, entity resolution, text analytics and much, much more. Modeler is available in two “flavors”: Professional (quantitative), and Premium (qualitative, entity resolution, social analytics, etc.). Like Statistics, if you’re a Modeler user already and have jobs that are taking too long to crunch there is a server component available. This is a good idea if you have very wide data sets, or are trying to crunch petabytes or terabytes (or more) of data.

Again – these are all good starting places. Lodestar can help you identify which, if any, would be useful for your organization. Please contact us to get started on the path to Predictive!

You can now see Part 2 of our SPSS Product Family Overview HERE.

Analytical Maturity – Where Does Your Company Rank?

When you hear someone talking about the Analytical Maturity of an organization – what does that mean? IBM has put together a handy “temperature reader” for this called the Analytics Quotient Quiz. It can give you an idea where your organization lands on the spectrum. Are you a Novice or a Master?  If you’re reading this post, I’m guessing you’re somewhere in between, or at least curious about where to start.

IBM research also indicates that Analytics-driven organizations outperform their peers by up to 3 times. The same research also shows that analytically mature companies have an Analytics Center of Excellence.

“That’s great, IBM, but who the heck has time to build a Center of Excellence? I’ve got a business to run here!”

Some shy away from implementing Analytics because it looms “out there” as a huge implementation that’s expensive and time consuming, especially if your standards for project management adhere strictly to the Waterfall methodology.

Incorporating Analytics into business processes does NOT have to be a huge implementation across the entire organization. In fact, most companies that start small with one project in one department routinely see more ROI in less time than those who opt to do a large enterprise-wide implementation (where it’s easy to overspend on software and man hours, even if you’re getting a heck of a deal). Sometimes that first Analytics project is as simple as automating repetitive reporting processes such as data gathering, merging and cleansing. A “starter” project like this frees up analysts’ time to do real analysis. You know, the kind that uncovers the big ‘ah-ha’ moments…the kind they make TV commercials about?

As a frame of reference – most “starter” projects are linked to one of these four initiatives:

  • Grow, retain, and satisfy customers (students, citizens, parents, policy holders, etc.)
  • Increase operational efficiency
  • Transform financial processes
  • Manage risk, fraud and regulatory compliance

As a result of success realized with these “starter” projects, other analytics projects become obvious as next steps. It makes sense to then prioritize these projects, and expand the overall analytics program (training, hardware, software, staffing, etc.) as it’s appropriate for other initiatives, business problems, departments, lines of business, etc. This is more aligned with the Agile project management methodology (in comparison to Waterfall methodology), and is generally more palatable from a timing and budgeting perspective. It is much easier to budget annually for controlled growth of analytics across an organization than it is to budget for a huge multi-million dollar enterprise project – for most companies, anyway.

Before you know it, many of these analytics projects are approaching maturity and automated stages, and that organization is stepping right on up through the ranks of Analytical Maturity:

  • Making more data-backed decisions
  • Reducing fraud, waste and abuse
  • Customer Satisfaction on the rise
  • Customer Churn is falling
  • Customer base and wallet share are growing

This concept of Analytical Maturity is really to say that Analytics (like many other things in life) is a journey, and not a destination. The best part is, an organization can leverage the learning curve as they go along, gaining analytical prowess and expertise at each step of the way. More often than not, the folks who act as Change Agents in promoting transformation into the analytics driven age within their organization see resounding success and career growth, alongside the obvious benefits to their organization.

Among all of the marketing around BIG Big Data projects ($$$) – it should also be quietly implied that it’s a good idea to start small, maybe with the intention that more data-backed decision making in more areas of the company over time will grow the organization into an Analytically Mature company.

In your market – can you recommend a good place to start with an analytics project?
Are your analytics projects helping you outperform your peers by 3X? If not – we should talk…

X