Don’t be a data scientist whose models fail to get deployed!
An epic example of model deployment failure is from Netflix Prize Competition. In a short story, it was an open competition. Participants had to build a collaborative filtering algorithm to predict user rating for films. The winners received grand prize of US$1,000,000. In the end, the complete model never got deployed.
Not just Netflix, such dramatic events occurs in most of the companies. Recently, I have been talking with C-Suite professionals of many leading analytics companies. The biggest concern I hear about is 50% of the predictive models don’t get implemented.
Would you want to build a model which doesn’t gets used in real world ? It’s like baking a cake which you’ve tasted and found wonderful but would never be eaten by anyone.
In this article, I have listed down all the possible reasons which you should keep in mind while building models. In my career, I’ve faced such situations many a times. Hence, I think my experience could help you in overcoming such situations.
1. High amount of false positive : This might seem a bit technical. Here, it’s important to understand what is false positive?In a classification model, assume that we want to predict whether the customer is a responder (one who give answers) or a non-responder (one who doesn’t).
Imagine that you predict that a person X will be a responder but in reality he does not respond. Person X in this case is called a false positive. So how does this effect in real world ? I knew you would ask this question.
Let’s take an example. For instance, you have given the responsibility to build a retention campaign for 1000 customers. Out of these 1000, 100 customers will actually attrite (leave). You create an amazing model which has a 4X lift on top decile (10 equal large subsections).
This means, out of your top 100 predicted customers, 40 customers will actually attrite. So, you recommend the business to target all these 100 customers with a fascinating shopping offer which can stop them from attriting. But, here’s the challenge.
The challenge is that for every dollar you spend on each of these customers, only $0.4 get used to stop attrition. Rest $0.6 just go to false positive customers who were really not in a mood of attrition. This calculation will some times make these models less likely to be implemented as a result of negative P&L (Profit & Loss).
2. Low understanding of underlying models with business : Lately, there has been a rising requirement of using machine learning algorithms and more complex techniques for model building. In other words, companies are drifting away from using traditional models techniques.
Undoubtedly, using ML techniques lead to an incremental power of prediction, but the businesses are still not very receptive to such black box techniques. In my experience, this leads to a lot longer lead time for a predictive strategy to get implemented. And as most of the applications in business are highly dynamic, the model become more and more redundant with higher lead time.
3. Not enough understanding of the business problem: Predictive models are good for resumes of both analyst and the business counterparts. However, that is not the purpose of the model you would build. In some cases, analyst run into creating model phase and try to cut down the time that should have been allotted to understanding the business problem.
4. Too complex models for implementation : Predictive power of models is the soul of these exercises. But in general, predictive power comes at a cost of complexity of models. We start bringing in bivariates and tri-variates to make models stronger, even when these variables make no sense as per business. Such models might be amazing in books, and hence, they just stay in these books and never see the actual light of real-world.
5. Not addressing the root cause just trying to improve the effect of a process : Why do we make models? The most important reason is to find the drivers of a particular response. What are these drivers? Drivers are always the root cause of response rate. What will happen if you bring in all the effects as the input variable and these variables also come out as significant? It will hardly be of any use as you are not changing things that can really bring changes .
Source: ThinkReliability
6. Training population significantly different from Scoring Population : In many cases, we end up creating models on a population which is significantly different from the actual population. For instance, if you are creating a campaign targeting population and you have no previous similar campaign. In such cases we start with the basic assumption that a population with high response rate might also have a high incremental response rate. But this assumption is rarely true and hence the model would be hardly used.
7. Unstable models : High performing models are often highly unstable and do not perform at par with time. In such cases, business might demand high frequency model revision. With higher lead time in model creation, business might start going back to intuition based strategy.
8. Models dependent on highly dynamic variables : Dynamic variable are those variables which bring in the real prediction power to the model. However, you might have a culprit variables which might bring in such values which have never been seen in training window.
For instance, you might get number of working days as a significant variable to predict monthly sales of a branch. Let’s say this variable is highly predictive. But for our scoring window, we have months which have just 10-15 working days. If your training data does not have any such month, your model might not be capable of making this prediction accurately.
I believe if we do understand these challenges, we can think of better ways of not getting entangled with such catches. Also, knowing them makes us see what business is really looking out for. I welcome you to append this list which will make our analysis more comprehensive and take the team a step ahead.
Did you like reading this article ? Do share your experience / suggestions in the comments section below.
Tavish Srivastava, co-founder and Chief Strategy Officer of Analytics Vidhya, is an IIT Madras graduate and a passionate data-science professional with 8+ years of diverse experience in markets including the US, India and Singapore, domains including Digital Acquisitions, Customer Servicing and Customer Management, and industry including Retail Banking, Credit Cards and Insurance. He is fascinated by the idea of artificial intelligence inspired by human intelligence and enjoys every discussion, theory or even movie related to this idea.
8 Ways to Improve Accuracy of Machine Learning ...
Common data preparation mistakes and how to avo...
Deployed your Machine Learning Model? Here̵...
A Beginners Guide to Machine Learning Operations
5 Simple manipulations to extract maximum infor...
Festive season special: Building models on seas...
A Beginners Guide to the Machine Learning Lifec...
How Machine Learning Models Fail to Deliver in ...
13 Tips to make you awesome in Data Science / A...
How to implement an analytics solution for a bu...
We use cookies essential for this site to function well. Please click to help us improve its usefulness with additional cookies. Learn about our use of cookies in our Privacy Policy & Cookies Policy.
Show details
This site uses cookies to ensure that you get the best experience possible. To learn more about how we use cookies, please refer to our Privacy Policy & Cookies Policy.
It is needed for personalizing the website.
Expiry: Session
Type: HTTP
This cookie is used to prevent Cross-site request forgery (often abbreviated as CSRF) attacks of the website
Expiry: Session
Type: HTTPS
Preserves the login/logout state of users across the whole site.
Expiry: Session
Type: HTTPS
Preserves users' states across page requests.
Expiry: Session
Type: HTTPS
Google One-Tap login adds this g_state cookie to set the user status on how they interact with the One-Tap modal.
Expiry: 365 days
Type: HTTP
Used by Microsoft Clarity, to store and track visits across websites.
Expiry: 1 Year
Type: HTTP
Used by Microsoft Clarity, Persists the Clarity User ID and preferences, unique to that site, on the browser. This ensures that behavior in subsequent visits to the same site will be attributed to the same user ID.
Expiry: 1 Year
Type: HTTP
Used by Microsoft Clarity, Connects multiple page views by a user into a single Clarity session recording.
Expiry: 1 Day
Type: HTTP
Collects user data is specifically adapted to the user or device. The user can also be followed outside of the loaded website, creating a picture of the visitor's behavior.
Expiry: 2 Years
Type: HTTP
Use to measure the use of the website for internal analytics
Expiry: 1 Years
Type: HTTP
The cookie is set by embedded Microsoft Clarity scripts. The purpose of this cookie is for heatmap and session recording.
Expiry: 1 Year
Type: HTTP
Collected user data is specifically adapted to the user or device. The user can also be followed outside of the loaded website, creating a picture of the visitor's behavior.
Expiry: 2 Months
Type: HTTP
This cookie is installed by Google Analytics. The cookie is used to store information of how visitors use a website and helps in creating an analytics report of how the website is doing. The data collected includes the number of visitors, the source where they have come from, and the pages visited in an anonymous form.
Expiry: 399 Days
Type: HTTP
Used by Google Analytics, to store and count pageviews.
Expiry: 399 Days
Type: HTTP
Used by Google Analytics to collect data on the number of times a user has visited the website as well as dates for the first and most recent visit.
Expiry: 1 Day
Type: HTTP
Used to send data to Google Analytics about the visitor's device and behavior. Tracks the visitor across devices and marketing channels.
Expiry: Session
Type: PIXEL
cookies ensure that requests within a browsing session are made by the user, and not by other sites.
Expiry: 6 Months
Type: HTTP
use the cookie when customers want to make a referral from their gmail contacts; it helps auth the gmail account.
Expiry: 2 Years
Type: HTTP
This cookie is set by DoubleClick (which is owned by Google) to determine if the website visitor's browser supports cookies.
Expiry: 1 Year
Type: HTTP
this is used to send push notification using webengage.
Expiry: 1 Year
Type: HTTP
used by webenage to track auth of webenagage.
Expiry: Session
Type: HTTP
Linkedin sets this cookie to registers statistical data on users' behavior on the website for internal analytics.
Expiry: 1 Day
Type: HTTP
Use to maintain an anonymous user session by the server.
Expiry: 1 Year
Type: HTTP
Used as part of the LinkedIn Remember Me feature and is set when a user clicks Remember Me on the device to make it easier for him or her to sign in to that device.
Expiry: 1 Year
Type: HTTP
Used to store information about the time a sync with the lms_analytics cookie took place for users in the Designated Countries.
Expiry: 6 Months
Type: HTTP
Used to store information about the time a sync with the AnalyticsSyncHistory cookie took place for users in the Designated Countries.
Expiry: 6 Months
Type: HTTP
Cookie used for Sign-in with Linkedin and/or to allow for the Linkedin follow feature.
Expiry: 6 Months
Type: HTTP
allow for the Linkedin follow feature.
Expiry: 1 Year
Type: HTTP
often used to identify you, including your name, interests, and previous activity.
Expiry: 2 Months
Type: HTTP
Tracks the time that the previous page took to load
Expiry: Session
Type: HTTP
Used to remember a user's language setting to ensure LinkedIn.com displays in the language selected by the user in their settings
Expiry: Session
Type: HTTP
Tracks percent of page viewed
Expiry: Session
Type: HTTP
Indicates the start of a session for Adobe Experience Cloud
Expiry: Session
Type: HTTP
Provides page name value (URL) for use by Adobe Analytics
Expiry: Session
Type: HTTP
Used to retain and fetch time since last visit in Adobe Analytics
Expiry: 6 Months
Type: HTTP
Remembers a user's display preference/theme setting
Expiry: 6 Months
Type: HTTP
Remembers which users have updated their display / theme preferences
Expiry: 6 Months
Type: HTTP
Used by Google Adsense, to store and track conversions.
Expiry: 3 Months
Type: HTTP
Save certain preferences, for example the number of search results per page or activation of the SafeSearch Filter. Adjusts the ads that appear in Google Search.
Expiry: 2 Years
Type: HTTP
Save certain preferences, for example the number of search results per page or activation of the SafeSearch Filter. Adjusts the ads that appear in Google Search.
Expiry: 2 Years
Type: HTTP
Save certain preferences, for example the number of search results per page or activation of the SafeSearch Filter. Adjusts the ads that appear in Google Search.
Expiry: 2 Years
Type: HTTP
Save certain preferences, for example the number of search results per page or activation of the SafeSearch Filter. Adjusts the ads that appear in Google Search.
Expiry: 2 Years
Type: HTTP
Save certain preferences, for example the number of search results per page or activation of the SafeSearch Filter. Adjusts the ads that appear in Google Search.
Expiry: 2 Years
Type: HTTP
Save certain preferences, for example the number of search results per page or activation of the SafeSearch Filter. Adjusts the ads that appear in Google Search.
Expiry: 2 Years
Type: HTTP
These cookies are used for the purpose of targeted advertising.
Expiry: 6 Hours
Type: HTTP
These cookies are used for the purpose of targeted advertising.
Expiry: 1 Month
Type: HTTP
These cookies are used to gather website statistics, and track conversion rates.
Expiry: 1 Month
Type: HTTP
Aggregate analysis of website visitors
Expiry: 6 Months
Type: HTTP
This cookie is set by Facebook to deliver advertisements when they are on Facebook or a digital platform powered by Facebook advertising after visiting this website.
Expiry: 4 Months
Type: HTTP
Contains a unique browser and user ID, used for targeted advertising.
Expiry: 2 Months
Type: HTTP
Used by LinkedIn to track the use of embedded services.
Expiry: 1 Year
Type: HTTP
Used by LinkedIn for tracking the use of embedded services.
Expiry: 1 Day
Type: HTTP
Used by LinkedIn to track the use of embedded services.
Expiry: 6 Months
Type: HTTP
Use these cookies to assign a unique ID when users visit a website.
Expiry: 6 Months
Type: HTTP
These cookies are set by LinkedIn for advertising purposes, including: tracking visitors so that more relevant ads can be presented, allowing users to use the 'Apply with LinkedIn' or the 'Sign-in with LinkedIn' functions, collecting information about how visitors use the site, etc.
Expiry: 6 Months
Type: HTTP
Used to make a probabilistic match of a user's identity outside the Designated Countries
Expiry: 90 Days
Type: HTTP
Used to collect information for analytics purposes.
Expiry: 1 year
Type: HTTP
Used to store session ID for a users session to ensure that clicks from adverts on the Bing search engine are verified for reporting purposes and for personalisation
Expiry: 1 Day
Type: HTTP
Cookie declaration last updated on 24/03/2023 by Analytics Vidhya.
Cookies are small text files that can be used by websites to make a user's experience more efficient. The law states that we can store cookies on your device if they are strictly necessary for the operation of this site. For all other types of cookies, we need your permission. This site uses different types of cookies. Some cookies are placed by third-party services that appear on our pages. Learn more about who we are, how you can contact us, and how we process personal data in our Privacy Policy.
Edit
Resend OTP
Resend OTP in 45s
Good job Tavish, thank you !
Hi Sounds like a complaint on a data scientist job. Sounds demoralizing LOL I do agreed with your points. and I hope analytics won't become a dot-net bubble. It is better to start small scale and bring small value (let's call it SMALL DATA analytics), than to do BIG DATA analytics which will end up in black hole. .Honestly, when I do my own mini projects and as I came across many algorithms, I start asking myself : what does the better algorithm provide in incremental value ? Maybe I end up more confused ! Worst, people who are non-believer of analytics will always disagree with people like us.
Nice article Tavish.