Marketing Analytics through Markov ChainRidhima KumarBlockedUnblockFollowFollowingJan 6Image Source : http://setosa.

io/ev/markov-chains/Imagine you are a company selling a fast-moving consumer good in the market.

Let’s assume that the customer would follow the below journey to make the final purchase:These are the states at which the customer would be at any point in the purchase journey.

Now, how to find out at which state the customer would be after 6 months?Markov Chain to the rescue….

Let’s first understand what Markov Chain is.

Markov Chain:A Markov chain is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event· Markov Chains are sequential events that are probabilistically related to each other.

· These events are also known as states· These states together form what is known as State Space.

· The probability of next event or next state depends only on the present state and not on the previous states.

This property of Markov Chain is called Memoryless property.

It doesn’t care what happened in the past event and focuses only on the present information to predict what happens in the next state.

Markov Chain — States, Probabilities and Transition MatrixLets delve a little deeper.

A Markov Chain provides· Information about the current state&· Transition probabilities of moving from one state to anotherUsing the above two information, we can predict the next state.

In mathematical terms, the current state is called Initial State VectorSo, what we get is:Final State = Initial State *Transition MatrixA classic example of Markov Chain is predicting the weather.

We have two different weather conditions: Sunny and Rainy.

Let’s assume today it is sunny.

We have the following probabilities:· Probability of being sunny tomorrow (Probability of being in the same state) given that it is sunny today: 0.

9· Probability of raining tomorrow given that it is sunny today: 0.

1· Probability of being sunny tomorrow (Probability of being in the same state) given that it is rainy today: 0.

5· Probability of raining tomorrow (Probability of being in the same state) given that it is raining today: 0.

5Source: WikipediaHere the initial vector is:Transition Matrix =Weather on day 2 =Recollect the Final State = Initial State *Transition Matrix ?.The above represents the same.

So what is the inference?There is a 90% chance that the weather will be sunny on Day 2 and 10% chance that it will rain.

Back to the ProblemComing back to the problem where we need to know what the state the customer is after 6 months of launching the product.

We can assume there are 4 states in which the customer can be at any point in time !!1.

Awareness2.

Consideration3.

Purchase4.

No PurchaseGiven information:· Total no.

of customers = 200,000· The no of customers in each state/category· The transition probabilities of moving from one state to another state· Information about running some campaign in these months (The aim of the campaign is to increase no.

of customers purchasing the product)The Marketing Analytics objective:· To get no.

of customers in all 4 states after 6 months· To assess whether the campaign was effective in increasing the no.

of customers purchasing the productSo, lets dive into the math part.

Note: A — Awareness, C — Consideration, P — Purchase, NP — No PurchaseInitial State Vector =Transition Matrix =It would be more clear to see the movements among all 4 states diagrammatically:Final States of Customers = Initial State Vector * Transition MatrixThe final matrix shows that the no.

of customers moving to Purchase state have increased indicating that the campaign has worked !!Markov Chain has many other applications in Marketing Analytics and other fields.

Stay tuned for more articles on Markov Chain!!If you like my article, do give it a few claps!!!Also, you can read my other articles on Marketing Mix Modeling posted on Medium: MMM 101, MMM 101 Part 2, MMM Elasticity, MMM — Interaction Effects, MMM — Digital Variables.

For more details, you can reach out to me on (For consulting requirements):Website: https://www.

ridhimakumar.

com/Also, do check out my video on MMM:LinkedInTwitter© Copyright 2018 www.

ridhimakumar.

com All Rights Reserved.

We can solve the above problem using the concept of Markov Chain.

Let’s first understand what Markov Chain is.

.