This could by far be the most useful data tutorial I have watched and very applicable to all sorts of data sets if you think about your own customer base and what could be considered their recency, frequency and monetization measures. 🙏
Hello, this video is really great and i want to thank you for doing it. A real time and effort saver. I do have a question that makes me feel stupid, but in my dataset i have 639 customers, and i have done exactly like you do. I dont understand why the percentrank make it so my scoring places like this: 30-27=10, 26-24=9, 23-21=8, 20-19=7, 18-17=6, 16-14=5, 13=4, 12-10=3, 9-8=2, 7-5=1, 4-1=0. Why is for example 14 alone with the score of 4? Is this correct? Should it not be equal intervalls between the Scores 30-1, and the customers score individual score then places them in a interval that has equal amount of numbers in it? I hope this makes sense, again great video, liked!
I used this method on our sales dataset but there's always blanks ("-") appearing in monetary and RFM score even though the scores for monetary, frequency, recency are complete. Could you tell me why?
I noticed from your video that customers with the most recent purchase are rewarded zero which reduces the total RFM score of a customer. I believe recency should ideally reward more recent transactions with higher scores. This does'nt correlate with the term Recency. Is there a better way to do this or a way around it.
This is addressed and a solution given in the video around the 14:00 mark. Easy enough just to score it by its inverse and score that lowest value as the best and highest as the worse
Hello I got another question. How do you get the frequency? I downloaded the data set that is linked in the video but it doesn't have the frequency column. How do I get the frequency?
@@absentdata One more question, so I have a problem getting the recency. let say I have a 12/01/2010 to 12/31/2012 using datedif, the resulting is always the #value! error. What should I do?
Try these steps: first convert date format in pivot table it will come 1900 but don't worry, go to field setting select max then it will be ok. it worked for me.
This could by far be the most useful data tutorial I have watched and very applicable to all sorts of data sets if you think about your own customer base and what could be considered their recency, frequency and monetization measures. 🙏
Hey!!! Love your comment. I am glad you found this tutorial very useful.
Business + Technical, very Well done!!!
Glad that you like it!
Excellent vidéo !!!! Thanks !!
Glad you liked it
Thank you so much sir I have been looking for this analysis from so long but none of the videos on you tube has given me that kind of insight..❤
This is an excellent comment. Glad you liked it
Great video! Thank you!
Glad you liked it. Share it will anyone you think it will help.
thank you so much🙌. small doubt how spending average calculated for each segment
Thanks
really goo insight and very clever
Thank you
Hello, this video is really great and i want to thank you for doing it. A real time and effort saver.
I do have a question that makes me feel stupid, but in my dataset i have 639 customers, and i have done exactly like you do.
I dont understand why the percentrank make it so my scoring places like this: 30-27=10, 26-24=9, 23-21=8, 20-19=7, 18-17=6, 16-14=5, 13=4, 12-10=3, 9-8=2, 7-5=1, 4-1=0. Why is for example 14 alone with the score of 4? Is this correct?
Should it not be equal intervalls between the Scores 30-1, and the customers score individual score then places them in a interval that has equal amount of numbers in it?
I hope this makes sense, again great video, liked!
you are so great person thank you
So nice of you
Great
very good
Top content as usual.
Glad you think so!
some dates are not recognized at dates and i get the value 0 for max invoice for all of them
I used this method on our sales dataset but there's always blanks ("-") appearing in monetary and RFM score even though the scores for monetary, frequency, recency are complete. Could you tell me why?
What is happening when I change Invoice Date to max and all the numbers are 0 on excel and how do I fix this.
How did you get the $7200 you concluded in the end and 10days average?
Thank you💯🔥❤️🇿🇦 can I please also ask for this done in python, would help me in my data analysis journey
Throughout whole youtube I have seen most of the channels used similiar dataset . Can you pls do the same project using different dataset.
I noticed from your video that customers with the most recent purchase are rewarded zero which reduces the total RFM score of a customer. I believe recency should ideally reward more recent transactions with higher scores. This does'nt correlate with the term Recency. Is there a better way to do this or a way around it.
This is addressed and a solution given in the video around the 14:00 mark. Easy enough just to score it by its inverse and score that lowest value as the best and highest as the worse
Hello I got another question. How do you get the frequency? I downloaded the data set that is linked in the video but it doesn't have the frequency column. How do I get the frequency?
This is definitely covered in the video. Watch it from the beginning. You missed it
@@absentdata Yeah I actually missed it . Anyway Thank you for the reply.
Where can I get the data that you used in the video?
It's in the description
@@absentdata One more question, so I have a problem getting the recency. let say I have a 12/01/2010 to 12/31/2012 using datedif, the resulting is always the #value! error. What should I do?
can some one help me while convering date to max in pivote im getting 0 insted of value and then changing to date give me date in 1900
Try these steps: first convert date format in pivot table it will come 1900 but don't worry, go to field setting select max then it will be ok. it worked for me.
joe rogan now has a Marketer career
Lol!
Great content but please provide the dataset
It's in the description.
I just want to confirm , that RECENTCY is worse ---> if there are many ?
If I understand your question, the answer is yes, its worse.