Holy cow. If you assign this to me as a task I can probably finish it in a day with some googling, trial and error, etc But doing this right away during an interview? I would have freaked out completely. This guy just nailed it, and the tips are super helpful. Thanks for sharing!
Shouldn't the average_purchase_value consider only those sessions where a purchase was made? (rows with purchase_value > 0). The conversion rate of each marketing_channel is creating a bias in this metric.
Holy cow. If you assign this to me as a task I can probably finish it in a day with some googling, trial and error, etc But doing this right away during an interview? I would have freaked out completely. This guy just nailed it, and the tips are super helpful. Thanks for sharing!
Ben is built different, no hesitation, its like he had done this before, what a beast
Shouldn't the average_purchase_value consider only those sessions where a purchase was made? (rows with purchase_value > 0). The conversion rate of each marketing_channel is creating a bias in this metric.
This is very helpful. Thank you for sharing
Very helpful...Please do more of Data Engineering Mock Interview and sessions related to DE.
very helpful thank you.
12:35, can we basically get the min timestamp in the high_value subquery?
Hey, for this episode interview question doc, Could you share? (add a link here)
Nice session. Very helpful.
Ben is a g
Why is the data in two separate tables ? It looks like a one to one relationship to me.