Analysing the failed Spark Jobs using Log Files | log Files in Databricks | Best Practices in Spark
ฝัง
- เผยแพร่เมื่อ 22 มี.ค. 2024
- Analysing the failed Spark Jobs using Log Files | log Files in Databricks | Best Practices in Spark
In this video, we will understand how to analyze Spark Jobs in case of failure using log Files. Where to find the Log Files. Databricks log file utility comes handy to view the logs of failed jobs.
One of the most common interview questions when you are applying for any data based roles such as data analyst, data engineer, data scientist or data manager.
Don't miss out - Subscribe to the channel for more such interesting information
Social Media Links :
LinkedIn - / bigdatabysumit
Twitter - / bigdatasumit
Instagram - / bigdatabysumit
Website - trendytech.in/?src=youtube&su...
#DataWarehouse #DataLake #DataLakehouse #DataManagement #TechTrends2024 #DataAnalysis #BusinessIntelligencen #2024 #interview #interviewquestions #interviewpreparation
The logger library would point to datalake or dbfs to Write the logs right?
But we see this in Databricks shell job tabs right?