Analysing the failed Spark Jobs using Log Files | log Files in Databricks | Best Practices in Spark

แชร์
ฝัง
  • เผยแพร่เมื่อ 22 มี.ค. 2024
  • Analysing the failed Spark Jobs using Log Files | log Files in Databricks | Best Practices in Spark
    In this video, we will understand how to analyze Spark Jobs in case of failure using log Files. Where to find the Log Files. Databricks log file utility comes handy to view the logs of failed jobs.
    One of the most common interview questions when you are applying for any data based roles such as data analyst, data engineer, data scientist or data manager.
    Don't miss out - Subscribe to the channel for more such interesting information
    Social Media Links :
    LinkedIn - / bigdatabysumit
    Twitter - / bigdatasumit
    Instagram - / bigdatabysumit
    Website - trendytech.in/?src=youtube&su...
    #DataWarehouse #DataLake #DataLakehouse #DataManagement #TechTrends2024 #DataAnalysis #BusinessIntelligencen #2024 #interview #interviewquestions #interviewpreparation

ความคิดเห็น • 2

  • @NabaKrPaul-ik2oy
    @NabaKrPaul-ik2oy 4 หลายเดือนก่อน +1

    The logger library would point to datalake or dbfs to Write the logs right?

  • @rh334
    @rh334 4 หลายเดือนก่อน

    But we see this in Databricks shell job tabs right?