How to setup IntelliJ IDEA for Scala-Spark ? Easy Setup of Scala-Spark
ฝัง
- เผยแพร่เมื่อ 27 ธ.ค. 2024
- In this video : I have setup the Scala-Spark Environment in the Intellij IDEA
Version used:
Java version : JDK 1.8
Spark version : 3.3.0
Scala Version: 2.12.19
with SBT executor and SCALA plugin
--------------------------------------------------------------------
SBT Configuration for SPAR Dependencies:
file : build.sbt
Paste below lines in build.sbt file in Intellij IDEA ..:
//dependencies
// mvnrepository....
libraryDependencies += "org.apache.spark" %% "spark-core" % "3.3.0"
// mvnrepository....
libraryDependencies += "org.apache.spark" %% "spark-sql" % "3.3.0"
==================================================
Tags:
Apache Spark tutorial
Spark programming for beginners
Big Data processing with Spark
Spark SQL tutorial
Spark streaming tutorial
Machine learning with Spark
PySpark tutorial
Spark RDD tutorial
Spark DataFrames tutorial
Spark Basic Tutorials
#intellij
#spark
#scala
#pyspark
#intellij
#bigdata
#programming
#dataanalytics
#dataanalysis
#coding
Good Job bro! It works!!!
Glad to hear it's working for you!
Thank you for this session it helps🙌
Welcome
Nice contest please more content like this it was very helpful for me #techieMangesh
Sure 👍
Thank you for this session. It was really meaningful. ❤ please upload tutorial videos ...how to write more code for spark and scala in intellij.
Thank you, I will
th-cam.com/video/k-VUqndd3IU/w-d-xo.html
Watch Var and Val difference in scala
thank u
You're welcome!
Hi Bro, It worked - [windows OS]. when I wrote a .csv file out, It was throwing error that HADOOP_HOME is not set & asked to install it. "java.io.FileNotFoundException: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset." how to resolve this. Should I download winutils for windows and create a HADOOP_HOME for this? Is this mandatory?
Yes It is needed for windows to file to read and write, download winutils and set Hadoop_home = c:/Hadoop_home/winutils; in environment variable
@@TechMAPR Thanks!