I would like to define environmental variables for Spark on Scala for using in terminal and/or IntelliJ. What is the right way to set it up? My hidden home elements are the following:
$ ls .
./ .Trash/ .ivy2/
../ .android/ .matplotlib/
.CFUserTextEncoding .bash_history .oracle_jre_usage/
.DS_Store .bash_sessions/ .sbt/
I found the answer as it can be done by editing(in not exist, creating) file in home directory as the following:
COMMAND
MacBook-Pro:~ User$ nano .bash_profile
TYPE
export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_161.jdk/Contents/Home
export SCALA_HOME=/<SCALA-folder-path>
export HADOOP_HOME=/<HADOOP-folder-path>
export SPARK_HOME=/<Spark-folder-path>
PATH=$PATH:$JAVA_HOME/bin:$SCALA_HOME/bin:$SPARK_HOME/bin:$HADOOP_HOME/bin
export PATH