In this video, we demonstrate how to install Apache Spark on Ubuntu 18.04.
Commands used:
apt update -y
apt install default-jdk -y
wget https://downloads.apache.org/spark/spark-3.0.1/spark-3.0.1-bin-hadoop2.7.tgz
tar -xvzf spark-*
mv spark-3.0.1-bin-hadoop2.7/ /opt/spark
echo "export SPARK_HOME=/opt/spark" >> ~/.profile
echo "export PATH=$PATH:/opt/spark/bin:/opt/spark/sbin" >>~/.profile
echo "export PYSPARK_PYTHON=/usr/bin/python3" >> ~/.profile
source ~/.profile
start-master.sh
ssh -L 8080:localhost:8080 root@ubuntu1804.awesome.com
start-slave.sh spark://ubuntu1804.awesome.com:7077
spark-shell
pyspark
stop-slave.sh
stop-master.sh
The related article for this article can be found here: https://www.liquidweb.com/kb/how-to-install-apache-spark-on-ubuntu/
For more information about this and other topics, visit us at https://www.liquidweb.com/kb/ or for more information on our Managed Cloud product, visit:https://www.liquidweb.com/products/cloud-servers/ to learn more!
Video by: Justin Palmer
Commands used:
apt update -y
apt install default-jdk -y
wget https://downloads.apache.org/spark/spark-3.0.1/spark-3.0.1-bin-hadoop2.7.tgz
tar -xvzf spark-*
mv spark-3.0.1-bin-hadoop2.7/ /opt/spark
echo "export SPARK_HOME=/opt/spark" >> ~/.profile
echo "export PATH=$PATH:/opt/spark/bin:/opt/spark/sbin" >>~/.profile
echo "export PYSPARK_PYTHON=/usr/bin/python3" >> ~/.profile
source ~/.profile
start-master.sh
ssh -L 8080:localhost:8080 root@ubuntu1804.awesome.com
start-slave.sh spark://ubuntu1804.awesome.com:7077
spark-shell
pyspark
stop-slave.sh
stop-master.sh
The related article for this article can be found here: https://www.liquidweb.com/kb/how-to-install-apache-spark-on-ubuntu/
For more information about this and other topics, visit us at https://www.liquidweb.com/kb/ or for more information on our Managed Cloud product, visit:https://www.liquidweb.com/products/cloud-servers/ to learn more!
Video by: Justin Palmer
- Category
- Liquid Web
- Tags
- liquidweb, datacenter, web
Be the first to comment



