spark_intallation

  1. 下载

下载spark

1
wget --no-check-certificate https://www.apache.org/dyn/closer.lua/spark/spark-2.3.1/spark-2.3.1-bin-hadoop2.7.tgz

下载 JAVA JDK

1
wget --no-check-certificate -c --header "Cookie: oraclelicense=accept-securebackup-cookie" http://download.oracle.com/otn-pub/java/jdk/8u171-b11/512cd62ec5174c3487ac17c61aaa89e8/jdk-8u171-linux-x64.tar.gz

如果解压出错,可能是问下下载的不完整,试试删了重新下载

安装JDK

将下载的jdk解压到/usr/java

1
2
3
cd /usr
mkdir java
tar -zxvf jdk-8u171-linux-x64.tar.gz -C /usr/java

修改环境变量

/etc/profile文件末尾加上

1
2
3
4
5
6
#Set java environment
JAVA_HOME=/usr/java/jdk1.8.0_171
JRE_HOME=/usr/java/jdk1.8.0_171/jre
CLASS_PATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar:$JRE_HOME/lib
PATH=$PATH:$JAVA_HOME/bin:$JRE_HOME/bin
export JAVA_HOME JRE_HOME CLASS_PATH PATH

使配置文件生效

1
source /etc/profile

现在java命令能正确相应

1
2
3
4
[root@localhost ~]# java -version
java version "1.8.0_171"
Java(TM) SE Runtime Environment (build 1.8.0_171-b11)
Java HotSpot(TM) 64-Bit Server VM (build 25.171-b11, mixed mode)