Current location - Quotes Website - Team slogan - Hadoop-Scala-Spark environment installation
Hadoop-Scala-Spark environment installation
Please pay attention to the continuous update:/blogs/Zorkelvll/Articles/20181/02/1541172452468.

? This paper mainly introduces the installation process of big data basic software Hadoop-Scala-Spark, and takes macOS, linux and other system environments as examples to practice!

I. Background

Second, the practice-environment installation (macOS)

Post-addition

(4) Configure the address and port of core-site.xmlhdfs: vim/usr/local/cellar/Hadoop/3.0.0/liberec/etc/Hadoop/core-site.xml = > Add configuration.

And create the folder mkdir/usr/local/cellular/Hadoop/HDFS &; mkdir/usr/local/Cellar/Hadoop/HDFS/tmp

Back up first: CP/usr/local/cellar/Hadoop/3.0.0/libexec/etc/Hadoop/mapred-site.xmlmapred-site-bak.xml.

Re-edit: vim/usr/local/cellar/Hadoop/3.0.0/liberec/etc/Hadoop/mapred-site.xml = > Add configuration.

(7) Format hdfs file system format: HDFS name-format.

(8) Start and close hadoop services:

/usr/local/Cellar/Hadoop/3 . 0 . 0/lib exec/start-DFS . sh = & gt; Daemon: NameNode, datanodes, auxiliary NameNode. Visit http://localhost:9870 in a browser. Please note that the port number is 9870, not 50070.

/usr/local/Cellar/Hadoop/3 . 0 . 0/libexec/start-yarn . sh = & gt; Yarn service processes: resourcemanager, nodemanagers, and accessing http://localhost:8088 and http://localhost:8042 in the browser.

/usr/local/Cellar/Hadoop/3 . 0 . 0/libexec/stop-yarn . sh

/usr/local/Cellar/Hadoop/3 . 0 . 0/libexec/stop-DFS . sh

Note: For hadoop3.0.0 installed by brew, the hadoop path that needs to be configured is under libexec, otherwise the start-dfs.sh command will report the error "Error: hdfs-config cannot be executed".

The above is the installation process of hadoop-scala-spark under mac. Yesterday was the first exercise under mac, and it was a one-time success => I hope I can help you and get your follow-up attention. If you have any questions or encounter pits, please leave a message at the bottom of the article! !

Spark opens the way: https://spark.apache.org/docs/latest/quick-start.html