Troubleshooting
Problem 1: An Error Is Reported When hdfs namenode -format Is Run
Symptom
The error message "Error: JAVA_HOME is not set and could not be found" is displayed when hdfs namenode -format is run.
Possible Causes
The JAVA_HOME environment variable is incorrectly set.
Procedure
Set the JAVA_HOME variable by referring to Configuring the Compilation Environment.
- Open the file.
vi ./etc/hadoop/hadoop-env.sh
- Press i to enter the insert mode and add the following environment variable to the environment variable file:
expect JAVA_HOME=JDK installation path
- Press Esc, type :wq!, and press Enter to save the file and exit.
Problem 2: An Error Is Reported When hdfs namenode -format Is Run
Symptom
The error message "Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password)" is displayed when hdfs namenode -format is run.
Possible Causes
The password-free function of the local host is faulty.
Procedure
Run the ssh-keygen -t rsa command to enable password-free function again.
Problem 3: "Could not resolve hostname" Displayed When Hadoop Is Started
Symptom
The error message "ssh: Could not resolve hostname xxx" is displayed when Hadoop is started.
Procedure
You can set the Hadoop environment variable to resolve this problem.
- Press Ctrl+C to stop the startup.
- Add the following two lines to ~/.bashrc:
- Open the file.
vi ~/.bashrc
- Press i to enter the insert mode and add the following commands to ~/.bashrc:
export HADOOP_HOME=/usr/local/hadoop export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
The setting process is the same as that of the JAVA_HOME variable. In the variable, HADOOP_HOME indicates the Hadoop installation directory.
- Press Esc, type :wq!, and press Enter to save the file and exit.
- Open the file.
- Run the following command for the setting to take effect:
source ~/.bashrc
- Run the following command to start Hadoop:
start-dfs.sh