我要评分
获取效率
正确性
完整性
易理解

Running and Verifying Hadoop and Spark

Procedure

  1. Use PuTTY to log in to the server as the root user.
  2. Run the following commands to create a user directory in the HDFS:
    cd path/to/HADOOP
    ./bin/hdfs dfs -mkdir -p /user/hadoop

    The directories and files created by HDFS DFS operations can be identified only in Hadoop mode. They cannot be viewed by running the local ls command.

  3. Run the following commands to create the input directory and copy the files in etc/hadoop to the input directory:
    hdfs dfs -mkdir -p input
    hdfs dfs -put ./etc/hadoop/*.xml input
  4. Run the following command to view the file list:
    hdfs dfs -ls input
  5. Run the following command to stop Hadoop:
    stop-dfs.sh