做网站UI说不会写文案,wordpress 删除修订版本,网页翻译成中文,北京好的网站开发Standalone
升级软件安装常用软件关闭防火墙修改主机名和IP地址修改hosts配置文件下载jdk和hadoop并配置环境变量配置ssh免密钥登录修改配置文件初始化集群windows修改hosts文件测试
1、升级软件
yum -y update2、安装常用软件
yum -y install gcc gcc-c autoconf automake…Standalone
升级软件安装常用软件关闭防火墙修改主机名和IP地址修改hosts配置文件下载jdk和hadoop并配置环境变量配置ssh免密钥登录修改配置文件初始化集群windows修改hosts文件测试
1、升级软件
yum -y update2、安装常用软件
yum -y install gcc gcc-c autoconf automake cmake make \zlib zlib-devel openssl openssl-devel pcre-devel \rsync openssh-server vim man zip unzip net-tools tcpdump lrzsz tar wget3、关闭防火墙
sed -i s/SELINUXenforcing/SELINUXdisabled/g /etc/selinux/config
setenforce 0systemctl stop firewalld
systemctl disable firewalld4、修改主机名和IP地址
hostnamectl set-hostname hadoopvim /etc/sysconfig/network-scripts/ifcfg-ens32参考如下 TYPEEthernet
PROXY_METHODnone
BROWSER_ONLYno
BOOTPROTOnone
DEFROUTEyes
IPV4_FAILURE_FATALno
IPV6INITyes
IPV6_AUTOCONFyes
IPV6_DEFROUTEyes
IPV6_FAILURE_FATALno
IPV6_ADDR_GEN_MODEeui64
NAMEens32
UUID55e7ac28-39d7-4f24-b6bf-0f9fb40b7595
DEVICEens32
ONBOOTyes
IPADDR192.168.10.24
PREFIX24
GATEWAY192.168.10.2
DNS1192.168.10.25、修改hosts配置文件
vim /etc/hosts修改内容如下 192.168.10.24 hadoop重启系统 reboot6、下载jdk和hadoop并配置环境变量 创建软件目录 mkdir -p /opt/soft 进入软件目录 cd /opt/soft下载 JDK 下载 hadoop wget https://dlcdn.apache.org/hadoop/common/hadoop-3.3.6/hadoop-3.3.6.tar.gz解压 JDK 修改名称 tar -zxvf jdk-8u411-linux-x64.tar.gzmv jdk1.8.0_411 jdk-8解压 hadoop 修改名称 tar -zxvf hadoop-3.3.6.tar.gzmv hadoop-3.3.6 hadoop-3配置环境变量 vim /etc/profile.d/my_env.sh编写以下内容 export JAVA_HOME/opt/soft/jdk-8export HDFS_NAMENODE_USERroot
export HDFS_SECONDARYNAMENODE_USERroot
export HDFS_DATANODE_USERroot
export HDFS_ZKFC_USERroot
export HDFS_JOURNALNODE_USERroot
export HADOOP_SHELL_EXECNAMErootexport YARN_RESOURCEMANAGER_USERroot
export YARN_NODEMANAGER_USERrootexport HADOOP_HOME/opt/soft/hadoop-3
export HADOOP_INSTALL$HADOOP_HOME
export HADOOP_MAPRED_HOME$HADOOP_HOME
export HADOOP_COMMON_HOME$HADOOP_HOME
export HADOOP_HDFS_HOME$HADOOP_HOME
export YARN_HOME$HADOOP_HOME
export HADOOP_CONF_DIR$HADOOP_HOME/etc/hadoop
export JAVA_LIBRARY_PATH$HADOOP_HOME/lib/nativeexport PATH$PATH:$JAVA_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin 生成新的环境变量 source /etc/profile7、配置ssh免密钥登录 创建本地秘钥并将公共秘钥写入认证文件 ssh-keygen -t rsa -P -f ~/.ssh/id_rsassh-copy-id roothadoop8、修改配置文件 hadoop-env.sh core-site.xml hdfs-site.xml workers mapred-site.xml yarn-site.xml hadoop-env.sh 文档末尾追加以下内容 export JAVA_HOME/opt/soft/jdk-8export HDFS_NAMENODE_USERroot
export HDFS_SECONDARYNAMENODE_USERroot
export HDFS_DATANODE_USERroot
export HDFS_ZKFC_USERroot
export HDFS_JOURNALNODE_USERroot
export HADOOP_SHELL_EXECNAMErootexport YARN_RESOURCEMANAGER_USERroot
export YARN_NODEMANAGER_USERrootexport JAVA_LIBRARY_PATH$HADOOP_HOME/lib/native
core-site.xml
?xml version1.0 encodingUTF-8?
?xml-stylesheet typetext/xsl hrefconfiguration.xsl?
configurationpropertynamefs.defaultFS/namevaluehdfs://hadoop:9000/value/propertypropertynamehadoop.tmp.dir/namevalue/home/hadoop_data/value/propertypropertynamehadoop.http.staticuser.user/namevalueroot/value/propertypropertynamedfs.permissions.enabled/namevaluefalse/value/propertypropertynamehadoop.proxyuser.root.hosts/namevalue*/value/propertypropertynamehadoop.proxyuser.root.groups/namevalue*/value/property
/configurationhdfs-site.xml
?xml version1.0 encodingUTF-8?
?xml-stylesheet typetext/xsl hrefconfiguration.xsl?
configurationpropertynamedfs.replication/namevalue1/value/propertypropertynamedfs.namenode.secondary.http-address/namevaluehadoop:9868/value/property
/configurationworkers 注意 hadoop2.x中该文件名为slaves hadoop3.x中该文件名为workers hadoopmapred-site.xml
?xml version1.0?
?xml-stylesheet typetext/xsl hrefconfiguration.xsl?
configurationpropertynamemapreduce.framework.name/namevalueyarn/value/propertypropertynamemapreduce.application.classpath/namevalue$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/*:$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/lib/*/value/property
/configuration
yarn-site.xml
?xml version1.0?
configurationpropertynameyarn.nodemanager.aux-services/namevaluemapreduce_shuffle/value/propertypropertynameyarn.nodemanager.env-whitelist/namevalueJAVA_HOME,HADOOP_COMMON_HOME,HADOOP_HDFS_HOME,HADOOP_CONF_DIR,CLASSPATH_PREPEND_DISTCACHE,HADOOP_YARN_HOME,HADOOP_HOME,PATH,LANG,TZ,HADOOP_MAPRED_HOME/value/property
/configuration
9、初始化集群
# 格式化文件系统
hdfs namenode -format
# 启动 NameNode SecondaryNameNode DataNode
start-dfs.sh
# 查看启动进程
jps
# 看到 DataNode SecondaryNameNode NameNode 三个进程代表启动成功# 启动 ResourceManager daemon 和 NodeManager
start-yarn.sh
# 看到 DataNode NodeManager SecondaryNameNode NameNode ResourceManager 五个进程代表启动成功重点提示
# 关机之前 依关闭服务
stop-yarn.sh
stop-dfs.sh
# 开机后 依次开启服务
start-dfs.sh
start-yarn.sh或者
# 关机之前关闭服务
stop-all.sh
# 开机后开启服务
start-all.sh#jps 检查进程正常后开启胡哦关闭在再做其它操作10、修改windows下hosts文件 C:\Windows\System32\drivers\etc\hosts 追加以下内容 192.168.171.10 hadoopWindows11 注意 修改权限 开始搜索 cmd 找到命令头提示符 以管理身份运行 进入 C:\Windows\System32\drivers\etc 目录 cd drivers/etc打开 hosts 配置文件 start hosts追加以下内容后保存 192.168.10.24 hadoop11、测试
11.1 浏览器访问hadoop 浏览器访问: http://hadoop:9870 浏览器访问:http://hadoop:9868/ 浏览器访问:http://hadoop:8088 11.2 测试 hdfs 本地文件系统创建 测试文件 wcdata.txt vim wcdata.txtSpark HBaseHive Flink
Storm Hadoop HBase SparkFlinkHBase
StormHBase Hadoop Hive
FlinkHBase Flink
Hive StormHive Flink HadoopHBase
HiveHadoop Spark HBase StormHBase
Hadoop Hive FlinkHBase Flink Hive StormHive
Flink HadoopHBase Hive
Spark HBaseHive Flink
Storm Hadoop HBase SparkFlinkHBase
StormHBase Hadoop Hive
FlinkHBase Flink
Hive StormHive Flink HadoopHBase
HiveHadoop Spark HBase StormHBase
Hadoop Hive FlinkHBase Flink Hive StormHive
Flink HadoopHBase Hive
Spark HBaseHive Flink
Storm Hadoop HBase SparkFlinkHBase
StormHBase Hadoop Hive
FlinkHBase Flink
Hive StormHive Flink HadoopHBase
HiveHadoop Spark HBase StormHBase
Hadoop Hive FlinkHBase Flink Hive StormHive
Flink HadoopHBase Hive
HiveHadoop Spark HBase StormHBase
Hadoop Hive FlinkHBase Flink Hive StormHive
Flink HadoopHBase Hive
Spark HBaseHive Flink
Storm Hadoop HBase SparkFlinkHBase
StormHBase Hadoop Hive
FlinkHBase Flink
Hive StormHive Flink HadoopHBase
HiveHadoop Spark HBase StormHBase
Hadoop Hive FlinkHBase Flink Hive StormHive
Flink HadoopHBase Hive
Spark HBaseHive Flink
Storm Hadoop HBase SparkFlinkHBase
StormHBase Hadoop Hive
HiveHadoop Spark HBase StormHBase
Hadoop Hive FlinkHBase Flink Hive StormHive
Flink HadoopHBase Hive
Spark HBaseHive Flink
Storm Hadoop HBase SparkFlinkHBase
StormHBase Hadoop Hive
FlinkHBase Flink
Hive StormHive Flink HadoopHBase
HiveHadoop Spark HBase StormHBase
Hadoop Hive FlinkHBase Flink Hive StormHive
Flink HadoopHBase Hive
Spark HBaseHive Flink
Storm Hadoop HBase SparkFlinkHBase
StormHBase Hadoop Hive
Spark HBaseHive Flink
Storm Hadoop HBase SparkFlinkHBase
StormHBase Hadoop Hive
FlinkHBase Flink
Hive StormHive Flink HadoopHBase
HiveHadoop Spark HBase StormHBase
Hadoop Hive FlinkHBase Flink Hive StormHive
Flink HadoopHBase Hive
Spark HBaseHive Flink
Storm Hadoop HBase SparkFlinkHBase
StormHBase Hadoop Hive
FlinkHBase Flink
Hive StormHive Flink HadoopHBase
HiveHadoop Spark HBase StormHBase
Hadoop Hive FlinkHBase Flink Hive StormHive
Flink HadoopHBase Hive
HiveHadoop Spark HBase StormHBase
Hadoop Hive FlinkHBase Flink Hive StormHive
Flink HadoopHBase Hive
Spark HBaseHive Flink
Storm Hadoop HBase SparkFlinkHBase
StormHBase Hadoop Hive
FlinkHBase Flink
Hive StormHive Flink HadoopHBase
HiveHadoop Spark HBase StormHBase
Hadoop Hive FlinkHBase Flink Hive StormHive
Flink HadoopHBase Hive
Spark HBaseHive Flink
Storm Hadoop HBase SparkFlinkHBase
StormHBase Hadoop Hive
Spark HBaseHive Flink
Storm Hadoop HBase SparkFlinkHBase
StormHBase Hadoop Hive
FlinkHBase Flink
Hive StormHive Flink HadoopHBase
HiveHadoop Spark HBase StormHBase
Hadoop Hive FlinkHBase Flink Hive StormHive
Flink HadoopHBase Hive
Spark HBaseHive Flink
Storm Hadoop HBase SparkFlinkHBase
StormHBase Hadoop Hive
FlinkHBase Flink
Hive StormHive Flink HadoopHBase
HiveHadoop Spark HBase StormHBase
Hadoop Hive FlinkHBase Flink Hive StormHive
Flink HadoopHBase Hive
HiveHadoop Spark HBase StormHBase
Hadoop Hive FlinkHBase Flink Hive StormHive
Flink HadoopHBase Hive
Spark HBaseHive Flink
Storm Hadoop HBase SparkFlinkHBase
StormHBase Hadoop Hive
FlinkHBase Flink
Hive StormHive Flink HadoopHBase
HiveHadoop Spark HBase StormHBase
Hadoop Hive FlinkHBase Flink Hive StormHive
Flink HadoopHBase Hive
Spark HBaseHive Flink
Storm Hadoop HBase SparkFlinkHBase
StormHBase Hadoop Hive
Spark HBaseHive Flink
Storm Hadoop HBase SparkFlinkHBase
StormHBase Hadoop Hive
FlinkHBase Flink
Hive StormHive Flink HadoopHBase
HiveHadoop Spark HBase StormHBase
Hadoop Hive FlinkHBase Flink Hive StormHive
Flink HadoopHBase Hive
Spark HBaseHive Flink
Storm Hadoop HBase SparkFlinkHBase
StormHBase Hadoop Hive
FlinkHBase Flink
Hive StormHive Flink HadoopHBase
HiveHadoop Spark HBase StormHBase
Hadoop Hive FlinkHBase Flink Hive StormHive
Flink HadoopHBase Hive
Spark HBaseHive Flink
Storm Hadoop HBase SparkFlinkHBase
StormHBase Hadoop Hive
FlinkHBase Flink
Hive StormHive Flink HadoopHBase
HiveHadoop Spark HBase StormHBase
Hadoop Hive FlinkHBase Flink Hive StormHive
Flink HadoopHBase Hive
HiveHadoop Spark HBase StormHBase
Hadoop Hive FlinkHBase Flink Hive StormHive
Flink HadoopHBase Hive
Spark HBaseHive Flink
Storm Hadoop HBase SparkFlinkHBase
StormHBase Hadoop Hive
FlinkHBase Flink
Hive StormHive Flink HadoopHBase
HiveHadoop Spark HBase StormHBase
Hadoop Hive FlinkHBase Flink Hive StormHive
Flink HadoopHBase Hive
Spark HBaseHive Flink
Storm Hadoop HBase SparkFlinkHBase
StormHBase Hadoop Hive
HiveHadoop Spark HBase StormHBase
Hadoop Hive FlinkHBase Flink Hive StormHive
Flink HadoopHBase Hive
Spark HBaseHive Flink
Storm Hadoop HBase SparkFlinkHBase
StormHBase Hadoop Hive
FlinkHBase Flink
Hive StormHive Flink HadoopHBase
HiveHadoop Spark HBase StormHBase
Hadoop Hive FlinkHBase Flink Hive StormHive
Flink HadoopHBase Hive
Spark HBaseHive Flink
Storm Hadoop HBase SparkFlinkHBase
StormHBase Hadoop Hive
Spark HBaseHive Flink
Storm Hadoop HBase SparkFlinkHBase
StormHBase Hadoop Hive
FlinkHBase Flink
Hive StormHive Flink HadoopHBase
HiveHadoop Spark HBase StormHBase
Hadoop Hive FlinkHBase Flink Hive StormHive
Flink HadoopHBase Hive
Spark HBaseHive Flink
Storm Hadoop HBase SparkFlinkHBase
StormHBase Hadoop Hive
FlinkHBase Flink
Hive StormHive Flink HadoopHBase
HiveHadoop Spark HBase StormHBase
Hadoop Hive FlinkHBase Flink Hive StormHive
Flink HadoopHBase Hive
HiveHadoop Spark HBase StormHBase
Hadoop Hive FlinkHBase Flink Hive StormHive
Flink HadoopHBase Hive
Spark HBaseHive Flink
Storm Hadoop HBase SparkFlinkHBase
StormHBase Hadoop Hive
FlinkHBase Flink
Hive StormHive Flink HadoopHBase
HiveHadoop Spark HBase StormHBase
Hadoop Hive FlinkHBase Flink Hive StormHive
Flink HadoopHBase Hive
Spark HBaseHive Flink
Storm Hadoop HBase SparkFlinkHBase
StormHBase Hadoop Hive
Spark HBaseHive Flink
Storm Hadoop HBase SparkFlinkHBase
StormHBase Hadoop Hive
FlinkHBase Flink
Hive StormHive Flink HadoopHBase
HiveHadoop Spark HBase StormHBase
Hadoop Hive FlinkHBase Flink Hive StormHive
Flink HadoopHBase Hive
Spark HBaseHive Flink
Storm Hadoop HBase SparkFlinkHBase
StormHBase Hadoop Hive
FlinkHBase Flink
Hive StormHive Flink HadoopHBase
HiveHadoop Spark HBase StormHBase
Hadoop Hive FlinkHBase Flink Hive StormHive
Flink HadoopHBase Hive
HiveHadoop Spark HBase StormHBase
Hadoop Hive FlinkHBase Flink Hive StormHive
Flink HadoopHBase Hive
Spark HBaseHive Flink
Storm Hadoop HBase SparkFlinkHBase
StormHBase Hadoop Hive
FlinkHBase Flink
Hive StormHive Flink HadoopHBase
HiveHadoop Spark HBase StormHBase
Hadoop Hive FlinkHBase Flink Hive StormHive
Flink HadoopHBase Hive
Spark HBaseHive Flink
Storm Hadoop HBase SparkFlinkHBase
StormHBase Hadoop Hive在 HDFS 上创建目录 /wordcount/input hdfs dfs -mkdir -p /wordcount/input查看 HDFS 目录结构 hdfs dfs -ls /hdfs dfs -ls /wordcounthdfs dfs -ls /wordcount/input上传本地测试文件 wcdata.txt 到 HDFS 上 /wordcount/input hdfs dfs -put wcdata.txt /wordcount/input检查文件是否上传成功 hdfs dfs -ls /wordcount/inputhdfs dfs -cat /wordcount/input/wcdata.txt11.3 测试 mapreduce 计算 PI 的值 hadoop jar $HADOOP_HOME/share/hadoop/mapreduce/hadoop-mapreduce-examples-3.3.6.jar pi 10 10单词统计 hadoop jar $HADOOP_HOME/share/hadoop/mapreduce/hadoop-mapreduce-examples-3.3.6.jar wordcount /wordcount/input/wcdata.txt /wordcount/resulthdfs dfs -ls /wordcount/resulthdfs dfs -cat /wordcount/result/part-r-00000dcount bash
hdfs dfs -ls /wordcount/input上传本地测试文件 wcdata.txt 到 HDFS 上 /wordcount/input hdfs dfs -put wcdata.txt /wordcount/input检查文件是否上传成功 hdfs dfs -ls /wordcount/inputhdfs dfs -cat /wordcount/input/wcdata.txt11.3 测试 mapreduce 计算 PI 的值 hadoop jar $HADOOP_HOME/share/hadoop/mapreduce/hadoop-mapreduce-examples-3.3.6.jar pi 10 10单词统计 hadoop jar $HADOOP_HOME/share/hadoop/mapreduce/hadoop-mapreduce-examples-3.3.6.jar wordcount /wordcount/input/wcdata.txt /wordcount/resulthdfs dfs -ls /wordcount/resulthdfs dfs -cat /wordcount/result/part-r-00000