设为首页 收藏本站
查看: 992|回复: 0

[经验分享] CentOS7x86_64编译hadoop2.4

[复制链接]

尚未签到

发表于 2016-12-13 11:39:09 | 显示全部楼层 |阅读模式
  主要涉及到工具有:hadoop-2.4.0-src.tar.gz、Ant、Maven、JDK、GCC、CMake、openssl
  第一步升级系统相关编译所需的软件(升级最新版):
  yum install lzo-devel zlib-devel gcc autoconf automake libtool ncurses-devel openssl-devel
  wget http://mirrors.cnnic.cn/apache/hadoop/common/hadoop-2.4.0/hadoop-2.4.0-src.tar.gz (源代版)
  tar -zxvf hadoop-2.4.0-src.tar.gz
  wget http://apache.fayea.com/apache-mirror//ant/binaries/apache-ant-1.9.4-bin.tar.gz
  tar -xvf apache-ant-1.9.4-bin.tar.gz
  wget http://apache.fayea.com/apache-mirror/maven/maven-3/3.0.5/binaries/apache-maven-3.0.5-bin.tar.gz
  tar -xvf apache-maven-3.0.5-bin.tar.gz
  vi /etc/profile
  export JAVA_HOME=/usr/java/jdk1.7.0_55
  export JAVA_BIN=/usr/java/jdk1.7.0_55/bin
  export ANT_HOME=/home/hadoop/ant
  export MVN_HOME=/home/hadoop/maven
  export FINDBUGS_HOME=/home/hadoop/findbugs-2.0.3
  export PATH=$PATH:$JAVA_HOME/bin:$ANT_HOME/bin:$MVN_HOME/bin:$FINDBUGS_HOME/bin
  生产配置文件:
  source /etc/profile
  验证是否配置成功
  ant –version
  mvn -version
  findbugs –version
  验证结果:
  安装protobuf(以root用户登录)
  wget https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz
  tar zxf protobuf-2.5.0.tar.gz
  cd protobuf-2.5.0
  ./configure
  make
  make install
  protoc --version
  安装cmake(以root用户登录)
  wget http://www.cmake.org/files/v2.8/cmake-2.8.12.2-Linux-i386.tar.gz
  ./bootstrap
  make
  make install
  cmake –version
  (别一种方法,直接用yum install cmake)


编译Hadoop

  mvn package -DskipTests -Pdist,native –Dtar
  此时在下载maven依赖所有包及插件
  慢慢等待中……
  编译成功,检查nativelib 是否编译成功
  [iyunv@master hadoop-2.4.1-src]# cd hadoop-dist/target/hadoop-2.4.1/lib/native/
[iyunv@master native]# file libhadoop.so.1.0.0
libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, BuildID[sha1]=0xba68c7f46259525c3aae4ebd99e1faf3b6c7e7a6, not stripped
  代表编译成功
  或者结果如下, 也表示编译成功。

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................ SUCCESS [5.731s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [4.215s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [5.122s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.548s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [4.271s]
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [8.020s]
[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [7.431s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [7.517s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [4.727s]
[INFO] Apache Hadoop Common .............................. SUCCESS [2:53.800s]
[INFO] Apache Hadoop NFS ................................. SUCCESS [16.696s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [0.042s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [6:07.368s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [48.810s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [20.154s]
[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [9.709s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.049s]
[INFO] hadoop-yarn ....................................... SUCCESS [0.138s]
[INFO] hadoop-yarn-api ................................... SUCCESS [2:00.295s]
[INFO] hadoop-yarn-common ................................ SUCCESS [1:00.256s]
[INFO] hadoop-yarn-server ................................ SUCCESS [0.076s]
[INFO] hadoop-yarn-server-common ......................... SUCCESS [21.974s]
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [28.986s]
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [6.791s]
[INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [13.558s]
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [28.431s]
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [2.644s]
[INFO] hadoop-yarn-client ................................ SUCCESS [12.729s]
[INFO] hadoop-yarn-applications .......................... SUCCESS [0.102s]
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [4.878s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [3.103s]
[INFO] hadoop-yarn-site .................................. SUCCESS [0.055s]
[INFO] hadoop-yarn-project ............................... SUCCESS [6.390s]
[INFO] hadoop-mapreduce-client ........................... SUCCESS [0.211s]
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [39.919s]
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [34.197s]
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [5.716s]
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [18.761s]
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [17.226s]
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [7.617s]
[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [3.211s]
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [12.571s]
[INFO] hadoop-mapreduce .................................. SUCCESS [6.483s]
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [10.180s]
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [15.514s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [5.243s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [12.533s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [8.247s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [6.091s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [5.339s]
[INFO] Apache Hadoop Pipes ............................... SUCCESS [13.666s]
[INFO] Apache Hadoop OpenStack support ................... SUCCESS [14.356s]
[INFO] Apache Hadoop Client .............................. SUCCESS [14.354s]
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.145s]
[INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [39.951s]
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [8.662s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [0.035s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [1:45.654s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 23:27.228s
[INFO] Finished at: Tue Sep 16 23:09:45 HKT 2014
[INFO] Final Memory: 67M/179M
[INFO] ------------------------------------------------------------------------

  错误1
  [INFO] ------------------------------------------------------------------------
  [INFO] BUILD FAILURE
  [INFO] ------------------------------------------------------------------------
  [INFO] Total time: 46.796s
  [INFO] Finished at: Wed Jun 04 13:28:37 CST 2014
  [INFO] Final Memory: 36M/88M
  [INFO] ------------------------------------------------------------------------
  [ERROR] Failed to execute goal on project hadoop-common: Could not resolve dependencies for project org.apache.hadoop:hadoop-common:jar:2.4.0: Failure to find org.apache.commons:commons-compress:jar:1.4.1 in https://repository.apache.org/content/repositories/snapshots was cached in the local repository, resolution will not be reattempted until the update interval of apache.snapshots.https has elapsed or updates are forced -> [Help 1]
  [ERROR]
  [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
  [ERROR] Re-run Maven using the -X switch to enable full debug logging.
  [ERROR]
  [ERROR] For more information about the errors and possible solutions, please read the following articles:
  [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
  [ERROR]
  [ERROR] After correcting the problems, you can resume the build with the command
  [ERROR] mvn <goals> -rf :hadoop-common
  解决方法:
  根据上面日志提示说找不到“org.apache.commons:commons-compress:jar:1.4.1”,
  直接将本地(Windows)包复制到Linux系统中,解决了。
  错误2
  [INFO] ------------------------------------------------------------------------
  [INFO] BUILD FAILURE
  [INFO] ------------------------------------------------------------------------
  [INFO] Total time: 2:16.693s
  [INFO] Finished at: Wed Jun 04 13:56:31 CST 2014
  [INFO] Final Memory: 48M/239M
  [INFO] ------------------------------------------------------------------------
  [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-common: An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program "cmake" (in directory "/home/hadoop/hadoop-2.4.0-src/hadoop-common-project/hadoop-common/target/native"): error=2, 没有那个文件或目录
  [ERROR] around Ant part ...<exec dir="/home/hadoop/hadoop-2.4.0-src/hadoop-common-project/hadoop-common/target/native" executable="cmake" failonerror="true">... @ 4:133 in /home/hadoop/hadoop-2.4.0-src/hadoop-common-project/hadoop-common/target/antrun/build-main.xml
  [ERROR] -> [Help 1]
  [ERROR]
  [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
  [ERROR] Re-run Maven using the -X switch to enable full debug logging.
  [ERROR]
  [ERROR] For more information about the errors and possible solutions, please read the following articles:
  [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
  [ERROR]
  [ERROR] After correcting the problems, you can resume the build with the command
  [ERROR] mvn <goals> -rf :hadoop-common
  解决方法:
  是没有安装cmake导致的,再重新安装cmake;参考《5.3.1编译环境准备》
  错误3
  错误提示是找不到相应的文件和不能创建目录,在网上没有相关错误(根据自己经验修改目录权限为:775,让目录有创建文件或文件夹的权限,另外最好保证hadoop编译目录有2.5G至4G的空间)
  chmod -Rf 775 ./ hadoop-2.4.0-src
  main:
  [mkdir] Created dir: /data/hadoop/hadoop-2.4.0-src/hadoop-tools/hadoop-pipes/target/test-dir
  [INFO] Executed tasks
  [INFO]
  [INFO] --- maven-antrun-plugin:1.7:run (make) @ hadoop-pipes ---
  [INFO] Executing tasks
  错误3
  main:
  [mkdir] Created dir: /data/hadoop/hadoop-2.4.0-src/hadoop-tools/hadoop-pipes/target/native
  [exec] -- The C compiler identification is GNU 4.4.7
  [exec] -- The CXX compiler identification is GNU 4.4.7
  [exec] -- Check for working C compiler: /usr/bin/cc
  [exec] -- Check for working C compiler: /usr/bin/cc -- works
  [exec] -- Detecting C compiler ABI info
  [exec] -- Detecting C compiler ABI info - done
  [exec] -- Check for working CXX compiler: /usr/bin/c++
  [exec] -- Check for working CXX compiler: /usr/bin/c++ -- works
  [exec] -- Detecting CXX compiler ABI info
  [exec] -- Detecting CXX compiler ABI info - done
  [exec] CMake Error at /usr/local/share/cmake-2.8/Modules/FindPackageHandleStandardArgs.cmake:108 (message):
  [exec] Could NOT find OpenSSL, try to set the path to OpenSSL root folder in the
  [exec] system variable OPENSSL_ROOT_DIR (missing: OPENSSL_LIBRARIES
  [exec] OPENSSL_INCLUDE_DIR)
  [exec] Call Stack (most recent call first):
  [exec] /usr/local/share/cmake-2.8/Modules/FindPackageHandleStandardArgs.cmake:315 (_FPHSA_FAILURE_MESSAGE)
  [exec] /usr/local/share/cmake-2.8/Modules/FindOpenSSL.cmake:313 (find_package_handle_standard_args)
  [exec] CMakeLists.txt:20 (find_package)
  [exec]
  [exec]
  [exec] -- Configuring incomplete, errors occurred!
  [exec] See also "/data/hadoop/hadoop-2.4.0-src/hadoop-tools/hadoop-pipes/target/native/CMakeFiles/CMakeOutput.log".
  [INFO] ------------------------------------------------------------------------
  [INFO] Reactor Summary:
  [INFO]
  [INFO] Apache Hadoop Main ................................ SUCCESS [13.745s]
  [INFO] Apache Hadoop Project POM ......................... SUCCESS [5.538s]
  [INFO] Apache Hadoop Annotations ......................... SUCCESS [7.296s]
  [INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.568s]
  [INFO] Apache Hadoop Project Dist POM .................... SUCCESS [5.858s]
  [INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [8.541s]
  [INFO] Apache Hadoop MiniKDC ............................. SUCCESS [8.337s]
  [INFO] Apache Hadoop Auth ................................ SUCCESS [7.348s]
  [INFO] Apache Hadoop Auth Examples ....................... SUCCESS [4.926s]
  [INFO] Apache Hadoop Common .............................. SUCCESS [2:35.956s]
  [INFO] Apache Hadoop NFS ................................. SUCCESS [18.680s]
  [INFO] Apache Hadoop Common Project ...................... SUCCESS [0.059s]
  [INFO] Apache Hadoop HDFS ................................ SUCCESS [5:03.525s]
  [INFO] Apache Hadoop HttpFS .............................. SUCCESS [38.335s]
  [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [23.780s]
  [INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [8.769s]
  [INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.159s]
  [INFO] hadoop-yarn ....................................... SUCCESS [0.134s]
  [INFO] hadoop-yarn-api ................................... SUCCESS [2:07.657s]
  [INFO] hadoop-yarn-common ................................ SUCCESS [1:10.680s]
  [INFO] hadoop-yarn-server ................................ SUCCESS [0.165s]
  [INFO] hadoop-yarn-server-common ......................... SUCCESS [24.174s]
  [INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [27.293s]
  [INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [5.177s]
  [INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [11.399s]
  [INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [28.384s]
  [INFO] hadoop-yarn-server-tests .......................... SUCCESS [1.346s]
  [INFO] hadoop-yarn-client ................................ SUCCESS [12.937s]
  [INFO] hadoop-yarn-applications .......................... SUCCESS [0.108s]
  [INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [5.303s]
  [INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [3.212s]
  [INFO] hadoop-yarn-site .................................. SUCCESS [0.050s]
  [INFO] hadoop-yarn-project ............................... SUCCESS [8.638s]
  [INFO] hadoop-mapreduce-client ........................... SUCCESS [0.135s]
  [INFO] hadoop-mapreduce-client-core ...................... SUCCESS [43.622s]
  [INFO] hadoop-mapreduce-client-common .................... SUCCESS [36.329s]
  [INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [6.058s]
  [INFO] hadoop-mapreduce-client-app ....................... SUCCESS [20.058s]
  [INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [16.493s]
  [INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [11.685s]
  [INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [3.222s]
  [INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [12.656s]
  [INFO] hadoop-mapreduce .................................. SUCCESS [8.060s]
  [INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [8.994s]
  [INFO] Apache Hadoop Distributed Copy .................... SUCCESS [15.886s]
  [INFO] Apache Hadoop Archives ............................ SUCCESS [6.659s]
  [INFO] Apache Hadoop Rumen ............................... SUCCESS [15.722s]
  [INFO] Apache Hadoop Gridmix ............................. SUCCESS [11.778s]
  [INFO] Apache Hadoop Data Join ........................... SUCCESS [5.953s]
  [INFO] Apache Hadoop Extras .............................. SUCCESS [6.414s]
  [INFO] Apache Hadoop Pipes ............................... FAILURE [3.746s]
  [INFO] Apache Hadoop OpenStack support ................... SKIPPED
  [INFO] Apache Hadoop Client .............................. SKIPPED
  [INFO] Apache Hadoop Mini-Cluster ........................ SKIPPED
  [INFO] Apache Hadoop Scheduler Load Simulator ............ SKIPPED
  [INFO] Apache Hadoop Tools Dist .......................... SKIPPED
  [INFO] Apache Hadoop Tools ............................... SKIPPED
  [INFO] Apache Hadoop Distribution ........................ SKIPPED
  [INFO] ------------------------------------------------------------------------
  [INFO] BUILD FAILURE
  [INFO] ------------------------------------------------------------------------
  [INFO] Total time: 19:43.155s
  [INFO] Finished at: Wed Jun 04 17:40:17 CST 2014
  [INFO] Final Memory: 79M/239M
  [INFO] ------------------------------------------------------------------------
  [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-pipes: An Ant BuildException has occured: exec returned: 1
  [ERROR] around Ant part ...<exec dir="/data/hadoop/hadoop-2.4.0-src/hadoop-tools/hadoop-pipes/target/native" executable="cmake" failonerror="true">... @ 5:123 in /data/hadoop/hadoop-2.4.0-src/hadoop-tools/hadoop-pipes/target/antrun/build-main.xml
  [ERROR] -> [Help 1]
  [ERROR]
  [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
  [ERROR] Re-run Maven using the -X switch to enable full debug logging.
  [ERROR]
  [ERROR] For more information about the errors and possible solutions, please read the following articles:
  [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
  [ERROR]
  [ERROR] After correcting the problems, you can resume the build with the command
  根据网上提示( 下面需要再安装openssl-devel,安装命令yum install openssl-devel,此步不做的话会报如下错误
  [exec] CMake Error at /usr/share/cmake/Modules/FindOpenSSL.cmake:66 (MESSAGE): 
[exec]   Could NOT find OpenSSL 
[exec] Call Stack (most recent call first): 
[exec]   CMakeLists.txt:20 (find_package) 
[exec] 
[exec] 
[exec] -- Configuring incomplete, errors occurred! 
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (make) on project hadoop-pipes: An Ant BuildException has occured: exec returned: 1 -> [Help 1] 
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. 
[ERROR] Re-run Maven using the -X switch to enable full debug logging. 
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles: 
[ERROR] [Help 1] http://cwiki.apache.org/confluen ... oExecutionException 
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command 
[ERROR]   mvn <goals> -rf :hadoop-pipes
  )
  错误连接:http://f.dataguru.cn/thread-189176-1-1.html
  原因是:在安装openssl-devel,少写一个l,重新安装一下
  解决方法:重新安装openssl-devel
  yum install openssl-devel


5.3.3 编译总结

  1、 必须安装(yum install lzo-devel zlib-devel gcc autoconf automake libtool ncurses-devel openssl-devel)
  2、 必须安装(protobuf,CMake)编译工具
  3、 必须配置(ANT、MAVEN、FindBugs)
  4、 将maven库指向开源中国,这样就可以加快编译速度,即加快下载依赖jar包速度
  5、 编译出错需求详细观察出错日志,根据错误日志分析原因再结束百度和Google解决错误;

运维网声明 1、欢迎大家加入本站运维交流群:群②:261659950 群⑤:202807635 群⑦870801961 群⑧679858003
2、本站所有主题由该帖子作者发表,该帖子作者与运维网享有帖子相关版权
3、所有作品的著作权均归原作者享有,请您和我们一样尊重他人的著作权等合法权益。如果您对作品感到满意,请购买正版
4、禁止制作、复制、发布和传播具有反动、淫秽、色情、暴力、凶杀等内容的信息,一经发现立即删除。若您因此触犯法律,一切后果自负,我们对此不承担任何责任
5、所有资源均系网友上传或者通过网络收集,我们仅提供一个展示、介绍、观摩学习的平台,我们不对其内容的准确性、可靠性、正当性、安全性、合法性等负责,亦不承担任何法律责任
6、所有作品仅供您个人学习、研究或欣赏,不得用于商业或者其他用途,否则,一切后果均由您自己承担,我们对此不承担任何法律责任
7、如涉及侵犯版权等问题,请您及时通知我们,我们将立即采取措施予以解决
8、联系人Email:admin@iyunv.com 网址:www.yunweiku.com

所有资源均系网友上传或者通过网络收集,我们仅提供一个展示、介绍、观摩学习的平台,我们不对其承担任何法律责任,如涉及侵犯版权等问题,请您及时通知我们,我们将立即处理,联系人Email:kefu@iyunv.com,QQ:1061981298 本贴地址:https://www.yunweiku.com/thread-313760-1-1.html 上篇帖子: Hadoop2.6.0-cdh5.4.1源码编译安装 下篇帖子: hadoop2.5.2配置httpfs服务
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

扫码加入运维网微信交流群X

扫码加入运维网微信交流群

扫描二维码加入运维网微信交流群,最新一手资源尽在官方微信交流群!快快加入我们吧...

扫描微信二维码查看详情

客服E-mail:kefu@iyunv.com 客服QQ:1061981298


QQ群⑦:运维网交流群⑦ QQ群⑧:运维网交流群⑧ k8s群:运维网kubernetes交流群


提醒:禁止发布任何违反国家法律、法规的言论与图片等内容;本站内容均来自个人观点与网络等信息,非本站认同之观点.


本站大部分资源是网友从网上搜集分享而来,其版权均归原作者及其网站所有,我们尊重他人的合法权益,如有内容侵犯您的合法权益,请及时与我们联系进行核实删除!



合作伙伴: 青云cloud

快速回复 返回顶部 返回列表