设为首页 收藏本站
查看: 1104|回复: 0

[经验分享] hadoop2.2.0遇到64位操作系统平台报错

[复制链接]

尚未签到

发表于 2016-12-13 11:18:42 | 显示全部楼层 |阅读模式
遇到的问题
  [hadoop@hadoop01 input]$ hadoop dfs -put ./in
  DEPRECATED: Use of this script to executehdfs command is deprecated.
  Instead use the hdfs command for it.
  Java HotSpot(TM) 64-BitServer VM warning: You have loaded library/app/hadoop/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0 which might havedisabled stack guard. The VM will try to fix the stack guard now.
  It's highly recommendedthat you fix the library with 'execstack -c <libfile>', or link it with'-z noexecstack'.
  13/10/24 04:08:55 WARNutil.NativeCodeLoader: Unable to load native-hadoop library for yourplatform... using builtin-java classes where applicable
  put: `in': No such file or directory
查看本地文件
  [hadoop@hadoop01 input]$ file /app/hadoop/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0
  /app/hadoop/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0:ELF 32-bit LSB shared object, Intel 80386,version 1 (SYSV), dynamically linked, not stripped
貌似是32位和64位的原因
  http://mail-archives.apache.org/mod_mbox/hadoop-user/201208.mbox/%3C19AD42E3F64F0F468A305399D0DF39D92EA4521578@winops07.win.compete.com%3E
  http://www.mail-archive.com/common-issues@hadoop.apache.org/msg52576.html
  操作系统64位,软件是32位。悲剧了。。。装好的集群没法用。
解决方法:重新编译hadoop
  解决方法,就是重新编译hadoop软件:
载程序代码
  机器得连网,如果没联网找可以联网的机器下载,但是编译时还是要下载一些东西,所以,实在不行。最好找相同平台(可以是虚拟机)能上网的机器做下面工作,弄好了再拷回来。
  # svn checkout'http://svn.apache.org/repos/asf/hadoop/common/tags/release-2.2.0'
  都下载到这了:
  [hadoop@hadoop01 hadoop]$ ls
  BUILDING.txt       hadoop-common-project     hadoop-maven-plugins  hadoop-tools
  dev-support        hadoop-dist               hadoop-minicluster    hadoop-yarn-project
  hadoop-assemblies  hadoop-hdfs-project       hadoop-project        pom.xml
  hadoop-client      hadoop-mapreduce-project  hadoop-project-dist
安装开发环境
1.必要的包
  [iyunv@hadoop01 /]# yum install svn
  [iyunv@hadoop01 ~]# yum install autoconfautomake libtool cmake
  root@hadoop01 ~]# yum install ncurses-devel
  root@hadoop01 ~]# yum install openssl-devel
  root@hadoop01 ~]# yum install gcc*
2.安装maven
  下载,并解压
  http://maven.apache.org/download.cgi
  [iyunv@hadoop01 stable]# mvapache-maven-3.1.1 /usr/local/
  将/usr/local/apache-maven-3.1.1/bin加到环境变量中
3.安装protobuf
  没装 protobuf,后面编译做不完,结果如下:
  [INFO] ---hadoop-maven-plugins:2.2.0:protoc (compile-protoc) @ hadoop-common ---
  [WARNING] [protoc, --version] failed:java.io.IOException: Cannot run program "protoc": error=2, No suchfile or directory
  [ERROR] stdout: []
  ……………………
  [INFO] Apache Hadoop Main................................ SUCCESS [5.672s]
  [INFO] Apache Hadoop Project POM......................... SUCCESS [3.682s]
  [INFO] Apache Hadoop Annotations......................... SUCCESS [8.921s]
  [INFO] Apache Hadoop Assemblies.......................... SUCCESS [0.676s]
  [INFO] Apache Hadoop Project Dist POM.................... SUCCESS [4.590s]
  [INFO] Apache Hadoop Maven Plugins....................... SUCCESS [9.172s]
  [INFO] Apache Hadoop Auth................................ SUCCESS [10.123s]
  [INFO] Apache Hadoop Auth Examples....................... SUCCESS [5.170s]
  [INFO] Apache HadoopCommon .............................. FAILURE [1.224s]
  [INFO] Apache Hadoop NFS................................. SKIPPED
  [INFO] Apache Hadoop Common Project...................... SKIPPED
  [INFO] Apache Hadoop HDFS................................ SKIPPED
  [INFO] Apache Hadoop HttpFS.............................. SKIPPED
  [INFO] Apache Hadoop HDFS BookKeeperJournal ............. SKIPPED
  [INFO] Apache Hadoop HDFS-NFS............................ SKIPPED
  [INFO] Apache Hadoop HDFS Project........................ SKIPPED
安装protobuf过程
  下载:https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz
  https://code.google.com/p/protobuf/downloads/list
  [iyunv@hadoop01 protobuf-2.5.0]# pwd
  /soft/protobuf-2.5.0
  依次执行下面的命令即可
  ./configure
  make
  make check
  make install
  [iyunv@hadoop01 protobuf-2.5.0]# protoc--version
  libprotoc 2.5.0
4.cmake安装
  CMAKE报错:
  main:
  [mkdir] Created dir:/soft/hadoop/hadoop-tools/hadoop-pipes/target/native
  [exec] -- The C compiler identification is GNU
  [exec] -- The CXX compiler identification is GNU
  [exec] -- Check for working C compiler: /usr/bin/gcc
  [exec] -- Check for working C compiler: /usr/bin/gcc -- works
  [exec] -- Detecting C compiler ABI info
  [exec] -- Detecting C compiler ABI info - done
  [exec] -- Check for working CXX compiler: /usr/bin/c++
  [exec] -- Check for working CXX compiler: /usr/bin/c++ -- works
  [exec] -- Detecting CXX compiler ABI info
  [exec] -- Detecting CXX compiler ABI info - done
  [exec] CMake Error at /usr/share/cmake/Modules/FindOpenSSL.cmake:66(MESSAGE):
  [exec]   Could NOT find OpenSSL
  [exec] Call Stack (most recent call first):
  [exec]   CMakeLists.txt:20(find_package)
  [exec]
  [exec]
  [exec] -- Configuring incomplete, errors occurred!
  [INFO] Apache Hadoop Gridmix............................. SUCCESS [12.062s]
  [INFO] Apache Hadoop Data Join........................... SUCCESS [8.694s]
  [INFO] Apache Hadoop Extras.............................. SUCCESS [6.877s]
  [INFO] Apache Hadoop Pipes ...............................FAILURE [5.295s]
  [INFO] Apache Hadoop Tools Dist.......................... SKIPPED
  [INFO] Apache Hadoop Tools............................... SKIPPED
  [INFO] Apache Hadoop Distribution........................ SKIPPED
  [INFO] Apache Hadoop Client.............................. SKIPPED
  [INFO] Apache Hadoop Mini-Cluster........................ SKIPPED
  需要安装
  root@hadoop01 ~]# yum install ncurses-devel
  root@hadoop01 ~]# yum install openssl-devel
编译hadoop
  [hadoop@hadoop01 hadoop]$ pwd
  /soft/hadoop
  [hadoop@hadoop01 hadoop]$ ls
  BUILDING.txt       hadoop-client          hadoop-hdfs-project       hadoop-minicluster   hadoop-tools
  dev-support        hadoop-common-project  hadoop-mapreduce-project  hadoop-project       hadoop-yarn-project
  hadoop-assemblies  hadoop-dist            hadoop-maven-plugins      hadoop-project-dist  pom.xml
  [hadoop@hadoop01 hadoop]$ mvn package -Pdist,native -DskipTests -Dtar
  目前的2.2.0 的Source Code 压缩包解压出来的code有个bug 需要patch后才能编译。否则编译hadoop-auth 会提示上面错误。

解决办法如下:

修改下面的pom文件。该文件在hadoop源码包下寻找:

hadoop-common-project/hadoop-auth/pom.xml

打开上面的的pom文件,在54行加入如下的依赖:(如果没有的话)

     <dependency>
       <groupId>org.mortbay.jetty</groupId>
      <artifactId>jetty-util</artifactId>
      <scope>test</scope>
     </dependency>
     <dependency>
       <groupId>org.mortbay.jetty</groupId>
       <artifactId>jetty</artifactId>
       <scope>test</scope>
     </dependency>
  编译是个很耗时的工作呀。。。。
  下面是做完成功的结果
  [INFO] Reactor Summary:
  [INFO]
  [INFO] Apache Hadoop Main................................ SUCCESS [6.600s]
  [INFO] Apache Hadoop Project POM......................... SUCCESS [3.974s]
  [INFO] Apache Hadoop Annotations......................... SUCCESS [9.878s]
  [INFO] Apache Hadoop Assemblies.......................... SUCCESS [0.856s]
  [INFO] Apache Hadoop Project Dist POM.................... SUCCESS [4.750s]
  [INFO] Apache Hadoop Maven Plugins....................... SUCCESS [8.720s]
  [INFO] Apache Hadoop Auth................................ SUCCESS [10.107s]
  [INFO] Apache Hadoop Auth Examples....................... SUCCESS [5.734s]
  [INFO] Apache Hadoop Common.............................. SUCCESS [4:32.636s]
  [INFO] Apache Hadoop NFS................................. SUCCESS [29.700s]
  [INFO] Apache Hadoop Common Project...................... SUCCESS [0.090s]
  [INFO] Apache Hadoop HDFS................................ SUCCESS [6:15.394s]
  [INFO] Apache Hadoop HttpFS.............................. SUCCESS [1:09.238s]
  [INFO] Apache Hadoop HDFS BookKeeperJournal ............. SUCCESS [27.676s]
  [INFO] Apache Hadoop HDFS-NFS............................ SUCCESS [13.954s]
  [INFO] Apache Hadoop HDFS Project........................ SUCCESS [0.212s]
  [INFO] hadoop-yarn....................................... SUCCESS [0.962s]
  [INFO] hadoop-yarn-api................................... SUCCESS [1:48.066s]
  [INFO] hadoop-yarn-common................................ SUCCESS [1:37.543s]
  [INFO] hadoop-yarn-server................................ SUCCESS [4.301s]
  [INFO] hadoop-yarn-server-common......................... SUCCESS [29.502s]
  [INFO] hadoop-yarn-server-nodemanager.................... SUCCESS [36.593s]
  [INFO] hadoop-yarn-server-web-proxy...................... SUCCESS [13.273s]
  [INFO] hadoop-yarn-server-resourcemanager................ SUCCESS [30.612s]
  [INFO] hadoop-yarn-server-tests.......................... SUCCESS [4.374s]
  [INFO] hadoop-yarn-client................................ SUCCESS [14.115s]
  [INFO] hadoop-yarn-applications.......................... SUCCESS [0.218s]
  [INFO]hadoop-yarn-applications-distributedshell ......... SUCCESS [9.871s]
  [INFO] hadoop-mapreduce-client........................... SUCCESS [1.095s]
  [INFO] hadoop-mapreduce-client-core...................... SUCCESS [1:30.650s]
  [INFO]hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [15.089s]
  [INFO] hadoop-yarn-site.................................. SUCCESS [0.637s]
  [INFO] hadoop-yarn-project............................... SUCCESS [25.809s]
  [INFO] hadoop-mapreduce-client-common.................... SUCCESS [45.919s]
  [INFO] hadoop-mapreduce-client-shuffle................... SUCCESS [14.693s]
  [INFO] hadoop-mapreduce-client-app....................... SUCCESS [39.562s]
  [INFO] hadoop-mapreduce-client-hs........................ SUCCESS [19.299s]
  [INFO] hadoop-mapreduce-client-jobclient................. SUCCESS [18.549s]
  [INFO] hadoop-mapreduce-client-hs-plugins................ SUCCESS [5.134s]
  [INFO] Apache Hadoop MapReduce Examples.................. SUCCESS [17.823s]
  [INFO] hadoop-mapreduce.................................. SUCCESS [12.726s]
  [INFO] Apache Hadoop MapReduce Streaming................. SUCCESS [19.760s]
  [INFO] Apache Hadoop Distributed Copy.................... SUCCESS [33.332s]
  [INFO] Apache Hadoop Archives............................ SUCCESS [9.522s]
  [INFO] Apache Hadoop Rumen............................... SUCCESS [15.141s]
  [INFO] Apache Hadoop Gridmix............................. SUCCESS [15.052s]
  [INFO] Apache Hadoop Data Join........................... SUCCESS [8.621s]
  [INFO] Apache Hadoop Extras.............................. SUCCESS [8.744s]
  [INFO] Apache Hadoop Pipes............................... SUCCESS [28.645s]
  [INFO] Apache Hadoop Tools Dist.......................... SUCCESS [6.238s]
  [INFO] Apache Hadoop Tools............................... SUCCESS [0.126s]
  [INFO] Apache Hadoop Distribution........................ SUCCESS [1:20.132s]
  [INFO] Apache Hadoop Client.............................. SUCCESS [18.820s]
  [INFO] Apache Hadoop Mini-Cluster........................ SUCCESS [2.151s]
  [INFO]------------------------------------------------------------------------
  [INFO] BUILD SUCCESS
  [INFO]------------------------------------------------------------------------
  [INFO] Total time: 29:07.811s
  [INFO] Finished at: Thu Oct 24 09:43:18 CST2013
  [INFO] Final Memory: 78M/239M
  [INFO]------------------------------------------------------------------------
使用用编译好的软件再执行一次
  [hadoop@hadoop01 input]$ hadoop dfs -put ./in
  DEPRECATED: Use of this script to executehdfs command is deprecated.
  Instead use the hdfs command for it.
  13/10/24 15:12:53 WARNutil.NativeCodeLoader: Unable to load native-hadoop library for yourplatform... using builtin-java classes where applicable
  put: `in': No such file or directory
  加入环境变量
  export HADOOP_HOME=/usr/hadoop/hadoop-2.2.0
export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
export JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native
  hadoop为2.2.0,操作系统为oracle linux 6.3 64位。
  [hadoop@hadoop01 input]$ hadoop dfs -put ./in
  DEPRECATED: Use of this script to executehdfs command is deprecated.
  Instead use the hdfs command for it.
  13/10/24 15:12:53 WARNutil.NativeCodeLoader: Unable to load native-hadoop library for yourplatform... using builtin-java classes where applicable
  put: `in': No such file or directory
  最后一行“put:`in': No such file or directory”先不管,肯定是语法命令有问题。
  先解决“WARN util.NativeCodeLoader: Unable to loadnative-hadoop library for your platform... using builtin-java classes whereapplicable
  备注:我的hadoop环境是自己编译的,因为64位操作系统,hadoop2.2.0貌似只有32位的软件。
解决过程
1.开启debug
  [hadoop@hadoop01 input]$ export HADOOP_ROOT_LOGGER=DEBUG,console
  [hadoop@hadoop01 input]$ hadoop dfs -put./in
  DEPRECATED: Use of this script to executehdfs command is deprecated.
  Instead use the hdfs command for it.
  13/10/24 16:11:31 DEBUG util.Shell: setsidexited with exit code 0
  13/10/24 16:11:31 DEBUGlib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRateorg.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess withannotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time,value=[Rate of successful kerberos logins and latency (milliseconds)], about=,type=DEFAULT, always=false, sampleName=Ops)
  13/10/24 16:11:31 DEBUGlib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRateorg.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure withannotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time,value=[Rate of failed kerberos logins and latency (milliseconds)], about=,type=DEFAULT, always=false, sampleName=Ops)
  13/10/24 16:11:31 DEBUGimpl.MetricsSystemImpl: UgiMetrics, User and group related metrics
  13/10/24 16:11:32 DEBUGsecurity.Groups:  Creating new Groupsobject
  13/10/24 16:11:32 DEBUGutil.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
  13/10/24 16:11:32 DEBUGutil.NativeCodeLoader: Failed to load native-hadoopwith error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
  13/10/24 16:11:32 DEBUGutil.NativeCodeLoader: java.library.path=/usr/java/jdk1.7.0_45/lib:/app/hadoop/hadoop-2.2.0/lib/native:/app/hadoop/hadoop-2.2.0/lib/native
  13/10/24 16:11:32 WARNutil.NativeCodeLoader: Unable to load native-hadoop library for yourplatform... using builtin-java classes where applicable
  13/10/24 16:11:32 DEBUG security.JniBasedUnixGroupsMappingWithFallback:Falling back to shell based
  13/10/24 16:11:32 DEBUGsecurity.JniBasedUnixGroupsMappingWithFallback: Group mappingimpl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
  13/10/24 16:11:32 DEBUG security.Groups:Group mappingimpl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback;cacheTimeout=300000
  13/10/24 16:11:32 DEBUGsecurity.UserGroupInformation: hadoop login
  13/10/24 16:11:32 DEBUGsecurity.UserGroupInformation: hadoop login commit
  13/10/24 16:11:32 DEBUGsecurity.UserGroupInformation: using local user:UnixPrincipal: hadoop
  13/10/24 16:11:32 DEBUGsecurity.UserGroupInformation: UGI loginUser:hadoop (auth:SIMPLE)
  13/10/24 16:11:33 DEBUGhdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false
  13/10/24 16:11:33 DEBUGhdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false
  13/10/24 16:11:33 DEBUGhdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false
  13/10/24 16:11:33 DEBUGhdfs.BlockReaderLocal: dfs.domain.socket.path =
  13/10/24 16:11:33 DEBUGimpl.MetricsSystemImpl: StartupProgress, NameNode startup progress
  13/10/24 16:11:33 DEBUG retry.RetryUtils:multipleLinearRandomRetry = null
  13/10/24 16:11:33 DEBUG ipc.Server:rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=classorg.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper,rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@2e41d9a2
  13/10/24 16:11:34 DEBUGhdfs.BlockReaderLocal: Both short-circuit local reads and UNIX domain socketare disabled.
  13/10/24 16:11:34 DEBUG ipc.Client: Theping interval is 60000 ms.
  13/10/24 16:11:34 DEBUG ipc.Client:Connecting to localhost/127.0.0.1:8020
  13/10/24 16:11:34 DEBUG ipc.Client: IPCClient (2141757401) connection to localhost/127.0.0.1:8020 from hadoop:starting, having connections 1
  13/10/24 16:11:34 DEBUG ipc.Client: IPCClient (2141757401) connection to localhost/127.0.0.1:8020 from hadoop sending#0
  13/10/24 16:11:34 DEBUG ipc.Client: IPCClient (2141757401) connection to localhost/127.0.0.1:8020 from hadoop gotvalue #0
  13/10/24 16:11:34 DEBUGipc.ProtobufRpcEngine: Call: getFileInfo took 82ms
  13/10/24 16:11:34 DEBUG ipc.Client: IPCClient (2141757401) connection to localhost/127.0.0.1:8020 from hadoop sending#1
  13/10/24 16:11:34 DEBUG ipc.Client: IPCClient (2141757401) connection to localhost/127.0.0.1:8020 from hadoop gotvalue #1
  13/10/24 16:11:34 DEBUGipc.ProtobufRpcEngine: Call: getFileInfo took 4ms
  put: `.': No such file or directory
  13/10/24 16:11:34 DEBUG ipc.Client:Stopping client
  13/10/24 16:11:34 DEBUG ipc.Client: IPCClient (2141757401) connection to localhost/127.0.0.1:8020 from hadoop: closed
  13/10/24 16:11:34 DEBUG ipc.Client: IPCClient (2141757401) connection to localhost/127.0.0.1:8020 from hadoop:stopped, remaining connections 0
  上述debug中的错误 :
  Failed to load native-hadoop with error:java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
  为了解决这个错误,尝试过很多种办法,很多都是对环境变量的修改。都是一筹莫展。。
  加入环境变量
  export HADOOP_HOME=/usr/hadoop/hadoop-2.2.0
export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
export JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native
2.解决方法
  最后详细读了官方的NativeLibraries文档。
  http://hadoop.apache.org/docs/r2.2.0/hadoop-project-dist/hadoop-common/NativeLibraries.html
  Either download a hadoop release, whichwill include a pre-built version of the native hadoop library, or build yourown version of the native hadoop library. Whether you download or build, thename for the library is the same: libhadoop.so
  发现人家要的名字是这个libhadoop.so,检查我的目录,有libhadoop.so.1.0.0这个。看了官方编译的软件,确实有那个libhadoop.so文件,但只是个link,所以照做
  [hadoop@hadoop01 native]$ ln -slibhadoop.so.1.0.0 libhadoop.so
  [hadoop@hadoop01 native]$ ln -s libhdfs.so.0.0.0libhdfs.so
  问题解决了。
  [hadoop@hadoop01 hadoop]$ hadoop dfs -put./in
  DEPRECATED: Use of this script to executehdfs command is deprecated.
  Instead use the hdfs command for it.
  put: `.': No such file or directory

运维网声明 1、欢迎大家加入本站运维交流群:群②:261659950 群⑤:202807635 群⑦870801961 群⑧679858003
2、本站所有主题由该帖子作者发表,该帖子作者与运维网享有帖子相关版权
3、所有作品的著作权均归原作者享有,请您和我们一样尊重他人的著作权等合法权益。如果您对作品感到满意,请购买正版
4、禁止制作、复制、发布和传播具有反动、淫秽、色情、暴力、凶杀等内容的信息,一经发现立即删除。若您因此触犯法律,一切后果自负,我们对此不承担任何责任
5、所有资源均系网友上传或者通过网络收集,我们仅提供一个展示、介绍、观摩学习的平台,我们不对其内容的准确性、可靠性、正当性、安全性、合法性等负责,亦不承担任何法律责任
6、所有作品仅供您个人学习、研究或欣赏,不得用于商业或者其他用途,否则,一切后果均由您自己承担,我们对此不承担任何法律责任
7、如涉及侵犯版权等问题,请您及时通知我们,我们将立即采取措施予以解决
8、联系人Email:admin@iyunv.com 网址:www.yunweiku.com

所有资源均系网友上传或者通过网络收集,我们仅提供一个展示、介绍、观摩学习的平台,我们不对其承担任何法律责任,如涉及侵犯版权等问题,请您及时通知我们,我们将立即处理,联系人Email:kefu@iyunv.com,QQ:1061981298 本贴地址:https://www.yunweiku.com/thread-313725-1-1.html 上篇帖子: CentOS6.4安装Hadoop2.1beta 下篇帖子: 知识管理
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

扫码加入运维网微信交流群X

扫码加入运维网微信交流群

扫描二维码加入运维网微信交流群,最新一手资源尽在官方微信交流群!快快加入我们吧...

扫描微信二维码查看详情

客服E-mail:kefu@iyunv.com 客服QQ:1061981298


QQ群⑦:运维网交流群⑦ QQ群⑧:运维网交流群⑧ k8s群:运维网kubernetes交流群


提醒:禁止发布任何违反国家法律、法规的言论与图片等内容;本站内容均来自个人观点与网络等信息,非本站认同之观点.


本站大部分资源是网友从网上搜集分享而来,其版权均归原作者及其网站所有,我们尊重他人的合法权益,如有内容侵犯您的合法权益,请及时与我们联系进行核实删除!



合作伙伴: 青云cloud

快速回复 返回顶部 返回列表