Uploaded image for project: 'Hadoop HDFS'
  1. Hadoop HDFS
  2. HDFS-16022

matlab mapreduce v95 demos can't run hadoop-3.2.2 run time

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Major
    • Resolution: Unresolved
    • 3.2.2
    • None
    • dfsclient
    • None

    Description

       hadoop \ hadoop \> jar /usr/local/MATLAB/MATLAB_Runtime/v95/toolbox/mlhadoop/jar/a2.2.0/mwmapreduce.jar \> com.mathworks.hadoop.MWMapReduceDriver \> -D mw.mcrroot=/usr/local/MATLAB/MATLAB_Runtime/v95 \> /usr/local/MATLAB/MATLAB_Runtime/v95/maxArrivalDelay.ctf \> hdfs://hadoop.namenode:50070/user/matlab/datasets/airlinesmall.csv \> hdfs://hadoop.namenode:50070/user/matlab/resultsjava.library.path: /usr/local/hadoop-3.2.2/lib/nativeHDFSCTFPath=hdfs://hadoop.namenode:8020/user/root/maxArrivalDelay/maxArrivalDelay.ctfUploading CTF into distributed cache completed.mapred.child.env: MCR_CACHE_ROOT=/tmp,LD_LIBRARY_PATH=/usr/local/MATLAB/MATLAB_Runtime/v95/runtime/glnxa64:/usr/local/MATLAB/MATLAB_Runtime/v95/bin/glnxa64:/usr/local/MATLAB/MATLAB_Runtime/v95/sys/os/glnxa64:/usr/local/MATLAB/MATLAB_Runtime/v95/sys/opengl/lib/glnxa64mapred.child.java.opts: -Djava.library.path=/usr/local/MATLAB/MATLAB_Runtime/v95/runtime/glnxa64:/usr/local/MATLAB/MATLAB_Runtime/v95/bin/glnxa64:/usr/local/MATLAB/MATLAB_Runtime/v95/sys/os/glnxa64:/usr/local/MATLAB/MATLAB_Runtime/v95/sys/opengl/lib/glnxa64New java.library.path: /usr/local/hadoop-3.2.2/lib/native:/usr/local/MATLAB/MATLAB_Runtime/v95/runtime/glnxa64:/usr/local/MATLAB/MATLAB_Runtime/v95/bin/glnxa64:/usr/local/MATLAB/MATLAB_Runtime/v95/sys/os/glnxa64:/usr/local/MATLAB/MATLAB_Runtime/v95/sys/opengl/lib/glnxa64Using MATLAB mapper.Set input format class to: ChunkFileRecordReader.Using MATLAB reducer.Set outputformat class to: class org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormatSet map output key class to: class com.mathworks.hadoop.MxArrayWritable2Set map output value class to: class com.mathworks.hadoop.MxArrayWritable2Set reduce output key class to: class com.mathworks.hadoop.MxArrayWritable2Set reduce output value class to: class com.mathworks.hadoop.MxArrayWritable2*************** run ******************2021-05-11 14:58:47,043 INFO client.RMProxy: Connecting to ResourceManager at hadoop.namenode/192.168.0.25:80322021-05-11 14:58:47,139 WARN net.NetUtils: Unable to wrap exception of type class org.apache.hadoop.ipc.RpcException: it has no (String) constructorjava.lang.NoSuchMethodException: org.apache.hadoop.ipc.RpcException.<init>(java.lang.String) at java.lang.Class.getConstructor0(Class.java:3082) at java.lang.Class.getConstructor(Class.java:1825) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:835) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:811) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1566) at org.apache.hadoop.ipc.Client.call(Client.java:1508) at org.apache.hadoop.ipc.Client.call(Client.java:1405) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:910) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359) at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1671) at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1602) at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1599) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1614) at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1690) at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:163) at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:277) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:143) at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1565) at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1562) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1762) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1562) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1583) at com.mathworks.hadoop.MWMapReduceDriver.run(MWMapReduceDriver.java:1438) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90) at com.mathworks.hadoop.MWMapReduceDriver.main(MWMapReduceDriver.java:1449) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.util.RunJar.run(RunJar.java:323) at org.apache.hadoop.util.RunJar.main(RunJar.java:236)Exception in thread "main" java.io.IOException: Failed on local exception: org.apache.hadoop.ipc.RpcException: RPC response exceeds maximum data length; Host Details : local host is: "hadoop.namenode/192.168.0.25"; destination host is: "hadoop.namenode":50070;  at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:821) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1566) at org.apache.hadoop.ipc.Client.call(Client.java:1508) at org.apache.hadoop.ipc.Client.call(Client.java:1405) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:910) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359) at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1671) at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1602) at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1599) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1614) at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1690) at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:163) at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:277) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:143) at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1565) at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1562) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1762) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1562) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1583) at com.mathworks.hadoop.MWMapReduceDriver.run(MWMapReduceDriver.java:1438) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90) at com.mathworks.hadoop.MWMapReduceDriver.main(MWMapReduceDriver.java:1449) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.util.RunJar.run(RunJar.java:323) at org.apache.hadoop.util.RunJar.main(RunJar.java:236)Caused by: org.apache.hadoop.ipc.RpcException: RPC response exceeds maximum data length at org.apache.hadoop.ipc.Client$IpcStreams.readResponse(Client.java:1894) at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1191) at org.apache.hadoop.ipc.Client$Connection.run(Client.java:1087)[root@hadoop sbin]

      Attachments

        1. matlab_errorlog
          9 kB
          cathonxiong

        Activity

          People

            Unassigned Unassigned
            cathonsh cathonxiong
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated: