Details
-
Bug
-
Status: Resolved
-
Minor
-
Resolution: Invalid
-
2.6.0
-
None
-
None
-
Using eclipse on windows 7 (client)to run the map reduce job on the host of Hortonworks HDP 2.2 (hortonworks is on vmware version 6.0.2 build-1744117)
Description
Hello,
1. I want to run the simple Map Reduce job example (with the REST API 2.6 for yarn applications) and to calculate PI… for now it doesn’t work.
When I use the command in the hortonworks terminal it works: “hadoop jar /usr/hdp/2.2.0.0-2041/hadoop-mapreduce/hadoop-mapreduce-examples-2.6.0.2.2.0.0-2041.jar pi 10 10”.
But I want to submit the job with the REST API and not in the terminal as a command line. http://hadoop.apache.org/docs/current/hadoop-yarn/hadoop-yarn-site/ResourceManagerRest.html#Cluster_Applications_APISubmit_Application
2. I do succeed with other REST API requests: get state, get new application id and even kill(change state), but when I try to submit my example, the response is:
--------------------------------------------------
--------------------------------------------------
The Response Header:
Key : null ,Value : [HTTP/1.1 202 Accepted]
Key : Date ,Value : [Thu, 22 Jan 2015 07:47:24 GMT, Thu, 22 Jan 2015 07:47:24 GMT]
Key : Content-Length ,Value : [0]
Key : Expires ,Value : [Thu, 22 Jan 2015 07:47:24 GMT, Thu, 22 Jan 2015 07:47:24 GMT]
Key : Location ,Value : http://[my port]:8088/ws/v1/cluster/apps/application_1421661392788_0038
Key : Content-Type ,Value : [application/json]
Key : Server ,Value : [Jetty(6.1.26.hwx)]
Key : Pragma ,Value : [no-cache, no-cache]
Key : Cache-Control ,Value : [no-cache]
The Respone Body:
Null (No Response)
--------------------------------------------------
--------------------------------------------------
3. I need help with the http request body filling. I am doing a POST http request and I know that I am doing it right (in java).
4. I think the problem is in the request body.
5. I used this guy’s answer to help me build my map reduce example xml but it does not work: http://hadoop-forum.org/forum/general-hadoop-discussion/miscellaneous/2136-how-can-i-run-mapreduce-job-by-rest-api.
6. What am I missing? (the description is not clear to me in the submit section of the rest api 2.6)
7. Does someone have an xml example for using a simple MR job?
8. Thanks! Here is the XML file I am using for the request body:
--------------------------------------------------
--------------------------------------------------
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<application-submission-context>
<application-id>application_1421661392788_0038</application-id>
<application-name>test_21_1</application-name>
<queue>default</queue>
<priority>3</priority>
<am-container-spec>
<environment>
<entry>
<key>CLASSPATH</key>
<value>/usr/hdp/2.2.0.0-2041/hadoop/conf<CPS>/usr/hdp/2.2.0.0-2041/hadoop/lib/<CPS>/usr/hdp/2.2.0.0-2041/hadoop/.//<CPS>/usr/hdp/2.2.0.0-2041/hadoop-hdfs/./<CPS>/usr/hdp/2.2.0.0-2041/hadoop-hdfs/lib/<CPS>/usr/hdp/2.2.0.0-2041/hadoop-hdfs/.//<CPS>/usr/hdp/2.2.0.0-2041/hadoop-yarn/lib/<CPS>/usr/hdp/2.2.0.0-2041/hadoop-yarn/.//<CPS>/usr/hdp/2.2.0.0-2041/hadoop-mapreduce/lib/<CPS>/usr/hdp/2.2.0.0-2041/hadoop-mapreduce/.//<CPS><CPS>/usr/share/java/mysql-connector-java-5.1.17.jar<CPS>/usr/share/java/mysql-connector-java.jar<CPS>/usr/hdp/current/hadoop-mapreduce-client/<CPS>/usr/hdp/current/tez-client/<CPS>/usr/hdp/current/tez-client/lib/<CPS>/etc/tez/conf/<CPS>/usr/hdp/2.2.0.0-2041/tez/<CPS>/usr/hdp/2.2.0.0-2041/tez/lib/*<CPS>/etc/tez/conf</value>
</entry>
</environment>
<commands>
<command>hadoop jar /usr/hdp/2.2.0.0-2041/hadoop-mapreduce/hadoop-mapreduce-examples-2.6.0.2.2.0.0-2041.jar pi 10 10</command>
</commands>
</am-container-spec>
<unmanaged-AM>false</unmanaged-AM>
<max-app-attempts>2</max-app-attempts>
<resource>
<memory>1024</memory>
<vCores>1</vCores>
</resource>
<application-type>MAPREDUCE</application-type>
<keep-containers-across-application-attempts>false</keep-containers-across-application-attempts>
<application-tags>
<tag>Michael</tag>
<tag>PI example</tag>
</application-tags>
</application-submission-context>
--------------------------------------------------
--------------------------------------------------