Index: hcatalog/src/test/e2e/templeton/resource/default.res
IDEA additional info:
Subsystem: com.intellij.openapi.diff.impl.patch.CharsetEP
<+>UTF-8
===================================================================
--- hcatalog/src/test/e2e/templeton/resource/default.res (revision )
+++ hcatalog/src/test/e2e/templeton/resource/default.res (revision )
@@ -0,0 +1,19 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership. The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied. See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+$resources = {
+};
Index: hcatalog/src/test/e2e/templeton/README.txt
IDEA additional info:
Subsystem: com.intellij.openapi.diff.impl.patch.CharsetEP
<+>UTF-8
===================================================================
--- hcatalog/src/test/e2e/templeton/README.txt (revision 1e79af5c56adc285e03908c32fb01c5c46fe2103)
+++ hcatalog/src/test/e2e/templeton/README.txt (revision )
@@ -20,10 +20,36 @@
End to end tests in templeton runs tests against an existing templeton server.
It runs hcat, mapreduce, streaming, hive and pig tests.
+It's a good idea to look at current versions of
+http://hive.apache.org/docs/hcat_r0.5.0/rest_server_install.html and
+http://hive.apache.org/docs/hcat_r0.5.0/configuration.html before proceeding.
+
+
+start hive metastore: ./bin/hive --service metastore
+(make sure templeton.hive.properties specifies the ULR for this metastore; when
+WebHCat calls Hive it will will start it with these properties)
+
+launch templeton server: ./hcatalog/sbin/webhcat_server.sh start
+
+to control which DB the metastore uses put something like
+
+ javax.jdo.option.ConnectionURL
+ jdbc:derby:;databaseName=/Users/ekoifman/dev/data/tmp/metastore_db_e2e;create=true
+ Controls which DB engine metastore will use for persistence. In particular,
+ where Derby will create it's data files.
+
+
+in hive-site.xml
+)
+
+
!!!! NOTE !!!!
--------------
USE SVN TO CHECKOUT CODE FOR RUNNING TESTS AS THE TEST
HARNESS IS EXTERNED FROM PIG. GIT WILL NOT IMPORT IT
+ (if you are using GIT, check out http://svn.apache.org/repos/asf/hive/trunk (or whichever branch)
+ (http://hive.apache.org/version_control.html) and symlink
+ hcatalog/src/test/e2e/harness/ to corresponding harness/ in SVN tree)
Test cases
----------
@@ -57,12 +83,20 @@
3. Copy contents of src/test/e2e/templeton/inpdir to hdfs
+(e.g. ./bin/hadoop fs -put ~/dev/hive/hcatalog/src/test/e2e/templeton/inpdir/ webhcate2e)
4. You will need to two jars in the same HDFS directory as the contents of inpdir. piggybank.jar, which can
be obtained from Pig. The second is the hadoop-examples.jar, which can be obtained from your Hadoop distribution.
This should be called hexamples.jar when it is uploaded to HDFS.
+Also see http://hive.apache.org/docs/hcat_r0.5.0/rest_server_install.html#Hadoop+Distributed+Cache for notes on
+additional JAR files to copy to HDFS.
+5. Make sure TEMPLETON_HOME evnironment variable is set
+
+6. hadoop/conf/core-site.xml should have items described in
+http://hive.apache.org/docs/hcat_r0.5.0/rest_server_install.html#Permissions
+
Running the tests
-----------------
Use the following command to run tests -
@@ -73,6 +107,7 @@
If you want to run specific test group you can specify the group, for example: -Dtests.to.run='-t TestHive'
If you want to run specific test in a group group you can specify the test, for example: -Dtests.to.run='-t TestHive_1'
+For example, tests/ddl.conf has several groups such as 'name' => 'REST_DDL_TABLE_BASIC'; use REST_DDL_TABLE_BASIC as the name
Running the hcat authorization tests
@@ -110,3 +145,7 @@
This assumes you've got webhdfs at the address above, the inpdir info in /user/templeton, and templeton running on the default port. You can change any of those properties in the build file.
It's best to set HADOOP_HOME_WARN_SUPPRESS=true everywhere you can.
+Also usefu to add to conf/hadoop-env.sh
+export HADOOP_OPTS="-Djava.security.krb5.realm=OX.AC.UK -Djava.security.krb5.kdc=kdc0.ox.ac.uk:kdc1.ox.ac.uk"
+to prevent warning about SCDynamicStore which may throw some tests off
+(http://stackoverflow.com/questions/7134723/hadoop-on-osx-unable-to-load-realm-info-from-scdynamicstore)
Index: hcatalog/src/test/e2e/templeton/build.xml
IDEA additional info:
Subsystem: com.intellij.openapi.diff.impl.patch.CharsetEP
<+>UTF-8
===================================================================
--- hcatalog/src/test/e2e/templeton/build.xml (revision 1e79af5c56adc285e03908c32fb01c5c46fe2103)
+++ hcatalog/src/test/e2e/templeton/build.xml (revision )
@@ -24,6 +24,7 @@
+
@@ -51,6 +52,7 @@
+
@@ -60,6 +62,9 @@
+
+
+
Index: hcatalog/src/test/e2e/templeton/resource/windows.res
IDEA additional info:
Subsystem: com.intellij.openapi.diff.impl.patch.CharsetEP
<+>UTF-8
===================================================================
--- hcatalog/src/test/e2e/templeton/resource/windows.res (revision )
+++ hcatalog/src/test/e2e/templeton/resource/windows.res (revision )
@@ -0,0 +1,19 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership. The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied. See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+$resources = {
+};