Details
-
Improvement
-
Status: Open
-
Major
-
Resolution: Unresolved
-
None
-
None
-
None
Description
The spark community has done a great job integrating github into their workflow. Integrating this will be tricky : We need to make sure the script works. So lets have this JIRA both create the script to commit from github + also confirm that it works by creating a pull request, and using the script to actually merge that pull request in .
Heres how the spark process works. We've confirmed in email thread that some of us agreed it will be good to do this.... From sparks commiter guidelines wiki page (https://cwiki.apache.org/confluence/display/SPARK/Reviewing+and+Merging+Patches):
Once a patch is in good shape to merge, you can use the build-in developer script (./dev/merge_spark_pr.py) to merge it.
This will also allow you to back-port the patch into earlier branches if required. Make sure to close the associated JIRA after you merge as well as indicating the fix versions.
1) create and test a script similar to dev/merge_spark_pr.py on a mock patch.
2) upload that script here, as a patch, for review.
3) After review is completed - take the code which creates this patch, and create a pull request by forking apache/bigtop.git
4) Use the script in your fork to commit the patch with the script. The script should
- commit the correct patch to apache git
- update this JIRA with "resolved" status
If EITHER step fails, but commit goes through - we will create a follow up JIRA to fix the issue.
Does this sounds like a plan we can agree on ?