Kafka
  1. Kafka
  2. KAFKA-139

cross-compile multiple Scala versions and upgrade to SBT 0.12.1

    Details

    • Type: Improvement Improvement
    • Status: Resolved
    • Priority: Major Major
    • Resolution: Fixed
    • Affects Version/s: 0.8.0
    • Fix Version/s: 0.8.0
    • Component/s: packaging
    • Labels:

      Description

      Since scala does not maintain binary compatibly between versions, organizations tend to have to move all of there code at the same time. It would thus be very helpful if we could cross build multiple scala versions.

      http://code.google.com/p/simple-build-tool/wiki/CrossBuild

      Unclear if this would require KAFKA-134 or just work.

      1. kafka-sbt0-11-3.patch
        23 kB
        derek
      2. kafka-sbt0-11-3-0.8.patch
        23 kB
        derek
      3. kafka-sbt0-11-3-0.8-v2.patch
        23 kB
        derek
      4. kafka-sbt0-11-3-0.8-v3.patch
        25 kB
        derek
      5. kafka-sbt0-11-3-0.8-v4.patch
        23 kB
        derek
      6. kafka-sbt0-11-3-0.8-v5-smeder.patch
        24 kB
        Sam Meder
      7. kafka-sbt0-11-3-0.8-v6-smeder.patch
        24 kB
        Sam Meder

        Activity

        Hide
        Neha Narkhede added a comment -

        Assigning to Josh Hartman

        Show
        Neha Narkhede added a comment - Assigning to Josh Hartman
        Hide
        Joshua Hartman added a comment -

        This is actually a very important issue that needs to be thought about and is closely related to mavenizing the project. First some background info:

        Scala is not backwards compatible, except for bug fix releases like scala 2.8.0 -> scala 2.8.1. Furthermore, the library jar needs to match the jar it was compiled against. Fortunately, source code has been forwards compatible since scala 2.8, which means that if you build against 2.8, you should be able to build against 2.9.

        The standard way to handle this issue in the scala world is to publish separate artifacts for their jars. So for instance, we could publish kafka-core_2.8.0, kafka-core_2.8.1, kafka-core_2.9.0, and kafka-core_2.9.1. You can configure sbt to do this with one line of code. It's a bit gross, but it's the only way I know of to solve this problem. Without doing this, we'll force users of the library to depend on a specific version of scala. Please read https://github.com/harrah/xsbt/wiki/Cross-Build to learn a bit more.

        I'd really appreciate input from the developer team about this. Kafka is on a really old version of scala and it will probably be an issue for non-linkedin users very soon. (Plus if I have artifacts built against 2.9.1 for all LinkedIn scala jars, I can upgrade LinkedIn to 2.9.1 in about a day).

        Show
        Joshua Hartman added a comment - This is actually a very important issue that needs to be thought about and is closely related to mavenizing the project. First some background info: Scala is not backwards compatible, except for bug fix releases like scala 2.8.0 -> scala 2.8.1. Furthermore, the library jar needs to match the jar it was compiled against. Fortunately, source code has been forwards compatible since scala 2.8, which means that if you build against 2.8, you should be able to build against 2.9. The standard way to handle this issue in the scala world is to publish separate artifacts for their jars. So for instance, we could publish kafka-core_2.8.0, kafka-core_2.8.1, kafka-core_2.9.0, and kafka-core_2.9.1. You can configure sbt to do this with one line of code. It's a bit gross, but it's the only way I know of to solve this problem. Without doing this, we'll force users of the library to depend on a specific version of scala. Please read https://github.com/harrah/xsbt/wiki/Cross-Build to learn a bit more. I'd really appreciate input from the developer team about this. Kafka is on a really old version of scala and it will probably be an issue for non-linkedin users very soon. (Plus if I have artifacts built against 2.9.1 for all LinkedIn scala jars, I can upgrade LinkedIn to 2.9.1 in about a day).
        Hide
        Chris Burroughs added a comment -

        Agreed that the need for project_scaleVersion suffixes for everything is obnoxious, but is the least bad option and would be appreciated by everyone. Internally at Clearspring we wrapped kafka with a pom with a dep on 2.8.1, since that was the version we were using. But several projects will soon want to use 2.9.x and kafka, and it would be nice to have the right fix upstream.

        So as far as input goes, +1!

        Show
        Chris Burroughs added a comment - Agreed that the need for project_scaleVersion suffixes for everything is obnoxious, but is the least bad option and would be appreciated by everyone. Internally at Clearspring we wrapped kafka with a pom with a dep on 2.8.1, since that was the version we were using. But several projects will soon want to use 2.9.x and kafka, and it would be nice to have the right fix upstream. So as far as input goes, +1!
        Hide
        Blake Matheny added a comment -

        +1 from me as well. In the interim I've created some custom builds, but that's not ideal.

        Show
        Blake Matheny added a comment - +1 from me as well. In the interim I've created some custom builds, but that's not ideal.
        Hide
        yair ogen added a comment -

        It's been a while. When can we expect a Scala 2.9.x supported Kafka version?

        Sum of us need this urgently.

        Show
        yair ogen added a comment - It's been a while. When can we expect a Scala 2.9.x supported Kafka version? Sum of us need this urgently.
        Hide
        Neha Narkhede added a comment -

        Sure. Would you be up for describing the set of changes required to make this possible ? We are open to accepting patches.

        Show
        Neha Narkhede added a comment - Sure. Would you be up for describing the set of changes required to make this possible ? We are open to accepting patches.
        Hide
        Evan Chan added a comment -

        I pulled the latest from trunk and even before attempting to build against 2.9, there is one test failing. Is this expected?

        [info]
        [info] == core-kafka / test-finish ==
        [error] Failed: : Total 137, Failed 1, Errors 0, Passed 136, Skipped 0
        [info] == core-kafka / test-finish ==
        [info]
        [info] == core-kafka / Test cleanup 1 ==
        [info] Deleting directory /var/folders/s4/5vy31m2n0ng7h5qbn4575myr00021m/T/sbt_bd845ec9
        [info] == core-kafka / Test cleanup 1 ==
        [info]
        [info] == core-kafka / test-cleanup ==
        [info] == core-kafka / test-cleanup ==
        [error] Error running kafka.zk.ZKLoadBalanceTest: Test FAILED
        [error] Error running test: One or more subtasks failed
        [info]

        Also, Kafka is using a really old version of sbt (0.7.x), but upgrading to xsbt is really involved.

        Show
        Evan Chan added a comment - I pulled the latest from trunk and even before attempting to build against 2.9, there is one test failing. Is this expected? [info] [info] == core-kafka / test-finish == [error] Failed: : Total 137, Failed 1, Errors 0, Passed 136, Skipped 0 [info] == core-kafka / test-finish == [info] [info] == core-kafka / Test cleanup 1 == [info] Deleting directory /var/folders/s4/5vy31m2n0ng7h5qbn4575myr00021m/T/sbt_bd845ec9 [info] == core-kafka / Test cleanup 1 == [info] [info] == core-kafka / test-cleanup == [info] == core-kafka / test-cleanup == [error] Error running kafka.zk.ZKLoadBalanceTest: Test FAILED [error] Error running test: One or more subtasks failed [info] Also, Kafka is using a really old version of sbt (0.7.x), but upgrading to xsbt is really involved.
        Hide
        derek added a comment -

        This is a preliminary patch to upgrade the build to SBT 0.11.3 and cross-compile for 2.8.0, 2.9.1 and 2.9.2. Minor tweak required to AsyncProducerTest.scala to ignore order of Map pair traversal, but otherwise it's a generally straightforward translation of the existing build. Haven't had time to fix/update the release-zip task, but hopefully I can get that done in the next few weeks.

        Show
        derek added a comment - This is a preliminary patch to upgrade the build to SBT 0.11.3 and cross-compile for 2.8.0, 2.9.1 and 2.9.2. Minor tweak required to AsyncProducerTest.scala to ignore order of Map pair traversal, but otherwise it's a generally straightforward translation of the existing build. Haven't had time to fix/update the release-zip task, but hopefully I can get that done in the next few weeks.
        Hide
        Jun Rao added a comment -

        Thanks for the patch. Can't apply the patch to 0.8 cleanly. Could you rebase?

        Show
        Jun Rao added a comment - Thanks for the patch. Can't apply the patch to 0.8 cleanly. Could you rebase?
        Hide
        derek added a comment -

        This patch is against trunk, not 0.8. Let me work out what changes need to be made

        Show
        derek added a comment - This patch is against trunk, not 0.8. Let me work out what changes need to be made
        Hide
        derek added a comment -

        I have the build seemingly working, but in my branch (and 0.8 HEAD) some of the tests just hang forever. Is 0.8 HEAD supposed to be stable?

        Show
        derek added a comment - I have the build seemingly working, but in my branch (and 0.8 HEAD) some of the tests just hang forever. Is 0.8 HEAD supposed to be stable?
        Hide
        Joel Koshy added a comment -

        0.8 is under development but the unit tests should not hang, and there are intermittent test failures that need to be fixed. Which tests do you see hanging? Can you add a comment in KAFKA-384 with your list?

        Show
        Joel Koshy added a comment - 0.8 is under development but the unit tests should not hang, and there are intermittent test failures that need to be fixed. Which tests do you see hanging? Can you add a comment in KAFKA-384 with your list?
        Hide
        derek added a comment -

        OK, the Map traversal ordering issue I hit before is causing the same issue on the 0.8 branch, but whereas before I could ignore it with a special EasyMock matcher (because it was against an array), now the order of results from DefaultEventHandler.partitionAndCollate is reflected by the order of calls against the EasyMock, something that I cannot work around as far as I can tell. A simple fix would be to have partitionAndCollate use a SortedMap (I've tested this and it works). I can include this change as part of the patch if that's OK.

        Show
        derek added a comment - OK, the Map traversal ordering issue I hit before is causing the same issue on the 0.8 branch, but whereas before I could ignore it with a special EasyMock matcher (because it was against an array), now the order of results from DefaultEventHandler.partitionAndCollate is reflected by the order of calls against the EasyMock, something that I cannot work around as far as I can tell. A simple fix would be to have partitionAndCollate use a SortedMap (I've tested this and it works). I can include this change as part of the patch if that's OK.
        Hide
        Jun Rao added a comment -

        Derek,

        For the 0.8 branch, is AsyncProducerTest.testPartitionAndCollateEvents the only test that fails? Will it help if we create expectedResult using HashMap instead of Map (which is what partitionAndCollate does)?

        Show
        Jun Rao added a comment - Derek, For the 0.8 branch, is AsyncProducerTest.testPartitionAndCollateEvents the only test that fails? Will it help if we create expectedResult using HashMap instead of Map (which is what partitionAndCollate does)?
        Hide
        derek added a comment -

        Yes, that's the only test that fails. I don't think that using HashMap is going to help, either, because IIRC between Scala 2.8.0 and 2.9.x the implementations for Map and HashMap both changed such that ordering of pair traversal differs, which is the underlying problem. Because testPartitionAndCollateEvents doesn't look at the order but rather a higher-level usage of that ordering (e.g. calls to getAnyProducer, send, etc), I think it may be difficult to thread that logic through to allow both the 2.8.0 and 2.9.x behaviors. Having said that, I'm only marginally familiar with the internals, so there may be some approach other than using a SortedMap.

        Show
        derek added a comment - Yes, that's the only test that fails. I don't think that using HashMap is going to help, either, because IIRC between Scala 2.8.0 and 2.9.x the implementations for Map and HashMap both changed such that ordering of pair traversal differs, which is the underlying problem. Because testPartitionAndCollateEvents doesn't look at the order but rather a higher-level usage of that ordering (e.g. calls to getAnyProducer, send, etc), I think it may be difficult to thread that logic through to allow both the 2.8.0 and 2.9.x behaviors. Having said that, I'm only marginally familiar with the internals, so there may be some approach other than using a SortedMap.
        Hide
        Jun Rao added a comment -

        Also, I tried the patch for 0.7. What's the right way to build now? If I do "sbt update", it prompts me for Name:.

        Show
        Jun Rao added a comment - Also, I tried the patch for 0.7. What's the right way to build now? If I do "sbt update", it prompts me for Name:.
        Hide
        Jun Rao added a comment -

        Ok, maybe you can submit a patch with the sortedMap first and we can see if there are other ways of addressing the unit test issue.

        Show
        Jun Rao added a comment - Ok, maybe you can submit a patch with the sortedMap first and we can see if there are other ways of addressing the unit test issue.
        Hide
        derek added a comment -

        Here's the patch against 0.8 with the included change to use SortedMap. This will require a new version of SBT. Here we use PaulP's excellent sbt-extras script (https://github.com/paulp/sbt-extras) instead of bundling the sbt-launcher.jar in the project, but you could also just update that JAR. To build, simple run "sbt +clean +package"

        Show
        derek added a comment - Here's the patch against 0.8 with the included change to use SortedMap. This will require a new version of SBT. Here we use PaulP's excellent sbt-extras script ( https://github.com/paulp/sbt-extras ) instead of bundling the sbt-launcher.jar in the project, but you could also just update that JAR. To build, simple run "sbt +clean +package"
        Hide
        derek added a comment -

        Sorry, submit patch didn't do what I thought it would. Here's the patch against 0.8

        Show
        derek added a comment - Sorry, submit patch didn't do what I thought it would. Here's the patch against 0.8
        Hide
        Jun Rao added a comment -

        I applied the patch and did ./sbt +clean +package. It prompted me for Name:. What should I put in and how do I find out what names are valid?

        Show
        Jun Rao added a comment - I applied the patch and did ./sbt +clean +package. It prompted me for Name:. What should I put in and how do I find out what names are valid?
        Hide
        derek added a comment -

        I'm not even sure where "Name:" is coming from. I definitely don't get that locally building with sbt 0.11.3. Are you sure you're not using an older version of SBT?

        Show
        derek added a comment - I'm not even sure where "Name:" is coming from. I definitely don't get that locally building with sbt 0.11.3. Are you sure you're not using an older version of SBT?
        Hide
        Jun Rao added a comment -

        Derek, thanks for the patch. Some comments:

        1. It would be good if we can use the new sbt out of a refresh kafka checkout without requiring any external download. Can we update both the sbt script and the sbt-launch jar to include the new sbt?
        2. ./sbt package seems to default to scala 2.9.1. Could we default it to 2.8? Also, the -28 option uses scala 2.8.2. Is it possible to support 2.8.0?
        3. ./sbt idea stopped working.

        Show
        Jun Rao added a comment - Derek, thanks for the patch. Some comments: 1. It would be good if we can use the new sbt out of a refresh kafka checkout without requiring any external download. Can we update both the sbt script and the sbt-launch jar to include the new sbt? 2. ./sbt package seems to default to scala 2.9.1. Could we default it to 2.8? Also, the -28 option uses scala 2.8.2. Is it possible to support 2.8.0? 3. ./sbt idea stopped working.
        Hide
        derek added a comment -

        1. You should be able to simply update the sbt-launch.jar under lib/ to get SBT working on a fresh kafka checkout without having to use PaulP's script.
        2. I've fixed it so the build defaults to 2.8.0. The -28 option isn't what you want. I believe that runs SBT under 2.8.2, not the compilation, etc
        3. The idea plugin changed a little. The new syntax is "./sbt gen-idea"

        I'm attaching a revised patch with some minor tweaks to fix a bug in the cross build and to make 2.8.0 the default. Please let me know if you have further concerns/questions

        Show
        derek added a comment - 1. You should be able to simply update the sbt-launch.jar under lib/ to get SBT working on a fresh kafka checkout without having to use PaulP's script. 2. I've fixed it so the build defaults to 2.8.0. The -28 option isn't what you want. I believe that runs SBT under 2.8.2, not the compilation, etc 3. The idea plugin changed a little. The new syntax is "./sbt gen-idea" I'm attaching a revised patch with some minor tweaks to fix a bug in the cross build and to make 2.8.0 the default. Please let me know if you have further concerns/questions
        Hide
        derek added a comment -

        Revised patch against 0.8

        Show
        derek added a comment - Revised patch against 0.8
        Hide
        Jun Rao added a comment -

        Derek, thanks for the new patch. A few more comments:

        10. ./sbt gen-idea doesn't seem to work for me. Got the following:
        [error] Not a valid command: gen-idea
        [error] Not a valid project ID: gen-idea
        [error] Not a valid configuration: gen-idea
        [error] Not a valid key: gen-idea
        [error] gen-idea

        11. What's the syntax for building other versions of scala, say 2.9? Could you update the readme file?

        Show
        Jun Rao added a comment - Derek, thanks for the new patch. A few more comments: 10. ./sbt gen-idea doesn't seem to work for me. Got the following: [error] Not a valid command: gen-idea [error] Not a valid project ID: gen-idea [error] Not a valid configuration: gen-idea [error] Not a valid key: gen-idea [error] gen-idea 11. What's the syntax for building other versions of scala, say 2.9? Could you update the readme file?
        Hide
        derek added a comment -

        10. sbt gen-idea works fine for me when I update lib/ with the sbt-launch.jar from http://typesafe.artifactoryonline.com/typesafe/ivy-releases/org.scala-sbt/sbt-launch/0.11.3-2/sbt-launch.jar. Can you try again and email me the full log?

        11. To select a specific version of scala, use the "++<version>" syntax. For example:

        ./sbt "++ 2.8.0 package"

        I'm attaching a 3rd revision of the patch with an updated README.md.

        Show
        derek added a comment - 10. sbt gen-idea works fine for me when I update lib/ with the sbt-launch.jar from http://typesafe.artifactoryonline.com/typesafe/ivy-releases/org.scala-sbt/sbt-launch/0.11.3-2/sbt-launch.jar . Can you try again and email me the full log? 11. To select a specific version of scala, use the "++<version>" syntax. For example: ./sbt "++ 2.8.0 package" I'm attaching a 3rd revision of the patch with an updated README.md.
        Hide
        derek added a comment -

        Revised patch (updated README)

        Show
        derek added a comment - Revised patch (updated README)
        Hide
        Jun Rao added a comment -

        I did a new checkout of 0.8 branch, applied patch v3 and copied the sbt-launch.jar to lib. I still got the following error:

        [jrao@jrao-ld kafka_0.8_temp]$ ./sbt gen-idea
        [warn] Using project/plugins/ is deprecated for plugin configuration (/home/jrao/Intellij_workspace/kafka_0.8_temp/project/plugins).
        [warn] Put .sbt plugin definitions directly in project/,
        [warn] .scala plugin definitions in project/project/,
        [warn] and remove the project/plugins/ directory.
        [info] Loading project definition from /home/jrao/Intellij_workspace/kafka_0.8_temp/project/plugins
        [info] Set current project to Kafka (in build file:/home/jrao/Intellij_workspace/kafka_0.8_temp/)
        [error] Not a valid command: gen-idea
        [error] Not a valid project ID: gen-idea
        [error] Not a valid configuration: gen-idea
        [error] Not a valid key: gen-idea
        [error] gen-idea
        [error] ^

        Show
        Jun Rao added a comment - I did a new checkout of 0.8 branch, applied patch v3 and copied the sbt-launch.jar to lib. I still got the following error: [jrao@jrao-ld kafka_0.8_temp] $ ./sbt gen-idea [warn] Using project/plugins/ is deprecated for plugin configuration (/home/jrao/Intellij_workspace/kafka_0.8_temp/project/plugins). [warn] Put .sbt plugin definitions directly in project/, [warn] .scala plugin definitions in project/project/, [warn] and remove the project/plugins/ directory. [info] Loading project definition from /home/jrao/Intellij_workspace/kafka_0.8_temp/project/plugins [info] Set current project to Kafka (in build file:/home/jrao/Intellij_workspace/kafka_0.8_temp/ ) [error] Not a valid command: gen-idea [error] Not a valid project ID: gen-idea [error] Not a valid configuration: gen-idea [error] Not a valid key: gen-idea [error] gen-idea [error] ^
        Hide
        Jun Rao added a comment -

        Also, does DefaultEventHandler.partitionAndCollate fail at the following test?
        assertEquals(expectedResult, actualResult)
        If so, we can probably write a customized equal test that sorts the sequence of ProducerData by value first and then compare. This way, we may not need to change HashMap to SortedMap in DefaultEventHandler.partitionAndCollate().

        Show
        Jun Rao added a comment - Also, does DefaultEventHandler.partitionAndCollate fail at the following test? assertEquals(expectedResult, actualResult) If so, we can probably write a customized equal test that sorts the sequence of ProducerData by value first and then compare. This way, we may not need to change HashMap to SortedMap in DefaultEventHandler.partitionAndCollate().
        Hide
        derek added a comment -

        No, unfortunately. It used to do that (on trunk), but in 0.8 the failure is within AsyncProducerTest.testFailedSendRetryLogic due to a change in ordering of underlying send calls:

        [2012-07-09 17:53:25,663] ERROR
        Unexpected method call getAnyProducer():
        getAnyProducer(): expected: 3, actual: 4
        close(): expected: 1, actual: 0 (kafka.utils.Utils$:102)
        java.lang.AssertionError:
        Unexpected method call getAnyProducer():
        getAnyProducer(): expected: 3, actual: 4
        close(): expected: 1, actual: 0
        at org.easymock.internal.MockInvocationHandler.invoke(MockInvocationHandler.java:45)
        at org.easymock.internal.ObjectMethodsFilter.invoke(ObjectMethodsFilter.java:73)
        at org.easymock.internal.ClassProxyFactory$MockMethodInterceptor.intercept(ClassProxyFactory.java:92)
        at kafka.producer.ProducerPool$$EnhancerByCGLIB$$5e7236c2.getAnyProducer(<generated>)
        at kafka.producer.BrokerPartitionInfo.updateInfo(BrokerPartitionInfo.scala:73)
        at kafka.producer.async.DefaultEventHandler$$anonfun$handle$1.apply$mcV$sp(DefaultEventHandler.scala:53)
        at kafka.utils.Utils$.swallow(Utils.scala:401)
        at kafka.utils.Logging$class.swallowError(Logging.scala:102)
        at kafka.utils.Utils$.swallowError(Utils.scala:38)
        at kafka.producer.async.DefaultEventHandler.handle(DefaultEventHandler.scala:53)
        at kafka.producer.AsyncProducerTest.testFailedSendRetryLogic(AsyncProducerTest.scala:475)

        Based on a brief review of the code involved I don't see a simple way around this for the 0.8 branch, but I'm open to suggestions.

        Show
        derek added a comment - No, unfortunately. It used to do that (on trunk), but in 0.8 the failure is within AsyncProducerTest.testFailedSendRetryLogic due to a change in ordering of underlying send calls: [2012-07-09 17:53:25,663] ERROR Unexpected method call getAnyProducer(): getAnyProducer(): expected: 3, actual: 4 close(): expected: 1, actual: 0 (kafka.utils.Utils$:102) java.lang.AssertionError: Unexpected method call getAnyProducer(): getAnyProducer(): expected: 3, actual: 4 close(): expected: 1, actual: 0 at org.easymock.internal.MockInvocationHandler.invoke(MockInvocationHandler.java:45) at org.easymock.internal.ObjectMethodsFilter.invoke(ObjectMethodsFilter.java:73) at org.easymock.internal.ClassProxyFactory$MockMethodInterceptor.intercept(ClassProxyFactory.java:92) at kafka.producer.ProducerPool$$EnhancerByCGLIB$$5e7236c2.getAnyProducer(<generated>) at kafka.producer.BrokerPartitionInfo.updateInfo(BrokerPartitionInfo.scala:73) at kafka.producer.async.DefaultEventHandler$$anonfun$handle$1.apply$mcV$sp(DefaultEventHandler.scala:53) at kafka.utils.Utils$.swallow(Utils.scala:401) at kafka.utils.Logging$class.swallowError(Logging.scala:102) at kafka.utils.Utils$.swallowError(Utils.scala:38) at kafka.producer.async.DefaultEventHandler.handle(DefaultEventHandler.scala:53) at kafka.producer.AsyncProducerTest.testFailedSendRetryLogic(AsyncProducerTest.scala:475) Based on a brief review of the code involved I don't see a simple way around this for the 0.8 branch, but I'm open to suggestions.
        Hide
        derek added a comment -

        Strange, gen-idea works for me. I get some warnings about NOT FOUND jars, but otherwise it creates the project.iml file. The SHA hash on my sbt-launch.jar file is:

        cfdb2c9d63776a5faaf681b06b9cf5c077375d43 lib/sbt-launch.jar

        And I haven't modified the "sbt" script in the project root at all.

        Show
        derek added a comment - Strange, gen-idea works for me. I get some warnings about NOT FOUND jars, but otherwise it creates the project.iml file. The SHA hash on my sbt-launch.jar file is: cfdb2c9d63776a5faaf681b06b9cf5c077375d43 lib/sbt-launch.jar And I haven't modified the "sbt" script in the project root at all.
        Hide
        derek added a comment -

        Any update on status? I can't reproduce the gen-idea issue.

        Show
        derek added a comment - Any update on status? I can't reproduce the gen-idea issue.
        Hide
        derek added a comment -

        Actually, I think I may know what's going on. The patch may not be deleting some of the dirs under project/. In particular, there should no longer be a project/plugins/ dir, but from your log above it appears that you still have it there:

        [jrao@jrao-ld kafka_0.8_temp]$ ./sbt gen-idea
        [warn] Using project/plugins/ is deprecated for plugin configuration (/home/jrao/Intellij_workspace/kafka_0.8_temp/project/plugins).
        [warn] Put .sbt plugin definitions directly in project/,
        [warn] .scala plugin definitions in project/project/,
        [warn] and remove the project/plugins/ directory.
        ...

        Can you make sure that project/plugins/ is gone and try again?

        Show
        derek added a comment - Actually, I think I may know what's going on. The patch may not be deleting some of the dirs under project/. In particular, there should no longer be a project/plugins/ dir, but from your log above it appears that you still have it there: [jrao@jrao-ld kafka_0.8_temp] $ ./sbt gen-idea [warn] Using project/plugins/ is deprecated for plugin configuration (/home/jrao/Intellij_workspace/kafka_0.8_temp/project/plugins). [warn] Put .sbt plugin definitions directly in project/, [warn] .scala plugin definitions in project/project/, [warn] and remove the project/plugins/ directory. ... Can you make sure that project/plugins/ is gone and try again?
        Hide
        Neha Narkhede added a comment -

        I checked out 0.8 code, applied v3 patch, downloaded the sbt jar (link above) in the lib directory and ran ./sbt gen-idea. It fails and here is the complete trace - http://pastebin.com/suUZ9wnw

        Show
        Neha Narkhede added a comment - I checked out 0.8 code, applied v3 patch, downloaded the sbt jar (link above) in the lib directory and ran ./sbt gen-idea. It fails and here is the complete trace - http://pastebin.com/suUZ9wnw
        Hide
        derek added a comment -

        Can you remove your project/plugins/ dir and see if that does it? For some reason the patch doesn't seem to delete that directory

        Show
        derek added a comment - Can you remove your project/plugins/ dir and see if that does it? For some reason the patch doesn't seem to delete that directory
        Hide
        Neha Narkhede added a comment - - edited

        After deleting the project/plugins/ directory, ./sbt gen-idea works, so does ./sbt package. Please can you include this change and upload a patch ?

        Show
        Neha Narkhede added a comment - - edited After deleting the project/plugins/ directory, ./sbt gen-idea works, so does ./sbt package. Please can you include this change and upload a patch ?
        Hide
        derek added a comment -

        As far as I know patch can't remove directories, but if you know otherwise please point me to docs

        Show
        derek added a comment - As far as I know patch can't remove directories, but if you know otherwise please point me to docs
        Hide
        Neha Narkhede added a comment - - edited

        Ok. A couple of comments -

        1. What is the difference between ./sbt +package and ./sbt package ?
        2. Please remove Release.scala since the release-zip target does not work and is not useful until fixed
        3. Why do some unit tests have their parent test name printed in the test output, while some don't. See below -

        [info] Test Starting: testReachableServer(kafka.producer.SyncProducerTest)
        [info] Test Passed: testReachableServer(kafka.producer.SyncProducerTest)
        [info] Test Starting: testSingleMessageSizeTooLarge(kafka.producer.SyncProducerTest)
        [info] Test Passed: testSingleMessageSizeTooLarge(kafka.producer.SyncProducerTest)
        [info] Test Starting: testCompressedMessageSizeTooLarge(kafka.producer.SyncProducerTest)
        [info] Test Passed: testCompressedMessageSizeTooLarge(kafka.producer.SyncProducerTest)
        [info] Test Starting: testProduceCorrectlyReceivesResponse(kafka.producer.SyncProducerTest)
        [info] Test Passed: testProduceCorrectlyReceivesResponse(kafka.producer.SyncProducerTest)
        [info] Test Starting: testProducerCanTimeout(kafka.producer.SyncProducerTest)
        [info] Test Passed: testProducerCanTimeout(kafka.producer.SyncProducerTest)
        [info] Test Starting: testFieldValues
        [info] Test Passed: testFieldValues
        [info] Test Starting: testChecksum
        [info] Test Passed: testChecksum
        [info] Test Starting: testEquality
        [info] Test Passed: testEquality
        [info] Test Starting: testIsHashable
        [info] Test Passed: testIsHashable

        4. Why does test-only testName try to run test-only in other projects. Earlier, after you selected a project, you could run test-only and it just executed that one test, without causing delay -

        > projects
        [info] In file:/home/nnarkhed/Projects/kafka-139/
        [info] * Kafka
        [info] contrib
        [info] core
        [info] hadoop consumer
        [info] hadoop producer
        [info] java-examples
        [info] perf
        > test-only kafka.integration.LazyInitProducerTest
        [info] No tests to run for contrib/test:test-only
        [info] No tests to run for Kafka/test:test-only
        [info] No tests to run for java-examples/test:test-only
        [info] No tests to run for hadoop producer/test:test-only
        [info] No tests to run for hadoop consumer/test:test-only
        [info] Test Starting: testProduceAndFetch(kafka.integration.LazyInitProducerTest)
        [info] Test Passed: testProduceAndFetch(kafka.integration.LazyInitProducerTest)
        [info] Test Starting: testProduceAndMultiFetch(kafka.integration.LazyInitProducerTest)
        [info] Test Passed: testProduceAndMultiFetch(kafka.integration.LazyInitProducerTest)
        [info] Test Starting: testMultiProduce(kafka.integration.LazyInitProducerTest)
        [info] Test Passed: testMultiProduce(kafka.integration.LazyInitProducerTest)
        [info] Test Starting: testMultiProduceResend(kafka.integration.LazyInitProducerTest)
        [info] Test Passed: testMultiProduceResend(kafka.integration.LazyInitProducerTest)
        [info] Passed: : Total 4, Failed 0, Errors 0, Passed 4, Skipped 0
        [success] Total time: 11 s, completed Jul 20, 2012 5:51:06 PM

        Show
        Neha Narkhede added a comment - - edited Ok. A couple of comments - 1. What is the difference between ./sbt +package and ./sbt package ? 2. Please remove Release.scala since the release-zip target does not work and is not useful until fixed 3. Why do some unit tests have their parent test name printed in the test output, while some don't. See below - [info] Test Starting: testReachableServer(kafka.producer.SyncProducerTest) [info] Test Passed: testReachableServer(kafka.producer.SyncProducerTest) [info] Test Starting: testSingleMessageSizeTooLarge(kafka.producer.SyncProducerTest) [info] Test Passed: testSingleMessageSizeTooLarge(kafka.producer.SyncProducerTest) [info] Test Starting: testCompressedMessageSizeTooLarge(kafka.producer.SyncProducerTest) [info] Test Passed: testCompressedMessageSizeTooLarge(kafka.producer.SyncProducerTest) [info] Test Starting: testProduceCorrectlyReceivesResponse(kafka.producer.SyncProducerTest) [info] Test Passed: testProduceCorrectlyReceivesResponse(kafka.producer.SyncProducerTest) [info] Test Starting: testProducerCanTimeout(kafka.producer.SyncProducerTest) [info] Test Passed: testProducerCanTimeout(kafka.producer.SyncProducerTest) [info] Test Starting: testFieldValues [info] Test Passed: testFieldValues [info] Test Starting: testChecksum [info] Test Passed: testChecksum [info] Test Starting: testEquality [info] Test Passed: testEquality [info] Test Starting: testIsHashable [info] Test Passed: testIsHashable 4. Why does test-only testName try to run test-only in other projects. Earlier, after you selected a project, you could run test-only and it just executed that one test, without causing delay - > projects [info] In file:/home/nnarkhed/Projects/kafka-139/ [info] * Kafka [info] contrib [info] core [info] hadoop consumer [info] hadoop producer [info] java-examples [info] perf > test-only kafka.integration.LazyInitProducerTest [info] No tests to run for contrib/test:test-only [info] No tests to run for Kafka/test:test-only [info] No tests to run for java-examples/test:test-only [info] No tests to run for hadoop producer/test:test-only [info] No tests to run for hadoop consumer/test:test-only [info] Test Starting: testProduceAndFetch(kafka.integration.LazyInitProducerTest) [info] Test Passed: testProduceAndFetch(kafka.integration.LazyInitProducerTest) [info] Test Starting: testProduceAndMultiFetch(kafka.integration.LazyInitProducerTest) [info] Test Passed: testProduceAndMultiFetch(kafka.integration.LazyInitProducerTest) [info] Test Starting: testMultiProduce(kafka.integration.LazyInitProducerTest) [info] Test Passed: testMultiProduce(kafka.integration.LazyInitProducerTest) [info] Test Starting: testMultiProduceResend(kafka.integration.LazyInitProducerTest) [info] Test Passed: testMultiProduceResend(kafka.integration.LazyInitProducerTest) [info] Passed: : Total 4, Failed 0, Errors 0, Passed 4, Skipped 0 [success] Total time: 11 s, completed Jul 20, 2012 5:51:06 PM
        Hide
        derek added a comment -

        1. +package will compile, test, and package for all cross-build Scala versions (2.8.0, 2.9.1), while package simply runs it for the default Scala version (2.8.0). See https://github.com/harrah/xsbt/wiki/Cross-Build for more details.

        2. I'll upload a new patch with Release.scala removed

        3. I'm not sure. As part of the patch I had to upgrade scalatest, so this may be some change in behavior there

        4. With a multi-module project, SBT 0.10+ will run a given command for all subprojects. If you want to run a command for a particular sub-project you can prefix it with "projectname/". For example, to run the test above, just do:

        core/test-only kafka.integration.LazyInitProducerTest

        Show
        derek added a comment - 1. +package will compile, test, and package for all cross-build Scala versions (2.8.0, 2.9.1), while package simply runs it for the default Scala version (2.8.0). See https://github.com/harrah/xsbt/wiki/Cross-Build for more details. 2. I'll upload a new patch with Release.scala removed 3. I'm not sure. As part of the patch I had to upgrade scalatest, so this may be some change in behavior there 4. With a multi-module project, SBT 0.10+ will run a given command for all subprojects. If you want to run a command for a particular sub-project you can prefix it with "projectname/". For example, to run the test above, just do: core/test-only kafka.integration.LazyInitProducerTest
        Hide
        derek added a comment -

        Patch with "Release" task removed

        Show
        derek added a comment - Patch with "Release" task removed
        Hide
        derek added a comment -

        Just checking to see what I can do to move this ticket along, thanks!

        Show
        derek added a comment - Just checking to see what I can do to move this ticket along, thanks!
        Hide
        Neha Narkhede added a comment -

        Sorry for reviewing this late and thanks for the patch. I applied it and try ./sbt idea, but it fails with the following error message -

        nnarkhed-ld:kafka-139 nnarkhed$ ./sbt idea
        Getting org.scala-tools.sbt sbt_2.7.7 0.11.3 ...

        :: problems summary ::
        :::: WARNINGS
        module not found: org.scala-tools.sbt#sbt_2.7.7;0.11.3

        ==== local: tried

        /home/nnarkhed/.ivy2/local/org.scala-tools.sbt/sbt_2.7.7/0.11.3/ivys/ivy.xml

        – artifact org.scala-tools.sbt#sbt_2.7.7;0.11.3!sbt_2.7.7.jar:

        /home/nnarkhed/.ivy2/local/org.scala-tools.sbt/sbt_2.7.7/0.11.3/jars/sbt_2.7.7.jar

        ==== Maven2 Local: tried

        file:///home/nnarkhed/.m2/repository/org/scala-tools/sbt/sbt_2.7.7/0.11.3/sbt_2.7.7-0.11.3.pom

        – artifact org.scala-tools.sbt#sbt_2.7.7;0.11.3!sbt_2.7.7.jar:

        file:///home/nnarkhed/.m2/repository/org/scala-tools/sbt/sbt_2.7.7/0.11.3/sbt_2.7.7-0.11.3.jar

        ==== sbt-db: tried

        http://databinder.net/repo/org.scala-tools.sbt/sbt_2.7.7/0.11.3/ivys/ivy.xml

        – artifact org.scala-tools.sbt#sbt_2.7.7;0.11.3!sbt_2.7.7.jar:

        http://databinder.net/repo/org.scala-tools.sbt/sbt_2.7.7/0.11.3/jars/sbt_2.7.7.jar

        ==== Maven Central: tried

        http://repo1.maven.org/maven2/org/scala-tools/sbt/sbt_2.7.7/0.11.3/sbt_2.7.7-0.11.3.pom

        – artifact org.scala-tools.sbt#sbt_2.7.7;0.11.3!sbt_2.7.7.jar:

        http://repo1.maven.org/maven2/org/scala-tools/sbt/sbt_2.7.7/0.11.3/sbt_2.7.7-0.11.3.jar

        ==== Scala-Tools Maven2 Repository: tried

        http://scala-tools.org/repo-releases/org/scala-tools/sbt/sbt_2.7.7/0.11.3/sbt_2.7.7-0.11.3.pom

        – artifact org.scala-tools.sbt#sbt_2.7.7;0.11.3!sbt_2.7.7.jar:

        http://scala-tools.org/repo-releases/org/scala-tools/sbt/sbt_2.7.7/0.11.3/sbt_2.7.7-0.11.3.jar

        ==== Scala-Tools Maven2 Snapshots Repository: tried

        http://scala-tools.org/repo-snapshots/org/scala-tools/sbt/sbt_2.7.7/0.11.3/sbt_2.7.7-0.11.3.pom

        – artifact org.scala-tools.sbt#sbt_2.7.7;0.11.3!sbt_2.7.7.jar:

        http://scala-tools.org/repo-snapshots/org/scala-tools/sbt/sbt_2.7.7/0.11.3/sbt_2.7.7-0.11.3.jar

        ::::::::::::::::::::::::::::::::::::::::::::::

        :: UNRESOLVED DEPENDENCIES ::

        ::::::::::::::::::::::::::::::::::::::::::::::

        :: org.scala-tools.sbt#sbt_2.7.7;0.11.3: not found

        ::::::::::::::::::::::::::::::::::::::::::::::

        :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
        unresolved dependency: org.scala-tools.sbt#sbt_2.7.7;0.11.3: not found
        Error during sbt execution: Error retrieving required libraries
        (see /home/nnarkhed/Projects/kafka-139/project/boot/update.log for complete log)
        Error: Could not retrieve sbt 0.11.3

        Show
        Neha Narkhede added a comment - Sorry for reviewing this late and thanks for the patch. I applied it and try ./sbt idea, but it fails with the following error message - nnarkhed-ld:kafka-139 nnarkhed$ ./sbt idea Getting org.scala-tools.sbt sbt_2.7.7 0.11.3 ... :: problems summary :: :::: WARNINGS module not found: org.scala-tools.sbt#sbt_2.7.7;0.11.3 ==== local: tried /home/nnarkhed/.ivy2/local/org.scala-tools.sbt/sbt_2.7.7/0.11.3/ivys/ivy.xml – artifact org.scala-tools.sbt#sbt_2.7.7;0.11.3!sbt_2.7.7.jar: /home/nnarkhed/.ivy2/local/org.scala-tools.sbt/sbt_2.7.7/0.11.3/jars/sbt_2.7.7.jar ==== Maven2 Local: tried file:///home/nnarkhed/.m2/repository/org/scala-tools/sbt/sbt_2.7.7/0.11.3/sbt_2.7.7-0.11.3.pom – artifact org.scala-tools.sbt#sbt_2.7.7;0.11.3!sbt_2.7.7.jar: file:///home/nnarkhed/.m2/repository/org/scala-tools/sbt/sbt_2.7.7/0.11.3/sbt_2.7.7-0.11.3.jar ==== sbt-db: tried http://databinder.net/repo/org.scala-tools.sbt/sbt_2.7.7/0.11.3/ivys/ivy.xml – artifact org.scala-tools.sbt#sbt_2.7.7;0.11.3!sbt_2.7.7.jar: http://databinder.net/repo/org.scala-tools.sbt/sbt_2.7.7/0.11.3/jars/sbt_2.7.7.jar ==== Maven Central: tried http://repo1.maven.org/maven2/org/scala-tools/sbt/sbt_2.7.7/0.11.3/sbt_2.7.7-0.11.3.pom – artifact org.scala-tools.sbt#sbt_2.7.7;0.11.3!sbt_2.7.7.jar: http://repo1.maven.org/maven2/org/scala-tools/sbt/sbt_2.7.7/0.11.3/sbt_2.7.7-0.11.3.jar ==== Scala-Tools Maven2 Repository: tried http://scala-tools.org/repo-releases/org/scala-tools/sbt/sbt_2.7.7/0.11.3/sbt_2.7.7-0.11.3.pom – artifact org.scala-tools.sbt#sbt_2.7.7;0.11.3!sbt_2.7.7.jar: http://scala-tools.org/repo-releases/org/scala-tools/sbt/sbt_2.7.7/0.11.3/sbt_2.7.7-0.11.3.jar ==== Scala-Tools Maven2 Snapshots Repository: tried http://scala-tools.org/repo-snapshots/org/scala-tools/sbt/sbt_2.7.7/0.11.3/sbt_2.7.7-0.11.3.pom – artifact org.scala-tools.sbt#sbt_2.7.7;0.11.3!sbt_2.7.7.jar: http://scala-tools.org/repo-snapshots/org/scala-tools/sbt/sbt_2.7.7/0.11.3/sbt_2.7.7-0.11.3.jar :::::::::::::::::::::::::::::::::::::::::::::: :: UNRESOLVED DEPENDENCIES :: :::::::::::::::::::::::::::::::::::::::::::::: :: org.scala-tools.sbt#sbt_2.7.7;0.11.3: not found :::::::::::::::::::::::::::::::::::::::::::::: :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS unresolved dependency: org.scala-tools.sbt#sbt_2.7.7;0.11.3: not found Error during sbt execution: Error retrieving required libraries (see /home/nnarkhed/Projects/kafka-139/project/boot/update.log for complete log) Error: Could not retrieve sbt 0.11.3
        Hide
        derek added a comment -

        I replied via email but it doesn't look like it went through.

        This has been somewhat of a complicated patch to submit . Besides the obvious application of the patch file itself, you'll need to downoad the latest SBT jar and replace the existing one in lib/, and you'll also need to remove the project/plugins/ dir. "Patch", per se, doesn't support either of these operations, so I apologize for any confusion. The error you show above appears to be an issue with the SBT jar not being updated.

        I saw rumblings on the list about moving to Git from SVN. If that happens I can host my own repo, make the changes there, and then just submit a pull request if that would be simpler. Alternatively, I could submit a script that applies the patch, downloads the latest SBT, and removes the extraneous directories. Which would be preferable from your standpoint?

        Show
        derek added a comment - I replied via email but it doesn't look like it went through. This has been somewhat of a complicated patch to submit . Besides the obvious application of the patch file itself, you'll need to downoad the latest SBT jar and replace the existing one in lib/, and you'll also need to remove the project/plugins/ dir. "Patch", per se, doesn't support either of these operations, so I apologize for any confusion. The error you show above appears to be an issue with the SBT jar not being updated. I saw rumblings on the list about moving to Git from SVN. If that happens I can host my own repo, make the changes there, and then just submit a pull request if that would be simpler. Alternatively, I could submit a script that applies the patch, downloads the latest SBT, and removes the extraneous directories. Which would be preferable from your standpoint?
        Hide
        Neha Narkhede added a comment -

        Derek,

        I understand that this patch is pretty complicated to submit. I'd like to help out to expedite this. I don't mind downloading the sbt JAR and removing the project/plugins directory, just missed it last time I reviewed the patch. Even after doing that, I see the following problems, some I resolved already, some I'm not so sure about -

        1. the perf sub-project is missing a build.sbt. So I added one
        2. The scripts are pointing to */target/scala_2.8.0 while the new path is */target/scala-2.8.0. I changed all scripts to incorporate this new path

        I need some help with the following -

        3. ./sbt package doesn't always re-create the jars for all sub-projects.
        4. I couldn't find the path where all the downloaded dependency jars are stored. Our scripts need to have that in the classpath in order to run properly.

        For this patch to be checked in, we need to ensure that unit tests as well as system tests pass. Right now, unit test pass, but system tests fail due to (4).

        I think we are pretty close to accepting this patch. Appreciate your help in getting this completed!

        Show
        Neha Narkhede added a comment - Derek, I understand that this patch is pretty complicated to submit. I'd like to help out to expedite this. I don't mind downloading the sbt JAR and removing the project/plugins directory, just missed it last time I reviewed the patch. Even after doing that, I see the following problems, some I resolved already, some I'm not so sure about - 1. the perf sub-project is missing a build.sbt. So I added one 2. The scripts are pointing to */target/scala_2.8.0 while the new path is */target/scala-2.8.0. I changed all scripts to incorporate this new path I need some help with the following - 3. ./sbt package doesn't always re-create the jars for all sub-projects. 4. I couldn't find the path where all the downloaded dependency jars are stored. Our scripts need to have that in the classpath in order to run properly. For this patch to be checked in, we need to ensure that unit tests as well as system tests pass. Right now, unit test pass, but system tests fail due to (4). I think we are pretty close to accepting this patch. Appreciate your help in getting this completed!
        Hide
        derek added a comment -

        Thanks for looking into it! For what it's worth, my comment on this being a complicated patch was meant to convey my understanding that I'm probably doing some things here well outside the norm for submitted patches, not a complaint about the process

        For #3 do you mean that changes in code don't result in a newly built JAR for certain subprojects? If that's the case I'm probably missing a dependency setting on those subprojects. For #4, this is a change in the behavior of SBT for 0.10+. Now, SBT uses Ivy, so none of the downloaded deps will be in the project dir. Rather, they will be in the user's Ivy home (~/.ivy2/cache by default). In the case of packaging, this is something that may (and I stress may) be simpler within SBT, as in the Release.scala I had originally, or by using the sbt-assembly plugin (https://github.com/sbt/sbt-assembly/) to bundle all JARs together into one. Please let me know if I can help with that. If you'd like me to take a look at it, would it be possible to check in what you do have working on a branch so that I can simply point git-svn at it and work on your exact codebase?

        Thanks,

        Derek

        Show
        derek added a comment - Thanks for looking into it! For what it's worth, my comment on this being a complicated patch was meant to convey my understanding that I'm probably doing some things here well outside the norm for submitted patches, not a complaint about the process For #3 do you mean that changes in code don't result in a newly built JAR for certain subprojects? If that's the case I'm probably missing a dependency setting on those subprojects. For #4, this is a change in the behavior of SBT for 0.10+. Now, SBT uses Ivy, so none of the downloaded deps will be in the project dir. Rather, they will be in the user's Ivy home (~/.ivy2/cache by default). In the case of packaging, this is something that may (and I stress may ) be simpler within SBT, as in the Release.scala I had originally, or by using the sbt-assembly plugin ( https://github.com/sbt/sbt-assembly/ ) to bundle all JARs together into one. Please let me know if I can help with that. If you'd like me to take a look at it, would it be possible to check in what you do have working on a branch so that I can simply point git-svn at it and work on your exact codebase? Thanks, Derek
        Hide
        Neha Narkhede added a comment -

        3. In the earlier SBT version, if you run ./sbt package over and over, it re-created the jars for all subprojects each time. Now, I'm not so sure it does that ?
        4. Like I said, all our scripts (*.sh) need to add the dependencies to the Java CLASSPATH. Right now, they point to the lib_managed directory and hence the system tests fail.
        5. You might want to run system tests before submitting the patch -
        ./system_test/*/bin/run-test.sh
        6. Why does the patch change DefaultEventHandler to use a SortedMap ?

        Would you be up for incorporating the changes suggested in review comments 1-6 above and submitting a patch ? Rest of the patch looks good.

        Show
        Neha Narkhede added a comment - 3. In the earlier SBT version, if you run ./sbt package over and over, it re-created the jars for all subprojects each time. Now, I'm not so sure it does that ? 4. Like I said, all our scripts (*.sh) need to add the dependencies to the Java CLASSPATH. Right now, they point to the lib_managed directory and hence the system tests fail. 5. You might want to run system tests before submitting the patch - ./system_test/*/bin/run-test.sh 6. Why does the patch change DefaultEventHandler to use a SortedMap ? Would you be up for incorporating the changes suggested in review comments 1-6 above and submitting a patch ? Rest of the patch looks good.
        Hide
        derek added a comment -

        3. SBT now does dependency management all the way to the produced artifacts. If nothing has changed, it won't rebuild the JAR. There may be a setting to override this behavior, but I would have to look into it if that's a requirement.

        4. Is the intention that the system tests are run outside of SBT, or would it be OK to execute them as part of the test phase on the master project? If the latter is OK, I think this is done fairly easily. If we need it outside there may be a way to run SBT to interrogate the classpath for a given project. I'll look at the scripts.

        5. Let me know if running #4 is OK within SBT

        6. This was discussed earlier in the ticket. Due to a change in Scala's Map iteration ordering between 2.8.0 and 2.9.x, EasyMock fails in Scala > 2.8.x. Changing to a SortedMap preserves the ordering, allowing the test to pass. The new tests use an indirect measure via EasyMock of call order, and as far as I can tell it would be difficult to work around it because the change required is several layers below where EasyMock interfaces.

        Show
        derek added a comment - 3. SBT now does dependency management all the way to the produced artifacts. If nothing has changed, it won't rebuild the JAR. There may be a setting to override this behavior, but I would have to look into it if that's a requirement. 4. Is the intention that the system tests are run outside of SBT, or would it be OK to execute them as part of the test phase on the master project? If the latter is OK, I think this is done fairly easily. If we need it outside there may be a way to run SBT to interrogate the classpath for a given project. I'll look at the scripts. 5. Let me know if running #4 is OK within SBT 6. This was discussed earlier in the ticket. Due to a change in Scala's Map iteration ordering between 2.8.0 and 2.9.x, EasyMock fails in Scala > 2.8.x. Changing to a SortedMap preserves the ordering, allowing the test to pass. The new tests use an indirect measure via EasyMock of call order, and as far as I can tell it would be difficult to work around it because the change required is several layers below where EasyMock interfaces.
        Hide
        derek added a comment -

        Sorry, to answer your last question I'd be happy to work out the remaining issues to get this in place.

        Show
        derek added a comment - Sorry, to answer your last question I'd be happy to work out the remaining issues to get this in place.
        Hide
        Neha Narkhede added a comment -

        3. Sounds good, let's not change anything here. It is fine to not re-create the jars, if nothing has changed.
        4. Yes, the system tests are just some shell scripts that run out of SBT. It will be good to change them to include the dependent jars on the classpath.
        5. No, it's not. The way to run system tests is as shown above.
        6. Thanks for the explanation, makes sense.

        Show
        Neha Narkhede added a comment - 3. Sounds good, let's not change anything here. It is fine to not re-create the jars, if nothing has changed. 4. Yes, the system tests are just some shell scripts that run out of SBT. It will be good to change them to include the dependent jars on the classpath. 5. No, it's not. The way to run system tests is as shown above. 6. Thanks for the explanation, makes sense.
        Hide
        Sam Meder added a comment -

        Updated the patch for current 0.8. Just added the yammer metrics dependency and fixed merge conflicts. That said, I am not familiar with Scala or SBT so you may also just want to disregard the patch update...

        Show
        Sam Meder added a comment - Updated the patch for current 0.8. Just added the yammer metrics dependency and fixed merge conflicts. That said, I am not familiar with Scala or SBT so you may also just want to disregard the patch update...
        Hide
        Sam Meder added a comment -

        Noticed I had screwed up the build.properties part of the patch. Again, feel free to disregard...

        Show
        Sam Meder added a comment - Noticed I had screwed up the build.properties part of the patch. Again, feel free to disregard...
        Hide
        derek added a comment -

        Since we're on Git now, I've pushed a branch with the changes here:

        https://github.com/dchenbecker/kafka-sbt

        I haven't had a chance to look at Sam's patches yet, but I'll get those applied as well. The branch is taken from the latest 0.8 head, using SBT 0.12.1.

        Show
        derek added a comment - Since we're on Git now, I've pushed a branch with the changes here: https://github.com/dchenbecker/kafka-sbt I haven't had a chance to look at Sam's patches yet, but I'll get those applied as well. The branch is taken from the latest 0.8 head, using SBT 0.12.1.
        Hide
        Joe Stein added a comment -

        FYI

        I have been reviewing

        git remote add dchenbecker git://github.com/dchenbecker/kafka-sbt.git
        git fetch dchenbecker
        git merge dchenbecker/topic/0.8-sbt-0.12

        ********

        so far the only changes I have made where minor in in Build.scala to reflect recent updates for mavenification

        • version := "0.8.0",
        • organization := "org.apache.kafka",
          + version := "0.8-SNAPSHOT",
          + organization := "org.apache",

        stil working on reviewing a few more areas of change but this is looking really good. so far I am a +1 on this

        unless I find anything or anyone else does that might be hacking on this ticket too I will commit the changes tomorrow

        Show
        Joe Stein added a comment - FYI I have been reviewing git remote add dchenbecker git://github.com/dchenbecker/kafka-sbt.git git fetch dchenbecker git merge dchenbecker/topic/0.8-sbt-0.12 ******** so far the only changes I have made where minor in in Build.scala to reflect recent updates for mavenification version := "0.8.0", organization := "org.apache.kafka", + version := "0.8-SNAPSHOT", + organization := "org.apache", stil working on reviewing a few more areas of change but this is looking really good. so far I am a +1 on this unless I find anything or anyone else does that might be hacking on this ticket too I will commit the changes tomorrow
        Hide
        derek added a comment -

        Joe, that's great! Let me know if you run into any issues beyond the minor ones you found. I was also thinking that perhaps the cross build should include 2.8.2 (latest in that line) and 2.10.0, since it just came out. I'm running a test build tonight across those versions and if all looks well I'll push the update to my branch.

        Show
        derek added a comment - Joe, that's great! Let me know if you run into any issues beyond the minor ones you found. I was also thinking that perhaps the cross build should include 2.8.2 (latest in that line) and 2.10.0, since it just came out. I'm running a test build tonight across those versions and if all looks well I'll push the update to my branch.
        Hide
        derek added a comment -

        OK, 2.10.0 is going to be a non-starter at this point due to a number of removed methods in the Scala collections libs (they've been deprecated since 2.8.1/2.9.0), but 2.8.2 seems fine.

        Show
        derek added a comment - OK, 2.10.0 is going to be a non-starter at this point due to a number of removed methods in the Scala collections libs (they've been deprecated since 2.8.1/2.9.0), but 2.8.2 seems fine.
        Hide
        Joe Stein added a comment -

        added 2.8.2 and committed changes to 0.8 branch

        Show
        Joe Stein added a comment - added 2.8.2 and committed changes to 0.8 branch
        Hide
        Jun Rao added a comment -

        Joe, after your check in. I saw a couple of changes: (1) ./sbt package received a bunch of warnings during compilation; (2) bin/kafka-server-start.sh config/server.properties stopped working with the following error

        bin/kafka-server-start.sh config/server.properties
        Exception in thread "main" java.lang.NoClassDefFoundError: kafka/Kafka
        Caused by: java.lang.ClassNotFoundException: kafka.Kafka
        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
        Could not find the main class: kafka.Kafka. Program will exit.

        It seems that kafka/kafka.class is no longer in the Kafka jar.

        Show
        Jun Rao added a comment - Joe, after your check in. I saw a couple of changes: (1) ./sbt package received a bunch of warnings during compilation; (2) bin/kafka-server-start.sh config/server.properties stopped working with the following error bin/kafka-server-start.sh config/server.properties Exception in thread "main" java.lang.NoClassDefFoundError: kafka/Kafka Caused by: java.lang.ClassNotFoundException: kafka.Kafka at java.net.URLClassLoader$1.run(URLClassLoader.java:202) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:190) at java.lang.ClassLoader.loadClass(ClassLoader.java:306) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) at java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the main class: kafka.Kafka. Program will exit. It seems that kafka/kafka.class is no longer in the Kafka jar.
        Hide
        Joe Stein added a comment - - edited

        there are 2 issues I am fixing now in kafka-run-class.sh to get this working again on branch 0.8

        1 is minor the changes one dir is - instead of _ now

        the other issue is it looks like since SBT 0.10.0 all of the project/boot jars are now in ~/ivy2 no longer localized

        I am changing the bash script now to look in the ~/iv2 directory we could (could) consider using assembly for stand alone jar

        first step is getting it working again, we have to decide if the fix I am making right now to get it running pointing to ~/iv2 is what we want or if we want to start to use assembly for standalone jar and not looping through the directories in

        Show
        Joe Stein added a comment - - edited there are 2 issues I am fixing now in kafka-run-class.sh to get this working again on branch 0.8 1 is minor the changes one dir is - instead of _ now the other issue is it looks like since SBT 0.10.0 all of the project/boot jars are now in ~/ivy2 no longer localized I am changing the bash script now to look in the ~/iv2 directory we could (could) consider using assembly for stand alone jar first step is getting it working again, we have to decide if the fix I am making right now to get it running pointing to ~/iv2 is what we want or if we want to start to use assembly for standalone jar and not looping through the directories in
        Hide
        Joe Stein added a comment -

        i committed the changes so that the start script looks in the right ~/.ivy2 directories and related changes so that jars are matching up

        tested quick start script and looks good now on 0.8 branch

        Show
        Joe Stein added a comment - i committed the changes so that the start script looks in the right ~/.ivy2 directories and related changes so that jars are matching up tested quick start script and looks good now on 0.8 branch
        Hide
        Maxime Brugidou added a comment -

        I checked out 0.8 this morning and this breaks the Mavenization from KAFKA-133
        The dependencies for com.yammer.metrics is nested in the resulting pom.xml, which is not valid.

        Show
        Maxime Brugidou added a comment - I checked out 0.8 this morning and this breaks the Mavenization from KAFKA-133 The dependencies for com.yammer.metrics is nested in the resulting pom.xml, which is not valid.
        Hide
        Jun Rao added a comment -

        Also, when running "bin/kafka-console-producer.sh --broker-list localhost:9092 --topic test", I saw the following:
        SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
        SLF4J: Defaulting to no-operation (NOP) logger implementation
        SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.

        This seems to be new and I am not sure how critical it is.

        Show
        Jun Rao added a comment - Also, when running "bin/kafka-console-producer.sh --broker-list localhost:9092 --topic test", I saw the following: SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". SLF4J: Defaulting to no-operation (NOP) logger implementation SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details. This seems to be new and I am not sure how critical it is.
        Hide
        Jay Kreps added a comment -

        It also looks like package no longer builds a test jar(KAFKA-676)? How do we run the various non-unit tests? Previous you did
        $./sbt
        > package
        and then from the command line you could do
        $ ./bin/kafka-run-class.sh kafka.TestLinearWriteSpeed

        But now I get class not found.

        Show
        Jay Kreps added a comment - It also looks like package no longer builds a test jar( KAFKA-676 )? How do we run the various non-unit tests? Previous you did $./sbt > package and then from the command line you could do $ ./bin/kafka-run-class.sh kafka.TestLinearWriteSpeed But now I get class not found.
        Hide
        Jay Kreps added a comment -

        Also is it right that kafka-run-class is hardcoding ~/.ivy2/cache as the lib directory? Not sure that that came with this patch, but can't that directory be anywhere?

        Show
        Jay Kreps added a comment - Also is it right that kafka-run-class is hardcoding ~/.ivy2/cache as the lib directory? Not sure that that came with this patch, but can't that directory be anywhere?
        Hide
        Joe Stein added a comment -

        ok so first let me review KAFKA-728 and commit it because that was caused by this also.

        I will open a sub ticket to make ~/ivy2/cache a variable in the bash script and upload a patch after that commit

        let me look into what is causing the KAFKA-133 to break and create a sub ticket, I will take a look to see what might be causing it and upload a patch or hola for assistance

        let me look into what is causing the KAFKA-676 to break and create a sub ticket, I will take a look to see what might be causing it and upload a patch or hola for assistance

        let me reproduce the SLF4J error because I could have sworn I ran into that and resolved it but maybe that error is different than the one I already fixed (likely) and sub ticket for that with a patch there too it should be trivial

        i am on it

        Show
        Joe Stein added a comment - ok so first let me review KAFKA-728 and commit it because that was caused by this also. I will open a sub ticket to make ~/ivy2/cache a variable in the bash script and upload a patch after that commit let me look into what is causing the KAFKA-133 to break and create a sub ticket, I will take a look to see what might be causing it and upload a patch or hola for assistance let me look into what is causing the KAFKA-676 to break and create a sub ticket, I will take a look to see what might be causing it and upload a patch or hola for assistance let me reproduce the SLF4J error because I could have sworn I ran into that and resolved it but maybe that error is different than the one I already fixed (likely) and sub ticket for that with a patch there too it should be trivial i am on it

          People

          • Assignee:
            Unassigned
            Reporter:
            Chris Burroughs
          • Votes:
            6 Vote for this issue
            Watchers:
            12 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved:

              Development