Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-1879

Default PermGen size too small when using Hadoop2 and Hive

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Critical
    • Resolution: Fixed
    • None
    • 1.0.0
    • Spark Core
    • None

    Description

      If you launch a spark-shell with Hadoop 2 and Hive on the classpath, and try to use Hive therein, the PermGen quickly reaches 85 MB after a few commands, at which point Java gives up and freezes. We should pass a MaxPermSize to prevent this. Unfortunately passing this results in a warning on Java 8, but that's still better than not passing it.

      I don't think this affects stuff other than the shell; it's just the combination of Scala compiler + Hive + Hadoop 2 that pushes things over the edge.

      Attachments

        Activity

          People

            matei Matei Alexandru Zaharia
            matei Matei Alexandru Zaharia
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: