Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-17193

HadoopRDD NPE at DEBUG log level when getLocationInfo == null

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Trivial
    • Resolution: Fixed
    • 2.0.0
    • 2.0.1, 2.1.0
    • Spark Core
    • None

    Description

      When I set the log level to "DEBUG" in one of my apps that reads from Parquet, I notice several NullPointerExceptions logged from HadoopRDD.getPreferredLocations.

      It doesn't affect executions as it just results in "no preferred locations". It happens when InputSplitWithLocationInfo.getLocationInfo produces null, which it may. The code just dereferences it however.

      It's cleaner to check this directly (and maybe tighten up the code slightly) and avoid polluting the log, though, it's just at debug level. No big deal, but enough of an annoyance when I was debugging something that it's probably worth zapping.

      Attachments

        Activity

          People

            srowen Sean R. Owen
            srowen Sean R. Owen
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: