Uploaded image for project: 'Apache Arrow'
  1. Apache Arrow
  2. ARROW-1130

io-hdfs-test failure

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Blocker
    • Resolution: Not A Problem
    • None
    • 0.5.0
    • C++
    • None
    • Ubuntu 16.04, GCC 4.8, Parquet-cpp

    Description

      Hi,

      I have noticed that arrow-cpp's io-hdfs-test fails during compilation with GCC 4.8, but passes when compiled with GCC 5.4 (as it just skips all tests as it doesn't connect to the HDFS client).

      I went into the test output log and it seemed to want me to set the variable ARROW_HDFS_TEST_USER, so I set the variable to 'root' and ARROW_HDFS_TEST_PORT to '9000' (which is the port that I use to connect to my local hdfs) and the test passes.

      Do I need to configure the environment and the variables in a specific way to get it to work?

      I'm mainly asking as I am trying to use arrow and parquet c++ libraries in an external project and I continue to run into segfaults in the libhdfs jni_helper even though I successfully connect to HDFS on my local Hadoop cluster and even read in a single Parquet file and am hoping that somehow this will help me figure out the issue in my external project as well.

      Thank you in advance for your help.

      Attachments

        Activity

          People

            wesm Wes McKinney
            cyp Young Park
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: