Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
0.16.0
Description
I have my env variable setup correctly according to the pyarrow README
$ ls $HADOOP_HOME/lib/native
libhadoop.a libhadooppipes.a libhadoop.so libhadoop.so.1.0.0 libhadooputils.a libhdfs.a libhdfs.so libhdfs.so.0.0.0
Use the following script to reproduce
import pyarrow pyarrow.hdfs.connect('hdfs://localhost')
With pyarrow version 0.15.1 it is fine.
However, version 0.16.0 will give error
Traceback (most recent call last): File "<string>", line 2, in <module> File "/home/jackwindows/anaconda2/lib/python2.7/site-packages/pyarrow/hdfs.py", line 215, in connect extra_conf=extra_conf) File "/home/jackwindows/anaconda2/lib/python2.7/site-packages/pyarrow/hdfs.py", line 40, in __init__ self._connect(host, port, user, kerb_ticket, driver, extra_conf) File "pyarrow/io-hdfs.pxi", line 89, in pyarrow.lib.HadoopFileSystem._connect File "pyarrow/error.pxi", line 99, in pyarrow.lib.check_status IOError: Unable to load libhdfs: /opt/hadoop/latest/libhdfs.so: cannot open shared object file: No such file or directory
Attachments
Issue Links
- duplicates
-
ARROW-8154 [Python] HDFS Filesystem does not set environment variables in pyarrow 0.16.0 release
- Closed
- links to