Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
0.6.0
-
Ubuntu 16.04, Python 3.6
Description
I have a localy saved parquet database created in spark from querrying an SQL database. When I run:
import pyarrow.parquet as pq path = "path/to/parquet/dataset" dataset = pq.ParquetDataset(path) dataset.read()
an error indicating that there is no support for reading columns of type decimal(19,4). It's quite a common type used in SQL databases and I saw in the source code that there is an implementation for decimals. I'm stuck trying to figuring out a solution. Is there a walk around (conversion of decimals to integers during reading)?
Attachments
Issue Links
- depends upon
-
PARQUET-1095 [C++] Read and write Arrow decimal values
- Resolved
-
ARROW-1839 [C++/Python] Add Decimal Parquet Read/Write Tests
- Resolved