Description
Not quite able to figure this out but here is a junit test to reproduce this, in JavaAPISuite.java
DecimalBug.java
@Test public void decimalQueryTest() { List<Row> decimalTable = new ArrayList<Row>(); decimalTable.add(RowFactory.create(new BigDecimal("1"), new BigDecimal("2"))); decimalTable.add(RowFactory.create(new BigDecimal("3"), new BigDecimal("4"))); JavaRDD<Row> rows = sc.parallelize(decimalTable); List<StructField> fields = new ArrayList<StructField>(7); fields.add(DataTypes.createStructField("a", DataTypes.createDecimalType(), true)); fields.add(DataTypes.createStructField("b", DataTypes.createDecimalType(), true)); sqlContext.applySchema(rows.rdd(), DataTypes.createStructType(fields)).registerTempTable("foo"); Assert.assertEquals(sqlContext.sql("select * from foo where a > 0").collectAsList(), decimalTable); }
Fails with
java.lang.ClassCastException: java.math.BigDecimal cannot be cast to org.apache.spark.sql.types.Decimal
Attachments
Issue Links
- Is contained by
-
SPARK-6784 Make sure values of partitioning columns are correctly converted based on their data types
- Resolved
- links to
1.
|
SPARK-6784 | Resolved | Unassigned |