Description
In PySpark, _acceptable_types defines accepted Python data types for every Spark SQL data type. The list is shown below.
_acceptable_types = { BooleanType: (bool,), ByteType: (int, long), ShortType: (int, long), IntegerType: (int, long), LongType: (int, long), FloatType: (float,), DoubleType: (float,), DecimalType: (decimal.Decimal,), StringType: (str, unicode), TimestampType: (datetime.datetime, datetime.time, datetime.date), ArrayType: (list, tuple, array), MapType: (dict,), StructType: (tuple, list), }
Let's double check this mapping before 1.1 release.
Attachments
Issue Links
- links to