Details
-
Improvement
-
Status: Resolved
-
Major
-
Resolution: Not A Problem
-
3.1.0
-
None
-
None
Description
SparkSQL and PostgreSQL have a lot different cast behavior between types by default. We should make SparkSQL's cast behavior be consistent with PostgreSQL when spark.sql.dialect is configured as PostgreSQL.
Attachments
Issue Links
- relates to
-
SPARK-27764 Feature Parity between PostgreSQL and Spark
- Open
1.
|
PostgreSQL dialect: cast to boolean | Resolved | wuyi | |
2.
|
PostgreSQL dialect: cast to timestamp | Open | Unassigned | |
3.
|
PostgreSQL dialect: cast to integer | Open | Unassigned | |
4.
|
PostgreSQL dialect: cast to date | Open | Unassigned | |
5.
|
PostgreSQL dialect: cast to double | Open | Unassigned | |
6.
|
PostgreSQL dialect: cast to float | Open | Unassigned | |
7.
|
PostgreSQL dialect: cast to decimal | Open | Unassigned | |
8.
|
PostgreSQL dialect: cast to char | Closed | Unassigned | |
9.
|
PostgreSQL dialect: cast to varchar | Open | Unassigned | |
10.
|
PostgreSQL dialect: cast to bigint | Open | Unassigned | |
11.
|
PostgreSQL dialect: cast to smallint | Open | Unassigned |