Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-26051

Can't create table with column name '22222d'

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Minor
    • Resolution: Not A Problem
    • 2.3.1
    • None
    • SQL
    • None

    Description

      I can't create table in which the column name is '22222d' when I use spark-sql. It seems a SQL parser bug because it's ok for creating table with the column name ''22222m".

      spark-sql> create table t1(22222d int);
      Error in query:
      no viable alternative at input 'create table t1(22222d'(line 1, pos 16)
      
      == SQL ==
      create table t1(22222d int)
      ----------------^^^
      
      spark-sql> create table t1(22222m int);
      18/11/14 09:13:53 INFO HiveMetaStore: 0: get_database: global_temp
      18/11/14 09:13:53 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_database: global_temp
      18/11/14 09:13:53 WARN ObjectStore: Failed to get database global_temp, returning NoSuchObjectException
      18/11/14 09:13:55 INFO HiveMetaStore: 0: get_database: default
      18/11/14 09:13:55 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_database: default
      18/11/14 09:13:55 INFO HiveMetaStore: 0: get_database: default
      18/11/14 09:13:55 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_database: default
      18/11/14 09:13:55 INFO HiveMetaStore: 0: get_table : db=default tbl=t1
      18/11/14 09:13:55 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_table : db=default tbl=t1
      18/11/14 09:13:55 INFO HiveMetaStore: 0: get_database: default
      18/11/14 09:13:55 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_database: default
      18/11/14 09:13:55 INFO HiveMetaStore: 0: create_table: Table(tableName:t1, dbName:default, owner:root, createTime:1542158033, lastAccessTime:0, retention:0, sd:StorageDescriptor(cols:[FieldSchema(name:22222m, type:int, comment:null)], location:file:/opt/UQuery/spark_/spark-2.3.1-bin-hadoop2.7/spark-warehouse/t1, inputFormat:org.apache.hadoop.mapred.TextInputFormat, outputFormat:org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat, compressed:false, numBuckets:-1, serdeInfo:SerDeInfo(name:null, serializationLib:org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, parameters:{serialization.format=1}), bucketCols:[], sortCols:[], parameters:{}, skewedInfo:SkewedInfo(skewedColNames:[], skewedColValues:[], skewedColValueLocationMaps:{})), partitionKeys:[], parameters:{spark.sql.sources.schema.part.0={"type":"struct","fields":[{"name":"22222m","type":"integer","nullable":true,"metadata":{}}]}, spark.sql.sources.schema.numParts=1, spark.sql.create.version=2.3.1}, viewOriginalText:null, viewExpandedText:null, tableType:MANAGED_TABLE, privileges:PrincipalPrivilegeSet(userPrivileges:{}, groupPrivileges:null, rolePrivileges:null))
      18/11/14 09:13:55 INFO audit: ugi=root ip=unknown-ip-addr cmd=create_table: Table(tableName:t1, dbName:default, owner:root, createTime:1542158033, lastAccessTime:0, retention:0, sd:StorageDescriptor(cols:[FieldSchema(name:22222m, type:int, comment:null)], location:file:/opt/UQuery/spark_/spark-2.3.1-bin-hadoop2.7/spark-warehouse/t1, inputFormat:org.apache.hadoop.mapred.TextInputFormat, outputFormat:org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat, compressed:false, numBuckets:-1, serdeInfo:SerDeInfo(name:null, serializationLib:org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, parameters:{serialization.format=1}), bucketCols:[], sortCols:[], parameters:{}, skewedInfo:SkewedInfo(skewedColNames:[], skewedColValues:[], skewedColValueLocationMaps:{})), partitionKeys:[], parameters:{spark.sql.sources.schema.part.0={"type":"struct","fields":[{"name":"22222m","type":"integer","nullable":true,"metadata":{}}]}, spark.sql.sources.schema.numParts=1, spark.sql.create.version=2.3.1}, viewOriginalText:null, viewExpandedText:null, tableType:MANAGED_TABLE, privileges:PrincipalPrivilegeSet(userPrivileges:{}, groupPrivileges:null, rolePrivileges:null))
      18/11/14 09:13:55 WARN HiveMetaStore: Location: file:/opt/UQuery/spark_/spark-2.3.1-bin-hadoop2.7/spark-warehouse/t1 specified for non-external table:t1
      18/11/14 09:13:55 INFO FileUtils: Creating directory if it doesn't exist: file:/opt/UQuery/spark_/spark-2.3.1-bin-hadoop2.7/spark-warehouse/t1
      Time taken: 2.15 seconds
      18/11/14 09:13:56 INFO SparkSQLCLIDriver: Time taken: 2.15 seconds

      Attachments

        Activity

          People

            Unassigned Unassigned
            xiejuntao1002@163.com Xie Juntao
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: