Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-44111 Prepare Apache Spark 4.0.0
  3. SPARK-46461

The `sbt console` command is not available

    XMLWordPrintableJSON

Details

    • Sub-task
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • 4.0.0
    • 4.0.0
    • Build

    Description

      1. Unable to define expressions after executing the `build/sbt console` command
      scala> val i = 1 // show
      package $line3 {
        sealed class $read extends _root_.scala.Serializable {
          def <init>() = {
            super.<init>;
            ()
          };
          sealed class $iw extends _root_.java.io.Serializable {
            def <init>() = {
              super.<init>;
              ()
            };
            val i = 1
          };
          val $iw = new $iw.<init>
        };
        object $read extends scala.AnyRef {
          def <init>() = {
            super.<init>;
            ()
          };
          val INSTANCE = new $read.<init>
        }
      }
      warning: -target is deprecated: Use -release instead to compile against the correct platform API.
      Applicable -Wconf / @nowarn filters for this warning: msg=<part of the message>, cat=deprecation
             ^
             error: expected class or object definition 

      2.  Due to the default unused imports check, the error "unused imports" will be reported after executing the `build/sbt sql/console` command

      Welcome to Scala 2.13.12 (OpenJDK 64-Bit Server VM, Java 17.0.9).
      Type in expressions for evaluation. Or try :help.
      warning: -target is deprecated: Use -release instead to compile against the correct platform API.
      Applicable -Wconf / @nowarn filters for this warning: msg=<part of the message>, cat=deprecation
             import org.apache.spark.sql.catalyst.errors._
                                                  ^
      On line 6: error: object errors is not a member of package org.apache.spark.sql.catalyst
             import org.apache.spark.sql.catalyst.analysis._
                                                           ^
      On line 4: error: Unused import
             Applicable -Wconf / @nowarn filters for this fatal warning: msg=<part of the message>, cat=unused-imports, site=
             import org.apache.spark.sql.catalyst.dsl._
                                                      ^
      On line 5: error: Unused import
             Applicable -Wconf / @nowarn filters for this fatal warning: msg=<part of the message>, cat=unused-imports, site=
             import org.apache.spark.sql.catalyst.errors._
                                                         ^
      On line 6: error: Unused import
             Applicable -Wconf / @nowarn filters for this fatal warning: msg=<part of the message>, cat=unused-imports, site=
             import org.apache.spark.sql.catalyst.expressions._
                                                              ^
      On line 7: error: Unused import
             Applicable -Wconf / @nowarn filters for this fatal warning: msg=<part of the message>, cat=unused-imports, site=
             import org.apache.spark.sql.catalyst.plans.logical._
                                                                ^
      On line 8: error: Unused import
             Applicable -Wconf / @nowarn filters for this fatal warning: msg=<part of the message>, cat=unused-imports, site=
             import org.apache.spark.sql.catalyst.rules._
                                                        ^
      On line 9: error: Unused import
             Applicable -Wconf / @nowarn filters for this fatal warning: msg=<part of the message>, cat=unused-imports, site=
             import org.apache.spark.sql.catalyst.util._
                                                       ^
      On line 10: error: Unused import
             Applicable -Wconf / @nowarn filters for this fatal warning: msg=<part of the message>, cat=unused-imports, site=
             import org.apache.spark.sql.execution
                                         ^
      On line 11: error: Unused import
             Applicable -Wconf / @nowarn filters for this fatal warning: msg=<part of the message>, cat=unused-imports, site=
             import org.apache.spark.sql.functions._
                                                   ^
      On line 12: error: Unused import
             Applicable -Wconf / @nowarn filters for this fatal warning: msg=<part of the message>, cat=unused-imports, site=
             import org.apache.spark.sql.types._
                                               ^
      On line 13: error: Unused import
             Applicable -Wconf / @nowarn filters for this fatal warning: msg=<part of the message>, cat=unused-imports, site=
             import sqlContext.implicits._
                                         ^
      On line 17: error: Unused import
             Applicable -Wconf / @nowarn filters for this fatal warning: msg=<part of the message>, cat=unused-imports, site=
             import sqlContext._
                               ^
      On line 18: error: Unused import
             Applicable -Wconf / @nowarn filters for this fatal warning: msg=<part of the message>, cat=unused-imports, site=
      
      
      scala>  

      It is necessary to delete `-Wunused:imports` from both SparkBuild. sbt and pom.xml in order to avoid this error

      Attachments

        Issue Links

          Activity

            People

              LuciferYang Yang Jie
              LuciferYang Yang Jie
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: