Uploaded image for project: 'Flink'
  1. Flink
  2. FLINK-14955

Not able to write to swift via StreamingFileSink.forBulkFormat

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Major
    • Resolution: Won't Fix
    • 1.8.1, 1.9.1
    • None
    • None

    Description

      not able to use StreamingFileSink to write to swift file storage

       

      Code:

      flink version: 1.9.1.   ( tried with 1.8.1 as well, same exception)

       scala 2.11

      build tool : maven

      main part of the code:

      val eligibleItems: DataStream[EligibleItem] = env.fromCollection(Seq(
      EligibleItem("pencil"),
      EligibleItem("rubber"),
      EligibleItem("beer")))(TypeInformation.of(classOf[EligibleItem]))

      val factory2: ParquetWriterFactory[EligibleItem] = ParquetAvroWriters.forReflectRecord(classOf[EligibleItem])
      val sink: StreamingFileSink[EligibleItem] = StreamingFileSink
      .forBulkFormat(new Path(capHadoopPath),factory2)
      .build()

      eligibleItems.addSink(sink)
      .setParallelism(1)
      .uid("TEST_1")
      .name("TEST")

      scenario : when path is set to point to swift ( capHadoopPath = "swift://<path>" ) , getting exception - java.lang.UnsupportedOperationException: Recoverable writers on Hadoop are only supported for HDFS and for Hadoop version 2.7 or newerjava.lang.UnsupportedOperationException: Recoverable writers on Hadoop are only supported for HDFS and for Hadoop version 2.7 or newer at org.apache.flink.fs.openstackhadoop.shaded.org.apache.flink.runtime.fs.hdfs.HadoopRecoverableWriter.<init>(HadoopRecoverableWriter.java:57) 

       

       

      Attachments

        1. pom.xml
          211 kB
          Simya Jose

        Issue Links

          Activity

            People

              Unassigned Unassigned
              simya.jose Simya Jose
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: