Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-13377 AliyunOSS: improvements for stabilization and optimization
  3. HADOOP-15063

IOException may be thrown when read from Aliyun OSS in some case

    XMLWordPrintableJSON

    Details

    • Type: Sub-task
    • Status: Resolved
    • Priority: Major
    • Resolution: Duplicate
    • Affects Version/s: 3.0.0-alpha2, 3.0.0-beta1
    • Fix Version/s: None
    • Component/s: fs/oss
    • Labels:
      None

      Description

      IOException will be thrown in this case
      1. set part size = n(102400)
      2. assume current position = 0, then partRemaining = 102400
      3. we call seek(pos = 101802), with pos > position && pos < position + partRemaining, so it will skip pos - position bytes, but partRemaining remains the same
      4. if we read bytes more than n - pos, it will throw IOException.

      Current code:

      @Override
        public synchronized void seek(long pos) throws IOException {
          checkNotClosed();
          if (position == pos) {
            return;
          } else if (pos > position && pos < position + partRemaining) {
            AliyunOSSUtils.skipFully(wrappedStream, pos - position);
            // we need update partRemaining here
            position = pos;
          } else {
            reopen(pos);
          }
        }
      

      Logs:
      java.io.IOException: Failed to read from stream. Remaining:101802

      at org.apache.hadoop.fs.aliyun.oss.AliyunOSSInputStream.read(AliyunOSSInputStream.java:182)
      at org.apache.hadoop.fs.FSInputStream.read(FSInputStream.java:75)
      at org.apache.hadoop.fs.FSDataInputStream.read(FSDataInputStream.java:92)

      How to re-produce:
      1. create a file with 10MB size
      2.

      int seekTimes = 150;
      for (int i = 0; i < seekTimes; i++) {
            long pos = size / (seekTimes - i) - 1;
            LOG.info("begin seeking for pos: " + pos);
            byte []buf = new byte[1024];
            instream.read(pos, buf, 0, 1024);
      }
      

        Attachments

        1. HADOOP-15063.001.patch
          3 kB
          wujinhu

          Activity

            People

            • Assignee:
              wujinhu wujinhu
              Reporter:
              wujinhu wujinhu
            • Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: