Details

    • Type: Improvement
    • Status: Closed
    • Priority: Major
    • Resolution: Implemented
    • Affects Version/s: 1.3.0
    • Fix Version/s: 1.3.0, 1.4.0
    • Component/s: Documentation
    • Labels:
      None

      Description

      I want to propose small changes to several pages to fix some typos and grammatical flaws.

        Issue Links

          Activity

          Hide
          greghogan Greg Hogan added a comment -

          master: 71d76731dc6f611a6f8772cb06a59f5c642ec6cc
          release-1.3: 81f58bae1a6730e9e6e06185e1dc6cac36e82892

          Show
          greghogan Greg Hogan added a comment - master: 71d76731dc6f611a6f8772cb06a59f5c642ec6cc release-1.3: 81f58bae1a6730e9e6e06185e1dc6cac36e82892
          Hide
          githubbot ASF GitHub Bot added a comment -

          Github user asfgit closed the pull request at:

          https://github.com/apache/flink/pull/3858

          Show
          githubbot ASF GitHub Bot added a comment - Github user asfgit closed the pull request at: https://github.com/apache/flink/pull/3858
          Hide
          githubbot ASF GitHub Bot added a comment -

          Github user greghogan commented on the issue:

          https://github.com/apache/flink/pull/3858

          Merging ...

          Show
          githubbot ASF GitHub Bot added a comment - Github user greghogan commented on the issue: https://github.com/apache/flink/pull/3858 Merging ...
          Hide
          githubbot ASF GitHub Bot added a comment -

          Github user alpinegizmo commented on a diff in the pull request:

          https://github.com/apache/flink/pull/3858#discussion_r115699196

          — Diff: docs/dev/best_practices.md —
          @@ -284,30 +284,30 @@ Change your projects `pom.xml` file like this:

          The following changes were done in the `<dependencies>` section:

          • * Exclude all `log4j` dependencies from all Flink dependencies: This causes Maven to ignore Flink's transitive dependencies to log4j.
          • * Exclude the `slf4j-log4j12` artifact from Flink's dependencies: Since we are going to use the slf4j to logback binding, we have to remove the slf4j to log4j binding.
            + * Exclude all `log4j` dependencies from all Flink dependencies: this causes Maven to ignore Flink's transitive dependencies to log4j.
            + * Exclude the `slf4j-log4j12` artifact from Flink's dependencies: since we are going to use the slf4j to logback binding, we have to remove the slf4j to log4j binding.
          • Add the Logback dependencies: `logback-core` and `logback-classic`
          • * Add dependencies for `log4j-over-slf4j`. `log4j-over-slf4j` is a tool which allows legacy applications which are directly using the Log4j APIs to use the Slf4j interface. Flink depends on Hadoop which is directly using Log4j for logging. Therefore, we need to redirect all logger calls from Log4j to Slf4j which is in turn logging to Logback.
            + * Add dependencies for `log4j-over-slf4j`. `log4j-over-slf4j` is a tool which allows legacy applications which are directly using the Log4j APIs to use the Slf4j interface. Flink depends on Hadoop which is directly using Log4j for logging. Therefore we need to redirect all logger calls from Log4j to Slf4j which is in turn logging to Logback.
              • End diff –

          That was a mistake.

          Show
          githubbot ASF GitHub Bot added a comment - Github user alpinegizmo commented on a diff in the pull request: https://github.com/apache/flink/pull/3858#discussion_r115699196 — Diff: docs/dev/best_practices.md — @@ -284,30 +284,30 @@ Change your projects `pom.xml` file like this: The following changes were done in the `<dependencies>` section: * Exclude all `log4j` dependencies from all Flink dependencies: This causes Maven to ignore Flink's transitive dependencies to log4j. * Exclude the `slf4j-log4j12` artifact from Flink's dependencies: Since we are going to use the slf4j to logback binding, we have to remove the slf4j to log4j binding. + * Exclude all `log4j` dependencies from all Flink dependencies: this causes Maven to ignore Flink's transitive dependencies to log4j. + * Exclude the `slf4j-log4j12` artifact from Flink's dependencies: since we are going to use the slf4j to logback binding, we have to remove the slf4j to log4j binding. Add the Logback dependencies: `logback-core` and `logback-classic` * Add dependencies for `log4j-over-slf4j`. `log4j-over-slf4j` is a tool which allows legacy applications which are directly using the Log4j APIs to use the Slf4j interface. Flink depends on Hadoop which is directly using Log4j for logging. Therefore, we need to redirect all logger calls from Log4j to Slf4j which is in turn logging to Logback. + * Add dependencies for `log4j-over-slf4j`. `log4j-over-slf4j` is a tool which allows legacy applications which are directly using the Log4j APIs to use the Slf4j interface. Flink depends on Hadoop which is directly using Log4j for logging. Therefore we need to redirect all logger calls from Log4j to Slf4j which is in turn logging to Logback. End diff – That was a mistake.
          Hide
          githubbot ASF GitHub Bot added a comment -

          Github user greghogan commented on a diff in the pull request:

          https://github.com/apache/flink/pull/3858#discussion_r115543214

          — Diff: docs/dev/best_practices.md —
          @@ -284,30 +284,30 @@ Change your projects `pom.xml` file like this:

          The following changes were done in the `<dependencies>` section:

          • * Exclude all `log4j` dependencies from all Flink dependencies: This causes Maven to ignore Flink's transitive dependencies to log4j.
          • * Exclude the `slf4j-log4j12` artifact from Flink's dependencies: Since we are going to use the slf4j to logback binding, we have to remove the slf4j to log4j binding.
            + * Exclude all `log4j` dependencies from all Flink dependencies: this causes Maven to ignore Flink's transitive dependencies to log4j.
            + * Exclude the `slf4j-log4j12` artifact from Flink's dependencies: since we are going to use the slf4j to logback binding, we have to remove the slf4j to log4j binding.
          • Add the Logback dependencies: `logback-core` and `logback-classic`
          • * Add dependencies for `log4j-over-slf4j`. `log4j-over-slf4j` is a tool which allows legacy applications which are directly using the Log4j APIs to use the Slf4j interface. Flink depends on Hadoop which is directly using Log4j for logging. Therefore, we need to redirect all logger calls from Log4j to Slf4j which is in turn logging to Logback.
            + * Add dependencies for `log4j-over-slf4j`. `log4j-over-slf4j` is a tool which allows legacy applications which are directly using the Log4j APIs to use the Slf4j interface. Flink depends on Hadoop which is directly using Log4j for logging. Therefore we need to redirect all logger calls from Log4j to Slf4j which is in turn logging to Logback.
              • End diff –

          Why remove the comma?

          Show
          githubbot ASF GitHub Bot added a comment - Github user greghogan commented on a diff in the pull request: https://github.com/apache/flink/pull/3858#discussion_r115543214 — Diff: docs/dev/best_practices.md — @@ -284,30 +284,30 @@ Change your projects `pom.xml` file like this: The following changes were done in the `<dependencies>` section: * Exclude all `log4j` dependencies from all Flink dependencies: This causes Maven to ignore Flink's transitive dependencies to log4j. * Exclude the `slf4j-log4j12` artifact from Flink's dependencies: Since we are going to use the slf4j to logback binding, we have to remove the slf4j to log4j binding. + * Exclude all `log4j` dependencies from all Flink dependencies: this causes Maven to ignore Flink's transitive dependencies to log4j. + * Exclude the `slf4j-log4j12` artifact from Flink's dependencies: since we are going to use the slf4j to logback binding, we have to remove the slf4j to log4j binding. Add the Logback dependencies: `logback-core` and `logback-classic` * Add dependencies for `log4j-over-slf4j`. `log4j-over-slf4j` is a tool which allows legacy applications which are directly using the Log4j APIs to use the Slf4j interface. Flink depends on Hadoop which is directly using Log4j for logging. Therefore, we need to redirect all logger calls from Log4j to Slf4j which is in turn logging to Logback. + * Add dependencies for `log4j-over-slf4j`. `log4j-over-slf4j` is a tool which allows legacy applications which are directly using the Log4j APIs to use the Slf4j interface. Flink depends on Hadoop which is directly using Log4j for logging. Therefore we need to redirect all logger calls from Log4j to Slf4j which is in turn logging to Logback. End diff – Why remove the comma?
          Hide
          githubbot ASF GitHub Bot added a comment -

          Github user greghogan commented on a diff in the pull request:

          https://github.com/apache/flink/pull/3858#discussion_r115541319

          — Diff: docs/dev/best_practices.md —
          @@ -31,13 +31,13 @@ This page contains a collection of best practices for Flink programmers on how t

            1. Parsing command line arguments and passing them around in your Flink application

          -Almost all Flink applications, both batch and streaming rely on external configuration parameters.
          -For example for specifying input and output sources (like paths or addresses), also system parameters (parallelism, runtime configuration) and application specific parameters (often used within the user functions).
          +Almost all Flink applications, both batch and streaming, rely on external configuration parameters.
          +For example, they are used to specify input and output sources (like paths or addresses), system parameters (parallelism, runtime configuration), and application specific parameters (typically used within user functions).
          — End diff –

          Do we need to say "for example"?

          Show
          githubbot ASF GitHub Bot added a comment - Github user greghogan commented on a diff in the pull request: https://github.com/apache/flink/pull/3858#discussion_r115541319 — Diff: docs/dev/best_practices.md — @@ -31,13 +31,13 @@ This page contains a collection of best practices for Flink programmers on how t Parsing command line arguments and passing them around in your Flink application -Almost all Flink applications, both batch and streaming rely on external configuration parameters. -For example for specifying input and output sources (like paths or addresses), also system parameters (parallelism, runtime configuration) and application specific parameters (often used within the user functions). +Almost all Flink applications, both batch and streaming, rely on external configuration parameters. +For example, they are used to specify input and output sources (like paths or addresses), system parameters (parallelism, runtime configuration), and application specific parameters (typically used within user functions). — End diff – Do we need to say "for example"?
          Hide
          githubbot ASF GitHub Bot added a comment -

          Github user greghogan commented on a diff in the pull request:

          https://github.com/apache/flink/pull/3858#discussion_r115544603

          — Diff: docs/dev/stream/checkpointing.md —
          @@ -124,8 +124,8 @@ env.getCheckpointConfig.setMaxConcurrentCheckpoints(1)

            1. Selecting a State Backend

          -The checkpointing mechanism stores the progress in the data sources and data sinks, the state of windows, as well as the [user-defined state](state.html) consistently to
          -provide exactly once processing semantics. Where the checkpoints are stored (e.g., JobManager memory, file system, database) depends on the configured
          +The checkpointing mechanism stores consistent snapshots of the progress in the data sources and data sinks, the state of windows and timers, as well as any [user-defined state](state.html), thereby
          — End diff –

          Perhaps this first sentence could use a little more work. Can we better explain "progress"? "as well as" -> "and"? Could we move "providing exactly once processing semantics" to the beginning: "The checkpointing mechanism provides ..."?

          Show
          githubbot ASF GitHub Bot added a comment - Github user greghogan commented on a diff in the pull request: https://github.com/apache/flink/pull/3858#discussion_r115544603 — Diff: docs/dev/stream/checkpointing.md — @@ -124,8 +124,8 @@ env.getCheckpointConfig.setMaxConcurrentCheckpoints(1) Selecting a State Backend -The checkpointing mechanism stores the progress in the data sources and data sinks, the state of windows, as well as the [user-defined state] (state.html) consistently to -provide exactly once processing semantics. Where the checkpoints are stored (e.g., JobManager memory, file system, database) depends on the configured +The checkpointing mechanism stores consistent snapshots of the progress in the data sources and data sinks, the state of windows and timers, as well as any [user-defined state] (state.html), thereby — End diff – Perhaps this first sentence could use a little more work. Can we better explain "progress"? "as well as" -> "and"? Could we move "providing exactly once processing semantics" to the beginning: "The checkpointing mechanism provides ..."?
          Hide
          githubbot ASF GitHub Bot added a comment -

          Github user greghogan commented on a diff in the pull request:

          https://github.com/apache/flink/pull/3858#discussion_r115538538

          — Diff: docs/dev/best_practices.md —
          @@ -31,13 +31,13 @@ This page contains a collection of best practices for Flink programmers on how t

            1. Parsing command line arguments and passing them around in your Flink application

          -Almost all Flink applications, both batch and streaming rely on external configuration parameters.
          -For example for specifying input and output sources (like paths or addresses), also system parameters (parallelism, runtime configuration) and application specific parameters (often used within the user functions).
          +Almost all Flink applications, both batch and streaming, rely on external configuration parameters.
          +For example, they are used to specify input and output sources (like paths or addresses), system parameters (parallelism, runtime configuration), and application specific parameters (typically used within user functions).

          Since version 0.9 we are providing a simple utility called `ParameterTool` to provide at least some basic tooling for solving these problems.
          — End diff –

          I expect it has been long enough to remove this notice.

          Show
          githubbot ASF GitHub Bot added a comment - Github user greghogan commented on a diff in the pull request: https://github.com/apache/flink/pull/3858#discussion_r115538538 — Diff: docs/dev/best_practices.md — @@ -31,13 +31,13 @@ This page contains a collection of best practices for Flink programmers on how t Parsing command line arguments and passing them around in your Flink application -Almost all Flink applications, both batch and streaming rely on external configuration parameters. -For example for specifying input and output sources (like paths or addresses), also system parameters (parallelism, runtime configuration) and application specific parameters (often used within the user functions). +Almost all Flink applications, both batch and streaming, rely on external configuration parameters. +For example, they are used to specify input and output sources (like paths or addresses), system parameters (parallelism, runtime configuration), and application specific parameters (typically used within user functions). Since version 0.9 we are providing a simple utility called `ParameterTool` to provide at least some basic tooling for solving these problems. — End diff – I expect it has been long enough to remove this notice.
          Hide
          githubbot ASF GitHub Bot added a comment -

          Github user zentol commented on the issue:

          https://github.com/apache/flink/pull/3858

          +1 to merge.

          Show
          githubbot ASF GitHub Bot added a comment - Github user zentol commented on the issue: https://github.com/apache/flink/pull/3858 +1 to merge.
          Hide
          githubbot ASF GitHub Bot added a comment -

          GitHub user alpinegizmo opened a pull request:

          https://github.com/apache/flink/pull/3858

          FLINK-6513[docs] cleaned up some typos and grammatical flaws

          Thanks for contributing to Apache Flink. Before you open your pull request, please take the following check list into consideration.
          If your changes take all of the items into account, feel free to open your pull request. For more information and/or questions please refer to the [How To Contribute guide](http://flink.apache.org/how-to-contribute.html).
          In addition to going through the list, please provide a meaningful description of your changes.

          • [x] General
          • The pull request references the related JIRA issue ("[FLINK-XXX] Jira title text")
          • The pull request addresses only one issue
          • Each commit in the PR has a meaningful commit message (including the JIRA id)

          You can merge this pull request into a Git repository by running:

          $ git pull https://github.com/alpinegizmo/flink 6513-small-changes-to-docs

          Alternatively you can review and apply these changes as the patch at:

          https://github.com/apache/flink/pull/3858.patch

          To close this pull request, make a commit to your master/trunk branch
          with (at least) the following in the commit message:

          This closes #3858


          commit e3f712c7de767bb2464f52b6d19c604a5b123135
          Author: David Anderson <david@alpinegizmo.com>
          Date: 2017-05-09T14:50:53Z

          FLINK-6513[docs] cleaned up some typos and grammatical flaws


          Show
          githubbot ASF GitHub Bot added a comment - GitHub user alpinegizmo opened a pull request: https://github.com/apache/flink/pull/3858 FLINK-6513 [docs] cleaned up some typos and grammatical flaws Thanks for contributing to Apache Flink. Before you open your pull request, please take the following check list into consideration. If your changes take all of the items into account, feel free to open your pull request. For more information and/or questions please refer to the [How To Contribute guide] ( http://flink.apache.org/how-to-contribute.html ). In addition to going through the list, please provide a meaningful description of your changes. [x] General The pull request references the related JIRA issue (" [FLINK-XXX] Jira title text") The pull request addresses only one issue Each commit in the PR has a meaningful commit message (including the JIRA id) You can merge this pull request into a Git repository by running: $ git pull https://github.com/alpinegizmo/flink 6513-small-changes-to-docs Alternatively you can review and apply these changes as the patch at: https://github.com/apache/flink/pull/3858.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #3858 commit e3f712c7de767bb2464f52b6d19c604a5b123135 Author: David Anderson <david@alpinegizmo.com> Date: 2017-05-09T14:50:53Z FLINK-6513 [docs] cleaned up some typos and grammatical flaws

            People

            • Assignee:
              Unassigned
              Reporter:
              alpinegizmo David Anderson
            • Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Development