Uploaded image for project: 'Guacamole'
  1. Guacamole
  2. GUACAMOLE-250

Implement support for in-browser playback of screen recordings

    Details

    • Type: New Feature
    • Status: Resolved
    • Priority: Minor
    • Resolution: Done
    • Affects Version/s: None
    • Fix Version/s: 0.9.13-incubating
    • Component/s: None
    • Labels:
      None

      Description

      Screen recordings of Guacamole sessions are simply dumps of the Guacamole protocol data from one side of the connection. Currently, these recordings are played back after being converted to video using guacenc, but they could just as easily be played back directly by Guacamole itself. Direct playback would remove the need to translate recordings to video (an expensive operation) and would allow playback within any browser which Guacamole already supports.

      Since the Guacamole.Client object accepts any implementation of the Guacamole.Tunnel interface, implementing some sort of "playback tunnel" which would parse Guacamole protocol from a static file (rather than connect through a Guacamole server tunnel) might be easy.

        Issue Links

          Activity

          Hide
          mike.jumper Michael Jumper added a comment -

          Given the above, probably better to use Guacamole.Recording as both the object representing the recording and the object producing the recording from instructions provided via a tunnel, rather than Guacamole.Player (which is overly specific).

          Show
          mike.jumper Michael Jumper added a comment - Given the above, probably better to use Guacamole.Recording as both the object representing the recording and the object producing the recording from instructions provided via a tunnel, rather than Guacamole.Player (which is overly specific).
          Hide
          mike.jumper Michael Jumper added a comment -

          When building the array of syncs (during processing of the received video), could represent each sync as a "frame" object, with that frame either being incremental (a set of instructions between itself and the previous frame) or absolute (a snapshot of client state).

          The would allow the transport receiving instructions to be abstracted away (no need to use byte offsets, nor any need to maintain the original XHR as a buffer), such that any tunnel implementation could be used as a source of instructions. Besides being cleaner and easier to implement, this would allow things like client-side session recording (or even DVR-like rewinding of an active session) to be done using the same core pieces.

          Show
          mike.jumper Michael Jumper added a comment - When building the array of syncs (during processing of the received video), could represent each sync as a "frame" object, with that frame either being incremental (a set of instructions between itself and the previous frame) or absolute (a snapshot of client state). The would allow the transport receiving instructions to be abstracted away (no need to use byte offsets, nor any need to maintain the original XHR as a buffer), such that any tunnel implementation could be used as a source of instructions. Besides being cleaner and easier to implement, this would allow things like client-side session recording (or even DVR-like rewinding of an active session) to be done using the same core pieces.
          Hide
          mike.jumper Michael Jumper added a comment -

          Can split this up into roughly four stages:

          1. Core support for import/export client state.
          2. Skeleton player which just downloads the stream and generates keyframes using state export, while recording the byte locations of sync instructions within the stream on a per-timestamp basis (for later reference during seeking).
          3. Player plays back in realtime via a visible client while keyframes are being generated. Support for start/pause.
          4. Seek function which pulls the appropriate keyframe and begins playback at the desired point (after replaying additional instructions as necessary). This would need to do a binary search through the array of syncs to determine appropriate keyframe and byte starting point.
          Show
          mike.jumper Michael Jumper added a comment - Can split this up into roughly four stages: Core support for import/export client state. Skeleton player which just downloads the stream and generates keyframes using state export, while recording the byte locations of sync instructions within the stream on a per-timestamp basis (for later reference during seeking). Player plays back in realtime via a visible client while keyframes are being generated. Support for start/pause. Seek function which pulls the appropriate keyframe and begins playback at the desired point (after replaying additional instructions as necessary). This would need to do a binary search through the array of syncs to determine appropriate keyframe and byte starting point.
          Hide
          mike.jumper Michael Jumper added a comment -

          Looking into the need for seeking (automatic generation of keyframes based on recordings), we can't just take raw snapshots of client state, since that's uncompressed and would be enormous (100 keyframes could easily approach 1 GB for large screens), but compressing state would work (ie: save layers and buffers as PNG data). I've verified with a basic POC that encoding PNGs for visible layers at regular intervals while playing back a recording as fast as possible does not add too much overhead.

          Overall approach:

          1. Add support to Guacamole.Client for (1) exporting its current state as an object and (2) importing its state from an object. This object will form the foundation of Guacamole's auto-generated keyframes, and will need to persist the contents of its layers in a compressed form (PNG).
          2. Implement a new object, Guacamole.Player, which downloads static files using normal XMLHttpRequest, feeding its own internal Guacamole.Client using an internal implementation of Guacamole.Tunnel. This particular client instance runs solely for the purpose of generating keyframes and decodes the stream as quickly as possible. Simultaneously, another Guacamole.Client (also contained within Guacamole.Player) will be used for playback, and will be fed data in realtime (delayed as necessary to match time with the sync instructions in the stream) through yet another internal Guacamole.Tunnel. When seeking needs to occur, the nearest keyframe to the desired point in time will be used to reinitialize the playback client, and instructions after the keyframe and up to the desired point in time will be fed to the playback client, followed by normal time playback of the remaining instructions.
          Show
          mike.jumper Michael Jumper added a comment - Looking into the need for seeking (automatic generation of keyframes based on recordings), we can't just take raw snapshots of client state, since that's uncompressed and would be enormous (100 keyframes could easily approach 1 GB for large screens), but compressing state would work (ie: save layers and buffers as PNG data). I've verified with a basic POC that encoding PNGs for visible layers at regular intervals while playing back a recording as fast as possible does not add too much overhead. Overall approach: Add support to Guacamole.Client for (1) exporting its current state as an object and (2) importing its state from an object. This object will form the foundation of Guacamole's auto-generated keyframes, and will need to persist the contents of its layers in a compressed form (PNG). Implement a new object, Guacamole.Player , which downloads static files using normal XMLHttpRequest , feeding its own internal Guacamole.Client using an internal implementation of Guacamole.Tunnel . This particular client instance runs solely for the purpose of generating keyframes and decodes the stream as quickly as possible. Simultaneously, another Guacamole.Client (also contained within Guacamole.Player ) will be used for playback, and will be fed data in realtime (delayed as necessary to match time with the sync instructions in the stream) through yet another internal Guacamole.Tunnel . When seeking needs to occur, the nearest keyframe to the desired point in time will be used to reinitialize the playback client, and instructions after the keyframe and up to the desired point in time will be fed to the playback client, followed by normal time playback of the remaining instructions.
          Hide
          mike.jumper Michael Jumper added a comment -

          Copying downstream GUAC-1585, which seems to cover the original request and then some.

          Show
          mike.jumper Michael Jumper added a comment - Copying downstream GUAC-1585 , which seems to cover the original request and then some.
          Hide
          mike.jumper Michael Jumper added a comment -

          I used "guacenc" to convert guac protocol recording file to m4v, but after converted, the target m4v file is too large!!

          There are other reasons support for direct playback of recordings might be desirable, including removing the need to process the recording at all. Leveraging the existing client object, this playback could be done in browser and without reimplementing things.

          Show
          mike.jumper Michael Jumper added a comment - I used "guacenc" to convert guac protocol recording file to m4v, but after converted, the target m4v file is too large!! There are other reasons support for direct playback of recordings might be desirable, including removing the need to process the recording at all. Leveraging the existing client object, this playback could be done in browser and without reimplementing things.

            People

            • Assignee:
              mike.jumper Michael Jumper
              Reporter:
              arlenliu yalinliu
            • Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Development