Game Streaming on Tizen TV with Wasm

This article provides an overview of the most important steps to use Samsung Wasm Extension APIs to create game streaming application dedicated for Tizen TVs.

Overview

Developing a game streaming application requires tremendous effort involving the use of lot of different technologies on both server and client sides. In this article it is shown how to use Samsung specific WebAssembly APIs to deploy existing game streaming solution as Tizen TV Web application.

As such example Moonlight for ChromeOS a third party open source NVIDIA GameStream client has been ported and published on github.com.

As an implementation of GameStream protocol, Moonlight requires access to raw TCP and UDP sockets and a way to play Elementary Stream packets i.e.:

  • H264/H265 NAL Units with either I- or P- frames
  • Opus packets

Currently (as of 2021) there is no standard WebAPI providing access to raw TCP and UDP sockets, hence Tizen Sockets Extension has been developed to fill this gap. Secondly as TVs are embedded devices without much CPU computing power to smoothly decode H264/H265 video in software, Tizen WASM Player has been provided which allows use of specialized hardware decoders for video decoding and playback.

Adapting Moonlight required changes related to following areas:

  • Change in networking code found in moonlight-common-c and enet submodules,
  • Using Wasm Player APIs to decode and play received multimedia data,
  • Changing NaCl module to a WebAssembly one.

In the following sections each of these aspects has been described.

Adapting moonlight-chrome to run on Tizen TV

Adapting networking code

Major differences between POSIX APIs and Samsung Sockets extensions which were addressed during porting are:

  • Issue in errno constants, so there is used __WASI_ERRNO_AGAIN instead of EAGAIN,
  • Lack of fcntl() call, so to create non-blocking socket SOCK_NONBLOCK flag is being passed to socket() call during socket creation.
  • As POSIX sockets API is synchronous API, providing it for JavaScript main thread would cause issues related to blocking various JavaScript activities including page rendering and Garbage Collection. This is the reason why this APIs have been restricted only to side (Worker) threads.

Full list of changes can be found here:

More details regarding Tizen Sockets Extension can be found in the article:

Using Tizen Wasm Player APIs

Multimedia data which Moonlight receives from server are Elementary Stream packets i.e:

  • H264/H264 NAL Units with either I- or P- frames
  • Opus packets

    so their format matches what is expected by Tizen Wasm Player.

Player initialization

The list of steps required to play a content with the configuration mentioned above with the minimal latency are:

  1. Create HTMLMediaElement object. In Moonlight there is a m_MediaElement field in MoonlightInstance class and nacl_module is passed to its constructor. The nacl_module an id of a <video> element.

  2. Create ElementaryMediaStreamSource object. In Moonlight there is a m_Source field in MoonlightInstance class and in constructor following arguments have to be passed:

    • kUltraLow as ElementaryMediaStreamSource::LatencyMode For GameStreaming it is necessary to use kLow or kUltraLow latency modes. Otherwise Tizen Platform will perform content buffering suitable for VOD scenarios, but this would increase latency way beyond what is acceptable by game streaming.
    • kMediaElement as ElementaryMediaStreamSource::RenderingMode This informs Tizen Platform that it will do decoding and content will be rendered on the associated HTMLMediaElement. This mode is advised as it offers lower latency than Video Texture rendering mode.
  3. Create needed listeners. In moonlight-chrome the following three listeners are required:

    • SourceListener - MoonlightInstance::m_SourceListener
    • AudioTrackListener - MoonlightInstance::m_AudioTrackListener
    • VideoTrackListener - MoonlightInstance::m_VideoTrackListener
  4. Attach source listener to the source:

    m_Source.SetListener(&m_SourceListener);

  5. Initialize media pipeline. An example of the pipeline initialization sequence can be found in MoonlightInstance::StartupVidDecSetup:

    1. Attach ElementaryMediaStreamSource to HTMLMediaElement:

      g_Instance->m_MediaElement.SetSrc(&g_Instance->m_Source);
      
    2. Create audio track and attach listener:

      auto add_track_result = g_Instance->m_Source.AddTrack(
          samsung::wasm::ElementaryAudioTrackConfig {
             "audio/webm; codecs=\"pcm\"",  // mimeType
             {},  // extradata (empty?)
             samsung::wasm::SampleFormat::kS16,
             samsung::wasm::ChannelLayout::kStereo,
             kSampleRate
          });
      if (add_track_result) {
        g_Instance->m_AudioTrack = std::move(*add_track_result);
        g_Instance->m_AudioTrack.SetListener(&g_Instance->m_AudioTrackListener);
      }
      

      On some older Tizen TV models there was a problem with H/W decoding of Opus content in low-latency modes, so audio decoding was done in software.

    3. Create video track and attach listener:

      auto add_track_result = g_Instance->m_Source.AddTrack(
          samsung::wasm::ElementaryVideoTrackConfig{
            "video/mp4; codecs=\"hev1.1.6.L93.B0\"",  // h265 mimeType
            {},                                   // extradata (empty?)
            static_cast<uint32_t>(width),
            static_cast<uint32_t>(height),
            static_cast<uint32_t>(redrawRate),  // framerateNum
            1,                                  // framerateDen
        });
      if (add_track_result) {
        g_Instance->m_VideoTrack = std::move(*add_track_result);
        g_Instance->m_VideoTrack.SetListener(&g_Instance->m_VideoTrackListener);
      }
      
    4. Complete Media Player initialization:

      g_Instance->m_Source.Open([](EmssOperationResult){});
      
    5. Start playback:

      g_Instance->m_MediaElement.Play([](EmssOperationResult err) {
        if (err != EmssOperationResult::kSuccess) {
          ClLogMessage("Play error\n");
        } else {
          ClLogMessage("Play success\n");
        }
      });
      

After completing these steps the created pipeline is ready to play media packets.

Video playback

Due to GameStreaming protocol specification, video playback logic boils down to constructing single packet from data chunks and appending such packet directly to Tizen Wasm Player API:

int MoonlightInstance::VidDecSubmitDecodeUnit(PDECODE_UNIT decodeUnit) {
  // Code abbreviated for clarity

  // Build one packet from multiple data chunks:
  totalLength = decodeUnit->fullLength;
  // Resize the decode buffer if needed
  if (totalLength > s_DecodeBuffer.size()) {
    s_DecodeBuffer.resize(totalLength);
  }

  entry = decodeUnit->bufferList;
  offset = 0;
  while (entry != NULL) {
    memcpy(&s_DecodeBuffer[offset], entry->data, entry->length);
    offset += entry->length;
    entry = entry->next;
  }

  // Start the decoding
  samsung::wasm::ElementaryMediaPacket pkt{
      s_pktPts,
      s_pktPts,
      s_frameDuration,
      decodeUnit->frameType == FRAME_TYPE_IDR,
      offset,
      s_DecodeBuffer.data(),
      s_Width,
      s_Height,
      s_Framerate,
      1,
      g_Instance->m_VideoSessionId.load()
  };

  if (g_Instance->m_VideoTrack.AppendPacket(pkt)) {
    s_pktPts += s_frameDuration;
  } else {
    ClLogMessage("Append video packet failed\n");
    return DR_NEED_IDR;
  }

  return DR_OK;
}

Audio playback

In GameStreaming protocol audio data was transported in network packets as Opus frames. Some older Tizen TVs were however not able to decode Opus in hardware while using low-latency playback modes. Thus the audio has to be decoded using libopus to be passed further as a raw audio stream to the Tizen Wasm Player API:

static void DecodeAndAppendPacket(samsung::wasm::ElementaryMediaTrack* track,
                                  samsung::wasm::SessionId session_id,
                                  OpusMSDecoder* decoder,
                                  const unsigned char* sampleData,
                                  int sampleLength) {
  int decodeLen = opus_multistream_decode(
      decoder,
      sampleData           , sampleLength,
      s_DecodeBuffer.data(), FRAME_SIZE,
      0);

  if (decodeLen <= 0)
    s_DecodeBuffer.assign(s_DecodeBuffer.size(), 0);

  samsung::wasm::ElementaryMediaPacket pkt{
     s_pktPts,
     s_pktPts,
     s_frameDuration,
     true,
     s_DecodeBuffer.size() * sizeof(opus_int16),
     s_DecodeBuffer.data(),
     0,  // frame width - not needed for audio
     0,  // frame height - not needed for audio
     0,  // framerate numerator - not needed for audio
     1,  // framerate denominator - 1 to avoid divide by zero error
     session_id
  };

  if (track->AppendPacket(pkt)) {
    s_pktPts += s_frameDuration;
  } else {
    MoonlightInstance::ClLogMessage("Append audio packet failed\n");
  }
}

Converting NaCl module to WebAssembly

With other functionalities not mentioned above like GamePads we followed WebAssembly Migration Guide.

Changes mostly related to:

  • Replacing NaCl module with a WebAssembly one
  • Using <video> element with nacl_module id. Original Moonlight Chrome client performed video decoding and rendering in Native Client module with the same id. Reusing this id allowed us to keep original code layout.
  • Adapting keyboard and gamepad input to APIs provided by Emscripten.