Revert of Split IncomingVideoStream into two implementations, with smoothing and without. (patchset #4 id:290001 of https://codereview.webrtc.org/2071473002/ )

Reason for revert:
Reverting again.  The perf regression does not seem to be related to dropping frames.

Original issue's description:
> Reland of Split IncomingVideoStream into two implementations, with smoothing and without.
>
> Original issue's description:
>
> Split IncomingVideoStream into two implementations, with smoothing and without.
>
> This CL fixes an issue with the non-smoothing implementation where frames were delivered on the decoder thread.  No-smoothing is now done in a separate class that uses a TaskQueue.  The implementation may drop frames if the renderer doesn't keep up and it doesn't block the decoder thread.
>
> Further work done:
>
> * I added TODOs and documentation for VideoReceiveStream::OnFrame, where we today grab 5 locks.
>
> * I removed the Start/Stop methods from the IncomingVideoStream implementations.  Now, when an instance is created, it should be considered to be "running" and when it is deleted, it's "not running".  This saves on resources and also reduces the amount of locking required and I could remove one critical section altogether.
>
> * I changed the VideoStreamDecoder class to not depend on IncomingVideoStream but rather use the generic rtc::VideoSinkInterface<VideoFrame> interface.  This means that any implementation of that interface can be used and the decoder can be made to  just use the 'renderer' from the config.  Once we do that, we can decouple the IncomingVideoStream implementations from the decoder and VideoReceiveStream implementations and leave it up to the application for how to do smoothing.  The app can choose to use the Incoming* classes or roll its own (which may be preferable since applications often have their own scheduling mechanisms).
>
> * The non-smoothing IncomingVideoStream implementation currently allows only 1 outstanding pending frame.  If we exceed that, the current frame won't be delivered to the renderer and instead we deliver the next one (since when this happens, the renderer is falling behind).
>
> * The lifetime of the VideoStreamDecoder instance is now bound to Start/Stop in VideoReceiveStream and not all of the lifetime of VideoReceiveStream.
>
> * Fixed VideoStreamDecoder to unregister callbacks in the dtor that were registered in the ctor. (this was open to a use-after-free regression)
>
> * Delay and callback pointers are now passed via the ctors to the IncomingVideoStream classes.  The thread primitives in the IncomingVideoStream classes are also constructed/destructed at the same time as the owning object, which allowed me to remove one more lock.
>
> * Removed code in the VideoStreamDecoder that could overwrite the VideoReceiveStream render delay with a fixed value of 10ms on construction.  This wasn't a problem with the previous implementation (it would be now though) but seemed to me like the wrong place to be setting that value.
>
> * Made the render delay value in VideoRenderFrames, const.
>
> BUG=chromium:620232
> TBR=mflodman
>
> Committed: https://crrev.com/e03f8787377bbc03a4e00184bb14b7561b108cbb
> Cr-Commit-Position: refs/heads/master@{#13175}

TBR=mflodman@webrtc.org
# Skipping CQ checks because original CL landed less than 1 days ago.
NOPRESUBMIT=true
NOTREECHECKS=true
NOTRY=true
BUG=chromium:620232

Review-Url: https://codereview.webrtc.org/2071093002
Cr-Commit-Position: refs/heads/master@{#13176}
This commit is contained in:
tommi 2016-06-16 15:44:04 -07:00 committed by Commit bot
parent e03f878737
commit 8e8222d0d2
15 changed files with 212 additions and 278 deletions

View File

@ -58,7 +58,6 @@ source_set("common_video") {
deps = [
"..:webrtc_common",
"../base:rtc_task_queue",
"../system_wrappers",
]
@ -92,7 +91,6 @@ if (rtc_include_tests) {
"h264/sps_vui_rewriter_unittest.cc",
"i420_buffer_pool_unittest.cc",
"i420_video_frame_unittest.cc",
"incoming_video_stream_unittest.cc",
"libyuv/libyuv_unittest.cc",
]

View File

@ -19,7 +19,6 @@
],
'dependencies': [
'<(webrtc_root)/common.gyp:webrtc_common',
'<(webrtc_root)/base/base.gyp:rtc_task_queue',
'<(webrtc_root)/system_wrappers/system_wrappers.gyp:system_wrappers',
],
'direct_dependent_settings': {

View File

@ -26,7 +26,6 @@
'h264/sps_vui_rewriter_unittest.cc',
'i420_buffer_pool_unittest.cc',
'i420_video_frame_unittest.cc',
'incoming_video_stream_unittest.cc',
'libyuv/libyuv_unittest.cc',
],
# Disable warnings to enable Win64 build, issue 1323.

View File

@ -15,20 +15,30 @@
#include "webrtc/base/criticalsection.h"
#include "webrtc/base/platform_thread.h"
#include "webrtc/base/task_queue.h"
#include "webrtc/base/thread_annotations.h"
#include "webrtc/base/thread_checker.h"
#include "webrtc/common_video/video_render_frames.h"
#include "webrtc/media/base/videosinkinterface.h"
namespace webrtc {
class EventTimerWrapper;
class IncomingVideoStream : public rtc::VideoSinkInterface<VideoFrame> {
public:
IncomingVideoStream(int32_t delay_ms,
rtc::VideoSinkInterface<VideoFrame>* callback);
~IncomingVideoStream() override;
explicit IncomingVideoStream(bool disable_prerenderer_smoothing);
~IncomingVideoStream();
// Overrides VideoSinkInterface
void OnFrame(const VideoFrame& video_frame) override;
// Callback for file recording, snapshot, ...
void SetExternalCallback(rtc::VideoSinkInterface<VideoFrame>* render_object);
// Start/Stop.
int32_t Start();
int32_t Stop();
int32_t SetExpectedRenderDelay(int32_t delay_ms);
protected:
static bool IncomingVideoStreamThreadFun(void* obj);
@ -39,36 +49,26 @@ class IncomingVideoStream : public rtc::VideoSinkInterface<VideoFrame> {
enum { kEventMaxWaitTimeMs = 100 };
enum { kFrameRatePeriodMs = 1000 };
void OnFrame(const VideoFrame& video_frame) override;
rtc::ThreadChecker main_thread_checker_;
rtc::ThreadChecker render_thread_checker_;
rtc::ThreadChecker decoder_thread_checker_;
void DeliverFrame(const VideoFrame& video_frame);
const bool disable_prerenderer_smoothing_;
// Critsects in allowed to enter order.
rtc::CriticalSection stream_critsect_;
rtc::CriticalSection thread_critsect_;
rtc::CriticalSection buffer_critsect_;
rtc::PlatformThread incoming_render_thread_;
// TODO(pbos): Make plain member and stop resetting this thread, just
// start/stoping it is enough.
std::unique_ptr<rtc::PlatformThread> incoming_render_thread_
GUARDED_BY(thread_critsect_);
std::unique_ptr<EventTimerWrapper> deliver_buffer_event_;
rtc::VideoSinkInterface<VideoFrame>* const external_callback_;
std::unique_ptr<VideoRenderFrames> render_buffers_
bool running_ GUARDED_BY(stream_critsect_);
rtc::VideoSinkInterface<VideoFrame>* external_callback_
GUARDED_BY(thread_critsect_);
const std::unique_ptr<VideoRenderFrames> render_buffers_
GUARDED_BY(buffer_critsect_);
};
class IncomingVideoStreamNoSmoothing
: public rtc::VideoSinkInterface<VideoFrame> {
public:
explicit IncomingVideoStreamNoSmoothing(
rtc::VideoSinkInterface<VideoFrame>* callback);
private:
// Overrides VideoSinkInterface
void OnFrame(const VideoFrame& video_frame) override;
rtc::ThreadChecker decoder_thread_checker_;
rtc::VideoSinkInterface<VideoFrame>* const callback_;
rtc::TaskQueue queue_;
};
} // namespace webrtc
#endif // WEBRTC_COMMON_VIDEO_INCLUDE_INCOMING_VIDEO_STREAM_H_

View File

@ -10,7 +10,8 @@
#include "webrtc/common_video/include/incoming_video_stream.h"
#include "webrtc/base/atomicops.h"
#include <assert.h>
#include "webrtc/base/platform_thread.h"
#include "webrtc/base/timeutils.h"
#include "webrtc/common_video/video_render_frames.h"
@ -19,65 +20,122 @@
namespace webrtc {
IncomingVideoStream::IncomingVideoStream(
int32_t delay_ms,
rtc::VideoSinkInterface<VideoFrame>* callback)
: incoming_render_thread_(&IncomingVideoStreamThreadFun,
this,
"IncomingVideoStreamThread"),
IncomingVideoStream::IncomingVideoStream(bool disable_prerenderer_smoothing)
: disable_prerenderer_smoothing_(disable_prerenderer_smoothing),
incoming_render_thread_(),
deliver_buffer_event_(EventTimerWrapper::Create()),
external_callback_(callback),
render_buffers_(new VideoRenderFrames(delay_ms)) {
RTC_DCHECK(external_callback_);
render_thread_checker_.DetachFromThread();
decoder_thread_checker_.DetachFromThread();
incoming_render_thread_.Start();
incoming_render_thread_.SetPriority(rtc::kRealtimePriority);
deliver_buffer_event_->StartTimer(false, kEventStartupTimeMs);
}
running_(false),
external_callback_(nullptr),
render_buffers_(new VideoRenderFrames()) {}
IncomingVideoStream::~IncomingVideoStream() {
RTC_DCHECK(main_thread_checker_.CalledOnValidThread());
{
rtc::CritScope cs(&buffer_critsect_);
render_buffers_.reset();
}
deliver_buffer_event_->Set();
incoming_render_thread_.Stop();
deliver_buffer_event_->StopTimer();
Stop();
}
void IncomingVideoStream::OnFrame(const VideoFrame& video_frame) {
RTC_DCHECK_RUN_ON(&decoder_thread_checker_);
rtc::CritScope csS(&stream_critsect_);
if (!running_) {
return;
}
// Hand over or insert frame.
rtc::CritScope csB(&buffer_critsect_);
if (render_buffers_->AddFrame(video_frame) == 1) {
deliver_buffer_event_->Set();
if (disable_prerenderer_smoothing_) {
DeliverFrame(video_frame);
} else {
rtc::CritScope csB(&buffer_critsect_);
if (render_buffers_->AddFrame(video_frame) == 1) {
deliver_buffer_event_->Set();
}
}
}
int32_t IncomingVideoStream::SetExpectedRenderDelay(
int32_t delay_ms) {
rtc::CritScope csS(&stream_critsect_);
if (running_) {
return -1;
}
rtc::CritScope cs(&buffer_critsect_);
return render_buffers_->SetRenderDelay(delay_ms);
}
void IncomingVideoStream::SetExternalCallback(
rtc::VideoSinkInterface<VideoFrame>* external_callback) {
rtc::CritScope cs(&thread_critsect_);
external_callback_ = external_callback;
}
int32_t IncomingVideoStream::Start() {
rtc::CritScope csS(&stream_critsect_);
if (running_) {
return 0;
}
if (!disable_prerenderer_smoothing_) {
rtc::CritScope csT(&thread_critsect_);
assert(incoming_render_thread_ == NULL);
incoming_render_thread_.reset(new rtc::PlatformThread(
IncomingVideoStreamThreadFun, this, "IncomingVideoStreamThread"));
if (!incoming_render_thread_) {
return -1;
}
incoming_render_thread_->Start();
incoming_render_thread_->SetPriority(rtc::kRealtimePriority);
deliver_buffer_event_->StartTimer(false, kEventStartupTimeMs);
}
running_ = true;
return 0;
}
int32_t IncomingVideoStream::Stop() {
rtc::CritScope cs_stream(&stream_critsect_);
if (!running_) {
return 0;
}
rtc::PlatformThread* thread = NULL;
{
rtc::CritScope cs_thread(&thread_critsect_);
if (incoming_render_thread_) {
// Setting the incoming render thread to NULL marks that we're performing
// a shutdown and will make IncomingVideoStreamProcess abort after wakeup.
thread = incoming_render_thread_.release();
deliver_buffer_event_->StopTimer();
// Set the event to allow the thread to wake up and shut down without
// waiting for a timeout.
deliver_buffer_event_->Set();
}
}
if (thread) {
thread->Stop();
delete thread;
}
running_ = false;
return 0;
}
bool IncomingVideoStream::IncomingVideoStreamThreadFun(void* obj) {
return static_cast<IncomingVideoStream*>(obj)->IncomingVideoStreamProcess();
}
bool IncomingVideoStream::IncomingVideoStreamProcess() {
RTC_DCHECK_RUN_ON(&render_thread_checker_);
if (kEventError != deliver_buffer_event_->Wait(kEventMaxWaitTimeMs)) {
rtc::CritScope cs(&thread_critsect_);
if (incoming_render_thread_ == NULL) {
// Terminating
return false;
}
// Get a new frame to render and the time for the frame after this one.
rtc::Optional<VideoFrame> frame_to_render;
uint32_t wait_time;
{
rtc::CritScope cs(&buffer_critsect_);
if (!render_buffers_.get()) {
// Terminating
return false;
}
frame_to_render = render_buffers_->FrameToRender();
wait_time = render_buffers_->TimeToNextFrameRelease();
}
@ -86,26 +144,21 @@ bool IncomingVideoStream::IncomingVideoStreamProcess() {
if (wait_time > kEventMaxWaitTimeMs) {
wait_time = kEventMaxWaitTimeMs;
}
deliver_buffer_event_->StartTimer(false, wait_time);
if (frame_to_render) {
external_callback_->OnFrame(*frame_to_render);
}
if (frame_to_render)
DeliverFrame(*frame_to_render);
}
return true;
}
////////////////////////////////////////////////////////////////////////////////
IncomingVideoStreamNoSmoothing::IncomingVideoStreamNoSmoothing(
rtc::VideoSinkInterface<VideoFrame>* callback)
: callback_(callback), queue_("InVideoStream") {
decoder_thread_checker_.DetachFromThread();
}
void IncomingVideoStream::DeliverFrame(const VideoFrame& video_frame) {
rtc::CritScope cs(&thread_critsect_);
void IncomingVideoStreamNoSmoothing::OnFrame(const VideoFrame& video_frame) {
RTC_DCHECK_RUN_ON(&decoder_thread_checker_);
queue_.PostTask([video_frame, this]() { callback_->OnFrame(video_frame); });
// Send frame for rendering.
if (external_callback_) {
external_callback_->OnFrame(video_frame);
}
}
} // namespace webrtc

View File

@ -1,88 +0,0 @@
/*
* Copyright (c) 2016 The WebRTC project authors. All Rights Reserved.
*
* Use of this source code is governed by a BSD-style license
* that can be found in the LICENSE file in the root of the source
* tree. An additional intellectual property rights grant can be found
* in the file PATENTS. All contributing project authors may
* be found in the AUTHORS file in the root of the source tree.
*/
#include "testing/gtest/include/gtest/gtest.h"
#include "webrtc/base/event.h"
#include "webrtc/common_video/include/incoming_video_stream.h"
#include "webrtc/media/base/videosinkinterface.h"
#include "webrtc/video_frame.h"
namespace webrtc {
// Basic test that checks if the no-smoothing implementation delivers a frame.
TEST(IncomingVideoStreamTest, NoSmoothingOneFrame) {
class TestCallback : public rtc::VideoSinkInterface<VideoFrame> {
public:
TestCallback() : event_(false, false) {}
~TestCallback() override {}
bool WaitForFrame(int milliseconds) { return event_.Wait(milliseconds); }
private:
void OnFrame(const VideoFrame& frame) override { event_.Set(); }
rtc::Event event_;
} callback;
IncomingVideoStreamNoSmoothing stream(&callback);
rtc::VideoSinkInterface<VideoFrame>* stream_sink = &stream;
stream_sink->OnFrame(VideoFrame());
EXPECT_TRUE(callback.WaitForFrame(500));
}
// Tests that if the renderer is too slow, that frames will not be dropped and
// still the "producer thread" (main test thread), will not be blocked from
// delivering frames.
TEST(IncomingVideoStreamTest, NoSmoothingManyFrames) {
rtc::Event done(false, false);
class TestCallback : public rtc::VideoSinkInterface<VideoFrame> {
public:
explicit TestCallback(rtc::Event* done)
: event_(false, false), done_(done) {}
~TestCallback() override {}
void Continue() { event_.Set(); }
int frame_count() const { return frame_count_; }
private:
void OnFrame(const VideoFrame& frame) override {
++frame_count_;
if (frame_count_ == 1) {
// Block delivery of frames until we're allowed to continue.
event_.Wait(rtc::Event::kForever);
} else if (frame_count_ == 100) {
done_->Set();
}
}
rtc::Event event_;
rtc::Event* done_;
int frame_count_ = 0;
} callback(&done);
{
IncomingVideoStreamNoSmoothing stream(&callback);
rtc::VideoSinkInterface<VideoFrame>* stream_sink = &stream;
for (int i = 0; i < 100; ++i)
stream_sink->OnFrame(VideoFrame());
// Unblock the 'processing' of the first frame and wait for all the frames
// to be processed.
EXPECT_LE(callback.frame_count(), 1);
callback.Continue();
done.Wait(rtc::Event::kForever);
}
EXPECT_EQ(callback.frame_count(), 100);
}
} // namespace webrtc

View File

@ -17,21 +17,14 @@
#include "webrtc/system_wrappers/include/trace.h"
namespace webrtc {
namespace {
const uint32_t kEventMaxWaitTimeMs = 200;
const uint32_t KEventMaxWaitTimeMs = 200;
const uint32_t kMinRenderDelayMs = 10;
const uint32_t kMaxRenderDelayMs = 500;
uint32_t EnsureValidRenderDelay(uint32_t render_delay) {
return (render_delay < kMinRenderDelayMs || render_delay > kMaxRenderDelayMs)
? kMinRenderDelayMs
: render_delay;
VideoRenderFrames::VideoRenderFrames()
: render_delay_ms_(10) {
}
} // namespace
VideoRenderFrames::VideoRenderFrames(uint32_t render_delay_ms)
: render_delay_ms_(EnsureValidRenderDelay(render_delay_ms)) {}
int32_t VideoRenderFrames::AddFrame(const VideoFrame& new_frame) {
const int64_t time_now = rtc::TimeMillis();
@ -70,9 +63,14 @@ rtc::Optional<VideoFrame> VideoRenderFrames::FrameToRender() {
return render_frame;
}
int32_t VideoRenderFrames::ReleaseAllFrames() {
incoming_frames_.clear();
return 0;
}
uint32_t VideoRenderFrames::TimeToNextFrameRelease() {
if (incoming_frames_.empty()) {
return kEventMaxWaitTimeMs;
return KEventMaxWaitTimeMs;
}
const int64_t time_to_release = incoming_frames_.front().render_time_ms() -
render_delay_ms_ -
@ -80,4 +78,18 @@ uint32_t VideoRenderFrames::TimeToNextFrameRelease() {
return time_to_release < 0 ? 0u : static_cast<uint32_t>(time_to_release);
}
int32_t VideoRenderFrames::SetRenderDelay(
const uint32_t render_delay) {
if (render_delay < kMinRenderDelayMs ||
render_delay > kMaxRenderDelayMs) {
WEBRTC_TRACE(kTraceWarning, kTraceVideoRenderer,
-1, "%s(%d): Invalid argument.", __FUNCTION__,
render_delay);
return -1;
}
render_delay_ms_ = render_delay;
return 0;
}
} // namespace webrtc

View File

@ -23,8 +23,7 @@ namespace webrtc {
// Class definitions
class VideoRenderFrames {
public:
explicit VideoRenderFrames(uint32_t render_delay_ms);
VideoRenderFrames(const VideoRenderFrames&) = delete;
VideoRenderFrames();
// Add a frame to the render queue
int32_t AddFrame(const VideoFrame& new_frame);
@ -32,9 +31,15 @@ class VideoRenderFrames {
// Get a frame for rendering, or false if it's not time to render.
rtc::Optional<VideoFrame> FrameToRender();
// Releases all frames
int32_t ReleaseAllFrames();
// Returns the number of ms to next frame to render
uint32_t TimeToNextFrameRelease();
// Sets estimates delay in renderer
int32_t SetRenderDelay(const uint32_t render_delay);
private:
// 10 seconds for 30 fps.
enum { KMaxNumberOfFrames = 300 };
@ -47,7 +52,7 @@ class VideoRenderFrames {
std::list<VideoFrame> incoming_frames_;
// Estimated delay from a frame is released until it's rendered.
const uint32_t render_delay_ms_;
uint32_t render_delay_ms_;
};
} // namespace webrtc

View File

@ -19,9 +19,9 @@ namespace rtc {
template <typename VideoFrameT>
class VideoSinkInterface {
public:
virtual ~VideoSinkInterface() {}
virtual void OnFrame(const VideoFrameT& frame) = 0;
protected:
virtual ~VideoSinkInterface() {}
};
} // namespace rtc

View File

@ -136,6 +136,8 @@ RemoteBitrateEstimatorAbsSendTime::FindBestProbe(
++it) {
if (it->send_mean_ms == 0 || it->recv_mean_ms == 0)
continue;
int send_bitrate_bps = it->mean_size * 8 * 1000 / it->send_mean_ms;
int recv_bitrate_bps = it->mean_size * 8 * 1000 / it->recv_mean_ms;
if (it->num_above_min_delta > it->count / 2 &&
(it->recv_mean_ms - it->send_mean_ms <= 2.0f &&
it->send_mean_ms - it->recv_mean_ms <= 5.0f)) {
@ -146,8 +148,6 @@ RemoteBitrateEstimatorAbsSendTime::FindBestProbe(
best_it = it;
}
} else {
int send_bitrate_bps = it->mean_size * 8 * 1000 / it->send_mean_ms;
int recv_bitrate_bps = it->mean_size * 8 * 1000 / it->recv_mean_ms;
LOG(LS_INFO) << "Probe failed, sent at " << send_bitrate_bps
<< " bps, received at " << recv_bitrate_bps
<< " bps. Mean send delta: " << it->send_mean_ms

View File

@ -2027,8 +2027,7 @@ TEST_F(EndToEndTest, VerifyNackStats) {
void EndToEndTest::VerifyHistogramStats(bool use_rtx,
bool use_red,
bool screenshare) {
class StatsObserver : public test::EndToEndTest,
public rtc::VideoSinkInterface<VideoFrame> {
class StatsObserver : public test::EndToEndTest {
public:
StatsObserver(bool use_rtx, bool use_red, bool screenshare)
: EndToEndTest(kLongTimeoutMs),
@ -2044,8 +2043,6 @@ void EndToEndTest::VerifyHistogramStats(bool use_rtx,
start_runtime_ms_(-1) {}
private:
void OnFrame(const VideoFrame& video_frame) override {}
Action OnSendRtp(const uint8_t* packet, size_t length) override {
if (MinMetricRunTimePassed())
observation_complete_.Set();
@ -2070,7 +2067,6 @@ void EndToEndTest::VerifyHistogramStats(bool use_rtx,
// NACK
send_config->rtp.nack.rtp_history_ms = kNackRtpHistoryMs;
(*receive_configs)[0].rtp.nack.rtp_history_ms = kNackRtpHistoryMs;
(*receive_configs)[0].renderer = this;
// FEC
if (use_red_) {
send_config->rtp.fec.ulpfec_payload_type = kUlpfecPayloadType;
@ -2494,15 +2490,6 @@ TEST_F(EndToEndTest, ReportsSetEncoderRates) {
TEST_F(EndToEndTest, GetStats) {
static const int kStartBitrateBps = 3000000;
static const int kExpectedRenderDelayMs = 20;
class ReceiveStreamRenderer : public rtc::VideoSinkInterface<VideoFrame> {
public:
ReceiveStreamRenderer() {}
private:
void OnFrame(const VideoFrame& video_frame) override {}
};
class StatsObserver : public test::EndToEndTest,
public rtc::VideoSinkInterface<VideoFrame> {
public:
@ -2703,7 +2690,6 @@ TEST_F(EndToEndTest, GetStats) {
expected_receive_ssrcs_.push_back(
(*receive_configs)[i].rtp.remote_ssrc);
(*receive_configs)[i].render_delay_ms = kExpectedRenderDelayMs;
(*receive_configs)[i].renderer = &receive_stream_renderer_;
}
// Use a delayed encoder to make sure we see CpuOveruseMetrics stats that
// are non-zero.
@ -2773,7 +2759,6 @@ TEST_F(EndToEndTest, GetStats) {
std::string expected_cname_;
rtc::Event check_stats_event_;
ReceiveStreamRenderer receive_stream_renderer_;
} test;
RunBaseTest(&test);

View File

@ -144,32 +144,6 @@ VideoCodec CreateDecoderVideoCodec(const VideoReceiveStream::Decoder& decoder) {
} // namespace
namespace internal {
// Since an IncomingVideoStream instance will create a thread/queue, we don't
// instantiate one unless we know we're going to be delivering the frames
// to a renderer.
//
// TODO(tommi): Consider making the functionality provided by the
// IncomingVideoStream classes, tied to the Config class instead or even higher
// level. That will make it an optional feature that will be up to the
// application to use or not use. Right now, we have two classes available,
// they both have threads involved which uses WebRTC's threading classes and
// that might not suit what the application wants to do. If one of them or
// neither, suits, the app can get some code size back as well as more control.
std::unique_ptr<rtc::VideoSinkInterface<VideoFrame>>
MaybeCreateIncomingVideoStream(const VideoReceiveStream::Config& config,
rtc::VideoSinkInterface<VideoFrame>* callback) {
std::unique_ptr<rtc::VideoSinkInterface<VideoFrame>> ret;
if (config.renderer) {
if (config.disable_prerenderer_smoothing) {
ret.reset(new IncomingVideoStreamNoSmoothing(callback));
} else {
ret.reset(new IncomingVideoStream(config.render_delay_ms, callback));
}
}
return ret;
}
VideoReceiveStream::VideoReceiveStream(
int num_cpu_cores,
CongestionController* congestion_controller,
@ -186,6 +160,7 @@ VideoReceiveStream::VideoReceiveStream(
congestion_controller_(congestion_controller),
call_stats_(call_stats),
video_receiver_(clock_, nullptr, this, this, this),
incoming_video_stream_(config_.disable_prerenderer_smoothing),
stats_proxy_(&config_, clock_),
rtp_stream_receiver_(&video_receiver_,
congestion_controller_->GetRemoteBitrateEstimator(
@ -198,6 +173,14 @@ VideoReceiveStream::VideoReceiveStream(
&config_,
&stats_proxy_,
process_thread_),
video_stream_decoder_(&video_receiver_,
&rtp_stream_receiver_,
&rtp_stream_receiver_,
rtp_stream_receiver_.IsRetransmissionsEnabled(),
rtp_stream_receiver_.IsFecEnabled(),
&stats_proxy_,
&incoming_video_stream_,
config.pre_render_callback),
vie_sync_(&video_receiver_) {
LOG(LS_INFO) << "VideoReceiveStream: " << config_.ToString();
@ -205,6 +188,9 @@ VideoReceiveStream::VideoReceiveStream(
RTC_DCHECK(congestion_controller_);
RTC_DCHECK(call_stats_);
// Register the channel to receive stats updates.
call_stats_->RegisterStatsObserver(&video_stream_decoder_);
RTC_DCHECK(!config_.decoders.empty());
std::set<int> decoder_payload_types;
for (const Decoder& decoder : config_.decoders) {
@ -224,6 +210,8 @@ VideoReceiveStream::VideoReceiveStream(
}
video_receiver_.SetRenderDelay(config.render_delay_ms);
incoming_video_stream_.SetExpectedRenderDelay(config.render_delay_ms);
incoming_video_stream_.SetExternalCallback(this);
process_thread_->RegisterModule(&video_receiver_);
process_thread_->RegisterModule(&vie_sync_);
@ -243,6 +231,8 @@ VideoReceiveStream::~VideoReceiveStream() {
for (const Decoder& decoder : config_.decoders)
video_receiver_.RegisterExternalDecoder(nullptr, decoder.payload_type);
call_stats_->DeregisterStatsObserver(&video_stream_decoder_);
congestion_controller_->GetRemoteBitrateEstimator(UseSendSideBwe(config_))
->RemoveStream(rtp_stream_receiver_.GetRemoteSsrc());
}
@ -266,14 +256,7 @@ void VideoReceiveStream::Start() {
if (decode_thread_.IsRunning())
return;
transport_adapter_.Enable();
incoming_video_stream_ = MaybeCreateIncomingVideoStream(config_, this);
video_stream_decoder_.reset(new VideoStreamDecoder(
&video_receiver_, &rtp_stream_receiver_, &rtp_stream_receiver_,
rtp_stream_receiver_.IsRetransmissionsEnabled(),
rtp_stream_receiver_.IsFecEnabled(), &stats_proxy_,
incoming_video_stream_.get(), config_.pre_render_callback));
// Register the channel to receive stats updates.
call_stats_->RegisterStatsObserver(video_stream_decoder_.get());
incoming_video_stream_.Start();
// Start the decode thread
decode_thread_.Start();
decode_thread_.SetPriority(rtc::kHighestPriority);
@ -281,12 +264,10 @@ void VideoReceiveStream::Start() {
}
void VideoReceiveStream::Stop() {
incoming_video_stream_.Stop();
rtp_stream_receiver_.StopReceive();
video_receiver_.TriggerDecoderShutdown();
decode_thread_.Stop();
call_stats_->DeregisterStatsObserver(video_stream_decoder_.get());
video_stream_decoder_.reset();
incoming_video_stream_.reset();
transport_adapter_.Disable();
}
@ -308,26 +289,16 @@ VideoReceiveStream::Stats VideoReceiveStream::GetStats() const {
return stats_proxy_.GetStats();
}
// TODO(tommi): This method grabs a lock 6 times.
void VideoReceiveStream::OnFrame(const VideoFrame& video_frame) {
// TODO(tommi): OnDecodedFrame grabs a lock, incidentally the same lock
// that OnSyncOffsetUpdated() and OnRenderedFrame() below grab.
stats_proxy_.OnDecodedFrame();
int64_t sync_offset_ms;
// TODO(tommi): GetStreamSyncOffsetInMs grabs three locks. One inside the
// function itself, another in GetChannel() and a third in
// GetPlayoutTimestamp. Seems excessive. Anyhow, I'm assuming the function
// succeeds most of the time, which leads to grabbing a fourth lock.
if (vie_sync_.GetStreamSyncOffsetInMs(video_frame, &sync_offset_ms)) {
// TODO(tommi): OnSyncOffsetUpdated grabs a lock.
if (vie_sync_.GetStreamSyncOffsetInMs(video_frame, &sync_offset_ms))
stats_proxy_.OnSyncOffsetUpdated(sync_offset_ms);
}
// config_.renderer must never be null if we're getting this callback.
config_.renderer->OnFrame(video_frame);
if (config_.renderer)
config_.renderer->OnFrame(video_frame);
// TODO(tommi): OnRenderFrame grabs a lock too.
stats_proxy_.OnRenderedFrame(video_frame.width(), video_frame.height());
}

View File

@ -96,10 +96,10 @@ class VideoReceiveStream : public webrtc::VideoReceiveStream,
CallStats* const call_stats_;
vcm::VideoReceiver video_receiver_;
std::unique_ptr<rtc::VideoSinkInterface<VideoFrame>> incoming_video_stream_;
IncomingVideoStream incoming_video_stream_;
ReceiveStatisticsProxy stats_proxy_;
RtpStreamReceiver rtp_stream_receiver_;
std::unique_ptr<VideoStreamDecoder> video_stream_decoder_;
VideoStreamDecoder video_stream_decoder_;
ViESyncModule vie_sync_;
std::unique_ptr<IvfFileWriter> ivf_writer_;

View File

@ -17,6 +17,7 @@
#include "webrtc/base/checks.h"
#include "webrtc/base/logging.h"
#include "webrtc/common_video/include/frame_callback.h"
#include "webrtc/common_video/include/incoming_video_stream.h"
#include "webrtc/modules/video_coding/video_coding_impl.h"
#include "webrtc/modules/video_processing/include/video_processing.h"
#include "webrtc/system_wrappers/include/metrics.h"
@ -31,9 +32,9 @@ VideoStreamDecoder::VideoStreamDecoder(
VCMFrameTypeCallback* vcm_frame_type_callback,
VCMPacketRequestCallback* vcm_packet_request_callback,
bool enable_nack,
bool enable_fec,
bool enable_fec, // TODO(philipel): Actually use this.
ReceiveStatisticsProxy* receive_statistics_proxy,
rtc::VideoSinkInterface<VideoFrame>* incoming_video_stream,
IncomingVideoStream* incoming_video_stream,
I420FrameCallback* pre_render_callback)
: video_receiver_(video_receiver),
receive_stats_callback_(receive_statistics_proxy),
@ -50,10 +51,16 @@ VideoStreamDecoder::VideoStreamDecoder(
video_receiver_->RegisterFrameTypeCallback(vcm_frame_type_callback);
video_receiver_->RegisterReceiveStatisticsCallback(this);
video_receiver_->RegisterDecoderTimingCallback(this);
static const int kDefaultRenderDelayMs = 10;
video_receiver_->SetRenderDelay(kDefaultRenderDelayMs);
VCMVideoProtection video_protection =
enable_nack ? (enable_fec ? kProtectionNackFEC : kProtectionNack)
: kProtectionNone;
VCMVideoProtection video_protection = kProtectionNone;
if (enable_nack) {
if (enable_fec)
video_protection = kProtectionNackFEC;
else
video_protection = kProtectionNack;
}
VCMDecodeErrorMode decode_error_mode = enable_nack ? kNoErrors : kWithErrors;
video_receiver_->SetVideoProtection(video_protection, true);
@ -63,14 +70,7 @@ VideoStreamDecoder::VideoStreamDecoder(
video_receiver_->RegisterPacketRequestCallback(packet_request_callback);
}
VideoStreamDecoder::~VideoStreamDecoder() {
// Unset all the callback pointers that we set in the ctor.
video_receiver_->RegisterPacketRequestCallback(nullptr);
video_receiver_->RegisterDecoderTimingCallback(nullptr);
video_receiver_->RegisterReceiveStatisticsCallback(nullptr);
video_receiver_->RegisterFrameTypeCallback(nullptr);
video_receiver_->RegisterReceiveCallback(nullptr);
}
VideoStreamDecoder::~VideoStreamDecoder() {}
// Do not acquire the lock of |video_receiver_| in this function. Decode
// callback won't necessarily be called from the decoding thread. The decoding
@ -84,9 +84,7 @@ int32_t VideoStreamDecoder::FrameToRender(VideoFrame& video_frame) { // NOLINT
}
}
if (incoming_video_stream_)
incoming_video_stream_->OnFrame(video_frame);
incoming_video_stream_->OnFrame(video_frame);
return 0;
}

View File

@ -19,7 +19,6 @@
#include "webrtc/base/criticalsection.h"
#include "webrtc/base/platform_thread.h"
#include "webrtc/base/scoped_ref_ptr.h"
#include "webrtc/media/base/videosinkinterface.h"
#include "webrtc/modules/remote_bitrate_estimator/include/remote_bitrate_estimator.h"
#include "webrtc/modules/video_coding/include/video_coding_defines.h"
#include "webrtc/typedefs.h"
@ -32,6 +31,7 @@ class ChannelStatsObserver;
class Config;
class EncodedImageCallback;
class I420FrameCallback;
class IncomingVideoStream;
class ReceiveStatisticsProxy;
class VideoRenderCallback;
class VoEVideoSync;
@ -58,7 +58,7 @@ class VideoStreamDecoder : public VCMReceiveCallback,
bool enable_nack,
bool enable_fec,
ReceiveStatisticsProxy* receive_statistics_proxy,
rtc::VideoSinkInterface<VideoFrame>* incoming_video_stream,
IncomingVideoStream* incoming_video_stream,
I420FrameCallback* pre_render_callback);
~VideoStreamDecoder();
@ -89,16 +89,18 @@ class VideoStreamDecoder : public VCMReceiveCallback,
void OnRttUpdate(int64_t avg_rtt_ms, int64_t max_rtt_ms) override;
private:
// Assumed to be protected.
void StartDecodeThread();
void StopDecodeThread();
// Used for all registered callbacks except rendering.
rtc::CriticalSection crit_;
vcm::VideoReceiver* const video_receiver_;
ReceiveStatisticsProxy* const receive_stats_callback_;
rtc::VideoSinkInterface<VideoFrame>* const incoming_video_stream_;
IncomingVideoStream* const incoming_video_stream_;
// TODO(tommi): This callback is basically the same thing as the one above.
// We shouldn't need to support both.
I420FrameCallback* const pre_render_callback_;
int64_t last_rtt_ms_ GUARDED_BY(crit_);