REEFT-360: Real-time Emulation and Evaluation Framework for Tile-based 360° Streaming under Time-varying Conditions
Eric Lindskog and Niklas Carlsson
Paper:
Eric Lindskog and Niklas Carlsson.
"REEFT-360: Real-time Emulation and Evaluation Framework for Tile-based 360° Streaming under Time-varying Conditions",
Proc. ACM MMSys,
June 2021.
(pdf)
Abstract:
With 360° video streaming, the user's field of view (a.k.a. viewport)
is at all times determined by the user's current viewing direction. Since any two
users are unlikely to look in the exact same direction as each other throughout the
viewing of a video, the frame-by-frame video sequence displayed during a playback
session is typically unique. This complicates the direct comparison of the perceived
Quality of Experience (QoE) using popular metrics such as the Multiscale-Structural
Similarity (MS-SSIM). Furthermore, there is an absence of light-weight emulation
frameworks for tiled-based 360° video streaming that allow easy testing of
different algorithm designs and tile sizes. To address these challenges, we present
REEFT-360, which consists of (1) a real-time emulation framework that captures
tile-quality adaptation under time-varying bandwidth conditions and
(2) a multi-step evaluation process that allows the calculation of MS-SSIM
scores and other frame-based metrics, while accounting for the user's head movements.
Importantly, the framework allows speedy implementation and testing of alternative
head-movement prediction and tile-based prefetching solutions, allows testing under
a wide range of network conditions, and can be used either with a human user or
head-movement traces. The developed software tool is shared with the paper.
We also present proof-of-concept evaluation results that highlight the
importance of including a human subject in the evaluation.
Software and datasets
Our software tool and example datasets are made available here
for reviewing purposes. For now, cite this as a technical report.
If our contribution is accepted to the conference,
the tool and dataset will be shared with the whole community.
-
Software and datasets:
The software, code, and dataset are available here.
The .zip file includes two folders.
-
360-video-emulator:
This folder contains the emulator code.
In this folder you can find a
readme.md
file with instructions
for how to set up and use the emulator.
-
experimentdata:
This folder contains the experiments data.
The
readme.md
file in this folder simply summarizes
what example experiments the folder contains
(each with its own sub directory) and explenations
for what all the files found in these folders contain.
Multi-step evaluations
The setup (using the 360-video-emulator above)
allows many types of easy evaluations using any of the four
evaluation modes available (some with a human in the loop and others trace-based),
as well as using any chunk size, number of tiles, max buffer, video, and different network traces.
Please note that to perform a full evaluation using our multi-step evaluation
methodology (which we suggest), one would need to follow the steps outlined
in the accompanying paper. This involves four steps, where the first three
steps involve carefully running three consecutive experiments in the
(1) evaluation mode, (2) recording mode, and (3) baseline mode, respectively,
and then (4) applying the
video quality measurement tool (VQMT),
installed separately, to calculate different per-frame statistics using the outputs from steps
(2) and (3). In the ``experiment data” directory also shared with this paper, we include such
per-frame traces for six metrics (MS-SSIM, PSNR, PSNR-HS, PNSR-HSVM, SSIM, VIFP) for each of
the 28 experiments discussed in the paper. In the paper itself, we show only example results
using MS-SSIM and some other metrics that we calculate using the output files from steps (2) and (3).
Example results using multi-step evaluation methodology
The experimentdata folder contains example results
from the 28 example experiments discussed in the paper.
Here, we have applied the multi-step evaluation methodology for each experiment.
Furthermore, we use a separate directory for each such experiment.
At the bottom of the corresponding read.me
file,
we include a table that summarizes each experiment.
Here, each column represents (1) the algorithm used, (2) the video watched,
(3) the buffer size used, (4) the network conditions experienced [based on trace],
(5) the A parameter value used, and (6) the folder in which the results for that
experiment can be found.
Citing our work
Finally, if you use our software, code or datafiles in your research,
please include a reference to our paper
(pdf).
-
Cite as:
Eric Lindskog and Niklas Carlsson.
"REEFT-360: Real-time Emulation and Evaluation Framework for Tile-based 360° Streaming under Time-varying Conditions",
Proc. ACM MMSys,
June 2021.