Skip to end of metadata
Go to start of metadata

Media Performance Data

The SBC Core system monitors the packet loss and jitter exceeding the jitter buffers capacity using the playout time series. The playout time series consists of 31 quality measurements, with each measurement representing a consecutive time period. Taken as a whole, the measurements represent how the playout buffer viewed the jitter and packet loss over consecutive time periods. Within each time period the quality is classified into four possible values:

  • Good
  • Acceptable
  • Poor
  • Unacceptable

Whenever the playout buffer has no data to play due to packet loss or excessive jitter, the SBC tracks the missing data duration over a time period. The total duration of the missing data over a time period is compared against three programmable thresholds to classify the performance during the period (THRESHOLD0, THRESHOLD1, and THRESHOLD2). The threshold comparison is listed in the table below.

Table : Threshold Comparison

If the duration of the missing data isQuality is considered
Less than or equal to THRESHOLD0Good
Greater than THRESHOLD0 and less than THRESHOLD1Acceptable
Greater than THRESHOLD1 and less than THRESHOLD2Poor
Greater than THRESHOLD2Unacceptable

The time series provides an approximate indication of the location where problem arises in the packets, for exactly determining the call problems. For example, a large single-event outage or a continuous series of packet issues distributed throughout the call.

Since the time period is fixed, the duration of the calls affect the number of time period intervals that are used for collecting data. By using a default time period of 20 seconds, a short call of 1-30 seconds, produces data for one or two time periods, whereas a longer call of 10 minutes will have data for the last 30 time periods. The calls which lasts more than 31 time periods will have data only for the last 31 time periods of the call (old data is discarded). If you wish to obtain data at a more granular level, you can configure the time period to be shorter, however this precludes you from monitoring longer calls (since only the last 31 time periods are recorded). 

Configuring the Playout Time Series Period and Thresholds

Table : Playout Buffer Sizing Chart

Codec
Playout Buffer
Length (ms)
Number of
Frames
Frame Size
(bytes)
Total Size
(bytes)
G.71120020801600
G.711 Side B1001080800
G.72640040602400
G.7295005010500
G.723150050241200
iLBC 20ms5002050950
iLBC 30ms60025321000
AMR/EFR5002532875
EVRC/EVRC-B5002522550
G.72240020801600
G.722.140020801600
G.722.240020621230
Opus40040300

12000

To configure the playout time series parameters, you set the thresholds to detect a certain percentage of missing data within a time period. For example, to configure a 20-second time period where between one and two percent of missing data is considered Poor quality, and more than two percent of missing data is considered Unacceptable:

  • Calculate the duration of the percentages of the 20-second period:
    • 1 percent of 20 seconds = 0.2 seconds (200msec)
    • 2 percent of 20 seconds = 0.4 seconds (400msec)
  • Assign these values (in milliseconds) to playoutTimeseriesThreshold1 and playoutTimeseriesThreshold2. The playoutTimeseriesThreshold0 is generally set to 0 (default).

The following CLI commands illustrate the example above.

% set system dspPad playoutTimeseriesPeriod 20000
% set system dspPad playoutTimeseriesThreshold1 200
% set system dspPad playoutTimeseriesThreshold12 400