Audio Performance - RT Test

A/V conferencing enables real-time audio and video communications between the Microsoft Teams users. In environments where users use audio streams for communication, it becomes mandatory for the administrator to check the quality of the audio conferencing experience and the load on the Microsoft Teams so that administrators may proactively be alerted to abnormalities/technical glitches in the conferences. This is where the Audio Streams test helps.

Using this test, you can easily figure out the total number of audio streams and the count of audio streams that were classified as poor, good and unclassified. Administrators can also figure out the reason why the audio streams were classified as poor - is it due to high roundtrip? or high packet loss? or jitter? or high degradation?

Target of the test : Microsoft Teams

Agent deploying the test : A remote agent

Outputs of the test : One set of results for the monitored Microsoft Teams service

Configurable parameters for the test
Parameters Description

Test period

How often should the test be executed

Host

The host for which the test is to be configured. By default, this is portal.office.com

Tenant Name

This parameter applies only if you want the eG agent to use Azure AD Certificate-based Authentication for accessing and monitoring an O365 tenant and its resources.

Azure AD certificate-based authentication (CBA) enables customers to allow or require users to authenticate with X.509 certificates against their Azure Active Directory (Azure AD) for applications and browser sign-in. When monitoring highly secure Office 365 environments, you can configure the eG agent to identify itself to a tenant using a valid X.509 certificate, so that it is allowed secure access to the tenant and its resources.

By default, the value of this parameter is none. This means that, by default, the eG agent does not use certificate-based authentication to connect to an O365 tenant.

On the other hand, if you want the eG agent to use this modern authentication technique to securely access a tenant's resources, you should do the following:

  1. Enable Azure AD Certificate-based authentication for the target O365 tenant; this can be achieved manually, via the Office 365 portal, or automatically, using Powershell scripts we provide. For the manual procedure, refer to Manually Enabling Certificate-based Authentication For an Office 365 Tenantunder Microsoft Office 365. For the automatic procedure, refer to Automatically Fulfilling Pre-requisites in a Modern Authentication-Enabled Environmentunder Microsoft Office 365.

    When enabling certificate-based authentication, an X.509 certificate will be generated for the target tenant.

  2. Configure the Tenant Name parameter with the name of the tenant for which certificate-based authentication is enabled. Using the tenant name, the eG agent will be able to read the details of the X.509 certificate that is generated for that tenant, and use that certificate to access that tenant's resources. To determine the tenant name, do the following:

    • Log in to the Microsoft 365 Admin Center as an administrator.

    • Under Setup, click on Domains.

    • Find a domain that ends with .onmicrosoft.com - this is your Microsoft O365 tenant name.

O365 User Name, O365 Password, and Confirm Password

These parameters need to be configured only if the Tenant Name parameter is set to none. On the other hand, if a valid Tenant Name is configured, then you should set these parameters to none .

For execution, this test requires the privileges of an O365 user who has been assigned the Service support admin role and is vested with the View-Only Audit Logs and Team administrator permission. Configure the credentials of such a user against O365 User Name and O365 Password text boxes. Confirm the password by retyping it in the Confirm Password text box.

Domain Name, Domain User Name, Domain Password, and Confirm Password

These parameters are applicable only if the eG agent needs to communicate with the Office 365 portal via a Proxy server.

In this case, in the Domain text box, specify the name of the Windows domain to which the eG agent host belongs. In the Domain User Name text box, mention the name of a valid domain user with login rights to the eG agent host. Provide the password of that user in the Domain Password text box and confirm that password by retyping it in the Confirm Password text box.

On the other hand, if the eG agent is not behind a Proxy server, then you need not disturb the default setting of these parameters. By default, these parameters are set to none.

Proxy Host, Proxy Port, Proxy User Name, Proxy Password and Confirm Password

These parameters are applicable only if the eG agent needs to communicate with the Office 365 portal via a Proxy server.

In this case, provide the IP/host name and port number of the Proxy server that the eG agent should use in the Proxy Host and Proxy Port parameters, respectively.

If the Proxy server requires authentication, then specify the credentials of a valid Proxy user against the Proxy User Name and Proxy Password text boxes. Confirm that password by retyping it in the Confirm Password text box. If the Proxy server does not require authentication, then specify none against the Proxy User Name, Proxy Password, and Confirm Password text boxes.

On the other hand, if the eG agent is not behind a Proxy server, then you need not disturb the default setting of any of the Proxy-related parameters. By default, these parameters are set to none.

Show N Good Streams

By default, this parameter is set to all, indicating that the detailed diagnostics will report details of all streams that perform well. You can change the 'N' in Show N Good Streams by specifying any number of your choice in this text box.

Show N Poor Streams

By default, this parameter is set to all, indicating that the detailed diagnostics will report details of all streams that perform poorly. You can change the 'N' in Show N Poor Streams by specifying any number of your choice in this text box.

DD Frequency

Refers to the frequency with which detailed diagnosis measures are to be generated for this test. The default is 1:1. This indicates that, by default, detailed measures will be generated every time the test runs, and also every time the test detects a problem. You can modify this frequency, if you so desire. Also, if you intend to disable the detailed diagnosis capability for this test, you can do so by specifying none against DD Frequency.

Detailed Diagnosis

To make diagnosis more efficient and accurate, the eG Enterprise embeds an optional detailed diagnostic capability. With this capability, the eG agents can be configured to run detailed, more elaborate tests as and when specific problems are detected. To enable the detailed diagnosis capability of this test for a particular server, choose the On option. To disable the capability, click on the Off option. The option to selectively enabled/disable the detailed diagnosis capability will be available only if the following conditions are fulfilled:

  • The eG manager license should allow the detailed diagnosis capability
  • Both the normal and abnormal frequencies configured for the detailed diagnosis measures should not be 0.
Measurements made by the test
Measurement Description Measurement Unit Interpretation

Unclassified streams

Indicates the number of streams that were marked as Unclassified.

Number

A stream is marked Unclassified when Interactive Connectivity Establishment (ICE) connectivity fails or when all the metrics required to compute the stream classification are not reported.

If ICE connectivity succeeded for an Unclassified stream, the stream is likely considered Unclassified because key stream metrics were not reported. There are a few reasons these metrics may not be reported:

  • QoE reports were not received
  • Short calls
  • Low packet utilization

Use the detailed diagnosis of this measure to know which audio streams were marked as unclassified. The start time, end time, first UPN, second UPN, first and second IP addresses, conference ID, segment ID, call type, participants, caller and callee of each unclassified stream is reported as part of detailed diagnosis, along with the number of times every stream was marked as unclassified.

Total audio streams

Indicates the total number of audio streams.

Number

 

Poor streams

Indicates the total number of audio streams that were classified as poor.

Number

The detailed diagnosis of this measure lists the start time, end time, first UPN, second UPN, first IP address, second IP address, conference ID, the list of participants, caller and callee ID, and the count of times that each stream was classified as 'poor'.

Percentage of poor audio streams

Indicates the percentage of audio streams that were classified as Poor.

Percentage

 

Poor audio with high roundtrip

Indicates the number of times the audio was classified as poor due to high roundtrip.

Number

The detailed diagnosis of this measure lists the start time, end time, first UPN, second UPN, first IP address, second IP address, conference ID, the list of participants, caller and callee ID, and the count of times each stream was classified as 'poor' owing to high roundtrip time.

Poor audio due to high packet loss

Indicates the number of times audio was classified as poor due to large amount of packets being lost.

Number

The detailed diagnosis of this measure lists the start time, end time, first UPN, second UPN, first IP address, second IP address, conference ID, the list of participants, caller and callee ID, and the count of times each stream was classified as 'poor' owing to high packet loss.

Poor audio due to jitter

Indicates the number of times audio was classified as poor due to jitter.

Number

The detailed diagnosis of this measure lists the start time, end time, first UPN, second UPN, first IP address, second IP address, conference ID, the list of participants, caller and callee ID, and the count of times each stream was classified as 'poor' owing to jitter.

Poor audio due to high degradation

Indicates the number of times audio was classified as poor due to average network mean opinion score degradation.

Number

This measure indicates how much network loss and jitter have impacted the quality of received audio.

The detailed diagnosis of this measure lists the start time, end time, first UPN, second UPN, first IP address, second IP address, conference ID, the list of participants, caller and callee ID, and the count of times each stream was classified as 'poor' due to average network mean opinion score degradation.

Poor audio with high concealed ratio

Indicates the number of times audio was classified as poor due to average ratio of the number of audio frames with concealed samples generated by packet loss healing to the total number of audio frames.

Number

The detailed diagnosis of this measure lists the start time, end time, first UPN, second UPN, first IP address, second IP address, conference ID, the list of participants, caller and callee ID, and the count of times each stream was classified as 'poor' owing to high concealed ratio.

Good streams

Indicates the total number of audio streams that were classified as Good.

Number

The detailed diagnosis of this measure lists the start time, end time, first UPN, second UPN, first IP address, second IP address, conference ID, the list of participants, caller and callee ID, and the count of times each stream was classified as 'good'.