Terminal Server Input Delay Test

One of the most difficult issues faced by users during the RDP sessions is slow response from applications. Every time, the slowness might not be caused due to network latency, workload, a hardware configuration issue or an operating system issue. In some cases, the users might experience slowness due to input device lag that is the time delay experienced between the user's input via a keyboard or mouse and the time at which the application responds to the input. This input delay can deny quick access to the applications and significantly degrade the overall user experience during RDP sessions. Whenever the users experience slowness in accessing the applications, administrators need to figure out if a delay is caused due to the network latency, workload, hardware configuration issue, operating system issue or input device lag. By knowing what is causing the delay, administrators can effectively troubleshoot the problem condition and resolve them in time, so that the user experience during the RDP sessions will not get impacted. To help administrators in this regard, eG offers the Terminal Server Input Delay test.

This test monitors the RDS server and reports the maximum and average time taken by the applications to respond to the user input across all RDP sessions. Using this revelation, administrators can easily figure out if the slowness is caused due to the input device lag and, if so, take remedial actions to resolve the issue before it seriously affects the user experience.

Note:

This test will report measures only for the Windows 2019 desktops on the Microsoft RDS server.

Target of the test : A Microsoft RDS server

Agent deploying the test : An internal agent

Outputs of the test : One set of results for the target RDS server being monitored.

Configurable parameters for the test
  1. TEST PERIOD – How often should the test be executed
  2. Host – The host for which the test is to be configured
  3. Port Refers to the port used by the Microsoft RDS server.
Measurements made by the test
Measurement Description Measurement Unit Interpretation

Maximum input delay for sessions

Indicates the maximum amount of time lag detected between the user's input through any input device (e.g., mouse, keyboard) and the time at which the application responds to the input.

Seconds

Ideally, the values of these measures should be zero or very low.

High values for these measures can impact the speed of accessing the applications in the environment which in turn seriously degrades the overall user experience.

To know exactly which user/application experiences the maximum input lag, you can refer to the Terminal Users and Terminal Applications tests.

 

Average input delay for sessions

Indicates the average amount of time lag detected between the user's input through any input device (e.g., mouse, keyboard) and the time at which the application detected the input.

Seconds