The Citrix Logon Simulator Test

This test emulates a user logging into a Citrix farm and launching an application/desktop. In the process, the test reports the total duration of the simulation, time taken for the login to be authenticated, the time taken for application/desktop ennumeration, duration of application/desktop launch, and log out duration. Additionally, the test also captures failures (if any) at each step of the simulation. Using the insights provided by this test, Citrix administrators can proactively detect logon slowness/failures and precisely pinpoint the root-cause of the anomaly - is it login authentication? enumeration? application/desktop launch? or logout? This way, Citrix administrators are enabled to isolate the probable pain-points of their Citrix delivery infrastructure, even before users begin to actively use applications/desktops.

Target of the test : Citrix XenApp / XenDesktop 6.x or 7.x

Agent deploying the test : An external agent

Outputs of the test : One set of results for every published application and/or virtual desktop that the simulator is configured to launch

Configurable parameters for the test

  1. TEST PERIOD - How often should the test be executed. The default is 15 minutes.

    Note:

    Some parameter changes can sometimes impact the simulation duration. Most often, this can happen in the following situations:

    • If multiple applications/desktops are configured for launching against published resources: In this case, the test will repeat the entire sequence of steps for every configured application/desktop - i.e., after an application is launched, the test will logoff and then log in again to attempt the launch of the next application. This can increase the duration of the simulation.
    • If the value of the Launch Timeout and/or the Web Logoff Delay parameters of the test is significantly increased: If this is done, then the simulator will wait that much longer for the application launch or logoff to happen, thereby increasing simulation duration.
    • If the Prompt flag of the test is set to Yes: If this is done, then the simulator will be forced to respond to each message prompt that appears during its interaction with the application. This in turn will increase simulation duration.

    Sometimes, these changes can cause the simulation to take more time than the configured Test Period.

  2. If this happens, the test will fail after logging an error to that effect in the <EG_AGENT_INSTALL_DIR>\agent\error_log file. To avoid this, it would be a good practice to relook at the TEST PERIOD configuration every time one of the parameters mentioned above is modified, and increase it if required.

  1. Host - The host for which the test is to be configured
  2. Port - Refers to the port used by the Citrix server
  3. site url - Specify the URL for connecting to StoreFront / NetScaler. You can provide an HTTP or an HTTPS URL here. Before specifying the URL, ensure the following:

    • Only StoreFront 2.0 (or above) and NetScaler Gateway v9.3 (or above) is supported.
  4. published resources - To know how to configure the resources to be monitored, refer to the How to Configure Published Resources for Monitoring? topic.

  1. Receiver Console Username - The simulator needs to run in the account of a user who has local administrator rights on the simulation end point - i.e., the system on which the external agent and the Citrix Workspace App have been installed. Specify the name of this user here. This user should also be logged in at all times for the simulator to run continuously.
  2. RECEIVER CONSOLE DOMAIN - If the user specified in RECEIVER CONSOLE USERNAME belongs to a domain, specify the name of the domain in the RECEIVER CONSOLE DOMAIN text box. By default, none is specified against this text box.
  3. launch timeout - By default, this parameter is set to 90 seconds. This implies that the simulator will wait for a maximum of 90 seconds (by default) for an application/desktop to launch. If the application/desktop does not launch even after the 90 seconds have elapsed, then the simulation will be automatically terminated, and the simulator will mark that application/desktop launch as 'failed'. Accordingly, the Application launch availability measure for that published resource (i.e., application/desktop) will report the value 0, and no launch duration will be reported for the same.

    In some environments, one/more published applications may take a little longer to launch than the rest. In such environments, you can instruct the simulator to wait longer for launching each of the configured published resources, by increasing the launch timeout. The high time out setting for resource launch ensures that the simulator captures and reports only genuine launch failures, and does not treat a launch delay as a failure.

  4. web logoff delay - By default, this parameter is set to 30 seconds. This implies that the simulator will wait for a maximum of 30 seconds (by default) after each resource launch, for the logoff to occur. If the logoff does not happen even after 30 seconds have elapsed, then the simulation will be automatically terminated, and the simulator will mark the logoff attempt as 'failed'. A logoff duration will hence not be computed or reported in this case.

    In some environments, even during normal operation, logoff may take longer. In such environments, you can instruct the simulator to wait longer for the logoff to occur, by increasing the web logoff delay. The high time out setting for logoff ensures that the simulator waits for the log off to complete and captures and reports the accurate logoff duration.

  5. Prompt - By default, this flag is set to No. This means that, by default, the simulator suppresses all message prompts that may appear during the simulation. If for some reason, you want the simulator to view and handle these message prompts, then set this flag to Yes.
  6. use ica - By default, this flag is set to Yes, indicating that the eG agent itself automatically logs off the simulated sessions. Sometimes however, the eG agent may not be able to perform clean session logoffs. When this happens, simulated sessions may continue to linger on the server in a disconnected state. In simulations that are performed on-premises, where you have control over the target Citrix infrastructure, you can avoid such disconnected sessions and ensure clean application/desktop logoffs by deploying the light-weight eG Logoff Helper software. To know how to install the software, refer to The eG Logoff Helper topic. Once the helper is installed, set the USE ICA flag to No, so that the logoff helper is automatically used for performing session logoffs.

    Note:

    If the simulation is to be performed on the Citrix Cloud or the Citrix Workspace, where you have no control over the Citrix infrastructure, make sure that the USE ICA flag is set to Yes only.

  7. DD FREQUENCY - Refers to the frequency with which detailed diagnosis measures are to be generated for this test. The default is 1:1. This indicates that, by default, detailed measures will be generated every time this test runs, and also every time the test detects a problem. You can modify this frequency, if you so desire. Also, if you intend to disable the detailed diagnosis capability for this test, you can do so by specifying none against dd frequency.
  8. DETAILED DIAGNOSIS -To make diagnosis more efficient and accurate, the eG Enterprise embeds an optional detailed diagnostic capability. With this capability, the eG agents can be configured to run detailed, more elaborate tests as and when specific problems are detected. To enable the detailed diagnosis capability of this test for a particular server, choose the On option. To disable the capability, click on the Off option.

    The option to selectively enable/disable the detailed diagnosis capability will be available only if the following conditions are fulfilled:

    • The eG manager license should allow the detailed diagnosis capability
    • Both the normal and abnormal frequencies configured for the detailed diagnosis measures should not be 0.

Measurements made by the test

Figure 1 : The measures reported by the Citrix Logon Simulator test

Measurement Description Measurement Unit Interpretation

Logon availability

Indicates whether/not the simulator logged into the web store successfully, when attempting to launch this application/desktop.

Percent

The value 100 for this measure indicates that logon was successful, and the value 0 indicates that logon failed.

If this measure reports the value 0, then no other measures will be reported for that application/desktop.

You can also use the detailed diagnosis of this measure to view the output of the simulation script, scrutinize it, and isolate the failure and problem points of the Citrix delivery infrastructure at first glance.

Logon duration

Indicates the time taken by the simulator to login to StoreFront/NetScaler, when attempting to launch this application/desktop.

Secs

If the Total simulation duration for an application/desktop exceeds its threshold, compare the value of this measure with that of the other duration values reported by the test to know where the bottleneck lies - in login authentication? application enumeration? application launch? or log out?

Application/desktop enumeration availability

Indicates whether/not applications/desktops were successfully enumerated on the StoreFront / NetScaler console, when the simulator attempted to launch this application/desktop.

Percent

The value 100 for this measure indicates that application/desktop enumeration was successful, and the value 0 indicates that enumeration failed.

Application/desktop enumeration duration

Indicates the time taken for application/desktop enumeration to complete, when the simulator attempted to launch this application/desktop.

Secs

If the Total simulation duration for an application/desktop exceeds its threshold, compare the value of this measure with that of the other duration values reported by the test to know where the bottleneck lies - in login authentication? application enumeration? application launch? or log out?

Application/desktop launch availability

Indicates whether/not the simulator launched this application/desktop successfully.

Percent

The value 100 for this measure indicates that application/desktop launch was successful, and the value 0 indicates that the launchn failed.

By comparing the value of this measure across applications/desktops, you can quickly identify which application/desktop could not be launched.

Application/desktop launch duration

Indicates the time taken by the simulator to launch this application/desktop.

Secs

If the Total simulation duration for an application/desktop exceeds its threshold, compare the value of this measure with that of the other duration values reported by the test to know where the bottleneck lies - in login authentication? application enumeration? application launch? or log out?

Logoff duration

Indicates the time taken by the simulator to log out of StoreFront / NetScaler.

Secs

If the Total simulation duration for an application/desktop exceeds its threshold, compare the value of this measure with that of the other duration values reported by the test to know where the bottleneck lies - in login authentication? application enumeration? application launch? or log out?

Total simulation duration

Indicates the total time taken by the simulator to simulate the launch of this application / desktop .

Secs

An abnormally high value for this measure could indicate a logon slowness. In such a case, compare the value of all the duration values reported by the test to know where the bottleneck lies - in login authentication? application enumeration? application launch? or log out?

Screen refresh latency - avg

Indicates the average time interval measured at the client between the first step (user action) and the last step (graphical response displayed) of this application /desktop's interactions with the endpoint.

Seconds

This is a measurement of the screen lag that is experienced during simulation. In other words, is the latency detected from when the simulator hits a key until the response is displayed.

Comparing the value of this measure across simulations will enable administrators to quickly and accurately identify the simulations that are experiencing higher latency.

Data received

Indicates the amount of data received by the simulator to simulate the launch of this application / desktop.

KB

Comparing the value of these measures across simulations will enable administrators to identify the simulation tha had sent/received the maximum amount of data.

 

Data sent

Indicates the amount of data sent from the simulator to simulate the launch of this application / desktop.

KB

Frames received

Indicates the number of frames received by the simulator to simulate the launch of this application / desktop.

Number

Comparing the value of these measures across simulations will reveal the simulation that had sent/received the maximum number of frames.

 

Frames sent

Indicates the number of frames sent from the simulator to simulate the launch of this application / desktop.

Number

Use the detailed diagnosis of the Logon availability measure to view the output of the simulation script, scrutinize it, and isolate the failure and problem points of the Citrix delivery infrastructure at first look. A summary of the simulation is also provided as part of the detailed diagnostics. This includes the Site URL configured for monitoring, the user name used for the simulation, the exact time at which the simulated user logged into the site, and the published resource that was accessed as part of the simulation.

Figure 2 : The detailed diagnosis of the Logon availability measure