Adding a Descriptor-based REST API Test

To illustrate a descriptor-based REST API test, consider another example. Say you have an environment where multiple servers/desktops are installed. This example involves the creation of a CPUPerformance_ex, which will use the REST API to execute in the target environment and provide a report on the CPU utilization of each server/desktop.

Begin adding this test by selecting the Test option from the Integration Console tile (see Figure 1). In the integration console - test page that appears next (see Figure 2), click the Add New Test button. The new test details page (see Figure 1) appears, wherein the following details need to be provided for our example:

  • Test type - Performance

  • Test name – CPUPerformance_ex

    Note:

    While adding a new test using the Integration Console, ensure that the Test name always ends with _ex. If not, an error message (see Figure 3) will appear upon clicking the Add button in Figure 1.

  • Duplicate - Since the new test is not a duplicate of any existing test, set the Duplicate flag to No.
  • Execution – Internal, as an internal agent will be executing the test

    Note:

    Using the Integration Console plugin, you can add an internal or an external test, but you cannot add tests that need to be run by a remote agent – i.e., tests that need to be executed in an agentless manner.

  • Port based – Specify whether the target server listens on a port or not.
  • Category - REST

Figure 1 : Adding a descriptor-based test of type REST

Then, click the Add button in Figure 1 to add the new test to the eG Enterprise system. The API tab page then opens automatically (see Figure 2).

Figure 2 : The API tab

Click the Configure API button in Figure 2 to configure a REST API of your choice. Figure 3 then appears.

Figure 3 : Configuring the REST API

In Figure 3, specify the following:

API Name - Specify the name of the API that you wish to configure.

REST URL - Specify the exact REST URL that this test should connect to and collect the required metrics.

HTTP Method - Here, pick the required HTTP Method. By default, this is set to GET.

User Authentication Required - By default, the User Authentication Required slider is turned off. This indicates that the data can be collected by any user from the REST URL. In some environments however, a specific user alone will be authorized to execute the REST API in order to collect the required data. In such environments, administrators can turn on the User Authentication Required slider (see Figure 3). This will enable the User name and Password text boxes. Specify the User name and Password of the user who is authorized to execute the REST API.

Header Required - By default, the Header Required slider is turned off (see Figure 3) indicating that no additional information is provided in the HTTP header while executing the REST API. If you wish to provide additional information in the HTTP header, turn on the Header Required slider as shown in Figure 4. The Name and Value text boxes will then appear. Specify the Name and Value parameters that you need to include in the Header of the REST API and click the Add Header button. The NAME and VALUE columns in the section below will now be populated. To delete a Name and Value pair, click the icon available in that row. To delete all the Name and Value pairs, select the check box preceding the NAME column and click the icon.

Figure 4 : Adding Headers

Parameter Required - By default, the Parameter Required slider is turned off (see Figure 3) indicating that no additional parameters are specified as variable parts of your REST API resources i.e., the data that you are working on. If you wish to provide additional parameters, turn on the Parameter Required slider as shown in Figure 5. The Name and Value text boxes will then appear. Specify the Name and Value parameters that you need to include in the parameter of the REST API and click the Add Param button. The NAME and VALUE columns in the section below will now be populated. To delete a Name and Value pair, simply, click the icon available in that row. To delete all the Name and Value pairs, select the check box preceding the NAME column and click the icon.

Figure 5 : Adding Parameters

Request Body - If POST is chosen from the HTTP Method list, then an additional Request Body section will appear as shown in Figure 3. By default, the x-www-form-urlencoded flag will be chosen indicating that the name and values are encoded in name-value tuples separated by '&', with a '=' between the name and the value. Specify the Name and Value parameters that you need to include in the body of the REST API and click the Add Body button. The NAME and VALUE columns in the section below will now be populated. To delete a Name and Value pair, simply click the icon available in a NAMEVALUE row. To delete all the Name and Value pairs, select the check box preceding the NAME column and click the icon. If you wish to post raw data, then you can choose the raw flag.

Clicking the Save button will invoke Figure 6.

Figure 6 : Validating REST API configuration

If you wish to validate your REST API configuration, then, choose the Yes button in Figure 6. If your validation is successful, then a message as shown in Figure 7 will appear.

Figure 7 : REST API configuration validated successfully

If your REST API configuration validation failed, then , Figure 8 will appear.

Figure 8 : REST API configuration validation failure

Once your REST API is validated successfully, then, your configuration will be displayed in the API tab as shown in Figure 9.

Figure 9 : The REST API configuration

Now, click the Measure tab in Figure 9 to configure the measures for the test. Figure 10 then appears.

Figure 10 : The Measure tab

Clicking the Configure Metrics button in Figure 10 will invoke the Metrics Configuration pop up window as depicted by Figure 11.

Figure 11 : The Metrics Configuration window with the API output

By default, the Data Field Configuration section in Figure 11 will list the data obtained as a result of the REST API execution in the target environment. Each set of data obtained will be listed in separate columns. By default, all the columns will be chosen indicating that each column label will be a measure of the test. If you wish to choose only a few columns, then, you can do so by deselecting the check box preceding the column name.

Note:

If the data returned upon execution of the REST API in the target environment is of JSON format and comprises of numeric values, then, the test is considered as a non-descritpor based test. If multiple rows of data are returned as the output of the executed REST API and if a column contains values that are alpha-numeric, then, you can consider the test to be a descriptor-based test. The Column that contains alpha-numeric values will be the "descriptor" of the test. If multiple columns with alpha-numeric values are returned as the output by the REST API, then, ensure that you select only one column from Figure 11.

Once you have chosen the columns that need to be displayed as measures, click the Next button in Figure 11. Figure 12 will then appear listing all the Measures chosen for the test.

Figure 12 : Measures listed for the CPUPerformance_ex test

From Figure 11, we can easily figure out that the JSON output obtained upon execution of the REST API contains multiple rows of data as well as alpha-numeric values in the SERVER column (see Figure 11). Therefore, we can conclude that this is a descriptor-based test and the values in the SERVER Column are the descriptors of the test.

To this effect, Figure 12 will display a Descriptor based test slider which will be turned off by default. If you wish to develop this test as descriptor-based, turn on the slider. The Descriptor list will then appear listing the column that contained non-numeric values. In our example, this is server. So choosing the server option from this list ensures that the descriptors of this test will be the values displayed in the SERVER column of Figure 11.

Click the Generate button in Figure 12 to integrate the test’s implementation into the eG Enterprise system.

When a test’s measurements are successfully configured, the eG Enterprise system prompts the user to specify the default threshold settings for each of the measurements made by the newly added test as shown in Figure 13. You can set your own thresholds by clicking on the measure.

Figure 13 : Configuring thresholds for the newly created descriptor-based REST API test

Clicking the Finish button in Figure 13 will complete the creation of the descriptor based REST API test.

In some cases, the status of the descriptors may be returned by the REST API as alpha-numeric values. Let us take the example of InterfaceTest_ex test. The name of the Interfaces and their availability along with their bandwidth are returned as the output (see Figure 14).

Figure 14 : The JSON output of the InterfaceTest_ex

The Availability column displays unique alpha-numeric values corresponding to each Interface. If you choose the Interface name as the Descriptor (see Figure 15), the Availability and Bandwidth measure will be reported for each Interface. To plot the measure graph for the Availability measure that takes the values UP/Down, you can use the Measure internal value configuration Details section. You can specify the value obtained as the output of the REST API in the Measure Display Value (in our example, the value obtained in the Availability column of Figure 14) and specify a corresponding numeric value in Measure Numeric value text box. Click the Add button to add those values. Similarly you can specify any number of numeric values corresponding to the status of the measures. Thee values displayed in the Measure Numeric value text box are the values that will be used to plot the Measure graph of the measure displayed in the eG monitor console for this test.

Figure 15 : Specifying the numeric values corresponding to the status of the chosen descriptor

Click the Generate button in Figure 15 to integrate the test’s implementation into the eG Enterprise system.

When a test’s measurements are successfully configured, the eG Enterprise system prompts the user to specify the default threshold settings for each of the measurements made by the newly added test as shown in Figure 16. You can set your own thresholds by clicking on the measure.

Figure 16 : Configuring thresholds for the newly created InterfaceTest_ex