|
Comparing Performance across DescriptorsThis page appears upon clicking the button, which appears alongside any test that reports measurements for more than one descriptor. Using the Measures to compare list in this page, you can pick and choose the parameters that need to be compared across descriptors. If there are too many descriptors to be compared, then just like choosing the measures for comparison, you might also want to indicate which descriptors to compare, so that root-cause analysis and identification becomes easier. You can use the Search for Descriptors text box in this page for this purpose. If you want to compare the performance of only those descriptors that contain a particular string, then you can specify that string in this Search for Descriptors text box. For instance, to compare the disk space usage of disk partitions containing the string 'data', specify data in the Search for Descriptors text box.
In some environments, multiple external agents can be configured to monitor the same component, so that more than one external perspective to performance is available. For instance, you can configure external agents in multiple locations to execute the NetworkTest for a component, so that the health of network connections to that component from each location is monitored. If this is done, then eG Enterprise considers every external agent executing the test as a 'measurement host' and reports a set of measures for every measurement host. Also, just the way it allows comparison of performance across descriptors, eG Enterprise also allows you to compare performance across measurement hosts, so as to enable you to instantly identify "hot-spots" in the monitored environment. Take the NetworkInterfaceTest for instance. This test, if executed by multiple external agents, will report one set of measures for every network interface supported by a device, from the point of view of every measurement host executing the test. To compare performance across measurement hosts and across descriptors at the same time, simply click on the button alongside the NetworkInterfaceTest listing. This page appears once again from which you get to not only select the measures to be compared, but also the measurement hosts to be compared.
Finally, clicking on the Show button in this page reveals the MEASURES COMPARISON page, which reports the values for the chosen measures for every descriptor and every measurement host that has been selected in the form of a table. This table also indicates the current state of the descriptor, along with pointers to the measures and measurement hosts that are responsible for abnormalities (if any). You can even view an instant graph of a measure across descriptors by clicking on the GRAPH button that appears along with the column-header. The graphical representation enables you to time-correlate descriptor performance, and easily identify which descriptor is playing truant and when exactly the non-conformance began. This graph plots the values reported for a particular measure during the last 1 hour. You can also save the comparison table as a CSV file by clicking on the CSV button at the top, right corner of the page, and even print the table by clicking on the PRINT button. By default, the details displayed in the table are sorted in the ascending order of the values of the first measure being compared. This is indicated by the 'arrow' symbol that appears adjacent to the the first measure name. If need be, you can sort the comparison table in the descending order of the values of the same measure, by clicking on the 'arrow' next to the measure name. Similarly, you can sort the table in the ascending/descending order of any of the other measures that are being compared, by just clicking on the corresponding measure name in the table. Doing so shifts the 'arrow' mark next to that measure.
|