Open topic with navigation
AlertSite Dashboard is the default home page for logged in users and is also accessible via the top menu.
The Dashboard shows the status of all your monitors and lets you create new ones. Here you can see whether monitors have found any problems on your monitored websites over the selected period (12 hours, 24 hours, 7 days or 30 days). If a monitor finds an error, its tile moves to the top of the dashboard and appears with a red icon.
The dashboard automatically refreshes for the latest status updates. You can pause and resume auto-refresh by clicking the refresh timer.
Click the bigsign, then click New Monitor to create a new monitor, or New Group to combine existing monitors into a monitor group in order to track their performance as a whole. For more information, see:
Note: You must be an Admin, Co-Admin or Power User to create monitors.
The dashboard shows various metrics for individual monitors and monitor groups. Groups show aggregate metrics for all monitors in the group.
The main metrics are:
Performance – your website response time. For multi-step monitors, this is the total response time of all test steps. Note that only successful runs (status 0) are included in response time calculations.
Click a tile (in the tile view) or hover over a row (in the list view) to display a toolbar with monitor-specific or group-specific actions. The tile toolbar can be expanded by clicking the ellipsis button.
|View monitor summary or group summary.|
|Run a test on demand from the monitor’s primary location. Results are displayed on the screen.|
|View monitor status by locations and steps, or compare all monitors in the group. See Comparing Monitors, Steps, and Locations.|
|*||Edit monitor configuration or group configuration.|
|Open the diagnostic viewer to drill into charts and run results without leaving the dashboard.|
|*||Add or edit monitor notes.|
Use the items on the Dashboard toolbar to change the way the data is displayed.
Filter the dashboard to only show the monitors and monitor groups you are interested in. Optionally, save your filter as a view for later use. Saved filters appear under the Select a view menu on the left of the toolbar.
Click Compare on the dashboard toolbar to enter the comparison mode. Here, you can compare multiple monitors using a grid view or chart view. For details, see the following topics:
You can drill into the monitor run results without leaving the dashboard. To do that, click a time segment on a chart, or click on the monitor toolbar. This opens the diagnostic viewer with the run results for that period (hour or day). Here, you can do the following:
By default, the viewer shows the most recent failed run in the selected time segment, or the most recent run if there are no failures. You can change the displayed run by selecting another run on the scatter plot.
The diagnostic viewer contains two parts:
Run results (on the right) – This includes the step-level results, response times broken down into color-coded components, waterfall charts and other details. This is the same data you can see on the Monitor Runs dashboard, excluding the event-level details. For the event details, you can click to navigate to the Monitor Runs dashboard.
In this way, by using the diagnostic viewer, you can quickly get the information needed to troubleshoot “red” statuses or high response times without having to navigate away from the dashboard.
You can switch from the tile view to the list view by clicking the list button on the Dashboard toolbar. In the list view, you compare the monitor metrics side by side.
The list view contains the following columns:
|Monitor Name||The name of the monitor or group.|
|Performance||A chart that shows the response time (for monitors) or Apdex (for groups) over the selected period (12 hours, 24 hours, 7 days or 30 days).|
|Availability||A chart that shows the run status (Success / Error) of the monitor or all monitors in a group throughout the selected period (12 hours, 24 hours, 7 days or 30 days).|
|Status||For monitors – the status of the latest run (Success / Error).
For groups – the current Apdex rating: Excellent, Good, Fair, Poor, Bad.
|Monitor Type||The monitor type:
|Last Response Time||The response time during the latest test (for monitors only).|
|Alerting Enabled||The bell icon means the monitor has availability alerts enabled. No icon means availability alerts are disabled for this monitor.|
|Last Run Time||The date and time of the latest run of the monitor or the group’s monitors.|