Operations Management Suite (OMS): Extending OMS Using Log Search

Introduction

In OMS Log Analytics, there are many capabilities derived from its rich log search functionality, and these extensions have become an essential part of OMS. When working with OMS, you will come across situations where you will need to master these features.

In this chapter, we will cover the following topics within OMS Log Analytics:

  • Saved Searches
  • OMS Computer Groups
  • Custom Fields
  • Custom Logs
  • Power BI
  • Solution Targeting

Note: OMS Alerting is also a feature that utilizes log search capability. This feature will be discussed in its own dedicated chapter "Alert Management", later in this book.

Saved Searches

As we discussed in Chapter 2, OMS search queries can become very complex. Saved searches are very practical as opposed to manually typing the search queries every time you want to search data in OMS. Luckily, OMS allows you to save the queries into favorites so you and your colleagues can reuse the queries later.

Accessing Existing Saved Search Queries

In the OMS Log Search page shown in Figure 1, you will see a "Favorite" icon. Once you click on this icon, you may notice the list of existing saved searches. Even before you create your first saved search, you will see a list of saved searches already exists, because most of the OMS solutions also ship with a list of saved searches which are added to your workspace when you enable the solutions.

FIGURE 1. ACCESSING OMS SAVED SEARCHES

Using the saved searches are very easy all you need to do is to click on it. As shown in Figure 2, when you click on a saved search, OMS will automatically perform the search based on the query behind the saved search.

FIGURE 2. INVOKING A SAVED SEARCH

Creating Saved Searches

You may have developed some complex search queries that you would like to re-use in the future. For example, use the query shown below to display Windows computers memory "% Committed Bytes In Use" counter. It displays the 15-minute aggregated counter value in a line chart, as shown in Figure 3.

Type=Perf ObjectName=Memory CounterName="% Committed Bytes In Use" | measure avg(CounterValue) As AvgCounterValue by CounterName,Computer,CounterPath INTERVAL 15MINUTES | Display LineChart

You can save this search query as a saved search so you and everyone who has access to your OMS workspace can reuse it in the future. You can use the following steps to create the saved search:

  1. Perform a log search in OMS portal using the search query that you'd like to save.
  2. Once you are satisfied with the search result, click on the Save icon.
  3. Enter the name and category and make sure the "Save this query as a computer group" option is set to the default value of "No", then click Save to create the saved search.

FIGURE 3. CREATING A SAVED SEARCH

The Category field in the Save Search area shown in Figure 3 supports auto-completion. This means that as you start typing the category name, existing categories that match what you type will appear on the list, and you may choose one from the list.

Once the saved search is created, you will find it in the Saved Searches list (shown in Figure 4) and it can be used by everyone who has access to the OMS workspace.

FIGURE 4. LOCATING THE SAVED SEARCH YOU JUST CREATED

Managing Saved Searches using PowerShell

The OMS PowerShell module AzureRM.OperationalInsights provides cmdlets to manage CRUD (Create, Read, Update, Delete) operations for saved searches. These cmdlets are:

  • Create new Saved Search: New-AzureRmOperationalInsightsSavedSearch
  • Get Saved Search Properties: Get-AzureRmOperationalInsightsSavedSearch
  • Get Saved Search result: Get-AzureRmOperationalInsightsSavedSearchResults
  • Remove Saved Search: Remove-AzureRmOperationalInsightsSavedSearch
  • Update existing Saved Search: Set-AzureRmOperationalInsightsSavedSearch

To execute these cmdlets, you first need to sign in to Azure and select the subscription in which the OMS workspace is located. The script example below demonstrates how to use these cmdlets to manage your OMS saved searches.

Note: You can download this script from book's GitHub repository at https://Github.com/insidemscloud/OMSBookV2 in the \Chapter 6 directory, the file name is New-OMSSavedSearch.ps1. The required modules are listed on line 1 of the script. You can get these modules from PowerShell Gallery: https://www.powershellgallery.com/

There are few things you need to keep in mind when using these cmdlets:

  1. Saved search names

The saved searches that are shipped with solutions typically follows this naming standard: "Category|DisplayName". When you create saved searches from the OMS portal, a random GUID is used for the actual name (Id) of the saved search. When you are creating saved searches in PowerShell, it's better to choose one of these standards.

  1. Using the Set-AzureRmOperationalInsightsSavedSearch cmdlet

When using the Set-AzureRmOperationalInsightsSavedSearch cmdlet to update the saved search, make sure you ALWAYS include the etag parameter: "-etag *". The cmdlet will fail if this is not included.

You also need to make sure the value you pass to the SavedSearchId parameter matches the Id (name) of the original query. If they do not match, a duplicate saved search will be created.

Based on Author's testing, the version number value cannot be changed.

OMS Computer Groups

OMS supports computer groups in the following scenarios:

  • Scoping search queries
  • Targeting update deployments

Groups can either be created manually within OMS using search queries or imported from any of the following systems:

  • Active Directory
  • Windows Server Update Services (WSUS)
  • System Center Configuration Manager (ConfigMgr)

Note: In OMS, not all data types are billable. Some data types are not counted towards your data usage. Computer Groups data are free of charge. You can use the following query to get a list of nonbillable data types: Type=Usage IsBillable=false | measure count() by DataType

Creating OMS Groups using Search Queries

You can use any search query that returns a list of computers, such as a list of computers with names starting with "OM":

Computer=OM* | Distinct Computer

A list of computers that have any SQL performance counters collected:

Type=Perf ObjectName= SQL* | measure count() by Computer

You can use the following steps to create a group:

  1. Perform a log search in OMS portal using the search query that returns a list of computers you want to save.
  2. Once you are satisfied with the search result, click on the Save icon.
  3. As shown in Figure 5, enter the name and category and make sure "Save this query as a computer group" option is set to "Yes", then click Save to create the group.

FIGURE 5. CREATING COMPUTER GROUPS

Once the group is created, you can find the group by going to the Settings page, then Computer Groups Saved Groups. In Figure 6, you can see the SQL Servers group has been added to the Saved Groups view.

FIGURE 6. LOCATING EXISTING COMPUTER GROUPS

You can click on the magnifying glass icon next to the group name to invoke the search and view the group membership. Clicking the 'X' icon will delete the group.

Importing Groups from Active Directory

If any of your Active Directory domain controllers are managed by your OMS workspace (either via a direct agent or enrolled via SCOM management group), you can configure OMS to import Active Directory groups.

To enable the Active Directory group import, go to the Settings page, then browse to Computer Groups Active Directory and select Import Active Directory group memberships from computers as shown in Figure 7. Click Save if prompted to confirm the setting.

FIGURE 7. ENABLING ACTIVE DIRECTORY GROUP IMPORT

Once enabled, the import rule will run on an interval. Any domain-joined computer connected to Operations Management Suite which also belongs to an Active Directory security group will have its group membership information imported and available for use in your workspace. As shown in Figure 8, you can verify the AD group information has been imported by using the following log search query:

Type=ComputerGroup GroupSource=ActiveDirectory

FIGURE 8. ACCESS RAW ACTIVE DIRECTORY GROUP DATA

Importing Groups from WSUS Servers

If your WSUS servers are managed by OMS either via the direct agent or via SCOM management group, you can import WSUS groups into OMS.

To enable WSUS group import, as shown in Figure 9, you need to go to the Settings page, then browse to Computer Groups WSUS and select Import WSUS group memberships. Click Save if prompted to confirm the setting.

FIGURE 9. ENABLING WSUS GROUP IMPORT

Once enabled, if client-side targeting is configured on your WSUS server, any computers connected to OMS which also belong to a WSUS group will have their WSUS group membership imported and available to use in OMS. To verify the WSUS group information has been imported, you can use OMS search query:

Type=ComputerGroup GroupSource=WSUS

Importing groups from ConfigMgr

If you are running System Center Configuration Manager (ConfigMgr or SCCM), you will have the option to import ConfigMgr collections into OMS as computer groups.

To enable this feature, the following requirements must be met:

  • Your ConfigMgr site must be on version 1606 or later.
  • Your ConfigMgr site must have a Service Connection Point configured for online mode.
  • The computer running the ConfigMgr Service Connection Point must be managed by OMS either via a direct agent or enrolled into OMS via SCOM.
  • You must connect OMS to the top of your ConfigMgr hierarchy. You can make the connection either from your Standalone primary site, or if you are using a Central Admin Site (CAS), you must connect OMS to the CAS.
  • You need to have administrative rights to the Azure Active Directory for the Azure subscription that your OMS workspace is using. This is because you will need to create a Service Principal credential in Azure AD for the ConfigMgr connector.

If your environment meets the above-mentioned pre-requisites, you can then create the connector in the ConfigMgr console.

Creating an Azure Service Principal for the ConfigMgr Connector

Before you create the connection, you will firstly need to create an Azure AD Application and Azure Service Principal for the connection.

If you prefer to create the Azure Service Principal manually from the Azure portal, you can follow the instruction from the Azure documentation site: https://docs.microsoft.com/en-us/azure/azure-resource-manager/resource-group-create-service-principal-portal

To simplify the process, the authors have developed a PowerShell script for creating the required Azure AD application and service principal for the ConfigMgr Connector. This script can be download from the book's GitHub repository at https://Github.com/insidemscloud/OMSBookV2 in the \Chapter 6 directory, the file name is New-SCCM-OMS-App.ps1.

You must run this script interactively because it will prompt you with several menus and requires you to select from these menus. To run this script, you must have administrative rights to your Azure subscription and Azure Active Directory. You will first be prompted with an Azure AD login page, where you will need to enter your admin credentials. The script will then prompt you to select the Azure subscription that hosts your OMS workspace and then you will need to select the OMS workspace that you wish to connect your ConfigMgr site to. You will also be asked to enter a name for the Azure AD application it is about to create if you do not wish to use the default name.

Once all the information is gathered, the script will create the Azure AD application, the Azure AD Service Principal for the application, and assign the Contributor role to the OMS workspace's resource group. The information you need to create the ConfigMgr connector is displayed in the PowerShell window shown in Figure 10.

FIGURE 10. CREATING AAD SERVICE PRINCIPAL FOR CONFIGMGR CONNECTOR USING POWERSHELL SCRIPT

Creating the ConfigMgr OMS Connector

Once the Azure AD Service Principal is created and configured, we can then create the ConfigMgr OMS connector from the ConfigMgr console following the steps below:

  1. In the ConfigMgr console, go to Administration pane and then click OMS Connector under the Cloud Services folder as shown in Figure 11.

FIGURE 11. LOCATING THE OMS CONNECTOR IN CONFIGMGR CONSOLE

  1. Now click the Create connection to Operations Management Suite button from the top ribbon.
  2. In the "Connection to Operations Management Suite Wizard", on the General tab, click Next.
  3. At the Azure Active Directory tab, enter the Tenant, Client ID and Client secret key for the Service Principal you created in the previous section. If you created the Service Principal using the New-ConfigMgr-OMS-App.ps1 script provided by this book, the information you need for this step is displayed as the output of the script (as shown earlier in Figure 10).
  4. Now click the Verify button and you should get a Successfully validated message similar to our example in Figure 12. Click Next to move on when the information has been verified.

FIGURE 12. ENTERING AAD SERVICE PRINCIPAL DETAILS FOR CONFIGMGR OMS CONNECTOR

  1. In the OMS Connection tab shown in Figure 13, make sure the correct Azure subscription, resource group, and OMS workspace are selected from the dropdown menus, then click the Add button to add your ConfigMgr collections.

FIGURE 13. CONFIGURE OMS CONNECTION FOR THE CONFIGMGR OMS CONNECTOR

  1. At the Select Collections window shown in Figure 14, select all the ConfigMgr collections that you would like to import into OMS, then click OK to confirm.

FIGURE 14. SELECT COLLECTIONS FOR THE CONFIGMGR OMS CONNECTOR

Note: Since you are only importing the computers into OMS, you can only select device collections. User collections are not available.

  1. At the "Connection to OMS Management Suite Wizard" window, in the OMS connection tab, click Next, and then click Next again on the Summary tab. Once the connector is created, at the Completion tab, click Close.

Similar to Figure 15, you should see the new connector in the ConfigMgr console and after a short wait, the data will begin to appear in your OMS workspace.

FIGURE 15. OMS CONNECTOR IN CONFIGMGR CONSOLE

You can verify the ConfigMgr collection data is imported by searching OMS using search query Type=ComputerGroup GroupSource=SCCM. If you have multiple ConfigMgr sites importing into your OMS workspace, as shown in Figure 16, you can also specify the

ConfigMgr site code in the search query. i.e. Type=ComputerGroup GroupSource=SCCM GroupSourceName=<Your ConfigMgr Site Code>

FIGURE 16. SEARCHING CONFIGMGR COLLECTION DATA IN OMS

Note: ConfigMgr collection data is imported into OMS on a schedule. Currently, it is scheduled to run once every hour. Therefore, any collection membership updates will not immediately be reflected in OMS.

Using OMS Computer Groups in Search Queries

OMS Computer Groups can be used to scope search results. When you include a group in a log search, the search results are limited to records that match the computers in the group.

Computer groups can be used with the IN clause in the log search query. For example, using the "SQL Servers" group that we created from a saved search in the "Creating OMS Groups using Search Queries" section earlier, we can retrieve all critical events from the System event log on all computers that are members of that group using the following query:

Type=Event EventLog=System EventLevelName=error Computer IN $ComputerGroups[SQL Servers]

Since it is possible to create computer groups with the same name but stored under different categories, you can also optionally specify the category name in the search query. For example, the "SQL Servers" group is saved under the category "Computer Groups". You can specify the category name in the following syntax:

$ComputerGroups[Category: Name]

As shown in Figure 17, the complete search query that includes the group category name would be:

Type=Event EventLog=System EventLevelName=error Computer IN $ComputerGroups[Computer Groups: SQL Servers]

FIGURE 17. PERFORMING OMS LOG SEARCH USING COMPUTER GROUP

Note: If there are two or more computer groups that share the same name and are saved in different categories, the category name is required in the search query.

Using OMS Computer Groups in Update Deployments

In addition to using OMS computer groups in log search, you can also use computer groups when creating update deployments.

When creating a new update deployment run in the OMS Update Management solution, you have the option to target individual computers as well as groups.

In the New Update Deployment page, when you start typing the name in the COMPUTERS field, all the computers and groups with a name matching what you have typed will be displayed in the list for you to select. For computer groups, the group source system (Type) is also displayed next to the computer group name as shown in Figure 18.

FIGURE 18. USING COMPUTER GROUPS IN UPDATE DEPLOYMENTS

Note: Update Deployment is discussed in detail in Chapter 4.

Custom Fields

Every log entry that has been injected into OMS belongs to a data type. Each data type has its own data schema. The schema defines the fields that the data type is allowed to have along with the data format that each field should contain (i.e. string, date-time, integer, Boolean, etc.). You may also use OMS to collect unstructured data using the Custom Logs feature (discussed later in this chapter). Sometimes, when you search for log entries in OMS, you may find the specific data you are looking for is not structured in a way that is easily searchable. It may be just a phrase embedded in a string.

For example, when a service is stopped or started on a Windows computer, an event with Event ID 7036 is generated in the System event log. The description of the event log entry contains a sentence like "The <service name> entered the <running/stopped> state".

Examples of these events are shown in Figure 19 and Figure 20.

FIGURE 19. WINDOWS SERVICE RUNNING EVENT (EVENT ID 7036)

FIGURE 20. WINDOWS SERVICE STOPPED EVENT (EVENT ID 7036)

In Windows Event Logs, these event log entries are structured, and the two key pieces of information (Service Name and Service State) are stored as two separate parameters. You can check this if you switch to the XML view, as shown in Figure 21.

FIGURE 21. EVENT ID 7036 XML VIEW

However, when these log entries are collected by OMS, the event log entry in OMS does not contain separate fields for Service Name and Service State. Instead, the raw XML segment is injected into a single field as shown in Figure 22.

FIGURE 22. EVENT ID 7036 IN OMS

In this figure, we can clearly see that in OMS, the Service Name and Service State are not stored in dedicated fields, therefore making it hard to search for specific service start or stop events.

Luckily, OMS is capable of extracting data from existing log entries and creating new searchable fields to store extracted data. This feature is called Custom Fields. The Custom Fields feature uses a technology called FlashExtract to extract fields and it trains OMS to recognize these fields.

Note: Interested in learning more about the FlashExtract framework for data extraction? Read more about the technology at Microsoft web page, http://research.microsoft.com/en-us/um/people/sumitg/pubs/pldi14-flashextract.pdf

Using the Custom Fields feature, it is possible to extract the service name and service state from the log entry once it is injected into your OMS workspace. Figure 23 shows an example of what this process would look like.

FIGURE 23. OMS CUSTOM FIELDS

To create the custom fields in OMS, follow these steps:

  1. In the OMS portal, use an appropriate search query to locate the log entry that you wish to create the custom fields from (we will use Type=Event EventLog=System EventID=7036 as an example and if you want to use this query too, make sure to enable OMS collection of Information type events from the Windows System event log).
  1. Click on the "…" icon in front of any of the text-based fields, and select the Extract fields from option as shown in Figure 24.

FIGURE 24. EXTRACTING FIELDS FOR OMS CUSTOM FIELDS

  1. Highlight the exact phrase (in this case, the Windows service name) that you want to capture for the custom field. Once it is highlighted, the Field Extraction Wizard is launched in a pop-up window, and you will need to enter a name for the custom field and choose a field data type from the drop-down list.
  1. To make the future data extraction for the custom field more accurate, you may choose additional existing fields as filters. In Figure 25 you can see that, in addition to the pre-selected fields, we have also selected the SourceSystem, Source and EventLevelName fields as filters.

FIGURE 25. CREATING CUSTOM FIELD

  1. Click the Extract button from the pop-up window.

Note: Based on the author's experience, the pop-up window for the field definition does not appear when using the Firefox browser. If this is the case, try another browser (i.e. Internet Explorer).

  1. OMS will perform a search based on the custom field that you have created and it will display the result in the Search Results section. If you are happy with the result, click the Save extraction button located in the Summary section at the bottom right corner of the page as shown in Figure 26.

FIGURE 26. SAVING CUSTOM FIELD EXTRACTION

  1. Repeat steps 3-6 to create another custom field for the Windows service state.

Note: By design, the name of the custom field you have created will have the suffix "_CF" appended to the name you have entered. "CF" indicates it is a Custom Field.

Once the custom field is created, it will be applied to any new records injected into OMS. It will NOT affect the existing records. In Figure 27 you can see two new custom fields referencing the state (WindowsServiceState_CF) name (WindowsServiceName_CF) from our Event ID 7036 query.

FIGURE 27. CUSTOM FIELDS IN NEWLY INJECTED LOG RECORDS

Now that the required custom fields are created, it becomes very easy to search for specific service stop and start events using log search. For example, you can use the following query to detect the Remote Registry service stopped event based on the two custom fields we just created (Figure 28 shows the output of this query):

WindowsServiceName_CF="Remote Registry" WindowsServiceState_CF=stopped

FIGURE 28. LOG SEARCH USING CUSTOM FIELDS

You can access the list of existing Custom Fields in your OMS workspace by going to the Settings page, then go to Data Custom Fields. As shown in Figure 29, you can choose to either browse directly to the custom field as a query or to just remove it from OMS.

FIGURE 29. MANAGING EXISTING CUSTOM FIELDS

Alternatively, you can also remove the custom fields from the log search result by selecting the custom field that you wish to delete, and selecting the Remove custom field option as shown in Figure 30.

FIGURE 30. REMOVING CUSTOM FIELD FROM LOG SEARCH RESULT

Custom Logs

In addition to having the ability to import Windows Event Logs and Linux Syslog into OMS, you can also configure OMS to collect data from text-based log files on both Windows and Linux agents. This feature is called Custom Logs.

If the application that you wish to collect logs from stores its logs in text file format (as opposed to using Windows Event Log or Syslog), you may use the Custom Logs feature to capture the log data for the application.

Note: At the time of writing this book, the OMS Custom Logs feature is still in Public Preview. This feature may be different than what we have discussed in this chapter when it becomes generally available.

The log file must meet the following requirements to be collected by OMS:

  • The log must either have a single entry per line or use a timestamp matching one of the following formats at the start of each entry.
    • YYYY-MM-DD HH:MM:SS
    • M/D/YYYY HH:MM:SS AM/PM
    • Mon DD,YYYY HH:MM:SS
  • The log file must not allow circular updates where the file is overwritten with new entries.

Since the Custom Logs feature is still in preview, you will need to manually enable it by browsing to the Settings Preview Features area of the OMS console, and then clicking the Enable button beside the Custom Logs entry as shown in Figure 31.

FIGURE 31. ENABLING CUSTOM LOGS PREVIEW FEATURE

Once the Custom Logs feature is enabled, you can follow the steps below to create a custom logs collection:

  1. Browse to the Settings page and then go to Data Custom Logs. From here, click the Add+ button shown in Figure 32 to launch the custom log wizard.

FIGURE 32. ADDING CUSTOM LOGS COLLECTIONS

Note: By default, all configuration changes are automatically pushed to all agents. For Linux agents, a configuration file is sent to the Fluentd data collector. If you wish to modify this file manually on each Linux agent, then uncheck the Apply below configuration to my Linux machines box.

  1. From the first page of the custom log wizard shown in Figure 33, specify a sample log file and click Next to continue.

FIGURE 33. SPECIFYING A SAMPLE LOGFILE

  1. At the next screen, select the record delimiter, either by new lines or by timestamp. In Figure 34, you can see how the entries in your log file are being broken into separate records based on your selection. Click Next to move on.

FIGURE 35. SPECIFY CUSTOM LOG PATH

  1. From the Add log collection paths screen shown in Figure 35, add one or more paths of where the logs are located on your OMS agent computers (you can choose between Windows and Linux here). Click Next when you're ready to continue.

FIGURE 36. SPECIFY CUSTOM LOG TYPE AND DESCRIPTION

  1. Specify the log type and description. As shown in Figure 36, the log type has a suffix of "_CL".

FIGURE 36. SPECIFY CUSTOM LOG TYPE AND DESCRIPTION

Once the log collection is created, you can also edit it as shown in Figure 37.

FIGURE 37. EDITING CUSTOM LOGS COLLECTION

After a few minutes, you should be able to search for your custom log records in OMS similar to our search shown in Figure 38.

FIGURE 38. SEARCHING CUSTOM LOG

At this stage, only the raw data is available in OMS. You can now extract fields from the raw data using the Custom Fields feature.

Note: Refer to the Custom Fields section in this chapter for more information on how to extract fields from the raw data in the custom logs.

Figure 39 shows an example of how we have created some custom fields using the raw data from our custom logs.

FIGURE 39. EXTRACTING CUSTOM FIELDS FROM CUSTOM LOGS

Coupled with the Custom Fields feature, the OMS Custom Logs feature provides a useful method of collecting data from text-based log files. This fills the gap for applications that do not log structured log events to the native event management framework provided by the operating system (i.e. Windows Event Log and Syslog). OMS also provides another API based log injection feature. This API is called the OMS HTTP Data Collector API and we discuss this feature in greater detail in the "Custom Solutions" chapter later in this book.

Power BI Integration

Based on what you have learned so far, you will probably agree that OMS is great at collecting and analyzing data, but the native dashboards lack some cool visualization and reporting capabilities. To fill this gap, OMS offers a feature that allows you to push user defined datasets to Power BI so you can utilize its rich reporting and dashboarding capabilities.

Power BI is a cloud-based service, hosted in Azure, where you can create reports and dashboards, share reports and direct connect to various data sources. When working with Power BI, there are three primary components: dashboards, reports, and datasets. A dashboard is a single pane of glass to all important and relevant data a central view to delivering the information you need at-a-glance. Each dashboard comprises several tiles and these tiles are based on reports which in turn, glean their information from datasets.

In our scenario, we will use OMS to push datasets to Power BI.

Why should you consider using OMS and Power BI together? The Power BI integration addresses the following issues and limitations:

  1. OMS dashboards are lacking cool visuals.
  2. OMS does not have built-in reporting capabilities.
  3. Currently, there is no Role-Based Access Control (RBAC) in OMS. You cannot limit users access to a subset of data.
  4. Searching data in OMS may be difficult for some users. You need to learn and master the OMS search syntax.
  5. Data retention is limited in OMS.

The process of connecting OMS and Power BI is divided into four major steps:

  1. Connect OMS to a Power BI account.
  2. Create Power BI injection rules in OMS.
  3. Build a report in Power BI.
  4. Publish the report to a dashboard.

Once you have built Power BI dashboards for your OMS data, you also have the option to embed Power BI reports into other 3rd party applications that support IFrame (i.e. SharePoint). This feature offers great flexibility to share OMS data with people who do not have access to your OMS workspace and the Power BI assets you have just created.

Note: At the time of writing this book, the OMS Power BI feature is still in Public Preview. This feature may be different than what we have discussed in this chapter when it becomes generally available.

Power BI Overview

Power BI is the successor of Power Pivot, Power Query and Power View which were addons in Microsoft Excel. Power BI has become Generally Available (GA) in July 2015 as part of the Office 365 product family. Soon after the release, Gartner has identified Power BI as the market leader in BI and Analytics Platform. https://www.gartner.com/doc/reprints?ct=160204&id=1-2XXET8P.

Power BI consists of the following components from the user's perspective:

  • Power BI (Online)
  • Power BI Desktop for Windows
  • Power BI Mobile App (UWP, Android, and iOS)
  • Microsoft On-Premises Data Gateway

In addition to the cloud-based online version of Power BI. You can use a desktop version called Power BI Desktop for Windows. Although the user interface between Power BI and Power BI Desktop are very similar, the usage scenarios are fundamentally different. Once the reports and dashboards are published in Power BI online, users have the options to access these reports and dashboards using the mobile apps.

Note: The OMS Power BI integration is only available for the online version of Power BI. Power BI Desktop for Windows cannot be used in this scenario.

Users have the option to connect Power BI Online to the on-premises data sources. This is achieved using the Microsoft On-Premises Data Gateway. On-Premises Data Gateway is a component that you install on a server located in your on-premise network, and it allows

Microsoft's SaaS applications such as Power BI, Azure Logic Apps, Microsoft Flow and Power Apps to access on-premises data sources.

Power BI comes with two pricing tiers: Free and paid (called "Power BI Pro"). The Power BI Pro license can be purchased within your Office 365 subscription and it is also included in Office 365 E5 licenses.

Note: Power BI is a service not included in the OMS license as it comes with its own free plan and a paid (Pro) plan. However, for the OMS Power BI integration feature, a Power BI Pro account is strongly recommended. For more information about Power BI pricing, see this Microsoft web page - https://powerbi.microsoft.com/en-us/pricing

Power BI comes with a set of default visuals you can use to build your reports. In addition to the default visuals, you can also download custom visuals from the Power BI Custom Visuals gallery: https://app.powerbi.com/visuals/. Figure 40 shows an example of some of the options in the visuals library.

FIGURE 40. POWER BI CUSTOM VISUALS LIBRARY

If you are not satisfied with any of these visuals, then it's also possible to develop your own custom visual using NodeJS. Power BI offers a custom visual development tool for Power BI and to learn more about this tool, see the online documentation here - https://powerbi.microsoft.com/en-us/documentation/powerbi-custom-visuals-getting-started-with-developer-tools/

Configuring the OMS Power BI integration

Before OMS can start injecting data to Power BI, you need to configure the Power BI integration in the OMS portal first. To configure the integration, you will first need to have a Power BI account.

The Power BI account that you are going to use should meet the following requirements:

  1. A Power BI Pro account is STRONGLY recommended

The data capacity and injection limitation for the Pro account is significantly higher than the Free account. If you are managing a very large environment with OMS, a free Power BI account will definitely not meet your requirements.

  1. Use a service account instead of a personal account

OMS injects data into the personal workspace that belongs to the Power BI account. If you have multiple people managing the OMS dataset in Power BI, you need to share the Power BI account credential with multiple people. Therefore, it's better to use a service account.

  1. The service account's password must be set to never expire

At the time of writing this book, it is not possible to update the Power BI account password within OMS. To ensure the Power BI injection rules continue to function, you need to make sure the password for the Power BI account never expires. You may use the script below to set the Power BI account's password never expire within your Office 365 subscription.

Note: This script can be download from book's GitHub repository at https://Github.com/insidemscloud/OMSBookV2 in the \Chapter 6 directory, the file name is Set-PowerBIAccountPasswordPolicy.ps1. The required modules are listed on line 1 of the script. You can get these modules from the PowerShell Gallery: https://www.powershellgallery.com/

Once you have configured the Power BI account, you can configure the Power BI integration in OMS portal using the steps listed below:

  1. Browse to the Settings page and go to the Preview Features tab, enable Power BI Integration as shown in Figure 41.

FIGURE 41. ENABLING POWER BI INTEGRATION PREVIEW FEATURE

  1. Browse to the Settings page and as shown in Figure 42, go to Accounts Workspace Information and click on Connect to Power BI Account.

FIGURE 42. CONNECTING POWER BI ACCOUNT

  1. Enter your Power BI account credentials in the pop-up window and click Sign In to confirm.

Once it is connected, the Power BI account username is displayed on the Workspace Information page, and you are now able to create Power BI injection rules in OMS.

Creating Power BI Injection Rules in OMS

Power BI injection rules are based on OMS search queries. In this example, we use the Agent Heartbeat data (Type=Heartbeat), which is generated by the Agent Health solution.

To create a Power BI injection rule for Agent heartbeat data, you can follow the steps listed below:

  • Go to the OMS log search page, and then perform log search using a query Type=Heartbeat.

Note: It is recommended to inject raw data into Power BI and perform data aggregation within Power BI. However, with some large volume data types such as Near Real-Time Performance data (Type=Perf) and Wire Data (Type=Wire Data), it is not possible to inject raw data into Power BI without running into problems. With large volume data types, you will have no choice but to inject aggregated data into Power BI. This will be discussed in more detail later in this chapter.

  • Once you are satisfied with the search result, click the Power BI icon from the ribbon at the top of the search page.
  • As shown in Figure 43, enter the following information for the Power BI injection rule and click Save:
    • Name: Power BI – Agent Heartbeat
    • Saved Search: Use current search query
    • Schedule: 15 Minutes
    • Dataset name: OMS – Agent Heartbeat

FIGURE 43. CREATING POWER BI INJECTION RULES

Once the Power BI injection rule has been created, you can verify the OMS dataset with the name you have specified in the injection rule appears in the Power BI "My Workspace" area by signing in (https://powerbi.microsoft.com/en-us/) using your Power BI service account. In Figure 44 you can see the OMS dataset has been added to our Power BI workspace.

FIGURE 44. POWER BI DATASET CREATED BY OMS

As shown in Figure 45, you can also find the Power BI injection rule in the OMS portal (Settings Power BI).

FIGURE 45. OMS POWER BI INJECTION RULE

Power BI REST API Limitations

Before we start creating reports in Power BI, let us first discuss the limitations of Power BI REST API, as these limitations can have a significant impact on your OMS data injection. You need to carefully consider how and what to inject to Power BI based on the size of your OMS workspace.

Power BI REST API is a RESTful API that provides programmatic access to Power BI resources. OMS injects data into Power BI via the Power BI REST API.

The Power BI REST API has several limitations. At the time of writing this book, these limitations are:

To POST Rows:

  • 75 max columns
  • 75 max tables
  • 10,000 max rows per single POST rows request
  • 1,000,000 rows added per hour per dataset
  • 5 max pending POST rows requests per dataset
  • 120 POST rows requests per minute per dataset
  • If table has 250,000 or more rows, 120 POST rows requests per hour per dataset
  • 200,000 max rows stored per table in FIFO dataset
  • 5,000,000 max rows stored per table in 'none retention policy' dataset
  • 4,000 characters per value for string column in POST rows operation POST Rows operation per Power BI Plan:
  • Dataset created by user with free service plan: 10,000 rows added per hour per dataset
  • Dataset created by user with paid service plan: 1,000,000 rows added per hour per dataset
  • If a user exceeds this limit, the service will fail subsequent API calls with the following details:
    • HTTP Status Code: 429 Too Many Requests o     Response header: Retry-After + {Number of seconds until their quota resets}
    • An OData error with the following message: "You are over your rows per hour limit for this dataset. To push more rows per hour, upgrade your account to Power BI Pro or retry your push later."

Note: Always refer to the Power BI Documentation for most up-todate details for the Power BI REST API limitation: https://msdn.microsoft.com/en-us/library/dn950053.aspx

The limitations highlighted in bold may impact your OMS data injection. Therefore, when you are creating the Power BI injection rules in OMS, you need to make sure your planned data injection does not breach any of these thresholds:

  1. The search result returned from the OMS search query for the Power BI injection rule does not have more than 75 fields.

Most of the data types in OMS should not have more than 75 data fields. However, this also means that if your search result returns more than one data type (i.e. using a query such as "Type=Perf OR Type=Event OR Type=W3CIISLog", or simply using "*" as the query to return everything), you will hit this limitation. Therefore, make sure you do not mix multiple data types into the search query so the Results table in the Power BI dataset does not have more than 75 fields.

  1. Check the number of records returned from the search result within the same time interval of the Power BI injection rule frequency.

If you are planning to configure the Power BI injection rule to run every 15 minutes, before creating the rule, you need to make sure the number of records returned from the search query with the search scope of 15 minutes does not exceed the maximum rows per single POST rows request of 10,000 rows. You can test run the query and set the scope to a 15-minute time window in the log search page, as shown in Figure 46.

FIGURE 46. SCOPING SEARCH TIME RANGE

  1. Check the number of the records returned from the search result within a one hour period.

Since each dataset can only take 1,000,000 (one million) rows per hour using the REST API if you have a Power BI Pro license, you need to make sure within a one-hour period, the number of records returned from your search result does not exceed this figure. If your Power BI account is on the free plan, the Figure for this limitation is only 10,000 (ten thousand).

  1. Make sure none of the values of any fields exceed 4,000 (four thousand) characters.

You need to pay attention to this limitation if you are working on events or custom logs (i.e. custom logs contain raw data as well as the custom fields you have defined). In the same way as the Windows Event data type, it contains the raw event data in XML, as well as rendered description. There is probably no point injecting this type of raw data into Power BI if the information is categorized in other fields. You can use the Select statement in your search query to select the only fields that you are interested in. i.e. for the Windows Event data, use a query such as

"Type=Event | select Computer,EventLog,EventId,EventLevelName,EventSource,EventCategor y,RenderedDescription".

Additionally, although it is not documented, based on the author's testing, the dataset that OMS creates does not have retention policies because the test injection rule promptly stopped working once the number of records in the Results table of the dataset reached 5,000,000 (five million) rows. We will provide a number of workarounds for this limitation later in this chapter.

Creating Power BI Reports and Dashboards

Once you have created the Power BI injection rules and verified your data has been injected into the Power BI dataset, you now have the ability to create the Power BI report. In this section, we will create a report based on the OMS Agent Heartbeat data. We will create a high-level statistics report for OMS agents. This report is made up using both built-in visuals and custom visuals from the Power BI Visuals Gallery. This report also contains an animated visual.

When completed, the report will look similar to our example in Figure 47.

FIGURE 47. OMS AGENT STATISTICS REPORT

This report contains the following parts:

  • A heat map based on the country where the agent is located (Agent Location by Country). The color highlighting the country changes based on the agent count.
  • An interactive "fish tank" visual. In this visual, each fish represents an OMS agent. the size of the fish presents the number of heartbeats generated by the agent. So, the older the agent (fish) is, the more heartbeat will be generated to the OMS workspace (fish tank), and the bigger the fish will become.
  • A Brick chart shows the percentage (this chart contains 100 tiles) of the agent by OS type (Linux vs Windows).
  • A tornado chart shows agent distribution by country. Agent OS type is also separated in different colors.
  • A Pie Chart shows agent distribution by management groups (SCOM attached vs direct attached vs Linux agents)
  • Agent version Donut chart that separates agent counts by agent version numbers (both Windows agents and Linux agents).

This report is made up with 6 visuals

  • Filled Map
  • Enlighten Aquarium (Custom Visual)
  • Brick Chart (Custom Visual)
  • Tornado Chart (Custom Visual)
  • Pie Chart
  • Donut Chart GMO (Custom Visual)

You must download the custom visuals from the Power BI Visuals Gallery at https://app.powerbi.com/visuals/ before you start creating the report.

You can either browse the gallery and locate each custom visual manually, or you can use the links below to download these visuals:

Once all the visuals are downloaded, you can follow the steps listed below to create the OMS Agent Statistics report:

Note: The steps listed below are based on the Power BI portal with the "Early access to features" setting turned off. If you have turned on this setting in the General tab of the Settings page (shown in Figure 48), the portal UI will look different than the steps below. However, it does not prevent you from creating reports.

FIGURE 48. POWER BI "EARLY ACCESS TO FEATURES" SETTING

  • Click on the OMS – Agent Heartbeat dataset from the left navigation pane of the Power BI portal. This dataset was created by the OMS Power BI injection rule that we created earlier. You will be presented with a blank canvas. As you can see from the Fields column in Figure 49, there are two tables within the dataset - Count and Results. The Results table should contain all the fields you included in your OMS Power BI injection rule.

FIGURE 49. ACCESSING DATASET IN POWER BI PORTAL

  • Import the custom visuals that you downloaded earlier by clicking on the "" icon in the Visualizations section and choosing the Import a custom visual option as shown in Figure 50. You can only import one visual at a time.

FIGURE 50. IMPORTING CUSTOM VISUALS

  • As shown in Figure 51, add a title for the report by adding a text box to the upper left corner of the canvas, then add the text "Agent Statistics". Format the text to your liking.

FIGURE 51. ADDING A REPORT TITLE

  • Add the Filled Map visual to the canvas and position it below the title text box.

Tip: Hovering your mouse over each of the visualization icons in the Visualizations section will reveal the name of the visual in Power BI.

  • Configure the Filled Map visual as shown in Figure 52 by dragging the fields listed below in bold from the Fields column on the right to the relevant field in the visual.
    • Location: RemoteIPCountry
    • Color Saturation: Count of Computer (Distinct)

FIGURE 52. CONFIGURING FILLED MAP VISUAL FIELDS

  • Click on the Format tab of the Filled Map visual (shown in Figure 53) and add the title "Agent Location by Country". Once completed, the report will look similar to Figure 54.

FIGURE 53. FORMATTING FILLED MAP VISUAL

FIGURE 54. FILLED MAP VISUAL ON THE WIP REPORT

  • Now add the Enlighted Aquarium visual to the top right corner of the canvas.
  • Configure the aquarium visual with the mapped field information listed below and as shown in Figure 55.
    • Fish: Computer
    • Size: Count of Id

FIGURE 55. CONFIGURING THE ENLIGHTED AQUARIUM VISUAL

  • Go to the format tab, and configure the aquarium visual title as "OMS Agent by Heartbeat Count".
  • Next, add the Brick Chart visual to the bottom left corner of the report canvas.
  • Configure the Brick Chart with the following field mappings:
    • Category: OSType
    • Value: Count of Computer (Distinct)
    • Title: Agent OS Type
  • Add the Tornado Chart visual to the right-hand side of the Brick Chart.
  • Configure the Tornado Chart as the following:
    • Group: Remote IPCountry
    • Legend: OSType
    • Values: Count of Computer (Distinct)
    • Title: Agent Distribution by Country
  • Add the Pie Chart visual to the right-hand side of the Tornado Chart.
  • Configure the Pie Chart visual as the following:
    • Legend: ManagementGroupName
    • Details: ManagementGroupName
    • Values: Count of Computer (Distinct)
    • Title: Agent Distribution by Management Groups
  • Add the Donut Chart GMO visual to the right-hand size of the Pie Chart.
  • Configure the Donut Chart GMO visual as the following:
    • Legend: Version
    • Primary Measure: Count of Computer (Distinct)
  • Go to the Format tab of the Donut Chart GMO visual and configure the following:
    • Under Legend, set the Primary Measure to "Value"
    • Set the title to Agent Version
  • Use the Save icon located in the right-corner to save the report and now the report is created. It should look similar to our example shown earlier in Figure 47.

In the Agent Statistics report you just created, the Filled Map visual shows the countries of agent's physical locations. The Enlighted Aquarium is an animated visual where the fish tank represents the OMS workspace, and each fish represents an agent computer. The size of the fish in the tank represents the number of heartbeats the agent has sent to OMS. Therefore, the bigger the fish is, the longer the agent has been managed by your OMS workspace! The bottom four visuals show various statistics of the OMS agents.

Now that the report is created, you can pin either individual visuals or the entire report page to Power BI Dashboards. In our example shown in Figure 56, we will pin the entire page to a dashboard. You can do so by clicking the Pin Live Page button from the top menu of the report, giving the dashboard a name and then clicking the Pin Live button to complete the process.

FIGURE 56. PIN LIVE REPORT PAGE TO DASHBOARD

Once the report page is pinned as a live dashboard, you will see a search box on the top left corner called "Ask a question about your data" (shown in Figure 57).

FIGURE 57. POWER BI DASHBOARD SEARCH BOX

In this search box, you can search data within your report and dataset using natural language such as English. For Example, you can "ask" Power BI to "show me agents from Australia". The Dashboard page will automatically be refreshed to display the search result (in Figure 58, this question filtered the report to only show agents located in Australia).

FIGURE 58. POWER BI DASHBOARD SEARCH RESULT

Publishing Power BI Reports to Third Party Systems

Power BI allows you to embed reports into third party systems using IFrame. By doing so, users are not required to have Power BI accounts to access the report. For example, you may choose to embed your OMS Power BI reports into SharePoint and grant user access within the SharePoint role based access control (RBAC) environment. By doing so, you overcome the lack of RBAC capability in OMS.

From the top menu of your Power BI Report, you can access the embed code by selecting File Publish to the web as shown in Figure 59.

FIGURE 59. PUBLISHING A POWER BI REPORT TO WEB

The IFrame embed code will display in the pop-up message box shown in Figure 60.

FIGURE 60. POWER BI IFRAME EMBED CODE

Depending on the target system, you can choose either the URL or the IFrame tag and use it on the target system. For example, as you can see in Figure 61, we can embed this report into a page in a SharePoint online site using a web part.

FIGURE 61. EMBEDDING OMS POWER BI REPORT IN SHAREPOINT ONLINE

Building Interactive Power BI Reports for OMS

In the previous example, we created a high-level OMS agent statistics report. This is not the only way we can consume OMS data in Power BI. With the right visuals, we can also build more interactive reports that allow users to consume OMS data without having to learn the OMS search syntax.

Filter and Slicer Visuals

To make reports more interactive, you will often need to use various filter and slicer visuals to scope the data on the report. You will usually need a time slicer to narrow a time window, as well as some filters to filter other fields/attributes (such as computer names, etc.). There are several time slicer and attribute filter visuals you can choose from.

  1. Native Slicer visual: Power BI offers a native slicer visual that you can use for every report. You can use this slicer for not only filtering a single field/attribute but also as a time slicer. Figure 62 shows how you can use the native slicer for both text fields and date time fields.

FIGURE 62. THE POWER BI NATIVE SLICER

  1. Hierarchy Slicer: The hierarchy slicer is a custom visual that you can download from the Power BI visuals gallery:

    https://app.powerbi.com/visuals/show/HierarchySlicer1458836712039. It can be used to filter multiple fields in a hierarchical fashion and you have the option to set the hierarchy order among the fields. Figure 63 shows how you can use the Hierarchy Filter.

    FIGURE 63. HIERARCHY SLICER

  2. Time Brush: The Time Brush visual is a time slicer you can download from the gallery: https://app.powerbi.com/visuals/show/TimeBrush1450434005853. Comparing with the native slicer, Time Brush uses a vertical bar chart to display an additional value that you can define. The vertical bar represents the number of rows that have been injected into the table within the time period. Figure 64 shows how you can use it to select a time period.

FIGURE 64. THE TIME BRUSH CUSTOM VISUAL

  1. Timeline: Timeline is another custom time slicer that you can download from the gallery: https://app.powerbi.com/visuals/show/Timeline1447991079100. Unlike the native slicer and the Time Brush slicer, where you need to drag the cursor to highlight a time window, with the Timeline visual, you can select a precise time window of your choice (i.e. a day, a week, a month, a quarter, etc.). Figure 65 shows what it will look like on your report.

FIGURE 65. THE TIMELINE CUSTOM VISUAL

  1. Smart Filter by SQLBI: The Smart Filter is a custom visual that allows you to manually enter the value of the field that you wish to filter. You can download this visual from the gallery here:

    https://app.powerbi.com/visuals/show/SmartFilterBySQLBI1458262140625. You may decide to use this one instead of the native slicer or the hierarchy slicer if the possible values for the field are too big (i.e. filtering computers in a large OMS workspace). This visual also supports multi-select. Figure 66 shows you how to use the Smart Filter visual in your report.

FIGURE 66. THE SMARTER FILTER VISUAL

There are many other visuals that you can find in the custom visuals gallery, and there will be more and more new visuals published over time. It is impossible to go through them all in this book. However, the five visuals listed in this section are the most common visuals used by the author.

To demonstrate how you can use a Power BI report to search and consume OMS data, we will build two more reports in this section:

  • Interactive Log Searching report based on OMS Windows Event data
  • Capacity Forecasting report based on OMS performance data

Report Sample: Windows Event Log Report

The windows event log report we are going to build is created using the following four visuals:

This report allows users to search Windows Event log data by computer and the words within the event source. Figure 67 shows what the report looks like when it's finished.

FIGURE 67. WINDOWS EVENT LOG REPORT

To build this report, follow the steps listed below:

  • Create an OMS Power BI injection rule using the search query "Type=Event", and configure the rule to run every 15 minutes.
  • In the Power BI portal, create a new report based on the dataset used by the OMS Power BI Injection rule you created in the previous step.
  • Download and import the Smart Filter and WordCloud custom visuals to the report.
  • Add a Smart Filter visual to the report canvas and configure it by dragging the "Computer" field from the Results table of the dataset over to the "Field" field of the Smart Filter visual. Optionally, you can also format the Smart Filter visual the way you like.
  • Add a WordCloud visual to the canvas and configure the visual by:
    • Adding the "Source" field from the Results table of the dataset to the "Category" field of the visual.
    • Adding the "Id" field from the Results table of the dataset to the "Values" field of the visual. Configure "Values" field to use "Count" id.
  • Now add a Table visual to the main area of the canvas, and configure (drag and drop) all of the following fields from the Results table of the dataset on to the "Values" field of the visual:
    • TimeGenerated
    • Computer
    • EventLog
    • Source
    • EventLevelName
    • EventID
    • RenderedDescription
  • The final visual we need to add is a Slicer and we will position this one at the bottom of the report canvas. Drag and drop the "TimeGenerated" field from the Results table of the dataset to the "Field" field of the visual.
  • Save the report when you have added and configured all required visuals. To use the report, users can use the Smart Filter visual to select one or more computers from a drop-down list. The report will automatically refresh to only display information related to the selected computers.

The WordCloud visual displays the phrases extracted from the event source of the event log. The font size represents the number of records that have the phrase within the event source. The bigger the font is, the more frequently it has appeared in the event source. Users can click on a word in the word cloud visual, and the report will be updated to only show the event log entries that have the search phrase in the event source. For example, if you want to check all Kerberos-related events on computer AD02 within a particular time period, you can select AD02 from the Smart Filter visual and click on the word "Kerberos" in the WordCloud visual, then select the time window you are interested using the slicer visual (as shown in Figure 68).

FIGURE 68. SEARCHING WINDOWS EVENT LOG ENTRIES USING POWER BI REPORT

As you can see, when using this report to search event log entries, users do not need to understand the OMS search syntax, this may be a preferred method for less-experienced users to consume OMS data.

Report Sample: Performance Forecasting Report

OMS does a great job collecting performance data from both Windows and Linux computers and can collect data as often as every ten seconds. Although you can display performance graphs of the performance data you are collecting on the OMS portal, the performance graphs in OMS do not perform forecasting and trend analysis.

However, support for forecasting has recently been announced in Power BI. Now you can easily produce a report based on OMS performance data in Power BI and configure it to perform forecasting as well as trend analysis.

Disk space forecasting has always been a common use case scenario and in this section, we will build a Power BI report that forecasts logical disk free space using OMS performance data (as shown in Figure 69).

Note: at the time of writing this book, there are some limitations when injecting OMS performance data into Power BI. This will be discussed in detail in the next section of this chapter.

FIGURE 69. LOGICAL DISK FREE SPACE FORECASTING REPORT

You can follow the steps listed below to create the OMS Power BI injection rule, and then the Power BI report:

  • Configure your OMS workspace to collect the Logical Disk % Free Space counter

    (LogicalDisk(*)\% Free Space).

  • Create a Power BI injection rule in OMS with the following inputs:
    • Query: Type=Perf ObjectName=LogicalDisk | measure avg(CounterValue) As AvgCounterValue by CounterName,Computer,InstanceName
    • Name: Power BI – Logical Disk Perf
    • Schedule: 15 Minutes
    • Dataset Name: OMS Logical Disk Perf
  • Sign in to the Power BI console using the Power BI service account
  • Click on the newly created Power BI – Logical Disk Perf dataset to start creating a new report.
  • Download and import the following custom visuals to the report:
  • Drag and drop the CounterName field from the "Results" table to the Page level filters field and select the % Free Space box from the filter list as shown in Figure 70.

FIGURE 70. PAGE LEVEL FILTER SETTING

  • Add a text box located at the top of the report for the report title.
  • Add the Hierarchy Slicer visual to the report and drag and drop the following two fields from the Results table to the "Fields" field of the Hierarchy Slicer (in the correct order):
    • Computer
    • InstanceName
  • Format the Hierarchy Slicer the way you want, and make sure Single Select option is left as the default value of "On", as shown in Figure 71.

FIGURE 71. HIERARCHY SLICER SINGLE SELECT SETTING

  • Add a Line Chart visual to the main section of the report canvas.
  • Drag and drop the "TimeRead" field from the "Results" table to the Axis field of the Line Chart visual.
  • Add the "AvgCounterValue" field from the "Results" table to the Values field of the Line Chart, and configure it to use the Average value using the drop-down menu as shown in Figure 72.

FIGURE 72. CONFIGURING LINE CHART VISUAL

  • Now click the icon for the Analytics tab of the Line Chart visual and add a trend line as shown in Figure 73.

FIGURE 73. ADDING TREND LINE TO THE LINE CHART

  • Staying in the Analytics tab, scroll down to the Forecast menu and add a forecast to suit your requirements similar to our example in Figure 74.

FIGURE 74. CONFIGURING FORECAST IN THE LINE CHART

  • Use the Format tab to configure the Line Chart visual as per your requirements.
  • The final visual that we need to add to this report is the Timeline visual. Place this visual at the bottom of the report canvas.
  • Add the "TimeRead" field from the "Results" table to the "Time" field of the TimeLine visual.
  • Use the Format tab of the TimeLine visual to configure it to suit your requirements and a good tip here might be to disable the Labels setting to save some space on the report canvas.
  • Click "Save" to save the report and the report is now created.

To use the report, you can use the hierarchy slicer to select the performance counter instance (logical disks), and select a time window from the Timeline visual.

As shown earlier in Figure 69, the performance graph displays a trend line (black dotted line) and a forecast line (solid black line). The forecasting confidence interval is represented by the gray area on the graph.

Optionally, you can also add other lines such as Percentile, Min, Max, Average, Median and Constant lines to the line graph.

Hint: the longer the time window you have selected, the more accurate the forecasting data is likely to be.

OMS Performance Data Injection Limitations

As mentioned previously, OMS uses the Power BI REST API to inject data into your Power BI workspace. The Power BI REST API has some limitations that will affect injecting a large volume of data, such as OMS Performance data into Power BI. In particular, the following three limitations will impact injecting performance data into Power BI:

  • 10,000 max rows per single POST rows request
  • 1,000,000 rows added per hour per dataset
  • 5,000,000 max rows stored per table in 'none retention policy' dataset In OMS, when you search for performance data using query Type=Perf, unaggregated, raw performance data is returned to the search result. There is no need to store raw performance data into Power BI. Therefore, to overcome these limitations, you should always inject aggregated performance data into Power BI. You can perform the data aggregation in the OMS search query that you are going to use for the Power BI injection rules. To produce the aggregated value, you can use measure avg(CounterValue) As AvgCounterValue in the search query (as shown in the Performance Forecasting report example previously).

However, this brings us to another challenge. If we are to produce a generic Power BI report for all OMS performance data, the Results table in the dataset must contain the following 5 fields:

  • ObjectName - The object that you are collecting the performance data for (such as LogicalDisk).
    • CounterName - The performance counter that you are collecting (such as % Free Space).
    • InstanceName - Some counters can have more than one instances on a computer, InstanceName differentiates the instances from the same computer (i.e. C:, D:, and _Total)
  • Computer - Identifies the computer name of where the performance data is collected from.
  • CounterPath - Contains the full path to the performance count. It is made up with the Computer Name, the ObjectName, the InstanceName and the CounterName. i.e.

    \\sever01.yourcompany.com\LogicalDisk(_Total)\Disk Writes/sec.

    This field is the only unique field when it comes to Performance data. We can use this field to identify individual instances in a Power BI performance report.

Currently, there is a limitation within the OMS search language, whereby, when you aggregate the data using the measure command, you can only specify up to three fields to be returned in the search result. The following query is an example of this limitation:

Type=Perf ObjectName=LogicalDisk | measure avg(CounterValue) As AvgCounterValue by CounterName,Computer,InstanceName

This limitation prevents you from using a common dataset for OMS performance data. As a result, you will not be able to produce a generic report that works for all performance counters collected by OMS.

To work around the Power BI REST API limitation, as well as the OMS search syntax limitation, you will need to create multiple OMS Power BI injection rules for multiple performance counter object names, and use multiple Power BI datasets. Obviously, by using this workaround, you may have to produce multiple similar Power BI reports since a report can only be based on one dataset in Power BI. You will need to identify case by case, that which two out of the above mentioned five fields can be excluded from the OMS search query.

If you followed the previous example and created the logical disk free space forecasting report, you have created a dataset just for the performance data for the LogicalDisk objects. You can follow this pattern and use separate Power BI datasets for each object type in OMS performance data.

Normally, since CounterPath is unique, in a Power BI Line Chart, you can use CounterPath as the legend so you can display more than one performance line graph in the visual. However, if you are building reports only for forecasting purposes, the CounterPath will not be used because you must leave the "Legend" field empty in the Line Chart if you are going to configure forecasting (as shown in Figure 75). Without specifying the Legend field, the Line Chart can only display one-line graph.

FIGURE 75. LINE CHART CONFIGURATION IN AN IDEAL SITUATION

Finally, many performance counters only have a single instance on a computer, therefore these counters do not have the InstanceName field (i.e. Processor % Processor Time). When working with these counters, you do not have to include InstanceName in the search result.

The bottom line is, based on the limitations, you need to design your OMS search queries, Power BI injection rules, and Power BI dataset carefully so you can produce meaningful and accurate Power BI reports using only three fields from OMS performance data.

Note: The author has reported this issue in OMS Log Analytics User Voice and suggested to increase the OMS search syntax limit from three fields to five fields. At the time of writing this book, there has not been any progress. If you agree with the author and would like to a fix for this issue, please cast your vote at https://feedback.azure.com/forums/267889-log-analytics/suggestions/17396725-increase-the-supported-number-of-grouping-fields-f

Monitoring Power BI Injection Results

Natively in OMS, when a Power BI injection rule is executed, an alert containing the injection result is logged in your workspace. You can access all Power BI injection rule results using the following search query:

Type=Alert SourceDisplayName=PowerBIConnector

In each search result, as shown in Figure 76, the alert description shows the number of rows injected to Power BI.

FIGURE 76. POWER BI RULE ALERT DESCRIPTION

Since the number of rows is exposed in the alert description, you can extract this field as a Custom Field. You may create a Custom Field based on the number in the Alert Description (the number 245 in our example shown in Figure 77) and name it "PowerBIUploadRowCount_CF". Once it is created, any new entries will have this field populated:

FIGURE 77. CREATING CUSTOM FIELD FOR POWER BI UPLOAD ROW COUNT

When the Power BI injection fails, it often means that you have been impacted by the Power BI REST API limitations that we discussed previously. You can create an OMS Alert rule to notify you when the upload count for any jobs is 0. The alert rule can use this query:

Type=Alert SourceDisplayName=PowerBIConnector PowerBIUploadRowCount_CF=0

Note: You will need to create the custom field before creating the alert rule. Refer to the Alert Management chapter of this book if you need assistance creating this OMS Alert rule.

Note: Custom Fields are discussed earlier in this chapter. You can refer to the Custom Fields section of this chapter if you need help creating this Custom Field.

You can also check the total number of rows injected per Power BI injection rule by utilizing the PowerBIUploadRowCount_CF custom field in the following search query (shown in Figure 78):

Type=Alert SourceDisplayName=PowerBIConnector | measure sum(PowerBIUploadRowCount_CF) As TotalRows by ObjectDisplayName

FIGURE 78. LIST THE TOTAL NUMBER OF ROWS INJECTED BY EACH POWER BI INJECTION RULE

As mentioned previously, the Power BI dataset that the OMS Power BI injection rules create does not use Power BI's data retention policy. Therefore, the maximum number of rows that can be stored in a table is 5,000,000 (five million) rows. Once this limit is reached, the OMS Power BI rule will stop injecting data into the dataset. Currently, the Power BI REST API does not expose any methods to delete a subset of rows from a table. However, it does offer you a way to clean up the table by deleting all the rows. You can perform this action without having to write any code using the Apiary interactive API Console at http://docs.powerbi.apiary.io/.

Note: You can also write your own code to programmatically delete rows from a table in a Power BI dataset. This is not covered in this book.

You can follow the steps listed below to empty a table in a dataset:

  • Open a browser window and go to the Power BI Apiary interactive API Console http://docs.powerbi.apiary.io/
  • On the left navigation menu, under REFERENCE, browse to Datasets and then Datasets Collection. Click on List all Datasets.
  • Click the Switch to Console button on the right section of the page as shown in Figure 79.

    FIGURE 79. ACCESSING LIST ALL DATASETS API FROM APIARY CONSOLE

  • Click the Authenticate button and enter your Power BI account credentials.
  • If prompted with the pop-up shown in Figure 80, click Accept to grant Apiary access to your Power BI workspace.

FIGURE 80. GRANTING APIARY ACCESS TO POWER BI WORKSPACE

  • Once authenticated, click on List all Datasets again, and then click the Call Resource button located in the pane on the right and shown in Figure 81.

FIGURE 81. CALLING LIST ALL DATASETS API ON APIARY CONSOLE

  • Once the HTTP request has been made and the response is returned, a list of datasets will be listed within the response body. Copy the ID of the dataset that you want to delete rows from as shown in Figure 82.

FIGURE 82. GETTING THE DATASET ID FROM THE LIST ALL DATASETS API CALL RESPONSE

  • Browse to the Table Rows section (located in the navigation bar on the left) and click on the Clear the rows in a table option.
  • Under the URI Parameters tab, enter the following fields:
  • Id: <the dataset id you copied from step 7>
  • TableName: Results
  • When you have entered your parameters, click the Call Resource button shown in Figure 83.

FIGURE 83. DELETING ALL ROWS FROM A TABLE USING REST API

  • Verify the API call result. Make sure the response code is 200.
  • Repeat the same step to invoke Clear the rows in a table API call again for the Count table in the same dataset.

After the tables have been cleared using the Apiary console, you can verify the tables are empty by going to the Power BI console and dragging a random field from the table to a new report canvas. It should have nothing listed on the List visual (as shown in Figure 84).

FIGURE 84. COMPARING ROW COUNT IN THE RESULT TABLE BEFORE AND AFTER THE DELETION

Based on the author's testing, after removing all the rows from the tables, the problematic OMS Power BI injection rules that are using this Power BI dataset will start working again.

Note: You may not like the approach of deleting all the data from a Power BI dataset. At the time writing this book, Power BI integration in OMS is still in public preview. The author cannot guarantee whether the data retention and dataset limitations are going to change in the future. You can always share your feedback with the OMS product team via the Log Analytics User Voice site: https://feedback.azure.com/forums/267889-azure-operational-insights

Removing OMS Power BI Injection Rules

When a Power BI report is no longer required, you may decide to remove the Power BI reports, dashboards and datasets. However, you must also remove the OMS Power BI injection rules from OMS portal.

As shown in Figure 85, the Power BI injection rules can be removed in the OMS portal from the Settings Power BI tab by using the X icon.

FIGURE 85. DELETING OMS POWER BI INJECTING RULES

You will also have to manually delete the saved search created by the Power BI injection rule if it is no longer in use.

Solution Targeting

By default, when you have enabled a Management Solution in OMS Log Analytics, the solution is targeted to all the OMS agents that are reporting to the Log Analytics workspace. This may not be desired when you need to control the cost or network bandwidth consumed by OMS and you may wish that certain solutions only apply to specific workloads such as domain controllers, SQL servers etc. To meet this requirement, OMS Log Analytics now provides a mechanism where you can target a solution to a subset of OMS agents using computer groups. The Solution Targeting feature is documented athttps://docs.microsoft.com/en-us/azure/operations-management-suite/operations-management-suite-solution-targeting.

Note: The OMS Solution Targeting feature is in Public Preview at the time of writing this book. The feature may be different than what we have discussed in this chapter when it becomes generally available.

Configuring Solution Targeting

In order to scope a solution to a subset of OMS agents, you will need to create one or more computer groups that contain the target computers first. You can either use manually created groups or groups imported from 3rd party applications such as Active Directory, SCCM or WSUS. We discussed groups earlier in the OMS Computer Groups section of this chapter.

Once the group is created, you will need to use the Azure portal https://portal.azure.com/ to configure the scope. Follow the steps listed below to target a solution (SQL Assessment in this example) to a computer group (the SQL Computer Group we created earlier in this chapter):

  1. In the Azure portal, browse to your Log Analytics workspace and locate "Scope Configurations" under Workspace Data Sources (as shown in Figure 86)

FIGURE 86. OMS WORKSPACE SCOPE CONFIGURATIONS

  1. At the Scope Configurations view, click the Add button to create a new scope configuration.
  2. From the Create a Scope Configuration page, assign a name to the scope and select one or more computer groups to be included (as shown in Figure 87). Click Select then hit OK to complete the scope creation.

FIGURE 87. CREATING SCOPE FOR OMS LOG ANALYTICS WORKSPACE

  1. Now use the Solutions option from the Log Analytics menu in the Azure portal to browse to the solution that you wish to scope. From the menu within your solution (SQL Assessment in our case) click the "Solution Targeting" option under Workspace Data Sources then click "Add scope configuration" (as shown in Figure 88):

FIGURE 88. ASSIGNING A SCOPE TO SOLUTION

  1. Select the scope that you have previously created (as shown in Figure 89).

FIGURE 89. SELECTING THE SCOPE FOR THE SOLUTION

  1. The new scope will now appear under the Solution Targeting section with its status shown as "Initializing" (as shown in Figure 90).

FIGURE 90. SCOPE ASSIGNED TO THE SOLUTION

Solution Targeting Limitations

Currently, the Solution Targeting feature has several limitations:

  • Not all solutions can be targeted. Only solutions that collect data from OMS agents are applicable. Solutions that collect data from Azure resources (i.e. Key Vault Analytics) cannot be targeted.
  • Only solutions provided by Microsoft can be targeted. Any custom solutions created by yourself or 3rd party ISVs (Independent Software Vendors) cannot be targeted.
  • The "Agent Health Assessment" solution also cannot be scoped.
  • You can only scope to Direct-Connected OMS agents. Any agents enrolled via SCOM management groups cannot be targeted. When applying a scope that contains SCOM agents, you will receive an error few minutes after the scope is applied to the solution (as shown in Figure 91 and Figure 92).

FIGURE 91. SCOPE THAT FAILED TO BE APPLIED

FIGURE 91. DETAILS OF SOLUTION TARGETING FAILURE

Summary

At the start of this chapter, we introduced you to the Saved Search feature and then walked you through creating and importing groups in OMS. After that, we explored how to work with Custom Fields and Custom Logs. Although there will be a learning curve when you first start using these search queries and features, these skills should make it much easier for you to work with OMS Log Analytics.

Then, we showed you how to leverage the OMS Power BI integration feature to surface the data collected by your OMS workspace to Power BI so you can benefit from the rich reporting and dashboarding capabilities that Power BI provides. When designing your Power BI reports, not only can you use the native visuals within the Power BI report, there is also a large selection of custom visuals that Microsoft and the Power BI community have released free of charge on the Power BI Visuals Gallery. Building reports has never been so much fun!

Finally, we discussed how you can leverage the OMS Log Analytics Solution Targeting feature to scope certain Microsoft solutions to a subset of OMS agents using the Solution Targeting feature.