Operations Management Suite (OMS): Custom OMS Solutions

Integration

In this chapter, we will focus on how to extend Operations Management Suite (OMS) Log Analytics by authoring your own custom solutions for applications running in your environments. We will cover the following components which are the building blocks of your custom OMS solutions:

  • SCOM collection rules for OMS
  • AzureRM.OperationalInsights PowerShell module
  • OMS related APIs
    • Log Analytics Search API
    • HTTP Data Collector API
    • OMS Alert API
    • Service Map API
  • OMS View Designer
  • Azure Resource Providers and ARM templates

SCOM Collection Rules for OMS

When you have connected your System Center Operations Manager (SCOM) management groups to your OMS workspace, a number of management packs are pushed down from OMS to your management group and then distributed to the SCOM agents that are enrolled into OMS. These management packs expose many modules that we can leverage in our own management packs.

Unlike SCOM, OMS does not utilize classes. Therefore, it is not possible to target collection rules to any specific types of computers (i.e. SQL database server, Hyper-V hosts, etc.) natively in OMS. Luckily, since the OMS management pack modules have been made available within SCOM, once we have connected our SCOM management group to OMS, we are able to author custom SCOM data collection rules that send various types of data to OMS.

In the following sections, we will demonstrate how to collect the following types of data from endpoints managed by OMS-connected SCOM management groups:

  • Event Data (Type=Event)
  • Near Real-Time Performance Data (Type=Perf)

Note: Although it is not possible to target OMS rules to specific classes, once you have connected your SCOM management group to OMS, it is possible to utilize SCOM overrides to fine-tune OMS rules. More details can be found from Cameron Fuller's blog post: http://blogs.catapultsystems.com/cfuller/archive/2015/12/02/changing-how-oms-gathers-performance-information-from-systems-in-operations-manager/

We will also cover the following areas:

  • Setting up your authoring environment.
  • OMS management pack (MP) module types.
  • Authoring for optimal performance by leveraging the SCOM Cookdown feature.
  • Creating and collecting custom performance data through the SCOM console.

Let us begin with setting up the authoring environment.

Setting Up the Authoring Environment

The Microsoft Monitoring Agent (MMA), is the agent that SCOM uses to collect monitoring data from Microsoft workloads and the same agent is also used by OMS to collect data from endpoints.

When you have connected your SCOM management groups to OMS, depending on the solutions you have enabled in OMS, a list of MPs will be sent to your SCOM management groups via OMS. These MPs expose many OMS specific management pack elements (e.g. - data source and write action modules) which you can use to develop your own custom solutions for the agents managed by your SCOM management groups.

In this section, we will discuss and demonstrate how to write your own custom SCOM MP workflows that interact with OMS. We will demonstrate how to build different types of MP workflows using Visual Studio 2015 and the Visual Studio Authoring Extension (VSAE). You are also able to download the Visual Studio project, the demo MP and all other source code we have used in this chapter from our GitHub repository at https://github.com/insidemscloud/OMSBookV2 in the \Chapter 17 directory.

If you wish to work through the exercises in this chapter, you will need to install the following software onto your computer:

Note: If you do not have a licensed version of Visual Studio 2015

Before we start, we expect that you are familiar with SCOM MPs and have previous experience in authoring SCOM MPs using VSAE. If you do not have any previous MP authoring experience using VSAE, we strongly recommend that you complete the free online SCOM 2012 R2 MP authoring course from the Microsoft Virtual Academy before continuing with this chapter: https://www.microsoftvirtualacademy.com/en-US/training-courses/system-center-2012-r2-operations-manager-management-pack-8829.

As a prerequisite for VSAE, Microsoft Monitoring Agent (MMA) must also be installed on the authoring computer. You can either install the MMA version that's shipped with SCOM 2012 R2, SCOM 2016 or download it from your OMS portal.

It is also a good idea to export all the OMS-related management packs to unsealed xml format so you are able to read the content of these MPs during the authoring process.

Installing Required Software

Let us start with Visual Studio. For this example, we will assume you will be using Visual Studio 2015 Enterprise Edition. You can simply install Visual Studio 2015 with the default options, as shown in Figure 1.

FIGURE 1. VISUAL STUDIO 2015 ENTERPRISE DEFAULT INSTALLATION OPTIONS

Note: At the time of writing this book, the latest version of the VSAE (version 1.2.0.1) only supports Visual Studio 2012, 2013 and 2015. It does not support Visual Studio 2017. Therefore, we are using Visual Studio 2015 in this chapter.

Next, we will need to install the following components before installing VSAE:

  • Microsoft Monitoring Agent (MMA)
  • .Net Framework 3.5

Note: In this case, we will download the Microsoft Monitoring Agent installer from the OMS portal and install it on the authoring computer. You may also install the SCOM 2012 SP1/R2 or SCOM 2016 version with the latest Update Rollup.

MMA is required because the MP Simulator within VSAE requires the assemblies that come with MMA. However, once you have installed the MMA on the authoring computer, it is not required to connect this computer to your SCOM management group or OMS workspace. If you use the SCOM console to push the MMA to your authoring computer, the computer will be automatically connected to your SCOM management group. Alternatively, if you do not want to connect the computer to your SCOM management group, you can manually install the agent from SCOM installation media (<Installation media>\agent\AMD64 or i386\OMMAgent.msi").

For the same reason, .Net Framework 3.5 is also required when you are using the MP Simulator within Visual Studio. Visual Studio will crash when you launch the MP Simulator if .Net Framework 3.5 is not installed.

Note: If you are pushing the agent via the SCOM console, the installed agent will be on the same Update Rollup level as the management group. However, if you manually install the agent from the installation media, it is always a good idea to also install the latest update rollup after the agent installation is complete.

After VSAE is installed, you will see the various SCOM and System Center 2012 R2 Service Manager (SCSM) MP templates when creating new Visual Studio projects, as shown in Figure 2.

FIGURE 2. NEW VISUAL STUDIO MANAGEMENT PACK PROJECT TEMPLATES

Installing Optional Software

Although it is not required when authoring MPs, we also recommend the following software for your authoring PCs:

Notepad++

In this chapter, we will use Notepad++ to view the unsealed MP XML files. We recommend you install it on your authoring PC as well.

SCSM Entity Explorer

We will also use SCSM Entity Explorer to navigate SCOM monitoring classes (targets for our OMS collection rules). Although it was written primarily for SCSM, it also works with SCOM. There is no need to install it, simply download it from the TechNet Gallery (https://gallery.technet.microsoft.com/SCSM-Entity-Explorer-68b86bd2) and place it on your authoring PC. We will use it later.

MPViewer

MPViewer is a very handy utility when you need to view the content of an MP. You can also use it to unseal a sealed MP or MP bundle to XML files. You can download the latest version 2.2.3 from 6560.MPViewer.2.3.3.zip

SCOM Operations Console

Although it is not required, we also recommend you install the SCOM Operations Console (2012 or 2016 depending on your SCOM version) onto your authoring machine. Having the console installed makes things easier when importing and testing your MPs. It is important that you also patch the console to the same Update Rollup level as your management group.

PowerShell Tools for Visual Studio 2015

As we will be writing PowerShell scripts to be used in the demo MP, we also recommend that you install the PowerShell Tools for Visual Studio 2015. This is a Visual Studio Extension. You can find it within Visual Studio by going to Tools Extensions and Updates, as shown in Figure 3.

FIGURE 3. ACCESSING VISUAL STUDIO EXTENSIONS AND UPDATES

In the Extensions and Updates window, select the "Online" category from the left, and you can find the tools by searching for "PowerShell" in the search box, as shown in Figure 4.

FIGURE 4. INSTALLING POWERSHELL TOOLS FOR VISUAL STUDIO 2015

Collecting Reference Management Packs

When authoring MPs, you will often need to reference elements from other MPs. When this is required, you will need to locate either sealed management packs (.mp files) or management pack bundles (.mpb files) that contain the elements you need to reference and add them into your MP Visual Studio project as references. VSAE ships with many built-in MPs. After the VSAE is installed, you will find these MPs located in "C:\Program Files (x86)\System Center Visual Studio Authoring Extensions\References" folder, as shown in Figure 5.

FIGURE 5. VSAE REFERENCE MANAGEMENT PACKS LOCATION

Since VSAE only comes with the built-in essential MPs for different versions of SCOM and

SCSM, you will not be able to find application-specific MPs in VSAE, such as SQL or SharePoint MPs. Therefore, we recommend that you create another folder on your authoring computer and place all other required reference MPs in this folder. If you are using multiple computers when authoring MPs, using a folder that can be accessed by multiple computers might be a better option (i.e. a network share or OneDrive).

As a rule of thumb when referencing MPs, you should ALWAYS use the minimum required version of the referencing MP. For example, if you are referencing a class or a data source module from MP XYZ, and this class or data source module was defined in version 1.0.0.0 of MP XYZ, but the most recent version of this MP is version 2.0.0.0. In this case, copy version 1.0.0.0 of MP XYZ to your reference folder and add it to your VSAE project as a reference.

This is to ensure your MP will still be compatible with management groups that are still using version 1.0.0.0 of the MP XYZ, and people can import your MP into their management groups without having to update MP XYZ to 2.0.0.0 first. However, in this example, if the version 2.0.0.0 of the MP XYZ has introduced additional properties in the class or contains bug fixes in the modules that you are referencing that would impact your MP, then you must reference the version 2.0.0.0 in your Visual Studio project.

Exporting OMS Related Management Packs

Since we are going to heavily leverage the MP module types from various OMS-related MPs, it is a good idea to export them to unsealed MPs (in XML files) so we can read their content.

Additionally, since at the time of this writing, Microsoft has not made the sealed version of these MPs available for download, we are unable to reference them in our management pack projects. We will demonstrate a workaround later in this chapter, but for now, even if you are not interested in reading the content of these OMS MPs, export them as we will need to use them later.

Because we are not able to export sealed MPs directly from the SCOM console, we will have to use the SCOM PowerShell module to export them.

To export the OMS related MPs from your SCOM management group, please run the following commands from your SCOM PowerShell console:

Get-SCOMManagementPack -Name "*Advisor*" | Export-SCOMManagementPack Path C:\OMSBook

Get-SCOMManagementPack -Name "*Intelligence*" | Export-SCOMManagementPack -Path C:\OMSBook

These commands will export all OMS related MPs to a C:\OMSBook folder on your computer. Replace "C:\OMSBook" with a folder of your choice. These commands do not show any output on the PowerShell console but once finished, you will be able to see a list of XML files in the folder that you have specified, as shown in Figure 6.

FIGURE 6. EXPORTED OMS RELATED MANAGEMENT PACKS

Creating Key Files for Sealed Management Packs

If you do not have a key file for sealing MPs, even if you are not planning to seal your MPs, you should still create a key file as we will need to manually seal the required OMS MP that will be used temporarily during our authoring process.

Note: If you already have a key file (.snk file) that you have previously used to seal MPs, you can skip this section.

To create a key file, begin by locating the shortcut for "Developer Command Prompt for VS2015" on your authoring PC where Visual Studio 2015 is installed. You can find this shortcut in the Start Menu under the "Visual Studio 2015" folder, as shown in Figure 7.

FIGURE 7. DEVELOPER COMMAND PROMPT FOR VS2015

Once you have launched the command prompt from the shortcut, you can use the following command to create your key file, as shown in Figure 8:

sn –k <Folder Path>\<Key File Name>.snk

FIGURE 8. CREATING KEY FILES USING SN.EXE

Note: After the key file (.snk) is created, save it to a secure location. Do not share this key publicly as it ensures the integrity of the MPs sealed by this key.

Resealing OMS Related MPs for Authoring

As mentioned previously, at the time of this writing, Microsoft has not yet made sealed versions of OMS related MPs publicly available. In order to make our authoring experience easier, we will seal the required MPs ourselves so we can add them as references in our Visual Studio management pack projects. Once the MP you are authoring is finalized, we will edit the unsealed MP XML files and seal them manually, outside of Visual Studio. We will demonstrate the manual sealing process later in this chapter.

We are going to use the following management pack in this chapter: Microsoft.IntelligencePacks.Types

You will need to locate the Microsoft.IntelligencePacks.Types.xml that you have exported from your SCOM management group previously. You will also use the key file that you have created from the previous section to seal these MPs.

To manually seal this MP, on your authoring computer where VSAE is installed, copy FASTSEAL.EXE from "C:\Program Files (x86)\System Center Visual Studio Authoring Extensions\Tools" to another location (i.e. C:\Software).

Once you have copied FASTSEAL.EXE to another location, run the following command from the folder where the copy resides, as shown in Figure 9:

FASTSEAL.exe C:\OMSBook\Microsoft.IntelligencePacks.Types.xml /KeyFile C:\Documents\OMSBook.snk /Company "OMS Book" /OutDir C:\Documents

FIGURE 9. RESEALING OMS MANAGEMENT PACKS FOR AUTHROING PURPOSES

After you have sealed the MP, copy it to the folder you have created previously for storing reference MPs.

Creating Visual Studio Management Pack Project

Now that we have all the prerequisites in place, we can start authoring the MP in Visual Studio.

We are going to create a project that is stored locally on your authoring computer. Although you can also store this project in Source Control systems such as Visual Studio Online or GitHub, source control is not in the scope of this chapter.

To create a project, launch Visual Studio 2015, and choose File New Project (as shown in Figure 10).

FIGURE 10 CREATING NEW VISUAL STUDIO PROJECT

In the New Project dialog box, choose "Operations Manager 2012 SP1 Management Pack", and specify the name and location. For the demo MP we are building in this chapter, we are going to name the MP "OMSBook.Demo" and store the project under "C:\Documents\OMSBook" (as shown in Figure 11).

FIGURE 11 CREATING A NEW MANAGEMENT PACK SOLUTION

Note: Since the minimum supported SCOM version for connecting to OMS is 2012 SP1 UR6 and 2012 R2 UR2, it is OK to choose the "Operations Manager 2012 SP1 Management Pack" template from the New Project dialog because SCOM 2012 RTM is not supported for OMS.

Before we start creating any MP elements, we will first configure various properties of this MP.

Then, we will set the Management Pack version to 0.0.0.1 and give it a friendly name: "OMS Book Demo". We can do so by right clicking on the "OMSBook.Demo" project from Solution Explorer and choosing "Properties", as shown in Figures 12 and 13.

FIGURE 12. CONFIGURING MANAGEMENT PACK PROJECT PROPERTIES

FIGURE 13. SPECIFYING MANAGEMENT PACK VERSION AND FRIENDLY NAME

Next, since we are going to build a sealed MP, we will have to fill out the fields under "Build" tab, as shown in Figure 14.

FIGURE 14. CONFIGURING PROPERTIES FOR SEALED MPS

To simplify the process of incrementing MP versions when authoring sealed MPs, you may also find the "Auto-increment version" feature very handy. When you have enabled this feature, the MP version will automatically increase by 0.0.0.1 when every time you click "Build Solution" to build the MP. You can enable this feature under the "Deployment" tab of the MP project properties page, as shown in Figure 15.

FIGURE 15. CONFIGURING MP VERSION AUTO-INCREMENT

After we have configured all the properties for the management pack, we will then create an MP fragment and specify the MP display name and description in a <LanguagePack> section. The display name and the description that you are going to specify here is what people are going to see in SCOM console once you have imported the MP into your management group. To do so, right click the management pack project, and choose AddNew Item…, as shown in Figure 16

FIGURE 16. ADDING NEW ITEMS TO THE MANAGEMENT PACK PROJECT

In the "Add New Item" dialog, choose "Empty Management Pack Fragment" and give it a name. Since we are going to specify the MP display name and description in this fragment, we will call it "ManagementPack.mpx", as shown in Figure 17.

FIGURE 17. CREATING EMPTY MANAGEMENT PACK FRAGMENT

After the ManagementPack.mpx file is created, type the following under the <ManagementPackFragment> XML tag:

<LanguagePacks>

<LanguagePack ID="ENU" IsDefault="true">

<DisplayStrings>

<DisplayString ElementID="OMSBook.Demo">

<Name>OMS Book Demo Management Pack</Name>

<Description> This is the demo management pack created in Chapter 17 of the Inside OMS v2 book.</Description>

</DisplayString>

</DisplayStrings>

</LanguagePack>

</LanguagePacks>

Download the Code

You can find the ManagementPack.mpx on GitHub at https://github.com/insidemscloud/OMSBookV2 in the \Chapter 17\Demo MP\OMSBook.Demo directory.

The last thing we are going to do before we start writing the MP element is to add the required referencing MPs. We will add the OMS MP that we manually sealed in the previous section, as well as the System.Performance.Library MP. Use the information provided in Table 1 below and add these two MPs as references in the OMSBook.Demo MP. You must name the reference aliases exactly as what is show in Table 1.

MP Name

Location

Reference Alias

Microsoft.IntelligencePacks.Types

C:\Documents

IPTypes

System.Performance.Library

C:\Program Files (x86)\System Center Visual Studio Authoring Extensions\References\OM2012SP1

Perf

TABLE 1: REQUIRED REFERENCE MPS

Note: Previously we have manually sealed the Microsoft.IntelligencePacks.Types MP to the C:\Documents folder. Thus, the location from the table above is listed as C:\Documents. If your location is different, then substitute accordingly.

FIGURE 18. ADDING REFERENCE MPS

Now the OMSBook.Demo MP is configured, we can start creating modules and workflows in this MP.

Creating OMS Event Collection Rules

Although you can specify which event logs to collect from the Settings page in OMS, the settings are not as granular as those available in SCOM. As shown in Figure 19, when configuring OMS to collect an event log, the only filter you can specify is the event severity (Error, Warning, and Information). Additionally, for every log you have configured here, OMS will create a collection rule targeting all computers that are managed by OMS, regardless of whether the log exists or not.

FIGURE 19. CONFIGURING EVENT COLLECTION IN OMS

Note: When you add an application-specific event log in OMS, you may see an error event with ID 26002 repeatedly logged on computers that do not have the event log you have specified. For example, the event collection rule created by OMS is unable to access the "Microsoft-Windows-Hyper-V-VMMS-Admin" log on a non-Hyper-V server, as shown in Figure 20.

FIGURE 20. EVENT LOG NOT FOUND ERROR EVENT IN OPERATIONS MANAGER EVENT LOG

Unlike OMS, SCOM would normally discover the applications and systems that specific rules and monitors apply to. For example, when SCOM collects events from the System Center Virtual Machine Manager (VMM), it would first discover the VMM server and only target the event collection rules to the VMM server class.

Also, it is very common to generate event data from script outputs in SCOM. In the following sections, we will demonstrate two sample event collection rules:

  • Collecting Event ID 1002 from the VMM Server Admin event log on VMM servers and saving the events to OMS.
  • Script Based Event Collection Rule for OMS.

Creating Event Collection Rules

In this section, we will demonstrate how to author an event collection rule that collects events with ID 1002 from VMM servers "Microsoft-VirtualMachineManagerServer/Admin" log, as shown in Figure 21.

FIGURE 21. THE MICROSOFT-VIRTUALMACHINEMANAGER-SERVER/ADMIN LOG ON A VMM SERVER

Before authoring any workflows, we first need to identify the target monitoring class for our workflow (in this instance, the event collection rule). Since this event log would only exist on SCVMM servers, it is logical to target the rule only to the VMM server. More specifically, we would only want to collect the event from VMM 2012 servers. Now, if we search "VMM" in SCSM Entity Explorer once it is connected to our SCOM management group, we can easily find our target monitoring class "Microsoft.SystemCenter.VirtualMachineManager.2012.VMMManagementServer". We can also see this monitoring class is defined in the "Microsoft.SystemCenter.VirtualMachine.2012.Discovery" MP, as shown in Figure 22.

FIGURE 22. VMM 2012 SERVER MONITORING CLASS AND MANAGEMENT PACK

Now that we have identified our target class and found in which MP the class is defined, we will need to make sure the "Microsoft.SystemCenter.VirtualMachine.2012.Discovery" MP is referenced in our Visual Studio management pack project. We will also need to reference the "Microsoft.SystemCenter.VirutalMachine.Library" MP because the target class "Microsoft.SystemCenter.VirtualMachineManager.2012.VMMManagementServer" is based on the abstract class "Microsoft.SystemCenter.VirtualMachineManager.VMMManagementServer", which is defined in the VMM library MP.

Note: Unlike MPs for other Microsoft Products, the VMM MPs are shipped with VMM and cannot be downloaded from the Microsoft MP Catalog. You can find it on your VMM server, under "<VMM Server Install Directory>\ManagementPacks" folder, as shown in Figure 23.

FIGURE 23. VMM MANAGEMENT PACKS LOCATION

After copying these two MPs to the MP reference folder we created earlier on the authoring PC, we can then add them as references in Visual Studio. We will also name the alias for the VMM 2012 Discovery MP "VMM2012" and the VMM Library MP "VMMLib", as shown in Figure 24.

FIGURE 24. MP REFERENCE FOR VMM 2012 DISCOVERY MP

Next, we will create a folder called "Rules" within the Visual Studio project. Although it is not necessary, we recommend that you create various folders for different types of MP elements (such as class definitions, module types, monitor types, discoveries, rules, monitors, scripts, etc.). It will make your life much easier when working on a complex MP if you place MP fragments into different folders. You can create the folder by rightclicking the MP project in Solution Explorer and choosing "Add""New Folder", as shown in Figure 25.

FIGURE 25. CREATING NEW FOLDER IN VISUAL STUDIO MANAGEMENT PACK PROJECT

Now that the "Rules" folder is created, let us create an empty MP fragment in this folder and name it "VMM.1002.Event.Collection.Rule.mpx"

Note: You can name the MP fragment (.mpx) files however you want to, as it will not impact the MP. Although VSAE ships many templates that we can use (such as Event Collection Rule template), in order for you to better understand the MP XML schema, we will not use any templates in this chapter. Instead, all elements will be written in XML directly.

Once the empty MP fragment is created, you will see nothing but the <ManagementPackFragment> XML tag. Everything you are going to author must be placed within this tag.

The XML code for the event collection rule is listed below. Place it inside the <ManagementPackFragment> tag:

Download the Code

You can find the VMM.1002.Event.Collection.Rule.mpx on GitHub at https://github.com/insidemscloud/OMSBookV2 in the \Chapter 17\Demo MP\OMSBook.Demo\Rules directory.

Since this is the first rule that we are authoring in this chapter, before continuing with the authoring process, let us take some time and explain the XML code. The Rule is defined within the <Rule> XML tag:

<Rule ID="OMSBook.Demo.VMM.Server.Event1002.Collection.Rule" Remotable="false" Enabled="true" Priority="Normal" Target="VMM2012!Microsoft.SystemCenter.VirtualMachineManager.2012.VMMManagementServer" ConfirmDelivery="false" DiscardLevel="100">

As we can see, we have also defined various attributes in the <Rule> tag. In this instance, we are only interested in the following attributes:

  • ID: The internal name of the rule. This must be unique within your management group.
  • Remotable: When set to true, this rule will also run against agentless managed computers. However, in this instance, since VMM can only be monitored by SCOM via the locally installed agent, we have configured the "Remotable" property to false.
  • Target: The target monitoring class for the rule. In this instance, the value is "VMM2012!Microsoft.SystemCenter.VirtualMachineManager.2012.VMMManagem entServer". This string contains two parts, separated by the exclamation mark ("!"). The first part "VMM2012" is the management pack reference (using the alias). It refers to the management pack in which the target class is defined. The second part is the target class ID.

Within the <Rule> tag, we need to define the category of this rule using the <Category> tag. The possible values are documented on Microsoft Developer Network (MSDN) website at https://msdn.microsoft.com/en-us/library/microsoft.enterprisemanagement.configuration.managementpackcategorytype.aspx

A rule in a SCOM MP is comprised of the following parts:

  1. One or more Data Source modules.
  2. Zero or one Condition Detection module.
  3. One or more Write Action modules.

Note: You can learn more about different SCOM management pack module types at the TechNet wiki page: https://social.technet.microsoft.com/wiki/contents/articles/15217.operations-manager-management-pack-authoring-module-types.aspx

In this rule, we only need one Data Source module and one Write Action module.

  • Data Source module - "Microsoft.Windows.EventProvider". We use this module collect the events from VMM servers' event log.
  • Write Action module – "Microsoft.SystemCenter.CollectCloudGenericEvent". This module is shipped from the OMS MP "Microsoft.IntelligencePacks.Types". We use this module to send the event data to OMS.

Some modules require mandatory or optional inputs. When this is the case, the input parameters must be defined in the module XML tag. Some modules do not require any inputs, such as the write action module of this rule "Microsoft.SystemCenter.CollectCloudGenericEvent".

Lastly, we have also defined the display name and the description of this rule in the <LanguagePack> section.

Once you have saved the MP fragment file, you may build the solution. This will generate the MP. You can build your MP project using "Build"-> "Build Solution" option from the top menu, or using the shortcut "Ctrl+Shift+B", as shown in Figure 26.

FIGURE 26. BUILDING MANAGEMENT PACK SOLUTION

The MP build result is displayed in the "Output" section., as shown in Figure 27.

FIGURE 27. MP BUILD RESULT

By default, the management pack output can be found in the "<Visual Studio project folder>\Bin\Debug" folder. The output folder is also displayed in the build output, as highlighted in Figure 27.

Manually Resealing Management Pack

Before importing the MP into your SCOM management group, in this instance, we must manually edit the XML and reseal it again. This is because we have referenced several OMS MPs that we manually sealed using our own key (as explained previously).

To edit and reseal this MP, we will first copy the unsealed version (OMSBook.Demo.xml) from the debug folder to another place. In this demo, we have created a folder "C:\Documents\Manual Seal".

After the unsealed MP has been copied to another folder, open it with Notepad++. You can find a reference to the "Microsoft.IntelligencePacks.Types" MP within the <References> tag towards the top of the XML file (as shown in Figure 28).

FIGURE 28. MP REFERENCE FOR "MICROSOFT.INTELLIGENCEPACKS.TYPES "

The <PublicKeyToken> tag for this reference contains the public key of the key pair you have used to manually seal the OMS MPs. Replace this key with "31bf3856ad364e35".

Note: All MPs released by Microsoft are sealed using a key pair with public key "31bf3856ad364e35". Therefore, all the sealed OMS MPs pushed to your SCOM management group also have the public key token of "31bf3856ad364e35".

Once you have updated the <PublicKeyToken> tag, save the changes and close Notepad++.

Next, we are going to manually seal this MP using FASTSEAL.EXE. In command prompt, navigate to the folder of where the FASTSEAL.EXE is located, and run the following command:

FastSeal.exe <Path to the unsealed MP XML file> /Keyfile <Path to the key file> /Company "<Company Name>" /Copyright "<Copyright message>"

For Example:

FastSeal.exe "C:\Documents\Manual Seal\OMSBook.Demo.xml" /Keyfile C:\Documents\OMSBook.snk /Company "OMS Book" /Copyright "Copyright (c) OMS Book Authors 2017. All rights reserved."

FASTSEAL.EXE will generate the sealed MP (OMSBook.Demo.mp) in the current working directory, as shown in Figure 29.

FIGURE 29. MANUALLY SEALING MANAGEMENT PACK USING FASTSEAL.EXE

A word on FASTSEAL versus MPSEAL:

There are two executables you can use to create sealed MPs. In addition to FASTSEAL.EXE, you can also use MPSEAL.EXE, which can be located on the SCOM 2012/2016 installation media, under the SupportTools folder. However, in this instance, we must use FASTSEAL.EXE. The difference between FASTSEAL.EXE and MPSEAL.EXE is FASTSEAL.EXE does not validate the MP before sealing, whereas when using MPSEAL.EXE, you must also specify a path containing all sealed referencing MPs so it can validate your MP XML. MPSEAL.EXE validates the unsealed MP before sealing it. In this case, we are not able to use MPSEAL.EXE because it will fail at the validation process because we do not have valid sealed OMS MPs.

MPSEAL.EXE is documented on MSDN: https://technet.microsoft.com/en-us/library/hh457550.aspx

Importing Management Packs

Once the sealed MP is created, we can import it into the SCOM management group. After it's imported, you should see a 1201 event in the Operations Manager event log on the VMM server indicating the MP has been received by the agent, as shown in Figure 30.

FIGURE 30. NEW MANAGEMENT PACK RECEIVED EVENT (ID 1201)

Accessing Collected Event Data in OMS

The event collection rule should become active once you have seen the 1210 event, shortly after the 1201 event. You will be able to access the collected events using the search query such as "Type=Event EventID=1002", as shown in Figure 31.

FIGURE 31. ACCESSING COLLECTED VMM EVENTS IN OMS

Script Based OMS Event Collection Rule

Now that we have demonstrated how to create more granular event collection rules in OMS (targeting specific class and only collecting a specific event ID), we will demonstrate how to create a script based event collection rule for OMS. We will demonstrate how to create a "heartbeat" event rule that targets the SCOM management server class and sends a heartbeat event to OMS every 3 minutes.

Note: The heartbeat rule example demonstrated in this section is simply an example that shows how to inject event data into OMS using a script. Do not use this rule in your production environments as OMS already offers native agent heartbeat capability via the Agent Health solution.

The workflow of this rule is configured as shown in Figure 32.

FIGURE 32. HEARTBEAT EVENT RULE WORKFLOW

The workflow starts with a scheduler in the Data Source Module, which is configured to run every 180 seconds (3 minutes). The scheduler then triggers a PowerShell script, which runs locally on each management server, and generates a property bag with the following information:

  • Heartbeat Message
  • Logging Computer
  • Log Time
  • Log Time in UTC

The condition detection module takes the property bag that the data source module has generated and maps the value to the event data format.

Additional Reading: If you are not familiar with the 'property bag' in SCOM scripting, see "Script Monitors and Rules" on the Microsoft website at https://technet.microsoft.com/en-us/library/hh457579.aspx.

Lastly, the write action module uses module type "Microsoft.SystemCenter.CollectCloudGenericEvent" from the OMS MP "Microsoft.IntelligencePacks.Types" and uploads the event data to your OMS workspace.

As you can see, we will write a custom data source module for this rule. We have named this data source module "OMSBook.Demo.Send.OMSHeartbeat.Event.DataSource". This is the first MP element we need to create for this rule. We will first create two folders in the Visual Studio project and call them "Modules" and "Scripts".

After the folders are created, we will add the PowerShell script for the probe action module embedded in the Data Source module type. As we will place this PowerShell script under the "Scripts" folder, we can right click the "Scripts" folder, choose AddNew Item and select "PowerShell script file". We will name this PowerShell script "GenerateHeartbeatMessageProbe.ps1", as shown in Figure 33.

FIGURE 33 CREATING POWERSHELL SCRIPT WITHIN VSAE PROJECTS

Place the following code into the "Generate-HeartbeatMessageProbe.ps1" script:

$Now = Get-Date

$UTCNow = $Now.ToUniversalTime()

$strNow = Get-Date $Now -Format F

$strUTCNow = Get-Date $UTCNow -Format F

$ComputerName = $env:COMPUTERNAME

$Domain = (Get-WmiObject -Query "Select Domain from

Win32_ComputerSystem").Domain

$FQDN = "$ComputerName`.$Domain"

$oApi = New-Object -Comobject "MOM.ScriptAPI"

$HeartbeatMessage = "OpsMgr Heartbeat Event. Originating Computer:

'$FQDN'. Local Time: '$strNow'. UTC Time: '$strUTCNow'."

#Submit property bag

$oBag = $oAPI.CreatePropertyBag()

$oBag.AddValue("LogTime", $strNow)

$oBag.AddValue("LogUTCTime", $strUTCNow)

$oBag.AddValue("LoggingComputer", $ComputerName)

$oBag.AddValue("HeartbeatMessage", $HeartbeatMessage)

$oBag

Download the Code

You can find the Generate-HeartbeatMessageProbe.ps1 on GitHub at https://github.com/insidemscloud/OMSBookV2 in the \Chapter 17\Demo MP\OMSBook.Demo\Scripts directory.

As shown in the previous script, it simply gets the current date and time in both local time zone and UTC time, along with the computer FQDN, then forms a message before placing all the information into a property bag. The property bag contains the following values:

  • LogTime
  • LogUTCTime
  • LoggingComputer
  • HeartbeatMessage

Once the script is created, we can then create the custom data source module "OMSBook.Demo.Send.OMSHeartbeat.Event.DataSource". To do so, let us create an empty MP fragment under the "Modules" folder and give it a name. In this example, we have named this fragment file "Send.OMS.Heartbeat.Event.DS.mpx". The content of this fragment is listed below:

Download the Code

You can find the Send.OMS.Heartbeat.Event.DS.mpx on GitHub at https://github.com/insidemscloud/OMSBookV2 in the \Chapter 17\Demo MP\OMSBook.Demo\Modules directory.

As you can see, this module contains two member modules:

  • Data Source Member Module: System.SimpleScheduler
  • Probe Action Member Module: Microsoft.Windows.PowerShellPropertyBagTriggerOnlyProbe This module requires 3 input parameters:
  • IntervalSeconds – The execution frequency (in seconds) for the System.SimpleScheduler.
  • SyncTime – Optionally, specify if the System.SimpleScheduler should be executed at a specific time.
  • TimeoutSeconds – Specify the script timeout (in seconds) for the PowerShell script used by the "Microsoft.Windows.PowerShellPropertyBagTriggerOnlyProbe" module.

The MP fragment file also contains a <LanguagePacks> section where we have defined the display name for various elements defined in the "OMSBook.Demo.Send.OMS.Heartbeat.Event.DataSource" custom module.

Additionally, when we defined the "Microsoft.Windows.PowerShellPropertyBagTriggerOnlyProbe", we did not have to place the PowerShell script source code within the <ScriptBody> tag. Instead, we can simply reference the location of the script by using the $IncludeFileContent variable (as shown in Figure 34). This is a feature provided by VSAE when we build the project. Visual Studio will read the content of the file specified by the $IncludeFileContent variable and place the source code into the MP.

FIGURE 34. USING $INCLUDEFILECONTENT VARIABLE IN VSAE

In this case, we have specified "$IncludeFileContent/Scripts/Generate-

HeartbeatMessageProbe.ps1$", which indicates we are going to include the file content for the "Generate-HeartbeatMesssageProbe.ps1 from the "Scripts" folder.

Now that the custom data source module is created, we can start creating the heartbeat event rule itself. Same as the previous example, we will create an empty MP fragment file under the "Rules" folder. We will name the fragment file "OMS.Heartbeat.Events.Rule.mpx" in this demo. The XML code for the MP fragment is shown as below:

<ManagementPackFragment SchemaVersion="2.0"

xmlns:xsd="http://www.w3.org/2001/XMLSchema">

<Monitoring>

<Rules>

<Rule

ID="OMSBook.Demo.SCOM.Management.Server.OMS.Heartbeat.Event.Rule"

ConfirmDelivery="false" DiscardLevel="100" Remotable="false"

Target="SC!Microsoft.SystemCenter.ManagementServer"

Enabled="true" Priority="Normal">

<Category>Operations</Category>

<DataSources>

<DataSource ID="DS"

TypeID="OMSBook.Demo.Send.OMS.Heartbeat.Event.DataSource">

<IntervalSeconds>180</IntervalSeconds> <SyncTime />

<TimeoutSeconds>60</TimeoutSeconds>

</DataSource>

</DataSources>

<ConditionDetection ID="MapToEvent"

TypeID="System!System.Event.GenericDataMapper">

<EventOriginId>$MPElement$</EventOriginId>

<PublisherId>$MPElement$</PublisherId>

<PublisherName>SCOMHeartbeat</PublisherName>

<Channel>Operations Manager</Channel>

<LoggingComputer>$Data/Property[@Name='LoggingComputer']$</Loggin gComputer>

<EventNumber>123</EventNumber>

<EventCategory>1</EventCategory>

<EventLevel>4</EventLevel>

<UserName />

<Description>$Data/Property[@Name='HeartbeatMessage']$</Descripti on>

<Params>

<Param>$Data/Property[@Name='LogTime']$</Param>

<Param>$Data/Property[@Name='LogUTCTime']$</Param>

</Params>

</ConditionDetection>

<WriteActions>

<WriteAction ID="HTTPWA"

TypeID="IPTypes!Microsoft.SystemCenter.CollectCloudGenericEvent" />

</WriteActions>

</Rule>

</Rules>

</Monitoring>

<LanguagePacks>

<LanguagePack ID="ENU" IsDefault="true">

<DisplayStrings>

<DisplayString

ElementID="OMSBook.Demo.SCOM.Management.Server.OMS.Heartbeat.Even t.Rule">

<Name>OMS Book Demo SCOM Management Server OMS

Heartbeat Event Rule</Name>

</DisplayString>

</DisplayStrings>

</LanguagePack>

</LanguagePacks>

</ManagementPackFragment>

Download the Code

You can find the OMS.Heartbeat.Events.Rule.mpx on GitHub at https://github.com/insidemscloud/OMSBookV2 in the \Chapter 17\Demo MP\OMSBook.Demo\Rules directory.

We have configured the rule with the following parameters:

  • Data Source Module:
    • IntervalSeconds: 180
    • SyncTime: None
    • TimeoutSeconds: 60
  • Condition Detection Module:
    • PublisherName: SCOMHeartbeat
    • Channel: Operations Manager
    • LoggingComputer: The "LoggingComputer" value from the property bag output
    • EventNumber: 123
    • EventCategory: 1
    • EventLevel: 4 (Information)
    • Description: the "HeartbeatMessage" value from the property bag output.
    • Additional parameters (<Params>): The "LogTime" and "LogUTCTime" value from the property bag output.

Note: For more information about the parameters required by the System.Event.GenericDataMapper condition detection module, please refer to its documentation on MSDN: https://msdn.microsoft.com/en-us/library/ee692955.aspx

Once you have saved the rule MP fragment file and generated the MP by building the solution, you must manually edit and reseal the MP again before importing it into your SCOM management group.

After importing the MP into your SCOM management group, you should see the new rule under the Authoring pane when targeting your SCOM Management Server class, as shown in Figure 35.

FIGURE 35. VIEWING THE OMS HEARTBEAT EVENT RULE IN SCOM CONSOLE

Shortly after the MP has been imported into SCOM, you should begin to see related events coming into OMS. You can access the event data in OMS via search queries such as "Type=Event Source=SCOMHeartbeat", as shown in Figure 36.

FIGURE 36. ACCESSING HEARTBEAT EVENT DATA

We can also create a dashboard tile to display number of heartbeat events over the last X number of minutes. For example, use the following query to check the number of heartbeat events for a particular management server over the last 15 minutes (shown in Figure 37):

Type=Event Source=SCOMHeartbeat Computer=<Management Server Name> TimeGenerated>NOW-15MINUTE | measure count() by TimeGenerated

FIGURE 37. HEARTBEAT EVENT DASHBOARD TILE

OMS Event Collection Rules Summary

To summarize, we have demonstrated how to create rules that collect event data for OMS and we have shown how you would author the OMS event collection rules just like normal SCOM event collection rules, but using a different write action module.

Normally, when authoring SCOM event collection rules, the following two write action modules are used:

  • Write to Operational DB: Microsoft.SystemCenter.CollectEvent
  • Write to DW DB: Microsoft.SystemCenter.DataWarehouse.

PublishEventData When we want to send the event data to OMS, we will need to use an alternative write action module – "Microsoft.SystemCenter.CollectCloudGenericEvent". If we want to store the event data in both SCOM and OMS, we can either author a single rule that uses all three above mentioned write action modules or write multiple rules with the same data source module and input parameters. If we are writing different rules, collecting the same event data for both SCOM and OMS, and use the same data source module and input configuration, the SCOM agent would utilize the Cookdown feature, so it would not add additional overhead to the agent computers.

For more information about the SCOM agent Cookdown feature, please refer to the Chapter 23 of the Microsoft Virtual Academy (MVA) MP Authoring course at https://www.microsoftvirtualacademy.com/en-US/training-courses/system-center-2012-r2-operations-manager-management-pack-8829.

Note: You should NOT use the "Microsoft.SystemCenter.CollectCloudGenericEvent" Write Action module to collect events with large volume in nature (i.e. Security Events). This is because this Write Action module injects data into OMS via SCOM management servers. It may cause performance issues on SCOM management servers when used to collect large volume of event data. When OMS is collecting large volume data types such as SecurityEvent or WireData, it uses different Write Action modules that bypass SCOM management servers so the data is injected into OMS directly via the Microsoft Monitoring Agent.

Creating OMS Performance Collection Rules

In this session, we will discuss how to author OMS Near Real-Time (NRT) performance data collection rules.

Note: The OMS performance collection mechanism has been changed since version 1 of this book was published. The hourly aggregated performance data (Type=PerfHourly) is no longer available. Furthermore, the NRT performance data is no longer being aggregated.

When writing rules to collect OMS NRT performance data, we need to follow these guidelines:

  1. When mapping performance data, the object name must follow the format "\\<Computer FQDN>\<Object Name>".
  2. The rules execution intervals must be between 10 seconds and 1800 seconds (half hour).

In this section, we will demonstrate how to write collection rules that collect the Processor "% Privileged Time" counter as NRT performance data. The workflow of the performance data collection rule is shown in Figure 38.

FIGURE 38. NRT PERFORMANCE COLLECTION RULE WORKFLOW

We will create an empty MP fragment under the "Rules" folder and name it "Processor.PercentPrivilegedTime.NRT.Perf.Rule.mpx". We are going to name the rule "OMSBook.Demo.Windows.Computer.Processor.Percentage.Privileged.Time.Perf.Rule".

The XML code for this rule is listed below:

<ManagementPackFragment SchemaVersion="2.0"

xmlns:xsd="http://www.w3.org/2001/XMLSchema">

<Monitoring>

<Rules>

<Rule

ID="OMSBook.Demo.Windows.Computer.Processor.Percentage.Privileged

.Time.Perf.Rule" Enabled="true"

Target="Windows!Microsoft.Windows.Computer"

ConfirmDelivery="false" DiscardLevel="100" Priority="Normal" Remotable="true">

<Category>PerformanceCollection</Category>

<DataSources>

<DataSource ID="DS"

TypeID="IPTypes!Microsoft.IntelligencePacks.Performance.DataProvi der">

<ComputerName>$Target/Property[Type="Windows!Microsoft.Windows.Co mputer"]/NetworkName$</ComputerName>

<CounterName>% Privileged Time</CounterName>

<ObjectName>Processor</ObjectName>

<InstanceName>_Total</InstanceName>

<AllInstances>false</AllInstances>

<IntervalSeconds>10</IntervalSeconds> </DataSource>

</DataSources>

<WriteActions>

<WriteAction ID="WriteToCloud"

TypeID="IPTypes!Microsoft.SystemCenter.CollectCloudPerformanceDat a_PerfIP" />

</WriteActions>

</Rule>

</Rules>

</Monitoring>

<LanguagePacks>

<LanguagePack ID="ENU" IsDefault="true">

<DisplayStrings>

<DisplayString

ElementID="OMSBook.Demo.Windows.Computer.Processor.Percentage.Pri vileged.Time.Perf.Rule">

<Name>OMS Book Demo - Processor % Privileged Time NRT

Performance Rule</Name>

</DisplayString>

</DisplayStrings>

</LanguagePack>

</LanguagePacks>

</ManagementPackFragment>

Download the Code

You can find the Processor.PercentPrivilegedTime.NRT.Perf.Rule.mpx on GitHub at https://github.com/insidemscloud/OMSBookV2 in the \Chapter 17\Demo MP\OMSBook.Demo\Rules directory.

Before importing the MP into your management group, we will need to manually change the <PublicKeyToken> for the referenced OMS MPs and manually seal it using FASTSEAL.EXE. We must change the <PublicKeyToken> for the "Microsoft.IntelligencePacks.Types" MP.

Shortly after you have imported the MP into your SCOM management group, you should be able to see in NRT performance data in OMS. For example, you can view the data and performance graph for a specific computer using search query 'Type=Perf ObjectName=Processor CounterName="% Privileged Time" Computer="HV04.corp.tyang.org"', as shown in Figure 39.

FIGURE 39. VIEW NRT PERFORMANCE GRAPH

As mentioned previously, when mapping the performance data, the object name must follow the "\\<Computer FQDN>\<Object name>" format. In the Visual Studio project, where the rules are defined, we can right click the ID of the data source module "Microsoft.IntelligencePacks.Performance.DataProvider", and select "Go To Definition", as shown in Figure 40.

FIGURE 40. VIEW MP ELEMENT DEFINITION

Once you have selected to view the definition of the data source module, you can see it formats the Object Name of the performance data to the "\\<Computer FQDN>\<Object Name>" format in a condition detection member module, as shown in Figure 41.

FIGURE 41. MICROSOFT.INTELLIGENCEPACKS.PERFORMANCE.DATAPROVIDER MODULE

In this section, we have demonstrated how to create performance collection rules for OMS Near Real-Time performance data using the same data source module as the NRT performance collection rules created by OMS. However, as we explained, so long as the object name of the performance data is formatted as \\<Computer FQDN>\<Object Name>, we can either use other data source modules, or we can create our own custom modules for NRT performance collection rules.

Creating Script Based Near Real-Time Performance Collections Rules from the SCOM Console

One interesting use of the SCOM property bag in a script is that it can be used to create synthetic (custom) performance counters, enabling you to collect and visualize any numeric or Boolean (0 or 1) value as a performance counter in SCOM!

For example, the sample PowerShell script below creates a property bag for the file count in a particular folder:

Param ($FolderName)

$oAPI = New-Object -ComObject "MOM.ScriptAPI"

$oBag = $oAPI.CreatePropertyBag()

# Get file count for target folder

$FileCount = (Get-ChildItem -Name $FolderName `

-force | Measure-Object).Count

$oBag.AddValue("ComputerName", $env:COMPUTERNAME)

$oBag.AddValue("FolderName", $FolderName)

$oBag.AddValue("FileCount", $FileCount)

$oBag

You could then use the data it produces in dashboards and reports to visualize current and historical values of the information.

One limitation of SCOM is that natively, there is no way to add a PowerShell script like this through the SCOM console UI. However, Microsoft's SCOM Premier Field Engineer Wei Hao Lim has published an MP that provides a wizard within the SCOM console for creating PowerShell script based NRT performance collection rules. The MP exposes additional wizards in the SCOM console where you can paste in a PowerShell script that uses the SCOM SDK to produce a property bag, enabling you to chart any numeric or Boolean (0 or 1) value as a performance counter in SCOM!

You can learn more about Wei Hao Lim's MP and download it from the Microsoft website at the following URL: http://blogs.msdn.com/b/wei_out_there_with_system_center/archive/2015/09/29/oms collecting-nrt-performance-data-from-an-opsmgr-powershell-script-collection-rule created-from-a-wizard.aspx

A note on agent resource consumption and NRT performance rules...

Since the NRT performance collection rules can run as often as every 10 seconds, when using custom data source modules (such as script based), keep in mind script based data source modules can have a much bigger footprint on your systems than the native performance collection data sources. Therefore, if you are using such modules, consider the potential increase in the frequency of the rule execution.

AzureRM.OperationalInsights PowerShell Module

Microsoft provides a PowerShell module that enables you to access your OMS Log Analytics workspace programmatically. It is available for download from the PowerShell Gallery here: https://www.powershellgallery.com/packages/AzureRM.OperationalInsights/. At the time of writing this book, the latest version of the AzureRM.OperationalInsights module is 2.8.0. It ships the following 40 cmdlets:

  • Disable-AzureRmOperationalInsightsIISLogCollection
  • Disable-AzureRmOperationalInsightsLinuxCustomLogCollection
  • Disable-AzureRmOperationalInsightsLinuxPerformanceCollection
  • Disable-AzureRmOperationalInsightsLinuxSyslogCollection
  • Enable-AzureRmOperationalInsightsIISLogCollection
  • Enable-AzureRmOperationalInsightsLinuxCustomLogCollection
  • Enable-AzureRmOperationalInsightsLinuxPerformanceCollection
  • Enable-AzureRmOperationalInsightsLinuxSyslogCollection
  • Get-AzureRmOperationalInsightsDataSource
  • Get-AzureRmOperationalInsightsIntelligencePacks
  • Get-AzureRmOperationalInsightsLinkTargets
  • Get-AzureRmOperationalInsightsSavedSearch
  • Get-AzureRmOperationalInsightsSavedSearchResults
  • Get-AzureRmOperationalInsightsSchema
  • Get-AzureRmOperationalInsightsSearchResults
  • Get-AzureRmOperationalInsightsStorageInsight
  • Get-AzureRmOperationalInsightsWorkspace
  • Get-AzureRmOperationalInsightsWorkspaceManagementGroups
  • Get-AzureRmOperationalInsightsWorkspaceSharedKeys
  • Get-AzureRmOperationalInsightsWorkspaceUsage
  • New-AzureRmOperationalInsightsAzureActivityLogDataSource
  • New-AzureRmOperationalInsightsAzureAuditDataSource
  • New-AzureRmOperationalInsightsComputerGroup
  • New-AzureRmOperationalInsightsCustomLogDataSource
  • New-AzureRmOperationalInsightsLinuxPerformanceObjectDataSource
  • New-AzureRmOperationalInsightsLinuxSyslogDataSource
  • New-AzureRmOperationalInsightsSavedSearch
  • New-AzureRmOperationalInsightsStorageInsight
  • New-AzureRmOperationalInsightsWindowsEventDataSource
  • New-AzureRmOperationalInsightsWindowsPerformanceCounterDataSource
  • New-AzureRmOperationalInsightsWorkspace
  • Remove-AzureRmOperationalInsightsDataSource
  • Remove-AzureRmOperationalInsightsSavedSearch
  • Remove-AzureRmOperationalInsightsStorageInsight
  • Remove-AzureRmOperationalInsightsWorkspace
  • Set-AzureRmOperationalInsightsDataSource
  • Set-AzureRmOperationalInsightsIntelligencePack
  • Set-AzureRmOperationalInsightsSavedSearch
  • Set-AzureRmOperationalInsightsStorageInsight
  • Set-AzureRmOperationalInsightsWorkspace

Installing AzureRM.OperationalInsights Module

On a Windows computer that is running PowerShell version 5.0 or above, you can install the latest version of the Azure.OperationalInsights module using the following PowerShell command in a PowerShell session with administrative privilege:

Install-Module AzureRM.OperationalInsights -Repository PSGallery -Force

Note: The AzureRM.OperationalInsights module is dependent on the AzureRM.Profile module with the same version number. By running the above command, AzureRM.Profile module will also be installed if it is not present on the computer.

The Install-Module cmdlet is part of the PowerShellGet module, which is shipped as a part of PowerShell starting from version 5.0. In order to install the module on Windows computers that have lower versions of PowerShell installed (i.e. version 4.0), you will first need to download and save the module using a Windows computer that is running PowerShell version 5.0 or later.

Note: Windows 10 ships with PowerShell version 5.0 and Windows Server 2016 ships with PowerShell version 5.1. If you are using an earlier version of Windows, you can download the latest version of the Windows Management Framework (WMF) installer from Microsoft's website.

To download and save the modules, you can run the following commands:

Save-Module AzureRM.OperationalInsights -Repository PSGallery -Path c:\Temp

Save-Module AzureRM.Profile -Repository PSGallery -Path c:\Temp

Once the modules are downloaded, you can manually deploy them to computers that are running PowerShell prior to version 5.0.

Using the AzureRM.OperationalInsights Module

Before running any cmdlets from the AzureRM.OperationalInsights module, you will firstly need to sign in to your Azure subscription using the Add-AzureRMAccount cmdlet.

Note: Add-AzureRMAccount cmdlet supports several authentication methods, such as using credentials or Azure Application and Service Principal. Instead of using Add-AzureRMAccount cmdlet, you may also use its alias LoginAzureRMAccount.

Like any other PowerShell cmdlets, you can access the help topic associated to each cmdlets in the AzureRM.OperationalInsights module using the Get-Help cmdlet. As an example, Figure 42 shows how you can access the help topic for the Get-AzureRmOperationalInsightsWorkspace cmdlet using Get-Help cmdlet with the "-Full" switch.

FIGURE 42. ACCESSING GET-AZURERMOPERATIONALINSIGHTSWORKSPACE HELP TOPIC

Furthermore, the AzureRM.OperationalInsights module is documented on Microsoft's documentation site: https://docs.microsoft.com/en-us/powershell/module/azurerm.operationalinsights

It is not possible to cover every single cmdlet from the AzureRM.OperationalInsights module in this book, however, we will provide several examples in the following sections.

As mentioned previously, you must first sign in to your Azure subscription. This can be achieved using the code example as shown below:

#Login to Azure

Add-AzureRmAccount

#Select Azure Subscription

Get-AzureRmSubscription | Out-GridView -PassThru | Select-AzureRmSubscription

You can retrieve OMS Log Analytics workspace details such as portal URL, workspace Id, primary and secondary keys, pricing model, etc. using PowerShell as shown below:

#Get Workspace

$Workspace = Get-AzureRmOperationalInsightsWorkspace -Name "OMSBook" -

ResourceGroupName "omsrg"

#Workspace ID

$WorkspaceID = $Workspace.CustomerId

Write-Output "Workspace Id: '$WorkspaceID'"

#Workspace location and SKU

$Location = $workspace.Location

$SKU = $Workspace.sku

Write-Output "WOrkspace location: $Location"

Write-Output "Workspace Pricing Model: $SKU"

#Open portal in default browser

$PortalUrl = $Workspace.PortalUrl

Invoke-Expression "$env:SystemRoot\System32\rundll32.exe url.dll,FileProtocolHandler '$PortalUrl'"

#Get workspace primary and secondary keys

$Keys = Get-AzureRmOperationalInsightsWorkspaceSharedKeys ResourceGroupName "omsrg" -Name "OMSBook"

$PrimaryKey = $Keys.PrimarySharedKey

$SecondaryKey = $Keys.SecondarySharedKey

Write-Output "Primary Key: '$PrimaryKey'"

Write-Output "Secondary Key: '$SecondaryKey'"

You can get, enable, and disable OMS solutions using PowerShell as shown below:

#Get all solutions that are available for your workspace

Get-AzureRmOperationalInsightsIntelligencePacks -ResourceGroupName

"omsrg" -WorkspaceName "OMSBook"

#Get all solutions that are yet enabled in your workspace

Get-AzureRmOperationalInsightsIntelligencePacks -ResourceGroupName

"omsrg" -WorkspaceName "OMSBook" | Where-Object {$_.Enabled -eq $false}

#Enable a solution - WIreData solution

Set-AzureRmOperationalInsightsIntelligencePack -ResourceGroupName "omsrg"

-WorkspaceName "OMSBook" -IntelligencePackName 'WireData' -Enabled $true

#Disable a solution - SQL Assessment

Set-AzureRmOperationalInsightsIntelligencePack -ResourceGroupName "omsrg"

-WorkspaceName "OMSBook" -IntelligencePackName 'SQLAssessment' -Enabled $false

You can invoke OMS log search, as well as creating, reading, invoking, and removing saved searches using PowerShell. The script listed below demonstrates how to invoke log search by specifying a search query, and also how to invoke an existing saved search:

#Invoking a search query

$Query = "Type=Event EventLog=System EventID=7036"

$response = Get-AzureRmOperationalInsightsSearchResults -

ResourceGroupName "omsrg" -WorkspaceName "OMSBook" -Query $Query

$arrResults = New-Object System.Collections.ArrayList

Foreach ($item in $response.Value)

{

[void]$arrResults.Add($(ConvertFrom-JSON $item))

}

$arrResults

#Invoking a saved search

$SavedSearchDisplayName = 'Planning: Event Counts'

$SavedSearchRequest = Get-AzureRmOperationalInsightsSavedSearch ResourceGroupName 'omsrg' -WorkspaceName 'OMSBook'

$SavedSearch = $SavedSearchRequest.Value | Where-Object

{$_.Properties.DisplayName -ieq $SavedSearchDisplayName}

$SavedSearchId = $SavedSearch.Id.split("/")[9]

$SearchResult = Get-AzureRmOperationalInsightsSavedSearchResults ResourceGroupName 'omsrg' -WorkspaceName 'OMSBook' -SavedSearchId

$SavedSearchId

$arrResults = New-Object System.Collections.ArrayList

Foreach ($item in $SearchResult.Value)

{

[void]$arrResults.Add($(ConvertFrom-JSON $item))

}

$arrResults

Download the Code

All the PowerShell script samples listed above can be found in the OMSPSModuleExamples.ps1 script file on GitHub at https://github.com/insidemscloud/OMSBookV2 in the \Chapter 17\ PowerShell Code Examples directory.

Managing OMS Log Analytics Data Programmatically

Microsoft provides several REST APIs (Application Programming Interfaces) to programmatically manage the OMS Log Analytics data. We will discuss the following APIs in this section:

  • Log Search REST API
  • Log Analytics HTTP Data Collector API
  • Alert API
  • Service Map API

Since we are going to work on REST APIs, we will use a tool called Postman in this section. Postman is a tool widely used when working with REST APIs. You can download a free version from https://www.getpostman.com.

Note: We will use the Windows desktop version of Postman in this chapter. The UI may be slightly different if you choose to use another version (i.e. Chrome browser extension).

Log Search REST API

The OMS Log Analytics Log Search API is a RESTful API and it can be accessed via the Azure Resource Manager API. It allows you to programmatically invoke log search from the applications or automation solutions that you are developing. The Log Search API allows you to perform the following actions:

  • Perform log search (by specifying the search query)
  • Invoke an existing saved search
  • Get a list of saved searches
  • Create saved searches
  • Update saved searches
  • Delete saved searches
  • Retrieve, create, update, and delete computer groups

The Log Search API is documented at the Microsoft documentation site here - https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-log-search-api.

You will need to authenticate to Azure Resource Manager and retrieve a JSON Web Token (JWT) from Azure AD before making API calls.

Note: The Log Search API documentation from the Microsoft documentation site only demonstrates how to use the ARM client to make API calls. It does not show you how to generate the JWT tokens. One of the authors of this book, Stanislav Zhelyazkov has developed a PowerShell module called OMSSearch. This module contains a function called Get-AADToken. You may learn how to create the token from the source code of this function: https://github.com/slavizh/OMSSearch/blob/master/OMSSearch.psm1.

Although the official Microsoft documentation for the Log Search API only explains how to invoke the API using the ARM Client, we are able to invoke the API in our favorite tools such as Postman and PowerShell.

Note: ARM Client is an open sourced command line to invoke the Azure Resource Manager (ARM) API. You can find the more details of this tool at its GitHub repository: https://github.com/projectkudu/ARMClient

If you are developing a solution using PowerShell, the OMSSearch PowerShell module mentioned above is a good reference to get you started. For Postman, we need to construct the following:

  • The API URL. Based on the examples from Microsoft's documentation site, we need to add "https://management.azure.com/" to the beginning of the API that the ARMClient is invoking. i.e. the URL for performing a search would be:

https://management.azure.com/subscriptions/{Subscription_Id }/resourcegroups/{Resource_Group_Name}/providers/Microsoft. OperationalInsights/workspaces/{OMS_Workspace_Name}/search?api-version=2015-03-20

  • Authorization header. The authorization header must contain a valid JWT token retrieved from Azure AD Graph API. One of the authors of this book, Tao Yang has posted an instruction on how to configure Postman to generate this token from Azure AD. You can read more from Tao's blog post: http://blog.tyang.org/2017/04/26/using-postman-invoking-azure-resource-management-apis/.
  • Other headers. Some API calls require the request header to contain other information, such as "Content-Type" must be set to "application/json".
  • Request body (for POST operations). For POST operations, appropriate request body must be constructed and passed into the API call.

For example, to invoke a search query "Type=Heartbeat" using the search API in Postman, we will add the following information in Postman:

  1. Insert the API URL.
  2. Select "POST" operation.
  3. Generate the Authorization header.
  4. Add Content-Type header with the value "application/json".
  5. Construct the request body similar to our example shown in Figure 43.

FIGURE 43. SAMPLE API POST REQUEST BODY IN JSON

Once we call the API in Postman, the search result and the metadata are returned in the HTTP response body (as shown in Figure 44).

FIGURE 44. ACCESSING LOG SEARCH RESULT IN POSTMAN

The Log Search API has several limitations:

  • When an aggregation command such as |measure count() or distinct is used, the maximum number of records returned from each call is 500,000.
  • Searches that do not include an aggregation command returns up to 5,000 records.

Note: The AzureRM.OperationalInsights PowerShell module ships two cmdlets: Get-AzureRmOperationalInsightsSearchResults and GetAzureRmOperationalInsightsSavedSearchResults. These two cmdlets invoke search and saved search respectively. Since these two cmdlets are using the Log Search API under the hood, the same limitations also apply to these cmdlets.

When making API calls to invoke log search, the response returned by the API returns the search result in JSON format. In the HTTP response you receive, the metadata contains a field called "total", which indicates the size of the search result. You can determine if the API call has returned the entire result If this value is less than or equal to the API limitation mentioned above.

To work around the limitation, you may use a technique that performs multiple searches by leveraging the Skip command. For example, if you want to retrieve all events using query "Type=Event", you could perform multiple queries until you have retrieved all the log entries as shown below:

  • 1st Query:     "Type=Event | Top 5000"
  • 2nd Query:     "Type=Event | Skip 5000 | Top 5000"
  • 3rd Query:     "Type=Event | Skip 5000 | Top 5000"
  • … repeat until the search API call returns no results

Note: One of the authors of this book, Tao Yang has published a blog with a sample script demonstrating how the Skip command can be used in a PowerShell script that uses the AzureRm.OperationalInsights module. You can read the post from Tao's blog: http://blog.tyang.org/2017/04/25/programmatically-performing-oms-log-search-against-a-large-result-set/

HTTP Data Collector API

The OMS HTTP Data Collector API is a RESTful API that allows you to send your own data to your OMS Log Analytics workspace. This API plays an important role when you are developing your custom OMS solutions.

Unlike the Log Search API, the Data Collector API does not require Azure AD authentication. Instead, the request must include an authorization header that is generated by using the Log Analytics workspace ID and the primary or the secondary key for the workspace.

The HTTP Data Collector API is well documented on the Microsoft documentation site: https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-data-collector-api

When injecting your own data using this API, you define a custom record type when you submit data through the Log Analytics HTTP Data Collector API. Currently, you cannot write data to existing record types that were created by other data types and solutions. Log Analytics reads the incoming data and then creates properties that match the data types of the values that you enter.

Each request to the Log Analytics API must include a Log-Type header with the name for the record type. The suffix _CL is automatically appended to the name you enter to distinguish it from other log types as a custom log. For example, if you enter the name InsideOMSBookDemo, Log Analytics creates a record with the type InsideOMSBookDemo_CL (as shown in Figure 45). This helps ensure that there are no conflicts between user-created type names and those shipped in current or future Microsoft solutions.

FIGURE 45. EXAMING LOG TYPE NAME FOR DATA INJECTED BY HTTP DATA COLLECTOR API

In addition to appending the log type name with a _CL suffix, the property names within each log record are also appended with a suffix depending on the data type as shown in Table 2.

Property data type

Suffix

String

_s

Boolean

_b

Double

_d

Date/time

_t

GUID

_g

TABLE 2: PROPERTY TYPES AND APPENDED SUFFIX

In order to keep the consistency between built-in and custom log types, several property names from the custom logs are not appended. If you have included the following properties within the log record, they will not be appended, as shown in Figure 46:

  • Computer
  • Message

FIGURE 46. CUSTOM LOG PROPERTY NAME SUFFIX

When injecting data via the Data Collector API, the records need to be included in the HTTP POST request body in JSON format. If the record type does not exist in your OMS workspace, it will be created.

Note: When logs are inserted into your OMS workspace, it can take a few minutes before the data becomes searchable due to the data indexing process. Based on the authors experience, when a new data type is introduced by the Data Collector API, it may also take a few extra minutes before it becomes available in your OMS workspace.

The API supports batch insert, which means you may inject more than one record within a single request. However, all records in the bulk insert request must be of the same record type.

The Data Collector API has the following limitations:

  • The maximum size per post request is 30 MB. When inserting records in batches, the payload for the HTTP POST body must not exceed 30MB.
  • The value for each property (field) cannot exceed 32 KB. If the value exceeds 32 KB, the data will be truncated.
  • Based on the usability and search experience perspective, the recommended maximum number of fields for each type is 50.

One of the authors of this book, Tao Yang has written a PowerShell module that leverages the Data Collector API to inject data into OMS Log Analytics workspace. The PowerShell module is called OMSDataInjection, you can find it either from the PowerShell Gallery or GitHub:

The OMSDataInjection module is shipped with the following features:

  • Support both inserting individual records and bulk insert
  • Support using both primary and secondary keys. If The insert request using the primary key failed, it will retry using the secondary key.
  • The records to be injected can be passed in either in JSON format or psobject (or an array of psobjects for bulk insert).
  • Can be used as an Azure Automation Integration module by using the connection object defined in the module.

The sample script listed below demonstrates how to inject data into OMS Log Analytics using the OMSDataInjection module:

Download the Code

All the PowerShell script samples listed above can be found in the OMSDataInjectionModuleExamples.ps1 script file on GitHub at

https://github.com/insidemscloud/OMSBookV2 in the \Chapter 17\ PowerShell Code Examples directory.

Alert API

The Log Analytics Alert REST API allows you to manage the alert rules within your workspace. Same as the Log Search API, the Alert API is also a RESTful API that's been implemented via the Azure Resource Manager (ARM) API.

The Alert API supports the following operations:

  • Listing; creating; editing and deleting Saved Search schedules.
  • Retrieving; creating; editing and deleting alert actions.

You can find the documentation for the Alert API at Microsoft's documentation site: https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-api-alerts.

Note: Just like the Log Search API, because the Alert API is also implemented via the ARM API, the methods of using PowerShell and Postman we have explained in the previous Log Search API section also works here for the Alert API.

Often, you will use Alert API in conjunction with the log search API because the alerts are created for Saved Searches and the Saved Searches are managed by the Log Search API. For example, to retrieve the schedule for a particular alert, as shown in Figure 47, you will need to take the following steps:

FIGURE 47. SAMPLE ALERT TO BE RETRIEVED USING LOG SEARCH AND ALERT APIS

  1. Getting a list of the user-defined Saved Searches using the Log Search API.
  2. Locate the particular saved search from the response body. In this example, the display name is "Computer missing security updates" (as shown in Figure 48), the name field is the Search Id that will be used in the Alert API.

FIGURE 48. GETTING THE SAVED SEARCH ID FROM THE NAME IN SAVED SEARCH LIST OPERATION

  1. Retrieve the schedule from the Alert API using the following parameters: URL: https://management.azure.com/subscriptions/{Subscription_Id }/resourceGroups/{Resource_Group_Name}/providers/Microsoft. OperationalInsights/workspaces/{OMS_Workspace_Name}/savedSe arches/{Saved_Search_Id}/schedules?api-version=2015-03-20, where the Saved Search Id is retrieved from the previous step.
  1. Add the authorization header as explained in the Log Search API section previously.
  2. Invoke the API call using GET method. The schedule is returned in the response body in JSON format, as shown in Figure 49.

FIGURE 49. RETRIEVING ALERT SCHEDULES VIA ALERT API

Service Map API

The Service Map solution was previously discussed in the Service Map and Wired Data 2.0 chapter. Microsoft provides a RESTful API that allows you to query Service Map dependency data. This API is fully documented on the Microsoft documentation site: https://docs.microsoft.com/en-us/rest/api/servicemap/. Same as the OMS Log Search API and the Alert API, the Service Map API is also implemented via the Azure Resource Manager API. This API is divided into the REST operation groups listed in Table 3:

Resource Groups

Description

Client Groups

Provides operations for retrieving information about client groups.

Machine Groups

Provides operations for managing machine groups.

Machines

Provides operations for retrieving information about machines.

Maps

Provides operations for retrieving maps.

Ports

Provides operations for retrieving information about ports.

Processes

Provides operations for retrieving information about processes.

Summaries

Provides operations for retrieving machine summary information

TABLE 3: REST RESOURCE GROUPS

You can use the Service Map REST API to perform the following tasks for each resource group:

  • Client Groups
    • Get a specified client group
    • Get the approximate members count for a client group
    • List members of a client group
  • Machine Groups
    • Create machine group
    • Delete machine group
    • Get machine group
    • Update machine group
    • List all machine groups in a workspace
  • Machines
    • Get a specified machine
    • Obtain the liveness status of a machine during a specified time interval
    • List machines by workspace that are matching specified conditions
    • List connections of a machine
    • List machine group membership
    • List live ports for a machine during a specified time interval
    • List Processes of a machine that are matching specified conditions
  • Maps
    • Generate map
  • Ports
    • Return a specified live port during a specific time interval
    • Obtain the liveness status of a port during a specified time interval
    • Return processes accepting on the specified port
    • Return connections established via the specified port
  • Processes
    • Get the specified process
    • Get the liveness status of the process during a specified time interval
    • Get a collection of ports on which the process is accepting
    • Get a collection of connections terminating or originating at the specified process
  • Summaries
    • Get summary info about machines in the workspace

Note: Many of above listed API calls require you to specify a time window using the startTime and endTime parameters. You must use the UTC time and both parameters must be formatted in the Sortable Date Time Pattern (YYYY-MM-DDThh:nn:ss). At the time of writing this book, the maximum time window (differences between start and end time) is one (1) hour.

Based on what we have already discussed in the Service Map and Wire Data 2.0 chapter, we all agree that the Service Map solution generates a comprehensive map to visualize the dependencies between different nodes. However, when running an infrastructure refresh/upgrade/migration project, often we will need to access this data in a structured way in order to programmatically identify and analyze dependencies.

To better explain how to use the Service Map APIs, we are going to demonstrate how to retrieve the dependencies of a particular server (the focus node) using the APIs. We will cover both using Postman and via a PowerShell script. The high-level steps are:

  1. Retrieve the machine ID for the focus node using the Get Machine API call.
  2. Retrieve the dependency map using the Generate Map API call.

To retrieve the dependency map data via Postman, you will need to follow the steps listed below:

  • Invoke a request to

    https://management.azure.com/subscriptions/{Subscription_Id }/resourceGroups/{Resource_Group_Name}/providers/Microsoft. OperationalInsights/workspaces/{OMS_Workspace_Name}/features/serviceMap/machines?api-version=2015-11-01-preview with the following configuration:

    • Authorization header: Generate an Azure AD OAuth token as previously explained in the Log Search API section of this chapter.
    • Add a header called Content-Type with the value application/json
    • Method: GET
  • Find the focus node from the response body and take a note of the machine id (as shown in Figure 50). The machine id will be used in the next API call.

FIGURE 50. RETRIEVING THE SERVICE MAP MACHINE ID VIA THE SERVICE MAP REST API CALL

  • Invoke the second request to

https://management.azure.com/subscriptions/{Subscription_Id }/resourceGroups/{Resource_Group_Name}/providers/Microsoft.

OperationalInsights/workspaces/{OMS_Workspace_Name}/feature s/serviceMap/generateMap?api-version=2015-11-01-preview with the following configuration:

  • Authorization header: Generate an Azure AD OAuth token as previously explained in the Log Search API section of this chapter.
  • Add a header called Content-Type with the value application/json
  • Request Body: { "kind":"map:single-machine-dependency", "machineId":"{Machine_Id}" } (Copy and paste the machine Id that was retrieved from the previous API call).
  • Method: POST
  • The dependency map data is returned in the response body in JSON format, as shown in Figure 51.

FIGURE 51. DEPENDENCY MAP DATA RETURNED BY SERVICE MAP API

Now that we have learned how to retrieve the dependency map data for a particular computer, you should have a clear idea on how to work with the Service Map APIs. It is also very simple to construct the web requests in PowerShell so we can retrieve the same data in PowerShell. The script listed below demonstrates the process. To use it in your environment, you will first need to modify the variable values in the User-Defined variables region (located at the very top of the script).

Download the Code

All the PowerShell script samples listed above can be found in the GetServerDependencyViaServiceMap.ps1 script file on GitHub at https://github.com/insidemscloud/OMSBookV2 in the \Chapter 17\ PowerShell Code Examples directory.

OMS View Designer

Up until now, we have only discussed how to manage data insertion and retrieval with OMS Log Analytics. Once you have managed to have your data collected by OMS, the next step is making sure your users are making good use of the data in a way that you wish it to be consumed. Your OMS Log Analytics workspace provides a native capability that allows you to design your custom views in order to visualize your data. The OMS View Designer is located on the home page of the OMS portal, generally the last tile on the home page (as shown in Figure 52)

FIGURE 52. OMS VIEW DESIGNER

The View Designer is documented on the Microsoft documentation site at: https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-view-designer.

When you click the View Designer tile and entered the designer, you will be presented with an empty design view, as shown in Figure 53.

FIGURE 53. OMS VIEW DESIGNER NEW DESIGN VIEW

In Figure 53, the left pane is the control pane with two tabs (Tile and View). First, you will need to select a tile from the Tile tab and configure appropriate queries and other properties. The tile that you have created will appear on the home page of the OMS portal, this will become the entry point for your custom solution, as shown in Figures 54 and 55.

FIGURE 54. DESIGNING SOLUTION TILE IN THE VIEW DESIGNER

FIGURE 55. CUSTOM SOLUTION TILE ON THE PORTAL HOME PAGE

The second tab from the control pane is the View tab. In this tab, you can add one or more views to your custom solution. These views are presented to the users once they have entered your solution by clicking on the tile you have created earlier.

Depending on the view, you will need to specify search queries, thresholds, color settings, etc. for each view (as shown in Figure 56).

FIGURE 56. CONFIGURING VIEWS IN VIEW DESIGNER

Note: As you can see from Figure 56, you can add an icon for a view. When adding an icon, the image resolution must be 32x32 pixels.

When you have finished designing the custom views, you can edit it any time by entering the solution view and clicking on the "Edit" button (shown in Figure 57).

FIGURE 57. EDIT CUSTOM VIEW

When in the edit mode, you also have the option to export the view. The content of the export file is indeed an Azure Resource Manager(ARM) template in JSON format. You can either manually import the export file to another workspace from the view designer, or you can also deploy it like a normal ARM template.

Hint: When designing your custom views, it is a good idea to always include a "List of queries" view and add some useful queries for your solution.

Download the Sample

The export for the sample solution OMS Statistics demonstrated above can be found from GitHub at https://github.com/insidemscloud/OMSBookV2 in the \Chapter 17\ Sample Solution directory. The file name is OMS Statistics.omsview

Azure Resource Providers and ARM Templates

The OMS management solution you are creating may consist of many Azure components. For example, if the solution uses Azure Automation runbooks to collect and inject data into OMS, use a solution tile in the OMS portal with multiple views for users to consume data. It may also contain alerts with remediation actions such as email and remediation runbooks. You may package the Azure resources that are part of your solution into ARM templates so it can be easily deployed to any Azure subscriptions. The process of designing and building OMS management solutions is documented at the Microsoft's documentation site: https://docs.microsoft.com/en-us/azure/operations-management-suite/operations-management-suite-solutions-creating

Azure Resource Providers

Azure Resource Provider is a service that supplies the resources you can deploy and manage through Azure Resource Manager. You can learn more about Azure Resource Manager and Resource Provider at Microsoft's documentation site here: https://docs.microsoft.com/en-us/azure/azure-resource-manager/resource-group-overview".

When designing and building your custom OMS solutions, the two resource providers you will work with most of the time are:

  • Azure Automation: Microsoft.Automation
  • OMS Log Analytics: Microsoft.OperationalInsights

You will need to make sure required resource providers are registered in your Azure subscription. There are several ways you can check and register resource providers. For example, in the Azure ARM portal (https://portal.azure.com , navigate to the subscription, then go to Resource Providers, as shown in Figure 58.

FIGURE 58. REGISTERING RESOURCE PROVIDER VIA AZURE PORTAL

When registering the resource provider, other than using the Azure portal, you can also use the PowerShell cmdlet Register-AzureRmResourceProvider or via the Azure ARM API using: POST /subscriptions/{subscriptionId}/providers/{resourceProviderNamesp ace}/register?api-version=2016-09-01 https://docs.microsoft.com/en-us/rest/api/resources/providers.

Azure Resource Manager Templates

The following Azure Automation and OMS Log Analytics resources can be provisioned as part of your ARM templates:

  • Azure Automation
    • Runbooks
    • Webhooks
    • Jobs
    • Certificates (Automation Asset)
    • Credentials (Automation Asset)
    • Schedules
    • Automation Variables
    • Integration Modules
  • OMS Log Analytics
    • Saved Searches
    • Alerts
      • Schedule
      • Alert actions
    • Views

Note: You can find more information about the ARM authoring templates for above mentioned resources at Microsoft documentation site:

Summary

In this chapter, we firstly demonstrated how to author OMS collection rules in SCOM for the following types of data:

  • Event Data
  • Hourly Aggregated Performance Data
  • Near Real-Time Performance Data

Other than the demo rules we have created in this chapter, you can also read the following articles for more examples:

We have then discussed the AzureRM.OperationalInsights PowerShell module, and various APIs that can be used when developing your OMS solutions.

Lastly, we demonstrated how to use the OMS View Designer, and using ARM templates to package your solution. There are several excellent free OMS solutions that have been made available to the community. You may find these solutions to be good references when you design your solutions:

We hope this chapter has given you the knowledge you need to embark on your OMS authoring journey with confidence. And we hope you find this book to be a valuable companion on your OMS journey.

Good luck!