This post is the next in my series on using IT Analytics to analyze the performance of our Private Cloud at Bay Dynamics. So far in this series we've explored IT Analytics cubes in many ways, and drag and dropped our way to insightful Pivot Tables and Charts that helped me understand how my VMware infrastructure is performing. We also created Key Performance Indicators and SQL Server Reporting Services reports to help communicate key metrics. Now we're going to tie everything together and create a SharePoint Dashboard to allow others to take advantage of our IT Analytics cubes directly from within our corporate SharePoint portal.
The prerequisite for this scenario is an installation of SharePoint 2010 Enterprise. This gives us the ability to create a new SharePoint Site using the Business Intelligence Center template, and from that site we can take advantage of the recent inclusion of PerformancePoint Services and the Dashboard Designer. The screen below shows the home of a Business Intelligence Center.
Clicking the link labeled "Start using PerformancePoint Service", we arrive at the following page, from which we can launch the PerformancePoint Dashboard Designer.
After launching the Dashboard Designer, we're ready to start creating dashboard content. Our first action is to create a Data Connection to one of our IT Analytics Cubes so we can use it as the source of information for the dashboard. For a high level overview of creating a dashboard with IT Analytics, check out this post which will guide you to a recent Webinar that provides step by step guidance for creating a PerformancePoint Dashboard with IT Analytics cubes.
Once our Data Connection is configured, we're ready to choose from the following types of dashboard content.
Our first dashboard item will be an Analytic Chart. Using the same drag and drop design technique we've highlighted throughout this series, I configured the following chart to show alerts by severity, filtered to only show VMware alerts reported from the nworks MP. Notice the range of chart types and other options we have available to us. For this chart, we want to do a Stacked Bar Chart.
Next we'll create an Analytic Grid using a similar approach. In this grid I want to see the number of VMware alerts by name, and I want to sort that list descending so the most common alert type shows at the top.
Now that we have a couple of items to use in our dashboard, lets go ahead and create one. After clicking the new dashboard button, I chose a 2 column template as shown below.
Now I can arrange my two dashboard bits by dragging them into the zones on the dashboard template.
Before I finish though, I'd like to add another page to this dashboard that hosts the custom SSRS report I created in the previous post in this series. I choose Reporting Services from the content creation ribbon, enter my server name, and choose my custom report from the list.
With that added to my PerformancePoint Content listing, I add a new page to my VMware Alert Dashboard with one zone and drag the custom report into place.
That's everything I want on my dashboard for now. After deployment, I can now navigate to the dashboard on my SharePoint Site and see it live!
But that's not the end of it. Unlike many other dashboards, PerformancePoint dashboards are highly interactive when you use IT Analytics Cubes as a source for the charts and reports. On first glance that VM Balloon Memory Usage alert pops out at me again, and I want to drill in that further. I can right click on that Alert Count cell and choose Decomposition Tree to do further analysis.
After opening the Decomposition Tree I can explore my IT Analytics SCOM Alerts Cube directly from within my SharePoint site and drill down, level by level, "decomposing" the data by adding criteria one at a time. The Decomposition Tree is a great example of the powerful capabilities built into SharePoint specifically to leverage OLAP cubes.
Analytic Grids aren't the only place we can further slice the data. The following screen shows how we can right click on a value in an Analytic Chart and drill down to an attribute such as the Host Name to get further insight without leaving the browser.
Finally, lets see how that custom SSRS report that uses our SCOM Alerts cube looks in our dashboard. Choosing the link at the top of the page to navigate to our report, we can now see the report we designed by dragging and dropping attributes from our SCOM Alerts cube directly within SharePoint!
The combination of IT Analytics, System Center, and SharePoint have proven to be a very flexible and powerful platform to create a Business Intelligence portal to help me manage my Private Cloud. I can create visually informative charts and graphs that I can drill into for additional insight, and share them with others in my organization using the familiar SharePoint portal they're already using. All of this without having to bother my DBAs once!
This post is the next in my series on using IT Analytics to analyze the performance of our Private Cloud at Bay Dynamics. I'd like to revisit the topic of monitoring the Alerts coming from my VMware infrastructure as reported by the nworks Management Pack. This round I want to create a new SQL Server Reporting Services report that I can view either inside the System Center console or externally through a Web browser.
We'll start by launching the Report Builder from the IT Analytics Configuration Manager shown below:
For anyone that has created their own SSRS reports for SCOM or SCCM, the experience is essentially the same with one major difference - no writing SQL! Once Report Builder opens, I chose New Report, and selected "Create a dataset" as shown in the following dialog:
I navigate to the SystemCenter/ServiceManager folder, and from here I have a couple of options. IT Analytics based reports can start from either the IT Analytics Model, which is a SSRS Report Model that will work with older versions of Report Builder, but for this example we'll start with the ITAnalytics Data Source directly.
This is where things are VERY different than the status quo for SCOM reports. Instead of writing a SQL query (I'll spare the commentary about the complexity and risks involved), I pick my SCOM Alerts cube and literally drag and drop the measures and attributes I want in my dataset. In this case, I added the Management Pack name as a filter and restricted it to only show the nworks MP, then added Alert Count, Date, Severity, Host name, and the actual name of the alert.
That was where most usually struggle - getting a good dataset. The next part is the same regardless of where the dataset comes from. I assembled the output of my dataset into a chart and a table as shown below:
I save the new report along side my other SCOM Alert related reports on the SSRS instance used by System Center Service Manager in the Configuration Manager folder.
After doing so, my new report shows up automatically in the System Center Service Manager Console.
Finally, opening the report gives me a great looking chart of Alerts by Severity, and a detailed table that breaks that down further by ESX Host, Alert Name, the date the alerts were raised.
That was easy, and it was FAST! Fast to render since the report used IT Analytics cubes as its source, and fast to author. I created a visually informative SQL Server Reporting Services report with very specific criteria on my own in just a few minutes. I used to have to pull my SQL guys off of what they were doing and have them go search around the OperationsManagerDW until they figured out what I was looking for. With a bit of luck and a lot of experience with the schema, they were able to get me what I needed within a few days. Not anymore. Now I'll make new reports, and update existing ones, any time I need them!
This post is the next in my series on using IT Analytics to analyze the performance of our Private Cloud at Bay Dynamics. In the previous two posts I focused on analyzing CPU, Memory, and Disk performance for my ESX infrastructure. I was able to quickly drag and drop my way to the performance views I was looking for, and visualize them in interactive charts. Now I want to look at my performance from a different angle. I saved all of the views I had previously created, but I'd like to have a single scorecard where I could see at a glance how I'm doing and if things are getting better or worse. Time to create a Key Performance Indicator!
I'd like to create a new Key Performance Indicator (KPI) to keep an eye on the average percent of memory in use for the three hosts in my ESX Cluster. I start by loading a saved pivot view from a previous blog, then highlight the Average measure for memoryUsedPct that encompases all three ESX hosts, as shown below.
With that cell highlighted, I click the New KPI button on the toolbar. This opens the Key Performance Indicator editor and populates the Value Expression with an MDX query (cube speak for SQL statement) that represents the value I highlighted on the pivot table.
Taking a step back, what I really want to see in my scorecard is a KPI for the percent of memory available, whereas the cell I selected was the percent of memory used. Simple enough, I add "100 - " in front of the value expression to flip the number. I also want to use 10% as a baseline, so I enter the value 10 as my Goal Expression. As a status, to determine if we're doing good or bad, I pick the predefined expression "Percentage Of Goal". Otherwise I could've written more complex conditions for determining status. I want to see if this number is going up or down over time, so I choose the predefined Trend Expression "Compare Current Period To Previous Period". Now I need to decide how long of a period I want to compare. I only have about a month of data from my nworks Management Pack, so I'm going to look at week over week for now and enter 7 as the number of days in the period of comparison. I'll choose "Status Arrow - Ascending" as my trend graphic. That gives me a green arrow if the trend is for the percent of memory available increases, and a red arrow if the percent of memory available decreases. Finally, lets call this KPI "VMware Cluster Pct Memory Available". After saving my KPI and navigating to the IT Analytics Key performance Indicators page, I now see my new KPI side by side with several others that come out of the box with IT Analytics.
Overall I'm at 8.52% memory available on that set of ESX hosts. According to my target, I'm doing ok, but not great. I also see that trend is going down, so I need to keep an eye on that. The good news is I can easily do just that by checking my KPIs all in a single scorecard!