This post is the next in my series on using IT Analytics to analyze the performance of our Private Cloud at Bay Dynamics. In previous posts I've focused on analysis of my VMware infrastructure, but now I'll turn my attention to investigating the performance of our Hyper-V hosts.
We start with opening our SCOM Performance Cube and refine the metrics to data coming from System Center Virtual Machine Manager by choosing the MP named "System Center Virtualization Reports 2008". Adding the Entity Count and Sample Count to the pivot table gives me a high level picture of how much data I'm collecting from different objects within that MP. We have lots to choose from, including information on virtualization candidate servers, Hyper-V hosts, VM guests, and VMware ESX servers. See below:
In this case, I only want to look at my Hyper-V hosts, so I'll multi-select those five objects, right click, and filter my results to that selection.
Next I add the actual host name to the mix, and I can see the six Hyper-V hosts I'm collecting performance metrics for with this MP.
Now that I have a good set of metrics and hosts, I want to change my measures to show the actual minimum, average, and maximum values of each.
Now I've got some real numbers to chew on. To make the analysis easier, I'd like to see this as a chart. Choosing the Chart option at the top and adjusting the options to split that into three charts by Counter, I now have a much more visually informative perspective.
I see that one of the hosts, svsfhypv002, has the most available memory as well as good free space, but also has the highest variance in CPU % total run time. I'd like to drill into that further and see that data over time to help me determine if this host is a good candidate for a new set of VMs I need to provision.
Hmm, looks like things were stable in April with high available space and low CPU utilization, but there was a big change in May. My next thought is to see that % free space broken down by specific day to see the recent trend. Dragging in the date, I see:
Clearly there was additional load added to this server in early May. Fortunately I still have over 75% of the disk free and this is still a good candidate to host new Hyper-V VMs.
Once again, by leveraging my IT Analytics SCOM Performance cube, I was able to analyze trends in a fluid way, asking and answering my own questions on the fly in a matter of minutes. No guessing and no waiting, I have the information I need to make good decisions about loading and capacity of my virtual infrastructure!
This post is the next in my series on using IT Analytics to analyze the performance of our Private Cloud at Bay Dynamics. So far in this series we've explored IT Analytics cubes in many ways, and drag and dropped our way to insightful Pivot Tables and Charts that helped me understand how my VMware infrastructure is performing. We also created Key Performance Indicators and SQL Server Reporting Services reports to help communicate key metrics. Now we're going to tie everything together and create a SharePoint Dashboard to allow others to take advantage of our IT Analytics cubes directly from within our corporate SharePoint portal.
The prerequisite for this scenario is an installation of SharePoint 2010 Enterprise. This gives us the ability to create a new SharePoint Site using the Business Intelligence Center template, and from that site we can take advantage of the recent inclusion of PerformancePoint Services and the Dashboard Designer. The screen below shows the home of a Business Intelligence Center.
Clicking the link labeled "Start using PerformancePoint Service", we arrive at the following page, from which we can launch the PerformancePoint Dashboard Designer.
After launching the Dashboard Designer, we're ready to start creating dashboard content. Our first action is to create a Data Connection to one of our IT Analytics Cubes so we can use it as the source of information for the dashboard. For a high level overview of creating a dashboard with IT Analytics, check out this post which will guide you to a recent Webinar that provides step by step guidance for creating a PerformancePoint Dashboard with IT Analytics cubes.
Once our Data Connection is configured, we're ready to choose from the following types of dashboard content.
Our first dashboard item will be an Analytic Chart. Using the same drag and drop design technique we've highlighted throughout this series, I configured the following chart to show alerts by severity, filtered to only show VMware alerts reported from the nworks MP. Notice the range of chart types and other options we have available to us. For this chart, we want to do a Stacked Bar Chart.
Next we'll create an Analytic Grid using a similar approach. In this grid I want to see the number of VMware alerts by name, and I want to sort that list descending so the most common alert type shows at the top.
Now that we have a couple of items to use in our dashboard, lets go ahead and create one. After clicking the new dashboard button, I chose a 2 column template as shown below.
Now I can arrange my two dashboard bits by dragging them into the zones on the dashboard template.
Before I finish though, I'd like to add another page to this dashboard that hosts the custom SSRS report I created in the previous post in this series. I choose Reporting Services from the content creation ribbon, enter my server name, and choose my custom report from the list.
With that added to my PerformancePoint Content listing, I add a new page to my VMware Alert Dashboard with one zone and drag the custom report into place.
That's everything I want on my dashboard for now. After deployment, I can now navigate to the dashboard on my SharePoint Site and see it live!
But that's not the end of it. Unlike many other dashboards, PerformancePoint dashboards are highly interactive when you use IT Analytics Cubes as a source for the charts and reports. On first glance that VM Balloon Memory Usage alert pops out at me again, and I want to drill in that further. I can right click on that Alert Count cell and choose Decomposition Tree to do further analysis.
After opening the Decomposition Tree I can explore my IT Analytics SCOM Alerts Cube directly from within my SharePoint site and drill down, level by level, "decomposing" the data by adding criteria one at a time. The Decomposition Tree is a great example of the powerful capabilities built into SharePoint specifically to leverage OLAP cubes.
Analytic Grids aren't the only place we can further slice the data. The following screen shows how we can right click on a value in an Analytic Chart and drill down to an attribute such as the Host Name to get further insight without leaving the browser.
Finally, lets see how that custom SSRS report that uses our SCOM Alerts cube looks in our dashboard. Choosing the link at the top of the page to navigate to our report, we can now see the report we designed by dragging and dropping attributes from our SCOM Alerts cube directly within SharePoint!
The combination of IT Analytics, System Center, and SharePoint have proven to be a very flexible and powerful platform to create a Business Intelligence portal to help me manage my Private Cloud. I can create visually informative charts and graphs that I can drill into for additional insight, and share them with others in my organization using the familiar SharePoint portal they're already using. All of this without having to bother my DBAs once!
This post is the next in my series on using IT Analytics to analyze the performance of our Private Cloud at Bay Dynamics. In the previous two posts I focused on analyzing CPU, Memory, and Disk performance for my ESX infrastructure. I was able to quickly drag and drop my way to the performance views I was looking for, and visualize them in interactive charts. Now I want to look at my performance from a different angle. I saved all of the views I had previously created, but I'd like to have a single scorecard where I could see at a glance how I'm doing and if things are getting better or worse. Time to create a Key Performance Indicator!
I'd like to create a new Key Performance Indicator (KPI) to keep an eye on the average percent of memory in use for the three hosts in my ESX Cluster. I start by loading a saved pivot view from a previous blog, then highlight the Average measure for memoryUsedPct that encompases all three ESX hosts, as shown below.
With that cell highlighted, I click the New KPI button on the toolbar. This opens the Key Performance Indicator editor and populates the Value Expression with an MDX query (cube speak for SQL statement) that represents the value I highlighted on the pivot table.
Taking a step back, what I really want to see in my scorecard is a KPI for the percent of memory available, whereas the cell I selected was the percent of memory used. Simple enough, I add "100 - " in front of the value expression to flip the number. I also want to use 10% as a baseline, so I enter the value 10 as my Goal Expression. As a status, to determine if we're doing good or bad, I pick the predefined expression "Percentage Of Goal". Otherwise I could've written more complex conditions for determining status. I want to see if this number is going up or down over time, so I choose the predefined Trend Expression "Compare Current Period To Previous Period". Now I need to decide how long of a period I want to compare. I only have about a month of data from my nworks Management Pack, so I'm going to look at week over week for now and enter 7 as the number of days in the period of comparison. I'll choose "Status Arrow - Ascending" as my trend graphic. That gives me a green arrow if the trend is for the percent of memory available increases, and a red arrow if the percent of memory available decreases. Finally, lets call this KPI "VMware Cluster Pct Memory Available". After saving my KPI and navigating to the IT Analytics Key performance Indicators page, I now see my new KPI side by side with several others that come out of the box with IT Analytics.
Overall I'm at 8.52% memory available on that set of ESX hosts. According to my target, I'm doing ok, but not great. I also see that trend is going down, so I need to keep an eye on that. The good news is I can easily do just that by checking my KPIs all in a single scorecard!