CloudPhysics Review

March 5th 2014 at Virtualization Field Day 3

This is one company that I’ve been anxious to meet for a while now and I am really glad that I got a chance to at Virtualization Field Day 3. They have a very unique offering to companies in that they provide collective intelligence for IT data sets.

First up in the presentation was John Blumenthal (who happens to be the ex-director of storage at VMware) and during his short introduction to the company, the slide deck had an interesting yet straight to the point phrase of “Answers to primitive questions”, this is how the physics of things actually operate.

Progress for ROI

They believe that this must happen above the automation level and needs to be analyzed as quality of service (QOS) and service level agreements (SLA’s) that will in turn need to be ingested into the analytics of the product to determine the next course of action.

They also poised the age old question of “Can a private cloud match the operations of a large scale public offering?” Their answer is that all companies must be able to use the same techniques and methodologies. With CloudPhysics, this is done through an aggregation of data from all aspects of private cloud.

How is it deployed

The product is delivered as a SaaS Model through a vApp virtual appliance and is deployed into a customers vCenter through standard techniques. The product is lightweight and has a minimal impact on the resources it consumes.

How it works

The single vApp collects and scrubs the data from vCenter. Once the process is complete, the information is pushed to CloudPhysics for analysis. The information is stored in an anonymous format to meet any regulatory requirements for compliance such as PCI. They mentioned in the presentation that even if the information was looked at, there is nothing that distinguishes the data points to any particular company.

5 Minutes to Analytics Delivery

One of the interesting points that they made was that you can start analyzing your collections within 5 minutes, which is something I would like to test in the lab since many products out on the market take weeks to deliver tangible results.

Cloud Physics has a datacenter simulator that they run customers sets through to analyze and recommend changes in the environment. This service is included in the subscription service pricing.

Datacenter Simulator Analysis can be done on a per-vm basis and the cache performance analysis can determine the right amount of cache/tweaks that the customer will need to do to maximize the configuration.

The Datastore analysis tool has two primary functions:
1) It highlights the contention periods.
2) It determines which ones were affected and which ones caused the contention.

Predicting Potential Outages

The product identifies problem points through hardware analysis of the compute side as well as other data points that adversely affect the virtualization environment.

We were then shown a demo that was delivered by Raj Raja from product management.

Finding vSphere Operations Hazards

Applications are called “cards” and delivered in segments such as datastore performance, datastore space, memory, etc. Custom “decks” can be created that are simply a collocation of cards and metrics to review and analyze.

Another nice function is the ability to simulate what will occur if changes are made to the environment before you implement. This could result in reduced lab time to validate configurations for change controls.

Root cause analysis with Cloud Physics

Datastore focus is to correlate information from datastore activities, pull in data from backups, sDRS, etc) then form a relationship management structure to determine performance metrics.

I plan to have a follow up conversation with them to find out more detailed information and hopefully get this stood up in the VMbulletin lab for further analysis.

Rick

Leave a Reply