VMware vExpert 2014 Award

The vExpert program is for those individuals that go above an beyond their daily jobs to help the community by making contributions in several key areas. These include being an evangelist, customer contributions or through the VMware Partner Network.

The one area that many fall under is the evangelist path that includes many bloggers that spend countless hours posting content to help others with real world scenarios and product reviews. Public speakers and authors fall under this category as well that deliver excellent content for a variety of VMware products.

You can’t forget about the customer category that encompasses many admins, engineers and infrastructure people that are champions internally and drive their respective companies forward with VMware products. I really enjoy heard success stories from many of these folks.

Then there is the partner path, which include many VAR’s throughout the world that are out in the field day after day implementing, supporting and being an external resource to many organizations. I see many of them doing speaking engagements and providing support that goes well above their job duties to enhance the community.

I salute all of the vExperts year after year for their significant contributions and dedicated work.

I am extremely honored to be elected into the program for a third year as I try to give as much back to the community as what I have received.

Congratulations to all that made the program this year!

Rick

CloudPhysics Review

March 5th 2014 at Virtualization Field Day 3

This is one company that I’ve been anxious to meet for a while now and I am really glad that I got a chance to at Virtualization Field Day 3. They have a very unique offering to companies in that they provide collective intelligence for IT data sets.

First up in the presentation was John Blumenthal (who happens to be the ex-director of storage at VMware) and during his short introduction to the company, the slide deck had an interesting yet straight to the point phrase of “Answers to primitive questions”, this is how the physics of things actually operate.

Progress for ROI

They believe that this must happen above the automation level and needs to be analyzed as quality of service (QOS) and service level agreements (SLA’s) that will in turn need to be ingested into the analytics of the product to determine the next course of action.

They also poised the age old question of “Can a private cloud match the operations of a large scale public offering?” Their answer is that all companies must be able to use the same techniques and methodologies. With CloudPhysics, this is done through an aggregation of data from all aspects of private cloud.

How is it deployed

The product is delivered as a SaaS Model through a vApp virtual appliance and is deployed into a customers vCenter through standard techniques. The product is lightweight and has a minimal impact on the resources it consumes.

How it works

The single vApp collects and scrubs the data from vCenter. Once the process is complete, the information is pushed to CloudPhysics for analysis. The information is stored in an anonymous format to meet any regulatory requirements for compliance such as PCI. They mentioned in the presentation that even if the information was looked at, there is nothing that distinguishes the data points to any particular company.

5 Minutes to Analytics Delivery

One of the interesting points that they made was that you can start analyzing your collections within 5 minutes, which is something I would like to test in the lab since many products out on the market take weeks to deliver tangible results.

Cloud Physics has a datacenter simulator that they run customers sets through to analyze and recommend changes in the environment. This service is included in the subscription service pricing.

Datacenter Simulator Analysis can be done on a per-vm basis and the cache performance analysis can determine the right amount of cache/tweaks that the customer will need to do to maximize the configuration.

The Datastore analysis tool has two primary functions:
1) It highlights the contention periods.
2) It determines which ones were affected and which ones caused the contention.

Predicting Potential Outages

The product identifies problem points through hardware analysis of the compute side as well as other data points that adversely affect the virtualization environment.

We were then shown a demo that was delivered by Raj Raja from product management.

Finding vSphere Operations Hazards

Applications are called “cards” and delivered in segments such as datastore performance, datastore space, memory, etc. Custom “decks” can be created that are simply a collocation of cards and metrics to review and analyze.

Another nice function is the ability to simulate what will occur if changes are made to the environment before you implement. This could result in reduced lab time to validate configurations for change controls.

Root cause analysis with Cloud Physics

Datastore focus is to correlate information from datastore activities, pull in data from backups, sDRS, etc) then form a relationship management structure to determine performance metrics.

I plan to have a follow up conversation with them to find out more detailed information and hopefully get this stood up in the VMbulletin lab for further analysis.

Rick

Virtualization Field Day 3 Fast Approaching

Coming up on March 5th through the 7th is the infamous Tech Field Day event – Virtualization Field Day 3!

This event will be held in San Jose, California and will feature a number of great presenters in the virtualization space (as well as storage) which is fast becoming an integral part of the ecosystem (along side of the SDN component).

Atlantis Computing will be there. Their product lines revolve around the VDI space and have solutions for VMware, Citrix and a very unique product known as the ILIO Center that can fully automate deployments. I will be very interested to hear more and hopefully see this product in action during the presentation.

Cloud Physics is scheduled to be there as well to talk about their analytics for datacenter operations. Being a technologist that loves the SaaS/PaaS world as much as the virtualization space, I am looking forward to hearing more about this offering and getting a look at how they analyze the infrastructure. Especially from a vCloud Hybrid Service perspective.

Coho Data, who I just saw at the last Storage Field Day will also be presenting. Their scaling model is fundamentally different from anything else out in the storage market today. The use of software network scaling to meet storage demand is a very unique. More on this as it pertains to virtualization after we meet with them.

Spirent will be presenting as well and I am very interested to hear more about their solutions in the virtualization space. As a result of some quick research, they have a number of solutions that revolve around performance testing with something they call PASS – Performance, Availability, Security & Scale. Look for a write-up about this company shortly after the event!

VMTurbo will most likely show their Hybrid Cloud Control Module that has connectors to AWS & Azure. I will be especially interested in the automation solutions that this product delivers when it comes to private clouds. I also saw that their products are UCS ready which is something I will be sure to have them elaborate on.

The website for this event and live video stream is here and the hashtag on Twitter will be #VFD3 – Add your twitter column now!

Rick