INTERVIEW: Coho Data CTO Andy Warfield on Infrastructure Intelligence
Recently, David Davis and I had a great chat with Andy Warfield, co-founder and Chief Technology Officer of Coho Data, a uniquely positioned vendor in the storage realm of the emerging software defined data center. Obviously, we discussed Coho Data’s approach to addressing storage performance issues, but we also discussed the wider issue of really having a deep understanding for what’s going on in the technology environment and then allowing software-based constructs to take proactive steps intended to ensure application/infrastructure performance and availability.
We already do this with non-IT systems. We create workflows that can take automated action when certain data triggers dictate. We have also worked hard to build comprehensive business intelligence platforms that provide business units with granular data points by which decision makers can enact changes that are necessary to keep things moving smoothly. However, in IT, we often remain reactionary as infrastructure challenges arise right under our noses.
Further, even while IT helps to push business partners to think longer term and create business entities that can scale as business needs dictate, IT has built business support structures that require constant care and attention and that can sometimes be difficult to expand without difficulty. Legacy data center storage environments carry with them inherent limitations that waste valuable IT resources. They are difficult to administer and, when it comes time to expand them, it’s not always possible to do it in a way that ensures adherence to performance needs.
These are among the many reasons that Coho Data created its scalable storage solution and that the company employs an army of data scientists. With a deep understanding of how the storage resource is operating in an enterprise data center, IT can take a more proactive approach to correcting items that could ultimately affect applications. In fact, while Coho is a leader in using data to improve data center operations, other companies, such as Cloud Physics, are also collecting data center statistics to help administrators gain more insight about their environments.
Imagine just some of the potential: In real-time, an administrator can understand exactly which workloads are having a negative impact on storage resources. Or, an administrator can gain predictive insight as to when a storage system needs to be scaled. Them metrics involved can help administrators understand the reason that the storage system needs to be scale – capacity, performance, or both. Regardless of what intelligence is gleaned from the analytics system, the fact that these activities are being performed helps IT organizations reduce waste in both storage procurement as well as on ongoing administrative tasks.
During our discussion with Andy, we talked about analytics plus many more Coho Data related topics, including why the company came into existence, Andy’s take on converged servers and storage (hint: Coho almost went down this path!) and why he now thinks that the converged model carries with it inherent flaws.
There’s Much More to the Story
Storage is just one aspect of the data center and the IT environment that deserves big data-based analytics. For so long, IT has tried to push the business into using data to improve operations. Now, it’s our turn.