Betting The Farm On DCIM
We all know the important roles played by all manner of solutions such as DCIM. But it must be mind boggling for potential customers when it comes to choosing what would be best for their environment. Almost every purveyor of DCIM software offers something different, blurring the lines between discrete areas of functionality and requirements. Is configuration management part of a DCIM solution? And whether it is or isn’t what does it represent. For example about the management and maintenance of software mounted on datacenter assets and (or) business assets connected to the network or does it represent the physical attributes and location of assets in the datacenter. Depending on the vendor, it is often one or the other or nothing at all.
What it really comes down is the software development roots of each product. Most originated years ago, often as a niche solution, expanding and morphing in to what we today know as DCIM. Depending on their legacy they are often biased towards either facilities management or centered on IT management. There are however new kids on the DCIM block that are launching modern technology solutions that are taking a converged, holistic approach.
These solutions offer huge benefits if implemented and use correctly. That said, benefits and drawbacks abound. In any selection process, one needs to balance the current and likely needs of the enterprise going forward especially considering the current high costs of traditional DCIM software procurement, implementation and subsequent operations. I recently spoke at a Premier CIO conference in Seattle where one of the themes was taking decisions that "bet the farm". Selecting a DCIM is one such decision. It’s a binary event, where there’s no going back without fatalities.
And they need to as they can be prohibitively expensive to buy and implement. However, there lurks in the shadows a much greater threat to the successful use of these technologies, so often overlooked.
How Many Assets Do You Have?
It’s a question I ask wherever I go and I am stunned that invariable datacenter and facility managers, CIO’s, CFO’s can’t answer with any level of accuracy. Last year I spoke at an ITIAM conference on this specific subject. It was a good size audience and I started out by asking them “how many of you have ask or been asked the question, how many assets do you have in your datacenter”. Pretty much every hand went up in the auditorium. I then asked “How many of you received or gave an accurate answer?” Every hand went down bar one. Recently, I visited a large multi-national company with multiple large-scale datacenters. I asked the same question of the datacenter manager running the facility I was visiting. His response was “Here we have between 10,000 and 12,000 assets”.
I have long since gone beyond the point of being shocked at the lack of understanding when it comes to this critical issue. And the ramifications of not knowing are profoundly negative. You simply don’t know what you have and where it is. Often the problem for larger datacenters is that it’s too hard to undertake manual audits, and a “chicken and egg” conundrum ensues that’s self-fulfilling. Here’s a simple example of not knowing:
You inevitably have ghost servers in your facility. Using the lower end ratio generally used in such situations of 5% that’s between 500 and 600 in the above example. And the likelihood is you are incurring maintenance or lease costs on every one of them. At a low average of $1,000 per year, that’s $5 – 600,000
Assuming 15 assets per rack, ghost servers in the same example are chewing up 20 racks which at an average of approximately $20,000 per year. That’s $400,000 per year.So in this one small example, over $1 million worth of costs per year are being incurred because of not knowing.
But I Have Implemented DCIM!
I can hear the cry.
The issue so often misunderstood, is that DCIM solutions and any others you may have are dependent on being fed the raw physical data in order for them to function correctly and provide value. They are not the vehicle of capturing this data, they are consumers of it. And that’s where the integrity of the datacenter and the major investment in software solutions becomes a house of cards. If you don’t know how many assets you have in your datacenter and where they are, you have no chance of figuring out things like what they are doing and how much they cost to maintain, how old they are and all the other things you bought a DCIM system for in the first place.
The ramifications and risks go much further. Imagine you have a major disaster recovery event. How do you know where your critical assets are and why they are critical in the first place. How do you plan for such events when you don’t know what you need to know.
It’s not your DCIM systems fault. It doesn’t know either.
There is an answer to this dilemma, solutions designed that work in what I call the “foundation layer” that integrate with DCIM solutions and provide them with a continuous flow of accurate data; a system of truth. In my next post, I will talk about how this all comes together.
