As more and more small-to-medium-sized businesses (SMBs) investigate and adopt Business Intelligence (BI) solutions, one question eventually gets asked: Can we virtualize this thing?
Businesses are increasingly leveraging virtualization to reduce the number of physical servers they have to maintain, to provide better availability, to enable fault tolerance and disaster recovery, and much more. BI systems are certainly critical to the company’s success, so why not install the BI system into a virtual machine?
Caution is advisable, and you’ll definitely want to consult with your BI system vendor before taking this step. Keep in mind that the whole premise of virtualization is that most physical servers’ computing resources go underutilized most of the time, because applications aren’t used evenly throughout the day. Computers are fast enough, these days, to switch between several different tasks, making virtualization sort of the ultimate expression of multitasking.
Today’s BI systems are sort of a special case, though. While you probably won’t ever use your BI system to full capacity every single moment of the day, when you do use it, you’re going to want it to respond quickly. Because of the way BI systems utilize computing resources, they may not be able to get the power they need when the time comes to work if they’re sharing resources with other virtual machines on a virtualization host.
That’s especially true of the most modern type of BI systems that utilize in-memory analytics. Rather than relying entirely on specially-constructed data warehouses to store and crunch through data, in-memory analytics builds analysis models on the fly, in – as the name suggests – the server’s memory. It’s not at all unusual for analytics servers to have copious amounts of memory, far more than even a hard-working database server might need. That’s because the database server is relying mainly on disk for its storage, and uses memory only to process current queries and to cache small amounts of data. An analytics server with 32GB of memory isn’t at all unusual. Giving it that 32GB of memory from a virtualization host can be complicated.
Compounding the complexity is the fact that most virtualization hosts are configured to use memory overcommit. That means a host containing 64GB of memory might allocate 16GB to each of two virtual database servers, 32GB to a virtual analytics server, 8GB to a couple of virtual messaging servers, and perhaps another 8GB to a couple of collaboration servers. That’s 112GB of memory allocated – far more than the server physically contains. The idea again is that no one virtual server actually needs all of its allocated memory at once, so the host transfers free memory around as its called for. An analytics server, however, might tend to use very little (when it’s not being accessed) or all of its allocated memory, trashing the “overcommit” model and killing performance.
Combine this with the fact that many SMB-targeted BI solutions also include a Web server, database engine, and other components can make these solutions even less viable inside a virtual machine. The moral here is to simply not assume that your BI solution is a good virtualization candidate. Work closely with the vendor to see what they recommend, and make sure that – should you decide to go the virtual route – your virtual BI server has all the resources it needs to work for you.