At companies with high levels of cloud maturity, more and more units are acting on their own in harnessing innovative solutions based on cloud technologies. That’s a desirable situation in itself, allowing for rapid adaptation of innovations on the market, but the flip side is that higher cloud consumption brings significant added costs – as we see over and over in our day-to-day consulting practice. What's the problem?
At most companies, cloud spending is lumped in with IT costs, so it is not directly associated with the business itself. And that reduces the necessary cost sensitivity regarding cloud services on the business side. At the same time, system requirements for applications are described quite generously, especially where existing workloads are being migrated (“lift & shift”). This means the IT organization faces rising costs, often without any direct influence on those costs. The result is cost increases that the IT organization cannot influence directly.
One possible solution is to have business units and application owners bear their own costs associated with cloud applications to create more transparency and cost sensitivity. But even at companies where the costs are paid by the business units that incur them, there is often no capacity for optimization, especially since the business cases for cloud projects that have been calculated at some point, are still positive even so. Aside from that, it is difficult to identify potential for cost optimization without the IT team’s in-depth technical know-how. In its advisory role, the IT organization should bear full responsibility for illuminating cloud costs across the entire company and identifying areas where there is potential to save on costs. To do this, companies need to build automated, tool-based savings identification into the IT organization's processes. Business units also need to be more cost-aware.