Optimise cloud spending with a streamlined approach to data

By Dael Williamson, EMEA CTO at Databricks.

  • 10 months ago Posted in

As 2023 progresses, many businesses have found themselves at a crossroads. Generative AI is booming, with organisations in every sector racing to implement large language models (LLMs) into their tech stack at a rate quicker than anyone could have anticipated. The commercial potential of generative AI cannot be understated, but its implementation does not come without challenges. 

 

At the same time, economic uncertainty and rising costs are forcing business leaders to reassess their tech investments. Global economic growth is predicted to remain modest in 2023, due to inflationary pressures and other issues like the technical skills shortage. Whilst IT spending remains a priority, with roughly half of businesses planning to increase their IT budgets, CIOs should remain vigilant. With senior leaders tightening their belts, now is the time to optimise cloud spending and ensure that it is delivering value, especially when integrating new generative AI-driven tools.

 

Cut costs by slimming the tech stack

 

As businesses leverage more and more technology for their operations, the amount of data they have access to has increased exponentially. Data has real potential to facilitate informed decision making, ensure that products are fulfilling market needs, and automate mission-critical processes. To enable these benefits, businesses have tapped into a range of systems and tools to manage their data, but this isn’t always the most efficient approach.

 

Inevitably, when teams have access to such a wealth of tools, it is difficult to use each one optimally at all times. Some functionality may never be utilised, either due to a lack of awareness around its existence or because another tool is already carrying out the same process. Additionally, the load of managing a range of bespoke tool chains isn’t sustainable for data teams. Inevitably, it leads to tech bloat, inefficiencies, and missed opportunities, as it is difficult to retain visibility over multiple systems simultaneously. 

 

There are also some inescapable hidden operational costs associated with all these cloud data management systems, which businesses often fail to consider. On top of the cost of the system or tool itself, we can add team costs, service level agreements, data migration, any unplanned downtime, and more. These costs can quickly add up when running multiple systems. 

 

To gain a clear picture of these hidden costs, businesses can leverage cost allocation tagging. This can be time consuming, but the output is valuable. Firstly, it provides visibility into the owner of each resource, which function it supports and how much the team is using and spending. Secondly, it enables CIOs to view their spend in different ways, and understand common spending patterns. Thirdly, investing time in cost allocation tagging makes it possible to build infrastructure to auto-tag future expenses and

easily understand bills on an ongoing basis.

 

Consolidate to unleash data’s potential

 

Once a business has reliable visibility into its cloud bills, leaders can embark on the cost-cutting journey. The first step is often migrating to one standardised and interoperable data management platform. When doing so, to truly maximise cost optimisation, CIOs should consider leveraging a cloud-agnostic, open-source architecture. Rather than tying businesses into one cloud provider, a cloud-agnostic approach allows a business to function across most, if not all, cloud environments. In the short term, this can make for a smoother transition, as core business services can be maintained during the move. In the longer term, a cloud-agnostic strategy can be the cheaper option. Instead of paying to move data from cloud to cloud when needed, these costs will be covered by the data management platform. By consolidating platforms and simplifying data management, businesses can seamlessly cut costs. 

 

Inevitably, there are a number of cloud-agnostic options on the market. The data lakehouse is becoming increasingly popular - a new, open architecture that combines the best elements of data lakes and data warehouses. It eliminates much of the complexity typically associated with legacy architectures, with an open system design that enables the timely flow of accurate data housed within a low-cost cloud storage site. With this in place, data is easily stored for analysis, and organisations can successfully scale AI and ML. These tools can further enable consolidation and unlocking of proprietary data and business-critical code with hidden business logic from extremely old legacy systems - generative AI is accelerating this approach, and massively removing even more cost and organisational risks. Already, this can be a cost-effective solution, as the business simplifies its data stack by eliminating the data silos that traditionally separate and complicate data engineering, analytics, data science and ML. 

 

Deliver a range of enterprise-wide benefits

 

When migrating to a new data management platform, there are many advantages - both financial and otherwise - which run adjacent to the initial cost optimisation. Once the initial migration has been carried out, data teams often report an improved experience using the new platform, as they find the streamlined approach more user-friendly. Equally, there are environmental benefits to consider, as the architecture consolidation results in reduced cloud computing needs and thus reduced carbon output. As such, optimising the cost of data management does not only benefit the IT department, but the whole organisation.

 

Inevitably, when conserving resources in one domain, leaders are able to prioritise other business objectives. For instance, a streamlined data management system will allow employees across all departments to access the organisation’s data more easily. Colleagues in HR may then be able to design a more efficient sustainability strategy, as they have improved access to data on the organisation’s ESG performance. Data management optimisation is never merely the reserve of the IT department, but rather has a positive knock-on effect across the whole business. Data becomes the connective tissue for business and IT to solve this problem together, removing past chasms.

Those who identify and address the range of costs associated with their data management strategy will ultimately unlock the most value from their data. Keeping costs down begins with migrating from a legacy to a modern architecture, like a lakehouse, but this is only the first step. Regularly reviewing each associated cost will support leaders in reducing costs without compromising on data quality. By running a tight ship in this way, leaders can leverage data efficiently to drive business value and stand out from their competition.

By Brian Sibley, Solutions Architect, Espria.
By Lori MacVittie, F5 Distinguished Engineer.
By Adam Gaca, Vice President of Cloud Solutions at Future Processing.
By Adriaan Oosthoek, Chairman Portus Data Centers.
By Jo Debecker, Managing Partner and Global Head of Wipro FullStride Cloud.
By Tim Whiteley, Co-Founder of Inevidesk.
By Russell Crowley, co-founder at Principle Networks.
Reaping the rewards of AI-powered services is putting the quality of fintech IT infrastructure to...