Data Analytics

Analytics Program and Project Justification During Cautious Spending

William McKnight

November 2, 2023

data analytics tools

The economy is currently in a state of flux based on analytics, and there are both positive and negative signals regarding its future. As a result of factors, such as the low unemployment rate, growing wages, and rising prices, businesses find themselves in a spectrum of states. 

Recent pullbacks appear to be driven primarily by macro factors. I have a positive outlook on IT budgets in 2024 because I anticipate a loosening of IT expenditures, which have been limited by fears of a recession, since 2022. This will allow pent-up demand, which was cultivated in 2023, to be released. Because data is the key to success for these new endeavors, the demand for data cleansing and governance technologies has increased to address broad data quality issues in preparation for AI-based endeavors. 

Taking a broader perspective, despite the instability of the macro environment, the data and analytics sector is experiencing growth that is both consistent and steady. However, there is a greater likelihood of acceptance for business programs that concentrate more on optimization than on change. As a means of cutting costs, restructuring and modernizing applications as well as practicing sound foundational engineering are garnering an increasing amount of interest. For instance, businesses are looking at the possibility of containerizing their applications because the operation costs of containerized applications are lower. 

At this point, in this environment, project approval is taking place; nonetheless, the conditions for approval are rather stringent. Businesses are becoming increasingly aware of the importance of maximizing the return on their investments. There has been a resurgence of interest in return on investment (ROI), and those who want their projects to advance to the next stage would do well to bring their A-game by integrating ROI into the structure of their projects. 

Program and Project Justification

First, it is important to comprehend the position that you are attempting to justify: 

  • A program for analytics that will supply analytics for a number of different projects.
  • A project that will make use of analytics.
  • Analytics pertaining to a project.
  • The integration of newly completed projects into an already established analytics program.

Find your way out of the muddle by figuring out what exactly needs to be justified and then getting to work on that justification. When justifying a business initiative with ROI, it is possible to limit the project to its projected bottom-line cash flows to the corporation in order to generate the data layer ROI (which is perhaps more accurately referred to as a misnomer in this context). In order for the project to be a catalyst for an effective data program, it is necessary for the initiative to deliver returns. 

The question that needs to be answered to justify the starting of an existing data program or the extension of an existing data program is as follows: Why architect the new business project(s) into the data program/architecture rather than employing an independent data solution?  These projects require data and perhaps a data store, if the application doesn’t already come with one, then synergy should be established with what has previously been constructed.  

In this context, there is optimization, a reduction back to the bare essentials, and everything in between. The bare essentials approach can happen in an organization in a variety of different ways. All of these are indications of an excessive reach and expanded data debt: 

  1. Deciding against utilizing leverageable platforms like data warehouses, data lakes, and master data management in favor of “one-off”, and apparently (deceptively) less expensive, unshared databases tight fit to a project. 
  2. Putting a halt to the recruiting of data scientists. Enterprises that take themselves seriously need to take themselves seriously when it comes to employing the elusive genuine data scientist. If you fall behind in this race, it will be quite difficult for you to catch up to the other competitors. Even if they have to wrangle the data first before using data science, data scientists are able to work in almost any environment. 
  3. Ignoring the fact that the data platforms and architecture are significantly more important to the success of a data program than the data access layer, and as a result, concentrating all of one’s efforts on the business intelligence layer. You should be able to drop numerous BI solutions on top of a robust data architecture and still reach where you need to go. 
  4. Not approaching data architecture from the perspective of data domains. This leads to duplicate and inconsistent data, which leads to data debt through additional work that needs to be done during the data construction process, as well as a post-access reconciliation process (with other similar-looking data). Helping to prevent this is master data management and a data mesh approach that builds domains and assigns ownership of data.   

Cutting Costs

If your enterprise climate is cautious spending, target the business deliverable of your data project and use a repeatable, consistent process using governance for project justification. Use the lowering of expenses for justifying data programs. Also, avoid slashing costs to the extreme by going overboard with your data cuts, since this can cause you to lose the future.  

Although it should be at all times, it’s times like these when efficiencies develop in organizations and they become hyper-attracted to value. You may have to search beyond the headlines to bring this value to your organization. People in data circles know about Actian. I know firsthand how it outperforms and is less costly than the data warehouses getting most of the press, yet is also fully functional. 

All organizations need to do R&D to cut through the clutter and have a read on the technologies that will empower them through the next decade. I compel you to try the Actian Data Platform. They have a no-cost 30-day trial where you can setup quickly and experience its unified platform for ingesting, transforming, analyzing and storing data. 

About William McKnight

William McKnight has advised many of the world’s best-known organizations. His strategies form the information management plan for leading companies in various industries. He is a prolific author and a popular keynote speaker and trainer. He has performed dozens of benchmarks on leading databases, data lake, streaming, and data integration products. William is the #1 global influencer in big data and cloud computing and he leads McKnight Consulting Group, which has twice placed on the Inc. 5000 list. William’s breadth of knowledge across the disciplines of enterprise information management and depth of knowledge in them is evident in a consulting career with hundreds of thought leadership pieces in publication. William is a highly sought-after presenter who has spoken on four continents and consulted in over 15 countries. He is noted for his fascinating, informative, dynamic, and entertaining keynotes. William educates businesses and organizations on emerging technology, the vast uses of information, and strategy insights.