Data Management

Will Data Management in Operational Technology Finally Be Standardized?

Actian Corporation

July 11, 2019

Rows of virtual files in a data catalog, contributing to powerful data management

The proliferation of Operational Technology (OT) within companies is increasing, which has led many to ask whether this will lead to standardization of how operational technology-generated data is managed. The answer is “it’s complicated.”

There is unlikely to be standardization of the data individual operational technology devices generate, but there will be new capabilities for interoperability, data aggregation, and unified analysis.

What is Operational Technology?

Before examining the standardization issue, it is important to understand the definition of “operational technology.” OT is an umbrella term that describes technology components used to support a company’s operations – typically referring to traditional operations activities, such as manufacturing, supply chain, distribution, field service, etc. (Some companies are relying on operational technology to support, for example, marketing, sales and digital delivery of services, but that is the topic of a future article.)

Operational technology includes, for example, embedded sensors within manufacturing equipment, telemetry from operations components deployed in the field (e.g., oil pipelines, traffic signals, windmills, etc.); industrial IoT devices; location-enabled tablets, which field service personnel use; and much more – the list is long. This is important because OT is not a single classification of technology, it is a descriptor of how technology components are used.

The Push for Interoperability

Some efforts are occurring within the industry to drive interoperability amongst IT and OT components. Open Platform 3.0 (OP3) from The Open Group is a good example. What this standard and others like it seek to do is enable components from different manufacturers to co-exist and work better together within a company’s technology ecosystem. They aren’t seeking to standardize the data coming from individual OT systems or how that data is managed. That challenge is being left to individual companies and the data sciences profession to address.

Data science professionals have been working with companies and individual technology providers for many years to determine a scalable and efficient method to aggregate data from diverse data sources. Efforts to standardize data models and interfaces have been largely unsuccessful due to the desire of some large players in the market to develop and defend closed technology ecosystems.

In light of this, most of the recent developments have been centered on the use of data warehouses to aggregate diverse data and then applying machine learning and artificial intelligence to reconcile differences.

Why Operational Technology Data Management May Never be Standardized

The biggest challenge to standardizing OT data management is managing change. It would be entirely possible to design and deploy a standardized solution to manage all the data generated from OT systems today. The problem is that the technology in this space is continuously evolving and the data being generated is changing too.

Neither technology suppliers nor the companies consuming OT have any desire to slow the pace of technological innovation or constrain it through standardization. New OT innovations will be the driving force behind the next generation of business modernization and companies are eager to consume new capabilities as soon as they can be made available.

How Companies Are Integrating Operational Technology Data

Even though companies don’t have a desire to standardize the data coming from various OT source systems, they have a very critical business need to combine data and analyze it as part of an integrated data set. That is where data management tools, such as Actian, come into play.

Actian’s suite of products, including DataConnect, Actian Data Platform and Zen, provide companies with a platform to manage the ingestion of data from all of their OT data sources, reconcile it in real-time using cloud-scale analytics and machine learning, and then apply the robust statistical analysis (e.g., time series and correlation analysis) to translate data into meaningful insights in an operations context.

The operational technology space is poised to be one of the most important sectors of the IT industry during the next few years. New components will enable companies to generate data from almost all facets of their operations and robust data management solutions, such as Actian, will enable them to interpret this data in real-time to generate valuable operational insights.

While standardization is unlikely, component interoperability is improving and emerging technologies, such as AI, are making data analytics easier. To learn more about how Actian can support your OT efforts, visit www.actian.com/zen.

actian avatar logo

About Actian Corporation

Actian makes data easy. We deliver cloud, hybrid, and on-premises data solutions that simplify how people connect, manage, and analyze data. We transform business by enabling customers to make confident, data-driven decisions that accelerate their organization’s growth. Our data platform integrates seamlessly, performs reliably, and delivers at industry-leading speeds.