To further strengthen our commitment to providing industry-leading coverage of data technology, VentureBeat is excited to welcome Andrew Brust and Tony Baer as regular contributors. Watch for their articles in the Data Pipeline.
Element Analytics, which creates IT and operational technology (OT) data management tools, has announced significant advances to support digital twin development. Specific improvements simplify data integration, enhance knowledge modeling and support data fusion from multiple sources.
The internet of things (IoT) often involves combining data from IT such as enterprise apps and OT such as industrial control systems. These tools have traditionally been created and managed by different teams and with different design goals. Element is designed to create tools that make it easier to connect OT data into digital twins managed on IT infrastructure spanning enterprise servers, cloud services, and various data management tools.
These tools help feed fresh, clean, organized and governed data to digital twins to provide more accurate, timely and valuable models of assets. This requires bringing together data from various sources, including operational data, which is often inconsistently labeled and lacks metadata context. Element’s Unify data platform automates these DataOps processes and organizes this data into a knowledge graph.
Element customers span the chemical, power, utility, food and agriculture sectors. For example, Nova Scotia Power uses Unify to import and contextualize data from five multi-unit coal generation thermal plants. They were able to construct a comprehensive asset framework representing their data hierarchy. The project then evolved to help with analytics and reporting via a web-based portal for a complete set of generating assets as the utility transitions to include greener, wind-based generation.
Event
Transform 2022
Join us at the leading event on applied AI for enterprise business and technology decision makers in-person July 19 and virtually from July 20-28.
Evonik, a leading specialty chemical company, uses Unify’s tools to create a 360-degree view of equipment for plant operations teams. The company has reduced the time and effort to contextualize the complex data streams produced by its plant equipment. It estimates saving $110 thousand from a single analytics model at just one plant and expects considerably more savings with a broader rollout.
Under the Unify covers
Here is what the new enhancements mean for digital twins:
Connector portal: The Unify Connector Portal provides pre-built connectors for commonly used data sources and destinations used to feed data into digital twins. This eliminates the need to code individual connections and speeds and simplifies the process. All the user needs to do is select the connector from the portal, register it by entering details in a dialog box, and then deploy it.
Unify Graph: Graphs are a flexible approach to describing various entities and their relationships used to represent the different elements within a digital twin. Unify Graph allows a user to represent different data sources and how they relate to process flows, asset hierarchies, and organization structures. The user can quickly build on their initial graph by adding more sources and associated relationships incrementally. They can explore the graph visually and query it using a graph query language within Unify or export the graph-to-graph database products such as AWS Neptune or Neo4j. Unify can export its graph models for use by digital twin tools such as AWS IoT TwinMaker.
Advanced joins: Digital twins are crafted from joining data from multiple sources to represent different aspects of a thing or process. Joining data involves identifying overlapping fields from different data streams. But in real-world scenarios, there are often subtle differences in how this data is encoded across sources owing to different naming conventions or lack of compliance. Data engineers must then manually wrangle this data together to join data sets or data streams to account for such differences. New join techniques support “fuzzy” and “contains” joins that do a better job at accurately matching up uncertain data.
Still early for advanced graphs
A few enterprises have seen impressive benefits from digital twins. For example, the Abu Dhabi National Oil Company used digital twins to generate $1 billion in additional revenue. But other industrial clients are less mature in their digital twin efforts.
“It is relatively early for most of our clients within the industrial sector when it comes to using advanced graph database processing and graph-based AI techniques,” Element CEO Andy Bane told VentureBeat
Bane says the industry needs to do more work on improving basic graph data processing capability for tasks like characterizing the connections or relationships between disparate data sources. Once this basic graph model is in place, it will be easier for enterprises to build more advanced digital twins.
There are industry efforts to promote formats such as Digital Twins Definition Language (DTDL), though Bane is unsure whether these standards will achieve widespread adoption.
“Ultimately, what matters to the business is the ability to establish robust models and reliable data pathways, so that the digital twin has fresh, clean, organized, and governed data that it can operate upon to support decision-making,” he said. “Given the reality of the diverse deployed plant and associated operational technology at major industrials, digital twin success calls for the ability to bring that data to the digital twin efficiently and reliably, regardless of the source format.”