DNVGL.com

SHARE:
PRINT:
Streamlining data flow

Streamlining dataflow 

Turning data into trusted information as a basis for true insights and value creation is a challenge, considering the multiple sources of data being generated on vessels as well connectivity limitations. A significant part of today’s data is collected by manual input from the crew. There are good reasons for this proven approach: the crew’s felt ownership and responsibility for the data as well as their observation of critical parameters enforced by this data collection. But quality management of the data collected is a key prerequisite for its later usability. This has several challenges: missing entries, wrong data formats used, inaccurate observations or entries, and intentional misrepresentations. Consistency and plausibility checks must be performed already at the time of inputting the data on board a vessel. In 2016, DNV GL conducted a study showing that manually-collected data can be of comparable quality to sensor-generated data if its quality is thoroughly assured.  

Unstructured data, in other words text documents of all kind, pose a particular challenge for efficient reuse of data entered, quality assurance and analytics. Standardizing formats and entries can provide some remedy. Artificial intelligence that enables natural language comprehension is progressing quickly, but currently is still not economically feasible for all or only possibly very few ship owners and managers.

Even though ship owners and managers want to be in control of all the sensor data logged aboard their assets, they often receive a derivative product from the OEM – based on the data, rather than on the data itself. While this increases risk of vendor lock-in, it may also hinder reuse of data logged on assets for other purposes. With equipment being increasingly sensor-packed, system and equipment providers are often in control of the data value chain, something that may require consideration when ordering equipment.

Key questions for assuming control of the data, is how to relay data from the equipment to shore. Data is gathered by the ship’s instrumentation and relayed to the designated OEM’s system, to a separate data recorder or to an integrated automation system on board. To make further use of the data, such systems must expose the sensor data to the external world through an accessible, and preferably open, interface. An operational historian or data logger can be utilized to join information from the various data sources for intermediate storage of the information. “Operational historian” refers to a database software application that logs or historizes time-based process data. A continuous data stream from ship to shore will require a physical network connection from the operational historian to the ship’s external network gateway.

Sensor System
A schematic of a sensor system

As data is becoming an asset, data quality issues are arising. Proper data quality is visible – from the fleet manager, who determines his vessel’s hull cleaning intervals based on fuel performance data, to the ship manager, who makes critical decisions regarding whether to open a thruster for maintenance on health data. Organizations and people are increasingly relying on data for decision-making, and they are generally anticipating the information to be factual. That means they trust the generation of the data (sensors or manual collection), the equipment that stores, processes and cleans the data, and the algorithms that make sense of the data. Finally, they must trust the security arrangements that protect the data. But these steps in the data value chain are just some of the elements contributing to the final quality of the data. 

The downloadable practical guide discusses all dimensions that in total define the quality of data, and include elements towards strategies for improving data quality.

Streamlining_data_flow_Quality_Check
Contact us How can we support you? Get in touch with us for discussing your challenges on creating value from data in shipping