Turning data into trusted information as a basis for true insights and value creation is a challenge, considering the multiple sources of data being generated on vessels as well connectivity limitations. A significant part of today’s data is collected by manual input from the crew. There are good reasons for this proven approach: the crew’s felt ownership and responsibility for the data as well as their observation of critical parameters enforced by this data collection. But quality management of the data collected is a key prerequisite for its later usability. This has several challenges: missing entries, wrong data formats used, inaccurate observations or entries, and intentional misrepresentations. Consistency and plausibility checks must be performed already at the time of inputting the data on board a vessel. In 2016, DNV GL conducted a study showing that manually-collected data can be of comparable quality to sensor-generated data if its quality is thoroughly assured.
Unstructured data, in other words text documents of all kind, pose a particular challenge for efficient reuse of data entered, quality assurance and analytics. Standardizing formats and entries can provide some remedy. Artificial intelligence that enables natural language comprehension is progressing quickly, but currently is still not economically feasible for all or only possibly very few ship owners and managers.
Even though ship owners and managers want to be in control of all the sensor data logged aboard their assets, they often receive a derivative product from the OEM – based on the data, rather than on the data itself. While this increases risk of vendor lock-in, it may also hinder reuse of data logged on assets for other purposes. With equipment being increasingly sensor-packed, system and equipment providers are often in control of the data value chain, something that may require consideration when ordering equipment.
Key questions for assuming control of the data, is how to relay data from the equipment to shore. Data is gathered by the ship’s instrumentation and relayed to the designated OEM’s system, to a separate data recorder or to an integrated automation system on board. To make further use of the data, such systems must expose the sensor data to the external world through an accessible, and preferably open, interface. An operational historian or data logger can be utilized to join information from the various data sources for intermediate storage of the information. “Operational historian” refers to a database software application that logs or historizes time-based process data. A continuous data stream from ship to shore will require a physical network connection from the operational historian to the ship’s external network gateway.