Bigger and bigger data
For testing, measurement of the stresses applied to test-objects is of eminent importance. In testing, test-engineers must guarantee beyond doubt that the proper wave shapes are applied, customers need to be sure their equipment is not overstressed.
In high-power/high-voltage testing, a key factor is the influence that the electromagnetic environment of the laboratory has on the measurement results. High current, high voltage, discharges, switches create many sources of electromagnetic pollution that readily affects signals for measurement and control in cables and wires. Here, a competition exists between two conflicting trends that are visible in the past 100 years of testing. On the one hand, the electromagnetic influences increased in due time because testing of ever higher current and voltage is asked for. These electrical quantities, through the unavoidable Maxwell laws, leave their traces through induced currents and voltages in the measurement chain. On the other hand, these signal levels, the eyes and the ears of the test-engineer on which he bases his decisions, become smaller and smaller over the years, shifting from electrical, to electronical to digital. Thus, a major effort of laboratories is to make the ever smaller signals immune to the ever bigger disturbances.
Vintage measurement systems used at KEMA recorded the relevant traces through galvanometric recorders with a rotating drum, at the inside of which photographic paper was placed. A tiny mirror, activated by the measured signal projects a light beam. Although the development of the films was a major hassle, signals could be recorded quite precisely. A major improvement was the introduction of the “loop” oscillography, that allowed recording on plain paper, sensitive for ultra violet light. No more long hours spent in dark rooms! And the result of test became available immediately after the test, when the device spit out long paper strips with a speed op to 10 meter per second. Still, the numerical evaluation of the measurement had to be carried out by hand or by creative tools, like the “skate”, for asymmetrical current evaluation. These devices remained in service till the mid-eighties. Some years before, electronics knocked on our door, and we introduced cathode-ray oscilloscopes when detailed recordings of certain fast phenomena was required. The measurement reach was, and is traditionally split into three regions: the low frequencies, covering the power frequency domain, the medium frequencies (up to 50 kHz) necessary for recording of the transient recovery voltage, and the high-frequency dealing with short-line fault phenomena as well as re-ignition and restrike phenomena. Being able to look into an amazing detail, new phenomena during tests became apparent. One example of this is the non-sustained disruptive discharge, that lead to heated discussions in standardization committees for at least a decade. This is just an example how testing and measurement advances the power equipment technology and understanding of events hitherto unexplained.
Meanwhile, also the digital age dawned. The first real digital data recorder was the “super green machine”, a recorder able to capture only 4000 data points, but with a sampling rate high enough not only to match the requirements of the time but to open up a realm of new possibilities.
In the early nineties, the advantages of digitalization started to become fully exploited. A team of specialists was hired exclusively for the development of laboratory digital data recording systems. It reflects our policy to have technology that is a cornerstone of testing in our own hands.
From that time on, a rapid succession of transient recorder technology passed by. In the early years, data was recorded digitally, but plotted on paper. In the next step, software was developed that enabled digital data analysis and automated reporting. The first laboratory system was based on Maurer data acquisitions systems, later followed by systems manufactured by (successively) Nicolet, Bakker and National Instruments. The breakthrough of each step was speed. While the earlier systems communicated through serial links, the amount of data soon exploded to such a degree that suitable solutions had to be found.
In the high-power laboratories, till the present day, data acquisition, accessing hundreds of transducers in each laboratory is centralized. This is different in the high-voltage laboratory, where each test cell has its own dedicated measurement system. Decentralization of data acquisition is also the future choice for the high-power laboratories.
An in-house development, called Nexis, based on open-source hard- and software from CERN, is in preparation as the next generation of data acquisition. Low-cost, local storage units are applied, synchronized centrally through GPS and connected through optical fibres to a massive database. The tedious “patching”, the manual connection on a physical switchboard in the command rooms, prone to human errors and malfunctioning connectors will then be replaced by software.
And, electromagnetic field cannot mess around with the laser light signals, flashing through the secure optical data highways in the laboratories.