More than 2.5 million miles of pipelines were accounted for in the US last year. As the country’s shale industry grows and peak oil production looms in 2019, the government is trying to spur greater network capacity and more sophisticated approaches to pipeline safety.
Despite attempts to streamline the Federal review process for interstate natural gas pipeline permit applications through The Natural Gas Pipeline Permitting Reform Act, new pipelines face increasingly tight rights of way, according to Dr Neil Thompson, vice president of DNV GL’s Pipeline Services Department, North America, in Dublin, Ohio.
“New lines must often run alongside, below, above or across existing pipelines and power lines. But electrical interference from other pipelines and power lines can corrode steel pipes through effects known respectively as stray current and induced alternating current,” he explained.
Other forms of corrosion pose integrity threats to offshore pipelines, such as the large number traversing the Gulf of Mexico.
Most US pipelines are buried: establishing what is already below ground can be tough with more than 3,000 companies operating pipelines. The ageing of the current pipeline infrastructure also presents a challenge for owners and operators seeking the maximum efficiency from their assets during a time of increased production. “Many onshore high pressure pipelines are 50 to 60 years old, and some Gulf of Mexico pipes have a five-decade lifespan too. This creates a mammoth job in monitoring them and deciding which to replace or rehabilitate when pipeline integrity is threatened,” Thompson said.
These challenges have caused some oil and gas industry players to question the regulatory approach to risks associated with pipelines in the US.
“Regulation has been driven by historic service failures,” said Dr John Beavers, director of failure investigation and chief scientist at DNV GL’s Dublin, Ohio facility.
One such example is an incident in Carlsbad, New Mexico, in August 2000, which DNV GL’s experts were called upon to investigate. An internally corroded natural gas transmission pipeline ruptured, and the resulting ignition of the escaping gas killed 12 people.
Tighter regulation since has seen the proliferation of in-line inspection tools generating data on pipeline condition to supplement aerial and on-the-ground inspection by trained staff. Yet a concern with US regulation being based largely on past incidents is that it does not account for constantly-changing conditions surrounding a pipeline.
“Urbanisation, soil movement and weather variations all have an impact. Pipeline steels, pipe fabrication methods, coating types, and installation procedures also change. These can all have unintended consequences and may not be accounted for through regulation,” Thompson said.
The US regulatory approach to developing and maintaining pipelines is largely prescriptive, and government defines the actions required of pipeline operators to manage safety and integrity.
However, more than three-quarters (76%) of the country’s oil and gas professionals surveyed last year for a GL Noble Denton study said they would prefer a goal-based approach to regulation. This sets clear targets in terms of safety and environmental protection, but allows significant freedom in the way in which they are achieved.
Goal-based regulation relies on operators using risk management methods – such as independent verification – rooted in probabilities of events occurring separately or in interlinked ways.
“For pipelines, risk depends on location, so an assessment for predicting future risk must be able to analyse and link up causative factors across locations in a quantitative manner and relate these to failure processes,” Thompson said.
“Greater risk management has come into the US pipeline industry over the past decade, even if it has been rather qualitative in nature,” Beavers added. “We expect more will be included in US regulation as time goes by. But there is still no requirement for true third-party verification.”
There is a lot of both legacy and new data from in-line monitoring to inform a quantitative risk management approach. “However, there is a need for better data management to enable better and quicker cost-benefit analysis of which pipelines to rehabilitate and which to replace,” Thompson stressed.
MARV helps decisions
A new DNV GL tool offers data management for smarter cost saving decisions about pipelines. Multi-Analytic Risk Visualisation (MARV) can acquire data at specified times and from different locations, and models the probability of pipeline failure based on threats to integrity. Some of these include manufacturing and construction defects, weather, earthquakes, corrosion, mechanical damage, sabotage and flawed operation. It can also model the likely environmental and safety consequences of a failure.
Input is mainly from incident databases, time-based data and geographic based information, though MARV can interface with a number of data sources.
It displays an easily grasped visualisation with a touch-screen interface allowing users to call up specific locations to show, for example, third-party damage risk for that location.
As well as the mean value for a risk, MARV shows the values below or above which confidence in the figures drops below 90%. This allows operators to initiate the right responses, such as gathering more data or taking preventive action.
MARV has been tested on small pipelines with industry partners in China and the Middle East. “We will next do a trial on a larger pipeline and hope to make MARV available in the US by the end of 2014,” said DNV GL’s Dr Neil Thompson.
 The International Energy Outlook 2013, US Energy Information Administration (EIA), July 2013
 Reinventing Regulation: The impact of US reform on the oil and gas industry, GL Noble Denton, May 2013