Companies in the petroleum industry are increasingly adopting advanced technology to supplement and replace tasks previously performed by humans, both on land and offshore.

Ever greater use is being made of solutions which utilise machine learning to process large quantities of real-time and historical data rapidly and with high resolution and precision – in drilling and well work, for example.

Innovatory technology and novel solutions can help to reduce risk. However, that depends on good assessments being made by the companies before these advances are adopted.

“We see that new solutions often imply significant changes for employees in terms of user interface, work tasks and processes, and new ways of collaborating in organisations,” says Linn Iren Vestly Bergh. “The fact that the companies are introducing several individual technologies simultaneously and at a rapid pace makes the conditions more challenging.”

Together with colleague Kristian Solheim Teigen, Bergh occupies a key place in following up the industry’s digitalisation work at the Petroleum Safety Authority Norway (PSA).

Cognitive technologies

Safetec has conducted a study for the PSA where it describes the concept of “cognitive technologies”. In these, a data model can be understood as the “brain”.

That model processes information and signals from a number of sensors and different sources, and compares them with historical trends. This input is further processed as the basis for actions by or recommendations to a human decision-maker.

This can be equated with the process when humans acquire, collate and process information from vision, smell and touch in order to compare it with their experience and acquired knowledge of the external world before reaching decisions and taking action.

Data models utilising cognitive technology can help to increase safety and efficiency. When drilling a well, for example, such a construct can help to interpret downhole signals and measurements and to determine whether these are early signs of hazards.

Interact

People are required to a great extent to interact with the technology being used. That makes it necessary to ensure that the data models and interfaces are comprehensible to their users.

This requirement is being reinforced as ever more advanced technology gets adopted. Information presented to decision-making personnel is generally not only more processed but also more complex than before.

That introduces to new sources of error – and increased risk. Biases in a database can be carried forward, for example, and lead to undesirable conclusions about and recommendations on a specific action – without the decision-maker understanding these biases.

Data models could eventually become so advanced that decision-makers are unable to grasp how the recommendations they receive are arrived at – a phenomenon often called the black box problem.

“The human being has ultimate responsibility for the operation,” notes Bergh. “One risk with using a multiplicity of complex data models is that it can be difficult for the end user to interact safely and efficiently with the technology.”

No negative effect

The PSA keeps abreast of technology development in the industry, and is concerned to ensure that the introduction of new solutions by the companies does not have a negative effect on risk.

Through this work, it sees that only limited attention is being paid to human abilities and prerequisites when developing cognitive technology. Even after that approach has been chosen, such abilities and prerequisites are not always considered.

The PSA also sees that development of data models is often outsourced to subcontractors, and that the operator companies are less likely to list the involvement of end users as a requirement. Furthermore, providers of cognitive technology often tend to have limited experience with and knowledge of offshore operations.

A failure to require end-user involvement in contracts, standards and specifications can result in a lack of attention to human abilities and prerequisites in the design of the technology.

Further development

“Learning and development aren’t things which should be confined to the design phase,” says Teigen. He emphasises that information on the use of cognitive technology – particularly in cases where the human-technology interaction functions badly – should therefore be acquired in such a way that it can be used in further development.

“Incident data can used to develop the actual technology, and can also say something about how operators interact with it during an incident,” he says. “This should be applied in turn to continued development of the technology so that it provides better support for human behaviour in response to errors, hazards and accidents.”

Learning from incidents and near-misses is not adequately picked up, according to a Sintef study for the PSA this year. Another finding was that loads on users monitoring the system can be high, and should not be further increased. The study therefore highlighted the importance of making provisions for automated reporting and of standardising how data from incidents are processed.

Risk assessed

With a rapid pace of change, increased complexity and the introduction of new technology on a large scale, the safety risk must be assessed both individually and overall. The PSA sees that several technologies are adopted simultaneously, and generally comprise several functional elements assembled in common interfaces.

Although individual technologies and functional elements may be relatively comprehensible on their own and have recognisable recommendations for use, their sum total and the way they interact with each other and with human comprehension introduce vulnerabilities and risks.

The PSA finds that this is a challenge on a number of Norwegian offshore facilities today.

Worker participation

Although technology is taking over many tasks, employees still play an important role for safety in the petroleum sector. It is therefore crucial that industry players make provision for genuine worker participation at an early stage in processes for developing and implementing cognitive technology.

This is also important with regard to which competency requirements can and should be posed for end users. In that context, the PSA’s concern is that integrating the human perspective is treated as important throughout the development process when adopting cognitive technology.