November 2019 Exclusive Story
Ruling Preserves Industry’s Ability To Employ Independent Contractors
LOS ANGELES–Since the first commercial availability of digital computers, the oil and gas industry has incorporated digital measurements and computation-based analysis to improve exploration, development, and production processes. Seismic imaging and reservoir simulation, for example, are among the most advanced forms of industrial scientific computing applications, incorporating sophisticated, complex algorithms and massive data volumes to push the boundaries of high-performance computing in every area.
Advances in “scientific computing” will continue in the upstream oil and gas business, driven by new developments in both measurement and computing technology, as well as by the business need for more complete subsurface characterization, more complex well and facilities designs, and more accurate drilling predictions.
Another important dimension of oil and gas computing may be referred to as real-time computing. RTC utilizes continuous sensor data streams and industrial control systems to monitor, operate, and optimize oil field processes in real time. RTC is essential for introducing significant levels of automation into oil field operations. Developed for downstream crude and petrochemical applications, such process control technology–sometimes referred to as operations technology (OT)–is expanding rapidly into large, complex systems for upstream operations.
A key premise of the digital oil field is that integrating real-time data streams, process control networks, and advanced analytics will provide continuously improved asset management, including important processes such as maintenance and supply chain optimization.
Advances in RTC are leveraged on the exponential growth in a broad front of information technologies, including ubiquitous digital sensor systems, high-speed wireless connectivity, vast data management and analytic capabilities (the big data paradigm), and the near-universal distribution of powerful computing and interaction devices such as smart phones and tablets. In combination with more powerful distributed control systems, the OT side of the upstream is undergoing a rapid evolution in RTC.
The Internet Of Things
Specific examples of the convergence of technology trends with RTC in the oil field include the Internet of Things (IOT), embedded intelligence, and big data and analytics. IOT involves the dramatic and near-universal increase in the measurement, control and connectivity of everything at an unrivaled pace.
While a rather abstract concept, IOT can be viewed as a fully networked world, where virtually every digital device is linked through the Internet to all other devices, and of course, to people. This may include anything from a cell phone to a geophone, as well as the multiple individual components and control systems used in a production facility.
Data collection and connectivity already are encompassing virtually all areas of technology, from consumer goods to industrial control processes. Adding sensors and intelligence to the information infrastructure extends into every aspect of every business, and even into our personal lives.
Oil and gas is a particularly information-intensive industry, and managing large volumes of data collected at multiple points across an operation or organization is not a new challenge. What is new, however, is not just integrating all that data, but networking all the sources of that data as well.
Real-time computing is a natural fit with IOT technology. The dramatic increase in data and connectivity is driving a completely different approach to computing. Real-time systems provide faster results and enable better decisions and more timely actions. They change information and computation flows, and leverage operations technology such as the supervisory control and data acquisition systems used to automate and manage oil and gas fields, gathering systems and processing plants. Real-time systems also change the human/technology relationship.
IOT and RTC are powerful enabling technologies on their own merits, but together, they can enable companies to operate their assets more efficiently and productively, with greater operational insights and higher levels of safety.
Data sensors are becoming smaller and cheaper, and they increasingly have embedded intelligence and connect wirelessly. In the pocket of a safety vest, they track the wearer’s movement. In the kitchen, they monitor a refrigerator’s electricity use. In the oil field, they enable more measurement, greater control, and expanded levels of automation.
The devices used for data acquisition are changing as well. New classes of devices, such as drones, are rapidly finding a place in surveillance and lightweight deliveries. The extended physical nature of oil and gas operations lends itself to opportunities such as visually monitoring field operations, detecting anomalous conditions, and recommending an on-site visit by a maintenance or security crew. Just as tablets and smart phones have been adopted widely as standard tools in the field, so will be new computing and connectivity devices not yet fully imagined.
Depending on the type of completion and artificial lift system installed, more and more wells already have at least a limited ability to acquire and transmit data to the surface. In the not-too-distant future, nano-devices and downhole robotic tools could be part of the standard well completion to not only acquire and transmit data, but also to perform a range of on-demand duties within the wellbore environment.
With sensors deployed across the oil field, new types of measurements and asset management protocols will become practical. For example, monitoring air and water emissions would become a far easier and more economic task with sensors already integrated throughout an asset area. Data analysis even can lead to predictive maintenance schedules that prevent emission problems from occurring. This is a huge benefit for operators, especially in shale plays where compliance can be an issue because of the density of producing wells, and storage and treating facilities.
With the growth in real-time data sources, RTC likely will find numerous opportunities for oil field applications from the advancements in big data and “informatics” technologies. The big data analytics approach is to utilize all the diversity of current data streams and historical data to find previously unseen patterns and correlations that could be useful in making decisions and solving complex problems.
The oil and gas industry stands to benefit from big data analytics as much as any commercial sector, largely because historically, it has acquired and managed more data than any other sector.
As digital density increases, security exposures increase. The two are wound around each other like DNA. More information, computing and control systems inherently add both physical and cyber security risks. The nature of oil and gas assets and operations mean that cyber-physical security exposures extend to suppliers and all the personnel that come onto an oil field.
There is always interactivity in a connected world, which also increases security risk. There are very few instances where technology stands by itself; ultimately it has to interact through some other set of platforms.
For example, the simple act of downloading a new application to a cell phone depends on a mobile network and computing infrastructure never seen and rarely considered. Controlling what data are shared and where data travel is a complex and risk-filled proposition. Controlling that risk takes a combination of technology and processes. Not surprisingly, security in the digital oil field continues to be a major area of academic and industry research.
There are thousands of applications for data management and control. Figuring out how to better operate these and the thousands of process controllers in a field is a major challenge. With a little more data and a more sophisticated way of thinking about the data, the infrastructure could operate far more efficiently.
For example, consider the value that comes from identifying the biggest maintenance risks in a producing field. If those risks could be determined, the maintenance schedule could be adjusted to mitigate them. With a sufficient number of measurements and a realistic understanding of how all these systems work and function together, it is possible to predict when maintenance should be performed to avoid breakdown situations and to eliminate the need for repairs.
This kind of dynamic maintenance schedule predicts failures, and potentially avoids them altogether. The time to address a problem is when the risk of failure starts to rise. This is, of course, a complex challenge, especially in oil and gas where there is multigenerational technology in many fields. Ultimately, the goal is operational excellence, where there are zero failures, unplanned shutdowns or safety incidents.
The number of human/machine interfaces (HMIs) and the ways in which people interact with data are starting to become overwhelming. When data volume grows by a factor of 1,000, which it will in the next decade or sooner, the traditional approaches to the HMI will need to change.
The oil and gas industry was the first large-scale industrial user of graphics and graphical computing. There is a clear opportunity for the industry to again assume the lead by adopting applications based on gaming technology. People interact with games in fundamentally different ways than they do with traditional office and scientific applications. The underlying technology is likewise fundamentally different, supporting vastly more data visualization and ad hoc decision making in real time.
Considering the power of that underlying technology and the thousands of hours of practice that young users have with gaming interfaces, there is huge leverage to be had in adopting gaming technology in oil and gas business computing functions. Certainly, sectors such as the military have learned that gaming presents a much more efficient way to get people to interact with data and make real-time decisions based on their training.
In many ways, the untouched opportunity is the entire way in which humans interact with ever-richer and exponentially growing data volumes. Consider the challenge in terms of traditional engineering drawing systems. There is a predefined decision tree inherent in engineering drawing applications that determines what the user can do and how data are displayed.
In the gaming paradigm, the decision tree is recomputed with every action, but it is only partly recomputed and then simulators are used. This visualization technique leverages the data and what we know about them to produce a scene that is realistic and rendered in real time. Similarly, real-time systems act on the data at hand to move the users to the next decision by determining what to recompute and display. While this recomputation happens at every step, it does not run very deep and is fast enough to be nearly transparent to the user.
The tree itself changes in this paradigm, depending on the decisions made. The many permutations could never be designed ahead because there would be trillions of branches on the decision tree. The number of potential branches grows exponentially, but with gaming technology, only the branches followed are actually computed. This is a fundamental paradigm change from traditional computing workflows.
A gaming approach to engineering design would define all elements at the smallest subassembly possible. The drawing itself, like the configurable product it represents, can be built ad hoc, based on the user’s specifications. This is analogous to gaming, where users can define their weapons, tools, structures, armor, etc., at any point in a game and in combinations that are perhaps unique.
One of the great strengths of this approach is that only the pathways used are computed, eliminating the need to predesign or recompute all the potential pathways a user could take.
This is a very different computational strategy that could help companies deal with the sheer volumes of data constantly streaming into their information technology systems. Gaming-based workflows have the potential to be both meaningful to the user and supportive of innovative real-time solutions that track and stabilize around successful pathways.
As more information technology is injected with every aspect of operations, more training is required. For oil and gas, that means geologists, geophysicists, engineers, pumpers, service technicians and back-office personnel all need to have deeper understandings of how an IT solution works, how it is evolving, and how it can change the nature of the business.
As important technology trends such as IOT, big data and analytics become integrated into the framework of real-time computing, the industry will develop valuable opportunities to better understand and control all aspects of oil and gas operations. Going forward, these technologies will create new ways to gather and interact with data, provide new perspectives on oil and gas assets, and ultimately, give companies the ability to lower costs and optimize performance of their production assets.
DONALD L. PAUL is executive director of the University of Southern California’s Energy Institute, and is the William M. Keck chair of energy resources. He had a 33-year career at Chevron Corporation, retiring in 2008 as vice president and chief technology officer. Paul is a senior adviser to the Center for Strategic and International Studies, and is a member of the National Petroleum Council. He also participates in advisory roles at energy companies, technology firms and several universities, including the Massachusetts Institute of Technology, Harvard, Rice, Stanford, and the University of Texas at Austin. Paul holds a B.S. in applied mathematics, an M.S. in geology and geophysics, and a Ph.D. in geophysics from MIT, and an honorary doctorate from the Colorado School of Mines.