March 2019 Exclusive Story
Pending Regulation Has Broad Market Impacts
PALO ALTO, CA.–The domestic oil and gas industry has seen a multitude of new opportunities over the past decade created by the development of unconventional resources and the technological advancements that make those plays economic. Internationally, technology is leading to significant discoveries in operating areas as diverse as the Mediterranean Sea and the Arctic Circle.
But despite these successes, on the global scale companies are encountering a fair share of business challenges when trying to capitalize on new revenue opportunities. A recent report from the Society of Petroleum Engineers found that while worldwide demand for oil and gas products is increasing, the international industry faces four major challenges: Access to new reserves around the world (80 percent of known reserves are locked up through resource nationalism), increasing cost and margin pressures, skill shortages, and a lack of breakthrough innovations.
For instance, global competition and market volatility are driving the need for faster discovery of new hydrocarbon resources, greater production output per well and extending the productive lives of existing assets. At the same time, regulatory and compliance efforts are increasing for oil and gas companies in many parts of the world, while the competitive nature of the business continues to create relentless pressure to reduce costs. To address these challenges and create the next generation digital oil field, companies are seeing a growing need to effectively leverage all of their data resources.
In fact, upstream oil and gas companies and service companies alike are facing increasing demand for timely and complete information in the face of growing data volumes and complexity. One survey found that oil and gas companies lose an average of 22 percent of their annual revenues as a result of not being able to fully leverage all of the information they collect during the course of their daily operations. With large volumes and disparate data types from a variety of sources–from 3-D seismic, drilling reports and well logs, to lease information, regulatory records and accounting data–operating companies are seeking to leverage their information assets without incurring the high costs of replicating and consolidating it multiple times.
To accomplish that task, the focus is on using the data as they become available in place and on-demand by creating an agile information infrastructure. In fact, many companies are using data virtualization as a key component of their information infrastructures for this very reason. Data virtualization is an emerging technology that combines disparate sources of data into a single “virtual” data layer that provides unified access and integrated data services to consuming applications in real time (or on demand).
However, using sophisticated equipment and measurement devices in exploration, drilling and production processes has introduced multiple data challenges for organizations. Chief among them is the increasing volumes of data collected. The significant volume and complexity of data produced in upstream business processes poses challenges to quickly and effectively leveraging the value contained in the data. The drilling operation on a single oil well generates an average of 1 terabyte of drilling data each day.
Another issue is growing data “silos.” Data from different oil and gas entities are not only spread out geographically, but also located within various disparate systems and are often locked under proprietary formats and applications. This limits a company’s ability to integrate data across systems and multiple disciplinary teams for analysis.
In addition, while critical information and insights can be gathered from unstructured data, traditional analysis tools and applications are not able to consume them. What is needed is an easy way to access, transform and integrate unstructured data with structured data.
Finally, given the sheer size of the daily data volumes, data replication can be a major challenge for oil and gas companies. Redundant copies of data lead to data quality issues in reporting and can lead to information inconsistencies, especially when it comes to regulatory compliance.
Oil and gas organizations need to ensure that data are well integrated with other systems so that users can harness the intelligence for real-time analytics and operational decision making. Today, data virtualization tools have matured to the point where oil and gas companies are adopting them at rapid rates in an effort to lower the costs of traditional integration, which often involves writing custom code, and supporting process for both extract, transform and load (ETL) and data replication processes. Data virtualization technology can also simplify data access by connecting and abstracting sources, combining them into canonical business views, and by publishing them as data services.
Simplified Data Management
Similar to server, storage and network virtualization, data virtualization simplifies how data are presented and managed for users, while employing technologies “under the covers” for abstraction, decoupling, performance optimization and the efficient use/reuse of scalable resources. One of the key aspects of data virtualization that enables it to be an efficient common data layer is “push-down optimization.” Simply put, this is the ability to take user queries to the data virtualization layer and delegate parts of it to the different systems that are best able to process the query, reassemble the results, and then present it to the user.
The adoption of data virtualization in the upstream oil and gas sector has typically followed a path that could be best described as “tactical return on investment now with agile infrastructure long term.” Companies have sought to bring in data virtualization for agile business intelligence to create single views of well master data for operational or regulatory reporting, to share virtualized data services with partners, to decouple data sources and applications during migrations, to enable rapid integration of mergers, and to create analytical “sand boxes” that quickly bring together new sources of data, etc.
In all of these cases, the value of data virtualization has to be proven on time-to-solution, in lowering costs, and in how well it supports or integrates with existing technologies. However, many oil and gas companies are beginning to see that longer-term data virtualization provides numerous benefits. These include creating a flexible and responsive information infrastructure that is critical for business agility and leveraging data assets to quickly capitalize on market opportunities as they emerge.
Therefore, before getting started, the operator should acknowledge both its immediate project needs and its long-term vision and seek a technology platform that will enable it to make faster and more accurate decision making, yield greater production and operational efficiencies, improve compliance processes and reporting, and quickly incorporate industry data models such as PPDM™, WITSML™ and PRODML™ into their infrastructure and development workflows.
Three Key Steps
This can be accomplished through three key steps. The first is to establish a unified data view in which the data virtualization platform should connect to a diverse set of structured and unstructured data sources, including subsurface data, borehole data, external vendor databases and other proprietary data formats, to provide a unified view of data that can be quickly leveraged for faster and more accurate decision making. There are any numbers of innovative ways in which big datasets can be presented in a unified view.
The second step is real-time data integration. The data virtualization platform should enable the integration of real-time well data with other operational data systems to provide up-to-date and tightly integrated data that can minimize well downtime and increase production. One oil field services company has deployed data virtualization to combine real-time data from drill bits with customer, product and enterprise resource planning (ERP) systems to feed multiple business intelligence and analytics tools. The data virtualization layer provides operators with the flexibility to combine elements in real time across different sources for analysis.
The final step is agile data integration while minimizing replication. Here, the focus is on ensuring that the data virtualization platform enables the creation of normalized, virtual views without the need to replicate source data. As noted, these virtual views build agility and minimize data quality issues that arise from redundant data copies.
In an uncertain world and an ever-changing market, only those businesses that react quickly and insightfully can thrive. Yet, the ability to react rapidly requires oil and gas organizations to have access to up-to-the-minute information and the ability to quickly change direction with ease. Making insightful decisions requires a broad view of the current situation and its entire historical context, which needless to say, is a lot of diverse information. Using data virtualization, oil and gas companies can integrate information beyond the bounds of traditional business intelligence needs and mine their data assets for greater overall value.