×
Ad - Click to learn more about Leistritz Multiphase Pumps.
Ad - Enertia Software: Click To Learn Enertia
Ad - MindsEye!: Put the creative energy of MindsEye! to work for you!
June 2018 Exclusive Story

Data Virtualization

Technology Delivers Information At Business Speed

By Lakshmi Randall

PALO ALTO, CA.–Companies across the upstream, midstream and downstream sectors of the oil and gas industry supply chain are taking a critical look at data assets, analyzing what percentage of their data is being used to improve critical factors such as pricing, production and trading.

Digital experiences require data at the speed of business to meet customer expectations. Digital business integration is increasingly pushing enterprises to focus on the value proposition of integration strategy, rather than technology. Multiple technologies working together are crucial to delivering the ultimate digital experience to various stakeholders such as customers, partners, suppliers and internal users.

Often, the reason data cannot be leveraged is because they are poorly integrated. Despite the most modern advances such as cloud and big data technologies, data often end up in functional silos where, for example, cloud data are separated from on-premises data, transactional data are separated from analytical data, modern data are separated from legacy data, etc. But bringing all these data together takes both time and money, since this normally requires data to be moved to a new and central location, where they need to be housed and maintained. Alternatively, it requires data be integrated on a continuous, ad-hoc basis, which is prohibitively costly.

Equally important is that decision makers need access to a singular, holistic and integrated view of data across the entire enterprise to retrieve actionable insight. Without such insight, it becomes challenging for oil and gas professionals to make informed decisions based on a total and complete assessment of a company’s data and energy assets.

To address these critical data challenges, oil and gas players need to leverage technology that will enable them to access structured and unstructured data residing in enterprise, big data, and cloud sources in both batch and real time. This will allow them to exceed their performance needs for both analytical and operational use cases by delivering information in a much shorter timeframe than traditional data integration tools. Data virtualization can achieve that goal.

Rather than moving the data from different sources and combining the data in a new, consolidated location, data virtualization provides a view of the combined data, leaving the source data exactly where it is across the myriad source systems. This means that oil and gas companies do not have to pay the costs of physically moving and housing the data, while still benefiting from bringing the data together in real time across the entire company.

Wide Variety Of Uses

Because data virtualization technology accommodates existing infrastructure in its existing state, it is relatively easy to implement compared with other options. And because it provides a view of the data in real time from a variety of systems that are normally very time consuming to integrate, it can support a wide variety of uses.

For example, data virtualization enables a “smarter” extraction and production system that continuously captures and applies data generated across the value chain to identify new and previously inaccessible reserves, improve asset availability, and proactively mitigate risk. Other important benefits include:

  • Real-time data access without replication;
  • Consolidated views across myriad sources;
  • A single point for implementing security and governance protocols;
  • Detailed traces of data lineage; and.
  • The ability to connect with most legacy and modern sources–including transactional, cloud and social media sources–to accommodate the range of highly structured to completely unstructured sources.

Data virtualization use cases include creating a unified global well information repository. For example, a major oil field services company is leveraging data virtualization to create a virtualized repository of canonical well data views across dozens of sources to provide real-time, on-demand access to integrated well data without having to physically move any data.

In another case, data virtualization is being used to support product agility and customer satisfaction by a leading energy information services company. It has created a unified logical data layer across all information assets in the company (well, lease, production, reservoir, regulatory, etc.) for multiple geographies. Normalized data services were used as sources for applications.

To improve data quality and operational reporting, a North American pipeline and energy logistics provider is leveraging data virtualization to deliver self-service intelligence on pipeline volumetrics and HS&E, using reusable data views for internal operations, customers and regulators while reducing data replication by 80 percent.

In a final example, data virtualization is being used by a national oil company to leverage agile business intelligence and big data. In this case, the company created a logical data warehouse combining relational and unstructured big data with multiple interfaces for reporting tools, analytics and data services access.

Drillinginfo Case Study

With a mission of helping the oil and gas industry achieve better, faster results, Drillinginfo wanted to expedite the integration of data across the data warehouse and other sources to provide its data to users more quickly. However, the product development team’s delivery timelines were routinely at risk because of data availability and data consistency issues. As a result, the developers were directly accessing the data sources, and in other cases, were suffering from severe delivery delays.

Drillinginfo decided to implement data virtualization to establish a logical data warehouse that provides seamless access to the company’s physical data warehouse as well as geospatial data and other sources. The data virtualization platform connects to these data sources, combines the data, and publishes the resulting virtual views as data services. These services are then consumed internally by the application development team, analytics and decision support applications, and application data “marts,” as well as externally by customers.

A caching mechanism is used to store data about business entities such as wells, completions, producing entities, permits, etc., which are exposed as data services, analytics services, and map services. These services are then used by internal application developers and customers to build applications. This allows Drillinginfo to greatly accelerate its data services delivery and improve its capacity to serve more customers in the same timeframe. Usable Web services for developing applications that used to take one to two weeks are now accomplished in less than a day. Drillinginfo can also manage the entire data virtualization process with just one full-time developer and a part-time virtualization administrator.

Oil Company Case Study

The drop in oil prices that began in late 2014 necessitated changes in the strategy of a Fortune 500 U.S.-based oil and gas company with 4,500 employees and $11 billion in revenue. Its capital expenditure in 2017 was about $4 billion. One change was a shift in focus to the high-margin regions of Colorado, West Texas and the deepwater Gulf of Mexico. This shift created a significant need for better data to address “edge” cases (e.g., fracturing) related to these areas of operation.

Accompanying this change in focus on high-margin areas were organizational adjustments and the transfer of personnel to the key asset areas. These adjustments created the need for faster data to be more responsive to changes in business areas and key organizations.

In parallel, the company recognized the importance of enhanced technology and established an advanced analytics team, and embedded individual analytics teams in key business units. This generated a need for even more data. The company initiated a digitization strategy to retrieve data from sensors for production purposes, and to deliver more data to business (such as subscription data detailing what was occurring in the industry).

The firm partnered with one of its key asset areas to set up a data management program for which a major deliverable was a self-service data delivery environment for creating and using data services for analytics, reports and apps. The scope of this effort included:

  • A shared, managed environment for data producers and consumers;
  • Corporate and non-corporate data source mash-ups;
  • Responsive delivery of data products with real-time data access; and
  • Bridged data environments across technology and business domains.

The company implemented its self-service data delivery environment using a data virtualization platform that provides capabilities such as governance and data cataloging. The implementation integrates data across 20 corporate repositories and delivers data to more than 1,000 users. It addresses use cases such as improved (production) completion design from multivariate analysis using virtual views targeting significant per-well increments in production quantity and speed; greater (combined) access to vendor subscription data exploration for competitor intelligence; and reduction in ad valorem taxes included in financial analysis.

A major benefit of the new data virtualization-based data architecture is the ability to separate data storage from data access, enabling flexibility to tap into new data technologies as they become available and to move data to different storage locations (on-premises, cloud, in-memory, Hadoop, etc.) while maintaining the same view of the data. It permits spanning across these different data environments, such as new (NoSQL data stores) and existing data repositories while allowing users to work and operate in self-service mode.

Oil and gas companies are quickly discovering how to do more with less and control their rates of expansion in the recovery by placing more of a focus on optimization and squeezing every ounce of efficiency from the assets they possess. Using the right technology, companies can attain the ability to integrate critical data from multiple and relatively closed-off systems, which are virtually inaccessible throughout the entire organization.

As a result, decision makers in the oil and gas industry can gain access to a singular, holistic and integrated view of data across the entire enterprise to retrieve actionable insight. After all, without this view, it is increasingly challenging for oil and gas professionals to make informed decisions based on a total and complete assessment of a company’s data and energy assets.

Lakshmi Randall is director of product marketing at Denodo, which offers leading data virtualization software solutions. Previously, she was a research director at Gartner covering data warehousing, data integration, big data, information management, and analytics practices. Randall holds a B.S. in mathematics and an M.S. in computer science from the University of Madras, and an M.B.A. from the University of California, Irvine.

For other great articles about exploration, drilling, completions and production, subscribe to The American Oil & Gas Reporter and bookmark www.aogr.com.