Conoco cuts spiraling data storage/ back-up costs
Conoco cuts spiraling data storage/back-up costsFaced with exponential data growth, especially in deepwater seismic, the company required a cost-effective method to integrate all of its project data in a way that would be largely transparent to usersR. Holt Ardrey, Conoco Deepwater Exploration, Lafayette, Louisiana; and Bruce Sketchley, Panther Software Corp., Houston, Texas his article tells how Conoco, faced with explosive data growth and spiraling hard-disk costs, found a long-term solution to its problems. The company first defined its needs, both present and future, through a formalized process that resulted in a set of criteria. Then it sought out available vendors that could meet those criteria. The chosen solution was centered on a robotic tape system for nearline (<2-min. access) mass-data storage. It was assembled and tested off site prior to installation to ensure compatibility and minimize installation downtime. Background And Needs In 1995, Conoco, Inc. established a goal of doubling the companys size by 2003. An essential element of the strategy developed to meet this goal was a successful exploration program. After evaluating exploration opportunities around the world, it was recognized that the deepwater area of the Gulf of Mexico had significant potential and favorable economic and regulatory environments for oil and gas development. As a result, the deepwater GOM became a key element of Conocos worldwide exploration effort. Since 1995, Conoco has acquired interests in 245 OCS blocks in the deepwater GOM and has recently embarked on an aggressive drilling program using its state-of-the-art drillship, the Deepwater Pathfinder. To properly evaluate various areas for acreage acquisition, in its ongoing efforts to develop a variety of plays and prospects, the company acquired over 13,000 mi2 of 3-D seismic data during the past three years. This represents two terabytes (TB) of data that has been received and loaded into the Landmark interpretation environment at Conocos Lafayette, Louisiana, office. The company recognized the value of new processing algorithms and added additional data volumes to the interpretation environment through loading of multiple processing runs over the same area. The combination of newly acquired and reprocessed data has resulted in exponential data growth. This trend is expected to continue for three to five years. Evaluation Process And Criteria Faced with rapidly increasing data-storage costs, Conoco defined a formal process of evaluating technologies for an integrated solution to manage its volume of seismic data. The steps to prepare for, and initiate review of, industry offerings were as follows:
An initial overview of currently available technology for seismic data management was undertaken. Information gathered was then used to help refine the definition of requirements for GOM Deepwater and North American operations. Having an understanding of what was available commercially helped in defining a realistic set of expectations for any selected offering. Using established capabilities of the different solutions that vendors were promoting, another review of company requirements was performed; available benchmarks of functionality were used to develop selection criteria. As a result of the review processes, Conoco established that it had sufficient capacity in the disk-management area, with 2.2 TB of online storage available on an existing data server. However, it was in need of cost-control methods and technology to alleviate the spiraling growth of its hard-disk environment. This required some form of nearline tape storage to augment hard-disk capacities. A key issue in implementing this type of system would be the ability to integrate nearline and online environments in an efficient manner such that implementation of this system would be largely transparent to interpreters. Further, it was recognized that this system would need to address Conocos needs in several areas, including: disaster recovery and back-up operations, knowledge management through project archiving, seismic project management and disk-space management. In addition, there was a requirement to manage seismic data flow throughout its working lifecycle. This meant eventually integrating unformatted SEG-Y files with the interpretation applications utilized by Conoco within the project environment. It also meant making the files available for pre-stack processing and analysis. This was viewed as a longer-term requirement, as opposed to the immediate needs of alleviating additional online disk-storage purchases and ability to back-up large data volumes without impacting system availability to users. The next step was to issue a request for a proposal to all vendors identified as having possible solutions to Conocos needs. From the responses, a list of vendors was compiled; some of these were subsequently invited to demonstrate, in further detail, their full-solution capability. Each vendor made a presentation to a representative cross-section of geoscience / data-management staff who would ultimately be affected by system selection. The final step was to select the solution that best met immediate and longer-term criteria previously established. This included prioritizing system requirements and ranking each vendors proposed solution against the prioritized criteria matrix. At the end of this process, Conoco selected a solution from a five-company consortium known as PetaStar, Fig. 1. This is a turnkey-systems solution that combines data-management and high-performance data-storage technologies into an integrated offering that fit the companys scalability needs. Nearline mass storage is combined with online disk to offer a "virtual disk" environment for post-stack SEG-Y, SeisWorks project data and pre-stack seismic files. System Benefits Conoco found that the advantages offered by the system it selected matched the primary improvement areas it wanted to address. These areas are summarized as follows. Disk-space management. This was a primary goal for Conoco, since it wanted to reduce future per-unit, seismic-storage costs. The ideal solution would be an integrated combination of disk and tape storage that would provide optimum data-storage efficiency for both user access and cost. Analysis of the trend of seismic-data growth and current media costs indicated that significant data-storage savings could be achieved with the selected system, Fig. 2. This analysis was done on a "complete solution" basis using 1999 costs and does not take into account future advances in media technology. In addition to cost and user-accessibility criteria, the company also recognized that media technology was evolving rapidly, and it wanted to ensure that the selected system would be able to accommodate newer, larger-capacity media types as they become available. While the systems modular architecture addressed the media-upgrade issue, Conocos scalability requirement was satisfied so that the system could accommodate both the anticipated data growth in its GOM Deepwater Business Unit, as well as potential integration into its North American Business Unit. Seismic project management. A high priority in system selection was seismic project management of interpretation projects; this was determined to be more important than managing preprocessed or raw SEG-Y data. The basic functionality requirements for seismic project management included: a map-based Graphical User Interface (GUI), seismic-data-version control, direct linkage to SeisWorks projects, integration with seismic data-loading tools, and ability to store and present metadata related to seismic projects. The requirement of a map-based GUI was related to Conocos desire to leverage the functionality of Geographical Information Systems (GIS) into the data management system. The companys data managers and users were familiar with ESRIs ArcView system and had built data viewing / analysis tools that incorporated it as a front end. In addition, Conoco noted that exploration software vendors were beginning to incorporate ArcView as front end to data management systems, and there could be potential integration benefits to including a GIS application in the Seismic Data Management System (SDMS). From a users perspective, this front end would present a graphical view of all available seismic lines and surveys in the context of cultural overlays, showing lease-block outlines as well as geographical features. Conocos criterion that SDMS manage seismic-data-version control was founded on the growth of interpreter-based processing capabilities, as provided by application vendors. Knowledge Management (KM). This was identified as a strategic realm of information management for Conoco. The selected system, with its incorporation of SDMS application software, was seen as providing valuable contributions in this area, especially with the potential integration it offers with other key KM tools Conoco is evaluating or utilizing. The companys KM strategy focused on preserving and cataloging value-added interpretation and analytical work. In the case of seismic interpretation, this translated into preservation of existing application-software projects, as well as capture / cataloging of descriptive project metadata in a master catalog or index. Coupled with a GIS front end, this would allow future users to quickly identify areas that had been previously worked and easily access their predecessors interpretations, if relevance was determined through review of the metadata captured in the master catalog. The system provided this solution by managing projects from a spatial perspective and through metadata capture utilizing SDMS software. SDMSs functionality was further enhanced through incorporation of PArchive software into the system, which provided a tool for capturing all relevant files and databases during the archiving process. Further enhancements to the KM aspect of the system are being added through use of software that allows annotations / documentation in non-Landmark file formats to be added to the archived project. Disaster recovery / loss protection. Conoco has been challenged in the past in its attempts to implement disaster back-up / recovery tools that both manage large volumes of data and do not impact system performance and availability. Further, tools currently in use for disaster back-up were not well suited to project archiving for KM purposes. The selected system offered a set of key capabilities, including simultaneous creation of multiple data copies on different media types. The system also offered Unix file-system-level compatibility, essential for integration with the companys current environment. An added system benefit was automated cataloging / managing of tape volumes via media labeling. This system aspect was easily integrated into Conocos physical media management system. Lastly, the ability to integrate with other vendors disk-storage devices was a unique benefit provided by the system. Future development needs. As part of the integration requirements identified for Conocos operational environment, the company also was looking for willingness by the solution provider to accommodate needs relating to future development directions. These focused on increased levels of integration with other essential data-management technologies and infrastructures. Examples of areas where the selected system held out promising benefits included:
Commitment to working with other primary suppliers was a strong factor for choosing the system. Project Implementation It was recognized at the outset that both a detailed plan and full integration into Conocos existing infrastructure would be critical to project success. During a several-week period, a project-management group established such a plan. The group, led by Conocos geodata manager, comprised representatives of Computer Sciences Corporation (Conocos Systems and Infrastructure outsource partner), Panther and Ovation. Considerable time was spent working with Conocos existing technology providers to fully understand how various system parts would be integrated. This work included setting up a test environment, identical to Conocos production environment, at Ovations Houston facility prior to project initiation. This allowed for system customization prior to delivery and minimized risk to Conocos production environment during implementation. It was decided during the planning phase of this project that many system aspects needed to be integrated into Conocos ongoing technology-refresh program. Specific issues included integration of system implementation with Conocos plans for Y2K compliance upgrading its application and database environments, as well as its operating system. Conoco recognized that a key to optimizing performance of a geoscience interpretation environment was to establish file-system architecture and data-management processes which take full advantage of system-hardware components and network design. In this case, file systems were reorganized prior to SDMS implementation to optimize system performance and enable programming of business rules for file migration. The front-end loading effort has facilitated implementation in accordance with the project plan from both cost and time perspectives. At press time, the system had been delivered and installed at Conocos Lafayette facility without any significant problems. All system parts have been fully integrated and tested through proof-of-concept. Business rules for file management have been established and system performance evaluated. All online seismic projects have been scanned and incorporated into the SDMS database. Procedures for handling of physical media have been established, documented and implemented. Future work includes migration of SDMS from an object to a relational database and further integration of both the system and GIS GUI into a master catalog. The authorsR. Holt Ardrey is currently director of Information Management for Conocos Deepwater Exploration effort based in Lafayette, Louisiana. He earned a BA from Dartmouth College in 1976 and a MS in geology from the University of Michigan in 1978. Prior to joining Conoco in 1997, he held a number of E&P-related positions with Shell Oil, Union Texas Petroleum and ARCO, as well as a position with Landmark Graphics from 1995 through 1997. Bruce Sketchley is the vice president of sales for Panther Software, Houston, Texas. He joined Panther shortly after the companys inception in 1994 and moved to Houston from the Calgary, Canada, corporate head office in June 1999. Previous oil industry experience includes positions with Landmark Graphics and Petroleum Information (now part of IHS Energy Group).
Copyright © 1999 World
Oil |