{"Bibliographic":{"Title":"Frontiers in data storage, retrieval and display : proceedings of a marine geology and geophysics data workshop, November 5-7, 1980, Boulder, Colorado","Authors":"","Publication date":"1981","Publisher":""},"Administrative":{"Date created":"08-16-2023","Language":"English","Rights":"CC 0","Size":"0000391041"},"Pages":["H\nZ\n699.5\nG4\nG4\nF7\n980\n1980\nFRONTIERS IN DATA STORAGE, RETRIEVAL AND DISPLAY\nProceedings of a Marine Geology and Geophysics\nData Workshop\nNovember 5-7, 1980\nBoulder, Colorado\nNational Oceanic and Atmospheric Administration\nEnvironmental Data and Information Service\nNational Geophysical and Solar-Terrestrial Data Center\nOF\nCOMMUNITY\nBoulder, Colorado\nJUNE 1981\naverage\nSTATES\nOF","COVER\nDIGITALLY PROCESSED SONAR IMAGE OF A SUNKEN SHIP\nShort range (100 m) side-looking sonar images of a sunken navel vessel\nfrom the war of 1812 in Lake Ontario. These sonar images were acquired by the\nCanada Center for Inland Waters (CCIW)* in 1978. Ken McMillan of CCIW provided\nthese images to Peter Paluzzi of JPL for computer processing.\nThe image presented here is from a false color rendition of the original\nside- -looking sonar signals. To produce this image, analog sonar signals were\ndigitized and corrected for shading variations. The image was then corrected\nfor geometric distorion (slant range and ship speed).\n*Now called National Water Research Institute, Environment Canada.","INTERLIBRARY LOAN\nfrom\nGEORGIA AND ATMOSPHERIC\nNOAA.\nPRODUCT\nS OF\nNOAA Central Library\nRockville, MD\nDate due: : 6-13-91\n(unless recalled earlier)\nIL: 1739991\nYou can request renewals on\nOCLC (OCLC code: OLA) or by\nphone-- (301)443-8330 (ask for ILL)\n.\nPlease return material to:\nNOAA Library, E/OC4\nInterlibrary Loan\n6009 Executive Blvd.\nRockville, MD 20852","Z\n699.5\nAND NOAA ATMOSPHERIC\nG4F7\nU.S. DEPARTMENT OF COMMERCE\n1980\nMalcolm Baldrige, Secretary\nc.l\nNATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION\nUS OF\nENVIRONMENTAL DATA AND INFORMATION SERVICE\nThomas D. Potter, Director\nFRONTIERS IN DATA STORAGE, RETRIEVAL AND DISPLAY\n11\nProceedings of a Marine Geology and Geophysics\nData Workshop\nNovember 5-7, 1980\nBoulder, Colorado\nSponsored by\nThe National Geophysical & Solar-Terrestrial Data Center\nof NOAA with Funding Support from National Science Foundation/IDOE\nChairmen\nAllen M. Hittelman, National Oceanic & Atmospheric Administration\nRoger L. Larson, University of Rhode Island\nEditor\nAllen M. Hittelman\nCENTRAL\nLIBRARY\nNational Geophysical and Solar-Terrestrial Data Center\nJUL 27 1981\nBoulder, Colorado\n81 2397 JUNE 1981\nN.O.A.A.\nU. S. Dept. of Commerce","81 2337","FOREWORD\nThis is the third in a series of Marine Geology and\nGeophysics Data Management Workshops hosted by the Na-\ntional Geophysical and Solar-Terrestrial Data Center. At\nthe first workshop, \"Marine Geophysical Data Formats,\nwhich was held in 1977, we were encouraged by the marine\ncommunity to continue hosting workshops to discuss themes\nrelating to marine geological and geophysical data. The\nsecond workshop, \"Marine Geological Data Management,\" was\nheld in 1978.\nThese workshops help to strengthen the existing in-\nformal network of federal and university data collectors,\nmanagers, and users. This in turn has a direct benefi-\ncial impact on the ability of the nation to support the\ndata requirements for a broad range of exploration and\nresearch applications by industry, government, and\nacademia. Although this network is informal, voluntary,\nand operates with minimum funds, it has been quite suc-\ncessful in stimulating exchanges of vast quantities of\ndata and experiences.\nOne of the purposes of these workshops is to ex-\nplore creative ways of meeting the challenges facing the\nmarine data managers. Data are being collected in\ngreater quantities, more sophisticated techniques are\nbeing used, the numbers of users and applications are\ngrowing, better hardware and software systems for data\nhandling are becoming available, and there is a great\nresponsibility for applying these data to solve the\nenvironmental and resource problems of the nation.\nThe present workshop, \"Frontiers in Data Storage,\nRetrieval and Display,\" is not designed to develop speci-\nfic conclusions and recommendations, although indirectly\nthis may happen. The intent is to bring together\nexisting and potential participants in this informal net-\nwork and provide a forum for describing existing institu-\ntional systems and to present concepts for future\nsystems. The workshop topic was selected because of the\nawareness of many marine institutions that there must be\ncontinual improvement in capabilities and exchanges to\nsatisfy the expanding needs of the marine scientists,\nengineers, and planners who are trying to solve many\nproblems relating to the marine environment.\nHerbert Meyers\nChief, Solid Earth\nGeophysics Division\niii","PREFACE\nThe Marine Geology and Geophysics Branch of NGSDC is pleased to have hosted this workshop and to have welcomed\nparticipants to Boulder. We thank the office of the International Decade of Ocean Exploration of the National\nScience Foundation for their support.\nBy way of introduction, I would like to share with you how this workshop was developed. Initially, we had\nsome difficulty in deciding exactly what topics should be addressed and who should attend. We could identify the\ncontinuing need for a forum and mechanism by which the data managers at major institutions could share their\nexperiences, both good and bad, with each other. However, to limit the workshop to this small group would have\nbeen, we felt, much like assembling a group of evangelical ministers to speak to each other or \"preach to the\nconverted.\" Realizing that any narrowly selected group would suffer from the same difficulty, we hit upon the idea\nof bringing together several of the easily identified groups we at NGSDC deal with on a daily basis to discuss some\nas yet unidentified, mutually interesting set of topics. The groups we selected were academia (who were both\nsuppliers and users of data), government (who also were both users and suppliers of data) and a third group from\nindustry (who are users of data and a major source of resources to guarantee improvement of data bases at NGSDC).\nWe strongly suspected that specifics of the industrial need for data were poorly known by many within the other two\ngroups. It was felt by us that a healthy mix from the three groups could produce a lively workshop and had great\npotential for highlighting a concept we believed to be of paramount importance -- all data management is a service\nactivity and the needs of data users should dominate in almost all decisions relative to data management.\nWe felt that this workshop would present a unique opportunity for several institutions who had not yet gotten\ntheir data management act together. From our perspective there seemed to be some danger that they might choose\ndata management schemes that ignored lessons learned painfully elsewhere, including NGSDC. All of us want to avoid\ncostly mistakes and profit from the experience of others. Thus we came to focus on the techniques of data manage-\nment rather than the data themselves, their precision or any other characteristics. Although it might appear that\nwe restricted the discussion to marine geophysical data, we did not.\nAt this point it became necessary to pin down who the essential attendees were and to identify amongst others\nwho might want to attend, those who could either benefit or contribute the most to a successful workshop. We felt\nwe had an obligation to use our limited travel dollars to ensure the presence of data managers from the major\noceanographic institutions. They have the most limited access to travel money of any group we deal with, yet they\nhave the greatest need, from our point of view, to share their experiences.\nAs a part of choosing who should come to a workshop one is always confronted with difficult decisions.\nMany\n\"first-line\" scientists and scientist-managers badly need to hear (and heed) the discussions at our workshops but\nfrequently they have very little to contribute to advancing techniques of data management. It is ironical that\nthose who can profit the most from these discussions will often ignore data management and leave it to more junior\npersonnel. (Fortunately there are exceptions!) Data management is frequently relegated to a minor role at insti-\ntutions where senior scientists (or funding agencies) do not openly support it. This suggested to us that somehow\nwe had to twist a few arms to ensure senior scientist and funding agency participation.\nTo avoid the impression that NGSDC was dominating the workshop and to share the workload, we decided to have\nco-chairmen (an NGSDC representative and an \"outsider.\"). Somehow I hit on the idea of Roger Larson. I knew him\nto be loud enough to control the mob if need be, and I had recently learned that he was leaving Lamont to take the\nsenior position in marine geophysics at Rhode Island, where, whether he knew it or not, he would need all that he\ncould learn at our workshop. In fact, he would have a unique opportunity to establish data management at Rhode\nIsland using a fresh approach based on the latest ideas presented at this workshop. He is a \"first-line\" scientist\nwho could, by his personal reputation, ensure broader participation and lend greater credence to our workshop. My\npersonal thanks go to Roger for agreeing to help.\nNGSDC representation was not too difficult. Because I wanted to participate from the floor rather than the\nchair, had chaired our last workshop and felt strongly that someone closer to the \"nuts and bolts\" of data manage-\nment was needed, I tagged A1 Hittelman for the job. Having advocated this type of workshop for years, he jumped at\nthe chance. We all felt comfortable with A1 as our choice because he had been a co-chairman of our first workshop,\nwas one of the principal authors of the \"MGD77 format,\" a term that will be clearly embedded in the proceedings of\nthis workshop, and was one of our more enthusiastic participants in other conferences. He is uniquely qualified,\nbased on the thoroughness of his technical knowledge of the tools of data managment employed at NGSDC and\nelsewhere.\nWe felt that the format of the workshop should be flexible enough to allow for several mechanisms of exchange\nof ideas. We scheduled formal presentations followed by ample opportunity for discussion and questions. For those\nwhose ideas, projects or products were better suited for informal presentations, we scheduled poster sessions\naround coffee breaks. We also scheduled individual appointments and group discussions when the need arose. Tours,\ndiscussions and demonstrations of NGSDC capabilities were important parts of the workshop. Finally, we concluded\nthe workshop with a series of invited and volunteered reviews or reflections upon the results of the workshop.\nI hope this helps you understand the unique make-up of the group assembled and allows you to share the enthu-\nsiasm felt by participants in this workshop.\nMichael S. Loughridge\nChief, Marine Geology & Geophysics Branch\nNational Geophysicsl and Solar-Terrestrial\nData Center\niv","CONTENTS\nPage\niii\nForeword -- Herbert Meyers\niv\nPreface Michael S. Loughridge\nvii\nIntroduction\nAllen M. Hittelman\nEDIS Systems Development -- Larry D. Schultz\n1\nStrategies to Develop and Access Large Sets of Scientific Data\n5\nRoy L. Jenne\nMarine Data Management at NGSDC -- Allen M. Hittelman\n13\nSamples of GEODAS Products -- Dan R. Metzger\n17\nInteractive Graphics for Retrieval, Display, and Editing of Marine\n28\nGravity Data -- Ken Fournier\nData Base and Interactive Graphics Techniques at CEAS\n--\n38\nGerald S. Barton\nInteractive Graphics in a Computer-Saturated Environment -- Peter\n41\nW. Sloss\nData Management at the Hawaii Institute of Geophysics -- Dale\n***\nErlandson\nOpportunities and Challenges in Data Storage, Retrieval, and\nDisplay at L-DGO in the 1980s -- Ellen M. Herron\n45\nData Storage and Access at L-DGO -- Rose Anne Weissel and Thomas\n47\nD. Aitken\nComputer Storage and Retrieval of Position-Dependent Data at Woods\nHole Oceanographic Institution -- Robert C. Groman\n49\nThe S.I.0. Sample Index -- Stuart M. Smith, Virginia W. Psaropulos,\n53\nJames S. Charters\nA System for Storage, Retrieval, and Display of Underway Geo-\nphysical Data -- Stuart M. Smith, Gregory Papadopoulos\n60\nCurrent Systems Development for Management of Marine Geophysical\nData at the Atlantic Geoscience Centre, Canada -- Gwynneth M. Martin\n70\nAn Automated Index to USGS East Coast Marine Geological Data --\n74\nValerie Paskevich, Joseph Moses Botbol\nMarine-Data Management at the Pacific-Arctic Branch of the Office of\n75\nMarine Geology, U.S. Geological Survey -- Alan K. Cooper\nOffshore Data: Acquisition, Analysis and Release in the USGS\n***\nConservation Division -- Lucille Tamm\nBibliographic Databases: Present and Future -- Victoria R.\n84\nSchneller\n88\nThe ROSE Seismic Data Archive -- Jan Garmany\nManagement of Large Petroleum Data Bases -- Philip H. Stark\n91\n*** - Only an oral presentation was given\nV","Contents (continued)\nPage\nThe Construction and Use of Large Multisource Integrated Data\nSets -- G.W. Wecksung, S.H. Balogn, S.L. Boliver, K. Campbell,\n92\nE. Fugelso, T.A. Weaver\nApplication of Computer Image Processing to Marine\n93\nGeology -- Peter R. Paluzzi\nImaging Conventional Marine Geophysical Data - Caribbean Seismic\nExample -- David Handschumacher, et al\n100\nSEABEAM Studies of the Gorda Rift/Blanco Fracture Zone\nIntersections -- Alexander Malahoff\n104\nAdvance in Sea Beam Data Treatment and Presentation -- C.\nEdy\n113\nBathymetric Data Reduction Subsystem -- Gordon M. Frank, James P.\n121\nMoran, John Lanier\nA United States Continental Margin Gridded Bathymetric Data\nBase -- Edward C. Escowitz, Roger W. Bowen, James M. Potbol\n135\nWorkshop Reflections\nA Suggestion Regarding Our Marine Geology and Geophysics Data\n138\n-- Thomas W.C. Hilde\nCenter\n139\nReflections -- Ellen M. Herron\n140\nWorkshop Reflections -- Roy L. Jenne\n141\nReflections -- Stuart M. Smith\nReflections on Frontiers in Data Storage, Retrieval, and\n142\nDisplay -- Peter C. Belknap, Roger L. Larson\n143\nReflections on Perceptions -- Peter K. Mackinnon\n144\nAfterthoughts Michael S. Loughridge\n146\nAppendix 1\nWorkshop Agenda\n--\n149\nWorkshop Participants\nAppendix 2\n--\nvi","FRONTIERS IN DATA STORAGE, RETRIEVAL AND DISPLAY\nINTRODUCTION\nA retreat for marine data managers - this concept has intrigued me for a number of years, for\nseldom do data managers have an opportunity to meet and share their art. Yet, the need strongly\nexists. Within the marine community, novice data managers were consistently repeating errors which\ntheir more experienced colleagues have long since resolved. Furthermore, since all of our problems\nand applications are so common, it seems wasteful not to share strategies and perhaps even resources.\nIn this manner, the concept for a workshop - \"Frontiers in Data Storage, Retrieval and Display\"\nevolved. A forum was established in which we, the participants and pioneers of our frontier, were\nable to share our thoughts in both formal and informal settings. Even during the presentation of\npapers, an atmosphere was created in which questions and dialogs abounded.\nHow do we store our data? Do we store it as a total duplication of what we collected, do we\nstore only what is necessary to recreate the collection, or do we store only a subset? Should we\ncompact our data using blank suppression, bit compression, or interpolation techniques? What are the\ntrade offs? These apparently trivial questions frequently have major resource implications and an\nanalysis must include a review of: (a) software and hardware resources, (b) requirements of the data\nbase, and (c) our funding environment.\nPrior to establishing a file structure, one must determine requirements for access. Should\nthe data reside, either permanently or temporarily, on tape, disk or a mass storage device? What are\nthe trade offs and overheads of a data base management system (DBMS) as compared with a dedicated\nmanagement system? Turnaround time must also be considered, for costs of on-line systems, which\npermit instant access, are always more expensive then off-line systems. How can my system have the\nadvantages of both?\nIn developing an access system there are pitfalls to avoid. Ideally, one wishes to develop\na system that can be maintained -- a system that is well documented, well structured, modular, and\nflexible enough to account for changing technologies. How do we avoid these pitfalls?\nIn one's final judgement, a system is only as good as the information that is presented to us.\nHow can we structure our inventories and our data display packages to optimize our requirements for\ninformation? What can current technologies, such as graphic terminals and computer output microfilm\n(COM), do for us?\nWithin the pages of these proceedings, essays explore all of these issues. However, these\ntreatises are but an illusion of the total treasure of knowledge that the participants shared at the\nworkshop.\nYes, we are at the frontier of our art and our future holds many opportunities. As networking\nbecomes more efficient, there may be less of a need to duplicate major data bases at many research\norganizations and, someday perhaps, expensive systems may be shared by many within the framework of\n\"data utilization centers\".\nAllen M. Hittelman\nWorkshop Co-Chairman\nvii","EDIS SYSTEMS DEVELOPMENT\nLarry D. Schultz\nNational Geophysical and Solar-Terrestrial Data Center\nINTRODUCTION\nThe mission of the Environmental Data and Information Service (EDIS) is to manage scientific data and infor-\nmation in the marine, solid earth, atmospheric, solar, and aeronomy environment. The range of data is from the\ninterior of the Earth to the atmosphere of the sun, and beyond. The users of the data are commercial, industrial,\nscientific, and engineering firms. We also service the general public, educational institutions, and local,\nState, and other Federal agencies.\nPRESENT SYSTEM\nHistorically, EDIS came together through numerous reorganizations from widely varying disciplines. This evo-\nlution is reflected in the way that EDIS's computing systems currently are structured (see Figure 1). In this\nfigure are shown the three major data centers of EDIS. The National Climatic Center (NCC) is located in\nAsheville, N.C., and is primarily concerned with atmospheric data bases. The National Oceanographic Data Center\nis in Washington, D.C., and is associated with oceanographic data bases. The National Geophysical and Solar-\nterrestrial Data Center is located in Boulder, Colo., and works in the solid earth and solar physics data areas.\nEach of these major centers uses different computer facilities for their respective missions. The NCC has\nits own facility on its premises, consisting of a Univac 1100/10 mainframe computer and associated peripherals.\nThe NODC uses the NOAA OMCS mainframe sytem which is a Univac 1100/42 located in the D.C. area. The NGSDC uses\nthe NOAA ERL computer facility located in Boulder. It is a CDC Cyber 750 mainframe.\nThe three centers each work autonomously and each does its own separate mission within these separate com-\nputing environments. As a consequence, there are no cross-EDIS standardizations and data management sharing\nactivities. This means that personnel at one center cannot help customers that may wish additional data from the\nother centers. A customer must go to each center separately if more than one data need is to be satisfied.\nDAMUS-I\nIn the summer of 1978 EDIS management began plans to integrate and consolidate the computing environments of\nEDIS. The concept of a Data Archive Management and User Services (DAMUS) system was developed. This system is\nillustrated in Figure 2. The figure shows the three major centers interfacing into a single computer facility and\naccessing directly all of the major data bases. The facility will be located in Asheville, N.C., at the NCC. It\nis planned to be operational in the fall of 1981. The data bases will be consolidated but not integrated in this\ninitial version (hereafter called DAMUS-I).\nA data dictionary will reside over the data bases, and a data base management system, DMS 1100, will handle\nmany of the data management chores.\nThe hardware system of DAMUS-I is shown in figure 3. Plans call for the up-grade of the present NCC computer\nto a Univac 1100/12 (2 CPU's), more memory, disk, and more I/O channels. In addition, a mass storage device\n(discussed later) is also being procurred. The system will also have a microfilm unit, tape drives, and other\nperipherals. In the drawing, existing modules are shown as solid lines, and planned additions to the present NCC\nsystem are shown as dashed lines.\nThe NODC and NGSDC centers will communicate via high-speed (9600 bps) lines to the NCC location. These lines\nwill have multiplexers on them and will accept a variety of RJE and other terminals (see figure 4). The DAMUS-I\nsystem calls for a total of four such dedicated lines.\nThe mass storage system (MSS), pictured in figure 5, uses the IBM 3850 device. A minicomputer acts as an\ninterface to the storage unit, and a disk controller is provided by the vendor. From the Univac viewpoint, the\nMSS looks like a large slow disk. The system will have an initial capacity of 35 billion bytes and can be\nexpanded up to 256 billion bytes with add-on units.","DAMUS-II\nPlans are underway to develop DAMUS-II, the next generation system that will service the future needs of\nEDIS. This totally new system will become operational in early 1984 and is planned to be in place for 6 years. A\ncommercial vendor is now doing the requirements study for the system design of DAMUS-II.\nOTHER DEVELOPMENTS\nThere are other systems being developed within EDIS for the improvement of customer services. These include\na sophisticated graphics system which will improve and standardize the quality control of data at all of the\ncenters; a network of communicating word processing units which are used in inter-center communications; and a\npilot program of distributed access to the data bases via small microcomputer terminals, which will allow custo-\nmers direct access to the EDIS data bases.\nIn addition, there are several committees tasked specifically with long-term planning for future EDIS-wide\nsystems. These committees are planning systems that will come on-line in about 3 to 5 years. The scope of the\ncommittee work encompasses hardware design and acquisition, documentation standards for software systems, and\ntraining of EDIS ADP professional personnel.\nEXISTING SYSTEM\nAsheville, N.C.\nWashington, D.C.\nBoulder, Co.\nEDIS / NCC\nOMCS\nERL\n1100/42\nUnivac 1100/10\nCDC-6600\nNOAA Administrative\nNational Ocean Survey.\nNOAA Research\nNational Marine Fisheries\nand NGSDC\nNCC\nNODC Applications\nApplications\nApplications\nNOAA-Payroll,\nAtmospheric\nPersonnel, Finance,\nResearch\nData Base\n& Budget Operations\nOcean Survey Data Base\nSolid Earth & Solar Data\nFishery Data Base\nOceanographic Date Base\nFigure 1. Present computer environment of the major EDIS Data Centers.\n2","PROPOSED DAMUS SYSTEM\n1981 . 1984\nBoulder. CO.\nWashington, D.C.\nAsheville, N.C.\nEDISINGSDC\nEDIS/NODC\nEDIS/NCC\nUNIVAC 1100/12\nData Management\nUser Services\nSolid Earth & Solar\nOceanographic\nAtmospheric\nData\nData\nData\nFigure 2. Consolidated computer environment of the Data Archive Management and User Services (DAMUS) system.\nPROPOSED INTERIM DAMUS SYSTEM ASHEVILLE, N.C.\nMemory\n65K\nMemory\nMemory\nMemory\n131K\n131K\n131K\nMemory\n65K\nMMA\nMMA\nMMA\nMMA\nACU\n1100/10\n1100/10\n1/0 Channels\n1/0 Channels\nMSS\nCommunications\nProcessor\nPeripherals\nMMA = Multi-Mode Access Unit\nRemote Users\nExisting Components\nACU = Availability Control Unit\nMSS = Mass Storage Unit\nProposed Additions\nFigure 3. DAMUS system hardware configuration (solid lines show existing subsystems, dashed lines show proposed\nup-grades).\n3","DAMUS SYSTEM TELECOMMUNICATIONS\nWashington, D.C.\nDemand\nTerminals\nMUX\nMUX\n9600 BPS\n300/1200 BPS\nRJE Terminals\nMUX\nMUX\n4800 BPS\n9600 BPS\nCPU\nRJE Terminals\nMUX\nMUX\n4800 BPS\n9600 BPS\nDemand\nTerminals\nMUX\nMUX\n9600 IBPS\n300/1200 BPS\nBoulder, Colo.\nAsheville, N.C.\nFigure 4. Telecommunications Facilities for computer networking between the EDIS Data Centers.\nDAMUS MASS STORAGESUBSYSTEM\nDAMUS Host Computer\nUNIVAC 1100/12\nMass Store\nDual Channel\nMSS Manager\nIBM-3850\nDisk Controller\nPDP-11\n3.5 x 1019 Bytes\nCENCOM-2900\nMass Storage\nExpansion\nStaging\n25.6 x 1010 Bytes\nDisks\nExpansion\nDisks\nFigure 5. DAMUS Mass Storage Subsystem (expandable to 256 billion bytes).\n4","15 November 1980\nStrategies to Develop and Access\nLarge Sets of Scientific Data\nby\nRoy L. Jenne\nNational Center for Atmospheric Research*\nMost of the scientific disciplines face similar problems in the preparation of data sets. Data are needed for\nboth relatively small-scale problems and for all the Earth. Some data sets have large enough volumes that they are\ndifficult and expensive to use. Therefore, careful attention must be given to the way the data are structured, to\nformats, and to data subsets to minimize problems where possible. The strategies for managing and accessing the\ndata will be considered. The costs for processing and archiving data are included to help clarify which hardware\nand processing strategies will reduce costs.\nThe data Support Section at NCAR has over 100 different data sets, many with various subsets. They vary in\nvolume from one tape to several hundred. Our data are now on several thousand tapes, and some are on the mass\nstorage system. We have millions of analyses, each usually with 1,000 to 4,000 grid points. There are even more\nmillions of observations. One data set contains a total of about 30 million reports each year from 9,000 major\nweather stations around the world. Other thousands of tapes contain model output, radar data, etc.\nIn the fields of meteorology and physical oceanography, there are hundreds of tapes of data that we would like\nto be able to easily obtain from other organizations when necessary. Fortunately, it is usually rather easy to\nexchange data sets that have been properly prepared. The process of getting most national and global data sets\nprepared for easy access and exchange has been slow, but progress has been made. Data sets with very high volumes\ngive special problems. We are involved in discussions about some of the larger sets of satellite data held else-\nwhere (or not saved) so that appropriate data strategies can be developed.\nData Needs\nData needs usually cover a wide range of space and time scales. For local flash flood forecasts, we need\nradar or rain gage data each few kilometers, but for many problems, a sampling or average each 70 to 300 km\nis\nsufficient. For frost forecasts, or solar energy availability in a valley, detailed data may be necessary. For\nmost purposes the data doesn't need to be as dense. For most studies of climate-economic effects such as wheat\nyields, the exact location and size of each field isn't needed, but the total land area of wheat fields in a 100-km\nbox is necessary.\nBoth basic and derived data are necessary. The basic data are necessary to permit better analyses or dif-\nferent types of synthesis, but the results of credible data synthesis steps should be saved. Sometimes analyses\nare only saved in chart form. Usually a digital version should also be made available so that it can be used in\nmany ways without going through a costly digitizing step. All possible statistics don't have to be planned at the\nstart of a project; primary attention should be given to planning for data sets and subsets from which a large\nvariety of outputs could be easily produced.\nData Volume and Formats\nSome data sets have such high volumes that special attention must be given to compact formats and to data\nsubsets. Sometimes a laboratory with a 1,000- or 10,000-tape data set doesn't feel like it has a problem,\nbecause they have a good computer and sufficient staff to cope with the few data sets in which they specialize.\nHowever, other groups would like to obtain much of the information in the data without dedicating most of their\nresources and time to using it. And they need both basic data and the processed data. The main emphasis for com-\npaction and subsets should be given to those sets with volume more than about 100 tapes. When data are kept\non-line, compaction deserves even more emphasis because of the high cost of on-line storage.\nSome examples of high volume data are data from meteorological radars, gust probe data on aircraft, and many\nsatellites. The volume from geostationary satellites is shown in Table 1. The average data rate from the U.S.\nGOES satellite is 1 million its/second. Note that saving all GOES data would take 70 tapes (6,250 BPI) per day\nwhich can be cut to 36 tapes without loss of information. This can be cut to 91 tapes per year by saving data only\neach 3 hours (instead of half hourly) and saving data each 9 km instead of 1 km. Most of the pojects needing data\ncan be satisfied from this heavily sampled archive that has 0.0036 the original volume. Other planned archives are\nhistograms and averages for 250-km areas; these have very low volumes.\nSome factors concerning formats that deserve analysis are:\nAre all variables saved at the same time or space frequency even\nwhen it is not necessary?\nIs there more location and housekeeping data than observed data.\n*The National Center for Atmospheric Research is sponsored by the National Science Foundation.\n5","Does each data element use 6 to 60 bits when 4 to 16 might be enough?\nAre variable length reports saved in fixed length blocks?\nAre data written on tape in short unblocked records?\nAre too many data types integrated into one data stream?\nIs the volume of quality control information high compared to the data?\nDoes the grid mapping method have an excessive number of points?\nThe question often arises as to whether the primitive, first level archive can be thrown away. Sometimes this\nis decided by momentary economics or poor decisions which are regretted later. Usually archives are needed on\neither side of difficult or nonreversible data transformations but there are cases where the initial process is\nstraight-forward (though often messy) enough that the initial archive doesn't need to be kept. An example is the\nvoltages and instrument corrections from which radiosonde temperature versus height soundings of the atmosphere are\nderived.\nRelatively simple methods for using binary packing, variable length blocking, etc. (and using such data on\nother machines) are described in Jenne and Joseph (1974).\nWhen people become aware of data volume savings that can be obtained by applying binary packing techniques,\nthey sometimes go overboard and spend a lot of time and effort to squeeze out the last binary bit. This can lead\nto comments such as \"He (she) is more interested in saving the last bit (or microsecond) than getting the job\ndone.\" Thus, it is hoped that formats for both character and binary packed data be kept as simple as is consistent\nwith both hardware and programming efficiency.\nManagement of Large Scientific Data Sets\nIt is generally not reasonable or cost effective to try to provide access to scientific data sets with near\ninstant turn-around. A typical portion of a large data set is used very infrequently. Sometimes a researcher may\nobtain a copy of a large data set for use elsewhere, or make calculations from their whole set at NCAR. Then the\ndata may sit unused for many months. Instead of using all of a large set, many projects only require a portion of\nthe data such as:\nMost data for a few days, months, or years\nA selection of analyses for a time period\nData for only a few stations from a large data set having many\nstations in time-series order.\nThe use of model output data follows a similar pattern. A set of data is typically summarized soon after it\nis created and then it may sit idle for months before new questions arise.\nInstant turn-around is not necessary for these jobs. It is important to keep costs down so that useful data\njobs are not \"priced out of the market.\" If the storage of much of the data off-line helps to reduce costs, (See\nTable 3) then this should be done. It must still be a relatively simple task to bring the data back on-line.\nWhen a data set is in active use, it may remain in use for several days.\nIn most cases a file management system is needed, not a full DBMS. Jenne and Joseph (1978) outline the con-\nsiderations regarding use of data pointer systems in data access. The access times for disks and mass stores are\nsuch that data block sizes must be large to obtain reasonable average data transfer rates. This means that full\nrandom access to small reports is not permitted by the hardware. Thus, the files needed to be structured so that\nmost users see a file that is as nearly serial as possible.\nUsually system functions have a high overhead. For example, at the National Climatic Center a test was run to\nload 16,000 daily weather observations, about 60 characters each. It took 8 hours on the DMS-1100 system on the\nUnivac. Since there are 133,000,000 observations in this data set, it would take about 8 years to load it (0.2\nseconds each and even much worse if the data are not presorted). Since this is not acceptable, they want to\nexperiment with System 2000. This system inverts down to the element; it is also understood to be intensive on\noverhead, but hopefully it will give better timing. It has been noted that system timing may be acceptable on a\nsmall subset of data, but get much worse as the volume increases. Experiences with some systems lead to comments\nsuch as \"If we had used the approach of method Y or system X, we would need a CRAY (fast computer) just to keep\nup.\nCosts for Processing and Archiving Data\nThe archives to support cloud studies will be approximately as in Table 2. They are included to give some\nfeeling for data volumes that can be associated with the cost information that will be presented. One 6,250 BPI\ntape holds 109 bits.\n6","Table 3 shows the annual cost to archive data both on-line and off-line. Note that the on-line costs for\ndisks or mass-storage systems become extremely high with large amounts of data. It also becomes expensive to copy\nan archive once. Sometimes some of the copy costs can be reduced by using a minicomputer associated with a mass\nstorage system and thus not ascending the data to the large computer. It should also be noted that many data sets\nare small enough that they are inexpensive to manage, especially if they are usually off-line.\nTable 4 shows costs for individual storage and processing components. It is from Jenne (1980), which contains\nadditional information on computer component costs, data rates, and mass storage data flow. The CPU cost is based\non the CPU time needed to pass the data through the system without a significant number of special calculations.\nThe cost of hardware maintenance, operations, and software support is not significantly included in the on-line\ncosts. The off-line costs don't include the costs of mounting tapes or keeping track of the archive.\nTo store large amounts of satellite data the University of Wisconsin has adapted a standard Sony tape recorder\n(Cost $25,000 each). Students change a tape each 6 hours. The yearly cost for saving data from one satellite (3.2\nX 1013 bits) is about $90,000. The cost with 6,250 BPI tapes would be $775,000, but then a minicomputer could\nreduce the volume giving a cost of about $250,000. The Sony recorder has poorer error rates than a tape, probably\none error in 106 bits. The recorder looks to the program just like a satellite when it is read. Tapes that are 5\nor 6 years old are getting hard to read, especially if they have been used several times.\nIn figuring the costs of computing, it is good to consider capacity factors as is done in the economic eva-\nluation of production facilities. For example, a windmill that generates electricity may be rated at 2 kW, but only\noperates at this rate 10 percent of the time. The cost of the delivered energy is the total kilowatt-hours pro-\nduced per year divided by the annual capital costs (including interest), the maintenance costs, and the operating\ncosts. A similar analysis is needed for each major part of a computer system. On-line storage of seldom-used data\nrepresents a low capacity factor and thus high costs. A computer, disk channel, or tape drive that is seldom used\nusually has a high cost per unit of use. Thus, we should avoid obtaining excess equipment or using expensive\nmethods that aren't necessary, because the unit costs may become so high that many worthwhile projects are cut off.\nThe cost of developing software systems is often high. After a system is developed, changes are usually\nnecessary. Then the person making the changes must determine where the changes should be made, and what the\nadverse interactions might be. This is often hard even for the person who designed the system and very difficult\nfor others. This is one reason that so often a new beautiful system is designed to replace the poor inadequate\nsystem which was the beautiful system of yesterday. And often this approach may be best because of the cost of\nunderstanding a previous system. But simpler, more modular approaches are often overlooked.\nTable 5 lists some of the costs associated with getting a useful result from a computer.\nReferences:\nJenne, R.L. and D.H. Joseph, 1974: Techniques for the processing, storage, and exchange of data.\nINCAR-TNB/IA-93. National Center for Atmospheric Research, Boulder, Colorado, 46 pp.\nJenne, R.L. and D.H. Joseph, 1978: Management of atmospheric data. From NASA Conference Publ. 2055:\nEngineering and Scientific Data Management, 261 pp.\nJenne, Roy L., 1980: Data handling and its costs in large computing systems. National Center for Atmospheric\nResearch, Boulder, Colorado (unpublished).\n7","Table 1. Data from Geostationary Satellites. Note that saving all data from\none GOES satellite would take 9149 tapes/year (6250) BPI) for\nvisible data but only 25 tapes when the data is sampled each\n8 Km. The data volume includes several percent for overhead.\nESA\nGMS\nGOES\nMeteosat\nJapan\nUSA\nSpin scan\n100 RPM\n100\n100\nNo. of steps\n2500\n2500\n1820\nVisible resolution (Km)\n2.5\n1.25\n.9\nIR resolution (Km)\n5\n5\n9\nWater vapor resolution (Km)\n5\nVisible resolution (Hr)\n65 ur\n35\n21 (E-W)x25(N-S)\nAngle between vis spots (E-W)\n62.5 ur\n24\n21\nIR resolution (ur)\n140 ur\n140\n250 X 250\nAngle between IR spots\n125 ur\n48\n84\nWave Length Vis\n4-1.1 um\n5-.75\n55-.75\nIR\n10. .5-12.5 um\nsame\nsame\nVis samples in scan line\n5000\n13376\n15292\nVis lines in picture\n5000\n10000\n14560\nIR spots in scan line\n2500\n6688\n3823\nIR lines\n2500\n2500\n1820\n1.58 X 108\n8.12 X 108\n8\nVis Bits/picture (6 bit spot)\n13.9 X 10\nIR Bits/picture (8, 9, 9 bit\n5.26 X 10 7\n1.57 X 108\n7\nspots)\n6.51 X 10\nData from Satellite\neach day\nVis Bits/day (6 bit) (109)\n7.57 (48p)\n11.4(14P) *\n66.7 (48P)\n9\nIR Bits/day (8, 9, 9bit) (10\n)\n2.52 (48p)\n2.20 (14P)\n3.12 (48 P)\nCut vis at dark line,\ndrop IR overlap\nVis Bits/day (6 bit) (109)\n3.79 (48p)\n5.69 (14P)\n33.4 (48p)\nIR Bits/day (8 bit) (109)\n2.52 (48p)\n.98 (14p)(5 Km)\n1.39 (48p) (9Km)\nCut most space data (25%)\neach year\nVis Bits/year (6 bit) (109)\n1037 (48p)\n1558 (14p)\n9149 (48p) year\nIR Bit/year (8 bit) (109)\n691 (48p)\n268 (14p)\n381 (48p)\n3-Hourly spots each 8 to\n10 Km\n(10 Km)\n(10 Km)\n(9 Km)\nVis Bits/year (6 bit) (109)\n13 (8P)\n15\n(8p)\n25 (8P)\nIR Bits/year (8 bit) (109)\n31 (8P)\n40 (8p)\n66 (8p)\nPoints each vis, IR picture\n75 (1250x1250)\n.75 (1672x1250)\n.75 (1912x1820)\n* (14p) means 14 pictures/day\n.3 X 109 bits\nNote: A 1600 BPI tape holds\n1.0 X 109 bits\nA 6250 BPI tape holds\n9\n22.3 X\nA U. Wisc. recorder tape holds\n10\nbits\n8","Summary of Basic Cloud Data Set Options\nTable 2.\nTapes\ntapes\n(6250)\nnow\n~1901-1965\n1414\n70\nOld surface synop\nSfc synop from teletype\n1965-1980\n400 to 2500\n90-135\n30-50\nShip log data\n1850-on\n500+\nScanners\nNOAA SR grids (20 Km)\nJan 73-Mar 78\n2600\n95\nTIROS 4 Km (ea 8 Km)\n21 Oct 78-on\n5800/yr\n85/yr**\nNASA SR (N4, 5, 6, 7)\nApr 1970-on(breaks)8700/yn\n62/yr\nDMSP (6 Km spot ea 12 Km)\n40/yr\nnone\nGeosynchronous\nall orig *\nEach US GOES (8 Km)\n91/yr\nFGGE on\n9530/yr\nall orig\nESA Meteosat (10 Km)\nFGGE on (break)\n1728/yr\n44/yr\nfall orig\nGMS Japan (10 Km)\nFGGE on\n1826/yr\n55/yr\nSounders\nNov 72-Jan 79\n1130\n40\nVTPR\nTIROS (all chan, 250 Km)\n50 /yr\n9/yr\nNov 78-on\nTIROS (all spots, 4 chan)\n610 /yr\nNov 78-on\n9/yr\nClouds\n1971-1980\n11460\n700\n3-D neph\n(only do this now)\n10 boxes, 1978-80\n360\n30\n*The geosynchronous satellite now has much more volume than this.\n**Only 2 channels.\nTABLE 3.\nData archive and copy costs. The full maintenance\nand people costs are not included in the archive\ncosts. Note that a steady 1 megabit rate for a\nyear is 3.16 x 1013 bits per year.\noffline\non-line\ncopy\nTapes\ncost/year\ndisk\narchive\n(6250 BPI)\nbits\n(6250-tapes)\ncost/year\nonce\n1010\n10\n$\n$\n20\n5,000\n$\n340*\n10\n100\n200\n50,000\n3,400\n12\n1000\n10\n2000\n500,000\n34,000\n13\n10,000\n10\n20,000\n5,000,000\n340,000\n10 14\n100,000\n200,000\n50,000,000\n3,400,000\n*These costs would be about $730. using our TBM mass store,\nand ascending and descending the data. Costs using 1600 BPI\ntape are about 3 times the 6250 BPI costs.\nOn-line mass storage costs are high.\n9","Table 4. Data storage and processing costs. The cost is given\nfor both on-line - and off-line storage. The costs\ngenerally assumed a hardware lifetime of 5 to 6 years.\nThe processing costs are optimistically low because in-\nexpensive rates on a fast computer (CDC-7600) were used.\nI/O and Computing\nCost/yr per 1010 bits\nfor 10 10 bits\ncosts\non-line\noff-line\nhardware storage\ncosts\nmedia\nI/O\nCPU cost\ncosts\nand\nminimum\n(no maintenance)\nstorage\nminutes cost\nprocessing\n$\n1600 BPI tapes\n170,000\n$66*\n92\n$340\n$135\n6250 BPI tapes\n51,000\n20\n24\n88\n135\nAuto tape Library\n835\n20\n24\n88\n135\n(2000 tapes-6250 - BPI)\nDisk packs (300 mbytes)\n36,000\n1000+\nLarge disks (2400 mbytes)\nE4,700\nTBM mass store\n7,600\n29\n76\n281\n135\nOptical disk\nE3,600\nE2\n35\n124\n135\nCore disk transfer\n10\n37\n*These costs assume relatively full tapes. If\ntapes average only 25% full, multiply these costs by 4.\nIn the table, E means estimated.\n10","Costs associated with getting a useful result from a\nTable 5.\ncomputer.\nApplied Programmer Costs:\nTime to learn to use the computing system\nTime to write and check the program\nTime to run the program (portion of hardware\nand software costs)\nCosts of Using the Hardware:\nCost to buy it\nCost to maintain hardware\nCost to operate hardware; people, energy, etc.\nFloor space\nCost of Using the Software:\nCost to develop software system at plant\nCost to develop local software\nCost to understand and maintain software\nNext Level of Software Costs:\nCost to document system\nCost to teach courses about the system\nCost to write utility programs\nCost to provide consulting help about system\nCost of Managing the Hardware and Software System\n11","","Marine Data Management\nat the\nNational Geophysical and Solar-Terrestrial Data Center\nby\nAllen M. Hittelman\nINTRODUCTION\nThis paper examines principles of data management practiced at the National Geophysical and Solar-Terrestrial\nData Center (NGSDC). As an example of the implementation of these principles, the Marine Geophysical Data System\n(GEODAS) will be reviewed.\nWhat is GEODAS? Essentially, it is a computer software system which: (a) processes underway marine geophysi-\ncal data (i.e., bathymetry, magnetics and gravity) into a data base, with a common format and (b) provides inven-\ntory and data retrieval capabilities.\nGEODAS was originally conceived in 1977, following publication of the marine geophysical data exchange format\nknown as MGD77 (Hittelman et al., 1977). This format superceded an earlier exchange format which was developed in\n1972 by an ad hoc working group for the Ocean Science Committee of the National Academy of Sciences/National\nResearch Council (Talwani et al., 1972). Advantages of the MGD77 format include greater data precision, expanded\ndata content, and most importantly, data documentation as an intregral part of the data file. Both of these for-\nmats have been successfully used and have contributed to substantial improvements in data exchange and management\npractices.\nTHE 10 COMMANDMENTS OF A DATA CENTER\nSome postulate that \"the first myth of management is that it exists.\" Philosophy of data management is an\nelusive topic; ask a dozen colleagues for their perspective, and one would get a dozen different points of view.\nIt is, therefore, with some hesitancy that I offer my concept of the \"Ten Commandments of a Data Center.\"\nTHOU SHALT NOT LOSE DATA.\n1.\nTHOU SHALT KNOW WHAT DATA THOU HAST.\n2.\n3. THOU SHALT ASSIMILATE DATA IN A TIMELY MANNER.\nTHOU SHALT ACCESS DATA IN A TIMELY MANNER.\n4.\nTHOU SHALT ENSURE SUFFICIENT DOCUMENTATION.\n5.\nTHOU SHALT ENCOURAGE AND IMPROVE THE UTILIZATION OF DATA.\n6.\nTHOU SHALT ENCOURAGE THE EXCHANGE OF DATA.\n7.\nTHOU SHALT PROVIDE A \"ONE-STOP\" SERVICE FOR EDIS* DATA.\n8.\nTHOU SHALT REFER INQUIRIES TO OTHER SOURCES, WHEN APPROPRIATE.\n9.\nTHOU SHALT RECOVER COSTS ASSOCIATED WITH USER SERVICES.\n10.\nPrior to the implementation of GEODAS, the marine group of NGSDC often violated these laws, for example:\n(a) Some data were lost. One such incident occurred when time-series data from Lamont-Doherty Geological\nObservatory, in the form of computer punched cards (about 40 boxes), were erroneously recycled instead of\nbeing delivered to the computer room.\n(b) Inventories were nonexistent or always out of date. The last catalog for marine geophysical data,\npublished in September 1978 (Hittelman et al., 1978), summarized the data base that existed on 31\nDecember 1978).\n(c) Assimilation (and subsequent dissemination) of data took years. Data were frequently poorly documented,\nsubmitted in nonstandard formats, and riddled with inconsistencies. NGSDC began to obtain a reputation as\na \"black hole\"--data goes in but nothing comes out.\n*EDIS - The Environmental Data and Information Service (EDIS) is NGSDC's parent organization. EDIS is a\nmainline component of the National Oceanic and Atmospheric Administation (NOAA) and includes four other data centers.\n13","By nature, the NGSDC marine personnel were not an irreverent lot and most of these shortcomings had valid\nreasons, such as: moving from D.C. to Boulder, conversion to new computer systems, staffing turnovers and\nshortages, etc. However, the final result was that responsive services were not being provided by the national\ndata center to the scientific community.\nFortunately, these problems are historic, as the GEODAS system has resolved most of them.\nThe GEODAS System\nGEODAS is designed for the computerized storage and retrieval of underway marine geophysical data and inform-\ntion (see fig. 1). Specific disciplines that are included in this data base system are bathymetry, magnetics and\ngravity.\nData entries into this system have been in the \"merge-merge\" format (Talwani et al., 1972) and the MGD77 for-\nmat (Hittelman et al., 1977) for historic and recent data, respectively. Documentation, which is described in the\nMGD77 format, is entered as either an integral part of the contributor's digital data tape (e.g., the header por-\ntion of the data format) or as a file created from coded documentation sheets (supplied to contributors in pad\nform).\nAssimilation (see fig. 2) is initiated by entering the data into a conversion-inventory-quality control (CIQC)\nprogram. The first task of this program is to standardize all data and add these data to the master file tapes.\nBoth merge-merge (M-M) and MGD77 data are converted into a \"perfect MGD77\", for experience has proven that rarely\nis a format precisely followed (e.g., plus signs may be in the wrong columns or empty fields may be blank filled\ninstead of 9s filled). Concurrent with the format conversion, an inventory file is created. This file is ultima-\ntely known as HEADIN and contains both header (the documentation portion of the data format) and inventory\ninformation. The inventory is essentially a condensed navigation file with cruise statistics. Quality control in\nwhich spurious values are either deleted or flagged is also performed. For example, a ship cannot travel at 100\nknots, nor can it be at 400° longitude.\nThe inventory file, HEADIN, documents the port-to-port operation and maintains records as to where the data\nfile is located (i.e., tape and file location). In addition to the MGD77 header information, other summaries are\nmaintained--such as the number of specific data records in each port-to-port operation. The most valuable\ninformation, however, is the synthesized navigation file. To generate this file a vector analysis is performed on\nMARINE GEOPHYSICAL DATA SYSTEM\nGEODAS\nDOCUMENT\nDATA\nASSIMILATION\nDOCUMENTATION AND\nDATA\nINVENTORY FILE\nFILE\nSYSTEM\nSEARCH\nACCESS\nGRAPHICAL\nDATA\nINVENTORY\nSUMMARY\nFig. 1. GEODAS: Marine Geophysical Data System.\n14","\"MGD 77\" ASSIMILATION\nHEADER\nANALOG\nM-M\nDOCUMEN-\nDATA\nOR\nTATION\nMGD 77\n\"MGD 77\"\nAREA\nDIGITIZE\nMASTER\nCIQC\nCRUISE\nINV./\nINST.\nSELECT\nHEADER\nTAPES\nINTERIM\nHEADER/\nMULT\nINVENTORY\nIND\nQQC\nINV.\nBAD\nOR\nQUALITY\nEDIT\nGOOD\nPAC\nATL\nHEADIN\nAREA DATA FILES\nFig. 2. The Assimilation of Underway Marine Geophysical Data.\nthe original data in which turning points, parameter start and stop locations, and at least one point every 90\nnautical miles are identified. The resolution of this file is sufficient to support geographical summaries (e.g.,\na trackline plot of gravity data).\nTo create HEADIN, an interactive process is sometimes required in which the location points are subject to an\nEDIT or quick-quality control (QQC) procedure. The QQC procedure is a subset of CIQC in which the creation of the\nnavigation inventory is emphasized and point selection criteria are variable.\nInventory information documenting non-computer-compatible data may also be added to HEADIN. This process\ninvolves the manual determination of key geographical points and the subsequent digitization of this information.\nAdditional documentation can always be added to the system. This is frequently required since documentation\nis continuously trickling into the Data Center. The system has been designed to be interactive and contributors\ncan access it remotely.\nHEADIN is a word-addressable, random access disk file. In October 1980, it contained 121/2 million characters,\ndescribing 1,350 port-to-port operations.\nThe master data file contains the MGD77 data records. The preliminary form of this file, which is maintained\nas a backup system, is physically ordered by institution. For example, when the Hawaii Institute of Geophysics\n(HIG) provides us with their annual data submission, these data will be added to the end of the last file tape\ndesignated as HIG data. The data files that are routinely accessed are geographically ordered. The master insti-\ntution tapes are read by an \"area-cruise-select\" program that determines on which geographical tape file the entire\nport-to-port operation will be stored. There are four area files: Atlantic Ocean, Pacific Ocean, Indian Ocean,\nand Multiple Oceans. Linkage exists with HEADIN to identify data locations.\nThe master area data files are index sequential tape files. The tapes are 6,250 BPI, 5,120 characters/block,\nand internal binary CDC display code. In October 1980, the file contained: 1 billion characters, 9 million\nrecords, underway data that covered 5.5 million nautical miles and 1,350 port-to-port operations. These data are\nstored on three Atlantic Ocean tapes, five Pacific Ocean tapes, one Indian Ocean tape and one multi-ocean tape.\nThe data search is performed by accessing HEADIN. Search criteria include:\n15","11. Parameters Surveyed:\n1.\nMGG Number\na. Bathymetry\nCruise Identification\n2.\nb. Magnetics\n3. Institution\nC. Gravity\n4.\nPlatform\nd. Seismics:\n5. Project\n(1) single channel\n6.\nFunding\n(2) multi-channel\n7.\nArea-Max & Min.\n(3) high resolution\nLat. & Long.\n(4) deep penetration\n8.\nArea - 10° Identification\ne. Side Scan Sonar\n9.\nSurvey Dates\nf. Seismic Refraction\nDate added to \"HEADIN\"\n10.\nA \"user friendly\" interactive program has been designed to permit a remote user access to this inventory.\nOnce a customized search has been completed, graphical and tabular summaries may be generated. At the same time,\ndata selection criteria are stored in a temporary disk file which acts as a driver for a system access of the\nMaster Area Data File. Information maintained include selected MGG numbers, their addresses in HEADIN (which in\nturn locates the master file tapes), and quantity of data.\nThe system access, based on the selection criteria, reads the appropriate files in HEADIN and the Master Area\nData File, and reconstructs an MGD77 format data tape--complete with both header and data records. Tabular sum-\nmaries are simultaniously generated and included in the customer's data shipment.\nCONCLUSIONS\nGEODAS has substantially improved NGSDC's data management practices. The acquisition of data in a common\ndigital magnetic tape exchange format has decreased the assimilation time to a routine rate of several weeks. Data\nare less susceptible to loss because they are quickly processed into both institution and area master files.\nGEODAS is a \"living atlas,\" providing instantaneous and remote access to inventory information. Using computer-\nefficient index sequential files, data are routinely retrieved in less than one day. Documentation--an integral\npart of the data format--is automatically sent to the requester, thus encouraging and improving data utilization.\nThe MGD77 format has encouraged the exchange of data and has been accepted as a national and international\nstandard.\nA long standing policy of NGSDC's parent organization, the Environmental Data and Information Service (EDIS),\nis that its data centers should provide a one-stop service. The ease of access to GEODAS facilitates direct entry\ninto our system; thus, expediting requests for coverage in a certain region of interest (such as \"what data do you\nhave in the Bermuda Triangle?\").\nThe inventory portion of GEODAS was designed to permit future expansion which would support referrals. For\nexample, an international inventory format known as IGGCI (International Geological and Geophysical Cruise\nInventory), provides trackline inventory information. Such information could be incorporated into HEADIN to alert\ncustomers of the availability of data from other countries in an area of interest. Thus, a valuable referral ser-\nvice could be provided.\nAnother goal of NGSDC is to recover costs associated with user services. This is difficult because many\ncustomers are also contributors and therefore entitled to \"an equivalent amount\" of free services. In the GEODAS\npricing algorithm many user service costs are accounted for, including: (a) the assimilation of the data, that is\namortized over 15 customers, (b) the cost of providing free services to approximately one fourth of the users, (c)\nthe cost of providing free inventory searches, and, of course, (d) the cost for accessing GEODAS.\nIn the future, the GEODAS inventory system will be available to all customers as a remote access system. In\nthe next several years, we hope to provide remote access to the data file as well. NGSDC also plans to expand\nHEADIN to act as an international referral system.\nACKNOWLEDGMENT\nThe author acknowledges those who have made this paper possible. Primarily, I thank those who have contri-\nbuted to the GEODAS system. Collaborating with me on the initial plans for the GEODAS system were Christopher\nLawrence, Dan Metzger and Dave Clark. Coding of the \"CIQC\" module was performed by Christopher Lawrence and Samuel\nPool. The design and coding of the \"Master Files,\" \"HEADIN\" and \"Search\" were performed by Dan Metzger. System\nAccess programs were designed and coded by Dan Metzger and Peter Sloss.\nReferences\nHittelman, A.M., R.C. Groman, R.T. Haworth, T.L. Holcombe, G. McHendrie, and S.M. Smith, 1977, The Marine\nGeophysical Data Exchange Format - \"MGD77\", KGRD 10, 18 pp.\nHittelman, A.M., C. Lawrence, N.R. Holmberg, and D. Metgzer, 1978, Summary of Digital Marine Geophysical Data\nHoldings, KGRD 11, 210 pp.\nTalwani, M., P. Grim, T. Holcombe, B. Luyendyk, H. Meyers, and S. Smith, 1972, Formats for Marine Geophysical Data\nExchange, National Academy of Sciences, 19 pp.\n16","SAMPLES OF GEODAS PRODUCTS\nDan R. Metzger\nNational Geophysical and Solar-Terrestrial Data Center, NOAA\nBoulder, Colorado\nAB STRACT\nGEODAS is a digital data base management system for storage, maintenance, and retrieval of\nmarine underway geophysical data. It can be thought of as having two sets of software.\nOne set is for use by a data base manager to maintain the data base. The other set of\nsoftware is \"user friendly\" and is designed for those who are not familiar with the\nsystem, allowing them to acquire information about the data and to retrieve it. This\nsecond set of users includes non-ADP personnel both at NG SDC and outside the center (via\nremote access). The purpose of this paper is to give a few examples of assimilation and\naccess of GEODAS.\nLIST OF FIGURES\nFigure 1. MGG Status Report\nFigure 2. Conversion, Inventory, Quality Control (CIQC) Program\nFigure 3. SEARCH program\nFigure 4. Composite trackline search plot\nFigure 5. A zooming-in - feature of the trackline plot program\nFigure 6. An individual cruise plot\nFigure 7. An MGD77 header listing\nFigure 8. Computer documentation from a data SELECT run\nFigure 9. SELECT batch run printout\n17","ICOMPOTERIZED\nVALOG\nMGG FILE\nRECU\nCRUISE/\nSTAPT\nEND\nPROJ\nFUND\nIDATAINGU\n77\n11\nDATA\nLOISTINAV IINV\n771HEADIINBIGSCUIFURHI INVIPLUTI\nCOMMENTS\nNUMBER\nMOYR\nLEG\nYSMODY\nYAMODY\nCODE\nCODE\nINHMGI\n09030022\n0979\nS1930004\n700129\n700301\nIN\n77\nHD\nIN\n77\n09030023\n0979\nST931014\n710604\n710731\n09030024\n0979\nSI343520\n750301\n750306\nIN\n77\n11\nD\nNI\n77\n11\nNI\n09030025\nSI931005\n701002\n701216\nIN\n09030026\nSI932005\n11\n711010\n711217\nINBMG 77\nINBMG 77\n11\n09030027\nST932009\n720111\n720403\n11\nNI\n09030028\nSI933010\n721115\n721205\nINBM\n77\n11\nNI\n09030029\nSI939012\n690529\n690922\nIN\n77\n11\n09030030\nSI343403\n730809\n730816\nINH\n77\nNI\nIR1210\n720922\nIN\n77\n11\nD\n09030031\nSI933001\n720730\n11\nNI\n09030032\nSI343816\n780322\n780412\nIN\n77\nD\nNI\n1R759\n09030033\nSI939011\n690213\n690515\nIN\n77\n11\n77\n11\nNI\n09030034\nSI343426\n740620\n740703\nIN\n11\nNI\n09030035\nSI343625\n760904\n760918\nINBM\n77\n11\nD\nNI\nIR1210\n09030036\nST343417\n740113\n740503\nINBM\n77\nINBMG\n77\n11\nD\nNI\n09030037\nSI343505\n740817\n740428\nIR673\n760222\nIX\n11\nD\n09030038\nST343613\n760206\n0\nIR1211\n09030039\nSI343615 760407 760423\nIX\n11\n11\nD\nNI\nIR1211\n09030040\nSI343621\n760715\n760804\nINB\n77\n11\n09030041\nSI343716\n770203\n770326\nIX\n09030042\n0380\nSI343409\n730824\n731006\nIX\n11\n07030043\n0380\nST343410\n731013\n731221\nIX\n11\nIR1211\n09030044\n0380\nSI343607\n751004\n751019\nINBM\n77\n11\nD\nNI\nD\nNI\n09030045\n0380\nSI343604\n751112\n751211\nINHMG\n77\n11\n77\nNI\n11\n09030046\n0380\nSI343716\n770203\n770326\nINB\nD\nIR1210\n09030047\n0580\n31343425\n740514\n740515\n11\n77\n11\nD\nNI\nIR1210\n09030048\n0580\nS1343505\n740820\n740928\nINBM\nINAMG\n77\n11\nNI\n09030049\nS1343912\n790507\n790515\nINB\n77\n11\nNI\n09030050\n$1343501\n740714\n740723\n11\n09030051\nSI343903\n790409\n799421\nINRMG\n09030052\nST343906\n790519\n790606\nINBM\n11\n09030053\nSI343910\n790821\n791002\nINBM\n11\n09030054\nS1343914\n791210\n791219\nINB\n11\nIIN\nD\nFI\nFF033 H33-01\n09040002\n938005\n670921\n671019\n05\nI IN\n09040003\n938019\n671109\n671202\n05\nD\nFI\nIF033 H33-01\nIF033 233-01\n09040004\n938021\n680217\n680228\n05\n1 IN\n0\nFI\n05\nIN\nI IN\nD\nFI\nN1\nIF041,NA CHARTS K41-01\n09040005\n93900768\n101268\n1128\nIF049 R49-01,-02\n09040006\n6702\n6703\n05\n11 B\nD\nFI\n09040007\n938036\n680630\n680913\nIN\n77\nI IN\nD\nFI\nIR748NAV,THRU 715\nIN\nIIN\nD\nFI\nNI\nIR758\n09040015\n930063L1\n700121\n700207\nI\nFI\nNI\nIR758\n09040016\n93006312\n700227\n700314\nIN\nIIN\nD\n09040017\n930063L3\n700320\n700330\nIN\nIIN\nD\nI\nFI\nN1\nIK758\n09040018\n938004\n670822\n670913\nI\n11\nD\nI\nF1\nIR760\n09040019\n0779\nHU730002\n691006\n691217\nIN\n77\nHD\nI IN\nD\nI\nFI\nNI\n18752\n09040020\n0779\nHU931009\n710306\n710623\nIN\n77\n11\nD\nFI\nNI\nIR1219\n700425\nHD\n11\nNI\n09040021\n0979\nHU931003\n700710\nIN\n77\nI\nIONE SHIP UF 2 SHIP CRUISEY R1218\nIN\n77\n11\nD\nI\nNI\n09040022\n0979\nHU931005\n701002\n701217\nNI\n09040023\n0979\nHU930007\n700413\n700614\nIN\n77\nHD\n1.1\nD\nNI\n09040024\n0979\nHU939010\n690114\n690202\nIN\n77\n11\n11\nD\nNI\nIR1218\n09040025\n0979\nHU931007\n710106\n710227\nIN\n77\nNI\nIF032 R32-01,R604-606\n09040026\nHU939013\n690603\n690826\nIN\n77\nIINBM D\nIX\n09040027\nHU939011\n690212\n690418\n0\nIR1218\n09040028\nHU939004\n680822\n680905\nD\nIR1217\nIF048,C886 R48-01,-02\n09050002\nKA81C\n680622\n680625\n05\nINBM\n11NBM D\n09050003\nKA810\n680628\n680724\n05\nINBM\nIINBIA D\nFI\nIF048,C886\nIF048,C886 SANTUOTH\n09050004\nKA81E\n680727\n680816\n05\nINBM\nLINBM 0\nF1\nIF048,C886\n09050005\nKA81F\n680817\n680908\n05\nINBM\n11NBM 0\nFI\nFI\nI\nIF048,0886\n09050006\nKA81G\n680909\n681002\n05\nINBM\nIINBM D\nFigure 1. The MGG Status Report shows the status of every cruise leg of geophysical data\nat MGG. The MGG NUMBER that NG SDC assigns to each port-to-port operation is its primary\ntag. The status report keeps track of the assimilation of each port-to-port operation.\nCodes are used to monitor the conversion to MGD77 format, MGD77 Header information,\naddition to the Navigation Inventory, etc.\n18","YR\nMO\nDY\nHR\nMIN\nLATITUDE\nLONGITUDE\nN\ne\nM\nG\nS\nAREA\nMILESLP\nSPEED\nAZINV\nAZSEG\nINV-NO\nRCOUNT\n72\n06\n20\n07\n00.0\n41.3242\n138.8313\n2\n:\n0\n5\n0\n1413\n060.494\n010.1\n111.2\n238\n1250\n72\n06\n20\n07\n40.0\n41.2892\n138.8497\n2\n5\n0\n5\n0\n1413\n002.256\n003.3\n072.1\n239\n1251\n72\n06\n20\n09\n50.0\n41.1876\n139.4680\n6\nX\nX\n034.0\n1263\n72\n06\n20\n13\n50.0\n41.2959\n140.1895\n2\n5\n0\n5\n0\n1414\n060.708\n009.0\n060.5\n240\n1284\n72\n06\n20\n14\n30.0\n41.2659\n140.2129\n2\n5\n0\n5\n0\n1414\n002.085\n003.1\n114.6\n241\n1285\n72\n06\n20\n20\n20.0\n41.2559\n138.9113\n2\n5\n0\n5\n0\n1413\n058.833\n009.9\n065.9\n242\n1310\n72\n06\n20\n20\n50.0\n41.2275\n138.8947\n2\n5\n0\n5\n0\n1413\n001.858\n003.7\n113.8\n243\n1311\n72\n06\n21\n02\n30.0\n41.2342\n140.2479\n2\n5\n0\n5\n0\n1414\n061.249\n012.8\n036.4\n244\n1345\n72\n06\n21\n02\n50.0\n41.2109\n140.2912\n2\n5\n0\n5\n0\n1414\n002.404\n007.2\n104.4\n245\n1346\n72\n06\n21\n03\n10.0\n41.1909\n140.2595\n2\n5\n0\n5\n0\n1414\n001.866\n005.5\n043.1\n246\n1347\n72\n06\n21\n03\n50.0\n41.1892\n140.1212\n2\n5\n0\n5\n0\n1414\n006.293\n009.6\n172.2\n247\n1351\n72\n06\n21\n04\n40.0\n41.1826\n140.1946\n2\n5\n0\n5\n0\n1414\n003.335\n004.0\n176.5\n248\n1352\n72\n06\n21\n05\n10.0\n41.1859\n140.0862\n3\n3\n0\n3\n0\n1414\n004.897\n009.4\n249\n1355\n72\n06\n21\n05\n40.0\n41.1859\n139.7446\n6\nx\nX\n030.8\n1356\n72\n06\n21\n05\n50.0\n41.1876\n139.7013\n1\n1\n0\n1\n0\n1413\n011.7\n250\n1357\n72\n06\n21\n09\n00.0\n41.1859\n139.0130\n2\n5\n0\n5\n0\n1413\n031.103\n010.3\n160.9\n251\n1373\n72\n06\n21\n13\n40.0\n41.1642\n139.8579\n2\n5\n0\n5\n0\n1413\n038.506\n009.5\n046.8\n252\n1389\n72\n06\n21\n14\n30.0\n41.0876\n139.9479\n2\n5\n0\n5\n0\n1413\n006.139\n007.3\n042.4\n253\n1390\n72\n06\n21\n15\n50.0\n41.0859\n140.2529\n2\n5\n0\n5\n0\n1414\n013.846\n009.4\n138.2\n254\n1398\n72\n06\n21\n16\n20.0\n41.0526\n140.2029\n2\n5\n0\n5\n0\n1414\n003.019\n006.0\n041.4\n255\n1399\n72\n06\n21\n21\n30.0\n41.0592\n139.0997\n2\n5\n0\n5\n0\n1413\n050.015\n006.5\n052.1\n256\n1425\n72\n029.2\n06\n21\n21\n50.0\n41.0909\n139.0630\n2\n5\n0\n5\n0\n1413\n002.526\n009.1\n257\n1427\n72\n06\n21\n22\n10.0\n41.1209\n139.0547\n2\n5\n0\n5\n0\n1413\n001.838\n005.5\n100.5\n258\n1428\n72\n06\n22\n03\n30.0\n41.1292\n140.2246\n2\n5\n0\n5\n0\n1414\n052.939\n009.5\n126.0\n259\n1450\n72\n06\n22\n03\n50.0\n41.1659\n140.1879\n2\n5\n0\n5\n0\n1414\n002.753\n008.2\n049.2\n260\n1451\n72\n06\n22\n05\n00.0\n41.1526\n139.9846\n2\n5\n0\n5\n0\n1413\n009.268\n002.7\n028.6\n261\n1458\n72\n06\n22\n05\n50.0\n41.0909\n139.8613\n2\n5\n0\n5\n0\n1413\n006.690\n008.0\n039.2\n262\n1459\n72\n06\n22\n09\n30.0\n41.1059\n139.0713\n3\n3\n0\n3\n0\n1413\n035.970\n008.7\n263\n1471\n72\n06\n22\n12\n00.0\n41.3875\n138.8130\n1\n1\n0\n1\n0\n1413\n008.2\n264\n1472\n72\n06\n22\n12\n10.0\n41.3875\n138.8597\n3\n3\n0\n3\n0\n1413\n002.100\n012.6\n265\n1473\n72\n06\n22\n13\n20.0\n41.4142\n139.1046\n1\n1\n0\n1\n0\n1413\n009.5\n266\n1474\n72\n06\n22\n14\n30.0\n41.4009\n139.3763\n2\n5\n0\n5\n0\n1413\n012.272\n009.8\n048.0\n267\n1479\n72\n06\n22\n14\n50.0\n41.4192\n139.4013\n2\n5\n0\n5\n0\n1413\n001.573\n004.7\n091.5\n268\n1480\n72\n06\n22\n16\n20.0\n41.5825\n139.1896\n2\n5\n0\n5\n0\n1413\n013.811\n007.9\n085.9\n269\n1486\n72\n06\n22\n18\n00.0\n41.7841\n139.4429\n2\n5\n0\n5\n0\n1413\n016.635\n010.0\n099.1\n270\n1492\n72\n06\n22\n19\n40.0\n41.5742\n139.3279\n6\nX\nX\n030.7\n1498\n72\n06\n22\n22\n50.0\n41.1309\n139.7696\n2\n5\n0\n5\n0\n1413\n042.090\n009.1\n082.5\n271\n1507\n72\n06\n23\n00\n00.0\n41.1509\n139.8829\n2\n5\n.0\n5\n0\n1413\n005.259\n045.7\n272\n1508\n72\n06\n23\n01\n20.0\n41.3575\n140.0579\n2\n5\n0\n5\n0\n1414\n014.703\n010.7\n045.7\n273\n1513\n72\n06\n23\n01\n40.0\n41.3659\n140.1112\n2\n5\n0\n5\n0\n1414\n002.452\n007.3\n058.0\n274\n1514\n72\n06\n23\n02\n20.0\n41.2859\n140.2329\n2\n5\n0\n5\n0\n1414\n007.300\n011.1\n036.9\n275\n1518\n72\n06\n23\n02\n30.0\n41.2842\n140.2629\n3\n3\n0\n3\n0\n1414\n001.355\n008.1\n276\n1519\nCRUISE HM7201\nMGGNUM 56070006\nHICODES 2202000000\nP-SURV 50500\nNPTS\nBPTS\nMPTS\nGPTS NMILS BMILS MMILS GMILS SMILS CMILS SPEED\n1519\n1519\n0\n1519\n31965\n31965\n0\n31965\n0\n0\n46\nTLT\nBLT\nLLN\nRLN\nSTART\nEND\nHI-CODES\nPR-FUND\nSURVO\n044 041 138 141 720518 720623 2202000000\n50500\n02 1413 1414 9999\n16\n17\nTHE PERCENTAGE REDUCTION IS\n018.1698\nNUMBER OF INV. REC. PER HOUR IS\n000.3319\nNUMBER OF INV. REC. PER MILE IS\n000.0863\nFigure 2. The CIQC (conversion, inventory, quality control) run printout provides\ninformation about the underway cruise data necessary to its assimilation into GEODAS.\nDefault values for speed, turning angles, sample rates, etc. determine when turning points\nand parameter starts and ends are to be used as inventory points. When speed exceeds\ndefault, points are flagged on the printout(* and in the MGD77 data records but are not\nincluded in the inventory. Comparison statistics show whether this cruise has an accept-\nable inventory point-to-data reduction. In the above example, the reduction is unaccept-\nable so the cruise will be run again, increasing values of speed, angle, etc. above the\ndefaults.\n19","SEARCH FOR NEW GEOPHYSICAL DATA IN SEA OF JAPAN\nKEY 99DJ REQUEST\n81/02/02. 16.08.05.\nSEARCHING FOR DIGITAL DATA ONLY\n2 AREAS SELECTED\n47.000\n130.000\n142.000\n43.000\n43.000\n128.000\n140.500\n35.500\nSEARCHING ONLY FOR INFO ADDED SINCE 810101\nGEOPHYSICAL DATA SUMMARY IN NAUTICAL MILES\nJAPAN\nMGG NUM\nCRUISE/LEG\nNAVIGATION\nBATHYMETRY\nMAGNETICS\nGRAVITY\nSEISMICS SSCAN/REFRAC DIG REC COUNT\n56060001 HT6501\n1061.8\n1061.8\n0.0\n1061.8\n0.0\n0.0\n346\n0.0\n2639\n56070002\nHM6804\n4931.3\n4931.3\n0.0\n4931.3\n0.0\n56070003 HM7002\n5685.6\n5685.6\n0.0\n5685.6\n0.0\n0.0\n3276\n56070005 HM7103\n735.7\n735.7\n0.0\n735.7\n0.0\n0.0\n339\n56070006 HM7201\n3303.1\n3303.1\n0.0\n3303.1\n0.0\n0.0\n1514\n0.0\n1873\n56070007 HM7202\n3976.0\n3976.0\n0.0\n3976.0\n0.0\n56070009\nHM7401\n1925.5\n1925.5\n0.0\n1925.5\n0.0\n0.0\n1426\n56070010\nHM7503\n276.8\n276.8\n0.0\n276.8\n0.0\n0.0\n320\n0.0\n11733\nTOTALS\n21895.8\n21895.8\n0.0\n21895.8\n0.0\nFigure 3. The \"SEARCH\" is the primary method for acquiring information about the data.\nEach SEARCH is given an identification code (KEY) which is used by subsequent queries to\nGEODAS. The characters in this identification code identify the search as relating to a\nparticular data request (for Center use) or as belonging to a particular outside user.\n20","SEARCH FOR NEW GEOPHYSICAL DATA IN SEA OF JAPAN\n99DJ\n128°E\n133°E\n138°E\n0\nM.S.\nMERCATOR PROJECTION\n128°C\n133°C\n138°C\nA Composite trackline plot, using the preceding SEARCH, was plotted on a\nFigure 4.\nscope.* Maps may be quickly generated, using any of several projections, by\nTektronix\ncopying the image off of the scope.\nMention of a commercial product does not constitute endorsement.\n*\n21","GRAVITY\nSEARCH FOR NEW GEOPHYSICAL DATA IN SEA OF JAPAN\n99DJ\n136°E\n137°E\n-133°E\n134°E\n135°E\nMERCATOR PROJECTION\n136°C\n137°E\n134°E\n135°C\n133°E\nFigure 5. A zooming-in feature may be used to plot gravity tracks in a portion of\noriginal data.\n22","A\nFigure 6. Individual cruise plots can be generated on a paper copy or microfilm for each\ncruise found by the SEARCH; these plots are generally run offline. Cruise plots can be\ngenerated immediately on a Tektronix scope by referring to their MGG NUMBER. Tracks for\nparticular geophysical parameters can be studies. Zeroing in on portions of the cruise is\navailable.\n23","1 56060001\nHT6501\nNO.\n01HT6501 MGD7756060001102953513801222HYDROGRAPHIC DEPT. JAPAN OCEAN. DATA CNTR01\nJAPAN\nTAKYO\n1SHIP ATUSHI TOKUHIRO\n02\n03\nUMP\n650421MAIZURU, KYOUTO\n650522TOKY0\n04\nLORAN A.253.254\nLORAN A.253.254\n05\n50KHZ.BEAM-WIDTH 15-DEGREE*2\nANALOG RECORD\n06\n07\n08\nTSSG-H1,SENSOR:P1\n09\n1001/SECOND 1500007\n12\n13\n100003IAG SYSTEM(1967) 9SYSTEM JGSN75\nDRIFT.ETOVOS\n14\n9798070MAIZURU\n9799520TOKYO\n15\n3 1313,1413.1414,9999. 0.\n0.\n0.\n0.\n0.\n0.\n0.\n0.\n0.\n0.\n0,16\n0.\n0.\n0,\n17\n0.\n0.\n0.\n0,\n0.\n0.\n0,\n0.\n0.\n0.\n0.\n0.\nCRUISE NAME:YAMATO TAI\n18\nOBSERVER:ATSUSHI TOKUHIRO. TAKESHI YANAGI\n19\nVALUES OF GRAVITY BY TSSG WERE COMPARED WITH KNOWN VALUES OF GRAVITY AT TWO\n20\nSTATIONS. TOKYO AND MAIZURU.\n21\nREDUCTION AND COMPILATION OF GRAVITY DATA HAVE BEEN MADE BY K. SUGIMOTO AND\n22\nT. YANAGI OF THE ASTRONOMICAL DIVISION. CONVERSION POSTSDAM SYSTEM INTO\n23\nIGSN 71 SYSTEM HAS BEEN CARRIED OUT BY Y. GANEKO.\n24\n2 56070002 HM6804\nNO.\n01HM6804 MGD7756070002102953513801222HYDROGRAPHIO DEPT. JAPAN OCEAN. DATA CNTR01\nJAPAN\nMEIYO\n1SHIP TAKAHIRO SATO\n02\nTHE BASIC MAP OF SEA OF CONTINENTAL SHELVES\n03\n680827TOKY0\n681029\n04\n05\nLORAN.A.253.2S4\nLORAN A.253.254\n50KHZ. BEAM-WIDTH 15-DEGREE*2\nANALOG RECORD\n06\nPROTON MAG\nTOTAL INTENSITY CHART 6328M.6329M\n07\nTSSG-H1, SENSOR:P1\nGRAVITY ANOMALY CHART 63286.6329G\n08\nAIR GUN\nANALOG RECORD\n09\n12\n1001/SECOND 1500007\n13\n100003IAG SYSTEM(1967) 9SYSTEM JGSN75\nDRIFT.ETOVOS\n14\n15\n979991 NIIGATA\nAKITA 980183\n4 1313,1314,1413,1414,9999, 0. 0.\n0.\n0,\n0.\n0.\n0.\n0.\n0.\n0,16\n0.\n0.\n0.\n0,\n0,\n0.\n0,\n0,\n0.\n0,\n0.\n0.\n0.\n0.\n0,\n17\nCRUISE NAME:AKITA YAMAGATA OKI AND SADO OKI\n18\nOBSERVER:A.TOKUHIRO,T.YANAGI: AND Y.GANEKO\n19\nSAKATA :980087 MGAL HAKODATE:980391 MGAL\n20\nREDUCTION AND COMPILATION GRAVITY DATA HAVE BEEN MADE BY .TAKEMURA. T. YANAGI 21\nAND Y.GANEKO OF THE ASTRONOMICAL DIVISION.\n22\nCONVERSION POTSDAM INTO IGSN 71 HAS CARRIED OUT BY Y.GANEKO.\n23\n24\nFigure 7. A listing of the MGD77 Headers for cruises found by SEARCH can be generated by\nan interactive query program.\n24","/CALL, GDSLCT\nTHIS PROGRAM CREATES A BATCH JOB WHICH WILL SELECT\nTHE DATA FOUND BY GDSRCH ONTO MAG TAPE(S)\nENTER 4 DIGIT JOB KEY\nEX. 15JA\n? 99DJ\nENTER YOUR BANNER (UP TO 10 CHARACTERS)\n? ***DEMO***\nDETERMINE OUTPUT TAPE CHARACTERISTICS\nENTER NUMBER OF TRACKS\nMT 7 TRACK\nNT 9 TRACK\n? NT\nENTER DENSITY\nHD 800 BPI\nPE 1600 BPI\n? HD\nENTER CONVERSION MODE\nAS ASCII\nEB EBCDIC\n? EB\nDO YOU WANT ONLY RECORDS WHICH FALL IN THE AREA(S) SPECIFIED IN THE SEARCH? Y/N\n? Y\nJOB WILL REQUIRE TAPES\nR99DJ1\nWILL NOW SAVE REQUEST JOB *** M99DJDK FOR LATER SUBMITTAL\nREQUEST FILE M99DJDK ALREADY EXISTS\nDO YOU WANT TO REPLACE IT WITH THIS NEW VERSION. Y/N\n? Y\nDO YOU HAVE REASON TO SAVE MGG NUMBERS FILE\nOF KEY 99DJ (SAY FOR TRAX)\nENTER Y/N\n? Y\nEND PROGRAM\nRFL (20000)\n/\nFigure 8. The printout accompanying the MGD77 data tapes gives information about each\ncruise as well as a summary of each tape. The price algorithm results are also printed\nout for NG SDC personnel. .\n25","KEY 99DJ\nGEOPHYSICAL DATA REQUEST\nNATIONAL GEOPHYSICAL AND SOLAR TERRISTRIAL DATA CENTER (NGSDC)\nENVIRONMENTAL DATA AND INFORMATION SERVICE (EDIS)\nTHIS PRINTOUT ACCOMPANIES DATA TAPE R99DJ1\nTHE DATA IS IN MGD77 FORMAT DESCRIBED IN KGRD 10 - THE MARINE\nGEOPHYSICAL DATA EXCHANGE FORMAT - MGD77\nTHE TAPE IS AS FOLLOWS:\n9 TRACK 000 PARITY EBCDIC 800 BPI\nEACH FILE ON THE TAPE CONTAINS ONE CRUISE (PORT TO PORT\nOPERATION) OF DATA. THE FIRST 1920 CHA RACTERS OF EACH FILE\nARE THE MGD77 HEADER FOR THAT CRUISE. THIS IS FOLLOWED\nBY THE DATA RECORDS WHICH ARE 120 CHARACTERS BLOCKED AT\n16 RECORDS PER BLOCK (1920 CHARACTERS PER BLOCK)\nFOR FURTHER INFORMATION CONCERNING THIS DATA CONTACT\nDAN METZGER\nNATIONAL GEOPHYSICAL AND SOLAR-TERRESTRIAL DATA CENTER\nCODE 0-621, NOAA/EDIS\nBOULDER, CO 80303\nTELEPHONE: (303) 497-6542\nF TS 320-6542\nSELECT RECORDS ONLY IN THE FOLLOWING AREA(S)\nTOP LAT\nBOTTOM LAT\nLEFT LONG\nRIGHT LONG\nS=-\n47.000\n43.000\n130.000\n142.000\n43.000\n35.500\n128.000\n140.500\nTAPE R99DJ1 FILE 1 MGG NUMBER 56060001 CRUISE ID HT6501\n1HT6501 MGD7756060001102953513801222HYDROGRAPHIG DEPT. JAPAN OCEAN. DATA CNTR01\nJAPAN\nTAKYO\n1SHIP ATUSHI TIKUHIRO\n02\nUMP\n03\n650421MAIZURU. KYOUTO\n650522TOKY0\n04\nLORAN A.253.254\nLORAN A.253.254\n05\n50KHZ,BAM-WIOTH 15-DEGREE*2\nANALOG RECORD\n06\n07\nTSSG-H1,SENSORIP1\n08\n09\nF6.4 F6. I2 I1,\n10\nF6.1,F5.1,A8,4I1)\n11\n1001/SECOND\n1500007\n12\n13\n100003IAG SYSTEM(1967) 9SYSTEM JGSN75\nDRIFT,ETOVOS\n14\n9798070MAIZURU\n9799520TOKY0\n15\n03 1313,1413,1414,9999\n16\n17\nCRUISE NAME YYAMATO TAI\n18\nOBSERVER:ATSUSHI TOKUHIRO, TAKESHI YANAGI\n19\nVALUES OF GRAVITY BY TSSG WERE COMPARED WITH KNOW VALUES OF GRAVITY AT TWO\n20\nSTATIONS, TOKYO AND MAIZURU.\n21\nFigure 9. A SELECT program is used to create a batch runstream for selecting the MGD77\ndata according to the SEARCH criteria and generating magnetic data tape for distribution.\nthe case of outside users a file is generated which indicates that a SELECT jobstream\nIn\nis ready for submitting. Center personnel will run the job and send the resulting tape\nand printout to the customer.\n26","REDUCTION AND COMPILATI IN OF GRAVITY DATA HAVE BEEN MADE BY K. SUGIMOTO AND\n22\nT. YANAGI OF THE ASTRONOMICAL DIVISION. CONVERSION POSTSDAM SYSTEM INTO\n23\n24\nIGSN 71 SYSTEM HAS BEEN CARRIED OUT BY Y. GANEKO.\n3HT6501 +09006505079999999+3890108+1352235119999990296506019999999999999999999999999999999800452+00000-0254999999994929\n3HT6501 +09006505079999999+3890774+1349768619999990296106019999999999999999999999999999999800462+00000-025099999999492\n3HT6501 +09006505079999999+3891107+1349252019999990295706019999999999999999999999999999999800502+00000-0213999999994929\n3HT6501 +09006505079999999+3890940+1348835419999990284506019999999999999999999999999999999800512+00000-0201999999994929\n+09006505079999999+3890607+134845211999999028110601999999999999999999999999999999980052200\n3HT6501\n+09006505079999999+3890273+1347918819999990251606019999999999999999999999999999999800622+00000-0085999999994929\n3HT6501\n+09006505079999999+3889773+1347152219999990240706019999999999999999999999999999999800622+00000-0081999999994929\n3HT6501\n3HT6501 +09006505079999999+3889106+1346352319999990201106019999999999999999999999999999999800762+00000+0065999999994929\n3HT6501+09006505079999999+3889939+1345785719999990186706019999999999999999999999999999999800822+00000+0118999999994929\n3HT6501 +090065050799999993913103+1344719019999990045106019999999999999999999999999999999801472+00000+0563999999994929\n3HT6501 +09006505079999999+3915436+1344685719999690039906019999999999999999999999999999999801492+00000+0562999999994929\n3HT6501\n3HT6501 +090065050799999993920269+1344652319999990081206019999999999999999999999999999999801232+00000+0259999999994929\n+09006505079999999+3922435+1344635719999990080606019999999999999999999999999999999801282+00000+0290999999994929\n3HT6501\nP09006505079999999+3924435+1344652319999990080306019999999999999999999999999999999801312+00000+0302999999994929\n3HT6501\n347 RECORDS FOLLOWING 1920 CHARACTER MGD77 HEADER\n1. 56060001 HT6501\nSUMMARY OF TAPE R99DJ1\nFILE MGG NUM CRUISE\nRECORDS\n56060001 HT 6501\n347\n1.\n56070002 HM6804\n2652\n2.\n56070003 HM7002\n3285\n3.\n4. 56070005 HM7103\n341\n5. 56070006 HM7201\n1519\n1876\n6.\n56070007\nHM7202\n7. 56070009\nHM 7401\n1430\n336\n8. 56070010\nHM7503\n11786\nTAPE TOTALS\n(continued)\nFigure 9.\n27","Interactive Graphics for Retrieval, Display, and Editing of Marine Gravity Data\nKen Fournier\nUnion Oil. Research. P .0. Box 76, Brea, Calif.\nABSTRACT\nSpatially distributed data, such as that contained in the Dept. of Defense Gravity file, can be readily processed\nby interactive computer graphics. Data in digital form can be accessed and displayed on graphics terminals for\nsuch graphic operations as data spotting and annotation, gridding and contouring, point deletions, etc. Much\nof\nthe DMA file and other public gravity files have been merged by such graphic processing.\nSubsequent editing and mapping has provided composite gravity data sets ready for interpretation. Such data, one\nmapped, can then be readily retrieved in profile or map form for 2-D or 3-D interpretation. The following series\nof annotated figures illustrate the use of this technique for the Canadian Geological Survey open file reports off\nthe east coast of Newfoundland. A step-by-step procedure is illustrated in these figures to accomplish deletion\nof data that is not mappable in its present form; and also to illustrate various options available in the computer\nsystem.\nFIGURE 1 The hardware employed: the Tektronix 4014-1 (on left) and the IBM\n3278 (on right) provide both graphics and alphanumeric\ncapability respectively. These two devices are \"loosely\" coupled\nby software to enable both graphic and alphanumeric full screen\ninteraction with data retrieved.\n28","NEWFOUNDLAND\nFIGURE 2 Marine gravity and magnetic data points from data disclosed in\nthe Canadian Geological Survey open file reports off the east\ncoast of Newfoundland. The area enclosed in the dark rectangle\nwill be displayed in detail on the following figures.\n29","49°00'\n50*00'\n49°30'\noss\n+\n50°00'\n50°00'\n49°30'\n49°30'\n49°00'\n50°00'\n49° 30'\nFIGURE 3.\nA map produced on the Versatec plotter run in the \"batch\"\nshowing just gravity stations contoured from the area outlined on\nFIGURE 2. Note the anomalous chacter of the north-\neast/southwest lines.\n30","9/25/80 13:37\nSELECTION PAGE\nNEWFOUNDLAND SHELF\n49\n5\n1351 POINTS READ IN\nMAP NO.\nPTS SYM\n2182\n723\n180\n604\n+\n182\n579\n2180\n546\n320\n175\n312\n90\n322\n40\n2181\n24\n181\n2\nTOTAL\n1351\n+\n0 POINTS DELETED\nMIN LAT\n49. 3340\nMAX LAT -\n49.9999\nMIN LONG =\n49.0016\nMAX LONG\n49. .9973\n+\nPOSITION CURSOR FOR\nSELECT SEARCH BOUNDS\nHARD COPY\nRESTART\nBATCH MAP\nEND\n+\n+\n+\n+\n5. MINUTE INTERVAL\n49 55\nFIGURE 4 We have retrieved data described by latitude/longitude bounds\nfrom the data base in which gravity magnetic and bathymetric\ndata have been stored within UTM zones. We use the graphic\ncursor to indicate the upper left and lower right parts of the display\ndata to be further studied. In this case the entire display.\nNEWFOUNDLAND SHELF\n9/25/80 13:39\nMAP NO.\nPTS SYM\n49\n5\n180\n604\n2180\n546\n320\n175\n2181\n24\n181\n2\n+\n+\n+\n+\nPOSITION CURSOR FOR\nSELECTION PAGE\nREPLOT\nZOOM IN\nZOOM OUT\n+\n+\n+\nCONTOUR INTERVAL =\nCONTOUR ELEVATION -\nBATCH MAP\nSELECT ANOMALOUS POINTS\nLABEL ANOMALOUS POINTS\nLABEL ALL POINTS\n+\n+\n+\n+\nLABEL SELECTED MAP\nLABEL ALL STA\nPLOT SYMBOL\nCOINCIDENT POINT\nDELETE AREA\nHARD COPY\n+\n+\n+\nEDIT START\nEDIT END\nEND\nMIN AND MAX GRID VALUES\n2011.06\n-1914.57\n+\n+\nAVERAGE DIFFERENCE\n1.18\n49\n55\n5. MINUTE INTERVAL\nSTANDARD DEVIATION\nFIGURE 5 The display of the retrieved data from that shown in FIGURE 4 is\nnow gridded, and a histogram in the lower left comer summarizes\nthe difference between the actual data points and the surface\ninternlated from the computed arid to each data point.","NEWFOUNDLAND SHELF\n9/25/80 13:39\nMAP NO.\nPTS SYM\n49\n5\n180\n604\n.\n2180\n546\nA\n320\n175\n2181\n24\n181\n2\n.\nPOSITION CURSOR FOR\n:\nSELECTION PAGE\nREPLOT\nZOOM IN\nZOOM OUT\nCONTOUR INTERVAL\nCONTOUR ELEVATION\nBATCH MAP\nSELECT ANOMALOUS POINTS\nLABEL ANOMALOUS POINTS\nLABEL ALL POINTS\n+\nLABEL SELECTED MAP\nLABEL ALL STA .\nPLOT SYMBOL\nCOINCIDENT POINT\nDELETE AREA\nHARD COPY\nEDIT START\nEDIT END\nEND\nMIN AND MAX GAID VALUES\n-2011 06\n-1914.57\n8.08\nAVERAGE DIFFERENCE\n49\n55\n5. MINUTE INTERVAL\nDEVIATION\nFIGURE 6 Using the range of interpolated grid values from FIGURE 5 a contour\ninterval of two milligals is chosen and the data are contoured,\nshowing the strong bias on the northeast/southwest line. A portion\nof this data is then selected for enlargement by selecting the\nzoom-in option with the graphic cursor.\nMAP NO.\nPTS SYM\n180\n130\n.\nNEWFOUNDLAND SHELF\n9/25/80 13:42\n2180\n165\n49 35\n320\n70\nPOSITION CURSOR FOR\n:\nSELECTION PAGE\n+\n+\n+\nREPLOT\nZOOM IN\nZOOM OUT\nCONTOUR INTERVAL\nCONTOUR ELEVATION\nBATCH MAP\nSELECT ANOMALOUS POINTS\nLABEL ANOMALOUS POINTS\nLABEL ALL POINTS\nLABEL SELECTED MAP\nLABEL ALL STA a\nPLOT SYMBOL\nCOINCIDENT POINT\nDELETE AREA\nHARD COPY\nEDIT START\nEDIT END\nEND\n+\n+\n+\n+\n+\nMIN AND MAX GRID VALUES\n1980. 93 -1927.09\n201\n30\n+\n+\n+\n+\n80\nNO\n2.98\nAVERAGE DIFFERENCE\n49 55\nSTANDARD DEVIATION\n5. MINUTE INTERVAL\nFIGURE 7 The selected part of FIGURE 6 data is posted, gridded and a\ncontour interval chosen for more detailed mapping.\n32","MAP NO.\nPTS SYM\n180\n130\n.\n9/25/80 13:42\nNEWFOUNDLAND SHELF\n2180\n165\nA\n45 35\n320\n70\n.\nPOSITION CURSOR FOR :\nSELECTION PAGE\nREPLOT\nZOOM IN\nZOOM OUT\nCONTOUR INTERVAL -\nCONTOUR ELEVATION =\nBATCH MAP\nSELECT ANOMALOUS POINTS\nLABEL ANOMALOUS POINTS\nLABEL ALL POINTS\nLABEL SELECTED MAP\nLABEL ALL STA #\n1980.\nPLOT SYMBOL\nCOINCIDENT POINT\nDELETE AREA\nHARD COPY\nEDIT START\nEDIT END\nEND\nMIN AND MAX GRID VALUES\n1980 93\n-1927.09\nDEST\n2.68\n3.00\nAVERAGE DIFFERENCE\n1.02\n5. MINUTE INTERVAL\n49\n55\nSTANDARD DEVIATION\n1.20\nThe map of the data displayed in FIGURE 7 with a contour interval of\ntwo milligals. Again, a further detailed portion is chosen by the\nzoom-in option.\nFIGURE 8\n9/25/80 13:44\nMAP NO.\nPTS SYM\nNEWFOUNDLAND SHELF\n180\n7\n2180\n320\n13\nPOSITION CURSOR FOR :\nSELECTION PAGE\nREPLOT\nZOOM IN\nZOOM OUT\nCONTOUR INTERVAL -\nCONTOUR ELEVATION\n=\nBATCH MAP\nSELECT ANOMALOUS POINTS\nLABEL ANOMALOUS POINTS\na\nLABEL ALL POINTS\nLABEL SELECTED MAP\nLABEL ALL STA #\nPLOT SYMBOL\nCOINCIDENT POINT\nDELETE AREA\nHARD COPY\nEDIT START\nEDIT END\nEND\nMIN AND MAX GRID VALUES\n-1966 12 1941.46\nAVERAGE DIFFERENCE\n0.80\nSTANDARD DEVIATION\n0.91\nFIGURE 9 Shows the retrieved data within the zoom-in window and a\nhistogram illustrating grid and data point mismatches is presented\nafter gridding.\n33","MAP NO.\nPTS SYM\nNEWFOUNDLAND SHELF\n9/25/80 13:44\n180\n7\n2180\n320\n13\nPOSITION CURSOR FOR :\nSELECTION PAGE\nREPLOT\nZOOM IN\nZOOM OUT\nCONTOUR INTERVAL\nCONTOUR ELEVATION =\nBATCH MAP\nSELECT ANOMALOUS POINTS\nLABEL ANOMALOUS POINTS\nLABEL ALL POINTS\nLABEL SELECTED MAP\nLABEL ALL STA #\nPLOT SYMBOL\nCOINCIDENT POINT\nDELETE AREA\nHARD COPY\nEDIT START\nEDIT END\nEND\nMIN AND MAX GRID VALUES\n-1966.12 1941.46\nAVERAGE DIFFERENCE\n0.80\nSTANDARD DEVIATION 0.91\nFIGURE 10 Anomalous points are highlighted (arrows) by choosing the\n\"select anomalous points\" option and positioning the vertical\ncrosshair cursor at an appropriate point within the histogram.\nMAP NO.\nPTS SYM\nNEWFOUNDLAND SHELF\n9/25/80 13:44\n180\n7\n2180\n320\n13\nPOSITION CURSOR FOR :\nSELECTION PAGE\nREPLOT\nZOOM IN\nZOOM OUT\n3387\n180-1945.00\n14538 2180-1943.00\nCONTOUR INTERVAL -\nCONTOUR ELEVATION\nBATCH MAP\nSELECT ANOMALOUS POINTS\n&\nLABEL ANOMALOUS POINTS\nLABEL ALL POINTS\n519\n320-1959.00\nLABEL SELECTED MAP\nLABEL ALL STA\nPLOT SYMBOL\nCOINCIDENT POINT\n3370 180-1948.00\n14541\n2180-1948.00\nDELETE AREA\nHARD COPY\nEDIT START\nEDIT END\nEND\nMIN AND MAX GAID VALUES\n-1966.1\n1941.46\nAVERAGE DIFFERENCE 0.50\nSTANDARD DEVIATION 0.81\nFIGURE 11 Specific points are highlighted by using the coincident poir.\noption, identifying the location of coincident points and then\npositioning the crosshair cursor at an available point on the\nscreen for display of some data point attributes. This illustrates\ntwo problems: 1) a duplication in some survey lines with 2 milligals\ndifferences and 2) a mismatch between the northeast/southwest\nline and the east/west line of about 10 milligals.\n34","9/25/80 13:47\nMAP NO.\nPTS SYM\nNEWFOUNDLAND SHELF\n180\n7\n.\n2180\n320\n13\nPOSITION CURSOR FOR :\nSELECTION PAGE\nREPLOT\nZOOM IN\nZOOM OUT\nCONTOUR INTERVAL\nCONTOUR ELEVATION\n#\nBATCH MAP\nSELECT ANOMALOUS POINTS\nLABEL ANOMALOUS POINTS\nLABEL ALL POINTS\nLABEL SELECTED MAP\nLABEL ALL STA #\nPLOT SYMBOL\nCOINCIDENT POINT\nDELETE AREA\nHARD COPY\nEDIT START\nEDIT END\nEND\nMIN AND MAX GAID VALUES\n1966.12 -1941.46\nAVERAGE DIFFERENCE\n0.80\nSTANDARD DEVIATION 0.81\nFIGURE 12 Again, this detailed display may be contoured after clearing\nthe screen.\n9/25/80 13:54\nNEWFOUNDLAND SHELF\n49 5\nMAP NO.\nPTS SYM\n180\n604\n:\n2180\n546\n4\n320\n162\n2181\n24\n181\n2\n.\n+\n+\n+\n+\n+\n+\n+\n+\n+\n+\nPOSITION CURSOR FOR :\nSELECTION PAGE\nREPLOT\nZOOM IN\nZOOM OUT\n+\n+\n+\n+\n+\n+\n+\n+\n+\nCONTOUR INTERVAL\nCONTOUR ELEVATION\n-\nBATCH MAP\nSELECT ANOMALOUS POINTS\nLABEL ANOMALOUS POINTS\nLABEL ALL POINTS\n+\n+\n+\n+\n+\n+\n+\n+\n+\nLABEL SELECTED MAP\nLABEL ALL STA .\nPLOT SYMBOL\nCOINCIDENT POINT\nDELETE AREA\nHARD COPY\n+\n+\n+\n+\n+\nEDIT START\nEDIT END\nEND\n+\nMIN AND MAX GRID VALUES\n-2011, 06 -1914.29\n+\n+\n+\n+\n+\n+\n25\n+\n+\n+\n+\n+\n+\n+\n+\n+\nor\n.00\n2.44\n0.00\nAVERAGE DIFFERENCE\n1.15\n5. MINUTE INTERVAL\n49 55\nSTANDARD DEVIATION\n1.40\nFIGURE 13 Shows the display of all of the initial points from FIGURE 4 plus the\nboxes of the zoom-in areas, and illustrates the ability to delete\npoints either individually with the crosshair cursor or within\nspecified areas using the crosshair cursor.\n35","NEWFOUNDLAND SHELF\nMAP NO.\nPTS SYM\n9/25/80 13:54\n49\n5\n180\n604\n.\n2180\n546\n4\n320\n162\n2181\n24\n066\n181\n2\n.\nPOSITION CURSOR FOR :\nSELECTION PAGE\nREPLOT\nZOOM IN\nZOOM OUT\nCONTOUR INTERVAL\nCONTOUR ELEVATION -\nBATCH MAP\nSELECT ANOMALOUS POINTS\nLABEL ANOMALOUS POINTS\nLABEL ALL POINTS\n+\nLABEL SELECTED MAP\nLABEL ALL STA #\nPLOT SYMBOL\nCOINCIDENT POINT\nDELETE AREA\nHARD COPY\nEDIT START\nEDIT END\nEND\nMIN AND MAX GRID VALUES\n2011. 06 -1914.29\n+\n2.00\nAVERAGE DIFFERENCE 1.15\n49\n55\nSTANDARD DEVIATION 1.46\n5. MINUTE INTERVAL\nFIGURE 14 Should be compared with FIGURE 6 showing the area contoured\nwith the points in the small box deleted.\nMAP NO.\nPTS\nDEL\nSYM\nNEWFOUNDLAND SHELF\n9/26/80 06:25\n49\n15\n180\n307\n0\n.\n2180\n218\nA\n320\n0\n172\n.\n1980\nPOSITION CURSOR FOR\n+\n:\nSELECTION PAGE\nREPLOT\nZOOM IN\nZOOM OUT\nCONTOUR INTERVAL -\nCONTOUR ELEVATION\n=\nBATCH MAP\nIII\n+\n19\n+\nSELECT ANOMALOUS POINTS\nLABEL ANOMALOUS POINTS\nI\nQ.\nLABEL ALL POINTS\nLABEL SELECTED MAP\nLABEL ALL STA #\nPLOT SYMBOL\nCOINCIDENT POINT\nDELETE AREA\nHARD COPY\n+\n+\n+\nEDIT START\nEDIT END\nEND\nMIN AND MAX GRID VALUES\n1988 45 -1930. 80\n+\n0.60\n1.00\nAVERAGE DIFFERENCE\n1.02\nSTANDARD DEVIATION\n0.94\n49 50\n5. MINUTE INTERVAL\nFIGURE 15 Shows the map of the data from FIGURE 4 with the entire set of\nnortheast/southwest points deleted and the anomaly now\nremoved.\n36","49°00'\n49°30'\n50°00'\n50°00'\n+\n50°00'\n49° 30'\n49°30'\n49°00'\n49°30'\n50°00'\nFIGURE 16\nShows the data from FIGURE 3 as selected from the area on\nFIGURE 2 now submitted to the batch, and mapped outside of\nthe interactive system. It still shows some of the northeast/\nsouthwest lines and their anomalous behaviour; however, the\nanomalous misadjusted data in the center of the map has been\nremoved and the gravity data looks reasonable.\n37","DATA BASE AND INTERACTIVE GRAPHICS TECHNIQUES AT THE CENTER FOR ENVIRONMENTAL ASSESSMENT SERVICES\nGerald S. Barton\nNOAA/CEAS, Washington, D.C.\nINTRODUCTION\nThe Center for Environmental Assessment Services (CEAS) is a center within NOAA's Environmental Data and\nInformation Service (EDIS). One of the main responsibilities of CEAS is the assessment of the effects of cli-\nmate and weather on the marine environment, agriculture in the U.S. and foreign areas, and energy use in the\nU.S.\nTo meet the requirements of various projects, CEAS has developed expertise in data base management systems\n(DBMS) and computer graphics. This paper will briefly describe some of these applications.\nDATA BASE MANAGEMENT SYSTEMS APPLICATIONS\nA DBMS is a software system which allows a computer to be used to store, update, and retrieve data. Most\nDBMS's provide facilities to easily access data items. Once defined and entered into the system, data can be\nretrieved according to changing requirements of the user. For example, a retrieval made from a data base of\npersonnel records might search for all employees born before 1950; another retrieval might search for all\nemployees who are female and earn more than $15,000 per year.\nCEAS has developed several DBMS applications for various projects. Besides allowing flexible retrieval of\ndata to satisfy any search relationship desired, the systems can be used with a nationwide communications\nnetwork. The DBMS applications are implemented on the ADP Network Services Time Sharing System using the 1022\nDBMS. The ADP computers are located in Ann Arbor, Michigan, and Waltham, Massachusetts. ADP provides local\nphone call service for most areas of the country. Users in areas without local phone service can use the\ntoll-free, 800 area code service. There are also provisions for accessing the U.S. system from foreign\ncountries. This telecommunication capability allows the system to be used by people in widely scattered\nlocations. CEAS' and NOAA's uses of these capabilities are discussed below.\nRNODC-FOY Global Ocean Data Base\nThe United Nations Intergovernmental Oceanographic Commission (IOC) requested that member states provide\ndata management services for oceanographic data collected during the First Garp Global Experiment (FGGE).\nIn\nresponse to the request, EDIS is serving as a Responsible National Oceanographic Data Center for FGGE\nOperational Year (RNODC-FOY).\nThe RNODC-FOY has compiled a Global Ocean Data Inventory (GODI) containing directory information on all\ndata-gathering activities which occurred in the world oceans during the period September 1, 1978 through\nMarch 1, 1980. It contains the attributes principal investigator, project names, ship and cruise\nidentification, parameters observed with methods and frequency of observations, dates of observational\nactivity, processing and archiving centers, documentation and status of the activity, and geographic location\nof the data.\nRetrievals can be made on any attribute or combination of attributes. An example is seismic exploration\nin\nthe North Atlantic Ocean. The result of the search would be a list of all data sets which satisfy both\nsearch parameters.\nThe GODI has been used by government, academic, and private individuals throughout the U.S., England and\nFrance.\nGODI is described in Matejceck and Dennis (1979). The Global Ocean Data Inventory (1979) is a printed\ncopy of the inventory which was prepared for the IOC.\nInterim Climate Data Inventory\nThe Interim Climate Data Inventory (ICDI) was developed by CEAS to satisfy requirements of the U.S.\nClimate Program. It describes climate data available in the world. The data are classified into six areas:\nBasic Atmospheric Data; Hydrology, Precipitation, Snow and Ice; Ocean Data; Radiation, Physics, and Chemistry;\nProxy and Non-instrumental Data Resources (e.g., the Geologic Record); and Geographical, Land Use, and\nAssessments Data.\nEach data record contains the attributes title, parameters (e.g., temperature, wind direction), period of\nobservation, geographical area, data type, media, volume, holding center, and remarks.\nBecause of the nationwide telecommunications facilities available on the ADP system, the ICDI can be used\nby researchers throughout the U.S.\n38","EDIS CONFER System\nThe EDIS CONFER is a computer conferencing system developed by EDIS. It was designed and implemented by\nEDIS in 1977 as a tool to improve communications between personnel located in geographically separated EDIS\ncomponents.\nThe ADP 1022 DBMS and FORTRAN were used to program CONFER. Messages can be entered, edited, and sent to\nall conference members, or to selected conference members. A user can retrieve messages by selecting only new\nmessages not yet seen, or old messages selected by date, organization, or message number. Unlike some other\ncomputer mail systems, messages remain available for use even after being received by the user.\nVarious implementations of CONFER are in use in NOAA for a number of projects. All have the need for com-\nmunications between individuals in dispersed geographical areas. Systems are used by both U.S. and foreign\ngroups. Typical applications are: computer-managed conferences, message transmission, multi-party develop-\nment of textual material, more rapid communication between organizations, development and review of special\nstudies requiring several rounds of communication, and issuance of instructions and advice to staff.\nCONFER is available to interested parties for use on the ADP Network Services Systems. Information can be\nobtained from:\nChief, Systems Integration and Planning (Dx4)\nEnvironmental Data and Information Service\nNational Oceanic and Atmospheric Administration\nWashington, D.C. 20235\nSpill Response Program Inventory Tracking System\nThe Spill Response Program Inventory Tracking System (SPITS) was developed in CEAS. It will be used to\ntrack information collected by Hazardous Spill Response Teams during oil or hazardous substance spills. SPITS\nallows interactive entry, updating, and display of descriptive and tracking information.\nThe system is programmed with the 1022 DBMS and FORTRAN. It contains information on the spill\ndescription, name, address, trip navigation, sample descriptions, and chain of custody for tracking\ninformation. A feature of the system is a security system that allows control of access to the different\nparts of the system. It also provides for individualized accounting of use of the system.\nOne of the values of SPITS being implemented on the ADP computer is the U.S. telecommunications network.\nThe system can be used for near-real time remote access from the spill site, headquarters, and other involved\nparties located throughout the nation.\nINTERACTIVE COMPUTER GRAPHICS\nCEAS' use of interactive computer graphics began with the acquisition of a PDP-11/50 minicomputer and\nRamtek GX-100B color raster graphics hardware in 1975. The graphics system has been used mainly for quality\ncontrol of time series meteorological data and display of digital satellite data.\nThe PDP-11/50 minicomputer has 256,000 bytes of memory, two 40 megabyte disks, 5 tape drives, and a number\nof communications facilities. The operating system is IAS, which is a sophisticated real-time multi-user\nsystem. The graphics unit is connected to the computer via a direct memory access device.\nThe Ramtek GX-100B is an interactive system which displays a picture composed of 640 X 480 picture ele-\nments (pixels). Each pixel has 8 bits of resolution which means that 256 shades of gray or 256 colors can be\ndisplayed on the television screen. Keyboard and cursor control devices allow the user to communicate with\nthe software. Barton and Saxton (1976) described the system.\nTime series display and editing applications have been the main use of the graphics system. In a typical\napplication, a number of meteorological parameters can be displayed with each parameter having a different\ncolor. The time series can be scrolled from right to left so that the user can see a continuously changing\nview. The scroll can be stopped as desired so that editing can be done. Data values can be deleted,\ninserted, or a missing section of data can be interpolated. Quality control flags associated with each para-\nmeter may be set to note the action taken. Both edited and unedited data are available on disk for output to\ntape when the user desires. Anderson and Crayton (1978) described this application.\nAnother major graphics application is the picture processing system. Any image in digital format may be\ndisplayed. We have processed visual and infrared images from meteorological satellites and digital radar\nimages. Digital data for a picture are read from magnetic tape and stored onto disk in the system format.\nOnce on disk all data are in the same format; thus, any of the processing functions will operate on any data.\nImages may be displayed in color or black and white. There are 256 shades of gray and 256 colors. The 256 the\ncolors can be selected from the 4096 available in the color table. The software allows manipulation of\ncolors on the display so that an analyst can isolate specific features. There is a zoom, rotate, and\ntranslate feature for closer view of parts of the picture; new pictures created by these operations may be\nsaved on the disk for future access.\n39","The picture processing system has possible application for other than visual analysis. Image processing\ntechniques could be used with the digital images stored on disks. For example, images could be enhanced, or\ntwo images could be compared. The resulting image could then be written to disk, and displayed with the pic-\nture processing system.\nA feature of the graphics system is the capability to display color pictures in NTSC format. NTSC is the\nU.S. standard for broadcast color video which is used on home color televisions. This allows output from the\ngraphics system to be recorded on video tape, which is a useful medium for information exchange.\nSUMMARY\nCEAS makes extensive use of the 1022 DBMS on the ADP Network Services timeshare computer system. The DBMS\nis easy to use and allows the user to structure searches to satisfy his needs. The major advantage of the ADP\nsystem is the national and international telecommunications system which permits users in remote locations to\nuse the system for DBMS and communications purposes.\nThe CEAS PDP-11/50 minicomputer and interactive graphics system is a powerful tool for data editing and\nfor image analysis. A major feature is the provision for a NTSC color signal. This allows pictures to be\nrecorded on video tape for exchange with other scientists, or to preserve a graphics processing session.\nREFERENCES\nAnderson, C. E., and R. B. Crayton, 1978: Use of Interactive Graphics in Editing and Validating Scientific\nData. Bulletin of the American Meteorological Society, February, Vol. 59, No. 2, pp. 161-169.\nBarton, G., and D. Saxton, 1975: The Role of Interactive Computer Systems in Data Processing at CEDDA. EDS,\nMay, pp. 10-14. Available from NOAA/EDIS, Washington, D.C.\nMatejceck, J. B., and R. E. Dennis, 1979: System Design of a Global Ocean Data Inventory and a Global Ocean\nClimate Data Base Covering the FGGE Operational Year. Proceedings of the Second Working Conference on Ocean\nData Systems, October, Woods Hole, pp. 343-370.\nRNODC-FOY, 1979: Global Ocean Data Inventory. Prepared for the Intergovernmental Oceanographic Commission by\nthe Responsible National Oceanographic Data Center for the FGGE Operational Year, 976 pp. Available from\nRNODC-FOY, NOAA/EDIS, Washington, D.C.\n40","INTERACTIVE GRAPHICS IN A COMPUTER-SATURATED ENVIRONMENT\nPeter W. Sloss\nNational Geophysical and Solar-Terrestrial Data Center\nBoulder, Colorado\nAB STRACT\nTwo main problem areas must be addressed in discussions of how to live with a busy computer and a busy work\nschedule. First, the heavily-loaded computer system must not be burdened with superfluous operations to perform;\nsecond, the system user must not be burdened with superfluous operations to perform. If that sounds like an echo,\nit is supposed to. Neither the machine nor the man has time to waste. In view of this constraint, the software\ndesigner must strive to keep both man and machine active but not overloaded. Since machines are less inclined to\nbecome impatient, I will deal with the human elements of interactive graphics (and other data manipulations) more\nintensively.\nInteractive computer programs are supposed to lighten the burden of the overworked data professional by\nperforming complex operations based on a relatively small and simple set of instructions from the user. Consider-\nable effort has been expended in the development of commercial, proprietary Data Base Management Systems (DBMS)\nwhich are proudly advertised to do just that. The human is expected to be able to turn on his terminal, dial a\nphone, and answer a few simple questions to solve very complicated problems in data synthesis and abstraction. To\nthe frequent dismay of the intrepid investigator, however, the impressive flexibility inherent in DBMS causes them\nto be disk and core hogs in the computer and therefore not very swift in the completion of their appointed rounds.\nThis paradox may leave the user in the unfortunate position of waiting long and unproductive intervals to respond\nto the system's next, albeit simple, request. Long waiting times also increase the probability of communications-\ninduced errors in transmission, such as noise signals being intercepted by the computer as real commands. Please\nhang up and dial again. Such frustrations are bad for the blood pressure and do not help get the job done.\nTo minimize the risks of data loss and operator breakdowns, the programmer must strive for efficiency in\nprogram design and compactness of operating code so that the job spends as little time as possible in the never-\nnever land of the rollout queue. This implies that no executing module of the program should be larger than some\nsize which sad experience has found to be the practical maximum (not necessarily a constant, either!). The ra-\ntionale for such design is to keep the job active in the CPU as much as possbile. Most heavily loaded systems are\nI/0-bound and have quite a bit of idle time in the think-box; the peripheral channels to tape drives, disks, and\nremote-user parts are generally the source of apparent saturation. The ideal interactive program is always next\nin line for a scarce resource. In theory, we have the problem solved--write compact programs and the CPU will\nbeat a path to your door. Graphics systems, unfortunately, grow in inverse proportion to their apparent simpli-\ncity. The less input needed, the more has to be internally synthesized by intelligent software. Graphics systems\nwith high levels of geographic and three-dimensional plotting, contouring, fancy lettering, and flexibility, need\na lot of electronic brain power to substitute for whatever the user lacks.\nIf the computer is too busy to deal effectively with large graphics programs in a survivably short span of\ntime, then how can a user operate the desired system? The answer is simple, if devious and sneaky. Make the user\nthink he is doing graphics when in actuality he is priming a runstream which the system can chew on at its lei-\nsure, off-line. Obviously, there can be no direct graphical responses to the user, but for the purposes of the\npresent discussion, we have to assume that such delightful possibilities as interactive map editing are prerequi-\nsites of the well-endowed who have dedicated systems that don't waste effort on the mundane needs of others.\nGiven that the desired final product of an interactive computer session is a plot of certain data, the clever\nsoftware engineer creates a surrogate for the actual plotting program which asks similar questions and gives most\nof the same responses, but does none of the actual work. The true data fetching and plotting is done later by the\nreal program while the satisfied user is pursuing other tasks. The overall waiting time for delivery will usually\nbe greater for this hybrid process, but the user and his communications channel are set free after a brief encoun-\nter with the machine. The important difference between the real and imitation graphics programs is that the\nquestion-and-answer session is mercifully brief in the latter case; the final product is identical from either.\nThere is an obvious pitfall in the suggested method of double-programming: if the program is not used often\nand in prime time on the system, then there is no need for the extra effort of creating the extra program, i.e.,\ndon't waste time developing sophisticated dummy programs unless some dummy needs to use them enough to justify\ntheir existence. In NG SDC's service-oriented environment, it is our experience that much of our graphical soft-\nware can be converted to off-line operation and input can be handled by clerical personnel once the smarts have\nbeen built into the main program and the right questions are asked by its surrogate.\nThe processes necessary to live through a system with sclerosis of the peripherals may be summarized as\nfollows:\n1. Determine characteristics of the desired output and products.\n2. Determine the range and number of responses needed to completely define the desired output.\n3. Develop the necessary software to produce the output.\n4. Test the program(s). If OK, continue, else, (1) above.\n5. Create the surrogate program for interactive efficiency.\n41","Step (1) is clearly the most important to the success of this design process. Step (2) is the development\nof\nthe\nset of interactive requests from either the main program or the surrogate which transfer the desired criteria to\nthe program. Step (3) is the fun part. Step (4) will repeatedly lead the programmer back to (1) and possible\ninsanity, but no one will notice the difference. Steps at and beyond (5) extend into the world of operational\ndata services and program use by unsophisticated clerical personnel and outsiders who are unfamiliar with\nthe\nharsh realities of shared-time computer.\n6. Turn it loose on the world.\nFigure 1. Long lags in computer responses are unproductive.\nFigure 2. Streamlined programs keep things humming and the user happy. The\nsystem designer can turn the actual program operation over to\ntechnical support staff.\n42","Figure 3. High-level graphics program systems are pleasingly simple to use and\ngive elegant results.\n72.0\n34N\n-73.6\n91\n6\n39.1\n-50.7\n48.0\nit\n73.8\nC\n48.5\nH.\n48\n0\n40.4\nH\n30.\n33N\n113W\nBOUGUER ANOMALY, MGAL\n114W\nPrint from microfilm (COM) plot\nFigure 4.\ngenerated by actual run of\nprogram set up by session of\nquestions/answers in Fig. 5.\nThe \"real\" 11 program must fetch\ndata, interpolate data to uni-\nform grid, compute contours,\nand generate microfilm.\n43","INTERACTIVE TERMINAL SESSION\nUser responses are underlined.\nCALL,GRAVSUB\nCHARCE NUMBER FOR SURMIT FILE (E.G.: R5X.7Ji76231) ? FOR,EXAMPLE\nYOU ENTERED FOR,EXAMPLE\nIF THIS WAS OK, ENTER 1, ELSE ENTER 0? 1\nWHAT IS YOUR NAME (10 LETTER LIMIT)? BUSY USER\nDATA REQUEST NUMBER (CR=NONE, NEG=STOP)? 12345\nNAME OF DATA FILE (CR=SAME AS LAST)? LANGRAY\nNAME OF FILE OWNER (USER NAME) ? HEAVY\nMINIMUM ARRAY SIZE FOR INTERPOLATION/CONTOURING IS 11811\nMAXIMUM ARRAY SIZE IS 65*121 (X * Y)\nENTER NUMBER OF POINTS ALONG X (LONGITUDE) AXIS? 61\nENTER NUMBER OF POINTS ALONG Y (LATITUDE) AXIS ? 61\nDO YOU WANT GEOGRAPHY PLOTTED (1/0/CR=YES/NO/STOP) ?\n1\nWHICH NCAR MAP PROJECTION # (DEFAULT = 8)? 9\nLATITUDE, LONGITUDE OF MAP POLE (CR = DEFAULT 0,0) ? 00,00\nWHAT SPACING ON LAT/LON GRIDLINES (DEGREEES) ? 1\nLATITUDE OF LOWER-LEFT (S.W.) CORNER OF DATA ? 33\nAND THE CORRESPONDING LONGITUDE? -114\nSIZE OF AREA IN LONGITUDE EXTENT (DEGREES) ? 1\nSIZE OF AREA IN LATITUDE EXTENT (DEGREES)? 1.25\nCHOOSE A PARAMETER FOR PLOTTING--\n1 = STATION ELEVATION\n2 = OBSERVED GRAVITY\n3 = FREE-AIR ANOMALY\n4 = BOUGUER ANOMALY\n5 = POINT LOCATIONS ONLY\n0 = NEW MAP AREA\nOR NEGATIVE TO STOP.\n? 4\nDO YOU WANT POINT DATA VALUES ON THE PLOT (Y/N)? N\nENTER (CR) TO USE WHOLE INTERPOLATED FIELD, 0 TO USE\nONLY CELLS WITH ACTUAL DATA (CUTTING CONTOURS ELSEWHERE),\nOR NUMBER OF SMOOTHING PASSES ON MASKING ARRAY ? (CR)\nDO YOU WANT GRIDDED DATA VALUES ON PLOT (Y/N)? N\nENTER CONTOUR SPACING OR (CR) TO LET PROGRAM CHOOSE IT ? (CR)\nENTRY COMPLETE FOR THIS PLOT. ENTER 1 TO CONTINUE OR 0 TO STOP? 1\nCHOOSE A PARAMETER FOR PLOTTING--\n1 = STATION ELEVATION\n2 = OBSERVED CRAVITY\n3 = FREE-AIR ANOMALY\n4 = ROUGUER ANOMALY\n5 = POINT LOCATIONS ONLY\n0 = NEW MAP AREA\nOR NEGATIVE TO STOP.\n?\n-1\nALL DONE. YOUR SUBMIT FILE IS TAPE1. PLEASE REMEMBER TO\nCHECK (LIST OUT) TAPE1 BEFORE SUBMITTING IT.\nYOU MAY TEXT-EDIT TAPE1 TO MAKE NECESSARY CHANGES.\nWHEN READY, SUBMIT TAPE1 TO THE BATCH QUEUE.\nREWIND,TAPE1.\nRESULTING OFF-LINE RUN CONTROL FILE\n/COPY,TAPE1\n/JOB\n/NOSEQ\nGRAVITY,T1200.\nBUSY USER\nUSER,DTD.\nCHARGE,FOR,EXAMPLE\nROUTE,OUTPUT,DC=PR,UN=EDSTRM,DEF.\nHEADING.$ BUSY USER\nHEADING.$ *\nHEADING.$0 GRAVITY\nPURGE,GR12345/NA.\nMAP,OFF.\nDEFINE,TAPE8=GR12345/CT=S.\nGET,GRAVPLT/UN=PWS.\nCALL,GRAVPLT.\nGET,GRAFING/UN=GRAF.\nBEGIN,NCAR,GRAFING,PLOTTER=MCRFLM,LOADMAP=O.\nDAYFILE,GRDAY.\nREPLACE,GRDAY.\nEXIT.\nDAYFILE,ORDAY.\nREPLACE,GRDAY.\n/EOR\n1\n12345\nLANGRAV\nHEAVY\n61\n61\n1\n4\n0. 0.\n1.\n33.\n-114.\n1.\n1.25\n4\nN\n-1\nN\n0.\n-1\n-1\nFigure 5. Sample question-and-anser session to set up an off-line run of a large program.\nWith additions for accounting purposes, these are the same questions asked by\nthe actual graphics program.\n44","OPPORTUNITIES AND CHALLENGES IN DATA STORAGE, RETRIEVAL,\nAND DISPLAY AT L-DGO IN THE 1980s\nELLEN M. HERRON\nLamont-Doherty Geological Observatory, Palisades, N.Y.\nIn the 1960s a small group of marine scientists and students led by Manik Talwani\ndeveloped a computer-based system for the storage, retrieval, and display of navigation,\nbathymetry, magnetic, and gravity data acquired at sea.\nThis system quickly became the standard for U.S. oceanographic institutions, and the\nformat now used by NGSDC is based on the time series structure which is the core of the\nL-DGO system. No major changes in this system have taken place at L-DGO since the 1960s,\nbut significant changes have occurred both in the nature and scope of marine geophysical\ndata acquisition and in the scientific problems now being addressed. It is time for a\nserious review of the present system and consideration of alternate structures for the most\nappropriate and efficient handling of marine data.\nThe pressures that affect our system for handling marine data are multifold:\nIn the 1960s the \"omnibus\" mode of funding marine science at L-DGO allowed and\n1.\nencouraged development of an institutional-wide system for editing, storing, and\nusing marine geophysical data and quality control was easily obtained since there\nwas one principal investigator, Maurice Ewing. The funding agencies were strong\nsupporters of marine data acquisition programs, and there was sufficient work\nand funding to support a full data reduction staff who could also ensure that the\ndata base was maintained for all users. The users of the data base were students\nand researchers who worked directly with the computer to generate whatever out-\nputs they desired for their projects.\nIn the 1970s the \"omnibus\" mode was discarded by NSF and ONR in favor of project-\nby-project funding with a different principal investigator responsible for each\nproject. The quality of individual data sets returned to the lab varied with size each\ninvestigator. Emphasis on routine data acquisition programs dropped as the\nof the data base grew and maintenance of this data base became a significant\nproblem as funding for a data reduction staff shrank. The number of technicians\nworking for scientists grew, and these personnel tended to become the primary\nusers of the computer, instead of the scientists and students as in the 1960s.\nThe development of new techniques and instruments for marine geophysical research,\n2.\nespecially multichannel seismic reflection and refraction and deep-tow studies\nincluding digital side-scan, has required development of dedicated computer\nsystems for the extensive processing required of these data sets. The processing\nsystems and data base structures for these disciplines have developed independent\nof the classical system designed to handle navigation, bathymetry, gravity. and\nmagnetics data, and these new data sets, in contrast to the \"classical\" sets,\nare of very limited use until they have been subjected to extensive, sophisticated\nprocessing techniques. The end products seem relatively easy to interpret, but\nonly a few scientists with extensive knowledge of the acquisition and processing\ntechniques are really proficient and knowledgeable in fully exploiting these\ndata. And only these trained scientists are also fully aware of the limitations\nof the data. The potential for misuse of data sets and misinterpretation may be\nmuch greater for these new data sets than for the classical navigation, bathy-\nmetry, magnetics, and gravity data sets.\nIn addition to the change in the mode of funding and emphasis on data acquisition\n3.\non the part of NSF and ONR, the appearance of new and/or mission-oriented agencies\nsupporting marine research has created problems in the development of systems\nfor storage, handling, and display of data. These new agencies such as BLM or\nJOI, Inc., have tended to assume that existing formats and procedures for archiving\ndata sets are the most appropriate and hence only formats to be used. Many of\nthe programs funded by these agencies are designed to survey in great detail a\nvery small area of the seafloor, and the raw data along the ship's track is of\nmuch less utility than the compilation of these data onto a contour chart or\nother summary diagram.\n45","The data handling system which was developed at L-DGO in the 1960s is still an excellent\nsystem for large-scale regional studies, and it is a system which ensures that the data\nare archived in as close to initial state as possible. The new acquisition techniques and\nthe data obtained during detailed studies of very small areas point out the need for\nadditional systems and formats in which to archive data if a principal objective in\nchoosing formats for archived data sets is to ensure that the data are available in a\nformat which permits efficient use of these data.\nSince these new data sets include extensively processed or \"massaged\" data, it is\nvery important that the scientists who use these data are familiar with the assumptions\nand limitations incorporated in these data sets. The scientists who ultimately use the\ndata are also the only persons who can determine whether or not the systems and formats\nfor storage, retrieval, and archiving of various data sets are the ones which are best\nsuited to efficient utilization of the data. Systems for storage, retrieval, and archiving\nof data should be \"live\" dynamic systems, not \"dead\" static ones.\nActive participation by the scientists who use the data is essential to keeping a\nsystem \"live\". Our challenge and opportunity at L-DGO is to generate the interest and\nenthusiasm of the scientists and simultaneously to generate funds to undertake the essen-\ntial task of designing data base systems which will best serve us through the 1980s.\nL-DGO Contribution Number 3106\n46","DATA STORAGE AND ACCESS AT LAMONT-DOHERTY GEOLOGICAL OBSERVATORY\nRose Anne Weissel and Thomas D. Aitken\nLamont-Doherty Geological Observatory\nPalisades, New York\nLamont-Doherty currently stores marine geophysical data for over 500 cruise legs of VEMA, CONRAD, and ELTANIN\nand approximately 1,100 cruise legs from other institutions, corporations, and government agencies. Approximately\n60 scientists, graduate students, and technicians use this data on a regular basis. To best fill their scientific\nneeds and yet keep computing costs to a reasonable level, a parttime system of on-line disk storage is used.\nDATA STORAGE\nThe basic system of data storage was defined by Talwani (\"A Computer System for the Reduction, Storage, and\nDisplay of Underway Data Acquired at Sea,\" Technical Report No. 1-1969); the data is organized by cruise leg\n(typically 28 days at sea) and divided into data types: navigation, bathymetry, magnetics, and gravity. These\ndata are stored in separate files as sequential time series. If merging of the navigation and one or more data\nfiles is needed, it is done at execution time. The most important of the many changes made since 1969 is that\nreduced data, not raw data, make up these files and are used by all the display programs.\nNavigation is usually the smallest file and contains the ship's geographic position at various times. Time is\nexpressed as elapsed time in minutes from 0000 hours, January 1 of a given year. Latitude, longitude, cumulative\ndistance along track, and time zone are stored for each point, as are speed and set of drift when data is reduced\nat Lamont. The regional magnetic field value is computed and stored for each location; a field strength value is\ninterpolated for each magnetics point at execution time. Station numbers are stored when the ship is on station.\nA navigation point is defined as a change in course, a change in speed (or if neither have occurred for three\nhours, a pitlog reading is used), or any fix (satellite, Loran, star, transponder, etc.).\nThe three primary geophysical parameters and some digitized single-channel seismic reflection data\n(sub-bottom) are stored using the same direct-access disk file format consisting of date-header records and data\nrecords. Each data record contains eight time-data pairs. The time and data values are both stored as one-word\nintegers to minimize storage requirements.\nTopography is stored in uncorrected fathoms. Lamont depth recorders assume a sound velocity of 800 fm/sec;\nMatthews zones are routinely stored with the data for use when output in corrected fathoms or corrected meters is\nrequired.\nMagnetics data are stored as total field values in nanoteslas (gammas). No diurnal corrections are applied.\nWhen a program is run, the appropriate regional field for each data location is interpolated from the values at the\ntwo adjacent navigation points.\nGravity free air anomalies are stored in milligals. An Eotvos correction is applied during reduction of the\ndata.\nDATA ACCESS\nWhen new data are added to the system, an entry is made in a location file giving the maximum geographic\nextent of that cruise leg and what geophysical parameters are being stored. An area search is handled by\nretrieving from the location table those cruise legs (names) whose boundaries overlap the requested region.\nA preprocessor program is used to generate input parameter files for all of the frequently used plotting\nprograms. This is of great benefit to the users.\nIt simplifies the running of routines that require numerous and complex input parameters. A user with no\n1)\nFORTRAN background can set up and run these programs by answering a series of straightforward questions.\nIt can obtain Lamont and/or non-Lamont cruise names from the location file. This facilitates plotting\n2)\nall the data in a given area, something that had always been a problem at Lamont.\nIt allows the job to be run automatically during specified less-expensive evening hours when the data\n3)\ndisk is on line.\nThe user can prepare an input file in advance that can be submitted at any time. This helps avoid\n4)\nrunning jobs when the computer system is slow and therefore expensive.\nL-DGO Contribution No. 3131.\n47","NEW DATA\nData storage by cruise leg becomes less and less efficient as the number of cruise legs intersecting an area\nbecomes larger and as the geographic area being studied becomes smaller. In response to this problem, several data\nsets at Lamont are being converted to geographical storage systems. The basic file unit is defined by geographic\narea (often 1°x1°) and all data for a given parameter in that area are stored without regard to order. These data\nrecords are often 80 characters long and contain latitude, longitude, information pertaining to one geophysical\nparameter, and some identification of the source of the data. Presently non-marine gravity data and heat flow sta-\ntion data are stored like this and there are plans to include other types of marine data, including underway marine\ngeophysics.\n48","COMPUTER STORAGE AND RETRIEVAL\nOF POSITION-DEPENDENT DATA AT\nWOODS HOLE OCEANOGRAPHIC INSTITUTION\nRobert C. Groman\nWoods Hole Oceanographic Institution, Woods Hole, Mass.\nAbstract:\nNumerous storage and retrieval schemes have been implemented\nat the Woods Hole Oceanographic Institution to manage and mani-\npulate various location-dependent data sets. Five such schemes\nare described here. These systems deal with current meter data,\nCTD data, gravity data, seismic refraction data, and underway geo-\nphysical data. A short description of the first four systems is\npresented, while a more detailed discussion of the underway geo-\nphysical data library system is provided.\nKey Words: data management, data retrieval, position-dependent\ndata.\nMy presentation today reviews some of the computer-based storage and retrieval systems\nfor position-dependent data at the Woods Hole Oceanographic Institution (WHOI) I will first\noutline those systems operated by four groups within WHOI : buoy group, CTD group, gravity\ngroup, and the seismic refraction group. Finally, I will discuss, in detail, a new data\nretrieval system designed to manage underway geophysical measurements.\nBuoy Group\nThe buoy group1 maintains a suite of programs to manage current meter data for the moored\narray project. Current meter data versus time, temperature and pressure data are organized\nin a two-level system. The first level is essentially a directory to the data at the second\nlevel. The directories contain a summary of the descriptive information and provide the means\nto access the data in a direct and efficient manner. Specially prepared retrieval and\ngraphics routines are available to access and display these data.\nConductivity, Temperature and Depth (CTD) Group\nThe CTD group, headed by Robert C. Millard, Jr., collects, calibrates and processes\nvertical profiling data and stores calibrated CTD observations as a uniform pressure series.\nThe basic data storage quantum is the file, with each CTD profile stored as a separate disk\nfile. Extensive use is made of the computer system's directory and subdirectory capability\nin order to group together profile data from the same cruise.\nThe cruise index file contains one record per cruise, describing the geographic area and\ntime bounds, number of stations, data quality, the variables measured and the pressure\naveraging interval. For each entry in the cruise index a separate file, called the station\nindex file, is maintained. This file contains a record for each CTD profile taken during the\ncruise. Each record in a station index file contains start and stop positions and times, the\nminimum and maximum values of the observations and the station observation file name. A\nstation observation file contains the measured CTD data and computed values.\nGravity Group\nThe gravity group, headed by Dr. Carl 0. Bowin, maintains a library of gravity measure-\nments organized by one degree and ten degree squares. New data are sorted and then merged\nwith the existing data set. This scheme permits direct access to data within any degree\nsquare by referencing an index table of starting record numbers. Data within the same degree\nsquare are stored sequentially on disk.\nThe gravity group uses a second scheme to manage position-dependent data generated\nduring data analysis. This scheme, known as COSA, effectively implements the concept of\n\"third normal form\" relations by using a separate disk file for each data type. For example,\nthe latitude values are stored in the file called XYZ.LAT; the longitude values are stored in\nthe file called XYZ.LON; and the attribute \"height\" values are stored in the file called\nYZ.HGT. A correlation among the different files is preserved by the record number within the\nfile. This scheme has proven very effective in managing data sets which receive a high degree\nof processing and analysis.\n1. Dr. Nick P. Fofonoff, Dr. William J. Schmitz, Jr., and Dr. James R. Luyten.\n2. Date, C. J. 1977. An Introduction to Data Base Systems. Reading: Addison-Wesley.\n49","Seismic Refraction Group\nThe seismic refraction group, headed by Dr. G. Michael Purdy, maintains storage and\nretrieval software for seismic refraction data generated by explosion (airgun, explosives,\netc.) or by earthquakes. Each seismic \"event\" is stored on a separate disk file, with a\nheader record, including the number of digitized recording channels, date, location, and\ndigitizing rates. Data which are not currently being studied are stored on magnetic tape.\nAs with the CTD group, the computer system's directory naming flexibility is used to\norganize data taken during the same experiment. For example, those data collected during\nthe Project ROSE experiment are stored in the disk directory \"ROSE\". The subdirectory \"01\"\nidentifies data collected during deployment 1 of the project. Finally, descriptive file\nnaming conventions differentiate the various kinds of parameters collected and computed for\neach event.\nUnderway Geophysical Data Management Group\nThe digital data library group3 are responsible for processing and managing the under-\nway geophysical measurements (navigation, depths, magnetics and gravity) collected by WHOI\nscientists. In order to improve the access to WHOI's collection of these data I have\nimplemented a new storage and retrieval scheme based on a cruise leg and geographical data\norganization.\nThe key features of this scheme include the following:\n1. A data dictionary describes each element stored in the library. This\napproach allows for easier growth when new data attributes are added.\nA data storage scheme, using magnetic disk storage and isolating the\n2.\npositional information from the secondary measurements, provides flexible\nand efficient data storage. Infrequently used data can be removed from\non-line storage without jeopardizing the ability to satisfy requests\nabout the data.\nThe data organization facilitates retrieval within geographical boundaries\n3.\nand yet retains a sequential access capability.\nData compaction techniques reduce disk storage needs and reduce retrieval\n4.\ncosts.\nAn inquiry facility provides the user with a comprehensive set of commands\n5.\nto query the library about the information stored as well as provide\ninterfaces to existing graphics routines.\nA crucial consideration in designing a data retrieval system is how the data will be\nused. For underway geophysical measurements, there may be requests for profiles and charts,\nand summaries of data availability. One can also expect that new applications will introduce\nnew 1ocation-dependent parameters. It is the position, the location on the earth's surface\nat which the parameter was measured, which plays a central role in these data sets. This\nposition-dependency leads to two types of relations: a near neighbors relation and a\nsequential neighbors relation.\nThe near neighbor relation refers to the fact that the study of oceanography very often\nanalyzes the relationship among data values within geographically close locations. Depending\non the features under investigation, the geographical area can span a few hundred square kilo-\nmeters ('fracture zone studies') to as much as an entire ocean (Mid-Atlantic ridge studies)\nData obtained by different researchers, over the course of many years, are often studied\ntogether in order to piece together an understanding of the basic physical processes taking\nplace on, in, and below the ocean.\nThe sequential neighbors relation occurs because data parameters are collected, usually\naboard a sea-going ship, in sequence, as the ship moves through the water. Often, the change\nin the values from one position to the next is as informative as the absolute value of the\ncollected data. This leads to a requirement that data be accessible in the same order as they\nwere collected.\nThese two relationships form the basis of nearly all existing underway geophysical\nretrieval requests at the Woods Hole Oceanographic Institution. It is not surprising then,\nthat I have designed the data storage scheme to facilitate retrieval along these lines.\nHowever, I have not lost sight of the fact that requirements evolve and new data types are\nadded, and I have included features in my design to allow for change.\n3. Robert C. Groman, Jane A. Dunworth and Stephen R. Gegg as of this date.\n50","Data Storage Scheme\nI have classified information to be stored in the library into three categories: primary\ndata, secondary data and pointers. In this section I describe how these data are organized to\nprovide an efficient, cost effective and flexible means to store 1ocation-dependent data.\nThe primary data include date, time, latitude and longitude information. These data are\norganized into direct access files according to the particular one degree geographical square\nin which the data reside. Primary data are stored in primary data files. I selected this\napproach as it provides:\n1. growth potential for other than geophysical data,\nthe ability to reduce the on-line size of the library while affecting a\n2.\nminimum number of users,\neconomic data storage, and\n3.\n4. simplicity.\nOne implementation of this scheme would require as many as 64,800 files if data existed\nin each one degree area of the world. Even taking into account my relatively sparse data set,\nthis would be too many files for the operating system. However, in the implementation\nselected, only four files are used to cover the world, one file for each geographical quadrant\nof the earth. I define the four geographical quadrants as the northeast, northwest, southeast\nand southwest portions of the earth's surface. With this definition, Canada is located in the\nnorthwest quadrant and Australia is located in the southeast quadrant. Within each quadrant\nfile, data can still be accessed by degree because of a data structure maintained by the\nlibrary called the bounds table. This scheme retains the desired retrieval capabilities with-\nout undue file manipulation.\nAll information about a geographical location, not considered as primary data, is called\nsecondary data and is stored in a separate area, the secondary data files. Among the many\nadvantages to this approach are the following:\n1. Adding new data attributes does not effect the organization of the primary\ndata.\nSecondary data can be removed entirely from on-line disk storage without\n2.\naffecting queries based on position or time.\nSelected portions of the secondary data can migrate on and off the disk as\n3.\nthe needs for these data change.\nThe data storage cell size need not be the same in the primary and secondary\n4.\nfiles, making better use of disk space.\nAlthough the grouping of data within a storage cell (consecutive words of disk storage) in the\nsecondary data files can be arbitrary, data which are known to be retrieved together are placed\nin the same storage cell to improve retrieval times.\nThe third category of information is pointer data. Pointers are used to link related\ninformation, using a linked list structure. Pointer fields in the primary data files exist for\neach relationship in which position records may participate. Currently, these relationships\nare the geographical proximity relation and the cruise leg organization of the data. A linked\nlist structure is also used in the secondary data files to connect data attributes acquired at\nthe same location and time. I chose this approach to link information because:\n1. new data attributes can be easily added;\nnewly identified relationships can be implemented easily by adding an\n2.\nadditional pointer between common entities;\ndata can be easily deleted from a linked list; and\n3.\n4. lists can be reordered to improve retrieval.\nSample Query Session\nFigure 1 shows a sample query session. Once the user is logged onto the computer, the\nquery program is initiated with a RUN DBA0: [GNG]DBQUERY command. The program uses two con-\nsecutive \"greater than\" symbols as a prompt to the user. The first command shown is a\nLIST/FULL command. This command causes the system to retrieve all the information it has about\nthe specified cruise leg. Note that a summary of the data available for the cruise is included.\nThis information is readily available from the \"ship table data structure\" and involves a\nsimple direct lookup. The second command shown is the SET command, which restricts the user's\narea of interest to the geographical area bounded by 21 degrees south by 25 degrees south and\n14 degrees east by 20 degrees east. The system saves this user supplied information internally\nfor use during retrieval. The third command, DISPLAY, results in a summary of the data para-\nmeters available within these bounds. To satisfy this request, the system searches each bounds\ntable affected by the user-selected geographical bounds and computes a sum of all data parameters\navailable. Finally, the user ends the query session by issuing the END command. The system\nresponds with a message and a summary of the computer resources used during the session.\n51","Figure 1.\nSample Query Session\n$ RUN DBA0: CGNGIDBQUERY\nDBQUERY - VERSION 1.00 - 80DEC29 12:11:35\nHI! With this prosram you can request information from the\ndictionary and library files. If you need assistance with any\nof the commands just type in HELP for more information\n>> LIST/FULL ATLANTIS II 67 5\n*OTPTSTBLF message, full contents of ship table entry follows:\nCruise les id: ATLANTIS II 67 5\n0.00 End date: 1972/ 5/ 6\n0.00\nStart date: 1972/ 4/12\nDate added: 1980/ 5/13 1617.54\n0 Security code: 0 Reference #:\n151\nReserved:\n0\nFlass:\nProject name: IDOE\nChief scientist: MILLIMAN\nPort stops: WALVIS BAY-WALVIS BAY\nContributor: Woods Hole 0ceanosraphic Institution\nComments: SE ATLANTIC OCEAN\nNumber of Data Points\nParameter name\n1310\nCORRECTED DEPTH\n1310\nPOSITION(S)\n>> SET/BOUNDS -21,-25,14,20\n>> DISPLAY\n*PRTDATSUM message, summary of\n-21\ndata available in the area\nindicated to the risht.\n14\n20\n-25\nNumber of data points\nParameter name\n44\nPOSITION(S)\n44\nCORRECTED DEPTH\n0\nGEOMAGNETIC ANOMALY\n0\nFREE AIR ANOMALY\n>> END\nGOOD BYE FOR NOW.\nBUFFERED\nTIMES IN\nSECONDS\nPAGE\nDIRECT\nI/O\nI/O\nCPU\nELAPSED\nFAULTS\n1.61\n64.46\n237\n142\n77\nFORTRAN STOP\n52","THE S.I.0 SAMPLE INDEX\nStuart M. Smith\nVirginia W. Psaropulos\nJames S. Charters\nGeological Data Center Scripps Institution of Oceanography\nLa Jolla, California\nBACKGROUND\nThe SIO Sample Index originated in 1972 at the request of the Oceanographic Collections Committee to provide\nbasic information about time, date, position, and initial disposition for samples, measurements, and recordings\ncollected on Scripps cruises.\nThe three main objectives of the Index are to:\nAssist investigators collecting the data by providing lists of samples with positions added from the\n1)\ncomputer-stored smoothed navigation file in a format useful for generating required cruise reports.\nAssist curators responsible for a given sample type in knowing what data were collected and by whom.\n2)\n3) Provide for retrievals from a general data base searched by geographic area, data, sample type, cruise, or\ndisposition.\nA functional flow chart of Index processing, storage, and retrieval is shown in Figure 1. Figure 2 is an\nexample of an encoding form of the Index data record.\nThe Index data record contains:\nTime\n(K)\n*\nDate\nTimezone\n*\nSample Type/subtype Code\n(K)\n*\nBegin/End/Continue Flag\nSample Identifier (2 parts)\n*\n(K)\n*\nDisposition Code\n(K)\n*\nLatitude, Longitude\n*\nPosition Control Code\nCruise (cruise, leg, ship)\n(K)\n*\n(K) = MIDAS Search Key\nIndex data have been encoded for all legs of major cruises on R/V Washington and Melville since 1972 and\nincluded with the cruise reports generated by the Geological Data Center. The index now resides on punched cards\nordered by cruise leg and totals roughly 25,000 entries from all the main disciplines. General cruise information,\nsuch as begin/end ports and dates are included, as are entries for members of the scientific parties. There are\ntwo associated files, one for sample type codes (TYPE) and a second for disposition codes (DISP) that are used as\nsearch keys and by the processing and report programs to provide code description and validity checks (see Figs. 3\nand 4).\nPRESENT STATUS\nSingle cruise processing and report generation are done on the IBM1800 computer. Programs have recently been\nwritten for loading and retrieving index data using the MIDAS data management package on the Scripps PRIME 750\ncomputer, which has the capacity to store large disk files and allows access via remote terminals. Data for about\n30 cruises (4,000 entries) have been loaded into a pilot database for testing the retrieval programs (Fig. 5).\nFUTURE PLANS\nThe data will be split into five separate, but functionally equivalent files:\nGeneral cruise\nBiology\nGeology/Geophysics\nPhysical 0ceanography/Chemistry\nAtmospheric Data\nSeparating the files by discipline allows use of common programs yet cuts down on individual search times.\nSequential access to more than one file will be permitted in the same retrieval session to allow cross-disciplinary\nsearches.\n53","The GDC will have overall responsibility for the initial loading and maintaining of the database, maintenance\nof the general retrieval programs as well as the TYPE and DISP files.\nSIO SAMPLE INDEX\nSingle Cruise Leg\nProcessing\n(IBM 1800 Computer)\nNAV.\nFILE\nSTART\nSINGLE CRUISE\nS.I.DATA\nCODE FILES (1800)\nLEG PROCESSING\nENCODING/\nS.I. DATA\n*SAMPLE TYPE\n(P. SAMIN)\nCODE FILE\nKEYPUNCHING\nPUNCHED CARDS\nADD NAV. AND\n*DISPOSITION\nCARDS\nAT-SEA\n(No NAV.)\nCHECK CODES\n*SHIP\nS.I. DATA\nSINGLE CRUISE\nPUNCHED CARDS\nLEG REPORTING\nCRUISE LEG\n(WITH NAV.)\n(P. REPRT)\nS.I. REPORT\nCARD\nTo\nTAPE\nCARD\nIMAGE\nTAPE\nFILE\nCODE FILE\nLIST/EDIT/MODIFY\nCARDS\nMulti-Cruise\nCARD\nTAPE\nIMAGE\nData Base\nCHECK\nTo\nDISK\nDISK\nPROGRAM\nFILE\n(PRIME 750 Computer)\nPROGRAM To\nCODE FILES (PRIME)\nLOAD/MODIFY\nMIDAS\n*SAMPLE TYPE\nS.I. DATA\nDISPOSITION\nFILES\n*SHIP\nMIDAS S.I. DATA FILES\n*GENERAL CRUISE\n*BIOLOGY\n*GEOLOGY/GEOPHYS.\nATMOSPHERE\n*PHYS. OCEAN./CHEM.\nINFO (ADMIN8)\nPACKAGE\nS.I. RETRIEVAL\nHIT FILE(S)\nINTERACTIVE\nINTERACTIVE\nPROGRAM\n*SECONDARY SEARCHES\nCOMMANDS\nSEARCH/SORT\nAND MODIFICATIONS OF\nPRINT/LIST\nNON-MIDAS_KEY_FIELDS\nCOMMANDS\n*CUSTOMIZED REPORTS\nLISTINGS\nREPORTS\nS.M. Smith Oct. 80\nFigure 1. Sample Index processing, storage, and retrieval flow chart.\n54","A person designated for each of the major disciplines will coordinate the needs and problems of the curators\nand users in each field and act as principal contacts with GDC for such matters as defining new type codes, etc\n(Fig. 6). An additional, recently purchased, data management package, INFO (Admin8), will permit individual users\nand curators to easily communicate with MIDAS, update the sample indentifier and other unprotected fields, design\nsmaller independent or related databases, search and sort on non-MIDAS key items and generate customized reports.\nOctober 1980\n(Adapted from poster session presented at the Marine Geology and Geophysica Workshop: Frontiers in Data Storage,\nRetrieval, and Display - Boulder, CO. November 5-7, 1980. L. Hydock assisted with illustrations and report\npreparation.)\nS.I.O. SAMPLE INDEX ENCODING FORM\nCRUISE: S. I.O. SAMPLE\nWASHINGTON AGE of 2\n01\nLEG:\nSHIP:\nGMT\nDATE\nLOCAL\nCODE\nDISP.\nLAT.\nLONG.\nPOS.\nCRUISE\nSAMPLE IDENTIFIER\nTIME\nDY\nMO\nYR\nTIME\nT.Z.\nTYPE\nSUB.\nCODE\nDEG.\nMIN.\nis\nDEG.\nMIN.\nCODE\n10\n30\n40\n50\n60\n80\nU.G.R. 3 5KHZ R-01\n03.12\n11\n20\n78\nDP\nB\nGDC\n5105\nS\n01\nW.T\n0,105\n24\nW\n78\nDPR3\nE\nUGR 3, 5.K.H.Z R-01\nGDC\nSIOS\n01\nWT\n0507\n20\n1.1\n78\nMGRA\nB\nMAGNETICS R\nGDC\nSIOS\n01WT\n2019\n10\n78\nMGRA\nMAGNETIC\nGDC\nS\nSIDS\nof\nIW.T\n0512\n10\n11\n78\nSPRS\nB\nAIRGUN SLOW R-01\nGDC\nS IOS\n1902\n12\n7.8\nSPRS\nRGUN-SLOW R-01\nS.DC\nSIOS\nOJWT\n0546\nCO/GV\nSIOSON\n11\nlil\n78\n5884M\nGCR\nS\n105\n0617\n12\n11\n78\nCo.\nGV\nSIOSO2\n58311\nGCA\nS.\nIDS\nOIWT\n1403\n13\n11\n78\nCOG\nS.1.0,5,03\n60.46M\nGCR\nSIOS\n0824\n20\n7.8\nDRSD\nB\nDREDGE DROI\nIJ\n5798M\nS.IX\nSTOS\n0835\n21\n28\nIDRSD\nSEDIMENT\n11\n5.7.98M\nS.IX\nSTOS\n0,246\n22\n1.1\n78\nCAWS\nB\nCAMERA- 7.0MM\n5865M\nMBD\nSIOSOLWT\n0830\n22\nCAWS\nE\nCAMERA 70MM\n586 SM\n11\n78\nM.BD\nSIOS\n0.1WT\n1828\n11\n78\nGBFF\nB\n18\nGRAB-FREE FALL 5586M\nM.B.D\nSIOS\n91WT\n221\n18\nGB/F.E\nRES.PIROMETER\n1.1\n78\nMBD\nSIOS\nOWW\n11,4,5\n1911\n78\nGDXX\nSPH.N.C.T.E.R CORE\nWHO\nS.1.05\n1904\n01/12/78\nTRFV\nH316 HOP PNET\n5.8.6.9M\nSIOSALWT\nR.R.H\n0,112\n02.12\n78\nTREV\n4316 HOD.PAST\n5.869M\nRRH\nSIO/GDC-006\nDec71 rev. Oct73\nFigure 2. Example of Sample Index encoding form.\n55","BE\n113\nTM**= MIIWATER TRAWL\n114\nTMFE= IRUN 55 ASSAY\nTMIB= ISAACS-BROWN 3-PHASE TRAWL (OPENING-CLOSING)\n115\n116\nTMIK= ISAACS-KIDD MIDWATER TRAWL\n117\nTMIP= ISAACS-KID MIDWATER PLANKTON TRAWL\n118\nTMNI= NISKIN NET\n119\nTMPT= PELAG 10 TRAWL\n120\nTMR8= ROBISON MIDWATER TRAWL\n121\nTMXX= SPECIAL SAMPLE, UNDEFINED OR UNKNOWN\n122\nTR**= TRAP\nTR3C = 3-CHAMBERED TRAP (FISH)\n123\n124\nTRFV= FREE VEHICLE TRAP\n125\nTRXX= SPECIAL SAMPLE, UNDEFINED OR UNK NOWN\nIF\n126\nCA**= CAMERA\n127\nCAAB= ANCHORED BOTTOM\n128\nCACO= COKE MOUNTED\nCAFC= FREE CINEMA\n129\nCAFS= FREE STILL\n130\n131\nCATB= TOWED BOTTOM\n132\nCAWC = WIKE CINEMA\nCANS= WIKE STERFO\n133\nCAXX= SPECIAL SAMPLE, UNDEFINED OR UNKNOWN\n134\n135\nCO**= COKE\n136\nCOBX= BOX CORE\n137\nCOCA= CAMERA CORE\n138\nCODT= DAKT CORE\nCOFF= FREE-FALL CORE\n139\nCOGV= GRAVITY\n140\nCOCK = KAKIG CORER AS 3 INCH GRAVITY CORE\n141\nCOGO= GRAVITY ORIENTED\n142\nCOGV= GRAVITY CORE WITH HEAT FLOW THERMISTORS\n143\nCOPS= PISTON\n144\nCOPG= TRIP GRAVITY CORF (WITH PISTON CORE)\n145\nCOPO= PISTON ORIENTED\n146\nCOT3= TRIPLE (PRE-1971)\n147\nCOVH= HEAT PROBE CORE (PRF-1971)\n148\nCOXX= SPECIAL SAMPLE, UNDEFINED OR UNKNOWN\n149\nDH**= DEEP SEA DRILLING PROJECT DRILL HOLE DATA\n150\nDHDC DRILL CORF\n151\nDHSC = SIDE WALL C) RE\n152\nUHST= DRILL HOLE SITE\n153\nDP**= DEPTH\nBE\n154\nDPBR= BIOLOGICAL RECORD (DEEP-SCATTERING LAYER)\n155\nDPIL= INSTRUMENT LOWERING (PDR. MK5, MK10, OR GDR 12KHZ).\n156\nDPSV= DEPTH SURVEY\n157\nDPR3= 3.5 KHZ FATHOMETER RECORD\nIF\n158\nDPRT= 12 KH2 FATHOMETER RECORD\nIF\n159\nDR**= DREDGE\nIF\n160\nDRRO = ROLK\n161\nDRSD= SEDIMENT\n162\nURSP= SEDIMENT PIPF\n163\nDRTK= TRAWL\n164\nDPXX= MISCEILANEOUS\n165\nDT**= DEEP TOWED INSTRUMENT PACKAGE (MPL PROJECT )\n166\nDTWS= DEEP TOW SURVEY\n167\nEF **= ELECTRIC FIFLD\n168\nEFVA= ELICTRIC FIELD VERTICLE ARRAY\n169\nEFF V= ELECTRIC FIELD FREE VEHICLE\n170\nFigure 3. TYPE Code File - partial listing.\n56","*** DUMP OF FILE DISP ***\nLOADED ON 14JAN80 1104LCL BY PSAROPULOS (REPLACED PREVIOUS FILE LOADED 17SEP79 0928LCL )\nNUMBER OF RECORDS IN FILE = 144\n0000000001111111111222222222233333333334444444444555555555566666666667777777777\n2345678901234567890123456789012345678901234567890123456789012345678901234567890\n2\nANS = ANDREW SOUTAR (PALEOECOLUGY EXT. 2171)\n3\nARG ARGENTINA\n4\nAUA = AUSTRALIA\n5\nBHU = BRASIL OFFICE OF HYDROGRAPHY AND NAVIGATION\n6\nBOS = BIGFLOW LABORATORY FOR OCEAN SCIENCES\n7\nBTS = (INACTIVE 1 MAR 1974 - EFFECTIVE DISP. CODE = GTG)\n8\nCAN = CANADA\n9\nC3D = CAMBRIDGE UNIVERSITY, ENGLAND\n10\nCBS = CARL BOWSER, UNIVERSITY OF WISCONSIN\n11\nCDJ = C.D. JENNINGS (OREGON COLLEGE OF EDUCATION)\n12\nCDO = UNIV. COLORADO, BOULDER, COLO.\n13\nCHL CHILE\n14\nCLU = CORNELL UNIVERSITY, ITHACA, N.Y.\n15\nCNX = NATIONAL CENTER FOR EXPLUITATION OF THE OCEANS, BREST, FRANCE\n16\nCOX = CHARLES S. COX (PHYSICAL OCEANOGRAPHY EXT. 3235)\n17\nC02 = CARBON DIOXIDE RESEARCH GROUP-- C.D. KEELING (EXT. 4230)\n18\nCSS = (CHANGE TO SDU = SAN DIEGO STATE UNIVERSITY)\nDCP = DATA COLLECTION, PROCESSING GROUP -- F. WILKES (EXT. 3668) (SEE PCF\n19\n20\nDIZ = DEPT. INVERTEBRATE ZOOLOGY, SMITHSONIAN INSTITUTE\n21\nDMK = DENMARK, UNIV. OF COPENHAGEN\n22\nDSD = DEEP SEA DRILLING PROJECT -- B. LONG (EXT. 3506)\nDSX = DEEP SEA DRILLING PROJECT NON-EMPLOYEE STATUS (CONTACT B. LONG EXT. 3506)\n23\n24\nDTG = DEEP TOW GROUP (MAR. PHYSICAL LAB) -- S. MILLER (EXT. 2850)\n25\nEDR = ECUADOR\nELV = ELIZ. L. VENRICK (PHYTOPLANKTON RESEARCH (EXT. 2068) )\n26\n27\nELW = EDW. L. WINTEKER (EXT. 2083)\n28\nFCR = FOOD CHAIN RESEARCH GROUP P. WILLIAMS (EXT. 29291\n29\nFIJ = FIJI\n30\nFLS = CESIUM SAMPLES -- T. FOLSOM (EXT. 2663)\n31\nFNC = FRANCE\n32\nGBN = GREAT BRITAIN (U. K.)\n33\nGCR = GEOLOGICAL CURATING FACILITY -- W. RIEDEL, (EXT. 4386)\n34\nGDC = GEULOGICAL DATA CENTER -- S. SMITH (EXT. 2752)\n35\nGGS = GEORGE SHOR (EXT. 2853)\n36\nGOG = GEUCHEMICAL OCEAN SECTIONS PROJECT - SEE GSX\n37\nGRD = GEOLOGICAL RESEARCH DIVISION (EXT. 3360)\n38\nGSU = U.S. GEOLOGICAL SURVEY\n39\nGSX = GEOCHEMICAL OCEAN SECTIONS STUDY (EXT. 4420) (SEE PCF)\nGTG = GEOLOGICAL TECHNICIAN GROUP (EXT. 2747) DISCONTINUED 010175 NEW CODE MTG\n40\n41\nHDU = HARVARD UNIVERSITY\n42\nHFP = HEAT FLOW PROGRAM (KIER BECKER EXT. 2504)\n43\nHIG = HAWAIIAN INSTITUTE OF GEUPHYSICS, UNIV. OF HAWAII, HONOLULU\n44\nIDO = INDONESIAN\nIGP = INSTITUTE GEOPHYSICS AND PLANETARY PHYSICS, LA JOLLA (EXT. 2870)\n45\n46\nIMR = INSTITUTE MARINE RESOURCES (EXT. 2866)\n47\nIOP = INSTITUTO OCEANOGRAFICO DA UNIV. SAO PAULO, BRASIL\n48\nISL = STATE OF ISRAEL\n49\nJAM = J. A. MC GOWAN (EXT.2074)\n50\nJHF = J. H. FILLOUX. ORD (EXT. 2075)\nFigure 4. Disposition/Affiliation (DSP) File - partial listing.\n57","OK, Co TTY\nS.I.O. SAMPLE INDEX\nOK, A CHARTERS>SAMPLE INDEX\nOK, SEG RET\nSIO SAMPLE INDEX RETRIEVAL PROGRAM\nDATABASE IS MIDAS/ALLSAMPLES VER. 20 AUG 1980\nENTER TERMINAL TYPE ('S' FOR SCREEN OR 'II' FOR HARDCOPY): H120\nENTER NAME: SMSMITH\nENTER DEPARTMENT/INSTUTION (E.G. GRD/SIO) GRD\nWANT NEWS (LAST UPDATE)? N\nWANT HELP (YES OR NO)\nWANT PROMPTS FOR EACH INPUT ? N\nSEARCH - AREA\nSEARCH BY AREA? Y\nENTER TOP LAT: 10 AN.AN\nENTER BOTTOM LAT: C OCOCN\nTYPE - DR (DREDGE)\nENTER LEFT LONG: 120 00.00\nENTER RIGHT LONG: 140 00.00\nSEARCH: HELP\n(TYPE CODE)\nENTER SEARCH COMMANDS IN THE FORM OF\nSEARCH: DISP\nFOR <KEYSTRING>\nSEARCH: TYPE FOR <KEYSTRING>\nSEARCH: CRUISE FOR <KEYSTRING>\nSEARCH: DATE FROM <DDMMMYY> TO <DDMMMYY>\nSEARCH: GO\n(FOLLOWS LAST SEARCH COMMAND)\nSEARCH: TYPE FOR DR\nSEARCH: GO\nSTARTING SEARCH FOR TYPE CODE DR\nSTARTING SEARCH BY LAT/LONG\n12 RECORDS SATISIFY REQUEST (HITS)\nSORT HIT FILE? Y\nSORT: HELP\nENTER SORT FIELDS IN THE FORM OF\nSORT: TYPE\n(FOR TYPE CODE SORT)\nSORT: DISP\n(FOR DISPOSITION CODE SORT)\nSORT: LAT\n(FOR LATITUDE SORT)\nSORT: LONG\n(FOR LONGITUDE SORT)\nSORT: DATE\n(FOR DATE SORT)\nSORT: CRUISE (FOR CRUISE ID SORT)\nSORT: IDENT\n(FOR IDENT FIELD SORT)\nSORT BY CRUISE\nSORT: GO\n( TC START SORT)\nSORT: CRUISE\nSORT: GO\nPRINT HIT FILE?\nDO YOU WANT OUTPUT SPOOLED TO THE PRINTER? N\nPRINT: HELP\nENTER PRINT FIELDS IN THE FORM OF\nPRINT: ID\n(RECORD ID)\nPRINT: CRUISE\n(CRUISE ID)\nPRINT: START\n(BEGINING TIME/DATE LAT/LONG)\nPRINT: END\n(ENDING TIME/DATE LAT/LONG)\nPRINT: DISP\n(DISPOSITION CODE)\nPRINT: TYPE\n(SAMPLE TYPE)\nPRINT: ALL\n(ALL DATA)\nPRINT: IDENT\n(DATA IDENTIFER)\nPRINT: BIDENT\n(DATA IDENTIFER 1)\nPRINT: EIDENT\n(DATA IDENTIFER 2)\nPRINT: TIME\n(TIME/DATE TIME ZONE)\nPRINT: POSN\n(LAT/LONG)\nPRINT: BTIME\n(BEGINING TIME/DATE)\nPRINT: ETIME\n(ENDING TIME/DATE)\nPRINT: BPOSN\n(BEGINING LAT/LONG)\nPRINT: EPOSN\n(ENDINC LAT/LONG)\nPRINT: POSITITON\n(POSITION CODE)\nPRINT: TZONE\n(TIME ZONE)\nPRINT: GO\n(START PRINTING)\nPRINT: CRUISE\nPRINT\nPRINT: TYPE\nPRINT: BIDENT\nPRINT: POSN\nPRINT: GC\nINDP04WT DRRO DREDGE 02\n4C28M\n10-01.7N 138-22.9E\n9-44.5N 137-48.26\nINDP04WT DRRO DREDGE 03\n2742M\n8-32.8N 135-52.2E\n8-32.3N 135-53.16\nINDPC4WT DRRO DREDGE 04\n7-27.2N 136-24.3E\n7-27.3N 136-24.1E\n3555M\nINDP04WT DRRO DREDGE 05\n4161M\n7-46.6N 134-46.6E\n7-47.5N 134-45.2E\nINDP04WT DRRO DREDGE as\n3914M\n7-26.00 133-42.9E\n7-29.2N 133-45.8E\nINDP04WT DRRC DREDGE 07\n4545M\n8-56.7N 138-01.2E\n8-56.9₦ 137-59.4E\nINDPC4WT DRRO DREDGE 08\n3895M\n9-40.5N 138-20.4E\n9-40.7N 138-17.4E\nINDP07WT DRRO INDP 10D\n2205M\n1-00.3N 125-12.2E\n0-52.5N 125-07.8E\nINDP10WT DRRO DREDGE 11\n1425\n2-48.6N 126-25.4E\n2-47.4N 126-20.80\nINDPIOWT DRRO DREDGE 12\n3298\n2-27.7N 127-34.78\n2-32.00 127-36.86\nINDPIOWT DRRO DREDGE 13\n2104\n1-09.1N 125-22.RE\n1-02.5N 126-22.7E\nINDPICWT DRRO DREDGE 14\n1790\ne-44.0N 125-59.5E\n0-43.5N 126-01.1E\nWOULD YOU LIKE TO RE-SORT THE HIT FILES? YES\nSORT HIT FILE? Y\nSORT: LAT\nSORT BY LAT, LONG\nSORT: LONG\nSORT: GO\nPRINT HIT FILE? Y\nDO YOU WANT OUTPUT SPOOLED TO THE PRINTER? N\nPRINT: CRUISE\nPRINT: TYPE\nPRINT: BIDENT\nPRINT\nPRINT: POSN\nPRINT: GO\nINDPICWT DRRO DREDGE 14\n1790\n0-44.0N 125-59.5E\n0-43.5N 126-01.1E\nINDP07WT DRRO INDP 10D\n2205M\n1-00.3N 125-12.2E\n0-52.5N 125-07.80\nINDPIGWT DRRO DREDGE 13\n2104\n1-09.1N 126-22.00\n1-02.5N 126-22.7E\nINDPIOWT DRRO DREDGE 12\n3298\n2-27.7N 127-34.7E\n2-32.0N 127-36.89\nINDPIOWT DRRO DREDGE 11\n1425\n2-48.6N 125-25.46\n2-47.4N 126-20.8E\nINDP04WT DRRO DREDGE 06\n3914M\n7-26.0N 133-42.9E\n7-29.2N 133-45.PE\nINDP04WT DRRO DREDGE 04\n3555M\n7-27.2N 135-24.3E\n7-27.3N 136-24.16\nINDPCAWT DRRO DREDGE 05\n4161M\n7-46.6N 134-46.66\n7-47.5N 134-45.2E\nINDPCAWT DRRO DREDGE 03\n2742M\n8-32.8N 135-52.28\n8-32.3N 135-53.16\nINDP04WT DRRO DREDGE 07\n4546M\n8-56.7N 138-01.28\n8-56.9₦ 137-59.4E\nINDP04WT DRRO DREDGE 08\n3895M\n9-40.5N 138-20.4E\n9-40.7N 138-17.4E\n10-01.7N 138-22.9E\nINDP04WT DRRO DREDGE 02\n4026M\n9-44.5N 137-48.2E\nWOULD YOU LIKE TO RE-SORT THE HIT FILES? NO\nSAVE HIT FILE? YES\nINPUT UFD NAME AND PASSWORD: CHARTERS<SAMPLE INDEX\nINPUT UFD NAME AND PASSWORD: CHARTERS>SAMPLEINDE\nINPUT UFD NAME AND PASSWORD:\nSAVE HIT FILE\nINPUT FILE NAME:\nSI DR\nFILE TO BE WRITTEN IN BINARY? NO\nDO YOU WANT TO MAKE A NEW RETRIEVAL? N\nFigure 5. Example of Search by Sample Type and geographic area with multiple sorting and printing of selected\nitems in the data record. User input is underlined.","ATMOSPHERE\nADVISE ADMINISTRATOR USER-CURATOR CONTACTS FOR EACH\nDISCIPLINE DEFINE NEW CODES ADMINISTER INDIVIDUAL\nCURATORS\nOCEANOGR/\nCHEMISTRY\n*PACODF\n*NORPAX\nPHYS.\nDISCIPLINARY COORDINATORS\nUSERS\nGEOPHYSICS\nDISCIPLINARY DATA BASES.\n*GEOL. SAMPLES\nGEOLOGY/\nCURATORS\nFigure 6. Flow chart of proposed administrative organization.\n*U/W DATA\n*SEISMICS\n*DEEP Tow\n*GRAVITY\nBIOLOGY\n*PLANK. INVERT\n*BENTH. INVERT.\nSIO OCEANOGRAPHIC COLLECTIONS COMMITTEE\n*VERTIBRATES\nCURATORS\n(Administrative Organization)\nSIO SAMPLE INDEX\n(Proposed)\nENCODING INSTR.\nDOCUMENTATION\nUSER'S GUIDE\n(GDC)\nGEOLOGICAL DATA CENTER)\nINDEX ADMINISTRATOR\nSAMPLE IDENT.\nKEYS (GDC)\n(CURATORS)\nDATA BASE\nPRIMARY\nUPDATE\nPOST CRUISE\nDATA BASE\nREPORTS\nINITIAL\nLOADING\n(GDC)\nAND\nCOMPUTER GROUP)\nS.M. Smith Oct. 1980\nPROGRAMMING\n(SHIPBOARD\nSUPPORT\nDATA INPUT\n(RESIDENT\nENCODING\nTECHS)\nAT-SEA\n59","A SYSTEM FOR STORAGE, RETRIEVAL, AND DISPLAY\nOF UNDERWAY GEOPHYSICAL DATA\nStuart M. Smith\nGregory Papadopoulos\nGeological Data Center, Scripps Institution of Oceanography\nLa Jolla, California\nFigure 1 shows flow charts for single cruise data processing and the multicruise storage and retrieval system.\nSIO HEADERS\nOne for each cruise leg: contains all items of the MGD77 Header record (Fig. 2). Most Alpha fields are\nabbreviated, with SIO/MGD77 translation performed via the SIO Mnemonic file (Fig. 3). Has additional \"Parameter\nstatus\" flags for navigation, depth, magnetics, gravity, high resolution (3.5 kHz) and deep penetration (airgun)\nseismics to indicate if collected and status of processing, proprietary holds, and transmission to NGSDC for both\nanalogue (film) and digital forms of each data type.\nPACKED DATA FILES\nMerged data files received in the EBCDIC/BCD character mode exchange formats (MGD77, Merge-Merge72, etc.) are\nconverted to a packed format by the WRITEPACKEDFILE program. A global defination associated with each file descri-\nbes that file's structure and contents (i.e., a file for a cruise that collected only depth information will con-\ntain time, position and depth; another file may have all the geophysical parameters). The file definition includes\nnull value flags and any bias or factors applied to each item. The file structure presently used has all items\nconverted to 24-bit positive integers. Time is stored in hundredths of accumulated minutes from the beginning of\nthe cruise leg.\nINQUIRE PROGRAM\nThe principal interactive communications program for the system, INQUIRE loads and modifies any item in the\nSIO Header file. Searches can be made on the contents, or partial contents of any Header file item by logical\nAND/OR sequences to produce a Hit file (Fig. 4). The Hit file can be listed or used as input to a (still to be\ndeveloped) automatic file retrieval program for plotting or generating magnetic tape exchange files. Any of\nroughly 70 plot specifications may be set and optionally saved as CANNEDSPECS files for future plots. Options\ninclude grid lines, ticks, labels; track annotation of minute, hour, date ticks, and labels; data printed or pro-\nfile modes. Suppression parameters for minimum spacing of track ticks and labels and ship speed reduce\noverprinting. Several track specifications may be applied to one grid to allow plotting of one or several data\nparameters in both print and profile modes, or multiple cruises on one plot with different track annotations (Fig.\n5 and 6).\nPLOTPAK PROGRAM\nPLOTPAK uses the PLOTSPECS files generated by INQUIRE to pack data files previously loaded from tape to disk\nproduce disk plot files. After preview on a graphics terminal, hard copy plots are requested by way of the UCSD\nto\nComputer Center PLOTREQUEST program.\nPROGRAM EXECUTION\nPrograms may be executed interactively or run sequentially in batch jobs files using the Burroughs Work Flow\nLanguage (WFL) for interprogram communication and testing successful task completion.\nNAVIGATION ABSTRACT (under development)\nThe five-degree-square indexing method used in a previous system at Scripps will be replaced by a coarse\n\"geographic window\" index and a finer scale abstract of significant points along track for each cruise. The index,\npatterned after the NGSDC abstract file, will be used for area searches, plotting small scale track plots for\ntransmitting track information to the Sample Index.\nNovember 1980\n(Adapted from poster session presented at the Marine Geology and Geophysics Workshop: Frontiers in Data Storage,\nRetrieval, and Display - Boulder, CO. November 5-7, 1980. L. Hydock assisted with illustrations and report\npreparation.)\n60","INTERACTIVE\nTERMINAL\nINPUT\nARCHIVES\nC\na\nTAPE\nEXPORT\nINTERACTIVE\nMGD77\nTAPE\nFILES\nTERMINAL\nMNEMONIC\n(SAVED)\nPLOTSPEC\nINPUT\nLISTINGS\nFILE\nCANNED\nFILES\nSIO\n*SEARCH HEADERS BY ITEM OR AREA\n*SETUP/SAVE/RETRIEVE PLOT SPECS\n*LOAD/MODIFY SID HEADER FILES\nTo TAPE FILE\n(P. INQUIRE:\nMANAGEMENT\nREAD/WRITE\nTAPE FILES\nPROGRAM INQUIRE\n*PLOT SPECS\nMGD77 HEADER\nLISTINGS\nSIO TO/FROM\n*TAPE FILE MANAGEMENT\n*HITFILES\nFILE LIST\n*HEADERS\nPROGRAM\nHEADER\nRECORD\nCONVERT\nWRITE\nMGD77\nFILES\nMGD77\nUNDERWAY GEOPHYSICAL DATA\nMULTI-CRUISE DATA FLOW\n(Burroughs 7800 Computer)\n(TRANSIENT)\nHITFILE\nPER CRUISE LEG!\n(ONE DISK FILE\nMATTHEWS TAB\nGLOBAL BLOCK\n(FILE SPECS\nDATA STREAMS\nDEPTH ZONES\n(MODIFIED\nHEADERS)\nDATA FILES\nTIME ZONES\n(OPTIONAL\n(OPTIONAL)\nHEADER\nFILES\nMGD77\nPACKED\nSID\nPLOT-TRANSIENT\nTRACK/DATA SPECS\n(ONE FOR EACH\nCRUISE LIST\nGRID SPECS\nPLOTSPECS\nFILES\nA\nLOADER\nPROCESSING\nMGD77\nHEADER\nTo\nTRACK OR DATA\nPRINT/PROFILE\nMAP PLOTS\nPLOTPAK\nPROGRAM\nSTRUCTURE\n(DEFINES\nFUNCTIONING\nGLOBAL\nPACKED\nSPECS\nBLOCK\nFILE\nFILE\n(Merced Formats)\nINPUT FILES\nTAPE To DISK\nPACKED FILE\nDIVESS\nWRITE SID\nS.M.SmithOct.80\nLIMINED\nCONE FILE\nPOINTERS)\nABSTRACT\nFOR ALL\nCRUISES\nWITH\nFILE\nNAV\nB\nto\nFigure 1. Single cruise processing and Multi-cruise\n(P. PROFL)\nPROFILES\nPROFILES\nDISTANCE\nDATA\nvs.\n(TAPE/EBCDIC:\nTRANSFER TO\nMULTICRUISE\nMAG. FIELD\nMERGEMERGE 72\nMERGEMERGE 72\nMAG. ANOM.\nGRAV F.A.\nFOR EXPORT\nDATA BASE\n(P. MRGTN\nPOSITION\n(TAPE)\nCONVERT\nIBMER\nTRACK/DATA\nP. RATRK\nFORMAT\nFILE\nOR\nMAP PLOT\nFILE\nMERGED\nDEPTH\nPROGRAM\nTO\nPLOTS\nTIME\nMAP\nUNDERWAY GEOPHYSICAL DATA\nANOM\n(P. NPMRG)\nCALC MAG\nGRAVITY\nMERGING\nPROGRAM\nINDEX (BC BOUNDRIES\nIGRF75\n*PROFILES DEPTH &\n*DEPTH COMPILATION\nMAG. ANOMAL WITH\nSEISMIC PROFILER\ndata base flow charts.\nCRUISE LEG\nU/W DATA\nREPORT\n*INDEX TRACKS\nINDICATOR\nCONTAINS\nTo/FROM\nSMOOTHED\n(TAPE)\nDTNAV\nFILES\nDISK\nTAPE\nNAV.\nFILE\nDEPTH\nFILES\nGRAV\nMAG.\nvs.\nTIME\nSINGLE CRUISE LEG PROCESSING\nDUPLICATES\n(Export)\nMAKING\nCOPY\n(P. TOPTV\nFOR\nPROFILE\nLISTS PLOTS\n(P. FIXPI\nPROFILE\nPROGRAM\nSMOOTHING\n(IBM 1800 Computer)\nPROGRAM\nNAVIGATION\nLIST\nNAV\nNAV\nSEISMIC PROFILER\nMICROFILMING\n35MM FLOW FILM\nICROFILIMING\nPRINTER\nS.M. Smith Oct. 80\nMASTER\nREADER\nFILM\nCOPY\nFOR\nPROFILER\n(Carda)\n(Cards)\nRECORDS\nINDEX\nTIME\nCSE/SPD\nEDIT\nEDIT\nARCHIVE\nGRAVITY\nNGSDC\nCOPY\nINPUT\nFIX\nDEPTH\nvs.\nTIME\nFOR\nMAG.\n6","(CRUISE) CRUISE: MARA10WT\n(TITLE) TITLE: MARIANA LEG 10\n(SCIENT) CHIEF SCIENTIST(S) : SHOR G.G.\n(BDATE) BEGIN DATE: 16 FEB 79\n(BPORT) BEGIN PORT: JAKARTA\n(EDATE) END DATE: 10 MAR 79\n(EPORT)\nEND PORT: SUBIC BAY, PHIL\n(IDNUM)\nGDC IDENTIFICATION #0176\nPARAMETERS COLLECTED: BATH MAGS GRAV 3.5KC SEIPRF\n(INST)\nCONTRIBUTING INSTITUTION: SCRIPPS\n(SHIP)\nPLATFORM NAME: WASHINGTON\n(PLATCO)\nPLATFORM CODE 1\nPROJECT, FUNDING ETC.: <NORDA/ONR (749,UC446080-19909>\n(FUNDIN)\nLAST HEADER UPDATE ON 26 OCT 79\nSTATUS OF GEOPHYSICAL PARAMETERS\n(STATUS)\nNAV BATH MAGS GRAV 3.5KC SEIPRF\n*\n*\n*\n(COLLECT)\nCOLLECTED\n*\n*\n*\n(DIGITPRO)\nDIGIT. PROC.\n*\n*\n*\n-\n-\n-\n(DIGITNG)\nDIGIT. NGDC\n-\n-\n-\n-\n-\n-\n*\n*\n(MICRO)\nMICROFILMED\n*\n*\n*\n-\n(MFILMNG)\nMFILM NGDC\n-\n-\n-\n-\n-\n-\n(INSTHOLD) INST. HOLD\n*\n*\n*\n*\n*\n-\n(PROPRIET)\nPROPRIETARY\n-\n-\n-\n-\n-\n-\n'=YES, =NO, '?'=UNKNOWN\n(NAVINS) NAV. INSTRUMENTATION: SATAUTODOPPL\n(POSMET) POSITION METHOD: LINEARINTERP\n(BATHIN) BATHY INSTRUMENTATION: 3.5&12UGRGFT\n(BATHFO) BATHY DATA FORMS: RECCARDFILM\n(BATHDI) BATH DIGIT. RATE: 05.0 MIN.\n(BATHSA) BATH SAMPLING RATE: 1SECONDSWEEP\n(BATHVE) ASSUMED SOUND VELOCITY: 1463.0 METERS/SEC.\n(BATHIT) BATHY. INTERPOLATION: 5MININT\n(BATHDA) BATHY. DATUM CODE: 00\n(MAGINS) MAG INSTRUMENTATION: VARIANPROTON\n(MAGFOR) MAG DATA FORMS: RECCARDFILM\n(MAGDIG) MAG DIGITIZING RATE: 05.0 MIN.\n(MAGSAM) MAG SAMPLING RATE: 06 SEC.\n(MAGTOW) MAG SENSOR DIST.: 9999 METERS\n(MAGSEP) MAG SENSOR SEP. :999 METERS\n(MAGREF) MAG REF. FIELD CODE: 75\n(MAGCAL) MAG RES. CALC. METHOD: ONEDEGPLANE\n(GRAVIN) GRAV INSTRUMENTATION: ASKANIA-GSS2\n(GRAVFO) GRAV DATA FORMS: RECORDS\n(GRAVDI) GRAV DIGIT. RATE:\nMIN.\n.\n(GRAVTH) THEORETICAL GRAV FORM. CODE:\n(GRAVRE) GRAV REFERENCE SYST. CODE:\n(GRAVCO) GRAV CORRECTIONS APPLIED:\n(BGRAVA) DEPARTURE BASE STATION GRAVITY:\nMGAL.\n(BGRAVD) DEPARTURE STA. DESCRIPT.:\n(EGRAVA) ARRIVAL BASE STATION GRAVITY:\nMGAL\n(EGRAVD) ARRIVAL STA. DESCRIPT.:\n(SEISIN) SEISMIC INSTRUMENTATION: MULTICHANNEL\n(SEISFO) SEISMIC DATA FORMS: RECFILMTAPE\n>\nFigure 2. List of SIO cruise header record\n62","21000 BATHVE 000 GDC05NGSDC05 = BATHY ASSUMED SOUND VELOCITY (TO 0.1 METER)\n*\n*\n*\nSATMANCSESPD = SATNAV,MANUAL ENTRY GYRO AND SPD CHANGES*\n16700 NAVINS 007 SATDOPMANCSE = SATNAV, AUTO 2D-DOPPLER, MANUAL GYRO ENTRY*\n*\n*\n*\n*\n*\n*\n*\n*\n*\n*\n*\n3.5&12EDOGFT = 3.5-12KHZ/EDO,GIFFT RECORDERS/WID BEAM *\n*\n3.5&12UGRGFT = 3.5-12KHZ/UGR,GIP RECORDERS/WIDEBEAM *\n*\n12KHZ/EPC 4100 RECORDER/WIDE(60DEG) BEAM\n(Continued on next page)\nSATAUTODOPPL = SATNAV,AUTO LOG GYRO+2D DOPPLER SPD LOG\n*\n*\n*\n*\n*\n*\n*\n*\n= 3.5KHZ/RAYTHEON UGR RECORDER/WIDE BEAM\n3.5&12ED0550 = 3.5-12KHZ/EDO 550A RECORDERS/WIDE BEAM\n= 12KHZ/EDO UON (OLD 6INCH)REC./WIDEBEAM\n3.5&12EDOUGR 3.5-12KHZ/EDO,UGR RECORDERS/WIDE BEAM\n12KHZ/RAYTHEON UGR RECORDER/WIDE BEAM\nLINEARINTERP = LINEAR INTERP.BETWEEN ADJACENT FIXES\n= 12KHZ/GIFFT RECORDER/WIDE (60DEG) BEAM\n1463 METERS/SEC (=800 FATHOM/SECOND)\n= 3.5-12KHZ/GIFFT RECORDERS/WIDE BEAM\nGDC12NGSDC38 ADDITIONAL FORMS OF BATHYMETRY DATA\nRECCARDFLM35 = RECORDS, CARDS, 35MMFILM (3.5KHZ)\nSATAUTORPM = SATNAV,AUTOLOG GYRO AND ENGINE RPM\n= 12KHZ/PDR RECORDER/WIDE (60DEG) BEAM\nRECCARDFLM12 RECORDS, CARDS, 35MMFILM (12KHZ)\nGDC12NGSDC12 = BATHY SAMPLE RATE (RECORDER SWEEP)\nGDC02NGSDC02 = BATHY DATUM CODE (SEE NGSDC MGD77)\n= CELESTIAL, MANUAL ENTRY GYRO SPEED\n= 12KHZ/EDO 550A RECORDER/WIDE BEAM\n= 3.5KHZ/ED0550A RECORDER/WIDE BEAM\nGDC03NGSDC03 = BATHY DIGIT RATE (TO 0.1 MINUTE)\n= 3.5KHZ/GIFFT RECORDER/WIDE BEAM\n= ANALOGUE RECORDS, PUNCHED CARDS\nGDC12NGSDC38 = POSITION DETERMINATION METHOD\n= 3.5KHZ/PDR RECORDER/WIDE BEAM\n= MEAN LOWER LOW WATER SPRING\nANAL. RECORDS, CARDS, 35MMFILM\nSATAUTOEMLOG = SATNAV, AUTOLOG GYRO + EMLOG\nGDC12NGSDC40 = NAVIGATION INSTRUMENTATION\nGDC12NGSDC40 = BATHYMETRY INSTRUMENTATION\n3.5-12KHZ/WIDE (60DEG) BEAM\n= 3.5KHZ/ WIDE (60DEG) BEAM\nNO CORRECTION (SEALEVEL)\n= LOWEST NORMAL LOW WATER\n= INDIAN SPRING LOW WATER\n12KHZ/WIDE (60DEG) BEAM\n= MEAN LOW WATER SPRING\n= ANAL. RECORDS, 35MMFILM\n= MEAN LOWER LOW WATER\nCELEST/LORAN = CELESTIAL AND LORAN\n1500 METERS/SECOND\n= LOWEST LOW WATER\nANALOGUE RECORDS\n= MEAN SEA LEVEL\n= MEAN LOW WATER\n= 35MM MICROFILM\n= PUNCHED CARDS\n= 5 MINUTE RATE\n1SECONDSWEEP = 1SECONDSWEEP*\n2SECONDSWEEP = 2SECONDSWEEP*\n= UNSPECIFIED\nUNSPECIFIED\n=\n3.5KHZED0550\n12KHZEPC4100\nRECCARDFILM\n12KHZED0550\n12KHZUQNED0\n3.5KHZGIFFT\n3.5&12GIFFT\n12KHZGIFFT\n3.5KHZWIDE\n12KHZWIDE\n3.5KHZPDR\n3.5KHZUGR\n3.5&12KHZ\nCELESTIAL\n12KHZPDR\n12KHZUGR\nRECORDS\nRECFILM\nRECCARD\n14630*\n15000*\n99999*\nFILM\nCARD\n050*\n999*\n00*\n01*\n02*\n03*\n04*\n05*\n06*\n07*\n08*\n21500 BATHDA 000\n18000 BATHIN 008\n18100 BATHIN 009\n18200 BATHIN 010\n19000 BATHIN 018\n19200 BATHFO 000\n19300 BATHFO 001\n19400 BATHFO 002\n19500 BATHFO 003\n19600 BATHFO 004\n19700 BATHFO 005\n19800 BATHFO 006\n19900 BATHFO 007\n20000 BATHFO 008\n20200 BATHDI 000\n20300 BATHDI 001\n20400 BATHDI 002\n20600 BATHSA 000\n20700 BATHSA 001\n20800 BATHSA 002\n21100 BATHVE 001\n21200 BATHVE 002\n21300 BATHVE 003\n21600 BATHDA 001\n21700 BATHDA 002\n21800 BATHDA 003\n21900 BATHDA 004\n22000 BATHDA 005\n22100 BATHDA 006\n22200 BATHDA 007\n22300 BATHDA 008\n22400 BATHDA 009\n16000 NAVINS 000\n16600 NAVINS 006\n16900 POSMET 000\n17400 BATHIN 002\n17500 BATHIN 003\n17600 BATHIN 004\n17700 BATHIN 005\n17800 BATHIN 006\n17900 BATHIN 007\n18300 BATHIN 011\n18400 BATHIN 012\n18500 BATHIN 013\n18600 BATHIN 014\n18700 BATHIN 015\n18800 BATHIN 016\n18900 BATHIN 017\n16100 NAVINS 001\n16200 NAVINS 002\n16300 NAVINS 003\n16400 NAVINS 004\n16500 NAVINS 005\n17000 POSMET 001\n17200 BATHIN 000\n17300 BATHIN 001\n19100\n20100\n20500\n20900\n21400\n16800\n17100\n14400 SCIENT 000 GDC32NGSDC32 = NAME OF CHIEF SCIENTIST( [E.G. \"DOE J.R.\")\n15000 STATUS 000 GDC08NGSDC05 = STATUS OF GEOPHYS. PARAMETERS (SEE UW SYST)\n*\nFigure 3. SIO Mnemonic file for SIO/MGD77 Translation.\n888 BEGIN HEADER FIELD DESCRIPTIONS 883\n14600 TITLE 000 GDC20NGSDC58 CRUISE-LEG NAME \"ANTIPODE LEG 3\"]\n000 GDC21NGSDC21 = SHIP NAME (E.G. R/V THOMAS WASHINGTON)\nLAMONT-DOHERTY GEOLOGICAL OBSERVATORY\nWOODS HOLE OCEANOGRAPHIC INSTITUTION\nGDC01NGSDC01 = PLATFORM TYPE CODE (SEE NGSDC MGD77)\n10300 CRUISE 000 GDC08NGSDC08 = CRUISE LEG IDENTIFIER INDP02WT)\n= SCRIPPS INSTITUTION OF OCEANOGRAPHY\n#WORKFILE SYS/HEADERMNEMONICS: SEQ, 243 RECORDS (THRU 34200), SAVED\n= OTHER (SPECIFY IN HEADER COMMENTS)\n14800 FUNDIN 000 GDC60NGSDC20 = PROJECT AND/OR FUNDING <GRANT #>\nUPDATE 000 GDC06NGSDC06 = HEADER MODIFICATION DATE [YYMMDD]\n15200 BDATE 000 GDC06NGSDC06 = BEGIN DATE OF LEG(FILE) [YYMMDD]\n15400 EDATE 000 GDC06NGSDC06 END DATE OF G(FILE) (YYMMDD)\n= ANCHORED SEAFLOOR INSTRUMENT\n10500 IDNUM 000 GDC04NGSDC00 = GDC CRUISE ACCESSION NUMBER\n15600 BPORT 000 GDC32NGSDC34 = BEGIN PORT (CITY, COUNTRY)\n15800 EPORT 000 GDC32NGSDC32 END PORT (CITY, COUNTRY)\n*\n= R/V SPENCER F. BAIRD *\n= R/V ELLEN B. SCRIPPS *\n= D/V GLOMAR CHALLENGER*\n*\n*\n*\n*\n*\n*\n*\n= R/V THOMAS WASHINGTON*\n= R/V GYRE (TEXAS A & M) *\n*\nGDC12NGSDC41 = SOURCE INSTITUTION\n= R/V ALEX. AGASSIZ\n= SUBMERSIBLE SHIP\n= R/V HUGH M.SMITH\n= R/V ALPHA HELIX\n= R/V NEW HORIZON\n= R/V WECOMA (OSU)\n= R/V OCONOSTOTA\n= R/V MELVILLE\n= R/V STRANGER\n= SURFACE SHIP\n= UNSPECIFIED\n= MOBILE LAND\n= R/V HORIZON\n= FIXED LAND\n= AIRCRAFT\n= DEEP TOW\n= R/V ARGO\n= R/V ORCA\n= BUOY\n#12:05 PM THURSDAY, OCTOBER 30, 1980\n=\n=\n005 GCHALLENGER\n007 ALPHAHELIX\n011 OCONOSTOTA\n010 NEWHORIZON\n014 WASHINGTON\n003 WOODSHOLE\n004 EBSCRIPPS\n013 STRANGER\n008 MELVILLE\n012 HMSMITH\n006 HORIZON\nSCRIPPS\n002 AGASSIZ\nWECOMA\n002 LAMONT\n003 BAIRD\n009 ORCA\n015 GYRE\nARGO\n0*\n1*\n2*\n3*\n4*\n13800 PLATCO 006 5*\n6*\n14000 PLATCO 008 7*\n14100 PLATCO 009 8*\n14200 PLATCO 010 9*\n016\n000\n13300 PLATCO 001\n13400 PLATCO 002\n13500 PLATCO 003\n13600 PLATCO 004\n13700 PLATCO 005\n13900 PLATCO 007\n000\n001\n001\nPLATCO\n12600 SHIP\n1270g SHIP\n12800 SHIP\n12900 SHIP\n13000 SHIP\n10900 INST\n11000 INST\n11100 INST\n11200 INST\n11400 SHIP\n11500 SHIP\n11600 SHIP\n11700 SHIP\n11800 SHIP\n11900 SHIP\n12000 SHIP\n12100 SHIP\n12200 SHIP\n12300 SHIP\n12400 SHIP\n12500 SHIP\n14300\n14500\n14700\n14900\n15100\n15300\n15500\n15700\n13100\n13200\n10000\n10100\n10200\n10400\n10600\n10700\n10800\n11300","= 50KJOULE ARCER, RAYFLEX SLOWDRUM RECORDER*\n*\n*\n29500 SEISIN 004 AIRGUN20-300 = 20TO300CU.IN.AIRGUN,10-300HZ,ED PSR REC*\nMULTICHANNEL = 24CHANNEL,40-300CU.IN.AG,GEOSPACE DAS209*\n29700 SEISIN 006 AG20-300/EDO = 20TO300CU.IN.AIRGUN,10-300H2,EDO 550 REC*\n30600 GRAVIN 001 ASKANIA-GSS2 ASKANIA GSS2,ANSCHUTZ ELECT. GYRO TABLE\n31200 GRAVFO 004 RECCARDFILM = ANALOGUE RECORDS, PUNCH CARDS, 35MM FILM*\n31800 GRAVTH 000 GDC01NGSDC01 = THEORETICAL GRAVITY FORMULA CODE (MGD77)\nLOCAL SYSTEM (SPECIFY IN COMMENTS FIELD)\n*** END GDC HEADER FIELD DESCRIPTIONS ***\n003 AIRGUN40GDR = 40CU.IN.AIRGUN,10-300Z,GIFFT RECORDER\n= 40CU.IN.AIRGUN,10-300HZ,PDR MK 10 REC.\nFigure 3. (Continued)\n1CHAN/DIG/AG 20T0300CU. IN. AIRGUN, DIGITAL 1 CHANNEL\nGRAVRF 000 GDC01NGSDC01 = GRAVITY REFERENCE SYSTEM CODE (MGD77)\n30300 SEISFO 003 RECFILMTAPE = ANAL. REC, 35MM FILM, DIGITAL MAG. TAPE\n33400 BGRAVA 000 GDC07NGSDC07 = BEGINNING BASE STATION GRAVITY (MGAL)\nGDC03NGSDC03 = GRAVITY DIGITIZING RATE (0.1 MINUTE)\n30000 SEISFO 000 GDC12NGSDC38 = ADDITIONAL FORMATS OF SEISMIC DATA\n33600 BGRAVD 000 GDC33NGSDC33 = BEGINNING BASE STATION DESCRIPTION\nGDC12NGSDC40 = SEISMIC PROFILER INSTRUMENTATION\n30800 GRAVFO 000 GDC12NGSDC38 = ADDITIONAL FORMS OF GRAVITY DATA\n= OTHER (SPECIFY IN COMMENTS FIELD)\nOTHER (SPECIFY IN COMMENTS FIELD)\n= ANALOGUE RECORDS, 35MM MICROFILM\n33800 EGRAVA 000 GDC07NGSDC07 END BASE STATION GRAVITY (MGAL)\n= ANALOGUE RECORDS, PUNCH CARDS\n34000 EGRAVD 000 GDC31NGSDC31 = END BASE STATION DESCRIPTION\n= ANAL. RECORDS, 35MM MICROFILM\n33100 GRAVCO 000 GDC12NGSDC38 = GRAVITY CORRECTIONS APPLIED\n30500 GRAVIN 000 GDC12NGSDC40 GRAVITY INSTRUMENTATION\n= INTERNATIONAL1930*\n*\n= ANALOGUE RECORDS\n= IAG SYSTEM 1967\n= 35MM MICROFILM\n= HEISKANEN 1924\n= POTSDAM SYSTEM\n= SYSTEM IGSN 71\n33200 GRAVCO 001 DRIFTANDTARE DRIFT AND TARE\n5 MINUTE RATE\n= UNSPECIFIED\n= UNSPECIFIED\nAIRGUN40PDR\nRECFILM\nRECORDS\nRECFILM\nRECCARD\nARCER\nFILM\n050*\n31600 GRAVDI 002 999*\n1*\n2*\n3*\n32200 GRAVTH 004 8*\n32300 GRAVTH 005 9*\n1*\n32700 GRAVRF 002 2*\n32800 GRAVRF 003 3*\n32900 GRAVRF 004 9*\n29100 SEISIN 000\n29200 SEISIN 001\n29300 SEISIN 002\n29600 SEISIN 005\n29800 SEISIN 007\n001\n30200 SEISFO 002\n30900 GRAVFO 001\n31000 GRAVFO 002\n31100 GRAVFO 003\n31400 GRAVDI 000\n31500 GRAVDI 001\n31900 GRAVTH 001\n32000 GRAVTH 002\n32100 GRAVTH 003\n32600 GRAVRF 001\n29400 SEISIN\nSEISFO\n29900\n30100\n30400\n30700\n31300\n31700\n32400\n32500\n33000\n33300\n33500\n33700\n33900\n34100\n34200\n28900 MAGCAL 002 TRACK500MILE = INTERP. 30DEG C/C OR 500MI ALONG TRACK*\n28800 MAGCAL 001 ONEDEGPLANE LIN. INTERP. POINTS WITHIN ONE DEGREE SQUARE\n28700 MAGCAL 000 GDC12NGSDC47 METHOD OF REMOVING RESIDUAL MAGNETIC FIELD\n25100 MAGDIG 000 GDC03NGSDC03 MAGNETICS DIGITIZING RATE (TO 0.1 MINUTE)\n27000 MAGREF 000 GDC02NGSDC02 = MAGNETIC REFERENCE FIELD CODE (SEE MGD77)\n*\nSCRIPPSMAG = SCRIPPS MFD PROTON PRECESSION (VACQUIER)\n*\n*\nGDC02NGSDC02 = MAGNETICS INSTRUMENT SAMPLING (SEC)\n26400 MAGDEP 000 GDC05NGSDC05 = MAGNETIC SENSOR TOW DEPTH (1/10 METERS)\n26700 MAGSEP 000 GDC03NGSDC03 MAGNETIC SENSOR TOW SEPARATION (METERS)\n24000 MAGINS 002 VARIANPROTON = VARIAN MFD PROTON PRECESSION MOD 4970\n26100 MAGTOW 000 GDC04NGSDC04 = MAGNETIC SENSOR TOW DISTANCE (METERS)\nGEOMETRIC801 GEOMETRICS MAGNETOMETER MOD G-801/3\n24300 MAGFOR 000 GDC12NGSDC38 ADDITIONAL FORMS OF MAGNETICS DATA\n= OTHER (SPECIFY IN COMMENTS FIELD)\nOTHER (SPECIFY IN COMMENTS FIELD)\nGDC12NGSDC56 = BATHYMETRY INTERPOLATION SCHEME\nEQUATORIAL SPRING LOW WATER\n= ANAL. RECORDS, CARDS, 35MMFILM\n23800 MAGINS 000 GDC12NGSDC40 = MAGNETICS INSTRUMENTATION\nLOWEST ASTRONOMICAL TIDE\n15 SECOND SAMPLING RATE\n= 60 SECOND SAMPLING RATE\nTROPIC LOWER LOW WATER\nANAL. RECORDS, 35MM FILM\n= 6 SECOND SAMPLING RATE\nANAL. RECORDS, CARDS\n= 5 MINUTE INTERVAL\n= 6 MINUTE INTERVAL\n= 3 MINUTE INTERVAL\n= 2 MINUTE INTERVAL\n= ANALOGUE RECORDS\n= PEAK AND TROUGH\n= 35MM MICROFILM\nPUNCHED CARDS\n5 MINUTE RATE\n*\n*\n*\n*\n*\n*\n*\n*\n*\n*\nUNSPECIFIED *\nUNSPECIFIED\nUNSPECIFIED\nUNSPECIFIED\n= UNSPECIFIED\nUNSPECIFIED\nUNSPECIFIED\n= IGRF 1965\n= IGRF 1975\n= IGRF 1965\n= IGRF 1975\n= GSFC-1266\n= GSFC-0674\n= POGO 0368\nPOGO 1068\n= POGO 0869\nUNUSED\n= AWC 70\n= AWC 75\n= UK 75\nRECCARDFILM\nPEAKTROUGH\n5MININT\n6MININT\n3MININT\n2MININT\nRECORDS\nRECFILM\nRECCARD\n26500 MAGDEP 001 99999*\n26200 MAGTOW 001 9999*\nFILM\nCARD\n050*\n25300 MAGDIG 002 999*\n999*\n22500 BATHDA 010 09*\n22600 BATHDA 011 10*\n22700 BATHDA 012 11*\n22800 BATHDA 013 88*\n22900 BATHDA 014 99*\n06*\n15*\n25800 MAGSAM 003 60*\n99*\n65*\n75*\n00*\n01*\n02*\n03*\n04*\n05*\n06*\n07*\n08*\n28200 MAGREF 012 09*\n28300 MAGREF 013 10*\n28400 MAGREF 014 88*\n28500 MAGREF 015 99*\n23100 BATHIT 000\n23200 BATHIT 001\n23300 BATHIT 002\n23400 BATHIT 003\n23500 BATHIT 004\n23600 BATHIT 005\n23900 MAGINS 001\n24100 MAGINS 003\n24400 MAGFOR 001\n24500 MAGFOR 002\n24600 MAGFOR 003\n24700 MAGFOR 004\n24800 MAGFOR 005\n24900 MAGFOR 006\n25200 MAGDIG 001\n25500 MAGSAM 000\n25600 MAGSAM 001\n25700 MAGSAM 002\n25900 MAGSAM 004\n26800 MAGSEP 001\n27100 MAGREF 001\n27200 MAGREF 002\n27300 MAGREF 003\n27400 MAGREF 004\n27500 MAGREF 005\n27600 MAGREF 006\n27700 MAGREF 007\n27800 MAGREF 003\n27900 MAGREF 009\n28000 MAGREF 010\n28100 MAGREF 011\n23000\n23700\n24200\n25000\n25400\n26000\n26300\n26600\n26900\n28600","Figure 4. Example of Search on SIO Cruise Headers\nQUESTION: What cruises near Callao, Peru\nwere satellite controlled and\nFind cruises BPORT or EPORT -\ncollected high resolution\nCallao; NAVINS - \"SAT\" and\n(3.5khz) seismics?\n\"SEIPRF COLLECT\".\nusing Program INQUIRE.\nSEARCH #2\nSEARCHING FOR MATCH WITH HRSEIS COLLE\nFIND EXCL STATUS = \"HRSEIS COLLECT\"\nSEARCHING FOR MATCH WITH CALLAO\nSEARCHING FOR MATCH WITH CALLAO\nSEARCHING FOR MATCH WITH SAT\n8 ENTRIES\nHITLIST HAS 14 ENTRIES\nHITLIST HAS 26 ENTRIES\nHITLIST HAS 19 ENTRIES\nFIND EXCL NAVINS = \"SAT\"\nMETHOD:\nFIND EPORT = \"CALLAO\"\nFIND BPORT = \"CALLAO\"\n1: FD774BMV\n2: FD774CMV\n3: FD774DMV\n4: SOTW05WT\n5: SOTW08WT\n6: FD774BMV\n7: FD774CMV\n8: SOTW07WT\n8 HITS )\nHITLIST HAS\nHITS LIST\n>\n>\n(\n>\n>\n>\nQUESTION: What microfilms of SIO seismics\nsent to NGSDC, not on proprietary\nseismics, older than two years,\nhold, and having navigation on\nare ready to send to NGSDC?\nwith filmed seismics, not yet\nMETHOD: Find cruises that collected\n#NON-PRIME RATE SESSION 6554 09:53:19 11/02/80\nUCSD B7800:198 CANDE 30.116; YOU ARE TTX23 (184)\nSEARCHING FOR ALL STRINGS EXCEPT SEIPRF MFILM\nSEARCHING FOR ALL STRINGS EXCEPT SEIPRF PROPR\nSEARCH #1\nFIND EXCL STATUS NEO \"SEIPRF PROPRIET\"\nSEARCHING FOR MATCH WITH SEIPRF MICRO\nINQUIRE VERSION 5.0, YOU ARE USER 567\nSEARCHING FOR MATCH WITH SEIPRF COLLE\nFIND EXCL STATUS NEQ \"SEIPRF MFILMNG\"\nSEARCHING FOR VALUE LESS THAN 781101\nSEARCHING FOR MATCH WITH NAV MICRO\nFIND EXCL STATUS = \"SEIPRF MICRO\"\n# DEFAULT PRINT DESTINATION=SAVE\nFIND EXCL STATUS = \"NAV MICRO\"\nfilm.\nPLOT DIRECTORY = PLOTSPECS/GDC\nFIND STATUS = \"SEIPRE COLLECT\"\nHITLIST HAS 87 ENTRIES\nHITLIST HAS 18 ENTRIES\nHITLIST HAS 205 ENTRIES\nHITLIST HAS 101 ENTRIES\nHITLIST HAS 246 ENTRIES\nHITLIST HAS 228 ENTRIES\nFIND EXCL EDATE < 781101\n#ENTER USERCODE PLEASE.\n#ENTER PASSWORD PLEASE.\nUSER IS PRIVILEGED\n10: NOVANGAR\n11: NOVA07AR\n12: NOVAC8AR\n13: NOVA09AR\n14: NOVA1CAR\n15: NOVA1AHO\n16: NOVA1BHO\n17: NOVA02HO\n18: NOVA05HO\n3: ANTP13MV\n4 : CIRC02AR\n5: NOVACIAR\n6: NOVAC2AR\n7: NOVA03AR\n8: NOVA04AR\n9: NOVAC5AR\n1: ANTP01MV\n2: ANTPC5MV\n#RUNNING 6556\n18 HITS )\nHITS LIST\ne inquire\nTYPE <CR>\nds9\n#?\n>\n(\n>\n>\n>\n>\n>\n>",">\nCANNED TRDPPRN60IN\n>\nVERIFY ALL\n(SCALE) GRID SCALE (INCHES/DEGREE LONGITUDE) = 60.0000\n10'17.00\"N\n109' 2.00\"W\n108'54.00\"W\n10' 7.00\"N\nPLOT IS 11.400 BY 12.193 INCHES\n(LENGTH BY WIDTH OF PLOTTER)\n(ROTATE) PLOT TO BE ROTATED = RESET (NO)\n(PROJEC) TION = MERCATOR\nGRID PARAMETERS\n(LINLAT) GRID LINES-LATITUDE -EVERY\n1 DEGREES 0.00 MINUTES\n(LINLON) GRID LINES-LONGITUDE-EVERY\n1 DEGREES 0.00 MINUTES\n(LINBOR) LINE DRAWN AROUND BORDER = SET (YES)\n(LBLLAT) GRID LABELS-LATITUDE -EVERY\n0 DEGREES 5.00 MINUTES\n(LBLLON) GRID LABELS-LONGITUDE-EVERY 0 DEGREES 5.00 MINUTES\n(LBLSIZ) GRID LABEL HEIGHT(INCHES) =\n0.25\n(LBLOFF) GRID LABEL OFFSET FROM PLOT BORDER (INCHES) =\n0.20\n(LBLCOR) GRID CORNERS LABELED = SET(YES)\n(CROLAT) GRID CROSSES-LATITUDE -EVERY 0 DEGREES 5.00 MINUTES\n(CROLON) GRID CROSSES-LONGITUDE-EVERY 0 DEGREES 5.00 MINUTES\n(CROSIZ) GRID CROSS SIZE (INCHES) = 0.10\n(TICBOR) GRID TICKS AROUND BORDER = SET(YES)\n(TICLAT) GRID LINES TICKED-LATITUDE -EVERY 1 DEGREES 0.00 MINUTES\n(TICLON) GRID LINES TICKED-LONGITUDE-EVERY 1 DEGREES 0.00 MINUTES\nTO CONTINUE, TYPE <CR>\nTRACK PARAMETERS\n***CRUISE LABELS***\n(CRUISE) CRUISE ID = NOCRUISE\n(IDLBLL) CRUISE ID LENGTH (CHARACTERS) = 8\n(IDLBLH) CRUISE LABEL HEIGHT (INCHES) = 0.12\n(IDENTR) CRUISE LABELS-FIRST POINT AND ENTRIES = SET(YES)\n(IDEXIT) CRUISE LABELS-EXITS = SET(YES)\n(IDINTE) CRUISE LABEL INTERVAL ALONG TRACK (INCHES) =999.00\n(IDSR) CRUISE LABEL SUPPRESSION RADIUS (INCHES) =999.00\n***TRACK ANNOTATION*\n(TRLINE) DRAW LINE BETWEEN POINTS ALONG TRACK = SET (YES)\n(VIEWFR) OM AZIMUTH (DEGREES) = 180.0\n(MINSPD)MIN. SPEED FOR TRACK TICKS/LABELS (PLOT IN./SHIP HRS.) = 0.000\n(DATEOF) DATE LABEL OFFSET (INCHES) = 0.50\n(DATESR) DATE LABEL SUPPRESSION RADIUS (INCHES) = 0.20\n(TICKSR) TRACK TICK SUPPRESSION RADIUS (INCHES) = 0.03\nTO CONTINUE, TYPE <CR>\nDATA PARAMETERS\n(DATATY) PE = NONE\n(DATAMO) DE = PRINT\n(CORREC) TION = MATTHEWS, AREA =\n0\n***PRINT MODE***\n(PRNHTE) PRINTED DATA HEIGHT (INCHES) = 0.10\n(PRNOFF) PRINTED DATA OFFSET FROM TRACK POINT (INCHES) = 0.03\n(PRNDEC) PRINTED DATA, DECIMAL PLACES =\n0\n(PRNSR) PRINTED DATA SUPPRESSION RADIUS (INCHES) = 0.1200\n***PROFILE MODE***\n(PROSCL) PROFILE DATA SCALE(DATA UNITS/INCH) =\n0.00\n(PROZLN) PROFILE DATA ZERO-LINE = 0.00 INCHES FROM TRACK\n(PROBIA) PROFILE BIAS( (DATA UNITS) =\n0.00000\n(PROCSE) MAXIMUM COURSE CHANGE BEFORE PROFILE BREAK (DEGREES) =\n0.0\n(PROBRK) PROFILE BREAK RADIUS (INCHES) = 0.00\nFigure 5.\nListing of CANNED (Saved) plot specifications\nused to generate Figure 6.\n(Continued on next 2 pages)\n66","**MISCELLANEOUS PARAMETERS**\n(PENSIZ) PEN SIZE = 0\n(PENCOL) PEN COLOR = 0\n(GRDCOL) GRID COLOR = BLACK\n(TRKCOL) TRACK COLOR = BLACK\n>\nTITLE LIST\nTITLE # 1 HEIGHT = 0.350 IN.\nBNFC02WT 60IN/DEG\nTITLE # 2 HEIGHT = 0.250 IN.\nWITH DEPTHS AND MINUTE TICKS\nTITLE # 3 HEIGHT = 0.200 IN.\nDEPTHS IN FATHOMS (800FM/SEC)\nTITLE(S) ON LOWER LATITUDE SIDE\nTITLE(S) WILL FACE TOWARDS PLOT\n>\nTICK GRID LIST\n* GRID TICK SPECIFICATION NUMBER 1*\nTICK LENGTH IS 0.05 INCHES\nTICK INTERVAL IS 0 DEGREES 0.50 MINUTES\n* GRID TICK SPECIFICATION NUMBER 2*\nTICK LENGTH IS 0.07 INCHES\nTICK INTERVAL IS 0 DEGREES 1.00 MINUTES\n* GRID TICK SPECIFICATION NUMBER 3*\nTICK LENGTH IS 0.15 INCHES\nTICK INTERVAL IS 0 DEGREES 5.00 MINUTES\n>\nTICK TRACK MIN LIST\n*MINUTE TRACK TICK SPECIFICATION NUMBER 1*\nTICK LENGTH IS 0.05 INCHES\nTICK INTERVAL IS 1.00 MINUTES\n*MINUTE TRACK TICK SPECIFICATION NUMBER 2*\nTICK LENGTH IS 0.07 INCHES\nTICK INTERVAL IS 5.00 MINUTES\n*MINUTE TRACK TICK SPECIFICATION NUMBER 3*\nTICK LENGTH IS 0.10 INCHES\nTICK INTERVAL IS10.00 MINUTES\n>\nLABEL MIN LIST\n*MINUTE LABEL SPECIFICATION NUMBER 1*\nLABEL HEIGHT IS 0.08 INCHES\nLABEL INTERVAL IS 5.00 MINUTES\nLABEL MODE IS 1\n*MINUTE LABEL SPECIFICATION NUMBER 2*\nLABEL HEIGHT IS 0.08 INCHES\nLABEL INTERVAL IS30.00 MINUTES\nLABEL- MODE IS 0\nTICK TRACK HOUR LIST\n*HOUR TRACK TICK SPECIFICATION NUMBER 1*\nTICK LENGTH IS 0.15 INCHES\nTICK INTERVAL IS 1.00 HOURS\n*HOUR TRACK TICK SPECIFICATION NUMBER 2*\nTICK LENGTH IS 0.20 INCHES\nTICK INTERVAL IS 4.00 HOURS\n>\nLABEL HOUR LIST\nFigure 5. (Continued)\n67","*HOUR LABEL SPECIFICATION NUMBER 1*\nLABEL HEIGHT IS 0.10 INCHES\nLABEL INTERVAL IS 1.00 HOURS\nLABEL MODE IS 0\n>\nTICK TRACK DAY LIST\n*DAY TRACK TICK SPECIFICATION NUMBER 1*\nTICK LENGTH IS 0.50 INCHES\nTICK INTERVAL IS 1.00 DAYS\n>\nLABEL DAY LIST\n*DAY LABEL SPECIFICATION NUMBER 1*\nLABEL HEIGHT IS 0.15 INCHES\nLABEL INTERVAL IS 1.00 DAYS\nLABEL MODE IS 1\n>\nBYE\nEND OF PROGRAM INQUIRE, YOU ARE STILL LOGGED IN\n#ET=33.4 PT=2.1 I0=1.7\n#END PRIME RATE SESSION 3667 ET=38.5 PT=2.1 IO=1.7\n#USER = DS9 12:23:08 10/28/80\n#\nFigure 5. (Continued)","CANNEDSPECS - TRDPPRN60in\nScale 60in/degree\nDETAILED SURVEY\nTRACK and DEPTH\n10 07'N\n10 10'N\n10 17'N\n10 15'N\n108 54'W\n108 54'W\nFigure 6. Example of plot from Program PLOTPAK.\nBNFC02MV 60 INCHES/DEGREE\n1814 0900\nWITH DEPTHS AND MINUTE TICKS\n1811\n1793\n1796\nBNFCUZ\nDEPTHS IN FATHOMS(800FM/SEC)\n109 00 W\n109 02'W 109 00'W\n109 02'W\n10 07 N\n10 10'N\n10 15'N\n10 17'N","CURRENT SYSTEMS DEVELOPMENT FOR MANAGEMENT OF MARINE GEOPHYSICAL\nDATA AT THE ATLANTIC GEOSCIENCE CENTRE, CANADA\nGwynneth M. Martin\nGeological Survey of Canada, Ottawa, Canada K1A 0E8\nAbstract - The requirements of the Atlantic Geoscience Centre of the\nGeological Survey of Canada for the storage, retrieval and update of\ngravity, magnetic, bathymetric and navigation data have been established.\nAnalysis and design of a software system to meet the requirements is\ncomplete.\nThe design uses two levels of data: an index level containing reference\nand calibration items and a lower level containing all the logged items.\nThe data is structured hierarchically and SYSTEM 2000 is used to manage\nthe index level items. The lower level items are stored in indexed\nsequential files.\nThe users' major requirements are reduced to four basic operations -\nindex-level queries, retrieval of logged data, retrieval with user\nsubstituted values and update of data values. Processing uses FORTRAN\nprograms (including SYSTEM 2000 PLEX instructions) and CYBER CCL\nprocedures.\nKey Words: Hierarchical model, SYSTEM 2000, Indexed sequential files.\nBACKGROUND\nA data dictionary was prepared containing for each\nitem a description, the frequency with which each value\nThe Atlantic Geoscience Centre is a division of\noccurs, the range of possible values, the related\nthe Geological Survey of Canada and forms part of the\ninstrumentation and the location associated with the\nBedford Institute of Oceanography. Part of its\nvalue. In building the dictionary, the computational\nfunction is the collection of geophysical data during\nrelationships between data items were defined and each\nmarine surveys and the subsequent management of that\nitem was classed as primary (measured) or derived\ndata. In the late 1960's and early 1970's the data\n(resulting from a computation based on primary values).\nmanagement was accomplished using GEOFILE (Ross et al,\nA total of 36 primary and 24 derived items were\n1973), a tape-based system operating on the Institute's\nidentified. A11 item values are related to a specific\nCDC 3150 machine. However, as years passed the volume\ntime within a phase.\nof data and number of users both grew and the system\nbegan to fall short of meeting the ever-increasing\nThe items were also classed into an 'index-level'\nuser demands. The Institute's acquisition of a CYBER\nor 'minute-level' group depending upon the frequency\n171 in February 1979 made it feasible to begin the\nwith which values occur. The index-level items are\ndevelopment of a replacement system.\nthose having values occurring with a frequency of less\nthan once per minute. The minute-level contains all\nIn anticipation of this, a user committee had been\nitems whose values occur with a frequency greater than\nactive within AGC (Haworth et al, 1976). The first\nor equal to once per minute.\nsteps of definition of requirements and data items\nbuilt on the work of that committee and were completed\nwith the cooperation and assistance of potential users\nOPERATIONS\n(geophysicists). A functional analysis followed and a\npreliminary design was established and formed the basis\nThe operations required were analyzed as belonging\nof feasibility testing in the Institute during August-\nto one of four types:\nDecember 1979 (Martin, 1980). The design was consoli-\ndated and in June 1980 a small contract was let to DMR\n1. Index-level query\nand Associates to complete the detailed design. As\na\nThe index-level query will allow a user to\nresult of this, the detailed design manual was\ndelivered in October 1980.\nquery at an index-level, i.e. the query in-\nvolves only items which occur less frequently\nthan once per minute. It would give, for\nThis paper summarizes the results of requirements\nand functional analyses in terms of the data to be\nexample, the capability of determining the\nexistence of gravity data within an area or\nstored and the operations to be performed, describes\nphase; of retrieving the calibration values\nthe organization of data within the system, and out-\nlines how the required operations will be carried out.\nfor a gravimeter for a specific time period\nwithin a phase; of determining the quality of\nmagnetic data for a specific time period within\nDATA CONTENT\na phase, etc. It would not involve the most\nfrequently occuring data items such as actual\nThe data to be stored in the data base are those\nvalues relevant to, or collected during, marine geo-\nmeasurements logged every minute (or more\nfrequently) within the phase. Since no\nphysical surveys. They relate to navigation, bathy-\nmetry, magnetics and gravity. The surveys are\nparticular qualifier and/or retrieval list\nperformed in \"phases\", a phase being the time between\ncould be identified the requirement is for a\ninstrument calibration measurements at land reference\nvery general query capability.\nstations.\n70","into a hierarchy by establishing for each phase, the\n2. Retrieval\ndegree squares through which the phase passed, and for\neach degree square the times at which the phase entered\nThe retrieval function will allow a user to\nand left the square. Thus the specification of a\nretrieve values of items in the minute-level,\nrectangular area can be transformed into a specification\ni.e. those items which occur with a frequency\nof time intervals within a number of phases.\ngreater than or equal to once per minute. For\nexample, the values of the gravity readings\nSYSTEM 2000 was selected (from limited software\ncollected once per minute may be retrieved,\navailable) to be of use. It is a commercially avail-\nand/or the derived Bouguer anomaly values.\nable DBMS based on a hierarchical data model. Although\nThe user may include a qualifying clause which\nall the data items could have been defined in a single\nwould specify:\nSYSTEM 2000 data-base, only the index-level items are\nincluded. The bulk of the information is obviously in\na) either a rectangular area\nthe minute-level items and by keeping those physically\nor specific time intervals within phases\nseparate, the size of the index data base is such that\nit can be kept on-line at all times. Inclusion of the\nminute level values would increase the size by almost\nand\ntwo orders of magnitude and also implies possible\nb) the minimum quality level of the values\nstructural, and therefore expensive, changes in an up-\nto be retrieved.\ndate operation.\nFigure 1 illustrates the structure of the index-\n3. Modified Retrieval\nlevel data-base.\nThe modified retrieval function will allow a\nThe minute-level data for a phase is a flat file\nuser to make a retrieval (as in 2 above)\nof records for each minute of the phase with relation-\nbased on data base values which have been\nships between data items simple intrarecord relation-\nmodified but without making permanent changes\nships. Given the update and retrieval requirements,\nto the data base. The user will qualify his\nthe data is organized in an indexed sequential structure.\nretrieval by defining the specific time inter-\nA separate indexed sequential file is used for each\nvals within a phase and the modifications (and\nphase to allow less-frequently used data to be moved\nretrieval) to be made over those intervals.\nThe retrieved values will be those of items\noff- and on-line conveniently.\nin the minute-level. Modifications may be\nSYSTEM OPERATION\nmade to any of the primary data items. The\nvalues of any derived items to be retrieved\nwill be recalculated using the modified\nIndex-level Queries\nvalues. Thus the effect of any proposed\nThe first requirement listed above was for index-\nchanges to the data base can be investigated\nlevel queries. These involve only index-level items\nfully.\nand would therefore need to access only the SYSTEM 2000\ndata-base.\n4. Update\nSYSTEM 2000 has a natural language capability, i.e.\nThe update function will allow a permanent\nad hoc queries can be made using English-like commands.\nmodification to be made to any primary\nThis is ideally suited for index-level queries. The\nvalue(s) in the data-base. An update request\nfollowing examples illustrate the type of command.\nwill be made by defining specific time inter-\nvals within a phase and the modifications to be\ni) PRINT CHIEF SCIENTIST WHERE PHASE EQ P79013\nmade over those intervals. The values of any\nderived items stored in the data-base will be\nii) PRINT GRW1, GRW2 WHERE PHASE EQ P73012\nrecalculated (using the new primary values)\niii) PRINT PHASE WHERE DEGSQ LAT SPANS 52*57\nand modified also.\nAND LONG SPANS 45*48\nNote that this operation, unlike the previous\nThere is also the capability to build 'strings' which\nthree, will not be available to any user; it\nare parameter driven and allow Natural Language commands\nis the prerogative of the \"data manager\".\nto be abbreviated. For example, the third command above\nDATA BASE STRUCTURE\ncould be invoked by:\nPHASE IN AREA (52, 57, 45, 48)\nThe logical data structure is a network with an\narea \"owning\" observations, a phase \"owning\" obser-\nThe user enters these queries from a time-sharing\nvations and phases and areas \"owning\" each other.\nterminal and the response is timely.\nHowever, one of the practical criteria established was\nthat the use of existing, developed and supported soft-\nRetrieval and Update Operations\nware was desirable in order to lessen the on-going\ncommitment of AGC staff to system maintenance. No\nThe requirements for retrieval, modified retrieval\ncommercially available DBMS package for the CYBER series\nand update involve data items in both the index-level\nhandled a network structure.\ndata base and the indexed sequential minute-level files.\nAll three operations are similar in that they ultimately\nExamination of the logical relationships lead to\ninvolve processing minute-level data and are carried out\nthe development of a hierarchical data organization\nin two parts as illustrated in Figure 2. In the first\nwith the phase as the basic entity. Factors leading to\nstep - the operation specification - the user establishes\nthis choice were that all items are linked by time\nvalid parameters for the operation required. This will\nwithin a phase; several computations of derived values\ninvolve accessing the index-level data-base; a new\nneed sequential measurements within a phase; and that\nvalues file may be created (for modified retrieval or\nupdates are always done on a phase basis. On the other\nupdate); and the parameters are accumulated in a para-\nhand, it is equally important to retrieve information\nmeter file. The second step - update/retrieval -\nby phase or by area. This latter link was incorporated\nactually performs the minute-level operation required.\n71","PHASE\nDEGREE SQUARE\nCALIBRATION\nNAV. FIXES\nQUALITY\nFigure 1.\nStructure of Index-level data-base\nSTART &\nEND TIMES\nINDEX-LEVEL DATA BASE\nMINUTE DATA BASE\nOPERATION\nUPDATE/\nUSER\nPARAMETER FILE\nSPECIFICATION\nRETRIEVAL\nNEW VALUES FILE\nRETRIEVED VALUES\nFigure 2.\nOverview of Retrieval and Update Operations\nThis will involve accessing the complete data base,\nFeedback is given to the user so that he is aware of\nthe parameter file and the new values file and will\nthe quantity of data which will be retrieved in response\nproduce either a file of retrieved values (for\nto his request. Interactive programs and procedures\nretrieval or modified retrieval) or a modified data-\naccess the index-level data-base to reduce the required\nbase (for update)\narea to a series of time intervals within phases. These\ntime intervals are screened, again using the index-level\nThe flow of the operation is designed so that the\ndata-base, to produce (sub) intervals in which data is of\nfirst step, which will be executed interactively will\nat least the specified quality. The parameter file\ninitiate the second step, a batch job, to perform the\ncontains these screened intervals and the list of data\nbulk of the processing.\nitems to be retrieved. This file becomes input to\nprograms executed in the batch job described in Figure\nFigures 3a and 3b expand on the detail of the two\n3b.\nsteps, taking as an example a retrieval by area. In\nthe first step (Figure 3a) the user is prompted for\nThe batch job itself is divided into two parts to\nensure only moderate core requirements at any point\na) the area required\nThus an intermittent data file is first created,\ncontaining those values from the index-level data base\nb) the data items to be retrieved\nwhich are required for the computation of derived values\nand\nwhich are not explicitly stored but which are to be\nretrieved. The minute data processor uses that with\nc) the minimum acceptable quality of values\nthe minute-level indexed sequential files to produce a\nrequired.\nfile of retrieved values.\n72","SPECIFICATION\nPROCESSOR\nPARAMETER FILE\nFigure 3a\nOperation Specification\n(Step 1) for retrieval\nby area\nUSER\nQUALITY\nAREA\nFILTER\nREDUCTION\nINDEX-LEVEL DATA BASE\nPARAMETER FILE\nMINUTE\nINDEX-LEVEL\nDATA-BASE\nFigure 3b\nDATA-BASE\nRetrieval stage (Step 2)\nfor retrieval by area\nMINUTE DATA\nINDEX-LEVEL\nINTERMITTENT DATA FILE\nPROCESSOR\nPROCESSOR\nRETRIEVED\nVALUES FILE\nsystem be said to be completely established. Personnel\nwill also be needed on a continuing basis to perform\nCONCLUSIONS\ndata administration and system evaluation functions.\nAn analysis of requirements for the management of\nREFERENCES\ngeophysical data has been completed, functional\nspecifications defined, a feasibility study performed\nRoss, D.I., Shih, K.G., Johnston, B.L. and Porteous,\nand a detailed design document completed. The latter\nD.M., 1973, Geofile - A Revised Manual on the\nhas been reviewed and approved by potential users. The\nStorage and Retrieval of Geophysical Data: B.I.O.\nproposed system will meet the user requirements and\nalso takes into account the practical problems of the\nComputer Note Series, BI-C-73-3.\nenvironment in which it will be implemented (such as\nHaworth, R.T. , Falconer, R.K.H. , Srivastava, S.P., and\nlimited interactive resources and disc space). The\nShih, K.G. 1976, The AGC Digital Data Base (A\nremaining steps of coding and testing are estimated to\nPreliminary Proposal): A.G.C. Internal Report.\nneed 1 1/2 - 2 man-years.\nMartin, Gwynneth, M. 1980, A System Design for Marine\nThis will complete the development phase for the\nGeophysical Data Management (abstract) Geologic-\nsoftware system. In addition, resources will be\nal Association of Canada/Mineralogical\nrequired to convert existing data, load it into the new\nAssociation of Canada, Annual Meeting, Halifax,\nsystem and interface existing display and analysis\nProgram with Abstracts, p. 69.\nprograms. Only then can an overall data management\n73","AN AUTOMATED INDEX TO U.S. GEOLOGICAL SURVEY EAST COAST MARINE GEOLOGICAL DATA\nValerie Paskevich and Joseph Moses Botbol\nU.S. Geological Survey, Woods Hole, Mass.\nAB STRACT\nDuring marine geologic cruises enormous volumes of time - and spatially dependent data are collected. A\ntypical 3-week cruise may completely fill one hundred 2,400-foot reels of magnetic tape. The U.S. Geological\nSurvey (U.S.G.S.) has collected data on this scale for at least the last 10 years. Two major problems that must\nbe resolved now are storing the data and devising an effective index to the stored data. In the near future, data\nstorage will undoubtedly be resolved by the development of mass data storage devices such as the laser-written\ndisk. However, the mechanism for locating these data by any relevant variable is not a problem to be relegated to\na computer industry unfamiliar with marine geologic data, but rather a problem best addressed by those familiar\nwith the nature and use of the data. As a result, the U.S.G.S has implemented indices for bibliographic data,\ncommon-depth-point (CDP) seismic profiles, data stored on magnetic devices, and Geologic Retrieval And Synopsis\nProgram (GRASP) files on United States east coast marine geologic data. Additional indices are planned for sin-\ngle-channel seismic raw data and profiles, cruise logbooks, CDP raw data, bottom photos, and sidescan sonar raw\ndata and processed imagery.\nAll indices are designed to be simple and easily accessible on an interactive computer by a casual user.\nGRASP was used as the retrieval mechanism for all indices, and it will permit implementation on any timesharing\ncomputer that supports FORTRAN IV. The unit record of each index is a \"cohesive data aggregate,\" i.e., a group\nof\ndata that share common variables, relate to the same subject, and are physically stored in the same medium and\nlocation. Examples of cohesive data aggregates are: one file (on a multi-file magnetic tape) of seismic data\ncollected during part of a leg of a cruise, a data base containing all sedimentary size analyses for an entire\nproject, and the location of a seismic profile archived in a particular drawer or cabinet area.\nThe index to bibliographic data is presently restricted to those publications authored by personnel of the\nBranch of Atlantic and Gulf of Mexico Geology. The unit record is a bibliographic text reference whose variables\ninclude the author name, year of publication, geographic area, topic, project number, subject index, and type of\npublication. Although the index is intended for both administrative and bibliographic reference purposes, it\nprovides a method for rapidly assessing the nature, status of, and degree of attention paid to, various topics in\ndifferent areas.\nData stored on magnetic tape, disk, or cassette constitute the records of the largest of our indices.\nApproximately 1,000 archival records have been encoded and about 2,000 more records are to be added in order to\nbring the Branch data library up to date. New or incoming data are encoded for immediate inclusion in the index,\nwhereas older archival data are gradually encoded as permitted by both time and available personnel. The major\nproblem with indexing the archival data is that each tape, file, or cassette must be read completely in order to\ndetermine the extents of the variables. The fact that these extents are needed for the index but have never been\nrecorded points to the utility of the index. Key variables include the name of the ship and chief scientist, file\nor tape identification, tape storage site, maximum and minimum geographic extent, data content, program used to\ncreate present file, format characteristics, and beginning and ending date and time and shotpoint numbers for CDP\ntapes. This index has already been used to avoid duplication of effort and as a guide to the recovery of data in\nareas of current interest.\nCDP profiles are stored in large map cabinets. Until the advent of the index to CDP profiles, efficient\naccess to the profiles was limited by personal acquaintance with the data. CDP index variables include cruise\nidentification information, geographic extent, numerical profile parameters, salient geologic features, and the\ncabinet and drawer in which the profile is archived. This index, coupled with the yet-to-be-compiled raw CDP data\nindex, will serve as a valuable aid to planning U.S.G.S. seismic programs.\nAnalytical results of major ocean drilling and sampling programs have been encoded for retrieval by GRASP,\nand each of the resulting files constitutes a record in the index to GRASP data bases. Variables include a de-\nscription of the subject matter, the number of variables, the number of records in the file, geographic extent,\nback-up file designation, ancillary files, and exact reference designation in host computer. This index provides\na source of area and subject matter review as well as the exact file nomenclature required for any computer ac-\ncess.\nAll indices use the GRASP retrieval system for data examination and recovery. In the stape-disk-cassette\nindex, GRASP output is used as input to a computer program that plots and annotates the geographic extent of each\nrecord on a CRT (cathod-ray tube) or on hard-copy devices. Coastlines and a latitude and longitude grid are\nautomatically superimposed on the plot.\nMost projects within the U.S.G.S do not provide funds for maintenance of data collected. Because the actual\ndata are not easily available for use beyond the needs of the original project, indices to these data are vitally\nneeded. Indices prevent data from being \"forgotten\" and provide a means for data to be located easily when they\nare required. Properly constructed data indices serve in project planning, operations, and administration. They\nare effective research tools in their own right.\n74","MARINE-DATA MANAGEMENT AT THE\nPACIFIC-ARCTIC BRANCH OF THE\nOFFICE OF MARINE GEOLOGY,\nU.S. GEOLOGICAL SURVEY\nAlan K. Cooper\nU.S. Geological Survey,\nMenlo Park, California\nExamples of mnemonic codes are listed in Figure 4.\nINTRODUCTION\nAs an example, a digital magnetic tape that contains\ngravity data has a mnemonic code of 'MAG GRAV' and a\nEach year the Pacific-Arctic Branch of the of-\nhierarchical code of 'XMGG'\nfice of Marine Geology conducts numerous multidisci-\nplinary surveys in the offshore areas of the western\nA standardized set of shipboard logsheets has\nUnited States (Fig. 1). More than 50 different\nbeen designed for entering both data-base and de-\ntypes of geologic, geophysical, and hydrologic data\nscriptive information. Two examples of logsheets\nare routinely collected in support of regional re-\nare shown in Figure 5. The left-hand side of all\nconnaissance surveys and local environmental hazard\nlogsheets is nearly identical and contains the in-\nstudies. Since 1975, more than 800,000 km of geo-\nformation that is keypunched for the CDI data\nphysical trackline data (seismic reflection, grav-\nbase. The right-hand side contains descriptive in-\nity, magnetics, bathymetry) and 5000 stations with\nformation that usually is not keypunched. Figure 5\ngeologic samples has been collected.\nillustrates how a gravity magnetic tape and gravity\ncore sample (from Fig. 3) are logged at sea.\nA computer-based data management system is\nbeing implemented to inventory, process, store-\nAt the end of each cruise, the logsheets are\nretrieve, and disseminate the large volumes of data\nedited and keypunched and a CDI is generated for the\nthat are collected. These four functions of the\ncruise. A section of the CDI for cruise S6-77-BS is\ndata management system (Fig. 2) have been discussed,\nillustrated in Figure 6. The complete CDI is rou-\nin part, by Larsen et al. (1978a, b) and are de-\ntinely used to generate summary cruise reports (Fig.\nscribed in further detail below. The present system\n7), comprehensive categorical listings (Fig. 8), and\nwas initially designed in 1974 and is similar in\nmaps that display data-collection sites (Fig. 9).\nmany respects to the data management system used by\nScripps Institute of Oceanography (Smith et al.,\n1971).\nPROCESSING\nINVENTORY\nThe computer processing function (Fig. 2B) is\nThe inventory function (Fig. 2A) is the back-\ndone both aboard ship and on shore. Currently,\nbone of the data management system for, at this\nshipboard processing is limited to navigation data;\nstep, each type of field data is identified and\nhowever, a multipurpose computer for geophysical\nstored in a computer data file, Cruise Data Index\ndata processing and display is planned for the fu-\n(CDI), for later use. During field operations, a\nture. Digital navigation and geophysical data are\nunique data-identification code and a specific data-\nrecorded on magnetic tape and are processed on shore\ncollection time are assigned to each type of field\n1\nby a special-purpose Data General Eclipse system\ndata as it is collected. The codes and times are\n(multichannel seismic data) and a multipurpose Hon-\nentered on shipboard logs, and this information is\neywell 68/80 MULTICS system (navigation, gravity,\nkeypunched and transferred to a land-based computer\nmagnetics, bathymetry). Geologic samples are ana-\nwhere the CDI is created. Special reports,\nlyzed, in part, with devices (Coulter counter, Rapid\ndisplays, and listings (categorical or sequential)\nsediment analyzers, Leco carbon analyzer, x-ray dif-\nare then generated from the information stored in\nfractometer) that are currently being interfaced to\nthe Cruise Data Index.\na PDP 11/34 laboratory system. Once in full opera-\ntion, the PDP 11/34 system will also handle the\nThe Cruise Data Index (CDI) is a hierarchical\nstorage, retrieval, and display of the geologic par-\ndata base in which each type of data is described by\nametric data.\na four-letter code. Figure 3 illustrates the four-\nlevel hierarchy for geophysical and geologic data.\nSTORAGE - RETRIEVAL\nSimilar hierarchies, not shown, have been developed\nfor hydrographic and vessel operation data (Larsen\nThe long-term storage of marine data (Fig. 2C)\net al., 1978a). As a convenience for shipboard\nis generally limited to two different media, micro-\nwatchstanders, mnemonic codes, rather than hier-\nfilm (analog data) and magnetic tape (digital\narchical codes, are used to enter information into\ndata). After each cruise, most analog field records\nthe shipboard logs. A land-based computer later\nare microfilmed, and both the records and microfilm\nconverts the mnemonic codes to hierarchical codes.\n75","are kept in permanent library archives. The digital\nSolar-Terrestrial Data Center (NGSDC) Boulder, Colo-\ndata are separated into two categories, tapes that\nrado) for sale.\ncontain raw field data and files that contain gene-\nral cruise information (CDI). The raw field data\nSUMMARY\nare segregated and distributed to specific project\ngroups who handle the special computer processing\nThe Pacific-Arctic Branch of the Office of Mar-\n(editing, reduction, display, etc.). After the spe-\nine Geology, U.S. Geological Survey, is continuing\ncialized processing, the digital data are stored on\nits plans, begun in 1974, to implement a computer-\narchive magnetic tapes.\nbased data management system that will assist in the\ninventory, processing, storage-retrieval, and dis-\nRecovery of digital data stored on computer\nsemination of all marine data collected by the\ndisk-files (CDI, preliminary geologic and geophysi-\nbranch.\ncal data, etc.) is handled by interactive programs\nREFERENCES\nthat search, sort, list, and display the data. Com-\nputer programs that will interactively access and\nLarsen, B. R, Cooper, A. K., McHendrie, A. G.,\ndisplay archived magnetic tape information are cur-\nGardner, J. V., 1978, A computer oriented system\nrently under development.\nfor logging, annotating, tabulating and plotting\nmarine geophysical and geologic data collected\nDISSEMINATION\nby the Pacific-Arctic Branch of the Office of\nMarine Geology, U.S. Geological Survey: U.S.\nThe dissemination of marine data (Fig. 2D) both\nGeological Survey Open-File Report 78-288, 208\ninside and outside the Pacific-Arctic Branch is us-\npages.\nually handled and overseen by the chief scientist of\nthe cruise on which the data were collected. Cate-\nLarsen, B. R., Cooper, A. K., and Chase, Tom, 1978b,\ngorical listings (CDI) of the data collected, as\nA system for tabulating, plotting, and display-\nwell as copies of the raw and processed field rec-\ning marine data, in: Proceedings of second\nords, are distributed to and kept by the chief sci-\nworking conference on oceanographic data sys-\nentist. Dissemination of all cruise data (analog\nitems, Woods Hole Oceanographic Institution,\nrecords and digital-data files) into the public do-\nSeptember 1978, p. 278-291.\nmain is done through U.S. Geological Survey Open-\nFile Reports and through journal publications that\nSmith, S. M., et al., 1971, SIO Sample Index - Gen-\nare written by the chief scientist or with others.\neral Information and encoding instructions: La\nMost U.S.G.S. Open-File Reports are for sale by the\nJolla, Scripps Institute of Oceanography, 95 p.\nOpen-File Services Section in Denver, Colorado; how-\never, those reports that contain predominantly raw\n1 Any use of trade names is for descriptive pur-\nor processed field data (analog or digital) are sent\nposes only and does not imply endorsement by the\nto the world data center (National Geophysical and\nU.S. Geological Survey.\n170°\n180°\n170°\n160\n150°\n140°\n130\n120\n75°\nARCTIC\nOCEAN\nAR\nBEAUFORT\nSEA\nCHUKCHI\n70°\nSEA\nALASKA\n(U.S.A)\nU.S.S.R.\nCANADA\nBERING\n60°\nSEA\nBS\nWG\nEG\n50°\nNORT\nH\nW\nUNITED\nSTATES\nPACIFIC\n40°\nNC\nOC\nEAN\nNIP\nSC\n30\nHW\nM\n20°\nFig. 1. Map of the areas in which the Pacific-Arctic Branch of\nthe Office of Marine Geology conducts marine surveys.\nBold letters are area designators (e.g. BS = Bering\nSea).\n76","MARINE DATA MANAGEMENT\n1. Inventory\n2.\nProcessing\n3. Storage and\nRetrieval\n4.\nDissemination\n3.\n1.\nSTORAGE & RETRIEVAL\nINVENTORY\nSHIPBOARD LOGS\nI\nSTORAGE:\n- Magnetic tape\nCDI DATA FILE\n- Microfilm\nMAP\nCRUISE\nII RETRIEVAL:\nDISPLAYS\nREPORTS\n- Data file recovery\n- Interactive search\nLISTINGS:\nand display of tapes\n1. Catagorical\nand data (under\n2. Sequential\ndevelopment)\n4.\nDISSEMINATION\n2.\nPROCESSING\nI\nINTERNAL USE:\nI\nSHIPBOARD:\n- Through chief scientist\nHP 2100 Integrated\nnavigation\nII\nEXTERNAL USE:\nPDP 11/34 - Multipurpose\n- USGS Open File Report\n(planned)\n(field data reports\navailable from NGSDC)\nII\nLAND BASED:\n- Journal publications\nECLIPSE 230 - Multichannel\nseismic data\nHONEYWELL 68/80 - Multipurpose\nPDP 11/34 - Sediment lab\nOverview of the data management system.\nFig. 2\n77","GEOPHYSICAL DATA HIERARCHY\nSCIENTIFIC DATA\nLEVEL\nGEOPHYSICAL\n(X)\nANALOG AND DIGITAL DATA\nSAMPLE AND OBSERVATIONAL DATA\nOPERATIONS DATA\nMAPS\nANALOG\nANALOG\nDIGITAL\nDIGITAL\nDIGITAL\nPAPER\nPHYSICAL\nNUMERICAL\nVIDEO/\nDEPLOYMENT\nOPERATION\nINVENTORY\nAND\nPAPER\nMAGNE TIC\nENCODING\nPRINTER\nPUNCH\nMAG\nSAMPLE\nOBSERVATIONS\nPHOTO\nOF\nII\nPLOTS\nRECORDS\nTAPES\nFORMS\nTAPES\nTAPES\nSCIENTIFIC\nSCIENTIFIC\nSCIENTIFIC\nOBSERVATION\nPRINT TAPE\nGEAR\nGEAR\nGEAR\n(H)\n(A)\n(T)\n(F)\n(D)\n(M)\n(G)\n($)\n(0)\n(P)\n(D)\n(x)\n(w)\nSHIPBOARD\nSHIPBOARD\nGRADIOME TER\nSINGLE\nBURG\nHIGH RES\nMULTI\nWIDE ANGLE\nSTATION\nSTATION\nBATHY\nHEAT\nGRAVITY\nMAGNE TICS\nCHANNEL\nSEIS-\nACOUSTIC\nCHANNEL\nREFLECTION\nIII\nGRAVITY\nMAGNETICS\nMETRY\nFLOW\nSEISMIC\nMIC\nSEISMIC\nREFRACTION\n(G)\n(M)\n(Z)\n(T)\n(Q)\n(H)\n(X)\n(W)\n(J)\n(K)\n(B)\n(F)\nTHE\nIV\n(A)\n(U)\n(1)\n(D)\n(4) (V)\n(T)\n(R) (K) (H) (P)\n(F)\n(E)\n(2)\n(2)\n(7)\n(G)\n(Y)\n(C)\n()\n(N)\n(X)\n(W)\n(N)\n(9)\n(M)\n(T)\n(S)\n(Z)\n(5) (M)\n(3) (B)\n(L)\n(8)\n(6)\nDigital Gravity Tapes = MAG GRAV = XMGG\nGEOLOGIC DATA HIERARCHY\nLEVEL\nSCIENTIFIC DATA\nGEOLOGIC\n(G)\nSAMPLE OBSERVATIONAL\nANALOG DIGITAL\nOPERATIONS\nDATA\nDATA\nDATA\nANALOG\nAMALOG\nOPERATION OF\nDEPLOYMENT OF\nINVENTORY OF\nPHYSICAL\nNUMERICAL\nVIDEO/PHOTO\nII\nMAGNETIC\nPAPER\nSCIENTIFIC\nSCIENTIFIC\nSCIENTIFIC\nSAMPLE\nOBSERVATIONS\nOBSERVATIONS\nGEAR\nTAPES\nRECORDS\nGEAR\nGEAR\n(P)\n(D)\n(W)\n(S)\n(0)\n(T)\n(X)\n(A)\nUNDER\nSEAFLOOR\nSHIPBOARD\nMISCEL\nSEAFLOOR\nSUSPENDED\nSEAFLOOR\nENGINEERING\nUNDERWATER\nSEAFLOOR\nLAND\nWATER\nUNDERWAY\nSEDIMENT\nCAMERA\nLANEOUS\nPROPERTIES\nCORE\nCAMERA\nDIVE\nDREDGE\nGRAB\nGRAB\nSAMPLER\n(M)\n(C)\n(0)\n(Q)\n(Y)\n(1)\n(D)\n(L)\n(K)\n(U)\n(E)\nonli\nBOX\nIV\n(B)\n(D)\n(W)\n(T)\n(A)\n(P)\n(Q)\n(C)\n(G)\n(K)\n(Y)\n(R)\n(U)\n(R)\n(J)\n(V) (X)\n(F)\n(2)\n(5)\n(1)\n(0)\n(M)\n(N)\n(1)\n(M)\n(E)\n(H)\nGravity Core= SAM GRAV = GSCG\nExamples of the four-level - hierarchies, for geophysical\nFig. 3.\ndata and geologic data, that are the basis for the\ncomputer - based data management system. Examples of\nmnemonic codes and corresponding hierarchical codes are\nshown beneath each hierarchy. .\n78","MEDIA CODE\nRECORDING MEDIA\nPAP\nPAPER ROLLS\nMAG*\nMAGNETIC TAPE (DIGITAL)\nAVM\nANALOG/VIDEO MAGNETIC TAPE\nPHO\nPHOTOGRAPH\nPRN\nPRINTER LISTING (OR PRINT TAPE)\nSPEC. CODE\nSPECIFICATIONS\nSAM*\nSAMPLING ATTEMPTS\nNAV\nNAVIGATION (SHIP MOVEMENTS)\n\"\nTRACKLINES\n\"\nSTATIONS\nDATA (EQUIPMENT)\nGEOPHYSICAL DATA\nCODE\nGRAVITY\nGRAV*\nGRAVITY (SHIPBOARD)\nGRVL\nGRAVITY (LAND TIE)\nGRVB\nGRAVITY WITH BATHY\nHIGH RESOLUTION SEISMIC\nUNIB\nUNIBOOM\nMSPK\nMINI SPARKER\nGEOLOGIC DATA\nCORING DEVICES\nDART\nDART CORE\nPSTN\nPISTON CORE\nVIBR\nVIBRATING CORE\nBOXS\nBOX CORE (CROPPED FROM SHIP)\nBOXH\nBOX CORE (HAND HELD)\nGRAV*\nGRAVITY CORE\nFig. 4. Examples of mnemonic codes that are entered into\nshipboard logsheets. Codes with an asterisk are those\nused in Figure 5.\n79","SHIPBOARD LOGS\nI GEOPHYSICAL\nGRAVITY\nPAGE\nU.S.G.S. GEOPOTENTIAL SENSOR LOG:\nLEG\nEQUIP, TYPE\nOF\nTOW DIST./DEPTH: SOURCE\nRECEIVER\nSHIP\nLOCATOR SW - 77 - BS\nICOL 73-BO\nRECORDER TYPE/NO.\nCHIEF SCIENTIST\nID\nYR\nAREA\nARCHIVE NO\nX IN ONE BOX ONLY\nTIME POSITION\n*\nSYSTEM GRAVITY\nSYSTEM MAGNETICS\nMED/SPEC\nDATA/EQUIP\nROLL REEL OR\nS/E OF DATA/SYST\nCOMMENTS\nCODE\nCODE\nSEQUENCE NO\nREC SCALE\nDIGITAL\nREC SCALE\nMSTR READING\nSTA OR\nJULIAN\nGMT\nROLL REEL RECORDING\nCOURSE SPEED\nLINE\nDAY\nSHOT POINT\nSEC\n3 LETT'S\n4 LETTERS\n5 LET/NO\nSTRT\nEND\nON\nOFF\nSAMPLE RATE\nANALOG\nSAMPLE RATE\nSLAVE READING\nR-MIN\n**\nSTART ROLL #2\n238\n2100\n103\nMAG\nGRAV\nII GEOLOGIC\n20\nU.S.G.S. STATION OPERATIONS LOG\nSTATION\nBERING\nSHIP/LEG S.P. LEE\nAREA\nLAT (ST/ED)\nBS\n56\n77\nCHIEF SCI.\nCRUISE LOCATOR\nLONG (ST/ED)\n-\nAREA CODE\n(COL 73-78)\nYR\n(COL 79.80)\n*\nCIN ONE BOX ONLY\nTIME POSITION\nSPEC\nEQUIP\nSAMPLE\nSTATION DATA ONLY\nOPERATIONS COMMENTS OR DATA MEASUREMENTS\nCODE\nSAMPLING LOCATION\nCODE\nATTEMPT NO\nRECOV\nWATER\nDEPTH\nGMT TIME\nSTA NO OR\nJULIAN\nSIGNA VARIABL IG\nERY\nDEPTH\nUNIT\nLINE NO\nDAY\nHR.-MIN\nSHOT POINT\n3 LET\nLETTERS\n5 LET/NO\nCAR\nSTRT\nEND\nY/N\n4 NO\nLET\n*\n238\n2219\n1,03\n,1,7\nSAM\nGRAV\nX\nY\n2,0\nGRAV CORE 20, ON, BOTTOM\n2230\nm\nFig. 5. Examples of standardized shipboard logsheets for\ngeophysical (top) and geologic (bottom) operations.\nNormally only the left-hand side of the logs is\nkeypunched for the CDI; the right-hand side is for\ndescriptive information.\nCRUISE DATA INDEX (CDI)\nDATA FILE\n229 1120. 0 72 10 SAM GRAV\n13\nON BOTTOM, CORE 13\nY0166M\n5677AS GSCG 300108 91 6 5619912-16833345\n237 938.0 TEST 14 SAM GRAV\n14\nGRAVITY CORE 14 ON BOTT\nY1309M\nS67785 GSCG 300108 91 6 5421100-16663906\n238\n1155.0 101 15 SAM GRAV\n15\nON BOTTOM, GRAV CORE 15\nY0825M\nS677AS GSCG 300108 91 6 5434194-16751842\n238\n1225.0\n101\n15 SAM GRAV\n16\nG-16 ON BOTTOM\nY0825M\nS677AS GSCG 30010A 91 4 5433600-16751563\nN1195M\n238\n1611.0\n102\n16 SAM GRAV\n17\nG-17 ON BOTTOM\n$67785 GSCG 300108 91 4 5437921-16808920\n238\n1641.0\n102\n16 SAM GRAV\n18\nG-18 ON BOTTOM\nY1195M\nS677AS GSCG 300108 91 4 5437756-16808740\n238\n1717.0\n102\n16 SAM GRAV\n19\nG-19 ON BOTTOM\nY1190M\n567785\nGSCG\n300108\n91\n4\n5437623-16808817\n238\n2219.0\n103\n17 SAM GRAV\n20\nGRAV CORE 20 ON BOTTOM\nY2230M\n567785 GSCG 300108 91 4 5476384-16870469\n238\n2319.0\n103\n17 SAM GRAV\n21\nGRAV CORE 21 ON BOTTOM\nY2224M\nS677AS\nGSCG\n300108\n91\n4\n5476466-16870145\n239\n625.0\n104\n18 SAM GRAV\n22\nGRAV CORE 22 ON BOTTOM\nY2151M\n567785\nGSCG\n300108\n91\n4\n5519565-16935063\n239\n1723.0\n105\n19 SAM GRAV\n23\nON BOTTOM, G-23\nY2030M\n$67785\nGSCG\n300108\n91\n4\n5484007-17036961\n240\n025.0\n106\n20 SAM GRAV\n24\nGRAV CORE 24 ON BOTTOM\nY2900M\nS677AS GSCG 300108\n91\n6\n5552227-17002246\n240\n118.0\n106\n20 SAM GRAV\n25\nGRAV CORE 25 ON BOTTOM\nY2900M\n$67785 GSCG 300108\n91\n4\n5551610-17003206\n240\n621.0\n107\n21 SAM GRAV\n26\nGRAV CORE 26 ON BOTTOM\nY3158M\n5677AS\nGSCG\n300108\n91\n6\n5571443-17080330\n240\n726.0\n107\n21 SAM GRAV\n27\nGRAV CORE 27 ON BOTTOM\nY3158M\n567785\nGSCG\n300108\n91\n4\n5572246-17080805\n240\n0.0\n108\n22 SAM GRAV\n28\nG-28 ON BOTTOM\nY2630M\n567785 GSCG 300108\n91\n4\n5593238-17047666\n240\n1158.0\n108\n22 SAM GRAV\n29\nG-29 ON BOTTOM\nY2648M\n567785 GSCG 300108\n91\n4\n5593999-17053348\n229\n1212.0\n72\n10 SAM VANV\n01\nON BOTTOM, VANV-1\nN0166M\n567785 GSKY 301008 102\n4\n5620107-16833971\n229 1239.0\n72\n10 SAM VANV\n02\nON BOTTOM, VANV-2\nN0166M\n$67785 GSKV 30100* 102\n4\n619959-16833350\n229 1254.0\n72\n10 SAM VANV\n03\nON BOTTOM, VANV-3\nN0166M\n567785 GSKV 301008 102\n4\n619999-16833663\nGENERATED BY\nKEYPUNCHED FROM LOGS\nCOMPUTER\nExample of a segment of the Cruise Data Index data file.\nFig. 6.\n80","CRUISE REPORT\nEQUIPMENT SYSTEMS USED\nNAVIGATIONAL\nGEOPHYSICAL\nGEOLOGICAL\nLHYDROGRAPHICAL-\nINTEGRATED NAV\n3.5KH BATHYMETRY\nGRAVITY CORE\nTEMP/SALINOMETER\nNAV SATELLITE\n12KH BATHYMETRY\nTEMPRATURE GAUGE\nLORAN C RHO-RHO\nSHIPBOARD GRAVTY\nDIGITRACK\nSPARKER\nSHIPBOARD MAGGY\nDATA COLLECTED\nGEOPHYSICAL\nTRACKLINE TRACKLINE RECORDING ROLL,REEL\nDATA TYPE\nRECORDING\nKILOMEIRS MILES. IIMESHRS2 LIST QIY\nOR SYSTEM\nMEDIUM\nSNGL CHANL ARCER ANL PAPER ROLL 3852.4\n2080.1\n216.1\n9\n320.1\n15\nANL PAPER ROLL 4969.2\n2683.2\n3.5KH BATHYMETRY\n7\nANL PAPER ROLL 5248.9\n2834.2\n348.1\n12KH BATHYMETRY\n2458.6\n252.0\n4\nSHIPBOARD MAGGY\nANL PAPER ROLL\n4553.3\n3858.9\n2083.6\n364.3\n3\nMAG/BATHYM/NAVIG\nDIGIT MAG TAPE\n2294.8\n365.8\n3\nPRINTR LISTING\n4249.9\n8\nSHIPBOARD GRAVTY ANL PAPER ROLL 5335.1\n2880.7\n290.8\nDIGIT MAG TAPE\n5341.8\n2884.4\n313.2\n19\n358.7\n5\nPRINTR LISTING\n4599.6\n2483.6\n0.0\n75.3\n1\nSTA GRAV LAND TY ANL PAPER ROLL\n0.0\n33.5\n18.1\n32.0\n17\nHEAT FLOW GAUGE\nANL PAPER ROLL\nGEOLOGICALLHYDROLOGICAL_SAMPLES\nNUMBER OF SAMPLES FROM A\nGIVEN WATER DEPTH INTERVAL\nSAMPLING\nSAMPLING\nSAMPLES\nATIEMPIS. RECOVERED\n0=100M\n100-3000M\n>3000M-\nDEVICE\n7\n0\n0\n7\nHEAT FLOW GAUGE\n14\n26\n19\n0\n0\n19\nGRAVITY CORE\n11\n0\n0\nTEMPRATURE GAUGE\n17\n11\n37\n57\n37\n0\n0\nTOTALS\nFig. 7. Example of a segment of a cruise report that has been\ngenerated from the Cruise Data Index.\n81","CATAGORICAL LISTING\nTHACKLINES\n170\n4 0.0\nLINE\n1\nSTART\nL#\n1\nSTN/SP#\n57 42.88\n-152 21.09\n172\n939.0\nLINE\n1\nEND\nL#\n1\nSTN/SP#\n54 22.59\n-165 31.39\n172\n940.0\nLINF\n2\nSTART\nL#\n2\nSTN/SP#\n54 22.62\n-165 31.63\n173\n854.0\nLINE\n2\nEND\nL #\n2\nSTM/SP#\n54 46.50\n-171 59.62\n173\n855.0\nLINE\n3\nSTART\nL#\n3\nSTN/SP#\n54 46.56\n-171 59.89\n173\n2215.0\nLINE\n3\nEND\nL#\n3\nSTN/SP#\n55\n29.36\n-175 42.40\n173\n2218.0\nLINE\n4\nSTART\nL#\n4\nSTN/SP#\n55\n29.57\n-175\n43.06\n174\n1050.0\nLINE\n4\nEND\nL#\n4\nSTN/SP#\n57\n1.82\n-176 59.88\n174 1710.0\nLINE\n5\nSTART\nL#\n5\nSTN/SP#\n57\n2.34\n-177\n0.81\nSHIPBOARD GRAVTY DIGIT MAG TAPES\n170\n4 0.0\nREEL\n1\nSTART\nL#\n1\nSTN/SP#\n57 42.88\n-152 21.09\n55 46.63\n-155 17.60\n170 2030.0\nREEL\n1\nEND\nL#\n1\nSTN/SP#\n170 2034.0\nKEEL\n2\nSTART\nL#\n1\nSTN/SP#\n55 46.23\n-155 18.59\n171 1315.0\nREEL\n2\nEND\nL#\n1\nSTN/SP#\n54 34.80\n-159 48.03\n171\n1318.0\nREEL\n3\nSTART\nL#\n1\nSTN/SP#\n54 34.67\n-159 48.87\n172\n915.0\nREEL\n3\nEND\nL#\n1\nSTN/SP#\n54 22.38\n-165 24.74\n54 22.37 -165 25.34\n172\n917.0\nREEL\n4\nSTART\nL#\n1\nSTN/SP#\n54 23.48 -167 30.53\n172 1645.0\n4\nOFF\nL#\n2\nSTN/SP#\n172 17 0.0\n4\nON\nL#\n2\nSTN/SP#\n54 23.42 -167 34.92\n54 23.66 -168 24.60\n172 20 0.0\n4\nOFF\nL#\n2\nSTN/SP#\n172\n2032.0\n4\nON\nL#\n2\nSTN/SP#\n54 23.75 -168 34.04\n173\n245.0\nREEL\n4\nEND\nL#\n2\nSTN/SP#\n54 31.21\n-170\n18.24\nGRAVITY CORE\nSAMPLE ATTEMPTS\n1\n3643 M\n57\n2.15\n-176\n59.96\n174 13 8.0\nSAMPLE\n1G1\nRECOVERY\nSTN/SP#\n1\n3643 M\n57\n2.08\n-177\n0.01\n174 1516.0\nSAMPLE\n1G2\nRECOVERY\nL#\nSTN/SP#\n3778 M 56 37.69 -178 12.21\n175\n9 1.0\nSAMPLE\n261\nRECOVERY\nL#\nSTN/SP#\n2\n56 37.72\n-178\n12.15\n175\n1050.0\nSAMPLE\n262\nRECOVERY\nL#\nSTN/SP#\n2\n3777 M\n56\n37.87\n-178\n12.21\n175 15 7.0\nSAMPLE\n263\nRECOVERY\nL#\nSTN/SP#\n2\n3777 M\n176 1515.0\nSAMPLE\n361\nRECOVERY\nLA\nSTN/SP#\n3\n3853 M\n55\n59.98\n179\n21.99\n3\n3850 M\n56\n0.10\n179 22.09\n176\n19 6.0\nSAMPLE\n3G2\nRECOVERY\nL#\nSTN/SP#\n55\n29.77\n176\n52.44\n177\n658.0\nSAMPLE\n461\nRECOVERY\nL#\nSTN/SP#\n4\n3874 M\n177\n959.0\nSAMPLE\n462\nNO RECO\nL#\nSTN/SP#\n4\n3881 M\n55\n29.70\n176 52.21\n177\n1323.0\nSAMPLE\n463\nRECOVERY\nL#\nSTN/SP#\n4\n3881 M\n55\n29.83\n176 52.41\n178\n5 9.0\nSAMPLE\n5G1\nRECOVERY\nL#\nSTN/SP#\n5\n3857 M\n56 31.08\n176 49.97\nExample of a segment of a categorical listing that has\nFig. 8.\nbeen generated from the Cruise Data Index.\n82","148\n147\n146\nMAP DISPLAY\nLEGEND\n145\n144\n143\n142\n141\n140\n139'\n61'\nCRUISE\nCHIEF SCIENTIST/S\nCECIL GREEN EGAL 79ED)\nL6.22\nWILLIAM\n138\n137\n136\n135\n59\nCHICHAGOR\n58\nset\n148\n147\n146\n145\n144\n143\n142\n141\nPacific Aretic Brench Marino Geology\nGULF of ALASKA - MULTI- CHANNEL SEISMIC(CDP)\nFig. 9 .\nExample of a map display for all lines along which\nmultichannel seismic data were collected on two\n571\n138\n137\n140\n139\n136\n135\ndifferent cruises. The map is generated from the CDI.\nModified from Larsen et al. (1978b).\n83","BIBLIOGRAPHIC DATABASES - PRESENT AND FUTURE\nVictoria R. Schneller\nTechnical Information Specialist\nNOAA/ERL Library, Boulder, Colo.\nAbstract - Bibliographic databases are multiplying rapidly and repre-\nsent a cost-effective, thorough, and efficient way to search the sci-\nentific literature sources right now. Soon, many encyclopedias, hand-\nbooks, directories, etc. will also be available for online searching.\nMoreover, the time is not far off when the full text of books, journals,\nand newspapers themselves will be published in electronic form for our\nuse at home or at work. Our \"information odyssey\" to the year 2001 is\nan exciting one, guaranteed to alter the ways in which we access, use,\nand store bibliographic and textual information.\nSince this presentation differs from those preceding me and those following me, I would like to\nask you to shift your mental gears, for I will be discussing BIBLIOGRAPHIC databases, that is, the\nonline or interactive retrieval of bibliographic data, which is also called \"computerized literature\nsearching\". I would like to give you a brief overview of prominent bibliographic database systems\nas we see them at present, the trends we see unfolding, and developments we can expect in the future,\nsay by 2001.\nWhich of you managers and scientists have an information problem that could be helped by obtaining\na listing of references to books, journals, conference papers, etc. for the last few years or even the\nlast ten or fifteen years? Which of you would like to receive regular updates to this listing, per-\nhaps quarterly or monthly, in your areas of interest and concern? I would wager that quite a few of\nyou here would have at least an occasional need for such services. How many of you seek out such\nservices or even know if they are available in your organization?\nTo introduce you to how a bibliographic database works, let us suppose that you have need for a\nlisting of references concerning \"Airborne Magnetics\", particularly aeromagnetic surveys and measure-\nments which have been done. Out of hundreds of bibliographic databases we will choose to query the\nGEOREF database available from Systems Development Corporation on its ORBIT information system. In the\nfollowing example, the system cues the user who types in the relevant keywords, combining them and/or\nthe set numbers which represent them using the boolean operators 'and', 'or', and 'not' as required.\nIn the example, there are 1918 postings or citations which include the relevant terms. One citation\nis printed out here, illustrating the kind of information available. This quick, sample search was\ndone in support of NASA's MAGSAT project which was worked on here in Boulder by EDIS.\nSS 1 /C?\nSS = Search Statement\nUSER:\nC? = What is your command?\nMAGNETIC SURVEYS AND AIRBORNE\nPROG:\nPROG = Program message\nSS 1 PSTG (1911)\nPSTG = Postings\nSS 2 /C?\nSS 1 has 1911 postings, SS 2 has 11\nUSER:\nAEROMAGNETIC SURVEYS\nPROG:\nSS 2 PSTG (11)\nBoolean operator 'or' puts the sets together,\nSS 3 /C?\neliminating any overlap or duplication of items\nUSER:\nin the two sets.\n1 OR 2\nPROG:\nSS 3 PSTG (1918)\nSS 4 /C?\nPRT 1 FULL INDENTED\nPRT = Print the first record or posting in the\nlast set formed, i.e. SS 3, giving all\ninformation present in the record.\n84","PROG:\n-1-\nACCESSION NUMBER\n80-57700\nAeromagnetic anomalies in the Berici-Euganean\nTITLE\nvolcanic region (Anomalie aeromagnetiche del\ndistretto eruttivo Euganeo-Berico)\nNorinelli, A,\nAUTHORS\nMem. Sci. Geol. (ITA), Vol. 32, 8 p., 1979, 12 Ref.,\nSOURCE\nillus., aeromagn. maps, SCALE: 1:250,000\nT SUMMARIES IN: EN)\nLANGUAGE\nS (Serial); MA (Map); ANL (Analytic)\nDOCUMENT TYPE\n80-45071 (Bibliography and Index of Geology)\nISSUE\n2-20 (Geophysics, Applied)\nCATEGORY CODES\n*Italy; Geophysical Surveys; magnetic surveys.\nINDEX TERMS\nEurope; Veneto; Euganean Hills; airborne methods;\nSUPPLEMENTARY TERMS\ngeophysical methods\nCOORDINATES\nN450000; N454000; E0115000; E0112000.\n(with permission of SDC & GEOREF)\nThus, this is a sample of a printout which can be done \"online\" or \"offline\"\nat the computer site, in this case, Santa Monica, and mailed to you, arriving\nin a few days. The latter choice is much less expensive.\nNow I would like to backtrack a bit to review some necessary terminology. Firstly, what do we\nmean by \"bibliographic\" databases? They are databases that you can search by author, title, report\nnumber, contract number, corporate source, and index terms (sometimes called subject terms, keywords,\nor descriptors) which refer you to the published literature in your area of interest. The citations\nor records you retrieve in one of these ways can describe and identify a wide range of document types\nsuch as conference papers, technical reports, books, sometimes even book chapters, patents, disserta-\ntions, journal and newspaper articles and so on.\nMany of these databases correspond to single well-known secondary literature sources such as\nEngineering Index, Oceanic Abstracts, Chemical Abstracts, Government Reports Announcements and Index.\nSome databases combine two or more printed publications such as the INSPEC database which includes\nthree sets of abstracts, one of which is Physics Abstracts. The GEOREF database is comprised of the\nBibliography & Index of North American Geology, Geophysical Abstracts, Bibliógraphy of Theses in Geol-\nogy, the Bibliography & Index of Geology, and others.\nStill other databases have NO printed counterpart, but represent the files of particular institu-\ntions that want to make their information available to a wide audience. They have chosen to do so by\nputting their databases up on the system of a large commercial vendor in its standardized format. As\na result, I can access PHILOSOPHER'S INDEX from Bowling Green State University as easily as WELDASEARCH\nproduced by the Welding Institute in England. I can do this by dialing the local Denver nodes for\nTymnet or Telenet to hook me into Lockheed Information Systems computer in Palo Alto where both these\nfiles are loaded. It is a little bit like one-stop service, like going to an \"information supermarket\".\nI. Let us look now at several large commercial and government bibliographic systems:\nLOCKHEED INFORMATION SYSTEMS (LIS) - \"Dialog\" - Palo Alto - 120+ DATABASES\nSYSTEMS DEVELOPMENT CORP. (SDC) - \"Orbit\" - Santa Monica - 60+ DATABASES\nBIBLIOGRAPHIC RETRIEVAL SERVICES (BRS) - Scotia, N.Y. -\n30+ DATABASES\nDEPARTMENT OF ENERGY (DOE)\n@30 DATABASES\n- \"Recon\" - Oak Ridge -\n- \"Recon\" - Balt./Wash. Airport - @12 DATABASES\nNASA\nI would also like to mention OCLC of Columbus, Ohio which contains the MARC records of the\nLibrary of Congress plus the cataloging records of over 2000 other libraries. RLIN (Research\nLibraries Information Network) of Palo Alto, Calif. maintains the records of over twenty very\nlarge research libraries and also includes several hundred other libraries as members. These\n\"bibliographic utilities\" represent a portent of things to come, of what we can expect by 2001.\nThey are useful not just for verifying cataloging information, but for locating which libraries\nhave the materials you want. OCLC also maintains an Interlibrary Loan Subsystem whereby loan re-\nquests are forwarded from one library to another via \"electronic mail\". This speeds up the pro-\ncess known as \"document delivery\" which can plague the best-intentioned information person with\nincredible delays.\nEverything I have spoken about above is available NOW, and is common to many federal, corporate,\nand academic libraries and information centers. Perhaps it would be worth your while to investigate\nthe availability of these services at your institutions.\n85","Not all the databases currently available could strictly be termed \"bibliographic\". A few could\nbe termed \"referral\" such as SSIE (Smithsonian Scientific Information Exchange) which contains listings\nof current research projects. Others could be called \"factual\", such as the ENCYCLOPEDIA OF ASSOCIA-\nTIONS and still others as \"numeric\" offering statistical information such as marketing data, etc. An\nexample is the group of PREDICASTS files.\nPerhaps it would be appropriate to mention here that Lockheed, SDC, and BRS all offer \"private\nfile\" service and may be a viable alternative for your organization if you're having in-house budget\ncrunches or systems troubles. If you would like to expand your user audience, your database could be\nbrought up also as a public file. NOAA produces the database for AQUATIC SCIENCES & FISHERIES ABSTRACTS\nwhich is currently available to the public on Lockheed. NOAA also supports METEOROLOGICAL AND GEO-\nASTROPHYSICAL ABSTRACTS through the American Meteorological Society, also available on Lockheed's\n\"Dialog\" system.\nII. Now let us turn to why you would like to use bibliographic databases and what they can do for you.\nA. They are speedy - you can manipulate mountains of literature quickly and easily, many times faster\nthan trying to do it manually in the printed indexes. There is no flipping\nthrough annual indexes because they are all searchable together.\nB. They are cost-effective - they save time, YOUR time, cutting off hours usually devoted to research-\ning a topic so you can have more time to write your report and move on to other\nurgent projects.\nC. They are convenient - they are usually available in your Library or Information Center. If you are\nin a hurry, let the searchers know and they will do their best to hustle for you.\nD. They are thorough - there are more points of entry, i.e. searchable fields, in the online format\nthan in the printed one. For example, you can search words in TITLES and often\nwords in ABSTRACTS, not just authors and broad subject areas.\nE. They are accurate - YOU specify the limits of the search by time, subject, languages, and so on.\nIII. Now what do the attractive features mentioned above allow you to do? Let us look at the following:\nYou can determine what has been done as well as what has NOT been done. Both are most valuable\nA.\nas you write a paper or a project proposal. Working in a Library I periodically\nsee cases of the right hand not knowing what the left is doing. Duplication of\neffort wastes tremendous numbers of man-hours and quantities of money. I feel\nstrongly that this can be avoided to a great degree if a thorough literature\nsearch is done even as early as the project proposal stage, let alone at the\nstart of an authorized project.\nB. You can tailor your own bibliography to your own specifications.\nC. You can discover who has cited a particular paper and how many times it has been cited.\nD. You can learn who is funding what research at what dollar level.\nE. You can identify and fill in missing pieces to partial references.\nWhat can computerized literature searching NOT do? Most bibliographic databases have about ten\nyears or so of coverage at this time. A few go back into the 60's such as NTIS and GEOREF. A small\nnumber go back even further, such as DISSERTATION ABSTRACTS, which includes records back to 1861. But\nbasically, back year coverage is limited, so the printed indexes and abstracts certainly retain their\nusefulness.\nIV. State-of-the-Art of Bibliographic Databases, 1980\nPerhaps you have already guessed or were already aware of the wide range of subject matter availa-\nble for searching in the bibliographic databases, spanning the depth and breadth of Physical and Life\nSciences, encompassing Physics, Biology, Electronics, Agriculture, Energy, Environment, Geology, Geol-\nphysics, Pollution, Rubber & Plastics, Textiles, and so on. Now the capability of online bibliographic\nretrieval has expanded to the Humanities and Social Sciences as well, to Philosophy, Psychology, Educa-\ntion, Music, Linguistics, History, Political Science, Management, etc.\nWe have come a long way from the late 60's and early 70's when only a small handful of databases\nwere commercially available, to the present with several hundred databases represented.\nSome areas of difficulty or at least concern at this time include the following:\nA. cost - $30-120 per connect hour charges with prices rising as database producers require\nhigher royalties.\nB. lack of standardization - firstly, in vocabulary which varies from database to database,\nrequiring the searcher to maintain many thesauri to help produce good searches, and\nsecondly, in systems protocols which vary just enough from system to system to cause\nfrustration, impatience, if not downright violence committed against the terminal\nkeyboard, even among experienced searchers.\nAll of the above result in most actual searching on bibliographic databases being performed by a\nsearch intermediary, often a Librarian or Information Specialist such as myself with the assistance of\n86","the end-user, i.e. the scientist or staff member whose presence is most welcome if not required. Most\nsearching currently takes place in the Library or Information Center and possibly in the patron's office\nwith a portable terminal.\nSo what we have now are the printed secondary literature publications and their machine-readable\nequivalents existing side by side with a few new databases emerging in machine-readable form only with-\nout ever exisiting in paper form as in PHILOSOPHER'S INDEX and WELDASEARCH I mentioned earlier. Current-\nly Law and some news services represent the very few areas searchable in \"full text\", i.e., not just\nthe abstract is present but the full text of the item desired. Full text searching of scientific jour-\nnals is being tested with the Journal of Medicinal Chemistry on BRS. If you choose, you can browse\nthe whole article, printing only key paragraphs, such as the first and last which usually contain pur-\nposes and conclusions.\nV. Let us look at some trends we see developing which give us a glimpse at the transition we are making\nto the future.\n1. the gradual replacement of 300-baud terminals with 1200 baud - faster terminals reduce online\ntime, hopefully reducing online costs. This is happening NOW.\n2. the entry of more \"factual\" databases, i.e. putting more reference books online, such as Ameri-\ncan Men and Women of Science and Books in Print. These should be available soon.\n3. addition of more news or current events information. Currently, quite a few newpapers are\navailable for searching, but more should be coming, plus the full text capability.\n4. more widespread availability of \"information utilities\" such as the SOURCE which has the capa-\nbility of 2000 databases and programs ranging from UPI news to airline schedules, from electronic\nmaildrop to horoscopes and computer games. Current cost is about $15.00/hour during peak working\ntime, dropping to $4.25/hour in the evenings. All you need is your home phone, a terminal with\nprinter or screen, and an after-dinner beer.\n5. studies being done on vocabulary switching and systems interface packages to make searching\nvocabulary and systems protocols more \"user friendly\"\nVI. Now to the future, ah the future - what is coming during our \"information odyssey\" to the year 2001?\nYour terminal either at home or at work will be your Library. By 2001 you will research sci-\nentific literature in electronic form only. During the period between now and then, a gradual\ntransition from paper to electronic publishing will be made as the publishing industry encounters\nfar-reaching changes.\n\"Electrobooks\" and \"electronic journals\", i.e. books and journals with NO previous printed\ncounterpart will appear. Some existing printed books and journals may disappear while others will\nswitch over to electronic form. Electronic form has the tremendous advantage of continuous updating\nand additional access points being available. Printed books will probably decline as the primary\ndevice for the storage and transmittal of knowledge, giving way to electronic means. (Lancaster, 1979)\nThe full texts of primary journals in the sciences will become accessible through online\nservice centers as well as the full texts of newspaper articles, technical reports, patents, stand-\nards, and specifications. These online service centers probably will include easy access to infor-\nmation systems in Europe and Asia as well.\nThe end-user, YOU, will become the searcher, with the development of user-oriented software,\nthe emergence of an international standard query language, and the capability for natural language\ninterrogation in sentence form. Thus search intermediaries such as myself will be used primarily\nas guides to sources and to handle areas unfamiliar to you. (Lancaster, 1979)\nWidespread proliferation of \"information utilities\" and videotext-type systems will complement\nand be integrated with bibliographic databases, particularly for home and personal use.\nVII. Conclusion\nBy no means has this been an exhaustive overview of bibliographic databases, but I hope this\npresentation has been informative for you in underscoring prominent features and future trends in\nthis area. Even today, as you sit here, new databases are being added to existing information\nsystems, taking us a few steps further on our \"information odyssey\" to 2001. Hopefully, the voyage\nwill be a thrilling one, all the more so if we are prepared for it.\nReferences\nKiechel, Walter, \"Everything You Always Wanted to Know May Soon Be On-Line\", Fortune, May 5, 1980,\np.226-228,233,236,240.\nLancaster, F.W., Laura S. Drasgow, Ellen B. Marks, \"The Changing Face of the Library: A Look at\nLibraries and Librarians in the Year 2001\", Collection Management, vol.3, no.1, 1979, pp. 55-77.\nTrubkin, Loene, \"Migration from Print to Online Use\", Online Review, vol.4, no.1, 1980, pp. 5-12.\n87","THE ROSE SEISMIC DATA ARCHIVE\nJan Garmany, Dept. of Oceanography, University of Washington, Seattle, WA 98195\nThe Rivera Oceanic Seismic Experiment (ROSE), sponsored by the Office of Naval Research and the National\nScience Foundation and conducted in early 1979, was an attempt to investigate the formation and development of the\noceanic crust. The scale of the proposed experiment vastly exceeded the resources of any single institution, so a\ncollaborative effort involving thirteen institutions was designed to accomplish the investigation. Significant\ndifficulties were immediately apparent in such a cooperation. The variety of ocean bottom seismometers is almost\nas great as the number of institutions that use them. The methods of recording vary, the number and the types of\nrecorded components vary, and the ultimate calibration of the instruments in terms of the ground motion or veloc-\nity or pressure change per count will vary. Naturally, since the development and use of these instruments was\ndone more or less in isolation, like the cultures of so many insular tribes, the techniques of storage and use of\nthe seismic data at each institution proliferated in a similar way. As a result of ROSE, the Babel of data stor-\nage and retrieval techniques had to be resolved by introducing a standardized method, and from this requirement\nthe ROSE data exchange format and archiving system was developed.\nThe present medium of storage is magnetic tape, and the structure of the data tapes is relatively straight-\nforward. The natural association of seismic data, both in acquisition and, generally, in use, is by instrument.\nThis association is preserved in the adopted method of archiving. When the instrument records more than one chan-\nnel, the data are demultiplexed. A single channel is written in 4096-word (16-bit) blocks until its data for the\ncurrent event is exhausted. The last block is padded with zeros, and the writing is continued with the next chan-\nnel, if any. Each event is terminated with an EOF. Before each event, one or more 256-word binary descriptive\nheaders are written. These short headers contain information about the event and its recording (eg., location,\nsize, sample rate, etc.) and about the response characteristics of each recorded channel. At the beginning of a\nuser tape, there is also a 256-word alphanumeric header containing instrument identification, the name of the\nexperiment, the start and end times for the data on this tape, and the total number of events on the tape.\nArchive tapes differ from user tapes only in the inclusion at the beginning of a single 256-word header terminated\nby an EOF. This archive tape header gives the ranges on the tape of several key descriptors which are used in\nhandling data requests. The archive header can be used in quickly checking what is on the tape, but does not play\nany part in the usual retrieval process.\nIndividual institutions have the responsibility of transmitting data from their own instruments to the\narchive. The receiving and archiving of incoming data is depicted in Figure 1. Only the descriptive headers from\nthe transmitted tape are kept on a disk catalog to be used for later data retrieval. Requests for data from the\narchive are given to the computer in terms of key descriptor ( or keyword) bounds. Keywords describe attributes of\ndata from a given event-instrument pair, such as distance between event and receiver, or attributes which are\nspecific to the event or the instrument separately, such as absolute event location. The search for event-\ninstrument pairs which satisfy the data request is conducted on the active (disk catalog) data base to define the\njob af actual retrieval. This sequence is shown in Figure 2.\nThe principal difficulty in storage and retrieval is simply mechanical -- digitally recorded seismic data\ntake up vast numbers of tapes. At the present time, the determination of where the archived seismic data\nsatisfying a certain request lives is not nearly so difficult as the management of some underway geophysical data\nbases with their millions of line miles of tracks. However, it must be borne in mind that the seismic event-\ninstrument pairs that are catalogued are essentially different from point measurements of, say, gravity, and the\nreader can easily appreciate how the catalog can be rapidly expanded as single events are usefully heard by many\ninstruments. The thinking of most seismologists is still rather one dimensional in the way that seismic data is\nactually used, but even with events and instruments confined to a single line, there is an obvious multiplicity of\ncoverage of the same real estate that must be kept resolved. As the ROSE system becomes more widely used as the\nstandard exchange format for short-period seismic data, from land-based as well as marine experiments, a central\ncatalog could become every bit as difficult as presently more extensive data bases are now, quite apart from the\nstorage volume problem associated with digital seismic data.\nWhile the design of the ROSE exchange format was the brainchild of many of the ROSE participants, I must\nmention two particularly important contributors, J. Gettrust and S. LaTraille of the Hawaii Institute of Geo-\nphysics. The ROSE archive is under their capable management, and they are best equipped to respond to requests\nfor more detailed information regarding the archive and the exchange format.\n88","DATA\nTAPE\nFILE ANY\nDOCUMENTATION\nLog date\nACKNOWLEDG\nLOG\nreceived and\nRECEIPT\nBOOK\nother info.\nARCHIVE\nKEYBOARD\nINPUT\nCheck\nread\nI. Label and store\nDISC\nDATA\n2. Log date,\nTAPE\nTape\nARCHIVE\n3. Enter tape con-\nTAPE\ntents on card\nfile\nREPORT FROM\nADD TO\nHEADERS\nCATALOG AREA\nDATA\nFILE\nTAPE\nRecycle to archive\nsystem\nHow incoming ROSE tapes are archived\nFigure 1.\n89","REQUEST\n=\nLog date\nLOG\nACKNOWLEDGE\noriginator,\nBOOK\nRECEIPT\netc.\nTRANSLATE\nREQUEST TO\nKEYWORD BOUNDS\nREQUEST\nSUMMARY REPORT\nbased on input- I copy filed\nTRANSLATE\nI copy sent w/data\nTO FORTRAN\nSUBROUTINE\nENTER INTO\nREQUEST STATUS\nAREA FOR\nREPORTING\nCOMPILE w/MAIN\nSEARCH PROGRAM\nAND EXECUTE\nSUMMARY REPORT\nREPORT\nof Tape #s,\nOF\nDATA\nFile #s, Documen-\nFORTRAN\ntation available\nSUBROUTINE\nFOR RETRIEVAL\nCATALOG\nPull\narchived\nARCH I\ntapes\nCOMPILE w/MAIN\nRETRIEVER PROGRAM\nLOG\nARCH 2\nBOOK\nAND EXECUTE\nTape with\nrequested data\nARCH 3\nCheck\nSend tape with any\nread\ndocumentation available\nand summary report\nFigure 2. ROSE data request search and retrieval\n90","MANAGEMENT OF LARGE PETROLEUM DATA BASES\nPhilip H. Stark, Vice President\nPetroleum Information Corporation\nDenver, Colorado\nABSTRACT\nThe petroleum industry has developed and processed large computer files of well completion and geological and\nproduction data for the past 20 years. Today, industry well data systems contain records on more than 1,200,000\nwells. Production files reports as much as 13 years of monthly volumes on more than 300,000 leases. Management\nof these large data collections remains a problem. The computer data files originally were developed to solve a\npaper storage problem. Now we understand that comprehensive information management problems, including records\nmanagement and microforms, are required. New system designs begin with records management and microforms to\ncomplement computer data bases.\nComputer data base management continues to be a problem. Hardware, including processors and storage devices,\nshow dramatic improvements in price performance. Data management software, however, has not accomodated all user\nrequirements for petroleum data bases. Most commercial data management packages were designed for business data.\nThe hierarchies and relationships in petroleum data files exceed the ability of most systems. Relational data\nbase systems under development may provide significant improvements.\nSeveral trends in petroleum data management are noteworthy.\n1.\nDistributive processing is a dominant trend. Local minicomputers with application processing capability\nare replacing remote terminal access to large central processors.\n2.\nLarge data bases are being integrated to greatly expand analytical capability. Common identifiers and\ndata codes are being input to link well and production data. Digital log data are being interfaced with\nwell data and seismic files. Lease and land data are being related to digital maps. Remote sensing\nimages are being related to digital base maps and subsurface data.\n3.\nNew digital files are being created and applied. A clearinghouse has been established for storage and\nretrieval of digital log data. A digital base map system for major petroleum provinces is being\ndeveloped. Lease status data bases are being developed.\n4.\nUse of computer graphics is expanding rapidly. Digital base maps with legal land subdivisions are\noverlain with lease data, well related data, surface lineaments, geophysical stations and relevant\ngeographic information. Bore hole logs, geochemistry, lithology, paleontology and formation tops are\ndisplayed together to assist interpretation.\n5.\nDifferent file management problems are being solved by a variety of methods. The notion that one large\ncomputer and one central data base with remote access can solve a complex set of problems is disappear-\ning. Users should evaluate their spectrum of needs and select the variety of hardware and software\nsystems required to best solve the needs.\n91","THE CONSTRUCTION AND USE OF LARGE MULTISOURCE\nINTEGRATED DATA SETS*\nG.W. Wecksung, S.H. Balogn, S.L. Boliver,\nK. Campbell, E. Fugelso, and T.A. Weaver\nUniversity of California\nLos Alamos Scientific Laboratory\nLos Alamos, New Mexico\nABSTRACT\nTechnology at the Los Alamos Scientific Laboratory has been established for integration of 35 or more geophy-\nsical and geochemical spatially distributed data sets derived from several independent sources. The data sets are\ngeographically indexed to large regions, typically unique quadrangles. Pilot integrations have been made for\nTalkeetna in Alaska and Montrose in Colorado. The Montrose data sets include a) airborne gamma-ray spectrometry\n(214, 208 Ti, 40K), b) aeromagnetic survey data, c) geologic map, d) LANDSAT imagery (bands 4, 5, 6, 7), e) hydro-\ngeochemical data for 24 elements (about 4,000 sample locations in 19,200 km2). A major problem is getting\nthe\ndata\nspatially into comparable formats. Hydrogeochemical data was collected at sample points determined by nominal\nspacing and convenience; aerial data was collected along flight lines flown at 3-km spacings. These data sets were\nsmoothed and interpolated by kriging to a UTM grid of 1-km spacing, which covers the quadrangle. The geologic map\nand the LANDSAT data were registered to the same domain but on a finer grid. Digital image processing techniques\nand software are employed for the manipulation and display of the data. Data interpretation is aided by multi-\nvariate statistical analysis techniques and pseudocolor displays. The short-range goal is to detect patterns asso-\nciated with uranium mineralizations. The long-range goal is to formulate a classifying scheme that will assign a\nprobability for economic favorability of strategic minerals. The techniques involved and preliminary evaluations\nare discussed.\n*Work performed under the auspices of the U.S. Department of Energy, Contract No. W7405-ENG-36.\n92","APPLICATION OF COMPUTER IMAGE PROCESSING\nTO MARINE GEOLOGY\nExtended Abstract\nPeter R. Paluzzi\nImage Processing Laboratory\nJet Propulsion Laboratory\nCalifornia Institute of Technology\nPasadena, California 91109\nNASA/JPL extraplanetary missions have relied on computer image processing\ntechniques and facilities to aid in the assimilation and interpretation of\nimages returned by spacecraft. Since the early 1960's, the Image Processing\nLaboratory of JPL has been developing both hardware and software to meet the\nrequirements of these missions. More recently, work has begun to apply NASA/\nJPL image processing experience to nonspace applications -- one area in\nparticular is imaging for marine geology.\nComputer image processing, or the manipulation of digital images by com-\nputer, is essentially a 2-dimensional adaptation of one-dimensional digital\nsignal processing (Castleman, 1979). To implement computer image processing,\nJPL uses a general purpose computing facility based on an IBM 370/158 and a\nlibrary of over 200 image processing programs called Video Image Communication\nand Retrieval (VICAR) This approach allows flexibility for developing new\ntechniques and modifying older ones. This is important since new missions and\napplications require that the library be continually updated and augmented.\nUnlike exploration involving spacecraft images, seafloor exploration has\nnot extensively employed computer image processing. The images created by\nbottom cameras and side-looking sonars have not been previously acquired\ndigitally. Today, the number of digital data collection systems has grown\nand existing capabilities to analyze the data have not kept pace with that\ngrowth. To this end, the existing image processing techniques developed by\nNASA/JPL have been applied to three types of images used in marine geology:\nbottom photographs; side-looking sonar records; and bathymetric data recast as\nan image.\nPhotographs of the seafloor are often beset by strong illumination gradi-\nents. Because of these gradients, it is troublesome to print the photographs\nand still preserve detail in all portions of the scene. The computer can\nsuppress illumination gradients using a digital filter that passes only the\nfiner details of a scene and not the broad changes in illumination. This is\nparticularly important when joining adjacent images to make mosaics. Bottom\nphotographs computer processed in this way can sustain stronger contrast en-\nhancements in order to adequately display important features (Paluzzi and\nothers, 1976).\nSide-looking sonar records also have illumination gradients, however these\ngradients arise from the attenuation of echos at increasing ranges. The same\ndigital filtering technique used with photographs effectively suppresses this\nillumination gradient in the sonar images. In addition, side-looking sonar\nimages possess distortions owing to the slant-range geometry and the ship's\nspeed. Techniques used to geometrically rectify spacecraft images for camera\nand perspective distortion have immediate application for correcting side-\nlooking sonar images (Paluzzi and others, 1976). Navigation data has been used\nto further correct side-looking sonar images for trackline distortion. Once\ntrackline corrected and registered to a cartographic reference system, adja-\ncent sonar images can be mosaicked and further rectified into a standard map\nprojection (Paluzzi and others, 1981).\nAlthough not traditionally displayed as an image, bathymetric data can\nbe\nprocessed using computer image processing techniques. Individual soundings\nand profiles can be rendered into an image by interpolating regridding the\ndata. The resulting surface can be displayed using digital surface shading\ntechniques. Topographic datasets that are available in a gridded format (e.g.,\nthe Global Topographic Dataset) can be shaded without interpolation. These\ndigital terrain models are images of a sort and, being so, can also be recti-\nfied into a standard map projection (Paluzzi and Malin, 1981).\nUsing computer graphics techniques, the bathymetric surfaces can be dis-\nplayed in perspective view as if seen from the window of an airplane or a\n93","spacecraft (Gomez and Paluzzi, 1981). This capability allows the interpreter\nto move above and about the terrain and see a representation of the seafloor\nwithout the necessity of diving in a submersible or draining away the oceans!\nMost of the techniques applied have assumed that the final product will be\nused for photo interpretation. Since the images exist in a digital format,\nthey can also be processed to extract quantitative information from the scene.\nThis as well as other types of processing, are part of continuing NASA/JPL\nwork in computer image processing.\nACKNOWLEDGEMENTS\nThis abstract presents the results of one phase of research carried out at\nthe Jet Propulsion Laboratory of the California Institute of Technology under\ncontract NAS 7-100, sponsored by the National Aeronautics and Space\nAdministration.\nREFERENCES\nCastleman, K.R., 1979, Digital Image Processing, Prentice-Hall Inc., Englewood\nCliffs, N.J.\nGomez, J.E. and P.R. Paluzzi, 1981, 20,000 Triangles Under the Sea, Proceedings\nof the 8th Conference on Computer Graphics (SIGGRAPH) ACM, Dallas (in press).\nPaluzzi, P.R., and M. Malin, (1981), Bathymetric Imaging: ASP/ACSM National\nMeeting Proceedings, Washington, D.C. (In Press)\nPaluzzi, P.R., W.R. Normark, G. Hess, M. Cruickshank, and H. Hess, (1976),\nComputer Image Processing in Marine Resource Exploration: Proceedings of\nthe joint IEEE/MTS conference on ocean engineering, Oceans '76, Washington,\nD.C.\nPaluzzi, P.R., R. Toaz, D. Roberts, and R. Searle, (1981), Computer Rectifica-\ntion and Mosaicking of Side-Looking Sonar Images: Proceedings of the Offshore\nTechnology Conference, Houston. (In Press)\n94","(a)\n(b)\nl'\n720608-023010 SOUTHTOW 5 CARNEGIE RIDGE\nNORMAL LENS AF T CAMERA LOWERING 7 RUN 31\n50 MICROMETER SCAN\n720608-023010 SOUTHTOW 5 CARNEGIE RIDGE\nSTRETCH - FASTFIL2\nNORMAL LENS AF T CAMERA LOWERING 7 RUN 31\nA%0.00 DAD\n106 150\n50 MICROMETER SCAN\nR6\nSTRETCH\n*MASK\n0\n1\n2\n3\n4\n5\nMETERS\nMEAN=179.98 o=68.80\nMEAN=128.26 o=70.19\nPRP FEB 8, 1976 193401\n02-09-76\nJPL/IPL\nBottom photographs showing the removal of illumination gradients. Figure la shows a bottom\nFigure 1.\nphotograph that has been digitized and displayed with no enhancement. The histogram at the\nbottom of the image represents the distribution of the picture elements or pixels in terms\nof brightness. The left end of the histogram corresponds to black and the right to white.\nAll of the information and detail contained in the original negative is present in the\ndigital image; however, not all the features can be adequately displayed in the printed\npicture owing to the illumination gradient. A digital high pass filter was applied to\nfigure la to yield figure 1b. Objects in both the light and dark portions of the image\nare portrayed with equal clarity because the subsequent contrast enhancement operates only\non the features and not the illumination gradient (after Paluzzi and others, 1976).\n95","(a)\n(b)\n(c)\n400\n800\n0\n100\n200\n0\nMETERS\nMETERS\n1:1 HORIZONTAL TO VERTICAL SCALE RATIO\n2:1 HORIZONTAL TO VERTICAL SCALE RATIO\nComputer processed side-looking sonar records. Figure 2a is an unprocessed digitized side-\nFigure 2.\nlooking sonar image. Figure 2b show the image of 2a with the range attenuation illumination\ngradient suppressed and missing scan lines interpolated... Figure 2c shows the image of\nfigure 2b geometrically rectified for slant-range distortion and the distortion introduced\nby from the heading mismatch changes of the were ships applied speed (after and the Paluzzi range and scale. others No correction 1976). for distortion arising\n96","(a)\n(b)\n(c)\n90km\n60km\nComputer rectified and mosaicked long range sonar images. Long range lide-looking - sonar\nFigure 3.\n(GLORIA) images made available from the British Institute for Oceanographic Sciences were\ndigitized and displayed (Figure 3a and b). These images were corrected for illumination\ngradients and slant-range - distortion. In addition, navigation data were used to apply\ntracking corrections and to montage the individual images into a cartographically based\nmosaic.\n97","(a)\n(b)\n(c)\nFigure 4. Computer processed sonar images of a sunken ship. Side-looking sonar images of sunken\nwar of 1812 vessels were collected in Lake Ontario by the Canada Center for Inland Waters.\nThese images were digitized and displayed with no correction or enhancement (Figure 4a).\nCorrections for illumination gradients, slant-range distortion, and ships speed were\nsubsequently applied (Figure 4b). Figure 4c is an extracted section of Figure 4b showing\nthe detail and relatively distortion free aspect of the ship.\n98","E\nW\n180°\n180°\n120°\n60°\n0°\n120°\n60°\n(a)\n80°\n80°\n60\n60°\n40°\n40°\n20°\n20\no\n0°\n0\n20°\n20°\n40°\n40°\n60°\n60°\n80°\n80°\n120°\n180°\n180°\n120°\n60°\n0°\n60°\nE\nW\n(b)\nFigure 5.\nShaded relief images of the earth. Using a directional derivative algorithm, the Rand/\nSIO global topographic dataset was shaded and the resulting images was rectified into a\nMercator Projection (Figure 5a). In addition, computer graphics techniques were used to\nproduce a shaded relief image of a globe using this dataset (5b) (after Paluzzi and Malin,\n1931) .\n99","IMAGING CONVENTIONAL MARINE\nGEOPHYSICAL DATA\nCARIBBEAN SEISMIC EXAMPLE\nDavid Handschumacher, Louis Hemler, Eugene Morganthaler, and Ronald Clark\nU.S. Navy, Naval Ocean Research and Development Activity\nBay St. Louis, Miss.\nImaging is a technique by which spatially distributed digital data is matrixed and displayed in color scales\ncorresponding to intensity values of the data. Originally developed for analysis of multi-spectra Landsat data,\nuse of imaging is now commonplace in the field of remote sensing. It may also have a wide-range of potential\napplications to the display and analysis of conventional marine geophysical data. For this reason, a program has\nbeen initiated at the Naval Ocean Research and Development Activity (NORDA) to image those data sets on NORDA's\nInteractive Digital Satellite Image Processing System (IDSIPS).\nThe IDSIPS consists of hardware components built by the International Imaging Systems (I2S) Division of\nStanford Technology Corporation, Figure 1. It is basically their off-the-shelf System 101 with some added custom\nmodifications to meet NORDA requirements. At the heart of the I2S is the Hewlett-Packard 3000-II computer with a\nmulti-programming operating system (MOS). Images are input directly to the system using a 9-track, 1600-bpi tape\ndrive where they are usually temporarily stored in a disk drive unit until they are displayed. When displayed, the\nimages are copied into an area called the \"refresh memory\". Monitoring of the image data is maintained by means\nof a color television screen. On the screen, the image data is user colored by assigning various colors (obtained\nby intermingling the three electron beam guns of the color cathode ray tube) to different grey levels of the data\nset. The number of color levels and range of the data set values to which they are applied are controlled by the\nuser on an interactive input console. A central processor enables the user to perform various digital analysis\nof the data such as deriving selected data statistics. Output is by means of photographs of the CCRT screen or\nmagnetic tape.\nSoftware for the I2S system is designed for the manipulation, analysis, interpretation and processing of a\nwide-variety of image data. CPU operating and support software for IDSIPS were manufacturer supplied. This\nsoftware includes compilers, file management and editing, debugging and scientific sub-routine libraries. Many\nof\nthe time-consuming chores of bookkeeping, cataloging, efficient I/0 handling, etc., are assumed by the system,\nleaving the user free to concentrate on the scientific problem at hand. However, unique image processing capa-\nbilities of the IDSIPS system result from additional applications software. At present, applications software\nconsists of over one hundred callable functions to perform radiometric or geometric corrections to an image,\naccomplish pattern classification, perform statistical analysis transform and filter image data, and enhance image\ndisplays. Uniform command syntax and user prompting are employed to facilitate system usage. Much of this existing\nsoftware is directly applicable or readily adaptable for display and analysis of marine geophysical data sets once\nthey have been converted to image.\nAlthough the I2S system is a versatile hardware and software system, it cannot create images. To create\nimages of marine geophysical data, a format program, called \"IMAGE,\" was developed to directly convert these data\nto images on 1600-bpi magnetic tape suitable for input to the IDSIPS system. IMAGE is written to be compatible\nwith the Control Data Corporation computer, CDC Cyber 176, on which NORDA's marine geophysical data sets are\nstored. Four specific format functions are accomplished by the IMAGE program: 1) each data point is properly\npositioned in the image, 510 X 510 pixel matrix: 2) selected data intensity values are scaled to image values\n0\nto 255; 3) data are converted from real numbers to 8-bit bytes; and 4) output of the digital matrix is formatted\non 1600-bpi magnetic tape for input to the IDSIPS system. Parameters required for data assignment by the IMAGE\nprogram are as follows: 1) latitude and longitude of the area to be imaged; 2) number of pixels, height by width\nof desired image; 3) maximum and minimum of data intensities to be scale in 0-to-255 range and 4) selection of\nmercator or equal area projection for output pixel matrix.\nAn example of imaging marine geophysical data is presented in Figures 2 and 3. The example data set consisted\nof seismic reflection data; i.e., two-way travel time picks to the surface reflector (bathymetry) and deepest\nreflector (assumed basement) from the northern Venezuela basin. Tracklines for this data set are shown in Figure\n2A. The data was imaged as described above and run through a pixel expansion subroutine to increase (4X) data\ncoverage along image tracklines. Once images are generated, it is useful to view resultant pixel intensity\nranges prior to assignment of a color code. In the example, profiles and histograms of pixel intensities in the\narea containing the Muertos trough were interactively generated for this purpose. Figure 2B shows a profile\nof pixel intensities corresponding to two-way travel times to the surface reflector in the area bounded by the\ndashed line in Figure 2A. Figure 3 shows images of the surface and basement reflectors which were colored using\nthe color code indicated on the figure.\nIt is interesting to note that a comparison of these images reveals the basement to be dipping downware to\nthe north beyond the bathymetric axis of the Muertos trough. Unfortunately, color Xeroxing of the original\nPolaroid photographs for these images are highly degraded both in clarity and color differentiation.\nThe intent of this discussion is to acquaint the reader with imaging as an alternative method for storing,\ndisplaying, and analyzing marine geophysical data. In the past, such applications of imaging have been primarily\nlimited by the availability of image hardware systems. However, increasing application of imaging in many fields\nis now making these systems more accessible to marine geophysical investigators. Thus, careful examination of\nimaging is encouraged so that the full potential of this method will be realized.\n100","COMPUTER SYSTEM\nCENTRAL DATA BUS\nMEMORY CONTROL\nPORT CONTROLLER AND SELECTOR CHANNEL\nUNIT\n(MTU)\nINPUT/OUTPUT\nCENTRAL\nMEMORY\n64 KBYTE\nPROCESSOR\nPROCESSOR UNIT\nEXPANSION\nCONTROLLER\nCONTROLLER\n(IOP)\n(CPU)\n64 KBYTE\n64 KOVTE\nTO 512 KBYTE\nINTERFACE PORTS\nSYSTEM DISC\nIMAGE DISC\n64 KBYTE\nFOR OTHER PERIPHERALS\nUP TO 8 TOTAL\nUP TO 8 EA. 200 MBYTE\nAND ADDITIONAL\nMODEL 70 USER CONSOLES\n1-50 MBYTE\n(FORMATTED)\nIOP BUS\nIOP BUS\nASYNCHRONOUS\nMAGNETIC TAPE\nLINE PRINTER\nTERMINAL\nCONTROLLER\nUNIT CONTROLLER\nCONTROLLER\nANALYST STATION\n16\nINTERACTIVE\n1600 BPI/45 IPS\nUNIVERSAL\n300 LPM\nSYSTEM CONSOLE\nDISPLAY\nMAGNETIC TAPE UNIT\nPRINTER\nINTERFACE ADAPTER\nTERMINAL\nSYSTEM\nHARD COPY\nUP TO 4 TOTAL\nTERMINAL\nPROGRAMMER'S CONSOLE\nSTATUS MONITOR\nMODEL 70 IMAGE\nINTERACTIVE\nFUNCTION\nANALYSIS TERMINAL\nDISPLAY\nTERMINAL\nMODEL 70 USER CONSOLE\nUNIVERSAL\nVIDEO\nINTERFACE ADAPTER\nHARD COPY UNIT\nVIDEO\nA/D AND FORMATTER\nDIGITIZER AND CAMERA\nINPUT SYSTEM\nMAGNETIC TAPE\nUNIT CONTROLLER\nPROVIDED\nGOES TAP\n800/1600 BPI 1251PS\nOPTION\nIMAGE MAGNETIC\nTAPE UNIT\n800/1600 BPI 125 IPS\nFROM WASHINGTON. D.C.\nUP TO 4 TOTAL\nFigure 1. IDSIPS hardware configuration.\n66w\n69\n68\n67\n72\n71\n70\n19\n18\n0\nA'\n17\nA\n16 N\nFIGURE 2A. TRACKLINES FOR SEISMIC REFLECTION DATA USED FOR IMAGE EXAMPLE.\n101","64\nA\n%\n128\n+\n#\n192\n256\n19 38 57 76\n64\n128\n192\n256\nFIGURE 2C. HISTOGRAM OF PIXEL INTENSITIES\nCORRESPONDING TO TWO-WAY TRAVEL TIME TO THE\nFIGURE 2B. PIXEL INTENSITIES CORRESPONDING\nSURFACE REFLECTOR INSIDE THE DASHED BOUNDARY\nTO TWO-WAY TRAVEL TIME TO THE SURFACE RE-\nON FIGURE 2A.\nFLECTOR ALONG TRACK A-A', FIGURE 2A.\n102","TRAVEL TIME (SEC\nCOLOR CODE\nTWO-WAY\n0\n2\n4\n6\n8\nDEEPEST REFLECTOR (ACOUSTIC BASEMENT\nFigure 3\n1st REFLECTOR (BATHYMETRY)\n3000","","SEABEAM STUDIES OF THE GORDA RIFT/BLANCO FRACTURE ZONE INTERSECTIONS\nby\nALEXANDER MALAHOFF\nNational Ocean Survey\nRockville, Maryland\nABSTRACT\nThe National Ocean Survey's deep ocean SEABEAM swath mapping system\ninstalled aboard the NOAA Ship SURVEYOR is being used to map the bathymetry\nof the Gorda and Juan de Fuca Ridges located in the deep ocean basin off\nthe states of California, Oregon and Washington. Real-time contouring\naboard the ship is used to produce maps at a scale of 1:10,000 with a\nbathymetric contour interval of 10 meters. Loran C navigation was used\nto control the surveys. The swaths used during the surveys are 2,000\nmeters wide. An overlap of 25 percent between the swaths was used\nto\nconstruct the maps. The survey has been able to resolve and contour\nbathymetric features 50-100 meters in horizontal size. The resolution of\nthe contour configuration is 5-10 meters in a water depth of 3,500 meters.\nAll the data for this region of the ocean floor is stored in a digital\nformat with one data point per each 75x75 meter segment of the ocean\nfloor. The survey has delineated major ocean floor features such as\nthe\nfour kilometer wide rift valley located along the Gorda Ridge and the\npresence of a distinct volcanic ridge located along the axis of the rift\nvalley.\nINTRODUCTION\nA General Instruments Corporation SEABEAM system was installed in 1980 aboard the NOAA Ship SURVEYOR.\nThe installation was accomplished by converting the onboard narrow beam echo sounding system to a multibeam\nmode. The conversion was accomplished successfully and the system emplaced aboard the SURVEYOR was immediately\nplaced into use in a joint U.S. Navy-NOAA experiment. The experiment took place during February-April 1980\nin the form of a field tested cross-comparsion between the Navy SASS multibeam system and SEABEAM. During the\ncourse of the experiment, SEABEAM was used in generating preliminary bathymetric maps of the Gorda Ridge and\nthe Blanco Fracture Zone. Preliminary results of this survey are presented in this paper.\nOPERATIONAL PRINCIPLES OF THE MULTIBEAM SURVEY SYSTEM\nA major advance in the use of real-time shipboard bathymetric surveying has been made possible with the\nintroduction of both civilian (SEABEAM) and military (SASS) multibeam surveying systems. Two such systems,\nboth using the same principles are currently in use in the United States and France. The United States Navy\nhas for many years used the SASS (Sonar Array Survey System) to map sections of the ocean floor. The civilian\nversion of SASS is the SEABEAM system. One SEABEAM system is currently in use aboard the CNEXO (Centre\nOceanologique de Betague) Research Vessel JEAN CHARCOT, the other system is in use aboard the NOAA Vessel\nSURVEYOR. All three systems have been used to survey the crestal regions of sections of mid-ocean spreading\ncenters. Common to the SASS and SEABEAM systems, both of which are manufactured by the General Instruments\nCorporation, is the concept of \"a crossed fan beam configuration.\" The SEABEAM transmitting system utilizes\n20 projectors to form a single verticle fan 2 2/30 in the fore and aft direction by 540 in the thwart\nships direction. The SEABEAM receiving system utilizes 40 units to form 15 separate adjacent fans each 2 2/30\nin the thwart ships direction by 20° fore and aft. When the receiving fans are crossed with the preliminary\nfan 15 beams 2 2/30 square result. The SASS system has 60 beams. The general mode of operation of both the\nSASS and SEABEAM systems is illustrated in Figure 1. Field tests of the SEABEAM system have been thoroughly\ndocumented by Renard and Allenon (1979). The multibeam echo-sounding system such as SASS and SEABEAM are\ncapable of resolving features on the ocean floor that may be as small as 20 meters in wavelength and 5 meters\nin height at a water depth of 3500 meters. Therefore, the SEABEAM and SASS systems are excellent tools not\nonly for mapping ocean floor surfaces structures, but also become essential prerequisites for any serious\nsubmersible based ocean floor projects.\nBATHYMETRY OF THE GORDA RIDGE - BLANCO FRACTURE ZONE INTERSECTION\nA detailed study of the northern end of the Gorda Rift and its intersection with the Blanco Fracture\nZone was conducted aboard the NOAA Ship SURVEYOR during February 1980. Figure 2 illustrates the location of\nthe study area.\nFigure 2 was drawn from a bathymetric map of the Gorda and Juan de Fuca plates prepared by the Pacific\nGeosciences Center (1979). Although the bathymetric map of the area does show the general position of the\nGorda Ridge axis and the Blanco Fracture Zone, the details of ocean floor structures are missing from the\ngeneralized map.\nFigure 3 illustrates the bathymetry of two sections of the Gorda Rift constructed from overlapping\n1:10,000 scale machine contoured bathymetric maps, plotted aboard the SURVEYOR. The map was constructed\n105","from eleven SEABEAM lines run in a northwest to southeast direction along the axis of the rift valley, and\nalong the flanks of the northwestern flank of the ridge. The detailed SEABEAM map shows two sets of\nsurprisingly linear bathymetric features. The northwest-southeast striking set marks the location of the\nBlanco Fracture Zone and the northeast to southwest striking set delineates the position of the rift valley\nand the associated fault ridges located along the flanks of the ridge.\nThe Gorda Ridge is a seismically active (Northrop, 1970) ridge segment located at about 150 kilometers\nwest of Cape Blanco, Oregon. Analysis of magnetic anomalies (Elvers et al, 1973) suggest the ridge to be\na medium spreading ridge - average total spreading rate 5.6 centimeters per year (Riddihough, 1980).\nThe\nbathymetric map of the Gorda Ridge is shown in Figure 2. From the bathymetry it is seen that the rift\nvalley occupies a narrow zone along the crest of the ridge. The Blanco Fracture Zone is delineated by a\nsharp ridge 500 meters high located at a water depth of 2400 meters (Figure 4). Southeast of the ridge lies\na seven kilometer wide zone of small basins and volcanic edifices which end abruptly against the northern\nedge of the rift valley located at 43°N, 120°36'W. The rift valley is located at a water depth of 3400 meters\nand is characterized by the presence of an elongate volcanic ridge, about 1500 meters wide and 250 meters\nhigh. The ridge is broken into a series of circular cone-like features at the southeastern end of the rift.\nThe rift segment surveyed is surprisingly discontinuous, losing its distinct rift valley characteristics\ntowards the southeastern end of the survey area.\nThe narrowing of the rift valley to the southeast suggests that at least for the episode of spreading\nthat built the rift segment studied, the spreading rate decreases as a function of distance southeast away\nfrom the Blanco Fracture Zone. The rift valley is flanked symmetrically by rift valley walls, and tilted\nblocks tilting away from the axis of the rift valley (Figure 4 and 5).\nUsing U.S. Navy SASS surveys over a segment of mid-Atlantic Ridge at 360 -37°N, Phillips and Fleming\n(1977), were able to map in detail the tectonic setting of the mid-Atlantic Ridge. A bathymetric cross\nsection across the mid-Atlantic Ridge at 37°N is also shown in Figure 5. There is a surprising morphological\nsimilarity between the Gorda Ridge and the mid-Atlantic Ridge segments. The morphological similarity has\nprobably resulted from similarity in tectonic style between \"medium\" spreading rates ridges such as the\nGorda Ridge (5.6 cm/year) and slow spreading ridges such as the mid-Atlantic Ridge (2.8 cm/year).\nHigh spreading ridge segments such as the Rivera-Tamayo and the East Pacific Rise (Francheteau, 1980)\nwhere the spreading rate is greater than 10 centimeters per year, show different cross-sectional morphologies\n(Figure 5). The most prominent difference is the absence of a rift valley for the high spreading rate segments.\nBATHYMETRY OF THE BLANCO FRACTURE ZONE\nThe Blanco Fracture Zone is represented by an eleven kilometer wide band of ridges and basins (Figure\n4).\nIt is significant to note that the Blanco Fracture Zone is bounded on its northern side, facing the Cascadia\nBasin, by a narrow ridge 500 meters high and with a slope of 18°. This ridge marks the leading edge of the\nGorda Plate where the plate abuts against the Cascadia Basin which is located on the Juan de Fuca plate. The\ntwo sections of the fracture zone surveyed with SEABEAM are shown in Figure 3 and illustrate the lateral\ncontinuity of the Blanco Fracture Zone ridge. A band of discontinuous basins, ridges, and presumably volcanic\nedifices is located between the ridge and the northernmost edge of the inner rift valley (Figure 4). These\nintra-fracture zone volcanic edifices were probably formed through the sporadic extension of the volcanic\nactivity northeast of the inner rift. The presence of large volcanic edifices such as the one located at a\nwater depth of 2400 meters in Figure 3A at 126°30'W, 43°00'N suggest a temporary northeastward propagation of\nvolcanic activity from the rift. Volcanic activity apparently ceased within the propagated rift after lateral\nseparation through seafloor spreading from the magma source had taken place. The complex nature of the\nbathymetry within the intra-fracture zone band is not unique to the Blanco Fracture Zone, but has been studied\nalong other fracture zones such as the Tamayo Fracture Zone of the East Pacific Rise (FRANCHETEAU et al, 1980).\nThe southwestern edge of the fracture zone is cuspate (Figure 3A) and marked by the presence of steep scarps.\nThe scarps mark the boundary between the volcanically active rift valley and the fracture zone band. The\ncusps appear to be remanents of major phases of volcanic activity within the inner rift of the Gorda Ridge\n(Figure 4). Current volcanic and spreading activity along the Gorda Ridge is typified by the presence of a\nprominent cusp located at 126°35'W, 43°00'N. The previous major volcanic and rifting activity along the\nGorda Ridge is typified by the cusp located at 126°30'L 4300'\nBATHYMETRY OF THE GORDA RIDGE INNER RIFT\nThe inner rift of the Gorda Ridge marks the site of current volcanic and spreading (extensional) activity\nalong the crest of the Gorda Ridge. The rift valley is not typical of other mid-ocean rift valleys such as\nthat of the mid-Atlantic Ridge (Phillips and Fleming, 1977). For most other rift segments studied, the width\nof the rift valley is constant for the extent of the ridge segment located between fracture zones and the\nspreading rate is constant throughout the length of the rift. The Gorda Rift segment surveyed during the\npresent study however shows a V shaped inner rift, the V shape suggesting a discontinuous spreading rate\nalong the axis of the rift, giving rise to the V shaped appearance of the rift in Figure 3A.\nA narrow, one to two kilometer wide, 200 meter high volcanic ridge extends along the axis of the rift\nvalley. The presence of axial volcanic ridges along the rift valleys of mid-ocean ridge with slow to medium\n(2 to 5 centimeters per year) spreading rates has only recently been defined through the use of multibeam\nand deep-towed surveying techniques (Phillips and Fleming, 1977).\n106","The axial ridge of enorthernmost Gorda Rift segment seen in Figure 3A is curved westward at its\nnorthern extremity. This westward curve is hypothesized to have resulted from strain due to the differential\nmotion along the Blanco Fracture Zone between the active Gorda and Juan de Fuca plates.\nBathymetric details of the axial ridge are shown in Figure 6. The figure illustrates copies of actual\nSEABEAM contoured swaths taken during the survey at an initial scale of 1:10,000 and a contour interval of\n20 meters. The bathymetric map is divided into two segments for the benefit of illustrating them in Figure 6.\nThe northern segment is shown in Figure 6A, the southern segment in Figure 6B. The northern segment illustrates\nthe presence of a relatively sharp crest along the ridge that has probably been formed of coalsced volcanic\ncones. Axial volcanic cones are seen to be located prominantly along the ridge crest in Figure 6B. The\nvolcanic cones are generally circular and up to 100 meters high. Some cones shows the presence of craters and\nsmaller cones along their summits.\nDISCUSSION\nIt is interesting to speculate on the presence of hydrothermal activity located within the rift valley\nsystems of the crests of mid-ocean ridges. Crane and Ballard (1980) found that extensive hydrothermal\nactivity is taking place along the axial ridge system of the Galapagos Ridge whose morphology is very similar\nto that of the Gorda Ridge. Furthermore, extensive polymetallic sulfide generation was found to have taken\nplace along the marginal or inner walls of the rift valley along the East Pacific Rise at 21°N (Hekinian et al\n1980). Such inner walls are prominant in the case of the Gorda Rift (Figures 4 and 5). Although the micro-\nmorphology and geological structure deduced from the multibeam surveys of the Gorda Ridge would suggest that\nhydrothermal activity may be present along the inner rift of the Gorda Ridge no extensive bottom photography\nor rock dredging of this rdige segment has taken place to date. Detailed ocean floor geological work such as\nprecision navigated bottom photography, precisely located sediment cores or submersible based observations\nof geological features should be based upon the availability of high resolution bathymetric maps of the areas\nto be studied. Multibeam surveys of geologically significant ocean floor features such as the Gorda Rift\nbecome essential pathfinders and necessary precursors to meaningful ocean floor geological exploration project.\nCONCLUSIONS\n1. Multibeam shipboard survey systems such as SEABEAM are essential for the detailed mapping of the\nstrcture and geology of oceanic ridge crests.\n2. SEABEAM surveys of the axial region of the Gorda Ridge shows the presence of an axial rift system\nwith an inner rift and the presence of volcanic cones within the rift. This rift morphology is characteristic\nfor ridges with slow and medium spreading rates.\n3. The detailed bathymetric map of the Gorda Rift shows the presence of tilted crustal blocks, tilting\naway from the axial rift.\n4. The Blanco Fracture Zone appears to be bounded along the Cascadia Basin by a 500 meter high ridge\nsloping at 18°. Between the ridge and the inner rift lies a 12 kilometer wide band of irregular volcanic\nedifices and basins that were probably formed through volcanic activity propagating north of the inner rift.\n5. The Gorda Ridge displays similar elements of structure to the mid-Atlantic Ridge and the Galapagos\nRidge. Extensive hydrothermal activity and polymetallic sulfide deposits have been observed from submersible\nto be present along the inner walls of the inner rifts. Similar activity could be present within the inner\nrift of the Gorda Ridge.\nACKNOWLEDGEMENTS\nI would like to thank Captain Bruce Williams and the crew and officers of the NOAA Ship SURVEYOR for the\nenthusiasm with which they tackled problems associated with getting the SEABEAM system operational. Dr. Robert\nEmbley and Commander Robert Ganse critically reviewed the manuscript. Mrs. Carol Collom for preparing the\nmanuscript.\nREFERENCES\n1. Crane, K., R.D. Ballard, \"The Galapagos rift at 86°W, 4. Structure and morphology of hydrothermal\nfields and their relationship to the volcanic and tectonic processes of the rift valleys,\" 85, JGR (1980),\n1443-1554.\n2. Francheteau, J., D. Needham, T. Juteau, C. Rangin: \"Birth of an Ocean,\" CNEXO (1980).\n3. Hekinian, R., M. Fevier, J.L. Bischoff, P. Picot, W.C. Shank: \"Sulfide Deposits from the East\nPacific Rise Near 21°N.,\" Science, 207, (1980), 1433-1444.\n4. Northrop, J.: \"Accuracy of Earthquake Epicenters on the Gorda Ridge,\" Bull. Seismol. Soc. Am. 60,\n(1970), 265-267.\n107","5. Pacific Geoscience Center: \"Juan de Fuca Plate Map, 11 (1979).\n6. Phillips, J.D., H.S. Fleming: \"Multi-beam Sonar Study of the Mid-Atlantic Ridge Rift Valley,\"\nBull. Geol. Soc. Am. 88 (1977).\n7. Renard, V., J.P. Allenon: \"Sea Beam, Multi-Beam Echo Sounding in \"Jean Charcot,\" International\nHydrographic Review, Monaco, LV (1) (1979), 36-67.\n8. Riddihough, R.: \"Gorda Plate Motions from Magnetic Anomaly Analysis,\" Earth Plan. Sci. Let. 51\n(1980), 163-70.\n108","1. Operational mode of SEABEAM. a-transmission mode in the form of a 2 2/30 wide beam\nFigure .\ntransmitted at 12 kHz, perpendicular to the axis of the ship. b - reception mode in the form of\nsixteen beams 2 2/30 wide contained within a 20° wide cone C - resultant coverage of the ocean\nfloor in the form of sixteen cones with subtended angles of 2 2/30 X 2 2/30.\nOREGON\nCAPE BLANCO\n3\nMENDOCINO FRACTURE ZONE\n12 7°\n1259\n126°\nn\nFigure 2. Location map of the Blanco Fracture Zone-Gorda Rift study area surveyed with SEABEAM\nand discussed in this paper. Contour interval 200 meters.\n109","0\n(A)\n(B)\nFigure 3. . Bathymetric maps, of the crest (a) and northeastern flank (b) of the Gorda Ridge\nat the Blanco Fracture Zone intersection. Maps were constructed from overlapping SEABEAM\nswaths run parallel to the axis of the rift. Contour interval 50 meters.\n110","15 06.0\nCASCADIA BASIN\nBLANCO FRACTURE ZONE\nNORTHERN BOUNDARY RIDGE\nBLANCO FRACTURE ZONE\nSOUTHERN RIFT BOUNDARY\nAsso\nNORTHWESTERN\nSOUTHEASTERN INNER WALL\nINNER WALL\n42528\nB\n4250.0\ndais\nC\nGORDA RIDGE AXIS\nFigure 4. Principal morphological features of the Gorda Ridge - Blanco Fracture Zone intersection.\nCross-hatched band marks the location of the axial ridge within the rift valley. Capital letters indicate\nlocation of bathymetric profiles shown in Figure 5. Contour interval 50 meters.\n111",",\n2400\nA\nA\n1\n2500\no\n2\n3\n4\n5\n6\n7\n8\n9\n10\n2500\n2800\nKILOMETERS\n2600\n2700\n2700\nGORDA RIDGE\n2800\n42° S & 'N 126° 37.5'W\n2800\n2900\n2900\n3000\n3000\n3100\n3100\n3200\n3200\n3300\n3300\n3400\n3400\n3500\nWEST\n3500 EAST\n3000\n3600\nB\n,\n2500\nB\n2500\nGORDA RIDGE 42° 54.6'N 126° 40'W\n2600\n2600\n2700\n2700\n2800\n2800\n2900\n2900\n3000\n3000\n3100\n3100\n3200\n3200\n3300\n3300\n3400\n3400\n,\n2500\nGORDA RIDGE 42° 50.6'N 126°42.5'W\n2500\n2600\n2600\n2700\n2700\n2800\n2800\n2900\n2900\n3000\n3000\n3100\n3100\n3200\n3200\n3300\n3300\nAMOUS (MID-ATLANTIC RIDGE CREST) 37°N\n2800\n2800\n2900\n2900\n3000\n3000\n3100\n3100\n3200\n3200\n3300\n3300\n3400\n3400\n3600\n3500\n3000\n3800\nMIVERA-TAMAYO (RITA) RIDGE SEGMENT\n21° 100°\nw\n2600\n2000\n2700\n2700\n2800\n2800\nEAST PACIFIC RISE\n2900\n2900\n3° 25'8 102° 36'W\n3000\n3000\n$100\n3100\n3200\n3200\n3300\n3300\nFigure 5. Bathymetric profiles across the inner rift of the Gorda Ridge and comperative profiles\nform sections of the mid-Atlantic Ridge and the East Pacific Rise. Vertical exaggeration is 4.5.\n112","43°00'\nSee Figure 4 for Detailed location shipboard of the maps. contoured Contour SEABEAM interval maps of 20 meters. the inner rift valley of the Gorda Ridge.\nFigure 6.\n113","ADVANCE IN SEA BEAM DATA TREATMENT AND PRESENTATION\nC EDY\nCentre National pour l'Exploitation des Oceans (CNEXO)\nCentre Oceanologique de Bretagne - Brest - France\nWithin the framework of its mission the CNEXO has established, in 1971, at the \"Centre Oceanologique de\nBretagne\" (COB), the \"Bureau National des Donnees Oceaniques\" (BNDO) Having access to the common COB computers,\nthe BNDO is both a data bank and a documentation centre. Beside these activities, the BNDO has acquired an origi-\nnal ability in the problems of acquiring and data processing at sea.\nFor the geology - geophysics subject, not many new developments have been undertaken since 1977 or are fore-\ncast in 1981, except:\n- Update of the geophysics data base (underway data of bathymetry, gravimetry and magnetism), for Atlantic\nocean, Mediterranean sea and East Pacific ocean.\n- Transfer from the DSDP's data base of the main data file IPOD.\n- Transfer of several NG SDC files (earthquake, heat flow, PETROS, CLIMAP, pipe-cores).\nDuring this time, the BNDO has made a large effort in the treatment of multi-beam sea beam data As a matter\nof fact, the CNEXO has equipped the research vessel JEAN CHARCOT with sea beam in May 1977 and has entrusted BNDO\nwith the data processing and archiving. BNDO enjoys the possession of a certain advance, sea beam features in\nparticular:\nPlay-back of data, at the Centre (big computer, elaborated products) as well as at sea (minicomputer,\n-\nsimple products)\n- Integrating of navigation in the real-time contouring (satellite, radioelectric or acoustic navigation).\n- Methods of presentation (small scales, block-diagram, slopes,\n.).\nPROCESSING OF THE DATA FROM THE MULTIBEAM SEA BEAM\nThe development of the Sea Beam processing system was undertaken in 1977. We now have available a complete\nsystem for off-line play-back of the Sea Beam data right up to precise isobath and block-diagram contouring along\nthe ship's track. The problem of combining the data from several profiles has not been tackled, but, neverthe-\nless, a notable improvement has been made over the real-time output of the system.\nWe have also modified the real-time output contouring by integration of the navigation parameters. The\nalgorithms of contouring are exactly the same.\n1. THE SEA BEAM SYSTEM\nThe Sea Beam multibeam sonar system has been previously described in the International Hydrographic Review by\nJ.\nP. Allenou and V. Renard*: we therefore give here only a summary of characteristics pertinent to a discussion\nof its processing sub-system.\nSea Beam allows simultaneous measurement on either side of the vessel of up to 15 depths on 16 beams, 2.2/3°\nin width, alined perpendicular to the ship's axis.\nThe system measures round trip pulse travel times which can be transformed to oblique distances provided the\nspeed of sound in the water column is known. These oblique distances are then converted - after compensation for\nroll (pitch is allowed for at the time of transmission) - into horizontal distances and depths\nThe data are recorded on magnetic tape, annotated with the time and the ship's heading, and then utilized in\npractically real-time contouring. The magnetic tape generated, together with navigational data, permits later\noff-line processing. The limited performance of the real-time contouring can be improved by more sophisticated\nsoftware, and therefore a detailed description of it will be given for purposes of comparison with our off-line\nprocessing development.\n*\nAllenou, J.P., and V. Renard (1979): Sea Beam, multi-beam echosounding in Jean Charcot - Description,\nevaluation and first results. Int. Hydrog. Review, LVI(1), January, pp 35-67.\n114","1.1. Real-ti Contouring by the Sea Beam System\nThe real-time system has a very heavy workload, and this necessitated a certain number of simplifications.\nSome matters were not treated in the manufacturer's manual and thus remain uncertain The general principle,\nhowever, would appear well known.\n- The ship's heading is taken as constant - and this permits use of a narrow-width plotter and facilitates\nthe generation of a rectangular grid of soundings.\n- Speed was initially manually input. A recent modification allows us to introduce automatically the speed\nthe water (log speed) every second. This speed is not, of course, the ground speed, for the wind or\nthrough\ncurrent drift is not exactly known until later.\n- The grid used for contouring has a fixed interval of 5 mm in both directions.\n- The principal of contouring within the grid is very simple, as the lines are fairly angular, so contouring\nis probably very close to a linear interpolation.\n- The major disadvantage of exploiting the data in real time is that we are obliged to work at the maximum\nscale; otherwise we would lose accuracy.\nTHE PRINCIPLES ADOPTED FOR DEVELOPMENT OF SOFTWARE FOR CONTOURING ALONG A SINGLE TRACK\n2.\n2.1. Definition of Objectives\n2.1.1. The emphasis was first of all put on computational speed and on minimizing memory use, as we wish to\nimplement our programs aboard ship and to avoid overloading the office computer.\n2.1.2. Plotting time was considered unimportant, in the first phase at least.\n2.1.3. The chosen algorithms should allow contouring of raw data as well as the possibility of smoothing\nwhen requiring.\n2.1.4. We considered that it would often be necessary to redraw the contours by hand, mainly to ensure that\nthe swept swathes are correctly placed alongside two adjoining profiles.\n2.1.5. In view of the great number of soundings, a certain proportion may be neglected, provided that this\nproportion remains very small.\n2.2. Algorithms\n2.2.1. General\nWe chose to work cycle by cycle* to minimize memory time, the considerations mentioned in 2.1.4. making it\npossible to plot the contours as and when wanted, without taking account of their continuity.\nIt can be seen that the soundings form an irregular grid; but it is quicker to contour in this irregular grid\nthan to form a more regular one.\n2.2.2. Smoothing\nIt seems desirable, for some areas at least, to smooth out the irregularities of bottom relief. This is done\ntheoretically, by averaging positions and soundings for individual beams from several successive cycles.\n2.2.3. Plotting of contours\nWe work on two successive cycles and in each we successively explore each \"box\" (the quadrilateral whose\ncorners are the adjacent soundings for two successive cycles, as illustrated below).\nA complete sequence of transmission and reception is termed a cycle; by extension the term here designates\n*\nche total number of soundings taken during one whole cycle.\n115","In a basic box, the positions of the intercept extremities are calculated by linear interpolation of the positions\nin proportion to the number of sounding value shown.\n3. PRODUCTS AVAILABLE AT THE BNDO\n3.1. Plotted Discrete Soundings (Fig. 1)\nThe \"plotted soundings\" program, useful for particular needs, is rarely used. The product obviously requires\na great deal of time for hand work if a contoured map is the objective.\n3.2. Simple Plotting of Isobaths (Fig. 2)\nThe principle of this program is to focus successively on the individual quadrilaterals of the sounding\nnetwork and to trace straightline isobath segments between points of corresponding depth for each quadrilateral.\nThe operations are used to render the product easier to read, visually more pleasing, and so forth; for example,\nby adding hachures to indicate the sense of slope directions; by thickening, or changing the colors of \"master\"\nisobaths; and by selective numerical specification of depths. Sub-programs also enable depths to be interpolated\nand isobaths to be smoothed; and a special technique can be applied for maintaining the isobath plot during coarse\nchanges.\n3.3. More Elaborate Isobath Plots (Fig. 3)\nThe relevant program includes, in addition to the possibilities of the simplified program, the tracing of the\nisobaths as joined segments of second-order tangents, and includes regular hachure-mark distribution.\n3.4.\nTopographic Profiles (Fig. 4)\nProfiles for a given beam (depth versus time or distance) can be traced using a standard BNDO format or the\nX Y format.\n3.5. Block-Diagram (Fig. 5)\nBlock-diagram combines several profiles; the problem of the overlapping is not yet resolve.\n4. CONCLUSION\nThe year 1980 has been a very important year because of the embarkment of the play-back software and the\nsubstitution of the output, BNDO real-time, which integrates the navigation parameters, for GIC real-time.\nIn 1981, we will to work out a method of archiving and speedy selection for sea beam data; this will lead us\nto define an archiving format which we would common with our American colleague.\nIn connection with this, we would like a collaboration with the existing American team and, perhaps, we could\nlook to the establishment of a work-group on the sea beam problems, especially the problems of exchange format.\nThis was discussed at Paris within the framework of the Franco-American meeting (NOAA-CNEXO), which Dr. Malahoff\nattended.\n116","W42.4930\nis\nI\nIII\nW42.4930\nVEMA 1 - ZONE ATLANTIS\nUTM - ECHELLE == 1/10000\nXMIN=\n500000\nFigure 1. Plotted discrete soundings\n117","W42.5033\n442.50\n142.49\nN29.3432\nN29.3432\nN29.34\nN29.34\nN29.33\nN29.33\nN29.3240\nN29.3240\nW42.5033\nW42.50\nW42.49\nVEMA1 - -ZONE ATLANTIS - NMOY=4 ;NTROU=5\nBNDO - -PROJECTION MTU - ECH 1/10000\nFigure 2.\nSimple plotting of isobaths\n118","N12.31\nN12.2:\nN12.2\nN12.2\nN12.2\nW44.02\n3500\nW44.01\nW44.02\nW44.02\nW44.03\nW44.03\nMore elaborate isobath plots\nW44.04\nW44.04\nNTROU=5 NMOY=5\nW44.05\nW44.05\nFigure 3.\nW44.06\nW44.06\nECHELLE=1/50000\nW44.07\nW44.07\nVEMA\n00\nW46.08\nW44.08\n122.28\n112.27\n112.26\n12.30\n122.29","25\n20\nTopographic profiles\n15\nFigure 4.\n10\n5.0\nSEANOD N40003 15 FAISCEAUX (-7,+7)\n5\nNMOY=1 1/100000 EV=3\nKILOMETRES\n-2600\n-3200\n-3900\n-4000\n-6600\n-4000\n-5200\n-5800\n-6000","Figure 5. Block-diagram\n121","BATHYMETRIC DATA REDUCTION SUBSYSTEM\nGordon M. Frank-Cartographer\nJames P. Moran-Cartographer\nJohn Lanier-Physical Scientist\nDefense Mapping Agency Hydrographic/Topographic Center\nWashington, D.C.\nABSTRACT\nThe Bathymetric Data Reduction Subsystem (BDRS) represents a new concept in bathymetric data reduction\nprocessing. The BDRS is the primary support subsystem to the Bathymetric Information System (BIS). BIS is one of eight\ninteractive data base systems to be developed under a global control system known as the Source Data Base System (SODS).\nSODS will provide users with both centralized access and information manipulation capability to these various information\nsystems, including BIS. Users will have no direct access to BDRS, but they will be able to retrieve information from BIS\nthrough SODS.\nThrough the use of various software programs the BDRS accepts digital data input from analog sources (such as\nsmooth sheets and fathograms/echograms) and magnetic tape . Output from the system will consist of proof plots, track-line\ngeneration, data base access and editing, and \"quick look\" capabilities into the data base. New developments contained in the\nsystem are: Voice recognition digitization system which allows the input of sounding data by spoken word; digitization of\nfathograms and the subsequent proof-plot generation using ship's log and the digitized soundings; and, the ability to geo-section\nportions of the data base using circular, triangular, or polygonic searches to define user areas of interest.\n122","TABLE OF CONTENTS\nI.\nINTRODUCTION\nA. Objective\nB.\nDevelopment\nINTEGRATION WITH OTHER HTC SYSTEMS\nII.\nIII.\nDATA INPUTS\nA. Analog Data\nB. Numerical Data\nIV.\nDATA OUTPUTS\nA. Plots\nB.\nGraphics Displays\nC. Summary Reports\nD.\nMagnetic Tape Files\nV.\nSYSTEM\nA.\nHardware\nOperating System and Related\nB.\nData General Operating Components\nC. Software\n1. Digitization\n2. Batch\n3. Data Base\nHIGHLIGHTS OF THE SYSTEM\nVI.\nA. Voice Recognition Digitization as Applied to Bathymetry\nB. Echogram (Fathogram) Digitization and Verification\nC. Geo-Sectioning of the Data Base\nVII.\nCONCLUSION\nVIII.\nILLUSTRATIONS\n123","I. Introduction\nThe Bathymetric Data Reduction Subsystem (BDRS) represents a new concept in bathymetric data reduction processing in\nthe Scientific Data Department of the Defense Mapping Agency Hydrographic/Topographic Center (DMAHTC). By the use of\nstate-of-the-art technology in both hardware and software, the design of the BDRS will provide an initial influence on future\ndata base structured systems to be developed at DMAHTC.\nA. Objective\nWith increased emphasis on the acquisition of bathymetric data in a digital form, the BDRS will provide a means\nof\nstoring, retrieving and manipulating data to provide products for the support of nautical and bathymetric charts. The BDRS\nwill also provide a means of comparing incoming data with current or up-to-date data for accuracy and integrity. This added\nfeature at DMAHTC will have the capability of identifying where good data exist and where more data should be acquired.\nB.\nDevelopment\nThe BDRS was developed by the Synectics Corporation (Fairfax, VA.), under contract with the Rome Air\nDevelopment Center (RADC) and the Defense Mapping Agency. Synectics Corporation performed both the system\nrequirements study and analysis/design activities, which resulted in software/hardware specifications for the BDRS. Much of\nthe software used in the development of the BDRS had been previously developed for use on the UNIVAC 1108 mainframe. It\nwas the job of the primary contractor (Synectics) to adapt and develop new software into a mini-computer environment with\non-line (interactive) capability and to provide the appropriate peripheral devices for each on-line function. The function of\nRADC was to manage the development and system implementation by Synectics. DMAHTC provided the technical expertise\nto the bathymetric data handling capability and function at the Center. With all three components working together, the\nBDRS became a unique on-line data reduction subsystem with bathymetric data base capabilities.\nII.\nIntegration With Other HTC Systems\nThe BDRS is an integral part of the conceptual Bathymetric Information System (BIS), which is vital to the Global Plan\n(Phase I) of the Source Data Base System (SODS). Its purpose is to assist in providing timely information to authorized DMA\nusers (both internal and external) of compilation, source, and index data.\nIn Figure 2-1, the BDRS is the \"work-horse\" in this scheme. Because basic analysis and evaluations will be constantly\nprocessed to update the BIS for new real-time processing, users will have to initiate a request through the BIS where it is\nvalidated giving the requester access to the indices showing the areas of coverage, reliability of data, accuracy, and\ngeographics.\nThe BDRS must maintain a flexible posture to support the various interfaces for the expansion of existing functions.\nThese functions include software programs, digitization, analysis, and hardware operations.\nSupport to worldwide weapon systems is a paramount MC&G requirement at DMAHTC. Additionally, the Center receives\nthousands of data points, track lines, surveys, and similar data each month from naval sources, government agencies and\nprivate firms. These data must be quickly synthesized and readied for use by the analysts for chart production. The incoming\ndata, which have been evaluated by the hydrographic analyst, can be reduced and verified, if proper documentation\naccompanies the submission. It is estimated that collected source data can be readied for use quicker than some current\nmethods using semi-automatic ADP systems.\nThe BDRS must also be compatible with the Lineal Input System (LIS) whose function it is to reduce analog data to a\ndigital format. The main cost-saving here is the speed with which a nautical chart can be produced or corrected, and reduce\ndata redundancy and manual files.\nFinally, as the BDRS is integrated with other systems at DMAHTC, it is not intended to be a \"cure-all\" system or\na\nsubstitute for all existing methods. Instead, it is a system which has been designed to apply state-of-the-art technology to\nMC&G areas, specifically to the hydrographic product, and to effectively reduce the total costs of producing a usable, safe,\nand accurate product.\nIII.\nData Inputs\nDMAHTC receives data from worldwide sources. These data fall into several general categories of hardcopy and softcopy\nmediums, where many hardcopy holdings predate the 20th century.\nThe hardcopy medium consists of survey smooth sheets, random track charts, program listings, sounding logs, and\nechograms. These data can be summarized as analog type. The softcopy medium includes magnetic digital tapes, and\noccasionally a card deck. These data can be conveniently summarized as numerical type input. A detailed explanation is given\nabout the general inputs of analog and numerical data below.\n124","A. Analog Data\nThe classification of analog data would include mostly survey smooth sheets, random tracks, and echograms\nwhich consist of lineal and point data. These data, graphically represent the physiography of the ocean floor. It is obtained by\ntransiting predesignated lines over an area or surface, either creating a controlled pattern (survey) of sounding data or data\ngathering (random), which is secondary to the original mission of the vessel. Both types of data are important needless to say,\nbut the random track data can be and often are the only data available in which to build to a nautical product. Echograms\nusually accompany both random and survey data. They are especially useful for verification purposes. Subsequently, all data\n(survey and random) must be analyzed for horizontal and vertical accuracies. The analysis is measured against a\npredetermined standard for accuracy. An example would be the southern oceans where the standards are usually lower due to\nthe unavailability of substantive information which is the result of fewer ships in transit compared with the northern oceans.\nIn contrast, the North Atlantic region has a higher evaluation standard, particularly in the vicinity of shipping lanes.\nEssentially, the evaluation of bathymetric data is relative to a geographical region and the density of available quality data in\nthe same region. The BDRS is designed to sort and track these various parameters for data evaluation.\nB.\nNumerical Data\nNumerical data are sounding logs, digital magnetic data tapes, or single point positions and soundings. Presently,\nmuch of these data must be digitized and manually compared to the associated source such as a ship's track for evaluation.\nThe advent of the BDRS can greatly simplify this process and make it less tedious than current methods allow. A comparison\nis made and finally evaluated with visible overlays plotted to the scale of the track or echogram. This procedure immediately\nillustrates whether the data are valuable for storage purposes. Over the last few years, much of the data submissions to the\nCenter are in a digital format which make it easier to plot and verify, after the header information has been deciphered, if the\ndata package has been properly documented. Over the years, DMA has had good success with the data it has received from the\nU.S. Navy, petroleum companies, National Ocean Survey (NOS), Naval Research Laboratory (NRL), and the Naval\nOceanographic Office (NAVOCEANO). The merchant marine community is also an excellent source of information; however,\nmost of their submissions are through conventional means (hardcopy) and random in nature.\nLastly, Figure 3-1 shows a simplification of the data types received and how they are interfaced within DMAHTC\nto support the worldwide users of hydrographic information and other specific data.\nIV. Data Outputs\nThe BDRS will be used as a support tool primarily for the Scientific Data Department at DMAHTC. However, its\ninformation can be used by anyone requiring access to bathymetry in a digital form.\nBasically, the BDRS has outputs of four types. They are: plots, graphic displays, summary reports, and magnetic tape\nfiles.\nA. Plots\nPlots are any output which will eventually be produced from a readable tape on a Xynetics or Calcomp Plotter\nPlots of this type include: Proof plots for verification against existing data plots, General Bathymetric Chart of the Ocean\n(GEBCO) track line plots of ship tracks registered while soundings were being recorded, and plots of a geo-sectioned area\ncontained in the data base (geo-sectioning will be described later in this paper). All plots output by the BDRS provide the\nanalyst with a reference sheet of what data are contained in the BDRS data base. Plots provide a vital support tool for the\npreparation of source information packages, which are the first step in producing a new chart.\nB.\nGraphics Displays\nThe BDRS also can output data in the form of a \"quick look\" display of the digital data. This can be performed\non one of three CRT display devices. This \"quick look\" feature allows an analyst to examine quality of data available in a\nregion, without having to wait for a hardcopy output to be constructed. The BDRS also allows access to any textual or source\nidentification information that a unique document identifier has associated with it. The unique document or data holding is\nreferenced by what is called a document number. This is an account number, which is assigned to a unique group of data when\nit arrives at the Center. The BDRS can output all document numbers in a region and describe the area encompassed by each\ndocument number.\nC. Summary Reports\nWhenever a conversion, or any other batch process takes place on the BDRS, a record of the transaction is stored\nand can be retrieved and easily printed using the line printer. Information contained on these reports identifies all old and new\nparameters that were involved in the conversion process. In addition, when a retrieval of the BDRS data base is required using\na document number or geographics, a summary report is produced to summarize what a user created. This feature can be\nconsidered a supervisory function of the BDRS.\nA dump of all data blocks, entered into the data base or contained in a geographics file, can also deliver a\nsummary report on the line printer, allowing the user to do a sounding-by-sounding check of the data.\n125","D. Magnetic Tape Files\nA file can be dumped to magnetic tape for input into other system routines. Output is executed using BDRS\noperating system commands. Digitized data can also be output to tape using a software routine, which outputs an octal dump\nof the data.\nV. System\nA. Hardware\nThe BDRS consists of predominately off-the-shelf equipment and can be acquired from vendors by way of their\nindividual product lines. An illustration of the BDRS hardware is given in Figure 5-1.\nThe computer used for the BDRS system is a Data General Corporation (DGC) Eclipse C-300 with a 92 MB Data\nGeneral Disk Unit. The disk is the mass storage device on the BDRS and contains all operating system software, user\ngenerated systems software, and data base information. The Eclipse is a 16 bit machine, 128k word core memory, with\nmemory capacity to 256k words with optional map. Two 9-track Data General magnetic tape drives are also linked to the\nmain Central Processing Unit (CPU) for tape input/output functions.\nThe BDRS supports two Data Automation Digitizing Tables. The first, known as Station 1 (See Figure 5-2),\nconsists of a digitizing table, a Data Automation standard 3-inch field of view cursor, with a 16-key keyboard and a Tektronix\n4010 graphics display terminal. The table has a free-cursor electronic digitizing system which provides a measurement\naccuracy of 0.005 inch over an area of 60 inches in the X direction and 42 inches in the Y direction. Measurement resolution\nis 0.001 inch with repeatability of 0.001 inch. The keyboard is used primarily for depth entry, but also can be used for editing\nand deleting functions.\nThe second table, known as Station 2 (See Figure 5-3), is basically the same unit as Station 1, except for one\nvariation. Station 2 has a Threshold-500 voice entry system with a visual feedback display cursor. The Threshold Technology,\nInc., system is an electronic speech recognition system, which automatically recognizes up to 32 spoken isolated words of up\nto 2.0 seconds duration each. This system is a prototype device which will be used as an alternative to the manual depth entry\nkeyboard. This device can be trained by up to 99 users and serves as an aid to rapid depth entry input. The visual feedback\ncursor (developed by Data Automation Company) displays the spoken numeric, in the order of entry, on the cursor itself and\nallows for checking voice entry before the actual entry is recorded in the system.\nTwo automated drafting plotters are associated with the BDRS, but are not directly connected to it. One is a\nflatbed plotter manufactured by Xynetics Corporation (Model 1101). The other is a drum plótter manufactured by Calcomp\nCorporation (Model 936).\nPeripherals contained on the system (See Figure 5-1) are a Centronics 101A Line Printer, a Data General 6012\nCRT (which is used as the master terminal), and two dedicated devices, Data General 6052 Dasher terminal and a Tektronix\n4014 Graphic CRT, which are used for data base operations only.\nB.\nOperating System and Related Data General Operating Components\nThe operating system of the BDRS is Data General's Real-Time Disk Operating System (RDOS). It is a mapped\nsystem, which means it can run two user grounds (foreground and background) simultaneously. The primary interface between\na user at the system console and RDOS is Data General's Command Line Interpreter (CLI). CLI is designed to accept command\nlines from a system console and translate them into operating system commands.\nThe BDRS is supported by Data General's INFOS. INFOS is a file management system, which will maintain the\nBDRS data base access and control capability. INFOS is intended to provide a concise, and flexible method for the creation\nand processing of sequential files, random files, indexed sequential files, and data base files.\nOther system features of the BDRS include: Data General Text Editor, Data General Octal Editor, Data General\nExtended Arithmetic Processor, Fortran IV and Fortran V compiler, and power fail detect and automatic restart feature.\nC. Software\nThe BDRS software can be described as being made up of three subsystems or modes with several special purpose\nroutines. The three subsystems are: digitization, batch, and data base.\n1. Digitization\nAs can be seen in Figure 5-4, the digitization process can input several types of information in\nseveral different ways. During what is called the initialization mode, chart information is entered and recorded for use later\nin the Batch conversion process. Once this information is entered, the chart, echogram (fathogram), or smooth sheet is\nregistered to the table. When a chart is registered it is said to be relatively referenced to the table X/Y coordinates. From\n2\nto 10 points can be referenced (the usual being just 4 points). These reference points are flagged by a geographic coordinate\nwhich later on in the batch subsystem can be directly merged with the X/Y data to produce a geographic file (GO). All X/Y\npositions on the table are entered via the two cursors at Station 1 and Station 2 (See Section V-A \"Hardware\"). Two methods\nexist on the BDRS for depth entry. Both Stations have the ability to enter depths manually via a keyboard assembly. Station 2\nhas a new function (as applied to bathymetry) of allowing depth entry via a spoken word. Station 2 (See Figure 5-3) has a\n126","Threshold Technology Model 500 voice entry terminal. Once an operator has trained his or her voice on this system (training\nmeans speaking in a consistent manner into the system's microphone and allowing a digital vocal pattern to be defined), an\nentry of depth can be registered both visually and on disk.\nThe digitization processes' most important manner of operation is known as Master Mode. This\nmode controls all inputs or operator commands selected by the user. Five subfunctions exist under master mode and the\nmajority are accessed via the digitizing tables' 5-pushbutton cursor. Three of these subfunctions control the input of data\n(Trace Mode, Depth-entry Mode, and Discrete Point Mode). Trace Mode inputs any linear or continuous feature such as\nshoreline, contour, or echogram trace. Trace mode also inputs any feature which consists of a continuous line segment.\nDepth-entry Mode allows the user to input independent depths using either the manual entry system or the voice entry system.\nDiscrete Point Mode allows the entry of individual grid ticks where they can be used for reference to a grid or projection type\nrelating to the source material. Two other subfunctions exist on the BDRS which allow corrections or modifications to be\nmade. The first is known as review mode which allows the review or examination of everything entered via the previously\ndescribed input functions. A window relative to the area positioned by the cursor is set up, and all entries within this window\nare flashed on the Tektronix 4010 CRT for visual verification of all entries. The other subfunction used for verification or\nmodification is the edit mode. This mode allows the operator to delete, change, or modify any feature such as a sounding\nentered during any of the first four modes.\nOutputs from the Digitization process are in the form of three files (See Figure 5-4). The Data\nFile (DF) holds all of the X/Y data recorded by the cursor. The Header File (HF) holds all header information that describes a\nunique feature. The Index File (IF) holds all source information that will be used in the batch conversion process. These three\nfiles can all be deleted using RDOS (See Section V-B) commands from the Background Master terminal.\nThe BDRS also has the unique ability to digitize an echogram and allowing it to be converted in\nbatch to a file that can be entered into the BDRS data base. When digitizing an echogram (depth verses time analog source),\ntrace mode is used and the depth curves contained on an echogram are recorded. When a change of scale or when a long\nechogram has to be reregistered, a new file is created (FF-Fathogram) which holds all change information.\nAs can be seen in Figure 5-4 a job which has already had data entered can be reentered, and new\ndata can be appended to it. When this happens the job must be reregistered and checks must be made against residuals to\nassure that the same chart has the same \"day one\" registration. Residuals are error parameters which tell how far away from\nthe original (day one) registration a chart is the second or third time it is used. Ideally, day one registration should show zero\nresiduals, and day two and after should be no greater than .005 inch off (residuals are measured in units of .001 inch).\n2. Batch\nThe batch subsystem provides the interfacing of the digitizing subsystem to the data base\nsubsystem. Each function can be described as being standalone, meaning it performs one function and one function only.\nGeneral categories of all batch subsystem functions are: Plotting functions, conversion functions, and input/output batch\nroutines. (See Figure 5-5 for a diagram of batch subsystem data flow).\nWhen plotting functions are defined, reference is made to any routine in batch which produces a\nXynetics or Calcomp plotter readable magnetic tape. These routines include Proof Plot (used to plot an overlay of previously\ndigitized soundings), Scaling Echogram (used to plot an overlay of a previously digitized echogram/fathogram), and Plot Driver\n(which is used to create a Xynetics or Calcomp readable tape of the previous two routines). A routine called GEBCO Track\nLine is also contained in subsystem batch and is used to plot \"track lines\" or ships course across the ocean.\nConversion routines used in this subsystem cause a file or files to be changed over to a new format.\nThe most important of these is a routine called Table to Geographics conversion. When a job is completed on one of the two\ndigitizing tables, the files created are said to be in table (X/Y) coordinates. This routine takes the table file and converts\nit\nto\na geographic file (latitude, longitude, depth). This is an important step because the BDRS data base is stored in geographic\ncoordinates. When an echogram is digitized another routine is run which requires an echogram table file and the previously\nentered ship's log file. These two are merged, and a geographic file is produced. This file can now be entered into the data\nbase in the same way as any other geographic file. Other routines contained in the batch subsystem include Geographics to\nTable conversion (used to create a new table file from a previously created geographic file), Depth Adjustment (used to filter\ndepths in dense data regions), and Datum Shift (which converts from one datum to another).\nInput/output batch routines are also used to capture data previously created on other systems.\nSystems to be interfaced with the BDRS are the Lineal Input System (LIS) (the data digitizing system used in the Hydrography\nDepartment at DMA), and UNIVAC 1108/81 holdings of previously digitized soundings, held on tape and accessed off-line by\nbatch processes. The BDRS batch subsystem can also transfer files to and from magnetic tape. The output from this routine\nis an octal dump which can be used on the system.\n3.\nData Base\nThe BDRS data base subsystem is designed to create, update, and maintain a viable holding for\nbathymetric data. The data base utilizes Data General's INFOS file management system. The design within the data base\nconsists of three unique modes of operation: Master Mode, On-Line, and Batch (See Figure 5-6 BDRS DATA BASE DATA\nFLOW).\n127","Master Mode allows a system supervisor the capability to assign a log-on password and read and or\nwrite privileges to the data base. The above parameters can be deleted or modified to accomodate the needs of the user. The\nsystem supervisor can also reinstate or delete those documents flagged logically for deletion during the on-line process.\nBasically Master Mode is a tool designed to insure data base integrity.\nThe On-Line Mode is designed to input document source description records and is capable of\nrandom geographic queries of the data base. To support this function the Data General (DG) 6052 and TEKTRONIX 4014\ngraphic CRT's are used, respectively (See Sec. V-A \"Hardware\" ). The user can input, review/modify, or delete\nsource\ninformation via the DG 6052 CRT; remembering from master mode he/she was assigned a password to read and/or write\nprivileges to the data base. To input a document source the user defines the geographic area (bounding rectangle) and a 10-\ndigit document number. The document number is assigned to any data (analog or tape) which is received by the Scientific Data\nDepartment for catalog purposes. The source descriptive record(s) consist of unique parameters defining the type and quality\nof data associated with the document. These values are entered and can be reviewed or modified. If modification is desired,\nthe user can update any parameter as required. The deletion process allows the user to logically delete the entire document or\na particular sheet(s) of that document. Only the system supervisor can physically delete any data from the data base. This is\nanother tool to insure data base integrity.\nContinuing with on-line, geographic queries can be performed by both terminals. Utilizing the DG\n6052 the user queries the data base by inputing 2 to 8 points. The retrieved data displays the document(s) for review. Another\nfeature is point search. This allows the user to define an exact location (latitude and longitude) to search the data base for\none data value. This is extremely useful when questioning the validity of a sounding value. The Tektronix 4014 allows the user\nto geographically query the data base and display the retrieved data graphically. The graphic display consists of the ship's\ntrack as stored in the data base and referenced by the geographics used for that query. The retrieved data can be used as an\noutput file to be plotted for the user. The Batch Mode of the data base subsystem allows the user to interactively load, query,\nand logically delete data from the data base. The load process requires a BDRS standard geographic file, the document, and\nrespective source descriptive parameters. The standard geographic digital file is transfered with its respective source\ndescriptive parameters into a hierarchial data base. A query of the data base is accomplished by document or geographic\nretrieval. The document retrieval searches the \"geographic\" data base and displays the document(s) and the associated source\ndescriptive parameters. The geographic retrieval allows the user to query the data base by two methods, geographically and\nlogically. A logical query can be by a ship's name and or data evaluation. (See On-Line Mode). Geographic sectioning of the\ndata base retrieves data by a circular, triangular or polygonic method. (See Section VI C). The output from geo-sectioning is\nplotted on the Tektronix 4014 graphic CRT in the form of track lines. A list of all documents contained within the defined\ngeographical area will be displayed. This display can be transferred to a BDRS geographic formatted file. The final output of\na geographic query is used to produce a computer plot via the Xynetics or Calcomp plotters. Finally the data base Batch Mode\nallows for logical deletion of a document (See Master Mode). A report is generated from the line printer to provide the user\nwith a hard copy summation for each request during the Batch Mode.\nVI. Highlights of the System\nA. Voice Recognition Digitization as Applied to Bathymetry\nThe BDRS allows the input of soundings by a different method than has been previously employed in a\ndigitization process. Station 2 (See Section V-A \"Hardware\") is connected to a Threshold Technology, Inc -Model 500 voice\ndata entry terminal. This hardware unit allows a user to input independent numeric data by spoken word. Once the values of a\nsounding are spoken into the machine, a user can enter the sounding into a user defined file as a X, Y, Z value. Numeric values\nare both displayed back to the user on the data entry terminal as well as on a special Data Automation LED display cursor.\nHaving the values displayed in this way allows real time checks of each sounding before it is entered into a file. The voice\nsystem has several advantages and disadvantages. One advantage is that the operators hands are free to hold the cursor at a\nstatic position, which allows for a more accurate entry of the X, Y position. This system also allows up to 99 different voices\nto \"train\" the system. A disadvantage of the voice system is that it is an early version of Threshold's voice system, and it does\nnot accept spoken words as fast as some of the later models. The system can be field-upgraded with new hardware, which will\nallow it to handle a much larger vocabulary and to be much faster. For purposes of the BDRS, however, the voice entry system\nallows an alternative to manual entry, and it does allow reasonably fast data entry.\nEchogram (Fathogram) Digitization and Verification\nB.\nA very significant development of the BDRS is the ability to digitize an echogram, convert it to geographic\ncoordinates, and examine the recording ship's track by displaying it on a CRT screen. An echogram shall be defined as any\ngraph that contains depth versus time axes and a copy of the recording ship's log of geographic fixes, loxodrome bearings, and\nship velocity changes. Up until this point, all previous abilities to digitize echograms had output X, Y depth formats. The\nBDRS produces geographic -depth positions, and because of the geographic nature of the BDRS data base (See Section V-C\n\"Software\"), echograms can be entered directly into the data base, and plots of this data can be made using the BDRS Geo-\nSectioning routines. This is a significant improvement over previous hand calculations that took place when utilizing an\nechogram (fathogram).\n128","C. Geo-Sectioning of the Data Base\nGeo-Sectioning (geographic) is defined as the ability to access a data base by defining geographic coordinates (in\nthe BDRS one can define 2 to 8 points) or bounds and by outputing all information contained within this specified area to a\ndisplay device. The BDRS allows a user to define a circular section (2 points), a triangular section (3 points) or a polygonic\nsection (4 to 8 points) of anywhere in the global data base and to output it to a CRT screen or plotter readable coordinates.\nThe major emphasis here is on the phrase \"anywhere in the global data base\" because the BDRS, for the first time can section\ndata on either side of the prime meridian and north or south of the Equator. This is a breakthrough development, and\nSynectics Corporation should be given full credit for developing the algorithm. The BDRS data base is geographic in nature\nand because of this the sectioning routine outputs data in the form of a geographic file. This file can be input into many other\nBDRS batch routines and converted to a plot of the sectioned area. The plot can then be utilized as a product in support of the\nproduction effort of nautical charts.\nVII. Conclusion\nThe Bathymetric Data Reduction Subsystem (BDRS) is one of the newest interactive data bases in the Department of\nDefense. The BDRS was conceived for the support of weapons systems that require worldwide hydrographic data. Being\ninteractive is cost-saving in that real-time turnaround permits rapid editing and modification of the data base. Moreover,\nadvanced editing and updating features allow data entered to be kept at maximum accuracy and integrity with minimum\nredundancy. With such an accurate data base, mission goals of DMAHTC can be met more rapidly and with improved\naccuracy.\nIt is necessary to note that the BDRS is a prototype unit and is still under testing for compatibility with present DMAHTC\nsupport applications. At present the system seems to fit in well with the Center's current compilation and source package\npreparation flow, and it appears to be a workable application of automation at DMAHTC. The current system will have*to be\nexpanded, however, with respect to memory, as more and more data are input into the system. This should be no problem\nbecause of the Data General Eclipse C-300's compatibility with 'new peripheral equipment (such as disk units) and software\napplication programs.\nIn conclusion, the BDRS is a fine mixture of components which perform a specific function relative to bathymetry. It has\na place at Defense Mapping Agency Hydrographic/Topographic Center and will influence the creation of future data base\nsystems at the Center.\nSELECTED BIBLIOGRAPHY\nAmerican Practical Navigator - An Epitome of Navigation, written originally by Nathaniel Bowditch, LL.D, Published by\n1.\nthe Defense Mapping Agency Hydrographic/Topographic Center, Volumes I & II, 1977 and 1975 editions, respectively.\n2. Bathymetric Data Reduction System, Final Technical Report (RADC-TR-80-273), prepared by Synectics Corporation for\nRome Air Development Center, Griffiss Air Force Base, Rome, N.Y., August 1980.\nGuide to Marine Observing and Reporting (PUB. 606), Publication prepared by: The U.S. Naval Oceanographic Office, The\n3.\nUnited States Coast Guard, National Oceanic and Atmospheric Administration, the Defense Mapping Agency Hydro-\ngraphic/Topographic Center, 1977.\nTest and Implementation Plan for Bathymetric Data Reduction Subsystem, prepared by Synectics Corporation for Rome\n4.\nAir Development Center, Griffiss Air Force Base, Rome, N.Y., January, 1980.\n129","USER\nBIS\nupdated\nUSER\nrequest\nBDRS\nFigure 2-1. Bathymetric information concept\nBDRS\nREPORTS\nlogs\nTERMINALS\nECHOGRAMS\nTAPE\nCHARTS, PLOTS\nand TRACKS\nFigure 3-1. BDRS simplified interfaces\n130","CENTRONICS PRINTER\nTEK 4010\nTEK 4014\nDATA GENERAL 6052\nSTATION 1\n9 TRACK\nECLIPSE\nI\nC300\nII\nT\nE\nDISX\nR\n92MEGABYTES\nCONT\nF\nCOMMUNICATIONS\nA\n128K\nCHASSIS\nWORKS\nC\nE\nDATA AUTOMATION\n5\nLOGIC\nSTATION 2\nDATA GENERAL 6012 CRT\nTEK 4010\nFigure 5-1. BDRS hardware configuration\n131","656\nECLIPSE C-303\nTEKINONIX 4019\nDISITIZER TABLE\nCONTROL LINES\nFigure 5-2. Station 1\nwww.\nECLIPSE C-300\nVOICE DIGITIZER UNIT\nDIGITIZER TABLE\nTO DISPLAY ON CURSOR\nFROM RECORD BUTTON\nCONTROL LINES\na\nTEXTROXIX 4013\nFigure 5-3. Station 2\n132","VOICE ENTRY\nX/Y DATA\nBDRS STANDARD\nTABLE FILES\nHEADER\nHEXXXXX\nCHART\nINDEX\nINFORMATION\nIFXXXXX\nDIGITIZATION\nPROCESS\nDATA FILE\nDFXXXXXX\nOPERATOR\nFATHOGRAM\nCOMMANDS\nFFXXXXXX\nVISUAL\nVERIFICATION\nEXISTING\nHEADER,\nINDEX,\nOLD\nDATA AND FATHO-\nJOB\nGRAM FILES\nINFO.\nFigure 5-4. Digitization input and output\n133","MASTER CONSOLE\nMAG\nTAPE\nFILES\nBDRS TABLE TO GEO\nDDRS GEO TO TABLE\nDISK\nBDRS TABLE TO/FROM MAG TAPE\nFILES\nDEPTH ADJUSTMENT\nDATUM SHIFTS\nFATHOGRAM PROCESSING\nLIS\nSCALING ECHOGRAM\nPLOTTING\nPROCESS LIS AND 1108 FILES\nSUMMARY\nREPORTS\nBDRS\nTABLE\nXYNETICS\nPLOT\nFILE\n1108\nCALCOMP\nPLOT\nFILE\nDDRS\nTABLE/GEO\nFILES\nPatch subsystem data flow\nFigure 5-5.\n134","MASTER MODE\nBATCH\nON-LINE\nMANIPULATE USER FILE\n. OPTIONS\nPHYSICALLY DELETE DOCUMENTS\n1. INPUT PROCESS\n2 OUTPUT PROCESS\n1. LOSICAL DELETION\nMASTER CONSOLE\na\n. OPTIONS .\nBDRS\nDATA\n1. REVIEW/MODIFY\nBASE\n. FEDERAPHIC\n19\nO\n. DOCUMENT\no\n2 DELETE (LOGICAL)\nJ. INSERT\nDOCUMENT REC\nSOURCE REC\no\nb\nLINE PRINTER\nMAG TAPE\nFigure 5-6. BDRS data base data flow\n135","A United States continental margin gridded bathymetric data base\nEdward C. Escowitz, Roger W. Bowen, and Joseph M. Botbol\nU.S. Geological Survey\nMichael S. Loughridge, Peter W. Sloss, and Robert B. Crowell\nNational Oceanic and Atmospheric Administration\nAbstract\nThe U.S. Geological Survey (USGS) and the National Oceanic and Atmospheric Adminis-\ntration (NOAA) are assembling a gridded bathymetric data base for the continental\nmargins of the United States. The NOAA library of coastal bathymetric surveys is the\nsource of the data which were gridded by computer to form an initial data base. This\ninitial data base will be supplemented by incorporation of additional data from the\nUSGS and other sources. Geographic control for gridding is based upon a latitude and\nlongitude mesh that defines 15-second, rectangular unit cells. For each cell, the\ndata base contains the average depth, number of observations, standard deviation,\nmaximum and minimum depth within the cell, and a numerical code indicating the origin\nand/or nature of the data. The USGS will apply the data base to mapping programs,\nequipment siting, and numerical modeling.\nIntroduction\nRecent projects involving geologic mapping, equipment and numerical modeling of geophysical processes\nhave demonstrated a `need for a readily accessible computerized bathymetric data base of the U.S. continental\nmargins. 'Presently available bathymetric data 'in graphic form, such à map or chart, are limited by fixed\nscale, projection, and document publication date. The graphic form of the data makes any transformation of\nscale and projection laborious. Most marine geologic or oceanographic projects require presentation of bathy-\nmetric data in a variety of scales. Present practice is to change scale photographically or by very tedious\nmanual transcription. In some cases the original bathymetric map is manually digitized for subsequent computer\nprocessing, which, depending on the nature of the source document, can add to or compound the inherent map\nerrors.\nAn enormous quantity of computerized raw bathymetric survey data is presently available to the public from the\nNOAA data library. For any given geographic area the computer-assisted data user is confronted with the task of\nreducing a large quantity of raw data to a manageable size for initial application. The existing data base at\nNOAA is awkward for the planned applications of USGS. Therefore, the USGS and NOAA have compiled a reduced\nversion of the NOAA raw bathymetric survey data base. The raw data are reduced in volume by gridding into\n15-second rectangles of latitude and longitude. Gridded data-base storage, maintenance, updating, and retrieval\nare supported by computer programs written by the USGS specifically for these purposes.\nAs new (or additional) data are acquired, grid values will be updated or, where no present data exist, grid\nvalues will be assigned. For those areas that require representation by a grid finer than 15 seconds, separate\ndata bases must be constructed. For those areas, both the file structure and all of the computer software\ndeveloped for the 15-second grid are directly applicable. The file design and all of the computer software used\nto handle the gridded bathymetric data base are models for the implementation of U.S. continental margins gravity\nand magnetics data bases, for which attribute lists and unit-cell sizes have not yet been established.\nThe conversion of the NOAA bathymetric survey library to the gridded bathymetric data base is complete for\nthe Atlantic, Gulf of Mexico, and Pacific (California, Oregon, and Washington) coasts. The USGS is now in the\nprocess of adding data to increase the gridded data-base coverage.\nData Base Characteristics\nA grid size of 15 seconds of latitude and longitude was selected which can be used for macro- and meso-scale\napplications. The number of data points associated with this grid for all the U.S. continental margins is\npresently accepted as the practical limit for timely and efficient storage and handling by available electronic-\ndata processing equipment. A 15-second grid will allow the resolution of features of from 0.5 to 1.0 nautical\nmile over regions of comprehensive data coverage. As a secondary refinement, some consideration has been\ngiven to establishing limited fine-mesh regions where very precise and comprehensive data are available.\nRaw data and initial treatment\nThe NOAA coastal bathymetric survey library, composed exclusively of NOAA surveys and compiled by the\nNational Ocean Survey as part of their automated nautical chart program, was used as the data base for grid\nconstruction. The quality of the data in this library is highly variable. Some of the variability can be\nattributed to the difference in precision and accuracy of the bathymetric surveys in the library. Some of\nthe surveys date back to 1930. Therefore, use of the grid derived from the NOAA library requires a certain\nmeasure of caution. However, in preparing the grid for general use some checks are made to eliminate\nblatantly erroneous grid values. More subtle biases will remain. These may be due to natural phenomena\nsuch as storm surges, data-collection errors such as poor navigation or inaccuracies of the depth-sounding\nequipment, or data reduction errors incurred when the survey sheets were plotted or digitized.\nThe geographic basis of the gridding scheme is a one-degree rectangle of latitude and longitude containing\n57,600 rectangular cells, each measuring 15 seconds on a side. A grid point is located in the center of each\n15-second rectangular cell. Figure 1 shows the division of a one-degree square into cells. For data handling\npurposes, the latitude and longitude are replaced by the location of each grid point in a 240 X 240 matrix.\n136","Figure 1 shows the two-dimensional index system used to reference grid points. For the index (j, k), \"j\"\nrepresents the latitudinal portion and \"k\" represents the longitudinal portion. For a given one-degree\nblock, the southwest corner grid point is indexed as (1, 1), whereas the northwest corner point is indexed\nas (240,1).\nAt NOAA, the bathymetric survey library tapes are processed by sorting depth values into 15-second cells\nfor each one-degree square and computing average depth and standard deviation while retaining maximum and\nminimum depth values. All depths within any given cell are arbitrarily considered to be located at the\ncenter of that cell. The final output tape prepared at NOAA contains latitude and longitude of the cell\ncenter (grid point), average depth, maximum depth, minimum depth, standard deviation and the number of\nobservations in each cell.\nFormatting and storage\nThe tapes received from NOAA are read on the USGS MULTICS computer and bathymetric values at each grid\npoint are placed in a 240 X 240 array which represents a one-degree square. During this process some\ngeneral checks are made to validate the data at each grid point. Because the geographic registration of each\none-degree square is known, it is not necessary to maintain the latitude and longitude values of the 15\nsecond cells. Using a data packing scheme, all of the bathymetric statistical data for a grid point are\nstored as three integer variables (i.e. three computer words). Therefore, a one-degree square is completely\nrepresented by an integer array of the form (3, 240, 240). The convention on array indexing, (i, j, k) is:\nencoded grid point bathymetric data (cf. next paragraph)\ni = 1,3\nj = 1,240 south to north grid points\nk = 1,240 west to east grid points\nGrid points having no bathymetric values are indicated by blank data values.\nThe first computer word of this three-word scheme combines average depth with a grid-point history code. The\nsecond integer word contains the maximum depth for the cell minus the average depth (differenced maximum\ndepth from mean depth) and the average depth minus the minimum depth (difference of minimum depth from the\nmean). The third word contains the standard deviation and the number of depth values used to compute the cell\nstatistics. Appendix A shows the division of the three integer computer words. Each of the three integer\nwords is unpacked by either floating point division followed by truncation to integer, or by a modular\narithmetic procedure.\nThe grid-point history code is a two-digit code which identifies the origin and character of the grid\npoint. At the present time, the following codes have been established:\nMeaning\nCode\ngrid-point statistics have been computed from the original NOAA bathymetric\n00\nsurvey library of raw data\ngrid point is on land\n01\ngrid point was computed (not based on raw data) using a computer gridding\n03\nprogram\nupdated grid point using new raw data\n04\ngrid point is in an area where a limited fine mesh grid exists (grid point has\n30\ncode 00 attributes)\ngrid point has codes 30 and 01 attributes\n31\ngrid point has codes 30 and 04 attributes\n34\nUndoubtedly, this list will be expanded as grid development continues by addition of data from various\nsources.\nAreas of application\nPrimary areas of planned application for the library of gridded bathymetric data include preparation of\ncomputer-generated bathymetric contour charts or cross sections and input to numerical wave-refraction\nmodels. In the case of bathymetric contour charts, a geographic area, scale, and projection are defined from\nwhich the necessary bathymetric grid values are retrieved and input to a computer-contouring program. The\ncontour chart generated can then be presented on a CRT or plotted on paper. For wave modeling, a subroutine\nretrieves gridded bathymetric values for a user-defined, rectangular geographic region. The user-specified\ngrid is placed in a separate working file to be accessed by the wave-modeling program.\n137","Appendix A\nword 1\ngrid point history code, 2 digits\naverage depth, 5 digits\nword 2\nmaximum cell depth - average depth, 4 digits\naverage depth - minimum cell depth 4 digits\nword 3\nstandard deviation, 5 digits\nnumber of observations, 3 digits\nAppendix B\nThe following procedures are used to combine statistics of cells:\npopulations of the two cells\n,6= mean values of the populations\nSa,Sb = standard deviations\nnc=na\nTo combine means:\nc = combined mean = (n a + nb)\nTo combine standard deviations:\n=\nSc\nis\n59\n(240,240)\n41°N.\n(240,1)\n40° 59 52,5\n40° 59 45,0\"\nNW\n40° 59 375\nNE\n40° 59 30.0\"\n40° 00 30.0\nSW\nSE\n40° 00 22.5 \"\n.\n.\n(1,1)\n40°\n00 15.0 \"\n.\n40° 00 7.5\n40°N\n74°W\n73°W\n(1,240)\nFigure 1.\nGeographic division of a one-degree square into 15-second cells. Only the corners of the\nsquare are shown. Grid points are shown centered in each cell. Parentheses enclose\nexamples of coordinate system for grid-point storage.\n138","A SUGGESTION REGARDING OUR MARINE GEOLOGY AND GEOPHYSICS DATA CENTER\nThomas W.C. Hilde\nGeodynamics Research Program\nTexas A&M University\nCollege Station, Texas 77843\nIt is clear from the reports of the data managers and experts in data processing who participated in this\nworkshop that we truly are at exciting \"Frontiers in Data Storage, Retrieval and Display\" in the field of marine\ngeology and geophysics. While the workshop dealt in considerable depth with different institution's procedures the\nand programs for handling data and with some very exciting methods for data processing and display, such as\nlatest in interactive and color graphics, two important problems that face our community were not very seriously\nconsidered; these are 1) data exchange and general access to data, and 2) data volume problems that exist related\nto the continued accumulation of standard data and that will exist for new types of data, such as multibeam echo-\nsounding and long-range side-scan sonar data.\nFrom the standpoint of a \"user\" it is desirable, in fact essential for a proper investigation, to address a\nproblem from the starting point of examining all available pertinent data that exist for a particular region or\nproblem. The Marine Geology and Geophysics Branch of NGSDC has made impressive progress over the last few years\nin attempting to build a national data collection and the related handling capability that will allow investi-\ngators to do this. And, they have done it with a greatly undermanned and poorly financed operation. However,\nthey are still only to the point of being able to provide the most basic of services. Data format options are\nextremely limited and for various reasons important data may not be in their collection. NGSDC recognizes all\nthis, and the difficulty in going much further with the current level of support.\nfew institutions have sought to overcome these problems by essentially duplicating the data base of NGSDC. at\nA of our advances in data handling have come from the efforts of these institutions, and a few individuals these\nMany these institutions, some of whom participated in this workshop. In addition to having the same data,\ninstitutions may have a significant amount of data not held by NGSDC. A few institutions also have far greater and\nprocessing capability than the data center. Their wisdom in building extensive institutional data bases point\nrelated handling capabilities has provided scientists at these institutions with the desired starting\nreferred to previously. Unfortunately, this advantage exists at only a handful of institutions.\nAs the national marine geology and geophysics data base grows and as greater sophistication is called for\nprocessing and display, costs rise for the data center and for individual institutions. In the past it was\npossible in for individual academic institutions to support data management to a large degree from national limited agency\nfunds. This is becoming increasingly difficult, if not impossible, as these agencies are forced by\nbudgets to more strictly adhere to their mandate of funding \"science\" as opposed to data base \"management\".\nState support is, in many cases, equally difficult to obtain for this purpose. Data base \"maintenance\" funds\nare\nsimply becoming very difficult to obtain.\nIn the past we have routinely given lip service to the need for a national marine geology and geophysics\ncenter, and we do supply NGSDC with our data. However, the volume of data and funding realities have advisable brought to\ndata to a time when we must seriously question whether we can continue \"as is\" or whether it may now be\nus build our national data center into an institution that is not only responsible for handling the marine geology will\nand geophysics data collection, but capable of doing SO at a highly sophisticated and efficient level that find our\nserve the entire community. I believe that if we do not make the latter choice we will, before long, to\ncollective marine geology and geophysics data base in serious disarray. A few institutions may be able\ncontinue maintaining usable collections. However, how many institutions will be able to continue expanding and to\nmaintain a truly comprehensive local collection of the national data base and be able to continue adding if few\nreplacing facilities in response to rapidly changing and improving methods of processing data? Even of a capable are\nsuccessful the health of our science will nevertheless suffer because an increasingly large percentage data\nscientists will not have sufficient access to the data or the facilities for proper investigations or\nprocessing. Participants in the workshop were provided with an impressive example of how one scientific discipline, visit to\natmospheric scientists, have responded to their data archiving and processing problems in our makes our the\nNational the Center for Atmospheric Research (NCAR). Funded by the National Science Foundation, NCAR\nnational facilities for the handling of marine geology and geophysics data appear antiquated and paltry by\ncomparison. This is not the fault of NGSDC, it is ours because we haven't seriously advocated a comparable (DAMUS\nfacility. We have been told of NOAA EDIS plans for future Data Archive Management and User Services of our II).\nThis if fully developed as planned, would certainly improve the archiving and manipulation in\nnational system, data collection. However, even if fully developed it will not, in my opinion, be comparable marine capability\nNCAR type facility, or sufficiently meet our future requirements. A fully capable national for geology\nto and an geophysics data center is needed if we are going to properly use our data for scientific studies, planning\nfuture investigations and to evaluate our increasingly important marine geological resources.\nI that we seriously consider development of a NCAR type facility for marine geology and geophysics.\nfacility propose could, like NCAR, be funded by NSF (and other agencies) and be accessable to all academic, with government\nThis and industry institutions concerned with the geology and geophysics of the oceans. It could undertake, and archiving\nguidance from experts such as those attending this workshop, the responsibility for formating\nexisting and future data, for development of a wide range of standard processing programs and serve as a center\nof excellence for assisting investigators in their scientific studies and in the development of more sophisticated\ncomputer methods for data manipulation and analysis.\nAn important aspect of approaching our data handling problems in this manner would be the freeing and of manpower\nfacilities at \"user\" institutions to devote more time to data analysis rather than data archiving of management. data\nand Direct access links could be had between the center and \"user\" institutions SO that appropriate subsets of\nrelated to individual projects could be manipulated as desired at the individual institutions. Adoption\nacceptable procedures for handling proprietory data and establishing appropriate hold times could for proprietory affect a use\nof data by the institution that collected it should not be a problem. Having such a center and\nsaving of national research funds by avoiding their expenditure on duplicate data archiving associated\nmanagement significant efforts. With the advent of new methods for rapid, large volume data acquisition and the\ndata processing requirements, can we really afford any other course of action?\n139","The following \"Reflections\" is an abstract of portions of an internal LDG-0 memorandum, included because of its\ninsite. -AMH\nREFLECTIONS\nEllen M. Herron\nLamont-Doherty Geological Observatory, Palisades, N.Y.\nThe meeting was exceptionally good and worthwhile because of:\n1.\nthe mix of participants, which included representatives from government agencies, industry, and\nuniversities at levels ranging from programmer through systems analyst to managerial.\n2.\nthe range of topics covered, which included large data bases such as those associated with satellite\nprograms to very small and/or very obsolete programs such as the Conservation Department of USGS which\nstill relies on 3x5\" file cards. The topics ranged from classical nav/grav/bathy/mag systems to new\ndevelopments in handling analog data such as side-scan sonar.\n3.\nthe excitement associated with the potential for future development and cooperation between groups which\nwas identified during the meeting.\nSignificant Issues\n1.\nIdentification of NG SDC as a central data base for all MGG -- and elimination or shrinking of data bases at\ninstitutions\nthis issues needs much more careful examination. At the present time NGSDC does not have\nthe\nstaff to cope with all the MGG data, especially new types of data such as MCS or Deep Tow and the lack of\nadequate communications links between NG SDC and users mandates that users obtain copies of all data required\nfor in house mounting on their own computer systems. Also questioned was the suitability of current data\nformats, especially for new digital data acquired by MCS and side-scan sonar, as well as the new emphasis on\ndetailed areal surveys either BLM-type studies or OMD syntheses (question here: should the data submitted to\na central repository continue to be just the \"raw\" data or should somewhat processed (second generation data)\nbe submitted)?\n2.\nData must be regarded as dynamic and living - not static. (Several go-rounds with Tom Pyle regarding support\nfor keeping data \"live\" kept Tom defensive.)\n3.\nDBMS systems currently on the market are not suitable for the multiply related data files encountered in MGG\nwork. Concept of modular storage and retrieval systems generally supported. WHOI and Scripps are way ahead\nof L-DGO now and AGC (Canada) is close behind them.\n4.\nRoll in - Roll out storage generally favored as the most economical and efficient way to handle large volumes\nof data in a shared facility.\nOf Particular interest for L-DGO:\n1.\nWe might best make use of the new WHOI or AGC systems for underway MGG data storage and retrieval rather than\ntry to hire programmers to re-do (and duplicate WHOI/AGC) our own programs. WHOI and AGC are developing\nmodular systems that are designed to be much more efficient than our present system - which hasn't changed\nsignificantly since the mid 1960's when Talwani and others set it up.\n2.\nJPL (Peter Paluzzi) has developed some very interesting and sophisticated techniques for handling side-scan\nsonar data (sometimes starting with an analog tape) as well as techniques for treating the global bathymetry/\ntopo data sets like pixels.\n3.\nWe need to consider whether or not to argue for alternate formats for data exchange, especially MCS and sonar\nand areas or very detailed, small-scale studies. Present formats do not seem to be the most appropriate or\ndesigned to ensure the \"vitality\" of the data.\n4.\nWe ought to be placing a high emphasis now on converting our analog seismic data library to digital data.\nGreg Mountain has recently completed a program to digitize single-channel seismic records. We ought\nto\nupgrade our digitizing facility to circumvent tab cards - and we ought to look also at the use of a\ndensitometer to digitize these records.\n5.\nWe ought to be developing (or working with other institutions to develop) capability to display and\nmanipulate 3-D data sets - to merge bathymetric and seismic data. Time to get familiar with gridding data\nsets, kriging, and \"rubber-sheet transformations.\"\n140","L-DGO was long the leader in MGG data management because of the institutional approach to the problem. I\n6.\nbelieve that it is even more important now that this approach be continued. I am concerned at the different\ndirections being pursued now by the new independent groups because they often are too tightly focused on only\na small part of the data set - and other complementary data sets are omitted in their system designs. A\nsystems level programmer who would be supported either directly or through overhead ought to be part of the\nL-DGO staff and should report to the senior administration rather than to a particular group.\nNOAA is interested in pursuing discussion with L-DGO as to their interest in serving as a duplication center\n7.\nfor certain types of records that require access to original documents: e.g. core descriptions, seismic\nsingle channel records, MCS tapes. I believe that financial support for technical staff to service such\nrequests might provide the necessary funds to maintain a critical mass data reduction department and that we\nshould seriously explore this possibility.\nWorkshop Reflections by Roy Jenne, NCAR*\nIt seems to me that the workshop was very effective in helping each data group to benefit from the data\nstrategies of other groups and to hear about the good or bad results from different methods that have been tried.\nMost of the data sets that were considered at the workshop were small in volume; they would fit on one to\nseveral magnetic tapes. The characteristics of use of the many geophysical data sets probably vary markedly.\nA few are used quite often and others are idle for long periods between use. An overall data strategy should\nencourage different approaches depending on data set size and how the data set is used. It. is not reasonable\nor cost effective to model all data access systems after an airline reservation system. If the frequency of\ndata access is relatively low and/or the data volume is high, a more batch oriented file system will be best.\nIn the on-line data systems, it may be easy to forget that a secure copy of the data, usually off-line,\nis also necessary. This copy should be in a well structured format that can be easily (and inexpensively)\ncopied for use at another installation, and it can serve as a backup. For most new data sets, the first\npriority (after meeting any real-time needs) should be to get it onto secure storage in a well structured\nfile format. Different stages of quality control should follow; the first stage should ensure that mechanical\ndetails such as format, dates, and locations are correct.\n*National Center for Atmospheric Research\n141","REFLECTIONS\nStuart M. Smith\nScripps Institution of Oceanography, La Jolla, California\nThese comments are offered from the perspective of a marine data manager at a multidisciplinary academic\noceanographic institution.\nFUTURE TRENDS\nThe next few years will undoubtedly see great changes in marine data management. The development of laser\ntechnology storage devices will permit very large data bases to be stored on-line and accessed interactively.\nSatellite communications will allow cheaper, higher speed, and more reliable transmissions between data centers or\nto and from ships at sea. Regional data from satellites, abstracted or gridded, and data from multibeam echo-\nsounders (just now becoming available to the academic/industrial community) will require us to look beyond our pre-\nsent merge-merge or MGD77 conventions of storing individual data points with time and position and explore other\nmethods such as digital image processing, for storing, displaying, and exchanging these data.\nThe economic realities for data management, in the academic environment at least, are such that we will not be\nable to take the lead in developing these technologies or even be among the first users when they do become avail-\nable. We should, however, keep ourselves informed about developments in these fields so that we may take advantage\nof them when it is economically feasible to do so.\nREMOTE ACCESSED ON-LINE INDEXES\nFor the past year I have been using the on-line Navigation Inventory file maintained by NGSDC in Boulder from\nterminal at Scripps as part of an experiment in remote access by off-site users. This experience has convinced\na\nme that, aside from the not insignificant positive public relations value of making NGSDC more visible to the user\ncommunity, this system offers real savings in time, money and exasperation. Whereas requests formerly required a\nseries of letters and phone calls to find out what was in the area of interest, a single access now lists what data\nof the specified types are available. Additional features, including single and composite track charts produced on\nmicrofilm and the ability to search for data added to the file since the date of a previous search, make the file\neven more valuable to the outside users.\nIt would seem worthwhile over the next several years to explore the feasibility of promoting remote access to\ncomparable files in other government agencies and private institutions or to determine if this is feasible only to\none central facility such as NGSDC. Practical matters which stand in the way are: (1) interside communication\ncosts, (2) cost allocation and billing procedures at many different computer facilities, and (3) adequate documen-\ntation for off-site users of computer system access, programs, and data bases.\nOUTPUT AND DISPLAY\nMany of the workshop presentations (including my own) described the use of computers for storing large volumes\nof data and subsequent on-demand searches on one or more parameters. While it is necessary to provide this\ncustomized retrieval, we should not neglect more traditional hardcopy indexing methods such as charts and tabular\nlistings. These provide the user with a \"quick look\" and do not require the expense of on-line, or the delays of\noff-line, data storage.\nMost of the problems of hard copy indexes are due to effort and expense of updating and distribution. Many of\nthese problems can be reduced by use of microfiche which, in volume, is relatively inexpensive to produce and\ndistribute. An example of such an application is the plan by NGSDC staff to provide individual track charts of\ncruises in their files on fiche with updates of additional cruises at appropriate intervals. Many institutions\nalso produce data reports and summary index track charts from time-to-time but these tend to be unpublished or\nscattered throughout the \"gray\" literature. It would be a very useful and appropriate function for NGSDC to\narrange to reproduce these reports on fiche to make them more available to marine data users.\nA concern I wish to share with workshop participants deals with getting useful hardcopy output into the hands\nof\nthe users. The technology for hardcopy output seems to be lagging behind that for input, storage, and retrieval\nfor both microform and digital data. Microfilmed seismic reflection records must be blown back frame-by-frame in\nthe darkroom on high quality but expensive silver emulsion papers or on less satisfactory reader-printers using\nxerographic methods. To my knowledge, unframed continuous flow reproduction from 35-mm film can only be done on a\nfew surviving copyflow xerographic machines of a type no longer manufactured.\nAs for computer-stored data, conventional pen and ink plotting seems to be an increasingly disproportionate\nshare of costs and aggrevation. Electrostatic raster type plotters solve some of the problems but, in turn, lack\nthe resolution and quality necessary for many applications. Here again, data managers lack the resources to\ndevelop new devices but it behooves us to keep up with new developments to adapt them to our needs and, to the\nextent possible, educate users and help them adapt to new methods.\nFUTURE WORKSHOPS\nNGSDC is to be congratulated for hosting the workshop. The meetings offered a unique opportunity for those of\nus involved in data handling operations to get together and compare notes, both with our colleagues from sister\ninstitutions and more importantly with those from government agencies and industry with whom we otherwise have no\ncontact.\n142","I strongly recommend that meetings of this kind continue to be held every several years, perhaps focusing on\nmore specific subjects. Examples of topics which might be covered in informal ni-workshops are depth correction\nmethods and conventions, multibeam echosounder data, and data bases and methods for handling digitized lines, con-\ntours and gridded data sets. NGSDC would be the logical organization to continue to host these workshops because\nof its charter as a national data center and its geographical location.\nFebruary 1981\nReflections on Frontiers in Data\nStorage, Retrieval and Display\nby Peter C. Belknap and Roger L. Larson\nGraduate School of Oceanography\nUniversity of Rhode Island\nOne of the most impressive features of the workshop was the mix of people who attended. Members of the\nrepresented the oil industry, commercial data firms, government agencies, and academia. Programmers,\ngroup geophysicists, system engineers, archivists, data base managers and others with a wide spectrum of experience\nand viewpoint traded ideas, concerns, and aspirations for three days of information-packed talks. With a\nbroader perspective and a heightened appreciation of data handling procedures, we began, in the final minutes,\nto consider the future of a shared scientific data bank. While no concrete proposals arose from the discussion, and\nthe melange of inputs brought home the need to consider data processing both from the viewpoint of the user,\nfrom that of the system designer and operator. One of the central points of this discussion was the conceptual\nproblem of designing a data management system. On the one hand, the system engineers and programmers must\ndetermine and anticipate the needs of the users. On the other, the users must be aware of the limitations of\nthe system being designed. Not everybody can afford rotating relief globes or multicolor graphics; not central everyone\nwants them. Of course, if rotating globes are in the budget, it's nice to know they can be done. The\nfunction of a data bank, however, is not to offer high technology and individualized displays, but to provide\nefficient access to data as responsively as possible to a large number of users.\nThe NGSDC data bank receives different contributions from different institutions, and provides a variety\nservices to industry, government, and academia. The workshop familiarized both contributors and users and with\nof NGSDC's capabilities and increased our awareness of the center's limits. Standardization of data format\nautomation of search procedures are two ways in which NGSDC has smoothed the flow of data from collecting\ninstitutions to outside users. Seminars such as the November, 1980 workshop are another way to improve\nscientific communication through the exchange of a variety of ideas.\n143","REFLECTIONS ON PERCEPTIONS\nThis stimulating workshop ended with a challenge for scientific data managers to face up\nto the serious problem that there are many diverse perceptions among scientists, scientist\nmanagers, and data managers concerning the management of data. This was first recognized by\nthose charged with maintaining pooled data. A concerted effort to solve this problem should\ncome from data managers; after all, the systems they wish to impose on the scientific community\nare theirs.\nAs a first step on the road to the resolution of the problem it is necessary that the\nscientific community recognize the intrinsic value of data as a resource which transcends\nspecific projects and publishing schedules. This can be carried a step further by realizing\nthat data have an inherent life cycle for which their life expectancy varies widely depending\non the quality of data, data gathering techniques, availability of collaborative data, etc.\nTied closely to the life cycle of data are economic factors associated with their\nacquisition and storage. This is particularly important in field-oriented research, such as\nmarine geology and geophysics, where costly resource and logistics commitments dominate project\neconomics, making. data acquisition an expensive enterprise.\nAn important facet within the perceptual problems that must be resolved is the relation of\nthe images of data structures envisaged by scientists and those utilized by data managers. A\nconcerted effort to explore these relationships should be mounted in order to define better\naccess paths inherent in the data structure images of scientists. It should be done in this\nway for two reasons. First, from a data management point of view, many of the decisions on how\nto manage data are intuitive. They are independant of data processing systems and theoretical\nmodels. Adherence to conventional data structures could invoke rigid control on the scientific\nperception of the data. Second, from a scientific point of view, there is a need to revamp\ndata structures, along with management software, to aid in exploring a data bases as a means\nfor developing research strategies. Most conventional software assumes rigid access paths\noperating under pre-conceived conditions.\nSuccess in rectifying this complex of views should make scientific data bases more\nattractive to the scientific community and thereby win greater support for continued\ndevelopment and growth.\nPeter K. MacKinnon\nWorld Data Center-A for Glaciology (Snow and Ice)\nCooperative Institute for Research in Environmental Sciences\nUniversity of Colorado\nBoulder, Colorado 80309\nU.S.A.\n144","AFTERTHOUGHTS\nMichael S. Loughridge\nNOAA/NGSDC\nBoulder, Colorado\nIt appears to me that the workshop was a success. We somehow managed to assemble the right people, the right\ntopics and the right atmosphere necessary to stimulate productive discussions. Enthusiastic comments from many\nindividuals were heard. Clearly all of us were assaulted with new ideas and confronted with a wide variety of dif-\nferent opinions and methodology.\nThere were several truly outstanding technical papers that showed us how far advanced the state of the art has\nbecome in recent years. We had the opportunity for individual private discussions that highlighted a variety of\nproblems and accomplishments. Hopefully some attendees will be able to avoid the costly mistakes of others.\nClearly, NGSDC benefited immensely from constructive feedback received.\nThere were several ideas that emerged and several themes that kept reappearing throughout the workshop that\nnow seem worth repeating for readers of these proceedings. These will be obvious to attendees but, unfortunately,\nperhaps not to others who may encounter our written proceedings later:\n1. Institutionalized data management still works best, but it is increasingly difficult to support. Data\nmanagement left to individual research scientists rarely produces data sets that are consistently useful to others.\nData collected by individuals require institutional protection to ensure their availability to secondary users at\nthe same institution or elsewhere. Once collected and protected, the data become an institutional resource to be\ndrawn upon by others.\nFrequently these data, because of their unique character or collection location, constitute a national\nresource, held in trust by the institution (or sometimes by NGSDC). It is this position of trust that frequently\ngoes lacking for support. Funding agencies do not seem to fully recognize that data are clearly one of the pro-\nducts of research. Publication of research results is not the only mechanism by which society and science benefit\nfrom research. Unfortunately, data must be considered perishable if not properly preserved and maintained.\nIf funding agencies abdicate their responsibility for the processing of data to a stage usable by others, and\nthe long-term preservation of the data, they have in fact created a burden for the scientific community and create\nthe possibility of non-productive and redundant collections of data.\nOne consistent complaint data managers hear from research scientists is that present mechanisms for funding\nthe data management portions of research grants places the two in conflict. An appropriate amount of data manage-\nment frequently reduces the money available for research. Scientists (who are also secondary users of data from\nelsewhere) are unlikely to speak out for better data management for fear of reductions in dollars for \"real\nresearch\". This \"Catch 22\" situation is untenable for the individual scientist, yet the real loser is the scien-\ntific community that the funding agencies support - not the individuals.\n2. Several workshop participants voiced the need for more efficient mechanisms for sharing (widely -\ndistributing) novel ideas for utilizing analytical data. Much of what presently constitutes good science has a\nsound technological base but there are no mechanisms equivalent to scientific journals for sharing that technology.\nAlthough an analog to the \"Review of Scientific Instruments\" comes to mind immediately, perhaps that is too ambi-\ntious initially. One suggestion is that in addition to sponsorship of workshops such as this one, data centers\nmight fill the void with a \"newsletter\" devoted to novel technologies for looking at analytical data.\n3. The concept of \"Data Utilization Centers\" re-emerged. Not every institution can afford to build or main-\ntain the comprehensive data collections necessary to remain at the forefront of geophysical research today; it is\nvirtually impossible fot the \"new player\" and increasingly difficult for the more established members of the\ncommunity.\nThe concept has two popular versions:\nOne version involves one facility, conveniently located, to which any scientist can go and productively\nwallow around in available data. Opinions vary considerably as to whether this facility should be on \"neutral\nground\" or associated with and built around an existing institutional collection (thus giving that institution\nan advantage in competing for reseach grants - some complain.)\nAt least one first-line scientist has suggested that the Marine Geological and Geophysical community\ndeserves nothing less that a facility comparable to NCAR where data and facilities could be shared by any\nscientist, and he implicitly faulted the community for not \"demanding\" that such a facility be established.\nThe second version involves several regional centers with essentially redundant but complete collections.\nThis has the obvious advantage of minimizing travel costs for short visits.\nIn some respects NGSDC matches the first concept and the second version does not exist, though it has been\nconsidered. NGSDC experience (and the advice of qualified scientists) would indicate that a modest staff of well-\nqualified individuals with both scientific and technical backgrounds coupled with a willingness to be of service\nand with no vested interest except survival is needed to operate any data center. The regional data center concept\nseems to require excessive dilution of the available talent. Furthermore, advances in computer networking seem to\nmake this obsolete, for all one needs is a telephone. Although the two versions of the concept are not new,\nseveral new ideas emerged from discussion.\n145","The marine research community has always been separable into two groups - the \"haves\" and the \"have-nots\".\nThis separation also applies to data and data management. Large institutions have large resources - one of these\nthat frequently goes unrecognized is the tremendous advantage accruing to an institutional research program when it\nhas its own scientific data base on which it can base its research program (and proposals), which it can analyze\nto\nproduce research results of broad scope, and which in itself attracts first-class visiting scientists, thus bene-\nfiting resident staff and students immensely. Just as major library collections attract historials, major data\ncollections (particularly those that can be easily used) attract scientists. In fact, a well managed collection of\ndata can be considered one characteristic of a healthy research institution\nUnfortunately today's economic pressures are such that even the most affluent of the \"haves\" cannot afford to\noperate in the same manner as before. During this meeting we saw the evidence that several major institutions are\nexperiencing economic pressures that seriously threaten the continued existence of their very healthy and widely\nrespected models for institutionalized data management. Why? The mechanism by which research if funded bring data\nand data management into conflict with research proposals rather than treating useful data as respected results of\nresearch program similar to publications. In a level-funded or decining-money environment, money spent on data\na\nis not available for the \"hard research\" effort!\nApparently it has become too expensive for all involved institutions to have their own redundant collection of\ndata. Recognition of this is one source of the concept that it might be better if there was one place (a data uti-\nlization center) supported at least conceptually by all institutions where ease of access to data was guaranteed to\nall interested individuals or institutions.\nMoney now spent in building redundant collections and supporting storage and retrieval systems exclusive to\nthose collections might be better spend on analysis of data, and in the development of hardware and software to\nfeed and use remote-site collections of data.\nOne or more moderately sophisticated computer terminals linked to a single remote collection of data and its\nretrieval software such as that at NGSDC can place a relatively small institution with no data collection at all\nin\na position to do credible research at a fraction of the cost necessary to support a full-blown, independent data\nsystem. The limited institutional resources remaining can be devoted to analysis software directly supportive of\nindividual scientists' research. It must be emphasized that the key element in any such scheme is that individual\ninstitutions must continue to feed the centers with data and reserve some fraction of their budgets to do so.\n4. New types of marine geophysical data are emerging with prodigious data production rates. The data from\ncertain sensors and systems threaten to totally overwhelm existing methods of data exchange or dissemination.\nCompared to the cost of system development or data collection, we spend miniscule amounts to ensure the continued\nuse of these data by secondary users. Those in government and academia have a modest familiarity with solutions to\nthe problems associated with Common Depth Point (CDP) seismic reflection data, whereas industry devotes millions of\ndollars to systems for storage, processing, retrieval and display of these data. A few academic institutions now\nhave CDP systems but the mechanism for exchange or dissemination of these data is not in place except for the ele-\nmentary process of providing duplicates of the processed sections. One short CDP line can produce an amount of\ndata equivalent to a full one-month cruise of multi-parimeter geophysical data of the traditional sort.\nThere is no currently accepted methodology for exchange of analog side-scan data except through the default\noption of microfilm. The advent of digital side-scan (GLORIA and others), swath mapping, and SEABEAM systems\nsuggest data management and exchange problems comparable in scope to those of CDP data, yet the community has no\nmechanism, let alone a coordinated one, for management of these data. Again, the volumes of data are potentially\nenormous.\n5. A clearly expressed need was felt by all attendees for a mechanism for periodic (or as needed) collective\naction to recommend national (not institutional) efforts in data management. It is vitally important to all\ninvolved - both individuals and institutions to identify common needs and effectively promote solutions to common\nproblems.\nMost attendees at the workshop were quick to agree that data utilization centers provide a viable solution to\ncurrent economic constraints on data management but there is no existing mechanism to ensure the message is heard\nby funding agencies. Even with the best centralized collection of data imaginable and supportive remote accesa\nhardware and software, institutions would be hard-pressed to pay the telephone charges associated with full-blown\nremote access. Major funding agency support for networking, WATS or access to Federal Telephone System (FTS) faci-\nlities would benefit all involved. Collective action is needed to convince the funding agencies that is is within\ntheir purview to foster science in this way. One or two individual suffestions are not enough. The problems must\nbe identified as a problem common to many institutions and investigators. The desired end-result is something like\nan \"NSF ARPA NET.'\nTom Hilde's suggestion of an NCAR-like facility for Marine Geology and Geophysics is brilliant. It would\nrequire an order-of-magnitude increase in NGSDC funding to begin to do it properly if NGSDC were the proper place\nto start. Projects of this magnitude do not develop without far-reaching collective action. If the idea does not\nmature, let us not forget who first proposed it. If this facility reaches fruition, I will be proud to recall that\nthe idea germinated at this workshop.\n146","APPENDIX 1\nAGENDA\nA Marine Geology & Geophysics Data Workshop\n\"Frontiers in Data Storage, Retrieval & Display\"\nRegistration Fee: $10\n1.\nPreliminary Activities (5 November) - NOAA, Research Building 3,\n3100 Marine Street, Boulder\n9:00-11:30\nInformal data exchange meetings\n1:30-2:30\nTour of NGSDC\n2:30-4:30\nInformal data exchange meetings\n4:30-5:30\nTour of National Center for Atmospheric Research,\nTable Mesa Drive, Boulder\n5:30-7:00\nWine n' cheese social (at NCAR)-Damon Room, North.\n2.\nData Workshop\n(6 November) - Hilton Harvest House, 1345 28th Street,\nBoulder\n8:00-10:00\nIntroductory Remarks\nHerbert Meyers, NGSDC\nMichael Loughridge, NGSDC\nAllen Hittelman, NGSDC\nRoger Larson, URI\nEDIS System Development\nLarry Schultz, NGSDC\nStrategy to Develop and Access Large\nSets of Scientific Data . Roy Jenne, NCAR\nMarine Data Management at NGSDC including Poster Session:\nThe NGSDC Marine Geophysical Data System (GEODAS)\nAllen Hittelman, NGSDC\nDan Metzger, NGSDC\nCOFFEE BREAK\nInteractive Graphics for Retrieval Display & Editing\n10:00-11:30\nof Marine Gravity Data. . Kenneth Fournier, Union Oil\nInteractive Grephics and Data Base Techniques at the\nCenter for Environmental Assesment Services\nGerald Barton, EDIS\nInteractive Graphics in a Compuer Saturated Environment\nincluding Poster Session: Interactive Graphics at\nPeter Sloss, NGSDC\nNGSDC\nLUNCH\n147","Data Management at the Hawaii Institute of\n1:00-3:00\nGeophysics\nDale Erlandson, HIG\nData Storage, Retrieval and Display of Data at the\nLamont-Doherty Geophysical Observatory\nEllen Herron, L-DGO\nTom Aitken, L-DGO\nComputer Storage & Retrieval of Position Dependent\nData at the Woods Hole Oceanographic Institution\nRobert Groman, WHOI\nPoster Session:\n1. The SIO Sample Index - a first level index of\nsamples, records and measurements collected on\nScripps' Cruises\nStuart Smith, SIO\nA System for Storage, Retrieval and Display of\n2.\nUnderway Geophysical Data under development\nat Scripps\nStuart Smith, SIO\nCOFFEE BREAK\n3:00-5:00\nCurrent Marine Geophysical Systems Development Work\nat the Atlantic Geoscience Centre, Canada\nGwynneth Martin, Geol. Survey\nof Canada\nAn Automated Index to USGS Eastcoast Marine Geological\nJoseph Botbol, USGS\nData\nData Management at the USGS Marine Geology Branch,\nMenlo Park, CA\nAlan Cooper, USGS\nOffshore Data: Acquisition, Analysis and Release in\nthe USGS Conservation Division\nLucille Tamm, USGS\nBibliographic Data Bases - Present and Future\nVicky Schneller, NOAA\n148","(7 November) - Hilton Harvest House, 1345 28th Street,\n3.\nData Workshop\nBoulder\n8:00 - 10:00 Management of Project ROSE data\nJan Garmany, Univ. of Washington\nManagement of Large Petroleum Data Bases\nPete Stark, Petroleum Information\nAAPG Committee for Computer\nApplications to Geology\nThe Construction and Use of Large Integrated Data Sets\nGeorge Wecksung, Los Alamos\nScientific Lab\nApplications of Computer Image Processing to Marine\nGeology -- including, Movie: Bathymetric Imaging\nPeter Paluzzi, JPL\nPoster Session:\n\"Marine Geophysical Image Display: A Caribbean\nSeismic Example\"\nDave Handschumacher, NORDA\nCOFFEE BREAK\nAlex Malahoff, NOS\n10:00 - 11:30 NOS SeaBeam Activities\nAdvances in SeaBeam Data Treatment and Presentation\nChristian Edy, BNDO (France)\nBathymetric Data Reduction Subsystem at the Defense\nJames Moran, DMAH/TC\nMapping Agency\nImplementation and Planned Application of a U.S. Continental\nMargins Gridded Bathymetric Library\nEdward Escowitz, USGS\n11:30 - 12:00 Closing Remarks\n12:00 - 1:30 LUNCH\nInformal Discussions\n1:30 - 3:00\n149","APPENDIX 2\nWORKSHOP PARTICIPANTS\nTom Aitken\nC. Edy\nMr. Roy Jenne\nLamont-Doherty Geo. Obser.\nBur. National desDonnes Oc.\nNCAR\nColumbia University\nCentre Ocean. de Bretagne\n1850 Table Mesa Drive\nPalisades, NY 10964\nB.P. 337-29273 Brest Cedex\nBoulder, CO 80303\nFRANCE\nMr. Red Bailey\nJohn Kunselman\nU.S. Geological Survey\nMr. Dale Erlandson\nUniv. of Texas at Galveston\nWoods Hole, MA 02543\nHawaii Institute of Geophysics\nMarine Science Institute\nUniv. of Hawaii\nGalveston, TX 77550\nSlade Barker - Code 8230\n2525 Correa Road\nNAVOCEANO\nHonolulu, HI 96822\nDr. Roger Larson\nU.S. Navel Oceano. Office\nSchool of Oceanography\nNSTL Station, MS 39522\nDr. Ed Escowitz\nUniversity of Rhode Island\nUSGS\nKingston, RI 02881\nMr. Gerald Barton\nNational Center\nNOAA/EDIS D213\n12201 Sunrise Valley Drive\nDr. Michael S. Loughridge\nPage 2/Room 260\nReston, VA 22092\nNOAA/NGSDC (D62)\n3300 Whitehaven, NW\n325 Broadway\nWashington, DC 20235\nMr. Ken Fournier\nBoulder, CO 80303\nUnion Oil\nPeter Balknap\nP.O. Box 76\nPeter MacKennon\nSchool of Oceanography\nBrea, CA 92621\nWDC-A for Glaciology\nUniv. of Rhode Island\nCampus Box 449\nKingston, TI 02881\nDr. Jan Garmany\nCIRES, Univ. of Colorado\nDept. of Oceanography, WB-10\nBoulder, CO 80303\nMr. Joseph M. Botbol\nUniv. of Washington\nU.S. Geological Survey\nSeattle, WA 98195\nDr. Alex Malahoff\nWoods Hole, MA 02543\nNOAA, NOS\nAnn Ginder\nOffice of Chief Scientist\nJerry Bourg\nUniv. of Texas at Galveston\n0A/Cx4, WSC-1\nNAVOCEANO - Code 8112\nMarine Science Institute\nRockville, MD 20852\nU.S. Naval Oceano. Office\nGalveston, TX 77550\nNSTL Station, MS 39522\nMs. Gwynneth Martin\nMr. Bruce Grant\nGeol. Survey of Canada\nNOAA/NGSDC (D62)\nMr. Robert Bowen\n601 Booth Street\nU.S. Geological Survey\n325 Broadway\nOttawa, Ont., Canada K1A 0E8\nNational Center\nBoulder, CO 80303\n12201 Sunrise Valley Dr.\nMr. Dan Metzger\nReston, VA 22092\nMr. Bob Groman\nNOAA/NGSDC (D62)\nClark Laboratory\n325 Broadway\nDr. Alan Cooper\nWoods Hole Oceanographic Inst.\nBoulder, CO 80303\nUSGS\nWoods Hole, MA 02543\nOffice of Marine Geology\nMr. Herbert Meyers\n345 Middlefield Rd.\nMr. Richard Haefner\nNOAA/NGSDC (D62)\nMenlo Park, CA 94025\nExxon Prod. Research Co.\n325 Broadway\nP.O. Box 2189\nBoulder, CO 80303\nMr. Robert Crowell\nNOAA/NGSDC (D62)\nDr. Dave Handschumacker\nJames Moran, Code SDH\n325 Broadway\nNaval Ocean Research & Dev.\nDefence Mapping Agency\nBoulder, CO 80303\nCode 300\nHydrographic/Topographic Ctr.\nNSTL Station, MS 39629\nWashington, DC 20315\nMr. Tom Davies\nJoint Oceanographic Inst. Inc.\nDr. Ellen Herron\nPaddy Ochsner\n1600 Virginia Ave. NW\nLamont-Doherty Geo Observ.\nCities Services Do.\nSuite 512\nColumbia Univ.\nEnterprise Bldg. Rm. 300\nWashington, DC 20037\nPalisades, NY 10964\n552 South Boston\nTulsa, OK 74102\nTerry Dettemann\nDr. Tom Hilde\nApplied Physics Laboratory\nTexas A & M University\nMr. Peter Paluzzi\n1013 NE 40th Street\nDept. of Oceanography\nJet Propulson Lab. MS 183-701\nSeattle, WA 98195\nCollege Station, TX 77843\n4800 Oak Grove Drive\nPasadena, CA 91103\nNick DiPiazza\nMr. Allen M. Hittelman\nNAVOCEANO - Code 7220\nNOAA/NGSDC (D62)\nU.S. Naval Oceanographic Off.\n325 Broadway\nNSTL Station, MS 39522\nBoulder, CO 80303\n150","Mr. Clint Steele\nDr. Sean Shanahan\nDr. Tom Pyle\nUSGS\nGrad. School of Oceanography\nDept. of Navy, Code 483\nMail Stop 640\nOregon State Univ.\nOff. of Navy Research\n12201 Sunrise Valley Dr.\nOcean Sci. & Tech. Det. Liaison\nCorvallis, OR 97331\nReston, VA 22092\nArlington, VA 22217\nDr. Peter Sloss\nMs. Lucille Tamm\nNOAA/NGSDC (D62)\nMr. Donald M. Scheibe\nUSGS\n325 Broadway\nChief, DOD Gravity Svcs.\nMail Stop 640\nBoulder, CO 80303\nDMA, Aerospace Center\n12201 Sunrise Valley Dr.\nSouth Annex, Atn: GDGGC\nReston, VA 22092\nMr. Stuart Smith\nSt. Louis, AFS, MO 63118\nGeol. Data Cent. Mail Code A-023\nMr. George Wecksung\nScripps Inst. of Oceanography\nMs. Vicky Schneller\nLos Alamos Scientific Lab.\nLaJolla, CA 92093\nNOAA, Mail Code R51\nP.O. Box 1663, MS 263\n325 Broadway\nLos Alamos, NM 87545\nDr. Peter Stark\nBoulder, CO 80303\nPetro. Information Corp.\nRoseanne Weissel\nP.O. Box 2612\nMr. Larry Schultz\nLamont-Doherty Geo. Observ.\nNOAA/NGSDC (D6)\n1375 Delaware St.\nColumbia Univ.\nDenver, CO 30201\n325 Broadway\nPalisades, NY 10964\nBoulder, CO 80303\nU. S. GOVERNMENT PRINTING OFFICE 1980 - 780-210/374 Reg. 8\n151"]}