scispace - formally typeset
Search or ask a question

Showing papers in "Eos, Transactions American Geophysical Union in 1991"


Journal ArticleDOI
TL;DR: The Generic Mapping Tools (GMT) is introduced, which is a free, public domain software package that can be used to manipulate columns of tabular data, time series, and gridded data sets and to display these data in a variety of forms ranging from simple x-y plots to maps and color, perspective, and shaded-relief illustrations.
Abstract: When creating camera-ready figures, most scientists are familiar with the sequence of raw data → processing → final illustration and with the spending of large sums of money to finalize papers for submission to scientific journals, prepare proposals, and create overheads and slides for various presentations. This process can be tedious and is often done manually, since available commercial or in-house software usually can do only part of the job. To expedite this process, we introduce the Generic Mapping Tools (GMT), which is a free, public domain software package that can be used to manipulate columns of tabular data, time series, and gridded data sets and to display these data in a variety of forms ranging from simple x-y plots to maps and color, perspective, and shaded-relief illustrations. GMT uses the PostScript page description language, which can create arbitrarily complex images in gray tones or 24-bit true color by superimposing multiple plot files. Line drawings, bitmapped images, and text can be easily combined in one illustration. PostScript plot files are device-independent, meaning the same file can be printed at 300 dots per inch (dpi) on an ordinary laserwriter or at 2470 dpi on a phototypesetter when ultimate quality is needed. GMT software is written as a set of UNIX tools and is totally self contained and fully documented. The system is offered free of charge to federal agencies and nonprofit educational organizations worldwide and is distributed over the computer network Internet.

4,128 citations


Journal ArticleDOI
TL;DR: The most difficult aspect of teaching an introductory geophysics course is deciding how to best combine the necessary mathematics and theory with their applications to the Earth as discussed by the authors, which is made more difficult because many students in a geology class are not geophysical majors and feel uncomfortable even with first year calculus.
Abstract: The most difficult aspect of teaching an introductory geophysics course is deciding how to best combine the necessary mathematics and theory with their applications to the Earth. Some texts present a classical approach emphasizing the mathematics (especially potential theory), which often gives little insight into its application to understanding the Earth. Others offer good descriptions of the applications but lack adequate explanations of the basic principles, causing students to acquire an insufficient background for future research. Choosing between the two types of texts is made more difficult because many students in a geophysics class are not geophysics majors and feel uncomfortable even with first year calculus.

217 citations


Journal ArticleDOI
TL;DR: Opportunities in Hydrology as discussed by the authors is a book published by a committee composed of water-oriented scientists, and its basic premise that there is such a thing as a single discipline of hydrologic sciences is contrary to the thinking of many hydrogeologists.
Abstract: Hydrologists can take heart that our profession has matured to the point of having its respectable reputation recognized by the National Academy of Sciences. Opportunities in Hydrology follows the publication of Opportunities in Biology and Opportunities in Chemistry, and was prepared by a committee composed of prestigious water-oriented scientists. I am writing this review because the book is extremely important, and its basic premise—that there is such a thing as a single “discipline” of hydrologic sciences—is contrary to the thinking of many hydrogeologists. The committee proposes that students can obtain adequate training and be prepared to develop a career in “hydrologic sciences.” Such an approach may be suitable for many aspects of hydrology, but it does not represent the interests, needs, goals, history, or future of “hydrogeology,” a clearly recognized subdiscipline of hydrology. The various aspects of hydrology are so wide ranging that, from my personal viewpoint and the viewpoints of many of my colleagues, it takes a person of extremely narrow focus to see hydrology as a single discipline.

140 citations


Journal ArticleDOI
TL;DR: In this article, the first results of a project carried out by the International Association of Seismology and Physics of the Earth's Interior (IASPEI) to evaluate claims about precursors to earthquakes were presented.
Abstract: AGU's Evaluation of Proposed Earthquake Precursors presents the first results of a project carried out by the International Association of Seismology and Physics of the Earth's Interior (IASPEI) to evaluate claims about precursors to earthquakes. Only three of 28 nominations were accepted for the Preliminary List of Significant Precursors that IASPEI is compiling for the International Decade for Natural Hazards Reduction. The project's purpose is to build a consensus among the scientific community as to which phenomena may be real precursors and useful for earthquake prediction studies. Because the spectrum of opinion on significant precursors is so broad, evaluation is crucial; some researchers believe that hundreds of significant precursor case histories exist, while others reject all that have been presented.

137 citations


Journal ArticleDOI
TL;DR: In this article, it is assumed that CO2 de gassing from the Earth's interior restores the deficit from surficial processes and balances the atmospheric CO2 budget on a time scale of 104-106yr.
Abstract: In an effort to better understand processes that control sources of CO2 in the carbon cycle, the U.S. Global Change Research Program [CEES, 1990] identifies imbonate deposition, and burial of organic matter would deplete the CO2 content of the atmosphere in 10,000 years and the atmosphere-ocean system in 500,000 years [Holland, 1978; Berner et al., 1983]. The CO2 content of the atmosphere-ocean system has varied in the past, but not at the rate expected if CO2 were removed and not replenished. It is assumed, therefore, that CO2 de gassing from the Earth's interior restores the deficit from surficial processes and balances the atmospheric CO2 budget on a time scale of 104–106yr. Earlier atmospheric balancing calculations imply present-day (pre-industrial) CO2 degassing rates of 6–7×1012 mol yr−1 [Holland, 1978; Berner et al., 1983]; recent calculations suggest degassing rates may be as high as 11×1012 mol yr−1 [Berner, 1990].

115 citations


Journal ArticleDOI
TL;DR: The Loma Prieta earthquake of 1989 as discussed by the authors showed that poorly consolidated sediments and Bay muds with low seismic velocities greatly amplified the ground shaking, leading to significant non-linearity in the site effects.
Abstract: The October 17, 1989, Loma Prieta earthquake of magnitude 7.1 provided a harsh reminder of the hazards associated with life on a major plate boundary. The scientific lessons of the earthquake are still being assessed as seismic, geologic, and geodetic data are analyzed and new data collected. Probably the most striking, and deadly, aspect of the earthquake was the intense damage at sites, such as San Francisco's Marina district and the 1-880 overpass in Oakland, 100 km from the earthquake's epicenter. Similar damage occurred in Mexico City in 1985, 350 km from the epicenter of the M 8.1 Michoacan earthquake. Seismic recordings during the Loma Prieta earthquake and its aftershocks showed that pockets of poorly consolidated sediments and Bay muds with low seismic velocities greatly amplified the ground shaking. Sites on Bay mud experienced peak ground motions many times greater than sites only a few blocks away on bedrock. Site amplification during the mainshock was less severe (at frequencies of a few Hz) than during subsequent aftershocks, demonstrating significant non-linearity in the site effects. The damage resulted from a combination of local site amplification and pervasive failure of artificial land fills. Both effects were predicted prior to the earthquake; published earthquake hazard maps had previously designated as high seismic risk zones the sites of intense damage.

84 citations


Journal ArticleDOI
TL;DR: In the last decade, Kilauea has experienced its most voluminous rift zone eruption in the last 2 centuries as mentioned in this paper, which has provided a unique opportunity for quantitative studies requiring long-term observations.
Abstract: Kilauea is nearing the 10th year of its most voluminous rift zone eruption in the last 2 centuries. Lava flows have covered 75 km2 to depths as great as 25 m and have added almost 1.2 km2 of new land to the island. These flows have devastated downslope communities and have provided a painful tutorial for local government in planning for and living with volcanic hazards [Heliker and Wright, 1991]. At the same time, the accessibility and longevity of this eruption have provided a unique opportunity for quantitative studies requiring long-term observations. This article briefly summarizes these studies, which are directed at a better understanding of eruption mechanics, lava-flow field emplacement, and the plumbing system of Kilauea.

72 citations


Journal ArticleDOI
TL;DR: In 1991, small phreatic explosions on April 2, 1991, ended more than 400 years of quiescence at Mount Pinatubo and led to the evacuation of at least 58,000 people prior to the volcano's climactic eruption on June 15, 1991 as discussed by the authors.
Abstract: Small phreatic explosions on April 2, 1991, ended more than 400 years of quiescence at Mount Pinatubo. A joint Philippine-United States team worked quickly to understand the unrest and the eruptive history of Pinatubo, and to warn those at risk. These warnings led to the evacuation of at least 58,000 people prior to the volcano's climactic eruption on June 15. Although 320 people died in that eruption, mostly due to collapse of ash-covered roofs, the evacuations and other precautions averted a much greater loss of life and property.

68 citations


Journal ArticleDOI
TL;DR: Geophysical geodesy aims to bridge a traditional gap between geodesys and geophysics as mentioned in this paper, and its topics include crustal motion, spatial and temporal variation of the gravity field, rotation, and tidal deformations of the Earth.
Abstract: Geodesy is devoted to the determination of the shape, size, and gravity field of the Earth. Its methods include triangulation, leveling, gravity surveys, and tracking by artificial satellites. The results of its observations typically hold for time periods varying from a few hours to decades. In contrast, in geophysics (seismology in particular) one deals with very short periods of an hour or less, while in geological processes, one frequently considers periods of a million years or more. In Lambeck's book, the author treats the middle ground between these two geophysical extremes, which is concerned with the slow deformation of the Earth. He quite appropriately calls this geophysical geodesy, and its topics include crustal motion, spatial and temporal variation of the gravity field, rotation, and tidal deformations of the Earth. Hence geophysical geodesy aims to bridge a traditional gap between geodesy and geophysics.

57 citations


Journal ArticleDOI
TL;DR: A 1990 marine geological investigation reported by as mentioned in this paper demonstrated that the submarine deposits are largely of pyroclastic flow origin, which is critical to understanding the origin of the devastating tsunamis generated during the 1883 Krakatau eruption, which claimed about 36,000 lives.
Abstract: The nature of the submarine deposits produced by the 1883 Krakatau eruption in Indonesia has remained controversial for more than a century. Knowledge of their character, however, is critical to understanding the origin of the devastating tsunamis generated during the event, which claimed about 36,000 lives. Results of a 1990 marine geological investigation reported here demonstrate that the deposits are largely of pyroclastic flow origin. Pyroclastic flows are partially fluidized mixtures of particles and gases that travel up to 150 m/s and have internal temperatures as high as 600°C. They are denser than the atmosphere but are likely to be comparable to seawater in density. What happens when such a flow encounters the sea along the coast of an active volcano has been the subject of much speculation [Cas and Wright, 1987] but little research [Sparks et al., 1980a, 1980b]. One well-known effect is the generation of tsunamis [Kienle et al., 1987] and indeed, 20% of all volcanogenic tsunamis have been attributed to the entrance of pyroclastic flows into the sea [Latter, 1981]. The most devastating of these occurred during the 1883 eruption of Krakatau volcano.

56 citations


Journal ArticleDOI
TL;DR: In this paper, the authors used the Total Ozone Mapping Spectrometer (TOMS) to measure the ratio of backscattered Earth radiance to incoming solar irradiance in the ultraviolet spectrum.
Abstract: The Cerro Hudson volcano in southern Chile (45.92°S, 73.0°W) emitted large ash and sulfur dioxide clouds on August 12–15, following several days of minor activity [Global Volcanism Network Bulletin, 1991]. The SO2 clouds were observed using (preliminary) near real-time data from the Total Ozone Mapping Spectrometer (TOMS) as they encircled the south polar region. The injection of SO2 into the stratosphere has essentially created a gigantic chemical tracer that could provide new insights into the wind patterns and seasonal circulation around the Antarctic region. around the Antarctic region. The TOMS instrument, on board the National Aeronautic and Space Administration's Nimbus 7 satellite, measures the ratio of backscattered Earth radiance to incoming solar irradiance in the ultraviolet spectrum. Although originally designed to measure ozone, it was later discovered that the TOMS instrument could also detect and quantify SO2 [Krueger, 1985]. After this discovery, measurements from TOMS were examined for SO2 emissions for all recorded volcanic eruptions since Nimbus-7 was launched in October 1978, and current data are analyzed as new eruptions occur. The satellite is in a polar, Sun-synchronous orbit so that it crosses the equator at local noon and observes the whole sunlit Earth in approximately 14 orbits each day. Total column amounts of SO2 are determined that represent the amount of gas affecting the reflection of ultraviolet light through a column of the atmosphere from the satellite to the reflecting surface, Earth, given in terms of milli atmospheres centimeter (1000 milli atm cm = a gas layer 1-cm thick at STP). The mass of SO2 is calculated by integrating over the cloud area to obtain a volume, then converting to tons.


Journal ArticleDOI
TL;DR: The Southern California Earthquake Center (SCEC) as discussed by the authors is one of 14 new NSF Science and Technology Centers and includes a substantial commitment from the USGS for FY91, which is a consortium of seven core academic institutions-USC (coordinating institution), Caltech, Columbia University's Lamont-Doherty Geological Observatory, University of California at Los Angeles, U.S. Geological Survey, and state and local officials.
Abstract: On February 11, Congressman George E. Brown, Jr., Chairman of the House Committee on Science, Space and Technology, together with the National Science Foundation, the U.S. Geological Survey, and state and local officials, helped inaugurate the Southern California Earthquake Center (SCEC) on the campus of the University of Southern California. SCEC is one of 14 new NSF Science and Technology Centers and includes a substantial commitment from the USGS for FY91. The center is a consortium of seven core academic institutions-USC (coordinating institution), Caltech, Columbia University's Lamont-Doherty Geological Observatory, University of California at Los Angeles, University of California at Santa Barbara, University of California at Santa Cruz, and University of California at San Diego's Scripps Institution of Oceanography-in partnership with the USGS's Office of Earthquakes, Volcanoes, and Engineering (OEVE). The center grew out of an April 3-5, 1989, workshop at Lake Arrowhead, Calif., convened by the USGS to discuss the need for an expanded effort in earthquake research in southern California.

Journal ArticleDOI
TL;DR: The chairman of the House Subcommittee on Environment of the Committee on Science, Space, and Technology is very interested in evaluating the adequacy of peer review at DOE as "an essential ingredient to cost-effective research".
Abstract: “What research priorities has the Department of Energy set?” “Does an adequate peer-review process exist?” These questions were asked by James H. Scheuer (D-NY), chairman of the House Subcommittee on Environment of the Committee on Science, Space, and Technology to witnesses at a Fiscal Year 1992 budget hearing for the Department of Energy's Biological and Environmental Research (BER) program on March 21. The hearing was held, Scheuer said, “as part of a larger effort to work with DOE to ensure that the nation receives the best research for every dollar invested.” He is very interested in evaluating the adequacy of peer review at DOE as “an essential ingredient to cost-effective research.”

Journal ArticleDOI
TL;DR: The TERRAscope project of the California Institute of Technology (Caltech) as mentioned in this paper has six very broadband seismic stations (PAS, GSC, PFO, SBC, ISA, and SVD) in southern California.
Abstract: The TERRAscope project of the California Institute of Technology began in 1988 and now has six very broadband seismic stations (PAS, GSC, PFO, SBC, ISA, and SVD) in southern California (Figure 1). The goal of TERRAscope is to provide high-quality broadband data needed for significant advances in both regional and global seismology. TERRAscope will replace the old Caltech seismographic network in southern California, which dates back to the 1920s. In many cases, new stations are deployed in cooperation with local institutions. The goal is to encourage involvement of both students and researchers in the operation of the stations and analysis of new data. The station PAS is a joint project between Caltech, the University of Southern California, the U.S. Geological Survey (USGS), and the Incorporated Research Institutions for Seismology (IRIS). The station SBC was deployed in cooperation with the University of California at Santa Barbara. The station PFO is operated jointly with the University of California at San Diego, and the station SVD was installed and is operated by the USGS. Except for SVD, all of the stations are equipped with a broadband Streckeisen STS-1 seismometer and Quanterra data logger with a 24-bit digitizer and a Kinemetrics FBA-23 strong-motion sensor. The station SVD has a Streckeisen STS-2 seismometer and a Guralp CMG-5 accelerograph. The project is funded mainly by grants from the L. K. Whittier Foundation and the Arco Foundation. In addition to the automatic dial-up data retrieving system called Caltech Gopher (adapted from the IRIS Gopher system) has been implemented. The Caltech Gopher receives mail from NEIC for teleseisms and the SCSN with origin time, location, and magnitude for regional events. The Gopher retrieves data from all six TERRAscope stations for these events. The TERRAscope data reside in an FTP anonymous account (seismo.gps.caltech.edu; password: “your e-mail address”) at the Caltech Seismological Laboratory, and are available to users through Internet. Usually the data are available within 30 minutes after a regional event and several hours after a teleseism. In the near future a new version of the Gopher software will be installed, which will also make some of the Gopher data available directly from the IRIS-DMC Gopher. When the Data Center of the Southern California Earthquake Center begins full operation in early 1992, it will take over the distribution of earthquake data from southern California, including both TERRAscope Gopher data and continuous data from the tape cartridges. The data will also be available from IRIS-DMC, and future improvements and changes in data access will be posted on the IRIS-DMC bulletin board.

Journal ArticleDOI
TL;DR: In this paper, a novel and unique ocean-surface wind data set was derived by combining the Defense Meteorological Satellite Program Special Sensor Microwave Imager data with additional conventional data.
Abstract: A novel and unique ocean-surface wind data-set has been derived by combining the Defense Meteorological Satellite Program Special Sensor Microwave Imager data with additional conventional data. The variational analysis used generates a gridded surface wind analysis that minimizes an objective function measuring the misfit of the analysis to the background, the data, and certain a priori constraints. In the present case, the European Center for Medium-Range Weather Forecasts surface-wind analysis is used as the background.

Journal ArticleDOI
TL;DR: The Office of Naval Research, together with Scripps Institution of Oceanography, University of Washington, Massachusetts Institute of Technology, and Woods Hole Oceanographic Institution, is pleased to announce the formation of two national Ocean Bottom Seismometer (OBS) facilities.
Abstract: The Office of Naval Research, together with Scripps Institution of Oceanography, University of Washington, Massachusetts Institute of Technology, and Woods Hole Oceanographic Institution, is pleased to announce the formation of two national Ocean Bottom Seismometer (OBS) facilities. Recent advances in marine seismic and acoustic research, including whole Earth tomography, seismic refraction tomography, detailed passive seismology, high-resolution seismic refraction, and marine ambient noise studies, require a suite of identical calibrated seafloor instruments for analysis of array data collected by OBS capable of sustained deployment periods. Such instruments require a recording capability that is substantially improved in terms of bandwidth, recording capability, fidelity, and deployment duration over that possible just a few years ago. Recognizing a deficiency in existing instrumentation, in 1987 ONR embarked on an effort to fund the design and construction of a new generation of OBS. Thirty-one instruments are now available for general use, and we encourage investigators to use the national OBS facilities as an effective means of acquiring state-of-the-art ocean floor seismic data. The two OBS facilities will be managed and operated on a joint institutional basis by WHOI and MIT, and SIO and UW, respectively. While the instruments will be managed and operated by the OBS facilities, ownership of the OBS will be retained by

Journal ArticleDOI
TL;DR: The circulation of the Gulf of California has been the subject of active research as discussed by the authors, and during the 1980s, scientists at CICESE and at the Scripps Institution of Oceanography designed a cooperative effort, the Pichicuco project, to investigate some of the notable physical oceanographic features.
Abstract: The circulation of the Gulf of California has long been of scientific interest. The first hydrographic expedition there was in 1889 [Roden and Groves, 1959], followed half a century later by Sverdrup's cruise on the R/V E.W. Scripps [Suerdrup, 1941] in February and March of 1939. Since then, the Gulfs circulation has been the subject of active research [Alvarez-Boirego, 1983]. During the 1980s, scientists at CICESE and at the Scripps Institution of Oceanography designed a cooperative effort, the Pichicuco project, to investigate some of the notable physical oceanographic features of the Gulf. The Gulf of California is a marginal sea close to 1500 km long and about 200 km wide, oriented northwest to southeast, between the peninsula of Baja California and western continental Mexico. It consists of a succession of basins that shoal progressively from about 3500 m at the mouth, where the Gulf connects with the Pacific Ocean, to just over 2000 m in the central Guaymas Basin. In contrast, the far northern Gulf is a continental shelf sea whose depth exceeds 200 m only in a few small basins. The Gulf's circulation is profoundly influenced by processes taking place at the narrows that connect Guaymas Basin to the northern Gulf between 28°N and 29°N (see Figure 1). These are a sequence of channels, each about 15 km wide, between San Lorenzo, San Esteban, and Tiburon islands, which reduce the effective cross section of the Gulf to about 2.25×106m2. The westernmost connection, close to Baja California, is the Ballenas-Salsipuedes (hereafter Ballenas) channel, whose depth exceeds 1600 m in its central part. It is bounded partially to the north by a lateral constriction with a maximum depth of 600 m, near the northern extreme of Angel de la Guarda island, and to the east by a ridge from which rise Angel de la Guarda, San Lorenzo, and other smaller islands. This ridge extends underwater about 20 km to the southeast from San Lorenzo into Guaymas Basin, where it forms the eastern wall of San Lorenzo sill, the southern end of Ballenas channel. A narrow canyon on this sill has a maximum depth of about 430 m. The central San Esteban channel is located between San Lorenzo and San Esteban islands, and is the deepest and widest of the three. It possesses a single, rather broad sill, formed by a westward underwater extension of San Esteban island. The third channel, between San Esteban and Tiburon islands, is narrower than the first two, has a broad sill at about 300 m depth, and connects the extension of the Sonoran shelf with the deeper basin to the north. Little studied before, it now appears to play a significant role in the regional exchange of water. A fourth, narrow channel between Tiburon island and mainland Mexico is too shallow to participate strongly in the circulation.


Journal ArticleDOI
TL;DR: On October 24, 1863, Abraham Lincoln spoke before an audience here in Baltimore and confessed, in effect, that he considered himself ill chosen for the duty that had been thrust upon him.
Abstract: On October 24, 1863, Abraham Lincoln spoke before an audience here in Baltimore and confessed, in effect, that he considered himself ill chosen for the duty that had been thrust upon him. Regarding my infinitely smaller duty pertaining to this lecture, I must make, respectfully, without wishing to offend the selection committee, the same confession. Lincoln spoke here less than a month before he delivered his address at Gettysburg. With that weak justification, I take liberty to begin this lecture as follows

Journal ArticleDOI
TL;DR: In this paper, techniques for enhancing the accuracy of the geophysical data records derived from Geosat altimeter observations with treatments for water-vapor correction and satellite orbit were described.
Abstract: Techniques are described for enhancing the accuracy of the geophysical data records derived from Geosat altimeter observations with treatments for water-vapor correction and satellite orbit. The TOVS/Special Sensor Microwave Imager (SSMI) water vapor data and T2 ephemeris are used to effectively adjust tide-gauge records. The T2 orbit is found to be effective for the study of large-scale phenomena with reductions in the radial orbit error.

Journal ArticleDOI
TL;DR: Friedman as discussed by the authors argued that the study of science is approached differently from one culture to another and that social, political, and economic aspects lead one people or nation or world region to emphasize different areas of science.
Abstract: Technology, it is said, is more parochial than science. In this view, technology is constructed by specific cultures to fit specific needs: while the science of physics is universal, there are a thousand ways to dig dirt from the ground or to build a house. But is this true? In his book, Friedman shows us that at least the study of science is approached differently from one culture to another and that social, political, and economic aspects lead one people or nation or world region to emphasize different areas of science. From Per Kalm (a Swedish naturalist who collected auroral observations before 1750) to Hannes Alfven, the Scandinavians have studied the Northern sea and the Northern sky.

Journal ArticleDOI
TL;DR: In this article, the authors summarize recent progress in geophysical studies of the deep continental crust and highlight some of the more important implications of deep crustal processes, including the primary processes responsible for its composition, structure, and mode of deformation.
Abstract: How has the continental crust evolved? What are the primary processes responsible for its composition, structure, and mode of deformation? What role do fluids play in deep crustal processes? In the last dozen years, geophysicists have obtained images of the deep continental crust that can be used to examine these questions and refine geologic models of crustal evolution. In this report we summarize recent progress in geophysical studies of the deep continental crust and highlight some of the more important implications of deep crustal processes.

Journal ArticleDOI
TL;DR: Douglas et al. as discussed by the authors pointed out that the anthropogenic influence on global sea level change has made a significant difference, and hence ought to be taken into account in the sea level budget.
Abstract: I'd like to call attention to an anthropogenic influence on global sea level change. Although far less interesting for geophysicists who study natural processes, the phenomenon in question has made a significant difference, and hence ought to be taken into account in the sea level budget. Observational evidence shows that the global sea level has risen at the rate of 1.6–2 mm/yr for the last several decades [e.g., Trupin and Wahr, 1990; Douglas, 1991]. How much of [the rise] results from natural fluctuation, how much is anthropogenic, and what are the sources and mechanisms of the rise are among the key questions asked in this era of concerns about enhanced greenhouse effect and global warming. Presumably, as the global temperature rises, the sea level will rise for two reasons: thermal expansion (the steric change), and addition of water (the eustatic change). A primary candidate for the source of the latter is melting land ice in the form of polar ice sheets and mountain glaciers. Practically nothing useful is known about the present-day mass balance of the polar ice [Zwally, 1989; Douglas et al., 1990]. The best estimate for the rate of mountain glacier mass wastage, based on scanty data, amounts to a contribution of 0.46 (±.26) mm/yr of higher sea level between 19007ndash;1961 [Meier, 1984].

Journal ArticleDOI
TL;DR: In this paper, it is shown that during marine photosynthesis, microscopic plants in the oceans fix CO2 into their tissues in the form of organic matter and part of this organic matter is oxidized back to CO2 in the mixed layer of the sea, which interacts with the atmosphere.
Abstract: There is scientific consensus that atmospheric carbon dioxide is a major controlling factor of the surface temperature of our planet. CO2 released into the atmosphere through fossil fuel burning and deforestation is believed to lead to global warming through the greenhouse effect. The CO2 content in the atmosphere is influenced not just by its release into the atmosphere, but also by its removal. Although it is generally accepted that oceans take up considerable amounts of CO2, the complex processes affecting this uptake are still poorly understood. Physicochemical exchange processes in the ocean remove large quantities of CO2, as do biological processes. During marine photosynthesis, microscopic plants in the oceans fix CO2 into their tissues in the form of organic matter. Part of this organic matter is oxidized back to CO2 in the mixed layer of the sea, which interacts with the atmosphere. The rest sinks down to the deep sea. The rate of this transfer to the deep sea determines the extent to which the formation of organic matter removes CO2 from the atmosphere.

Journal ArticleDOI
TL;DR: Joselyn and Tsurutani as discussed by the authors proposed a new quantitative definition of the two terms (Sis and SSCs) for open discussion, based on the existing phenomenological definition.
Abstract: In an attempt to resolve some ambiguity in defining geomagnetic sudden impulses (Sis) and storm sudden commencements (SSCs) using the existing phenomenological definition (see, for example, Mayaud and Romana [1977]; Mayaud [1980]), Joselyn and Tsurutani [1990] recently constructed a scheme in which SSCs are a subset of Sis, depending on the magnitude of subsequent geomagnetic activity. For quantitative application, they have proposed that an SI be specified as a sharp change (at least 10 nT in 3 minutes or less) observed nearly simultaneously (within a few minutes) in either component of the horizontal magnetic field at globally spaced observatories near 20° geomagnetic latitude. In addition, SSCs are those Sis followed within 24 hours by an hourly Dst index of at least −50 nT. Because the Dst index is not readily available, the recommended provisional alternative indicators are a 3-hourly Kp index of 5 or more and a half-daily a a index of 60 or more. Joselyn and Tsurutani [1990] have recommended these new quantitative definitions of the two terms (Sis and SSCs) for open discussion.

Journal ArticleDOI
TL;DR: This report outlines the OEDIPUS experiment (defined as Observations of Electric-field Distributions in the Ionospheric Plasma—a Unique Strategy) and gives some of its preliminary scientific results.
Abstract: The launch of the tethered payload OEDIPUS-A on January 30, 1989 established a new record for the maximum length (958 m) of a space tether. The flight achieved a number of novel objectives in ionospheric plasma physics and tether technology. This report outlines the OEDIPUS experiment (defined as Observations of Electric-field Distributions in the Ionospheric Plasma—a Unique Strategy) and gives some of its preliminary scientific results. In the 1970s, the word “tether” took on new meaning as space scientists began to plan pairs of spacecraft wired together to co-orbit. Much of the work on tethers in space centers on the idea of flying the National Aeronautics and Space Administration's (NASA) Space Shuttle with a tethered subsatellite. This will materialize in 1992 when NASA and the Italian Space Agency collaborate on the Shuttle experiment Tethered Satellite System [Bonifazi, 1987]. The first version of this experiment uses a conducting tether to draw current from the ionospheric plasma, thereby inducing a variety of electrodynamic phenomena. In addition to providing an experimental facility for plasma electrodynamics, the tethered system is a large flexible structure, and hence an interesting subject for space mechanics research in its own right.

Journal ArticleDOI
TL;DR: In this article, the authors proposed a solution for the GEM-T2 (Goddard Earth Model) solution, which is complete to degree and order 36, and incomplete to degree 50.
Abstract: Recently, geodesy has witnessed a renaissance in geoid computation. The advances over the past decade have taken place at all wavelengths, and have brought forth major improvements in accuracy. An example of a long-wavelength global gravitation model is the GEM-T2 (Goddard Earth Model) solution of Marsh et al. [1989], which is complete to degree and order 36, and incomplete to degree 50. Rapp and Pavlis [1990] have computed a pair of solutions, 0SU89A and 0SU89B (Ohio State University), which are spherical harmonic models of the Earth's geopotential complete to degree and order 360. Although termed high degree global models, these solutions provide the geoid to what we may now consider a medium length scale–about 50-km resolution. High-resolution geoid height modeling has shown the greatest advances in accuracy. Forsberg [1990] computed a geoid model for the Nordic area on a 5-km grid, and obtained 3–7 cm standard deviations when compared to Global Positioning System (GPS) and leveling in local networks of 50–100 km in extent.

Journal ArticleDOI
TL;DR: In this article, the U.S. Navy's Geosat radar altimeter made direct observations of sea-surface topography, revealing variations in sea surface topography reveal variations in marine gravity.
Abstract: Between 1985 and 1990, the U.S. Navy's Geosat radar altimeter made direct observations of sea-surface topography. Since the sea surface is nearly an equipotential surface of the Earth's gravity field, variations in sea surface topography reveal variations in marine gravity. At short wavelengths (<200 km), the topography of the sea surface mimics the seafloor or basement topography. Observations from Geosat have been used to map the marine geoid with unprecedented accuracy and resolution (see Figure 1). This mapping of the marine geoid—or gravity field—can revolutionize marine geophysics and geodesy, especially in the southern ocean basins where shipboard coverage is sparse.

Journal ArticleDOI
TL;DR: In this article, the authors proposed that periodic additions of iron-bearing volcanic ash to the oceans represent natural iron-fertilization experiments that could be evaluated for their effects on ocean productivity.
Abstract: Martin [1990] suggests that intentional iron fertilization of nutrient-rich but iron-starved areas such as the southern oceans and equatorial Pacific might be used to stimulate organic productivity and thereby remove carbon dioxide from the atmosphere Martin further proposes that low concentrations of atmospheric carbon dioxide during glacial times were partly the result of increased oceanic productivity stimulated by iron-bearing dust falling into oceans Martin's work is very controversial both because the effectiveness of iron fertilization of the iron-starved parts of the oceans has been questioned [Peng and Broecker, 1991] and because proposed pilot-scale experiments to test the effects of iron addition on ocean ecology have met resistance from those who fear unanticipated and possibly harmful side effects Of particular concern are possible long-term effects that would not necessarily be observed in proposed experiments In this article, it is proposed that periodic additions of iron-bearing volcanic ash to the oceans represent natural iron-fertilization experiments that could be evaluated for their effects on ocean productivity