scispace - formally typeset
Search or ask a question

Showing papers in "IEEE Spectrum in 1990"


Journal Article•DOI•
TL;DR: The European consumer electronics industry is examined, focusing on how the manufacturers have been restructuring, streamlining, and consolidating in order to compete against Japanese competitors as discussed by the authors, and the major issue, high-definition television (HDTV), is discussed; the European proposal, highdefinition multiplexed analog component, known as HD-MAC, is compatible with D2-MAC.
Abstract: The European consumer electronics industry is examined, focusing on how the manufacturers have been restructuring, streamlining, and consolidating in order to compete against Japanese competitors. Emerging from this strategic repositioning are three conglomerates which possess major shares of the European television market, and are active in other areas of consumer electronics. In order of size, they are NV Philips of the Netherlands and Thomson Consumer Electronics Co. of France, both headquartered within the European Community (EC), and the Nokia Group of Finland, which belongs to the European Free Trade Association (EFTA), but has interests inside the EC. The major issue, high-definition television (HDTV), is discussed; the European proposal, high-definition multiplexed analog component, known as HD-MAC, is compatible with D2-MAC, the new European standard for direct satellite broadcasts. Like Secam and PAL. D2-MAC prescribes 625 scan lines and 25 pictures/s but provides better picture quality, several high-quality audio channels, and a 16:9 picture ratio (which requires an adaptor for existing sets). >

228 citations


Journal Article•DOI•
TL;DR: In this paper, the way in which geomagnetically induced current (GIC) affects power systems is explained and the factors that make power networks especially vulnerable today and the difficulty of predicting episodes are considered.
Abstract: As solar activity moves toward an 11 year peak, utility engineers are girding for the effects of massive magnetic disturbances. The nature of the geomagnetic disturbances is examined. The way in which geomagnetically induced current (GIC) affects power systems is explained. Virtually all power equipment, operation, and protection problems due to GIC are traceable to two direct effects: the half-cycle saturation of power transformers and the half-cycle saturation of the current transformers used with protective relay systems. Of the two, the former, with its numerous secondary effects, has been the more serious. As an example, a blackout due to solar disturbances, which happened to the Hydro-Quebec system in Mark 1989, is described. The factors that make power networks especially vulnerable today and the difficulty of predicting episodes are considered. >

217 citations


Journal Article•DOI•
TL;DR: In this article, the techniques used to fabricate micromechanical structures are described and the mechanical properties of silicon, which are important to these applications, are examined, including accelerometers, resonant microsensors, motors and pumps made by these techniques.
Abstract: The techniques used to fabricate micromechanical structures are described. Bulk micromachining is routinely used to fabricate microstructures with critical dimensions that are precisely determined by the crystal structure of the silicon wafer, by etch-stop layer thicknesses, or by the lithographic masking pattern. Silicon fusion bonding has been used to fabricate micro silicon pressure sensor chips. Surface micromachining, based on depositing and etching structural and sacrificial films, allows the designer to exploit the uniformity with which chemical vapor deposition (CVD) films coat irregular surfaces as well as the patterning fidelity of modern plasma etching processes. Silicon accelerometers, resonant microsensors, motors, and pumps made by these techniques are discussed. Measuring the mechanical properties of silicon, which are important to these applications, is examined. >

178 citations


Journal Article•DOI•
TL;DR: Fuzzy logic is examined, and its application to control systems is discussed, and the possibility of interfacing fuzzy logic to existing control Systems is noted.
Abstract: Fuzzy logic is examined, and its application to control systems is discussed. The steps taken to design a fuzzy controller are described, and the possibility of interfacing fuzzy logic to existing control systems is noted. Tools for developing and modeling fuzzy control systems are described. >

147 citations


Journal Article•DOI•
R.R. Johnson1•
TL;DR: In this paper, the advantages of multichip modules (MCMs) are discussed, such as reduced delays between chips, simplified power distribution, and enhanced dissipation capabilities, and various approaches to MCM packages are described.
Abstract: Multichip modules (MCMs), which interconnect multiple bare dice by means of a stack of conductive and dielectric thin films, are discussed. Among their advantages are reduced delays between chips, simplified power distribution, and enhanced dissipation capabilities. Key design demands that should be weighed by IC design engineers planning to use MCMs are examined. They concern transmission delays, power distribution, heat dissipation, and temperature, as well as testing, burn-in, and rework. The various approaches to MCM packages are described. Factory-programmable versus user-programmable options are considered. Techniques for connecting chips to substrates are discussed. >

104 citations


Journal Article•DOI•
TL;DR: Applying spread spectrum to code-division multiaccess (CDMA) communication, in which each user is assigned an identification code, is discussed, and the potential of spread spectrum for relieving spectrum congestion is addressed.
Abstract: The use of spread-spectrum techniques to achieve more efficient utilization of available frequency spectra is examined. The two main spread-spectrum techniques, direct sequence and frequency hopping, are explained. In frequency hopping, the transmitter repeatedly changes (hops) the carrier frequency from one frequency to another. Direct-sequence transmission spreads the spectrum not by periodically changing the frequency but by modulating the original (information) baseband signal with a very wide-baseband digital signal. The wideband modulating signal's amplitude changes continually between two states, high and low, arbitrarily called +1 and -1, respectively, with the sequence of highs and lows being pseudorandom. Applying spread spectrum to code-division multiaccess (CDMA) communication, in which each user is assigned an identification code (a distinct sequence of frequencies for frequency hopping or +1 s and -1 s for direct-sequence modulation) is discussed. The use of CDMA for cellular radio, where it promises a capacity of over 1000 users per cell, by the authors' calculations, is considered. The potential of spread spectrum for relieving spectrum congestion is addressed. >

95 citations


Journal Article•DOI•
TL;DR: In this paper, a technique for measuring the risetimes of the fastest oscilloscopes, which will allow more accurate measurement of the most accurate pulses, is described, which exploits the fact that this kickback response is a signature of the sampling process.
Abstract: A technique for measuring the risetimes of the fastest of these instruments, which will allow more accurate measurement of the fastest pulses, is described. The sampling process used by these oscilloscopes is essentially a brief switch closure of a few picoseconds between the signal to be measured at the oscilloscope input and a holding capacitor in the detection circuit. During this switch closure, sampling current flows from the input and is collected or integrated in a capacitor into a finite charge proportional to the input signal. The collected charge is then measured with an analog-to-digital converter. The sudden flow of sampling current during the switch closure creates an impulselike electric disturbance at the oscilloscope input. The characterization technique exploits the fact that this kickback response is a signature of the sampling process. >

63 citations


Journal Article•DOI•
M. Santori1•
TL;DR: The concept and development of LabVIEW (Laboratory Virtual Instrument Engineering Workbench), a virtual instrument (VI) that provides a simple, graphical way to set up and run numerous instruments on a Mackintosh PC, are described.
Abstract: The concept and development of LabVIEW (Laboratory Virtual Instrument Engineering Workbench), a virtual instrument (VI) that provides a simple, graphical way to set up and run numerous instruments on a Mackintosh PC, are described. The first concept basic to LabVIEW had its roots in a large test system comprised of programmable signal sources compatible with IEEE-488 interface specifications, and switching matrices and measurement instruments controlled by a minicomputer. The test system's strength was its flexibility, which it owed to its several levels of user interfaces. This flexible configuration was refined into the notion of instrumentation as a hierarchy of virtual instruments in which all VIs at all levels had the same type of construction. Another characteristic was that each VI have its own user interface. The final conceptual element was the use of loop data-flow structures as the technique used by an engineer to construct and run his or her own VIs. Problems encountered in the development of LabVIEW and the solutions used are discussed. >

56 citations


Journal Article•DOI•
TL;DR: Thevenin's theorem as mentioned in this paper was originally introduced to facilitate the analysis of linear networks of resistances and voltage sources, and subsequently was defined in terms of impedances and sources.
Abstract: The equivalent generator theorem is discussed. It is commonly called Thevenin's theorem, in honor of Leon Charles Thevenin, a French telegraph engineer and educator who proposed it in 1883, but in fact Hermann von Helmholtz proposed it first, in an 1853 paper. Although originally introduced to facilitate the analysis of linear networks of resistances and voltage sources, the theorem subsequently was defined in terms of impedances and voltage sources. As a tool for circuit analysis, it is allied to the superposition theorem. The history of the theorem and how it came to be named for Thevenin are described. >

54 citations


Journal Article•DOI•
TL;DR: It is shown how an interdisciplinary team overcame technical and organizational hurdles to develop an expert system that tailors its real-time responses to a pilot's flying style, which was designed to make expert recommendations without stopping to quiz pilots during air-to-air combat.
Abstract: It is shown how an interdisciplinary team overcame technical and organizational hurdles to develop an expert system that tailors its real-time responses to a pilot's flying style. The system, called the Pilot's Associate (PA), besides the pilot's inputs, required in its knowledge bases expertise from human-factors specialists to decide how the information should be displayed, from psychologists to establish benchmarks for pilot performance in combat, and from engineers to define the relevant operational characteristics that need to be monitored by the expert system. It was designed to make expert recommendations without stopping to quiz pilots during air-to-air combat. The key to the system architecture is a plan-goal graph, which describes the elements used to link a pilot's actions, or plans, with a particular mission goal. The plan-goal graph is described, and a simple example is given to illustrate its use. The tasks of reconciling differing design requirements and of finding effective ways to communicate, both among human designers and among the software modules themselves, are discussed. >

52 citations


Journal Article•DOI•
TL;DR: The history of over-the-horizon radar and its basic principles are outlined in this article, and advances in signal processing that have made HF radar an operational reality are reviewed, as well as some units under construction.
Abstract: The history of HF (over-the-horizon) radar and its basic principles are outlined. Advances in signal processing that have made HF radar an operational reality are reviewed. Working units that the US Navy and the US Air force have in place, as well as some units under construction are described. Nondefence applications of HF radar, most notably the interdiction of drug traffickers and the monitoring of wind patterns over vast stretches of the ocean are noted. >

Journal Article•DOI•
TL;DR: Pan-European standardization bodies and activities with particular attention given to their impact on international trade are discussed in this article, where the authors examine the standardization process and funding and the role of the European Commission.
Abstract: Pan-European standardization bodies and activities are discussed with particular attention given to their impact on international trade. Since many International Organization for Standardization (ISO) and International Electrotechnical Commission (IEC) standards have been derived from US and Japanese products, primarily in telecommunications, the fear that European standards may keep US companies out of the market is said by European experts to be largely unfounded. Of prime concern to the 12 European Community (EC) nations is whether they can generate standards fast enough to be of any benefit to the EC. Applying international standards is difficult because they are so complex. Because of the time (from two to ten years) and effort needed to generate standards, priority has been given to those that are important for safety and government procurement. The standardization process and funding and the role of the European Commission are examined. >

Journal Article•
Roger Wood1•
TL;DR: The technologies that have driven the advances in magnetic storage are discussed in this paper, which consist of highly sensitive magnetoresistive heads, ultra-low-noise magnetic media, and advanced signal detection techniques.
Abstract: Improvements in recording densities that are allowing magnetic storage to pack more data than its optical counterparts are examined. The technologies that have driven the advances are discussed. They consist of highly sensitive magnetoresistive heads, ultra-low-noise magnetic media, and advanced signal detection techniques. Disk and tape performances are compared. >

Journal Article•DOI•
TL;DR: The approaches to global competition adopted by Japan, West Germany, and to the US are examined in this paper, where issues related to capital costs and to education in all the countries are examined.
Abstract: The approaches to global competition adopted by Japan, West Germany, and to the US are examined. In Japan several government institutions assist in developing a strategic vision in science and technology. Foremost is Japan's Council for Science and Technology, which promotes a comprehensive national policy. The Science and Technology Agency (STA), consuming about a quarter of Government R&D, funds research, oversees a worldwide collection of science and engineering publications, and directs a technology transfer corporation. West Germany has developed its R&D policy within a broader European context. Some 12% of the Federal Ministry for Research and Technology (BMFT) budget goes toward international organizations. In the US more than 700 federally funded laboratories spend one-third of the Government's R&D funds and employ more than one-sixth of US scientists and engineers. Issues related to capital costs and to education in all the countries are examined. >

Journal Article•DOI•
TL;DR: The history and early development of the stored program concept are briefly described in this article, which refers to the ability of a calculating machine to store its instructions in its internal memory and process them in its arithmetic unit so that in the course of a computation they may be not just executed but also modified at electronic speeds.
Abstract: The history and early development of the stored program concept are briefly described. This refers to the ability of a calculating machine to store its instructions in its internal memory and process them in its arithmetic unit, so that in the course of a computation they may be not just executed but also modified at electronic speeds. John von Neumann, a faculty member of the Institute for Advanced Study in Princeton, NJ, participated in the discussions in which the idea was elaborated, wrote the first report of the concept, placed it in a theoretical context, and built his own computer, which was the early model for a number of others, including the important commercially manufactured IBM 701. J. Presper Eckert and John Mauchly perhaps first conceived of the stored program concept and developed most of the plans for implementing it in the Edvac, and later incorporated it in the Univac and other computers produced by their company. Several British computer scientists, notably Maurice Wilkes, were the first to implement the idea in machines initially designed to embody this feature. >

Journal Article•DOI•
TL;DR: The technologies that have driven the advances in magnetic storage are discussed in this article, which consist of highly sensitive magnetoresistive heads, ultra-low-noise magnetic media, and advanced signal detection techniques.
Abstract: Improvements in recording densities that are allowing magnetic storage to pack more data than its optical counterparts are examined. The technologies that have driven the advances are discussed. They consist of highly sensitive magnetoresistive heads, ultra-low-noise magnetic media, and advanced signal detection techniques. Disk and tape performances are compared

Journal Article•DOI•
TL;DR: The potential of picture-archiving and communication systems (PACs), a technology for the transmission and storage of diagnostic images, is examined and the technological problems that must be solved before PACS can achieve widespread use are explored.
Abstract: The potential of picture-archiving and communication systems (PACs), a technology for the transmission and storage of diagnostic images, is examined PACs, which accept pictures or images, with associated test, in digital form and then distributes them over a network, offer a means of dealing with the explosive growth of information generated by radiology and other imaging modalities The technological problems that must be solved before PACS can achieve widespread use are explored These occur in the areas of storing, transmitting, and displaying the images Standard issues and radiologists' resistance to the digital hospital, which may be the greatest obstacle to implementing PACS, are discussed >

Journal Article•DOI•
TL;DR: Logic synthesis gained acceptance, while behavioral-level synthesis moved a step closer to reality, and testability became a design parameter, hardware accelerators got even faster, and work continued on developing frameworks that will allow one tool to retrieve information from another and display it in a uniform manner.
Abstract: Developments in design tools over the past year are examined. Progress was more incremental than in the past, consisting of refinement rather than innovation. Logic synthesis gained acceptance, while behavioral-level synthesis moved a step closer to reality. Testability became a design parameter, hardware accelerators got even faster, and work continued on developing frameworks that will allow one tool to retrieve information from another and display it in a uniform manner. >

Journal Article•DOI•
TL;DR: In this paper, the deleterious effects of not bringing projects quickly from the laboratory to the market are described, and corporate efforts to cut development time are examined, and a less widespread approach to shortening time to market is called concurrent engineering, in which an item is designed concurrently with the processes that would be used to assemble it, manufacture its parts, test it, and repair it in the field.
Abstract: The deleterious effects of not bringing projects quickly from the laboratory to the market are described, and corporate efforts to cut development time are examined. Many Japanese companies have already wrestled with this problem and moved on to other issues, like their need for basic research. As a result, many US corporations are looking to the Japanese for guidance. This is leading companies to restructure their organizations to encourage open, frequent, and early communication. Launching research projects with technology transfer in view from the start is another approach modeled after the Japanese. Central to Japanese technology transfer is the fact that as a project moves from laboratory to market, the engineers move with it. This is also being tried by some US firms. A less widespread approach to shortening time to market is called concurrent engineering, in which an item is designed concurrently with the processes that would be used to assemble it, manufacture its parts, test it, and repair it in the field. Standardization is also being used to boost researchers' output. >

Journal Article•DOI•
J.J. Vaglica1, P.S. Gilmour1•
TL;DR: A set of guidelines for the designer to use as a checklist for evaluating candidate devices is given, stressing that an absolute match between requirements and features is not necessary, since minor alterations to the requirements or the addition of a peripheral chip could create the most cost-effective solution from a less-than-perfect pairing.
Abstract: A set of guidelines for the designer to use as a checklist for evaluating candidate devices is given. The first decision examined is the level of performance required, that is, on the word size of the controller. The second step is to identify the quantity, frequency, and type of all input/output (I/O) signals, as well as any other special requirements, including those imposed by mechanical aspects of the system. The third consideration is the application's memory requirements, which should be further broken down into program memory and data memory. Mapping I/O peripherals and the choice of single-chip or expanded are discussed. The feature-selection process is addressed, stressing that an absolute match between requirements and features is not necessary, since minor alterations to the requirements or the addition of a peripheral chip could create the most cost-effective solution from a less-than-perfect pairing. >

Journal Article•DOI•
TL;DR: The IEEE's standards activity, which is the focus of voluntary efforts to establish EM (electromagnetic) fields standards in the United States is discussed, as well as national electric and magnetic field exposure standards for ELF (extremely low frequency) fields that have already been adopted by some countries.
Abstract: The debate over setting standards for exposure to power-frequency electromagnetic fields when science has not yet determined what levels, if any, are dangerous is examined. Regulatory agencies, standards bodies and utilities are struggling to make policy in the face of two opposing views: one camp believes the evidence of health effects is enough to warrant precautionary action to limit exposure, while the other insists that research must present proof of a harmful effect before practical limits (i.e. limits that can be achieved at a reasonable cost to society) can be determined. Burgeoning litigation over high-voltage transmission lines is making the task of providing power increasingly difficult for the utilities. Almost all utilities provide field measurements to customers upon request, and many have programs for characterizing magnetic fields in various environments. The IEEE's standards activity, which is the focus of voluntary efforts to establish EM (electromagnetic) fields standards in the United States is discussed, as well as national electric and magnetic field exposure standards for ELF (extremely low frequency) fields that have already been adopted by some countries. Some solutions for transmission lines, and the technological and economic factors entailed, are explored. The debate over the risk of electrical appliances is examined. >

Journal Article•DOI•
TL;DR: In this article, the two most favored topologies, double-star and double-ring, are described and emerging MAN protocols are examined, as well as two emerging topologies for MAN topologies.
Abstract: Metropolitan-area networks (MAN), which fill the gap between local area networks (LANs) and wide-area networks (WANs), are discussed. MANs were originally oriented toward data, but now often carry voice and video traffic as well. Supplying more bandwidth than LANs, they support two-way communication over a shared medium such as an optical-fiber cable, and may offer point-to-point high-speed circuits or packet-switched communication. The two most favored topologies, double-star and double-ring, are described and emerging MAN protocols are examined. >

Journal Article•DOI•
TL;DR: The sources of electrical transients, broadly interpreted as occurrences of any disturbance, either on the power line or the computer system's data line, are reviewed and some guidelines are provided for choosing a protective device for both types of systems mentioned above.
Abstract: The sources of electrical transients, broadly interpreted as occurrences of any disturbance, either on the power line or the computer system's data line, are reviewed. Common transients are overvoltages due to lightning strikes, transients caused by switching sequences in the power system, and undervoltages which could be caused by a nearby start-up of heavy loads or by distant faults. The impact of transients on small stand-alone and on distributed computer systems is examined. Growing concern among computer users that power-line surges, in particular, may damage equipment or cause loss of data has created a market for surge suppressors. Some guidelines are provided for choosing a protective device for both types of systems mentioned above. Potential negative side effects are indicated. >

Journal Article•DOI•
Z.J. Cendes1•
TL;DR: In this article, the authors discuss the emergence of vastly refined algorithms for finite-element analysis, which are putting electromagnetic simulators into the hands of any engineer who can afford a personal computer.
Abstract: The emergence of vastly refined algorithms for finite-element analysis, which are putting electromagnetic simulators into the hands of any engineer who can afford a personal computer, is discussed. Field simulation allows an engineer to analyze the electromagnetic behavior of a device without building a physical prototype and taking measurements. The way in which simulation technology is changing the design process is illustrated by considering the design of a coax-to-waveguide transition. The field simulation model developed by the design engineer is passed to the manufacturing engineer, who runs parameter studies of the device by varying key dimensions and materials to determine the sensitivity of device performance to these parameters. Further, the field simulation model helps determine the impact of unforeseen changes during manufacturing due to variations in supplies and facilities. The process used by the finite-element simulators to solve Maxwell's equations is described. Two typical CAE applications are examined: a microwave low-pass filter design and a study of electromagnetic coupling in digital circuit interconnections. Pointers for selecting electromagnetic CAE software are given. >

Journal Article•DOI•
J.R. Hines1•
TL;DR: PC-based analog circuit design systems are discussed, and Math tools for solving equations that describe the proposed behavior of the circuit, either symbolically or numerically, are described.
Abstract: PC-based analog circuit design systems are discussed. Math tools for solving equations that describe the proposed behavior of the circuit, either symbolically or numerically, are described. Schematic capture programs for drawing blocks and circuits are surveyed, as are simulation packages. Tables giving information on representative packages are provided. >

Journal Article•DOI•
TL;DR: In this paper, the effects of power-frequency electric and magnetic fields on biological systems are examined from the viewpoint of what, if anything, should be done to minimize the risk to individuals.
Abstract: The debate over the effects of power-frequency electric and magnetic fields on biological systems is examined from the viewpoint of what, if anything, should be done to minimize the risk to individuals. It is argued that as the social and economic costs of this uncertain state of affairs are growing, additional research is urgently needed yet the urgency is not being met by the modest or on-again, off-again levels of federal research support. Steps that can be taken by electric power system designers and operators, appliance manufacturers, the building industry, and individual citizens are outlined. It is stressed that if it is eventually found that fields adversely affect our health , a rush to manage the risks could result in inappropriate and inefficient actions unless utilities, appliance manufacturers, and the building industry start doing careful engineering/economic homework now. Some engineering problems entailed in redesigning electrical systems are indicated. >

Journal Article•DOI•
R. Cates1•
TL;DR: The use of very large-scale integrated GaAs circuits for applications where high speed at room temperatures is needed, such as in computers or telecommunications, is examined in this article, where the advantages and disadvantages of a logic family called direct-coupled FET logic (DCFL) which couples the speed of GaAs with a significantly lower power dissipation than any other alternative are discussed.
Abstract: The use of a very-large-scale integrated GaAs circuits for applications where high speed at room temperatures is needed, such as in computers or telecommunications, is examined The advantages and disadvantages of a logic family called direct-coupled FET logic (DCFL) which couples the speed of GaAs with a significantly lower power dissipation than any other alternative are discussed Material, fabrication, and packaging concerns associated with DCFL are considered Some GaAs devices being produced in volume, at rates of several hundred a month, are described The potential impact of these devices on the computer and telecommunications markets is addressed >

Journal Article•DOI•
TL;DR: The Electronic Design Interchange Format (EDIF), now in its second revision and paving the way for universal data exchange among CAE tools, is examined and the benefits and problems encountered are considered.
Abstract: The Electronic Design Interchange Format (EDIF), now in its second revision and paving the way for universal data exchange among CAE tools, is examined. Translators can convert data files from a given design tool into EDIF as a standard format and from EDIF into the form needed by another design tool. The benefits of using EDIF for design capture tools, design analysis tools, and logic synthesizers are discussed. Examples of the successful use of EDIF are given, and some problems encountered in its use are considered. A table listing the features of representative software packages is given. >

Journal Article•DOI•
TL;DR: The impact of a single market, scheduled to come into being in 1992, on the members of the European Community (EC) is examined in this article, where economic and industrial leaders believe that the single market of more than 320 million consumers, allowing goods, services, people, and capital to move freely within its borders, could restore European leadership in vital industries such as electronics.
Abstract: The impact of a single market, scheduled to come into being in 1992, on the members of the European Community (EC) is examined. Many economists and industrial leaders believe that the single market of more than 320 million consumers, allowing goods, services, people, and capital to move freely within its borders, could restore European leadership in vital industries such as electronics, in which Europe has been losing ground for approximately 10 years. Activities taking place in anticipation of 1992 include industrial reorganization and joint research efforts. >

Journal Article•DOI•
TL;DR: Concerns that electromagnetic (EM) fields may cause cancer and endocrine and nervous system disorders are discussed, and some experiments suggest that mediation by the cell membrane and/or large proteins that float in it is the mechanism by which fields couple to the cell.
Abstract: Concerns that electromagnetic (EM) fields may cause cancer and endocrine and nervous system disorders are discussed. The focus is on 60 Hz fields, where the mechanism of interaction probably involves the cell membrane, is nonlinear, and may act by causing some cooperative phenomena among the components of the cell membrane. Two basic epidemiological study designs have been used in work on cancer caused by EM fields. The first, retrospective case-control studies, compare an existing population of cases with a control group without the disease that is selected to be similar in all other characteristics. The second type computes the proportional mortality (or incidence) ratio, which compares the mortality from (or incidence of) disease in the sample population to that in general public. Selected studies that have looked for a link between exposure to power-frequency fields and cancer are summarized. Results have been mixed, but many studies do suggest a connection. One possibility is that EM fields act as a promoter of cancer. Also discussed are studies on birth defects and circadian rhythm, and cell and organ studies. Some experiments suggest that mediation by the cell membrane and/or large proteins that float in it is the mechanism by which fields couple to the cell. >