scispace - formally typeset
Search or ask a question

Showing papers in "IEEE Potentials in 1995"


Journal Article•DOI•
TL;DR: Creative problem solving is a framework that encourages whole-brain, iterative thinking in the most effective sequence; it is cooperative in nature and is most productive when done as a team effort.
Abstract: Problem solving, as commonly taught in schools, is an analytical or procedural approach. This approach almost exclusively employs left-brain thinking modes, is competitive, and relies on individual effort. However, creative problem solving is a framework that encourages whole-brain, iterative thinking in the most effective sequence; it is cooperative in nature and is most productive when done as a team effort.

442 citations


Journal Article•
G. Pal1, S. Agrawal1•
TL;DR: Congestion is an inherent problem for networks with multiple user access when the load exceeds what can be handled even with optimal routing, so window flow control and rate flow control schemes are discussed.
Abstract: Information technology's growth has led to an obsession to provide better and more communication services than ever before. Computer communication networks are the latest manifestation of this obsession. Computer networks have been playing an increasingly larger role in our day to day activities. However, there is technology available that makes it possible for us to envision simultaneous transmission of voice, video, graphics and text over the same network. Such networks are called broadband integrated services digital networks (B-ISDNs). For a B-ISDN to be successful, certain critical issues such as congestion control, routing and error detection have to be tackled. Congestion is an inherent problem for networks with multiple user access when the load exceeds what can be handled even with optimal routing. Window flow control and rate flow control schemes are discussed.

121 citations


Journal Article•DOI•
J.T. Butler1•
TL;DR: Two prototype four-valued logic devices have been implemented at the University of Twente (Enschede, Holland) and Hitachi has implemented a 16-valued memory that stores the equivalent of 10/sup 6/ bits.
Abstract: The ultimate usefulness of a number system depends on its implementation. Multiple-valued logic has been implemented in charge-coupled devices (CCD). In this technology, logic values are encoded as charge. For example, prototype four-valued logic devices have been implemented at the University of Twente (Enschede, Holland). Hitachi has implemented a 16-valued memory that stores the equivalent of 10/sup 6/ bits. CCD is more compact than any other VLSI technology. Although it is slower than CMOS (complementary metal oxide semiconductor), it is much faster than the disk and has the potential of replacing the disk. The use of multiple-valued logic in CCD increases its storage capacity significantly. Multiple-valued logic has also been implemented in current-mode CMOS. >

109 citations


Journal Article•DOI•
TL;DR: In this article, the authors proposed a machine vision-based inspection system for printed circuit board (PCB) fabrication, which removes the subjective aspects and provides fast, quantitative dimensional assessments.
Abstract: There are more than 50 process steps required to fabricate a printed circuit board (PCB). To ensure quality, human operators simply inspect the work visually against prescribed standards. The decisions made by this labor intensive, and therefore costly, procedure often also involve subjective judgements. Automatic inspection systems remove the subjective aspects and provide fast, quantitative dimensional assessments. Machine vision may answer the manufacturing industry's need to improve product quality and increase productivity. The major limitation of existing inspection systems is that all the algorithms need a special hardware platform to achieve the desired real-time speeds. This makes the systems extremely expensive. Any improvements in speeding up the computation process algorithmically could reduce the cost of these systems drastically. However, they remain a better option than increasingly error prone, and slow manual human inspection. >

51 citations


Journal Article•DOI•
L.D. Xu1•
TL;DR: Case-based reasoning (CBR) is an area of machine learning research that endeavors to support this process and the basic idea is that past experiences can be remembered and adapted to guide problem solving.
Abstract: In rule-based expert systems, knowledge is represented as rules. A rule-based system is one example of deductive reasoning. However, in tasks such as design, diagnosis, planning, management and assessment we often observe a different type of reasoning. An expert facing a new problem is usually reminded of similar situations, recalls their results and perhaps the reasoning. In other words, previous cases are applied to current problems by an analogical reasoning process. Case-based reasoning (CBR) is an area of machine learning research that endeavors to support this process. CBR is based on a memory-centered cognitive model. The basic idea is that past experiences can be remembered and adapted to guide problem solving. Computer systems that solve new problems by analogy with previous ones are called CBR systems, CBR systems base their intelligence and inference on known cases rather than on rules. There are two kinds of CBR systems: problem-solving systems and interpretive systems. A problem-solving system focuses on the construction of solutions suited to the new case by modifying previous case solutions. An interpretive system evaluates and justifies new cases based on the similarities or differences with the previous cases. However, most real-world problems require both types of CBR. AIDS-risky behavior screening, for example, requires both interpreting the behavior and then deriving a conclusion based on the precedents.

45 citations


Journal Article•DOI•
TL;DR: The author carries out a comparison of CISC and RISC (reduced instruction set computing) and discusses what RISC is and its shortcomings.
Abstract: The author carries out a comparison of CISC (complex instruction set computing) and RISC (reduced instruction set computing). The author discusses what RISC is and its shortcomings. The evolution of CISC and RISC microprocessors is then discussed and prospects for the future are examined. >

23 citations


Journal Article•DOI•
TL;DR: A common thread that pervades literally all human activity is identifying patterns and relationships in the world around us that are codified in tools used in research to develop, test, verify, adapt and extend theories as more data is taken or becomes available.
Abstract: A common thread that pervades literally all human activity is identifying patterns and relationships in the world around us. Such patterns and relationships are codified in tools that can have the form of theories, mathematical formulas and tables of data. These tools are used in research to develop, test, verify, adapt and extend theories as more data is taken or becomes available. They are used in application to design components and systems that will work predictably while satisfying specified performance characteristics. Examples are cited from information theory, signal processing, music, art, medicine and aviation. >

18 citations


Journal Article•DOI•
TL;DR: The authors present step-by-step guidelines, a set of decision rules proven to be useful in building ER diagrams, and a case study problem with a preferred answer as well as aSet of incorrect diagrams for the problem.
Abstract: The entity-relationship (ER) model and its accompanying ER diagrams are widely used for database design and systems analysis. Many books and articles just provide a definition of each modeling component and give examples of the pre-built ER diagrams. As a result, beginners in data modeling have a great deal of difficulty learning how to approach a given problem, what questions to ask in order to build a model, what rules to use while constructing an ER diagram, and why one diagram is better than another. The authors present step-by-step guidelines, a set of decision rules proven to be useful in building ER diagrams, and a case study problem with a preferred answer as well as a set of incorrect diagrams for the problem. These guidelines and decision rules have been successfully used in their beginning database management system course.

16 citations


Journal Article•DOI•
TL;DR: Genetic algorithms are adaptive search methods that simulate natural processes such as: selection, information inheritance, random mutation, and population dynamics that find more and more general applications thanks to understanding better the necessary properties of the required mapping, and new ways to process problem constraints.
Abstract: Gradually, problem solving becoming dynamic agents interacting with the surrounding world rather than by isolated operations. Some methods are coming from nature, where organisms both cooperate and compete for environmental resources. This has led to the design of algorithms which simulate these natural processes. The genetic algorithm (GA) represents one of the most successful approaches. Genetic algorithms are adaptive search methods that simulate natural processes such as: selection, information inheritance, random mutation, and population dynamics. At first, GAs were most applicable to numerical parameter optimizations due to an easy mapping from the problem to representation space. Today they find more and more general applications thanks to: (1) understanding better the necessary properties of the required mapping, and (2) new ways to process problem constraints. >

15 citations


Journal Article•DOI•
TL;DR: The basic requirements of GSM are described in five aspects: services; quality of services and security; radio frequency utilisation; network; and cost.
Abstract: This article is a brief overview of the GSM (global system for mobile communications) system. GSM is a wireless digital signaling network standard designed by standardization committees from the major European telecommunications operators and manufacturers. The GSM standard provides a common set of compatible services and capabilities to all mobile users across Europe. The basic requirements of GSM are described in five aspects: services; quality of services and security; radio frequency utilisation; network; and cost. >

10 citations


Journal Article•DOI•
T.P. Van Doren1, Todd H. Hubing1, Fei Sha1, James L. Drewniak1, D.M. Hockanson1 •
TL;DR: In this article, the authors discuss electromagnetic compatibility (EMC) and electromagnetic interference (EMI), after a brief look at the causes of EMI, they describe conductive coupling and electromagnetic radiative coupling.
Abstract: The authors discuss electromagnetic compatibility (EMC) and electromagnetic interference (EMI). After a brief look at the causes of EMI, they describe conductive coupling and electromagnetic radiative coupling. Career opportunities in EMC problem solving are looked at. >

Journal Article•DOI•
J. Locke1•
TL;DR: Virtual reality immerses a user within a simulated three-dimensional environment, and in a networked environment, people can be immersed in the same simulation, each perceiving it from their own point of view.
Abstract: Virtual reality (VR) immerses a user within a simulated three-dimensional environment. Still in its infancy, much remains to be done, but the field shows great promise. Computer processing speed, graphics, networks, and specialized peripherals all play a part in creating believable applications. In a full VR implementation, the user wears specialized goggles that substitute small video displays for the lenses, one for each eye. The displays become the "lenses" onto a simulated world. The head-mounted display containing the goggles detects head movement and updates the user's view accordingly; the user has the impression of looking around, even moving within, the simulated environment. The sensory experience can be further enriched with sound, or a data glove to simulate contact between the user and simulated objects. In a partial VR implementation, a computer's video display becomes a window onto the simulated environment, and no external gear is required. In a networked environment, people can be immersed in the same simulation, each perceiving it from their own point of view. >

Journal Article•DOI•
TL;DR: The trend to achieve better performance, with more consistent and improved properties, challenges manufacturers to produce higher-purity copper and higher conductivity copper with the right combination of properties for specific applications.
Abstract: A round for centuries, copper remains an important metallic element in the world today. Copper's high electrical and thermal conductivity as well as its excellent corrosion resistance makes it ideal for use in the electrical, electronics, telecommunication, and construction industries. Electrical and electronic applications already consume more than 50% of all copper and will probably place further demands in the future. The trend to achieve better performance, with more consistent and improved properties, challenges manufacturers to produce higher-purity copper and higher conductivity copper with the right combination of properties for specific applications.

Journal Article•DOI•
TL;DR: The inclusion of a scan-register on each IC allows: 1) the observation of each IC during normal operation; 2) the test of interconnects between ICs, and 3) the isolation of the IC from others so it can test itself.
Abstract: Miniaturization trends in integrated circuit (IC) technology have caused many testing problems. As bigger packaged ICs with higher pin counts are more densely packed onto a printed circuit board (PCB), accessing an IC's pins is harder. No longer are the pins mechanically accessible to probes or a bed-of-nails fixture. Therefore, determining which IC or interconnect is faulty is difficult or impossible. Because each IC's input pins cannot be controlled, and each IC's output pins cannot be observed. The boundary scan method was developed with the goal of improving this controllability and observability problem. A shift-register is included next to each IC pin so that input and output values can be serially shifted in and out. This reduces the need to use probes to control and observe. Also, the output of each IC's scan register can be connected with the input of another IC's scan register. This effectively creates one big scan chain per PCB, further reducing points that must be mechanically probed. The inclusion of a scan-register on each IC allows: 1) the observation of each IC during normal operation; 2) the test of interconnects between ICs, and 3) the isolation of the IC from others so it can test itself. The IEEE Standard 1149.1 Test Access-Port and Boundary Scan defines the test logic for implementing a boundary scan test architecture. Example circuits were designed in CMOS. A boundary scan cell is described. >

Journal Article•DOI•
TL;DR: A closer look at the particle collision phenomenon reveals the root of many natural processes and, as such, contains value for scientific computation.
Abstract: There is hardly a dynamic computer graphics demonstration that does not involve bouncing objects of some sort. Anticipating on-screen collisions does induce adrenalin flow and may explain the attention passers-by give such displays. Usually this visual realism is achieved without regard to the underlying physics of collisions. These displays possess no value other than being attention getters and entertainment providers. Yet a closer look at the particle collision phenomenon reveals the root of many natural processes and, as such, contains value for scientific computation. >

Journal Article•DOI•
TL;DR: In this paper, the authors examined the economic and environmental benefits of using photovoltaic energy for electrical power generation and outlined the economic, as well as environmental, benefits of this technology.
Abstract: Today, more than ever, photovoltaic energy seems to be on its way to becoming a leading source of electrical power. Prices are dropping and photovoltaic power systems are improving in quality, reliability, and capacity to convert sunlight directly into electricity. One large purchaser of photovoltaics are electric utilities. Here, the author examines the prospects of PV power generation and outlines the economic, as well as environmental, benefits of using this technology. >

Journal Article•DOI•
TL;DR: In this paper, the authors discuss the discovery of titanium dioxide and then describe its properties and applications, in particular its use in high efficiency dye sensitised solar cells, and the use of this material in high efficient dye sensitized solar cells.
Abstract: There is a growing need for lightweight, durable materials in industry and research. The "classic" materials such as gold, copper, and steel can not fit this new need. Thus, new sets of materials are being used more frequently, and none more so than titanium and its oxides-more specifically, titanium dioxide (TiO/sub 2/). The author discusses the discovery of titanium dioxide and then describes its properties and applications. In particular the author discusses its use in high efficiency dye sensitised solar cells. >

Journal Article•DOI•
TL;DR: In this paper, the authors briefly touched upon the essence of the technical presentation-planning, attitude, and execution, and illustrated the value of the script in planning, the positive outlook in attitude and the elements of poise, confidence, and dignity in execution.
Abstract: The author has briefly touched upon the essence of the technical presentation-planning, attitude, and execution. The value of the script in planning, the positive outlook in attitude, and the elements of poise, confidence, and dignity in execution are illustrated. >

Journal Article•
TL;DR: The author steps through a simple collision avoidance system designed to automatically warn the driver when the car is in danger of hitting another car (or other object) in front of it.
Abstract: The author steps through a simple collision avoidance system designed to automatically warn the driver when the car is in danger of hitting another car (or other object) in front of it. The urgency of the system's warning will depend on the car's speed, its distance to the next car, and the speed of the next car. The design of a fuzzy system can be described in four steps: determination of fuzzy inputs and outputs, organization of fuzzy inputs and outputs into fuzzy sets, determining the fuzzy rules, and defuzzification of the output(s). >

Journal Article•
TL;DR: A parallel processing computer system typically breaks down a big problem into many subproblems and then solves it through close cooperation of a large number of interconnected processors that provide ultra high performance at reasonable costs.
Abstract: Parallel processing is many problem solvers operating simultaneously or concurrently to reach a solution. A parallel processing computer system typically breaks down a big problem into many subproblems. It then solves it through close cooperation of a large number of interconnected processors. Parallel processing is the only way to overcome the limitation of traditional architectures. Following this direction, many new computer architectures have been proposed. They have opened up the possibility of constructing massively parallel computer systems that provide ultra high performance at reasonable costs.

Journal Article•DOI•
Behnam Kamali1•
TL;DR: Error control coding is a signal processing technique that protects digital information against transmission and storage errors by using bit-oriented codes to cope with errors that are clustered together.
Abstract: Error control coding (ECC) is a signal processing technique that protects digital information against transmission and storage errors. Codes that combat randomly distributed errors are called random-error correcting codes. Codes that cope with errors that are clustered together are known as burst-error correcting codes. When data is presented as a stream of binary numbers, bit-oriented codes may be applied. In contrast to this, data might be arranged as words, where each word includes a number of bits. Codes that operate on words are said to be word-oriented codes. The article is limited to linear bit-oriented block codes. >

Journal Article•DOI•
TL;DR: In this article, the authors discuss the optical modulator, its pros and cons, and how quantum well structures are used to realize a modulator and how they can be used in quantum modulators.
Abstract: Perfecting thin film growth techniques have made possible today's search for new, ultrafast optoelectronic devices. The techniques include: molecular beam epitaxy (MBE), organometallic chemical vapor deposition (OMCVD) and atomic layer epitaxy (ALE). By using these methods, scientists and engineers can grow multilayered semiconductor structures of different materials, with extremely high purity, sharp interfaces, and narrow doping profiles. The thickness of each layer has as its lower limit only one atomic layer. By superposing successive materials of different bandgaps, we can create quantum well structures in which electrons or holes are confined to potential wells in the conduction or valence bands. This quantization has allowed for the observation of quantum effects impossible to observe in the bulk materials. One effect is the electron or hole subband-to-subband transition within a conduction or a valence band in a quantum well. This is best known as the inter-subband transition. Before to this concept is examined, the author reviews the optical modulator, its pros and cons, and how quantum wells are used to realize a modulator. >

Journal Article•
TL;DR: In this paper, the authors describe a computer code that is being developed to alleviate the problem of lack of algorithms and software for quantitative NDE, wherein we attempt to quantify defects, that is, determine their size, location, even shape, rather than just detect their presence.
Abstract: Nondestructive evaluation (NDE) is to materials what CAT (computerized axial tomography) scanning is to the human body-an attempt to look inside without opening it up. As in CAT scanning, modern NDE requires sophisticated mathematical software to perform its function. This is especially true with regard to quantitative NDE, wherein we attempt to quantify defects, that is, determine their size, location, even shape, rather than just detect their presence. Low-frequency electromagnetic methods using eddy-currents are a traditional mode of doing NDE. (In approximately 35% of all applications, NDE uses eddy-currents depending upon the specific application.) But the technology still suffers from a lack of algorithms and software to allow its full potential to be realized. Here we will describe a computer code that is being developed to alleviate that problem.

Journal Article•
D.B. Davidson1•
TL;DR: This article is an introductory overview emphasizing the use of parallel computers for computational engineering.
Abstract: Parallel processing is the harnessing of multiple processors to work on the same problem. The aim is to speed up the computational process, ideally by the number of processors used. Parallel processing is increasingly emerging as the key to very-high-speed computation. This article is an introductory overview emphasizing the use of parallel computers for computational engineering. >

Journal Article•
T.K. Hazra1•
TL;DR: In this paper, the authors discuss some basic terms and concepts used in parallel computing, and go on to consider parallel software engineering principles and their implementation, concluding with a discussion of transputers and their future.
Abstract: The author begins by discussing some basic terms and concepts used in parallel computing, and goes on to consider parallel software engineering principles and their implementation. The author concludes with a discussion of transputers and their future. >

Journal Article•DOI•
TL;DR: Describes how the authors used genetic algorithms (GAs) to make a computer develop its own strategy in playing a simple board game.
Abstract: Describes how the authors used genetic algorithms (GAs) to make a computer develop its own strategy in playing a simple board game. There are various types of games. John von Neumann and Oskar Morgenstern's Game Theory classifies games into several categories depending on the number of players involved, the presence or absence of the element of chance, the case that all players receive the same pieces of information or not and the type of payment function. The authors consider two-person zero-sum non-chance perfect-infermation games. Chess and checkers belong to this class. >

Journal Article•DOI•
TL;DR: In this article, the authors present a method by which environmental considerations and constraints can be integrated into existing process and product design practices, which involves considering the environmental impact of all materials and processes that are involved in the entire life-cycle of the product.
Abstract: A pressing area of concern is the future competitiveness of the electronics industry. Design For Environment (DFE) is a method by which environmental considerations and constraints can be integrated into existing process and product design practices. It involves considering the environmental impact of all materials and processes that are involved in the entire life-cycle of the product. The life-cycle of a product is from materials extraction to production, use, and disposal. DFE can be broken into smaller subgroups involving designing for such things as pollution prevention, product disassembly and recyclability, environmentally sound processing, materials recyclability, maintainability, and other factors. Accounting practices and organizational issues need to be addressed as well in order to fully implement DFE within a corporation. A company that institutes DFE in its practices minimizes waste, energy, materials, and hazardous substances in its product and its manufacturing, and maximizes overall environmental friendliness in its design. >

Journal Article•DOI•
D.P. Leach1•
TL;DR: The "paper and pencil" technique presented is easy to learn, easy to use, and provides a rapid estimate of circuit behavior.
Abstract: Modern circuit analysis programs (CAP) such as PSPICE and MICROCAP offer efficient and accurate alternatives to more traditional "paper and pencil techniques". Nevertheless, the ability to quickly analyze circuit behavior without the aid of a computer is an important skill that provides valuable insights into circuit performance. The "paper and pencil" technique presented is easy to learn, easy to use, and provides a rapid estimate of circuit behavior. >

Journal Article•DOI•
TL;DR: The spin transistor as discussed by the authors is a trilayer, three terminal device similar in some ways to a semiconductor bipolar transistor, which is borrowed from the language of semiconductor physics to facilitate the description.
Abstract: The spin transistor is a trilayer, three terminal device similar in some ways to a semiconductor bipolar transistor. Although the physical principles of operation are different, the language of semiconductor physics is borrowed to facilitate the description. The emitter and collector layers of the spin transistor are ferromagnetic films characterized by a magnetization that lies in the plane of the films and has an externally manipulated direction. The base layer is a nonmagnetic metal, such as gold, copper or silver. One could fabricate an array of elements as a nonvolatile random access memory (NRAM). The magnetization orientation of all the emitter films is initially set by application (and subsequent removal) of a large magnetic field. The magnetization of the collector films is manipulated by sending currents through an over-layed array of write wires. >