scispace - formally typeset
Search or ask a question

Showing papers on "Personal computer published in 1988"


Book
01 Jan 1988
TL;DR: This book describes the security pitfalls inherent in many important computing tasks today and points out where existing controls are inadequate and serious consideration must be given to the risk present in the computing situation.
Abstract: From the Book: PREFACE: When the first edition of this book was published in 1989, viruses and other forms of malicious code were fairly uncommon, the Internet was used largely by just computing professionals, a Clipper was a sailing ship, and computer crime was seldom a headline topic in daily newspapers. In that era most people were unconcerned about--even unaware of--how serious is the threat to security in the use of computers. The use of computers has spread at a rate completely unexpected back then. Now you can bank by computer, order and pay for merchandise, and even commit to contracts by computer. And the uses of computers in business have similarly increased both in volume and in richness. Alas, the security threats to computing have also increased significantly. Why Read This Book? Are your data and programs at risk? If you answer "yes" to any of the following questions, you have a potential security risk. Have you acquired any new programs within the last year? Do you use your computer to communicate electronically with other computers? Do you ever receive programs or data from other people? Is there any significant program or data item of which you do not have a second copy? Relax; you are not alone. Most computer users have a security risk. Being at risk does not mean you should stop using computers. It does mean you should learn more about the risk you face, and how to control that risk. Users and managers of large mainframe computing systems of the 1960s and l970s developed computer security techniques that were reasonably effective against thethreatsof that era. However, two factors have made those security procedures outdated: Personal computer use. Vast numbers of people have become dedicated users of personal computing systems, both for business and pleasure. We try to make applications "user friendly" so that computers can be used by people who know nothing of hardware or programming, just as people who can drive a car do not need to know how to design an engine. Users may not be especially conscious of the security threats involved in computer use; even users who are aware may not know what to do to reduce their risk. Networked remote-access systems. Machines are being linked in large numbers. The Internet and its cousin, the World-Wide Web, seem to double every year in number of users. A user of a mainframe computer may not realize that access to the same machine is allowed to people throughout the world from an almost uncountable number of computing systems. Every computing professional must understand the threats and the countermeasures currently available in computing. This book addresses that need. This book is designed for the student or professional in computing. Beginning at a level appropriate for an experienced computer user, this book describes the security pitfalls inherent in many important computing tasks today. Then, the book explores the controls that can check these weaknesses. The book also points out where existing controls are inadequate and serious consideration must be given to the risk present in the computing situation. Uses of This Book The chapters of this book progress in an orderly manner. After an introduction, the topic of encryption, the process of disguising something written to conceal its meaning, is presented as the first tool in computer security. The book continues through the different kinds of computing applications, their weaknesses, and their controls. The applications areas include: general programs operating systems data base management systems remote access computing multicomputer networks These sections begin with a definition of the topic, continue with a description of the relationship of security to the topic, and conclude with a statement of the current state of the art of computer security research related to the topic. The book concludes with an examination of risk analysis and planning for computer security, and a study of the relationship of law and ethics to computer security. Background required to appreciate the book is an understanding of programming and computer systems. Someone who is a senior or graduate student in computer science or a professional who has been in the field for a few years would have the appropriate level of understanding. Although some facility with mathematics is useful, all necessary mathematical background is developed in the book. Similarly, the necessary material on design of software systems, operating systems, data bases, or networks is given in the relevant chapters. One need not have a detailed knowledge of these areas before reading this book. The book is designed to be a textbook for a one- or two-semester course in computer security. The book functions equally well as a reference for a computer professional. The introduction and the chapters on encryption are fundamental to the understanding of the rest of the book. After studying those pieces, however, the reader can study any of the later chapters in any order. Furthermore, many chapters follow the format of introduction, then security aspects of the topic, then current work in the area. Someone who is interested more in background than in current work can stop in the middle of one chapter and go on to the next. This book has been used in classes throughout the world. Roughly half of the book can be covered in a semester. Therefore, an instructor can design a one-semester course that considers some of the topics of greater interest. What Does This Book Contain? This is the revised edition of Security in Computing. It is based largely on the previous version, with many updates to cover newer topics in computer security. Among the salient additions to the new edition are these items: Viruses, worms, Trojan horses, and other malicious code. Complete new section (first half of Chapter 5) including sources of these kinds of code, how they are written, how they can be detected and/or prevented, and several actual examples. Firewalls. Complete new section (end of Chapter 9) describing what they do, how they work, how they are constructed, and what degree of protection they provide. Private e-mail. Complete new section (middle of Chapter 9) explaining exposures in e-mail, kind of protection available, PEM and PGP, key management, and certificates. Clipper, Capstone, Tessera, Mosaic, and key escrow. Several sections, in Chapter 3 as an encryption technology, and Chapter 4 as a key management protocol, and in Chapter 11 as a privacy and ethics issue. Trusted system evaluation. Extensive addition (in Chapter 7) including criteria from the United States, Europe, Canada, and the soon-to-be-released Common Criteria. Program development processes, including ISO 9000 and the SEI CMM. A major section in Chapter 5 gives comparisons between these methodologies. Guidance for administering PC, Unix, and networked environments. In addition to these major changes, there are numerous small changes, ranging from wording changes to subtle notational changes for pedagogic reasons, to replacement, deletion, rearrangement, and expansion of sections. The focus of the book remains the same, however. This is still a book covering the complete subject of computer security. The target audience is college students (advanced undergraduates or graduate students) and professionals. A reader is expected to bring a background in general computing technology; some knowledge of programming, operating systems, and networking is expected, although advanced knowledge in those areas is not necessary. Mathematics is used as appropriate, although a student can ignore most of the mathematical foundation if he or she chooses. Acknowledgments Many people have contributed to the content and structure of this book. The following friends and colleagues have supplied thoughts, advice, challenges, criticism, and suggestions that have influenced my writing of this book: Lance Hoffman, Marv Schaefer, Dave Balenson, Terry Benzel, Curt Barker, Debbie Cooper, and Staffan Persson. Two people from outside the computer security community were very encouraging: Gene Davenport and Bruce Barnes. I apologize if I have forgotten to mention someone else; the oversight is accidental. Lance Hoffman deserves special mention. He used a preliminary copy of the book in a course at George Washington University. Not only did he provide me with suggestions of his own, but his students also supplied invaluable comments from the student perspective on sections that did and did not communicate effectively. I want to thank them for their constructive criticisms. Finally, if someone alleges to have written a book alone, distrust the person immediately. While an author is working 16-hour days on the writing of the book, someone else needs to see to all the other aspects of life, from simple things like food, clothing, and shelter, to complex things like social and family responsibilities. My wife, Shari Lawrence Pfleeger, took the time from her professional schedule so that I could devote my full energies to writing. Furthermore, she soothed me when the schedule inexplicably slipped, when the computer went down, when I had writerOs block, or when some other crisis beset this project. On top of that, she reviewed the entire manuscript, giving the most thorough and constructive review this book has had. Her suggestions have improved the content, organization, readability, and overall quality of this book immeasurably. Therefore, it is with great pleasure that I dedicate this book to Shari, the other half of the team that caused this book to be written. Charles P. Pfleeger Washington DC

1,332 citations


Journal ArticleDOI
TL;DR: The distinct element method has advanced to a stage where the complex mechanical interactions of a discontinuous system can be modelled in three dimensions and an efficient data structure is utilizes which permits the rapid calculation on a personal computer of systems involving several hundred particles.

1,020 citations


Journal ArticleDOI
TL;DR: In this paper, a three-dimensional formulation of the distinct element method is embodied in computer program 3DEC, which has been adapted to run on a personal computer, based on a dynamic (time domain) solution algorithm.

545 citations


Journal ArticleDOI
TL;DR: Acuchem is a program for solving the system of differential equations describing the temporal behavior of spatially homogeneous, isothermal, multicomponent chemical reaction systems and for presenting the results in tabular or graphical form.
Abstract: Acuchem is a program for solving the system of differential equations describing the temporal behavior of spatially homogeneous, isothermal, multicomponent chemical reaction systems. It is designed to provide modelers, data evaluators, and laboratory scientists with an easy to use program for modeling complex chemical reactions, and for presenting the results in tabular or graphical form. The program is described and some examples of its application given. Acuchem is designed to operate on the IBM Personal Computer family and other compatible microcomputers, and is available in a compiled version on a floppy disk.

327 citations


Journal Article
TL;DR: This new method of determining protein concentration and cell number in the aqueous enabled us to make a noninvasive and quantitative evaluation of the inflammation in the anterior segment of the eye.

273 citations


Patent
18 Nov 1988
TL;DR: In this article, a user interactive mass storage data access system includes a personal computer (10) and a simulated book (30), where a mass storage device such as a compact disk (CD) read only memory (ROM) is connected to the personal computer, and the computer and the simulated book are connected by an infrared (IR) data communications link including IR transceivers (26, 48).
Abstract: A user interactive mass storage data access system includes a personal computer (10) and a simulated book (30). A mass storage device, such as a compact disk (CD) read only memory (ROM) (22), is connected to the personal computer, and the computer and the simulated book are connected by an infrared (IR) data communications link including IR transceivers (26, 48). The simulated book includes a display screen (34) and a microprocessor (43) with memory (44, 46). The microprocessor is programmed for storing data received and decoded by its IR transceiver (48) in memory (46) and responsive to user input for displaying a page of data on the display screen. In addition, the microprocessor is programmed to cause its IR transceiver (48) to transmit to the IR transceiver (26) connected to the personal computer (10) a data request command, and the personal computer is in turn programmed to transmit data from the CD ROM (22) to the simulated book (30). Data can be loaded in the simulated book and accessed at a later time when out of the proximity of the personal computer.

251 citations


Journal ArticleDOI
TL;DR: A new periodontal probing system has been developed which incorporates the advantages of constant probing force, precise electronic measurement to 0.1 mm and computer storage and analysis of the data which facilitates detecting changes in pocket depth and attachment level by rapidly comparing data recorded at different visits.
Abstract: A new periodontal probing system has been developed which incorporates the advantages of constant probing force, precise electronic measurement to 0.1 mm and computer storage of the data. The system includes a probe handpiece, displacement transducer with digital readout, foot switch, computer interface and personal computer. A unique movable arm design enables the probe handpiece to maintain smooth operation and makes it easy to clean and sterilize. Electronic recording of the data (actuated by pressing a foot switch) eliminates errors which occur when probe tip markings are read visually and the data are called to an assistant. Computer storage and analysis of the data facilitates detecting changes in pocket depth and attachment level by rapidly comparing data recorded at different visits. The system was evaluated in 3 experiments using a 0.4 mm diameter tip and a 25 g probing force. The standard deviation of repeated pocket depth measurement was less (0.58 mm versus 0.82 mm) than that of a common probe. With paired readings referenced to an occlusal stent, the standard deviation of repeated attachment level measurements was 0.28 mm. A loss of attachment level was detected to a certainty of 99% with less than a 1 mm change. This is a significant improvement over common probes, which require a 2-3 mm change for equivalent positive identification of change in attachment level.

234 citations



Journal ArticleDOI
TL;DR: Variations of the boxplot are suggested, in which the sides of the boxes are used to convey information about the density of the values in a batch, in a way designed to keep their ease of computation by computer.
Abstract: Variations of the boxplot are suggested, in which the sides of the box are used to convey information about the density of the values in a batch. The ease of computation by hand of the original boxplot had to be sacrificed, as the variations are computer-intensive. Still, the plots were implemented on a desktop personal computer (Apple Macintosh), in a way designed to keep their ease of computation by computer. The result is a dynamic display of densities and summaries.

197 citations


Patent
18 Mar 1988
TL;DR: In this paper, a data processing network in which a plurality of workstations such as personal computers (PC) are connected to a host processor (11) and contain PC programs at particular levels for controlling tasks on the personal computers.
Abstract: A data processing network in which a plurality of workstations such as personal computers (PC) (12) are connected to a host processor (11) and contain PC programs at particular levels for controlling tasks on the personal computers. The personal computers send a signal to the host when they are about to conduct a task indicating which data files they have and at what level. The host has an object library containing a copy of each disk file for each version of PC program. The host determines if the personal computer has the latest level data file for the version of program at that personal computer and, if it does not, sends a copy of the latest level data file to the personal computer to replace the down-level data file. Preferably, the host checks if the personal computer has all the data files it needs for the particular task and, if it does not and if it should because of the version of PC program stored at the PC, the host can load a copy of the missing file to the personal computer. The host does not have to contain a record of all the levels of all the programs of all the personal computers connected to it. However, by updating the object library in the host, this arrangement ensures that all personal computers connected to it are automatically brought to the latest authorized level of data files as are required by the personal computer.

173 citations


Journal ArticleDOI
TL;DR: The general program flow consists of calculation of reaction coefficients among phases in the selected chemical systems, computation of the equilibrium position for each reaction curve whereas each point on the curve is tested for metastability against all other phases.

Journal ArticleDOI
TL;DR: In this article, a dynamic, one-dimensional, unsteady lake water quality simulation model is described, which is intended primarily for lake eutrophication studies and control strategies.

Journal ArticleDOI
02 Oct 1988
TL;DR: The magnitude and symmetric optimization techniques are two related methods for designing optimal linear control systems in the frequency domain used in industry for some time, yet they do not appear to have been well documented in the English control engineering literature.
Abstract: The magnitude and symmetric optimization techniques are two related methods for designing optimal linear control systems in the frequency domain. Both methods have been used in industry for some time, yet they do not appear to have been well documented in the English control engineering literature. The development of the criterion is examined, including general solutions for typical controllers. As a basis for comparison, time-domain response optimization techniques such as minimizing the integral time absolute error, along with state-space techniques, i.e. pole placement, are briefly reviewed. Personal computer simulation results for a normalized position control system are given to illustrate these methods. >

Patent
13 Oct 1988
TL;DR: In this paper, a suspension system for a personal computer or monitor comprising a carriage in which the monitor or computer is mounted, the carriage being supported from above by a frictionally secured swivel and tilt mechanism in turn attached to a pivotable and rotatable support arm.
Abstract: A suspension system for a personal computer or monitor comprising a carriage in which the monitor or computer is mounted, the carriage being supported from above by a frictionally secured swivel and tilt mechanism in turn attached to a pivotable and rotatable support arm. The support arm is balanced by an adjustable pneumatic pressure cylinder, and is mounted on a roller assembly such that it may be carried along a path defined by a track assembly, thereby permitting the monitor or computer to be transported between first and second positions, as well as simultaneously being raised or lowered vertically, tilted, rotated, or swiveled. The track assembly may be attached to the underside of a shelving unit and incorporated into a modular partition system, or attached to a freestanding frame.

Patent
16 Mar 1988
TL;DR: In this article, a system and methods are disclosed for monitoring a home TV viewing system which may include a TV, a VCR and one or more cable converters, and the system includes probe/detector devices which obtain signals related to the frequency to which the TV, VCR, and cable converter are tuned, and a higher level processor which receives and processes those signals to identify channel tuning.
Abstract: A system and methods are disclosed for monitoring a home TV viewing system which may include a TV, a VCR and one or more cable converters. The system obtains information for identifying the source of video displayed or being recorded, i.e., off-air antenna, satellite antenna, cable converter tuner, VCR, personal computer, video game, etc. The system also obtains information identifying the video path of the video being displayed and/or recorded. The system "fingerprints" video recorded by the VCR so that the played back video may be identified as having been previously recorded. The system may record the date and time of recording and the video source of the video being recorded. The system includes probe/detector devices which obtain signals related to the frequency to which the TV, VCR, and cable converter are tuned, and a higher level processor which receives and processes those signals to identify channel tuning of the TV, VCR, and cable converter. The probe/detector monitoring the VCR includes a lower level processor and circuitry which fingerprint a video signal being recorded by the VCR. The system generates timing signals to record and detect fingerprints in the vertical blanking interval of the TV signal. The system has multi-level processing, and may be programmed to include a number of down-loadable and up-loadable parameters. The system also includes an alphanumeric display and data entry unit for each TV being monitored, and provides for interactive information entry by TV viewers including guests and by an installer.

01 Nov 1988
TL;DR: A modular image analysis system is presented consisting of a personal computer equipped with a real time video digitizer, an interactive control unit and a graphic tablet used for image enhancement, measurement of morphological parameters and image brightness as well as determination of vessel diameter and blood flow velocity.
Abstract: A modular image analysis system is presented consisting of a personal computer equipped with a real time video digitizer, an interactive control unit and a graphic tablet. Together with the corresponding software modules this system can be used for a number of image processing procedures in microcirculatory research including image enhancement, measurement of morphological parameters and image brightness as well as determination of vessel diameter and blood flow velocity.

Patent
06 Oct 1988
TL;DR: A hand held unit can operate as a mouse to move a cursor on a display, as a hand held optical scanner for entering into a computer characters or graphic information delineated on a work sheet across which the unit is moved, or as a digitizing puck for tracing and digitizing lines or curves on a paper work sheet laid over a digitising pad as discussed by the authors.
Abstract: A hand held unit can operate as a mouse to move a cursor on a display, as a hand held optical scanner for entering into a computer characters or graphic information delineated on a work sheet across which the unit is moved, or as a digitizing puck for tracing and digitizing lines or curves on a work sheet laid over a digitizing pad. The unit is connected via an electrical cable to an interface board plugged into an expansion slot inside a personal computer.

Patent
14 Oct 1988
TL;DR: In this article, the authors present a method and apparatus for high-speed reproduction of a customized group of audio programs on a slave medium enabling the reproduction of musical recordings at a point-of-sale terminal.
Abstract: Method and apparatus for high-speed reproduction of a customized group of audio programs on a slave medium enables the reproduction of musical recordings at a point-of-sale terminal. Master programs are prerecorded by a Dolby ADM (adaptive delta modulated) technique to condense their "information" content and permit higher data throughput during high-speed reproduction. In a "premastering" process to make encoded masters, a special multiplexer board provides interfacing between Dolby ADM digital audio data and a conventional data processing system. The multiplexer board performs computer-like data blocking and also writes a unique sync code directly in the data block. The data processing system also catalogs and edits the blocked digital audio; places encrypted catalog, pricing, and other indicia in the data file representing the music; and sends the data file to a conventional 16-bit PCM file writer for making encoded CD music ROMS which contain the ADM data representing the encoded audio program. A high-speed reproduction device, under control of a desk top personal computer, extracts timing and ADM data from the CD music ROM at high speed and controls high-speed tape drives to compile selected recordings on a single cassette tape in a form compatible with Dolby B noise reduction.

Patent
09 Feb 1988
TL;DR: In this paper, a computerized time clock system includes a personal computer via which employee, job and schedule records may be assembled and maintained, in order to validate and record time-in and time-out transactions executed by employees.
Abstract: The computerized time clock system includes a personal computer via which employee, job and schedule records may be assembled and maintained. A computerized time clock communicates with the personal computer and received employee and scheduling data therefrom, in order to validate and record time-in and time-out transactions executed by employees. Current time records are maintained in the memory of the time clock and at the end of each day are transmitted to the personal computer for addition to permanent disk records including a record of each time-in and time-out transaction for an extended period. Sales records may also be maintained in personal computer 20, for example the quantity of liquor or food served by a particular employee or in a particular department, which sales information may be correlated with labor costs found in the permanent time records.

Patent
08 Aug 1988
TL;DR: In this article, an aircraft collision avoidance system providing warning and avoidance manuevers for all fixed and moving obstructions that threaten the safe navigation of the host aircraft is presented, which is effective against threatening aircraft, runway maintenance vehicles and prominent geographic obstructions such as radio towers and mountain peaks.
Abstract: An aircraft collision avoidance system providing warning and avoidance manuevers for all fixed and moving obstructions that threaten the safe navigation of the host aircraft. The system is effective against threatening aircraft, runway maintenance vehicles and prominent geographic obstructions such as radio towers and mountain peaks. It is an economical combination of basic telemetry equipment (transmitter/receiver) and current personal computer components configured to broadcast its host location and intended movement while simultaneously receiving the same information from all nearby similarly equipped stations, either air or ground. Maximum effectiveness is attained when data is available from the Global Positioning System but alternative sources of navigational information including dead reckoning are provided for. Althrough intended primarily for aviation use, the same technology and concepts are valid for the safe transit of ships and railway equipment.

Journal ArticleDOI
TL;DR: An implementation of an algorithm which uses the factoring theorem, in conjunction with degree-1 and degree-2 vertex reductions, to determine the reliability of a network is presented.
Abstract: The factoring theorem is a simple tool for determining the K-terminal reliability of a network, i.e. the probability that a given set K of terminals in the network are connected to each other by a path of working edges. An implementation of an algorithm which uses the factoring theorem, in conjunction with degree-1 and degree-2 vertex reductions, to determine the reliability of a network is presented. Networks treated have completely reliable nodes and have edges which fail statistically and independently with known probabilities. The reliability problem is to determine the probability that all nodes in a designated set of nodes can communicate with each other. Such an implementation of the factoring theorem can be incorporated in a small, stand-alone program of about 500 lines of code. A program of this type requires little computer memory and is ideally suited for microcomputer use. >

Journal ArticleDOI
TL;DR: The hydrolysis and fermentation of insoluble cellulose (Avicel) by continuous cultures of Ruminococcus albus 7 was studied, indicating that product formation was growth linked.
Abstract: The hydrolysis and fermentation of insoluble cellulose (Avicel) by continuous cultures of Ruminococcus albus 7 was studied. An anaerobic continuous culture apparatus was designed which permitted gas collection, continuous feeding, and wasting at different retention times. The operation of the apparatus was controlled by a personal computer. Cellulose destruction ranged from ca. 30 to 70% for hydraulic retention times of 0.5 to 2.0 days. Carbon recovery in products was 92 to 97%, and the oxidation-reduction ratios ranged from 0.91 to 1.15. The total product yield (biomass not included) per gram of cellulose (expressed as glucose) was 0.83 g g−1, and the ethanol yield was 0.41 g g−1. The product yield was constant, indicating that product formation was growth linked.

Journal ArticleDOI
Roy McWeeny1
TL;DR: In this paper, the classical valence bond theory is recast in a spin-free form, which provides a practicable route to ab initio calculations of molecular electronic structure and requires only efficient algorithms for the generation and processing of permutations and the handling of Rumer diagrams.
Abstract: Classical valence bond theory is recast in a spin-free form which provides a practicable route to ab initio calculations of molecular electronic structure. The approach is simple and direct and requires only efficient algorithms for the generation and processing of permutations and the handling of Rumer diagrams: it makes modest demands on computing power and pilot calculations have indeed been performed entirely within the fast memory of a personal computer, which should be sufficient for dealing with systems possessing up to 10 electrons outside a closed shell. Simple applications confirm the conclusion of Cooper et al. [1] that, by using strongly overlapping orbitals, a small number of classical (nonpolar) structures can give results close to those obtained in a “full CI” calculation.

Journal ArticleDOI
TL;DR: This inexpensive method for fully automated amino acid analysis combines the advantages of automated precolumn derivatization with o-phthaldialdehyde and favorable analytical conditions to separate and quantify 30 amino acids found in normal plasma.
Abstract: This inexpensive method for fully automated amino acid analysis combines the advantages of automated precolumn derivatization with o-phthaldialdehyde and favorable analytical conditions to separate and quantify 30 amino acids found in normal plasma. The system can run unattended for almost four days, during which the data are processed automatically by a personal computer and a maximum of 76 samples and 19 standards can be processed (cycle time per analysis: 55 min). Only 1 microL of deproteinized plasma is required per analysis. Coefficients of variation for retention times and areas measured for all relevant amino acids are less than 1% and 3%, respectively. The system described is well suited for quick, sensitive operation in daily practice.

Journal ArticleDOI
TL;DR: In this paper, the (1 × 2) missing-row reconstruction of clean Pt(110) is studied with a new low-energy electron diffraction (LEED) intensity analysis.

Patent
03 Aug 1988
TL;DR: In this paper, a personal computer with an A/D converter and a hard disk drive was used to record the electroencephalographic (EEG) activity of a subject during a commercial and to record event related potentials (ERP) during commercial evaluation sequences subsequent to the commercial.
Abstract: The present invention uses a personal computer 180 with an A/D converter 184 and a hard disk drive 182 to record the electroencephalographic (EEG) activity of a subject 192 during a commercial and to record event related potentials (ERP) during commercial evaluation sequences subsequent to the commercial The EEG is analyzed by a signal processing computer 205 for alpha and beta frequency amplitude content to determine attention cognition of the commercial Different commercials for the same product are compared using overall attention and cognition ratings The ERP is analyzed to determine the amplitude and latency of the ERPs potentials produced by stimulus events in the evaluation sequences The ERPs are filtered and the peak amplitudes and latency measured The amplitude and/or latency determines the understanding of the commercial, the value of the product, the intent to buy the product and the memory of the product By computing overall results for each commercial different commercials can be compared The effectiveness the commercial can be compared to a reference product and a well known local price providing additional information concerning the product to the advertiser

Journal ArticleDOI
TL;DR: In this article, a computer-controlled four-detector photopolarimeter (FDP) was constructed using four windowless planar-diffused Si photodiodes, operational amplifiers, an analog-to-digital converter, and a personal computer with peripherals.
Abstract: A computer‐controlled four‐detector photopolarimeter (FDP) has been constructed using four windowless planar‐diffused Si photodiodes, operational amplifiers, an analog‐to‐digital (A/D) converter, and a personal computer with peripherals. A nonplanar light path is selected with incidence angles at the first three detectors of ∼65° and with rotations of ∼45° between the successive planes of incidence. The last detector, which is coated for minimum reflectance, intercepts the beam at a small angle and the residual light it reflects is dumped. A 1‐mW He–Ne laser beam (λ=632.8 nm) passes through the polarizing optics of an ellipsometer to provide the polarization states needed for calibration and testing. With an optimum set of calibration states, the instrument matrix A is determined. The FDP is subsequently tested and found to correctly measure the normalized Stokes parameters of a large number of states with an average absolute error of ∼0.01, which is attributed to imperfections in the calibration optics. This first prototype instrument has a precision of ∼0.2%.

Journal Article
TL;DR: The model consists of an additive procedure in which the decision to incorporate a route in the network or to increase the frequency of a route is based on an economic criterion which can be regarded as an estimate of the Lagrange Multiplier of the optimization problem.
Abstract: This paper describes the major features of an optimization model which can be used to design public transport networks. Design problems that can be solved with the model involve the redesign of either a part of a network or a complete network and the assignment of frequencies. The model consists of an additive procedure in which the decision to incorporate a route in the network or to increase the frequency of a route is based on an economic criterion which can also be regarded as an estimate of the Lagrange Multiplier of the optimization problem. A major advantage of the model is that the different design problems are solved with one single optimization process. Furthermore, the optimization process is kept understandable and the model is suited for use on a personal computer. Some results of the model are presented.

Journal ArticleDOI
TL;DR: In patients with mild retinal damage in whom other tests of visual function are normal, this method of testing color vision shows specific increases in contrast thresholds along tritan color-confusion lines.
Abstract: We report a method for computer enhancement of color vision tests. In our graphics system 256 colors are selected from a much larger range and displayed on a screen divided into 768 × 288 pixels. Eight-bit digital-to-analogue converters drive a high quality monitor with separate inputs to the red, green, and blue amplifiers and calibrated gun chromaticities. The graphics are controlled by a PASCAL program written for a personal computer, which calculates the values of the red, green, and blue signals and specifies them in Commite Internationale d'Eclairage X, Y, and Z fundamentals, so changes in chrominance occur without changes in luminance. The system for measuring color contrast thresholds with gratings is more than adequate in normal observers. In patients with mild retinal damage in whom other tests of visual function are normal, this method of testing color vision shows specific increases in contrast thresholds along tritan color-confusion lines. By the time the Hardy-Rand-Rittler and Farnsworth-Munsell 100-hue tests disclose abnormalities, gross defects in color contrast threshold can be seen with our system.

Journal ArticleDOI
TL;DR: A microcomputer program to calculate the cholesterol saturation of bile is described, designed to accept most of the conventional concentration units for bile salts, phospholipid, and cholesterol.