scispace - formally typeset
Search or ask a question

Showing papers in "IEEE Potentials in 2004"


Journal ArticleDOI
TL;DR: In this article, the performance of power lines for high bit rate applications is studied. But, the authors focus on the power line communication, which is originally devised to transmit electric power from a small number of sources to a large number of sinks, and the first data transmission were primarily done only to protect sections of the power distribution system in case of faults.
Abstract: This article overviews the power line communication, which is originally devised to transmit electric power from a small number of sources to a large number of sinks. Initially the first data transmission were over power lines were primarily done only to protect sections of the power distribution system in case of faults. This paper also studies the performance of power lines for high bit rate applications. DMT technology is adopted by the homeplug standard can theoretically provide data rates of 100Mb/s. However, products based on the standard only have achieved data rates up to 14 Mb/s. To protect against the severe noisy conditions and fading in the powerline channel, very high levels of error control coding need to be provided. The efforts of the homeplug alliance and home networking technology's growth in the US portend a very bright future for DMT-based PLC home networking.

299 citations


Journal ArticleDOI
TL;DR: The most promising features are joint lossless compression, joint encryption and hiding based on SCAN language which analyses the unique properties of digital image and video and search for high security algorithms to reduce the overall computational cost.
Abstract: This paper describes the most representative algorithms and standards for the encryption of data, digital images and MPEG video. The general model a typical encryption/decryption system about the security principle is discussed. Data encryption mainly is the scrambling of the content of data, text, image, audio, and video and to make the data unreadable, invisible or incomprehensible during ciphertext transmission. The goal is to protect the content of the data against the attackers. The reverse of data encryption is data decryption, which recovers the original data. There are two types of encryption/decryption key: the public-key system and the private-key system. The most promising features are joint lossless compression, joint encryption and hiding based on SCAN language which analyses the unique properties of digital image and video and search for high security algorithms to reduce the overall computational cost.

111 citations


Journal ArticleDOI
TL;DR: This article focuses on how to achieve the best possible data results from sensor network applications and setups for traffic/highway goals, and discusses a method that optimizes the tradeoff between energy and accuracy using a variety of activation policies.
Abstract: Wireless sensor networks have been used for a variety of applications. However, many highway and traffic applications have not been tapped, primarily sensor networks for highway and traffic algorithms that alleviate generic problems such as highway congestion. This is due to the fact that sensor network technology is a very recent development. Since sensor networks are relatively new, not many applications have been explored in depth. Utilizing the new generation of TinyOS micaboard mote sensors, miniaturized sensors that utilize TinyOS, an event-based operating environment written in code similar to stylized C, the article focuses on how to achieve the best possible data results from sensor network applications and setups for traffic/highway goals. How to use sensor-network graphs for optimal placement of sensors in a network so as to minimize work and to achieve the best possible, and most accurate, signal-strength localization measurements is also be a primary focus. Also discussed is a method that optimizes the tradeoff between energy and accuracy using a variety of activation policies. Finally, simulations and distancing experiments of indoor and outdoor data are provided to encourage similar sensor work.

89 citations


Journal ArticleDOI
TL;DR: The Rijndael algorithm is the new advanced encryption standard (AES) approved by the US National Institute of Standards and Technology (NIST) and it is believed that almost all US government agencies will shift to the AES algorithm for their data security needs in the next few years.
Abstract: The Rijndael algorithm is the new advanced encryption standard (AES) approved by the US National Institute of Standards and Technology (NIST). With this algorithm supporting significantly larger key sizes than DES (data encryption standard) supports, NIST believes that the AES has the potential of remaining secure for the next few decades. In overall performance, based on the speed of encryption and decryption and on key set-up time, the Rijndael algorithm has attained top scores in tests conducted by NIST. The belief is that almost all US government agencies will shift to the AES algorithm for their data security needs in the next few years. Also, that the algorithm will find its way in smart cards and other security-oriented applications used for safely storing private information about individuals.

81 citations


Journal ArticleDOI
TL;DR: Two important topics in disaster recovery, risk management and disaster recovery planning are covered, which can play a major role in a company's survival/success.
Abstract: In today's world, where fears of "what if" grow daily, information technology (IT) professionals are planning for those possible disasters. According to a 2003 article found on ComputerWorld's web site, nine out of ten IT leaders surveyed had already cemented a disaster recovery plan or will have one in place within the year. Planning for a disaster may seem odd at first. But it is a smart choice for anyone who wants to protect a valuable asset. For instance, just as you would not carelessly store paper money next to the fireplace, the same intelligence applies to data and computers. A catastrophe is anything that threatens the function or existence of a business, ranging from a computer virus to a huge earthquake. A well thought out disaster recovery plan can play a major role in a company's survival/success. Disaster recovery covers a broad range of topics and includes practically everyone in an organization. Every employee - manager and janitor - must be on the same page when a disaster occurs. The support of all the management teams is also necessary. This article covers two important topics in disaster recovery, risk management and disaster recovery planning.

66 citations


Journal ArticleDOI
TL;DR: This paper overviews the software-defined radio (SDR), also called software radio (SR), refers to wireless communication in which the transmitter modulation is generated or defined by a computer.
Abstract: This paper overviews the software-defined radio (SDR), also called software radio (SR), refers to wireless communication in which the transmitter modulation is generated or defined by a computer. The receiver then also uses a computer to recover the signal intelligence. SDR is an enabling technology that is useful in a wide range of areas within wireless systems. The primary goal of SDR is to replace as many analog components and hardwired digital VLSI devices of the transceiver as possible with programmable devices. This technology is receiving enormous recognition and generating widespread interest in the telecommunication industry. The SDR Forum is an international, nonprofit organization that includes members from academia, the military, vendors, wireless service providers, and regulatory bodies. SDR has been described as the cornerstone in the evolution of GSM.

62 citations


Journal ArticleDOI
TL;DR: The main issues and parameters that a PCB designer has to consider and analyze before a board layout is created are presented and first order approximation equations for various parameters are presented, based on the geometry of the PCB traces.
Abstract: Current high-speed PCB (printed circuit board) designs need extra care due to the frequency of operation and reduced rise time signals. We present the main issues and parameters that a PCB designer has to consider and analyze before a board layout is created. First order approximation equations for various parameters are presented, based on the geometry of the PCB traces. Some useful design practices are also mentioned. As the speed of operation increases, the variables that are neglected in the lower frequency/higher rise time situation become more significant. Such parameters increase the complexity of the design. Three-dimensional analysis becomes a must to calculate and model interconnects accurately. This is where field solvers and the role of the signal integrity engineer come into play.

39 citations


Journal ArticleDOI
TL;DR: Several algorithms that have been experimentally used to detect commercials, as well as devices that are currently available for this purpose, are discussed.
Abstract: TV advertising commercials are a critical marketing tool for many companies. Their interspersion within regular broadcast television programming can be entertaining, informing, annoying or a sales gold mine depending on one's viewpoint. As a result, there are two major reasons for being able to detect commercial segments within television broadcasts. These two applications' goals - at least indirectly - are at odds with each other. One application seeks to identify and track when specific commercials are broadcast. Advertisers, in particular, like to verify that their contracts with broadcasters are fulfilled as promised. The other group wants to detect commercials for the purpose of eliminating them from their recordings. This group consists of viewers who want to watch their recorded television shows without the annoyance of commercials. Video database maintainers would also appreciate the ability to automatically edit out commercials in stored shows and thereby decrease storage requirements. Of course, advertisers are strongly opposed to such devices because that defeats the purpose of the commercials. The article discusses several algorithms that have been experimentally used to detect commercials, as well as devices that are currently available for this purpose.

37 citations


Journal ArticleDOI

31 citations


Journal Article
TL;DR: The prior information of choosing model parameters for constant threat from well planned, sophisticated and coordinated terrorist operations against geographically diverse targets, which causes significant loss to human life and property is discussed.
Abstract: This paper discusses the prior information of choosing model parameters for constant threat from well planned, sophisticated and coordinated terrorist operations against geographically diverse targets, which causes significant loss to human life and property. A semiautomated model-based tool to detect and track terrorist activity are developed based on the analysis of prior terrorist attacks through the clues about the enabling events and the information from open sources. A pattern of transactions is a potential realization and prediction of a possible terrorist event. The HMMs is a stochastic model used to detect the monitored terrorist activity and measure local threat levels. The Bayesian network probabilistic model is well-suited for modelling the global threats and for computing/assessing the overall threat. Optimization techniques can be used to allocate counter terrorism resources and software to track multiple terrorist activities using multitarget tracking algorithms for intelligence analysis.

26 citations


Journal ArticleDOI
TL;DR: This paper focuses on a source coding and subjective mean opinion score methods are used to measure the speech perception based on score given by the listener to achieve low bit rate coding.
Abstract: This paper describes the objective of speech coding to achieve toll quality performance at minimum bit rate and to improve the efficiency of transmission and storage, reduce cost, increase security and robustness in transmission. Speech coding can be achieved based on two facts: redundancy in speech signals, and the perception properties of human ears. Performance evaluation of speech coding is obtained by speech quality, coding rate, algorithm complexity and delay. The two speech coding methods: waveform coding and parametric coding uses a synthetic model for speech analysis and reconstruction. This paper focuses on a source coding and subjective mean opinion score methods are used to measure the speech perception based on score given by the listener to achieve low bit rate coding.

Journal ArticleDOI
TL;DR: The cheap micro controller working in conjunction with existing wireless transmitters and tire pressure monitoring devices for motor vehicle applications is designed and implemented for the use on truck trailers.
Abstract: This paper describes the monitoring and managing of tire pressure. The cheap micro controller working in conjunction with existing wireless transmitters and tire pressure monitoring devices for motor vehicle applications is designed. The main goal is to develop a working unit capable of detecting pressure in multiple tires and adjustment of pressure in the tires to the appropriate predefined values. The controller that interfaces with the tire receiver to correctly capture and manipulate the data and the pressure management capability is also discussed. The "transportation recall enhancement, accountability and documentation" (TREAD) act requires automakers to gradually provide tire pressure monitoring solutions for all motor vehicles. This system is mainly designed and implemented for the use on truck trailers.

Journal ArticleDOI
TL;DR: The distributed algorithm proposed represents a novel step in solving the 3D imaging problem using power processing and enables cameras, fitted with special boards, to generate 3D images in less time than with existing methods.
Abstract: There is a general push for biometric-based solutions to replace keys, ID cards, passwords and PINs. Facial recognition, as one of the computational biometrics technologies, has received renewed attention and publicity lately, but for its inaccurate results. One major reason for the inaccuracy is the fact that, generally, facial recognition tools are rooted in 2D imaging methods which are limited to front-profile 2D photographs with a maximum divergence of 20 degrees. 3D facial imaging technology eliminates much of the nagging problems, but the benefits come with the added cost of processing time, especially in the case of stereoscopic imaging. The requirement of timely processing is particularly important in access control applications. The distributed algorithm we propose represents a novel step in solving the 3D imaging problem using power processing. The algorithm enables cameras, fitted with special boards, to generate 3D images in less time than with existing methods. The algorithm exploits the well-known properties/constraints from the stereovision field. Thus, it is very reliable. The obvious impact areas for this work are the capture, display and transmission of stereoscopic images. However, other areas, such as stereoscopic HDTV, can benefit from a faster technique.

Journal ArticleDOI
TL;DR: In this article, the viability of the hybrid-electric concept and electric propulsion as major ways to improve energy efficiency and reduce the automobile's environmental effects and energy consumption has been demonstrated.
Abstract: This paper describes the hybrid-electric vehicles (HEVs) and automotive propulsion technologies. The viability of the hybrid-electric concept and electric propulsion as major ways to improve energy efficiency and reduce the automobile's environmental effects and energy consumption has been demonstrated. The principal features are multiple power sources, an internal combustion engine, an electric motor, an energy storage system and an intelligent control system. This paper also discusses the design of the propulsion systems for automobiles and other transportation products, which is undergoing major changes. The development of EVs and HEVs provide a technology bridge to the electric propulsion technologies needed for the fuel cars (fuel cell hybrid vehicle) of the future.

Journal ArticleDOI
TL;DR: In this paper, the causes of electrostatic discharge, its damaging effects and how the effects can be prevented or minimized, are discussed, and the effects of ESD in nonelectronic components such as disk drives, magnetic recording heads, and sensors.
Abstract: Electrostatic discharge (ESD) refers to the sudden transfer (discharge) of static charge between objects at different electrostatic potentials. ESD belongs to a family of electrical problems known as electrical overstress (EOS). ESD poses a serious threat to electronic devices, such as microcircuits, transistors, and diodes, and affects the operation of the systems that contain those devices. Most electronic companies regard all semiconductor devices as ESD sensitive because of the damage ESD can cause. For this reason, ESD is a major concern in the microelectronic and electronic industry in manufacture and testing. ESD concerns also exist in nonelectronic components such as disk drives, magnetic recording heads, and sensors. The article describes the causes of ESD, its damaging effects and how the effects can be prevented or minimized.

Journal ArticleDOI
TL;DR: This paper focuses on data preprocessing for WUM, which tries to determine the exact list of users who accessed the Web site and to reconstitute user sessions-the sequence of actions each user performed at the Web website.
Abstract: This paper focuses on data preprocessing for WUM. Web page mining (WUM) applies data procedures to analyze user access of Well sites. As with any knowledge, discovery and data mining (KDD) process, WUM contains three main steps: preprocessing, knowledge extraction and results analysis. This data preprocessing try to determine the exact list of users who accessed the Web site and to reconstitute user sessions-the sequence of actions each user performed at the Web site. For privacy reasons, the preprocessing users use Web server log files from Web servers as well as the Website map and then anonymizing and joining log files are used. The data preprocessing involves data fusion, data cleaning, data structuration and data summarization. This data preprocessing not only reduces the log file size but also increases the quality of available data through the new data structures.

Journal ArticleDOI
TL;DR: The intelligent balance, which has been established between real and virtual experiments, is discussed and it also gives skills that have not been covered in conventional electronics and communication engineering.
Abstract: This paper discusses the intelligent balance, which has been established between real and virtual experiments and it also gives skills that have not covered in conventional electronics and communication engineering. Engineering is the art of applying scientific and mathematical principles, experience, judgment and common sense to make things that benefit people. The explosive growth of computer capabilities and easy access to nanominiature devices has revolutionized communication and the analysis of complex systems. The EC engineer either individually or as a member of a team, can play an important role in this technical diverse mosaic and this community prepares to adapt frequency shifts in technical priorities. EC engineers should be given computers hand-on-practice and training and also should understand the physics of the problem and fundamental theorem.

Journal ArticleDOI
TL;DR: Model-integrated computing through the use of domain specific modeling environments (DSMEs) is an emerging approach to computer programming that provides a customized level of abstraction in a relatively short period of time.
Abstract: The integration of design tools and an executable system is an important step in software engineering's evolution. Model-integrated computing (MIC) through the use of domain specific modeling environments (DSMEs) is an emerging approach to computer programming. By providing a customized level of abstraction in a relatively short period of time, and leveraging existing domain knowledge by creating the language specifically for a domain expert, DSMEs are a logical progression of system design technology. MIC is the technology that turns a design tool into an executable system. DSMEs should be used only when they fit the profile required by the domain. A domain with a manageable set of components with well-understood behaviors is an excellent candidate for a DSME, as the final computer system can be generated from the model of the system. Once the domain is identified, then it is possible to use metamodeling to develop a language that suits that domain. To quote Mark Twain, "The difference between the right word and the almost right word is the difference between lightning and the lightning bug". MIC is the practice of finding the perfect words to express the problems of a domain, and using the implied meaning of the language to implement the system rapidly and efficiently.

Journal ArticleDOI
TL;DR: All frequency bands can be used to characterize, under distinct events, the activation associated with a well-established auditory/comprehension test, and the study confirms that the results produced could be effective and useful, without any risk to the patient.
Abstract: A critical issue in the presurgical evaluation of brain operations is mapping regions that control speech and language functions. The goal is to preserve these regions during surgery. Mapping done by electrically stimulating the cortex subdurally with implanted electrodes is highly effective, but introduces a high risk for cortical tissue injury. Mapping the brain with non-invasive techniques, such as EEG recordings, is possible. The object of our study is to show that the results produced could be effective and useful, without any risk to the patient. EEG data was collected using electrical source-imaging, with up to 256 electrodes (ESI-256 system). The EEG of each subject was recorded during an auditory/comprehension test using 41 key locations of the possible 256 according to the modified combinatorial nomenclature (MCN). The representations obtained allow us to highlight how different subjects react under similar auditory/comprehension tests, in order to assess the similarity/dissimilarity of brain functional patterns, and, potentially, to detect the presence of any associated neurological disorders. The study confirms that all frequency bands can be used to characterize, under distinct events, the activation associated with a well-established auditory/comprehension test.

Journal ArticleDOI
TL;DR: A snapshot of the first-generation hardware and software for sensor networks, the requirements of the sensor nodes by their capabilities, and an outlook on future implementations are focused.
Abstract: This paper describes the sensor networks that are wireless systems in which large number of sensors can be deployed. A snapshot of the first-generation hardware and software for sensor networks, the requirements of the sensor nodes by their capabilities, and an outlook on future implementations are focused. These networks have been enabled in part by advancements in silicon sensor technology, low-power microcontrollers, chip-based radios, ad-hoc networking protocols, and operating systems and languages for embedded systems.

Journal ArticleDOI
TL;DR: In this article, the authors used the transmission line modeling method (TLM) to analyze the behavior of the lightning discharge in transmission lines because it is a time-domain iterative method to know the voltage and current in any branch of the system as well as at any time of the transient.
Abstract: This paper describes the modeling transient discharge suppressors. Protection of power lines against transient propagation is considered. The use of the transmission-line modeling method (TLM) is very appropriate to analyze the behavior of the lightning discharge in transmission lines because it is a time-domain iterative method. This permits to know the voltage and the current in any branch of the system as well as at any time of the transient. In this method, all devices of the study systems (capacitance, inductance and resistance) are modeled and compared to the transmission lines where the voltage and the current waves can propagate. One of the most efficient transmission line devices for protecting against fast and high-energy transients is the zener diode. The consideration of this modeling is the component's capacitive effect, which is fundamental in analyzing high-frequency operations. The nonlinear behaviour of the Zener diode can be used to predict and analyse the transient suppression and the high stability.

Journal ArticleDOI
TL;DR: The future of organic light-emitting diodes (OLEDs) seems very bright indeed as discussed by the authors, and they will probably conquer a large portion of the micro display market Their higher efficiency and lower weight will make them quite competitive with LCD displays, the currently favorite technology
Abstract: Electro-luminescence is light emission from a solid through which an electric current is passed Electro-luminescence from organic compounds was discovered in the early '60s, but the subject did not receive much attention until the discovery and development of conductive polymers Compared to other electro-luminescent technologies (such as inorganic compound semiconductors, porous silicon and liquid crystals), polymer/organic light-emitting diodes (PLEDs/OLEDs) are very attractive The reasons are their very low operating voltage, high brightness and their tunability to produce the three fundamental colors (red, blue and green) Furthermore, they are lightweight and can be grown on flexible substrates They are fairly easy and inexpensive to fabricate Today, PLEDs/OLEDs are suitable for applications such as automotive displays In the future, they will probably conquer a large portion of the micro display market Their higher efficiency and lower weight will make them quite competitive with LCD displays, the currently favorite technology The article concludes that the future of organic light-emitting diodes seems very bright indeed

Journal ArticleDOI
TL;DR: The prediction of customer behavior is described, which enables service providers in the telecommunications industries to proactively build customer loyalty and to maximize profitability.
Abstract: This paper describes the prediction of customer behavior, which enables service providers in the telecommunications industries to proactively build customer loyalty and to maximize profitability. The prediction process involves extracting customer data, preprocessing the raw data, building a predictive model, and evaluating the model. The modeler needs to understand the service provider's business rules and the application scenario. Customer data used for prediction can be massive and yet sparse, and is dynamic, noisy, and sometimes faulty. For the success of a project, data preparation and timing are often critical components.

Journal ArticleDOI
TL;DR: A fiber-optic confocal microscope (FOCM) has been developed at the Optical Spectroscopy Lab, University of Texas at Austin, to help detect and diagnose cervical epithelial lesions in vivo.
Abstract: Optical technologies, such as reflectance and fluorescence microscopy, may help detect and diagnose cancers that originate in the epithelium. The epithelium is the layer of tissue that is exposed to the environment and lines the body's cavities. Cancers that originate in the epithelium include cervical, oral, colon, lung, stomach, bladder and skin cancers. The curable precursors to cervical cancer are cervical epithelial lesions that have larger and more densely spaced nuclei. A fiber-optic confocal microscope (FOCM) has been developed at the Optical Spectroscopy Lab, University of Texas at Austin, to help detect and diagnose these lesions in vivo. With the aid of acetic acid as a contrast agent, the FOCM shows nuclear size and density information throughout the epithelium, presenting the same information as histology but without removing, staining and slicing cervical epithelial tissue. (There are also spatial resolution requirements for showing cell nuclei.) The Optical Spectroscopy Lab continues to develop confocal microscopic instrumentation, new contrast agents, and image processing techniques to improve early detection of precancerous cervical lesions.

Journal ArticleDOI
P. Paulson1, P. Juell
TL;DR: Reinforcement-trained case-based reasoning (RETCBR) uses feedback from the user or some external process to learn how to match cases, and expands the domains in which CBR techniques can be applied, because it requires knowledge only for case recognition, not for determining indexing strategies.
Abstract: Case-based reasoning (CBR) systems compare a new problem to a library of cases and adapt a similar library case to the problem, producing a preliminary solution. Since CBR systems require only a record of library cases with successful solutions, they are often used in areas lacking a strong theoretical domain model, such as medicine, economics and law. The problem is that many CBR systems use expert knowledge to determine how to build indexes for the case library so that cases that match the current situation can be identified. Human experts - whose time is valuable and scarce - often find it difficult to explain precisely their reasoning. Thus, a knowledge elicitation bottleneck occurs for many knowledge-based applications. Reinforcement-trained case-based reasoning (RETCBR) uses feedback from the user or some external process to learn how to match cases. RETCBR expands the domains in which CBR techniques can be applied, because it requires knowledge only for case recognition, not for determining indexing strategies. We are currently applying these techniques to human-computer interaction problems, such as user modeling and collaborative filtering.

Journal ArticleDOI
TL;DR: Patterns are a way to capture that experience which is often lost in traditional design literature as mentioned in this paper Patterns work with other patterns under human guidance to support a process for building systems that employ local adaptation and piecemeal growth.
Abstract: Both individuals and disciplines, as a whole, learn from experience Patterns are a way to capture that experience which is often lost in traditional design literature Patterns work with other patterns under human guidance to support a process for building systems that employ local adaptation and piecemeal growth The pattern language design technique has roots in architecture and has close ties to broad engineering practices; many engineering artifacts (such as the article's suspension bridge and the push-pull amplifier examples) are typical patterns Patterns are being widely used in object-oriented software development and have pushed into other areas of software engineering However, they are a natural fit for a broad spectrum of current engineering design practices Design can be defined as the process that takes one from a problem to a workable solution Whether your design techniques help you create the reinforcements in a large dam, improve the speed or density of a VLSI circuit, or create the layout of a new neighborhood, they all have to do with creating structures that solve problems Many of the solutions to the design problems set to students can be taught as engineering rules Such rules are often the foundation for teaching an engineering discipline

Journal ArticleDOI
TL;DR: Equivalence checking guarantees 100% verification coverage without the need for test vectors, which is the big advantage over the traditional practice of functional verification by simulation that is directed by a set of test vectors.
Abstract: Integrated circuit technology has made it possible to produce chips with several millions of transistors. However, the increasingly more complex digital circuit designs and limited time constraints only add to the pressure during the implementation process. Traditional functional verification based on simulation has, during the design creation phase, reached its limits. Thus alternatives to simulation are being used. The most important alternative is equivalence checking, known also as formal verification. With equivalence checking, the highly automated analysis of the different levels of digital circuit design is performed. A comprehensive formal verification solution at every stage in the design-flow is the main approach for today's digital circuit design. Equivalence checking uses mathematical proof algorithms, then verifies every node in the design. Thus, equivalence checking guarantees 100% verification coverage without the need for test vectors. This is the big advantage over the traditional practice of functional verification by simulation that is directed by a set of test vectors.

Journal ArticleDOI
TL;DR: In this paper, the authors proposed a microwave power beam-based solar power system to transfer solar energy from the Moon to the Earth via microwave power beams, where solar power is collected on the Moon and transferred to Earth via the microwave beam.
Abstract: The Earth continually intercepts /spl sim/175,000 terawatts of solar power. However, just a fraction of this sunlight is captured every year by the biosphere in the form of newly separated atmospheric carbon and oxygen, mostly from water. Each year the trees, grasses and grains of the continents' biomass capture thermal energy, in the form of new plant mass, but only /spl sim/0.03% of the primary solar energy. The article discusses various ways that solar energy can be exploited on Earth. It then proposes a lunar solar power system in which solar power is collected on the Moon and transferred to Earth via microwave power beams. It is impractical and costly simply to gather solar power on Earth because Earth's atmosphere and clouds reflect solar power back to space, absorb it, and block it from reaching the surface of Earth. The lunar solar power system overcomes these problems.

Journal Article
TL;DR: The article looks at the history and applications of primes, particularly their use in classical cryptographic systems, such as the Diffie-Hellman asymmetric-key cryptography algorithm and the RSA public key encryption system.
Abstract: Prime numbers reserve a special place in number theory and computer science. Their extensive use in data structures, cryptography, nucleotide encoding, in developing musical tones and such merit their importance across all disciplines, especially in computer science. The article looks at the history and applications of primes, particularly their use in classical cryptographic systems, such as the Diffie-Hellman asymmetric-key cryptography algorithm and the RSA public key encryption system. Quantum cryptographic techniques are also discussed.

Journal ArticleDOI
TL;DR: Microwave cardiac ablation is a procedure in which oscillating electromagnetic energy is delivered to the myocardium (the heart muscle), via a probe, to create thermal lesions to disrupt or eliminate the conduction pathways supporting arrythmia.
Abstract: Atrial fibrillation (AF) is a chaotic (and lethal) cardiac rhythm disorder that profoundly affects the cardiovascular performance. Surgical therapeutic intervention has been the principal method of treatment in these cases. The standard method has been the scalpel based "maze" cut and sew procedure. The method aims at interrupting the circular electrical patterns that are responsible for the arrhythmia. Strategic placement of incisions in both atria terminates the formation and the conduction of errant electrical impulses, and channels the normal electrical impulse in one direction. Scar tissue generated by the incisions permanently blocks the paths of the impulses that cause AF, thus eradicating the arrhythmia. Microwave cardiac ablation is a relatively new concept for the clinical treatment of cardiac arrhythmias. It is a procedure in which oscillating electromagnetic energy is delivered to the myocardium (the heart muscle), via a probe, to create thermal lesions. The frequencies typically used for microwave medical ablation are 915 MHz or 2.45 GHz. The thermal lesions are relied upon to disrupt or eliminate the conduction pathways supporting arrythmia. This relatively new technology has accrued an impressive clinical history in terms of biophysical and clinical efficacy, safety and surgeons' satisfaction.