scispace - formally typeset
Search or ask a question
Author

Masoud Naghedolfeizi

Bio: Masoud Naghedolfeizi is an academic researcher from Fort Valley State University. The author has contributed to research in topics: Sparse approximation & Hyperspectral imaging. The author has an hindex of 5, co-authored 29 publications receiving 93 citations.

Papers
More filters
16 Jun 2002
TL;DR: A survey of Web/Internet-enabled technologies to build experimental setups that can fully be operated, controlled and monitored remotely and both advantages and disadvantages are discussed and evaluated.
Abstract: The impact of World Wide Web on education has gone beyond the text and multimedia based instruction in course offering through the Web. Today, a number of universities, national laboratories and companies are using Web/Internet-enabled applications that can be fully controlled and monitored from remote locations. Continuous advances in computers and electronics coupled with falling prices in these industries have made Web/Internet -based technologies less costly than before, particularly for educational organizations. Thus, it is more affordable to invest in these technologies that are essential for both expanding education over Web and further improving and advancing such technologies. The LabVIEW software from National Instruments company has significantly helped researchers and educators to integrate Internet/Web with experimental setups in various methods. Some of these methods have been also improved and further advanced by other companies to greatly facilitate the implementation of Web/Internet-enabled technologies. This paper presents a survey of Web/Internet-enabled technologies to build experimental setups that can fully be operated, controlled and monitored remotely. Both advantages and disadvantages of each of these technologies are discussed and evaluated.

37 citations

Journal ArticleDOI
TL;DR: A prior knowledge-based Bayes random walk framework to segment the volumetric medical image in a slice-by-slice manner to obtain the shape and intensity knowledge of the target organ for the adjacent slice is proposed.
Abstract: Random walk (RW) method has been widely used to segment the organ in the volumetric medical image. However, it leads to a very large-scale graph due to a number of nodes equal to a voxel number and inaccurate segmentation because of the unavailability of appropriate initial seed point setting. In addition, the classical RW algorithm was designed for a user to mark a few pixels with an arbitrary number of labels, regardless of the intensity and shape information of the organ. Hence, we propose a prior knowledge-based Bayes random walk framework to segment the volumetric medical image in a slice-by-slice manner. Our strategy is to employ the previous segmented slice to obtain the shape and intensity knowledge of the target organ for the adjacent slice. According to the prior knowledge, the object/background seed points can be dynamically updated for the adjacent slice by combining the narrow band threshold (NBT) method and the organ model with a Gaussian process. Finally, a high-quality image segmentation result can be automatically achieved using Bayes RW algorithm. Comparing our method with conventional RW and state-of-the-art interactive segmentation methods, our results show an improvement in the accuracy for liver segmentation (p < 0.001).

13 citations

Journal ArticleDOI
TL;DR: A novel discriminant feature learning (DFL) method, which combines spectral and spatial information into a hypergraph Laplacian, which increases classification accuracy and outperforms the state-of-the-art HSI classification methods.
Abstract: Sparse representation classification (SRC) is being widely applied to target detection in hyperspectral images (HSI). However, due to the problem in HSI that high-dimensional data contain redundant information, SRC methods may fail to achieve high classification performance, even with a large number of spectral bands. Selecting a subset of predictive features in a high-dimensional space is an important and challenging problem for hyperspectral image classification. In this paper, we propose a novel discriminant feature learning (DFL) method, which combines spectral and spatial information into a hypergraph Laplacian. First, a subset of discriminative features is selected, which preserve the spectral structure of data and the inter- and intra-class constraints on labeled training samples. A feature evaluator is obtained by semi-supervised learning with the hypergraph Laplacian. Secondly, the selected features are mapped into a further lower-dimensional eigenspace through a generalized eigendecomposition of the Laplacian matrix. The finally extracted discriminative features are used in a joint sparsity-model algorithm. Experiments conducted with benchmark data sets and different experimental settings show that our proposed method increases classification accuracy and outperforms the state-of-the-art HSI classification methods.

9 citations

Proceedings ArticleDOI
10 Nov 2003
TL;DR: A prediction model based-on artificial neural network technology was developed for trend forecasting of a given degradation process in a system component and showed that the neural network models were capable of recognizing the correct future degradation trends in data even with a limited amount of input data.
Abstract: A prediction model based-on artificial neural network technology was developed for trend forecasting of a given degradation process in a system component. The model utilizes the engineering analysis of the degradation process under study with the analysis of process field data and information to predict future trend in the degradation. The neural network prediction models were applied to simulated degradation data of a typical system component. The prediction results showed that the neural network models were capable of recognizing the correct future degradation trends in data even with a limited amount of input data. In addition, the models were able to capture the dynamics and nonlinearities associated with the degradation process data.

9 citations

24 Jun 2001
TL;DR: The way computers are used to make physical measurements with sensors that send signals to data-acquisition boards and an instrument-based software (virtual instrument) reads the experimental data is discussed, which has helped create measurement systems that are dramatically more robust and efficient than traditional ones.
Abstract: National Aeronautics and Space Administration (NASA) has awarded Fort Valley State University (FVSU) a three-year project to develop an undergraduate minor program in computer based measurement and instrumentation. The primary objective of this program is to enhance the existing mathematics, engineering technology, and computer science programs at FVSU. This program will help students gain a solid foundation in computer science, engineering, physics, and modern experimental sciences through hands-on laboratory-based approaches with state-of-the-art technologies. A modern computerized instrumentation lab is currently being developed at the Department of Mathematics and Computer Science of FVSU to support the curriculum of the minor program. We are planning to equip the lab with various experimental setups that could be used to perform scientific experiments for lab science courses offered at FVSU. These setups will be fully controlled, monitored and operated by computer systems using virtual instrumentation technology. They will also feature on-line capabilities that would allow users to operate them remotely through the Internet. The setups are: (1) a motor-generator with a variable speed motor and a variable resistive load and (2) a variable-speed water pump, flow and level system. This paper discusses the way we use these in classes for teaching programming and data-acquisition. The paper presents typical assignments and a survey of student satisfaction and student complaints. Computer-Based Measurement and Instrumentation We believe that students majoring in computer science and engineering technology need computer experience that goes beyond standard "computer literacy" and programming. Computers are now routinely used for data acquisition and equipment control. With rapid growth in this area, more trained and knowledgeable college graduate are needed. In our laboratory, computers are being used to make physical measurements with sensors that send signals to data-acquisition boards and an instrument-based software (virtual instrument) reads the experimental data. This technology has helped create measurement systems that are dramatically more robust and efficient than traditional ones such as voltmeters, ammeters, thermometers, torque indicators, tachometers, level sight-gauges, rotameters, etc. In addition to taking the readings, the software can collect and record the data, present the data graphically and publish results to the World Wide Web. The following section includes further details of this technology. The computer-based measurements in our systems are made using LabVIEW software and data acquisition boards from National Instruments. All computers are IBM compatible Pentinum PCs. 1,3 Department of Mathematics and Computer Science, Fort Valley State University, Fort Valley, Georgia. 2 College of Engineering and Computer Science, University of Tennessee at Chattanooga, Chattanooga, Tennessee.

6 citations


Cited by
More filters
Book
01 Jan 2003
TL;DR: Comprehensive in scope, and gentle in approach, this book will help you achieve a thorough grasp of the basics and move gradually to more sophisticated DSP concepts and applications.
Abstract: From the Publisher: This is undoubtedly the most accessible book on digital signal processing (DSP) available to the beginner. Using intuitive explanations and well-chosen examples, this book gives you the tools to develop a fundamental understanding of DSP theory. The author covers the essential mathematics by explaining the meaning and significance of the key DSP equations. Comprehensive in scope, and gentle in approach, the book will help you achieve a thorough grasp of the basics and move gradually to more sophisticated DSP concepts and applications.

162 citations

Patent
19 Oct 2005
TL;DR: In this article, a smart interface engine (110) for accessing data relating to functional components of the system (10), a database containing previous knowledge about the system, a logic engine (120) for extracting parameter information for functional components, and a confirmatory engine (130) for updating the database with new information present in extracted parameter information.
Abstract: A method and an apparatus (100) provide intelligent monitoring and maintenance of a system (10). The apparatus (100) according to one embodiment comprises a smart interface engine (110) for accessing data relating to functional components of the system (10); a database (105) containing previous knowledge about the system (10); a logic engine (120) for extracting parameter information for functional components of the system(10), logic engine (120) extracting parameter information for functional components by performing inferential processing and trend recognition of data using previous knowledge about the system (10) from the database (105), and utilizing a performance simulator to simulate performance of the system (10) using models of the system (10) and previous knowledge about the system (10) from the database (105); and a confirmatory engine (130) for updating the database (105) with new information present in extracted parameter information.

93 citations

Journal ArticleDOI
TL;DR: A hierarchical fault detection method for transmission lines using a microphone array to detect the location of a fault and thermal imaging and charge coupled device cameras to verify the fault and store the image, respectively is proposed.
Abstract: This paper proposes a hierarchical fault detection method for transmission lines using a microphone array to detect the location of a fault and thermal imaging and charge coupled device (CCD) cameras to verify the fault and store the image, respectively. There are partial arc discharges on faulty insulators which generate specific patterns of sound. By detecting these patterns using the microphone array, the location of the faulty insulator can be estimated. A sixth-order bandpass filter and an autocorrelation scheme were applied to remove the noise signals caused by the wind, bird chirpings, or other external influences. When a mobile robot carries the thermal CCD cameras to the possible location of the fault, the faulty insulators or power transmission wires can be detected by the thermal images. The CCD camera then captures an image of the faulty insulator for the record. This detection scheme has been proved to be effective through experimentation. As a result of this research, it will be possible to use a mobile robot with integrated sensors to detect faulty insulators instead of using a human being.

73 citations

Journal ArticleDOI
TL;DR: An HSI classification method based on the 2D–3D CNN and multibranch feature fusion and the state-of-the-art activation function Mish to further improve the classification performance.
Abstract: The emergence of a convolutional neural network (CNN) has greatly promoted the development of hyperspectral image (HSI) classification technology. However, the acquisition of HSI is difficult. The lack of training samples is the primary cause of low classification performance. The traditional CNN-based methods mainly use the 2-D CNN for feature extraction, which makes the interband correlations of HSIs underutilized. The 3-D CNN extracts the joint spectral–spatial information representation, but it depends on a more complex model. Also, too deep or too shallow network cannot extract the image features well. To tackle these issues, we propose an HSI classification method based on the 2D–3D CNN and multibranch feature fusion. We first combine 2-D CNN and 3-D CNN to extract image features. Then, by means of the multibranch neural network, three kinds of features from shallow to deep are extracted and fused in the spectral dimension. Finally, the fused features are passed into several fully connected layers and a softmax layer to obtain the classification results. In addition, our network model utilizes the state-of-the-art activation function Mish to further improve the classification performance. Our experimental results, conducted on four widely used HSI datasets, indicate that the proposed method achieves better performance than the existing alternatives.

67 citations

Journal ArticleDOI
TL;DR: Results of the evaluation show that the virtual lab is clear, helped students with their understanding of torsion concepts, and offered a number of benefits, however students also rated hands‐on labs to be more fun and more interesting.
Abstract: An emerging change across the science, technology, engineering, and mathematics curriculum is the implementation of online, or virtual laboratories as supplements or replacements to both homework assignments and laboratory exercises. To test the effectiveness of such labs, a web-based virtual laboratory on the topic of torsion of engineered and biological materials was developed. The lab contains extensive data sets, videos of experiments, narrated presentations on lab practice and theory and assignments. Flexibility of use is built into the lab by providing the capability for the web-pages to be tailored to the needs of a particular institution. The lab was implemented and evaluated in a standard, sophomore level statics, and strength of materials course. Results of the evaluation show that the virtual lab is clear, helped students with their understanding of torsion concepts, and offered a number of benefits. However students also rated hands-on labs to be more fun and more interesting. © 2006 Wiley Periodicals, Inc. Comput Appl Eng Educ 14: 1–8, 2006; Published online in Wiley InterScience (www.interscience.wiley.com); DOI 10.1002/cae.20061

55 citations