scispace - formally typeset
Search or ask a question
Author

L.M. Luo

Bio: L.M. Luo is an academic researcher from Southeast University. The author has contributed to research in topics: Contextual image classification & Digital image processing. The author has an hindex of 7, co-authored 14 publications receiving 205 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, the authors proposed two general forms which simplify and group the discrete orthogonal Tchebichef and Krawtchouk polynomials and their corresponding moments, and discussed their importance in theories and applications.
Abstract: Discrete orthogonal moments such as Tchebichef moments and Krawtchouk moments are more powerful in image representation than traditional continuous orthogonal moments. However, less work has been done for the summarisation of these discrete orthogonal moments. This study proposes two general forms which will simplify and group the discrete orthogonal Tchebichef and Krawtchouk polynomials and their corresponding moments, and discusses their importance in theories and applications. Besides, the proposed general form can be used to obtain other three discrete orthogonal moments: Hahn moments, Charlier moments and Meixner moments. Computations of these discrete orthogonal polynomials are also discussed in this task, including the recurrence relation with respect to variable x and order n. Some properties of these discrete orthogonal moments, which are of particular value to image processing applications, such as energy compact capability and signal decorrelation, are also presented. Finally, the study evaluates these discrete orthogonal moments in terms of the capacity of image reconstruction and image compression, and discusses the importance of the proposed general form in theories and engineering.

125 citations

Journal ArticleDOI
TL;DR: A segmentation tool is presented in order to differentiate the anatomical structures within the vectorial volume of the CT uroscan to get a better classification result and is less affected by the noise.

26 citations

Journal ArticleDOI
Huazhong Shu1, Yulong Yan1, Xue-Liang Bao1, Yao Fu1, L.M. Luo1 
TL;DR: This method is composed of two steps: firstly, a quasi-Newton method is used to deal with the continuous variables such as position and weight of shots; the result obtained serves as the initial configuration for the next step, in which a simulated annealing method is applied to optimize all the aforementioned parameters.
Abstract: The gamma unit is used to irradiate a target within the brain. During such a treatment many parameters, including the number of shots, the coordinates, the collimator size and the weight associated with each shot, affect the amount of dose delivered to the target volume and to the surrounding normal tissues. Hence it is not easy to determine an appropriate set of these parameters by a trial and error method. For this reason, we present here an optimization method to determine mathematically those parameters. This method is composed of two steps: firstly, a quasi-Newton method is used to deal with the continuous variables such as position and weight of shots; the result obtained at the end of this step then serves as the initial configuration for the next step, in which a simulated annealing method is applied to optimize all the aforementioned parameters. Application of the proposed methods to two examples shows that our optimization algorithm runs in a satisfactory way.

22 citations

Proceedings ArticleDOI
01 Jul 2010
TL;DR: A medical image integrity verification system that not only allows detecting and localizing one alteration, but also provides an approximation of this latter, which helps to approximate modifications by determining the parameters of the nearest generalized 2D Gaussian.
Abstract: In this paper we propose a medical image integrity verification system that not only allows detecting and localizing one alteration, but also provides an approximation of this latter. For that purpose, we suggest the embedding of an image signature or digest derived from Geometric moments of image pixel blocks. Image integrity verification is then conducted by comparing this embedded signature to the recomputed one. This signature helps to approximate modifications by determining the parameters of the nearest generalized 2D Gaussian. Experimental results with local image modification illustrate the overall performances of our method.

13 citations

Proceedings ArticleDOI
01 Dec 2011
TL;DR: A medical image integrity verification system that not only allows detecting and approximating malevolent local image alterations but is also capable to identify the nature of global image processing applied to the image.
Abstract: In this paper we present a medical image integrity verification system that not only allows detecting and approximating malevolent local image alterations (e.g. removal or addition of findings) but is also capable to identify the nature of global image processing applied to the image (e.g. lossy compression, filtering …). For that purpose, we propose an image signature derived from the geometric moments of pixel blocks. Such a signature is computed over regions of interest of the image and then watermarked in regions of non interest. Image integrity analysis is conducted by comparing embedded and recomputed signatures. If any, local modifications are approximated through the determination of the parameters of the nearest generalized 2D Gaussian. Image moments are taken as image features and serve as inputs to one classifier we learned to discriminate the type of global image processing. Experimental results with both local and global modifications illustrate the overall performances of our approach.

13 citations


Cited by
More filters
01 Apr 1997
TL;DR: The objective of this paper is to give a comprehensive introduction to applied cryptography with an engineer or computer scientist in mind on the knowledge needed to create practical systems which supports integrity, confidentiality, or authenticity.
Abstract: The objective of this paper is to give a comprehensive introduction to applied cryptography with an engineer or computer scientist in mind. The emphasis is on the knowledge needed to create practical systems which supports integrity, confidentiality, or authenticity. Topics covered includes an introduction to the concepts in cryptography, attacks against cryptographic systems, key use and handling, random bit generation, encryption modes, and message authentication codes. Recommendations on algorithms and further reading is given in the end of the paper. This paper should make the reader able to build, understand and evaluate system descriptions and designs based on the cryptographic components described in the paper.

2,188 citations

Journal ArticleDOI
TL;DR: This paper surveys several applications of Operations Research in the domain of Healthcare and highlights current research activities, focusing on a variety of optimisation problems as well as solution techniques used for solving the Optimisation problems.

339 citations

Journal ArticleDOI
TL;DR: This is Applied Cryptography Protocols Algorithms And Source Code In C Applied Cryptographic Protocols algorithms and Source Code in C By Schneier Bruce Author Nov 01 1995 the best ebook that you can get right now online.

207 citations

Book Chapter
01 Dec 2001
TL;DR: In this article, a summary of the issues discussed during the one day workshop on SVM Theory and Applications organized as part of the Advanced Course on Artificial Intelligence (ACAI ’99) in Chania, Greece is presented.
Abstract: This chapter presents a summary of the issues discussed during the one day workshop on “Support Vector Machines (SVM) Theory and Applications” organized as part of the Advanced Course on Artificial Intelligence (ACAI ’99) in Chania, Greece [19]. The goal of the chapter is twofold: to present an overview of the background theory and current understanding of SVM, and to discuss the papers presented as well as the issues that arose during the workshop.

170 citations

Journal ArticleDOI
TL;DR: It is concluded that, in order to avoid artifacts and exclude the several sources of bias that may influence the analysis, an optimal method should comprise a careful preprocessing of the images, be based on multimodal, complementary data, take into account spatial information about the lesions and correct for false positives.
Abstract: White matter hyperintensities (WMH) are commonly seen in the brain of healthy elderly subjects and patients with several neurological and vascular disorders. A truly reliable and fully automated method for quantitative assessment of WMH on magnetic resonance imaging (MRI) has not yet been identified. In this paper, we review and compare the large number of automated approaches proposed for segmentation of WMH in the elderly and in patients with vascular risk factors. We conclude that, in order to avoid artifacts and exclude the several sources of bias that may influence the analysis, an optimal method should comprise a careful preprocessing of the images, be based on multimodal, complementary data, take into account spatial information about the lesions and correct for false positives. All these features should not exclude computational leanness and adaptability to available data.

140 citations