Institution
University of Macau
Education•Macao, Macau, China•
About: University of Macau is a education organization based out in Macao, Macau, China. It is known for research contribution in the topics: Population & Control theory. The organization has 6636 authors who have published 18324 publications receiving 327384 citations. The organization is also known as: UM & UMAC.
Papers published on a yearly basis
Papers
More filters
••
TL;DR: This paper introduces a fully efficient approximation algorithm of graph Laplacian, a natural generalization of the standard graph LaPLACian, which significantly saving the computing cost and applies pLapR to support vector machines and kernel least squares and conduct the implementations for scene recognition.
Abstract: The explosive growth of multimedia data on the Internet makes it essential to develop innovative machine learning algorithms for practical applications especially where only a small number of labeled samples are available. Manifold regularized semi-supervised learning (MRSSL) thus received intensive attention recently because it successfully exploits the local structure of data distribution including both labeled and unlabeled samples to leverage the generalization ability of a learning model. Although there are many representative works in MRSSL, including Laplacian regularization (LapR) and Hessian regularization, how to explore and exploit the local geometry of data manifold is still a challenging problem. In this paper, we introduce a fully efficient approximation algorithm of graph ${p}$ -Laplacian, which significantly saving the computing cost. And then we propose ${p}$ -LapR (pLapR) to preserve the local geometry. Specifically, ${p}$ -Laplacian is a natural generalization of the standard graph Laplacian and provides convincing theoretical evidence to better preserve the local structure. We apply pLapR to support vector machines and kernel least squares and conduct the implementations for scene recognition. Extensive experiments on the Scene 67 dataset, Scene 15 dataset, and UC-Merced dataset validate the effectiveness of pLapR in comparison to the conventional manifold regularization methods.
95 citations
•
01 May 2014TL;DR: The acquisition of a large scale and high quality parallel corpora for English and Chinese for Statistical Machine Translation (SMT) is described, designed to embrace eight different domains.
Abstract: Parallel corpus is a valuable resource for cross-language information retrieval and data-driven natural language processing systems, especially for Statistical Machine Translation (SMT). However, most existing parallel corpora to Chinese are subject to in-house use, while others are domain specific and limited in size. To a certain degree, this limits the SMT research. This paper describes the acquisition of a large scale and high quality parallel corpora for English and Chinese. The corpora constructed in this paper contain about 15 million English-Chinese (E-C) parallel sentences, and more than 2 million training data and 5,000 testing sentences are made publicly available. Different from previous work, the corpus is designed to embrace eight different domains. Some of them are further categorized into different topics. The corpus will be released to the research community, which is available at the NLP2CT website.
95 citations
••
TL;DR: Teachers have a generally positive level of technology acceptance and the TAMPST is a valid tool to be applied to teachers although it was originally developed to test pre-service teachers, according to results.
Abstract: This study examines the factors that explain teachers' technology acceptance. A sample of 673 primary and secondary school teachers gave their responses to a 16-item technology acceptance measure for pre-service teachers (TAMPST). Results of this study showed teachers have a generally positive level of technology acceptance and that the TAMPST is a valid tool to be applied to teachers although it was originally developed to test pre-service teachers. Tests for measurement invariance and latent mean differences on the five factors in the TAMPST provided support for full and partial configural, metric, and partial scalar invariance by gender, length of service in teaching, and teaching level. The tests of latent mean differences found significant differences by gender for perceived ease of use, with male teachers rating higher than their female counterparts. Between teachers with shorter and longer years of teaching service, statistical significance was found in the mean differences for perceived ease of use and attitude towards technology use. No significant mean differences in each of the five factors were found between the primary and secondary teachers.
95 citations
••
TL;DR: In this paper, the authors developed new real-time prediction models for output power and energy efficiency of solar photovoltaic (PV) systems using measured data of a grid-connected solar PV system in Macau.
95 citations
••
TL;DR: In this paper, the authors describe a new model of in-service teacher education in China, implemented within a broader program of Xingdong Jiaoyu (Action Education), which has been implemented since 2003.
Abstract: This paper describes Keli (Exemplary Lesson Development), a new model of in-service teacher education in China, implemented within a broader program of Xingdong Jiaoyu (Action Education), which has been implemented since 2003. This paper sets out how to implement the innovative Keli model. Finally, the implications for the practical community, including teachers and researchers, are examined.
95 citations
Authors
Showing all 6766 results
Name | H-index | Papers | Citations |
---|---|---|---|
Henry T. Lynch | 133 | 925 | 86270 |
Chu-Xia Deng | 125 | 444 | 57000 |
H. Vincent Poor | 109 | 2116 | 67723 |
Peng Chen | 103 | 918 | 43415 |
George F. Gao | 102 | 793 | 82219 |
MengChu Zhou | 96 | 1124 | 36969 |
Gang Li | 93 | 486 | 68181 |
Rob Law | 81 | 714 | 31002 |
Zongjin Li | 80 | 630 | 22103 |
Han-Ming Shen | 80 | 237 | 27410 |
Heng Li | 79 | 745 | 23385 |
Lionel M. Ni | 75 | 466 | 28770 |
C. L. Philip Chen | 74 | 482 | 20223 |
Chun-Su Yuan | 72 | 397 | 21089 |
Joao P. Hespanha | 72 | 418 | 39004 |