scispace - formally typeset
Search or ask a question
Author

Shattri Mansor

Other affiliations: University of Dundee
Bio: Shattri Mansor is an academic researcher from Universiti Putra Malaysia. The author has contributed to research in topics: Landslide & Land cover. The author has an hindex of 30, co-authored 160 publications receiving 2916 citations. Previous affiliations of Shattri Mansor include University of Dundee.


Papers
More filters
Journal ArticleDOI
01 Feb 2015-Catena
TL;DR: In this paper, support vector machine (SVM) is used to predict flood susceptibility in the Kuala Terengganu basin, Malaysia, and four SVM kernel types such as linear (LN), polynomial (PL), radial basis function (RBF), and sigmoid (SIG) were used to check the robustness of the SVM model.
Abstract: Statistical learning theory is the basis of support vector machine (SVM) technique. This technique in natural hazard assessment is getting extremely popular these days. It contains a training stage related to the input and desire output values. The main goal of this paper is to assess and evaluate the prediction capability of SVM technique with different kernel functions for spatial prediction of flood occurrence. Kuala Terengganu basin, Malaysia was selected as study area. To begin, a flood inventory map was produced by mapping the flood locations in the Terengganu using documentary sources and field survey. The flood inventory was partitioned into training and testing datasets through random selection. The spatial database was constructed using various flood conditioning factors: altitude, slope, curvature, stream power index (SPI), topographic wetness index (TWI), distance from the river, geology, land use/cover (LULC), soil, and surface runoff. Four SVM kernel types such as linear (LN), polynomial (PL), radial basis function (RBF), and sigmoid (SIG) were utilized to check the robustness of the SVM model. Consequently, four flood susceptibility maps were created. In order to examine the efficiency of the SVM model, a probabilistic based frequency ratio (FR) model was applied and compared with the SVM outcomes. An area under the curve (AUC) method was used to validate the resultant flood susceptibility maps. The validation results demonstrated that the prediction rate curves for flood susceptibility maps generated by the SVM-LN, SVM-PL, SVM-RBF, and SVM-SIG were 84.63%, 83.92%, 84.97%, and 81.88% respectively. On the other hand, the prediction rate achieved by FR showed the lowest accuracy of 61.43%. To evaluate the impacts of conditioning factors on the flood susceptibility mapping, Cohen's kappa index was measured. The result demonstrated that all conditioning factors have reasonably positive influence on the flood analysis in current case study except surface runoff which decreased the accuracy of the final results. The most influential factors were altitude and slope for all kernel types. It can be concluded that SVM technique is an efficient and reliable tool in flood susceptibility assessment. The resultant flood susceptibility maps can be beneficial in flood mitigation strategies.

497 citations

Journal ArticleDOI
TL;DR: The results show that the random landslide training data selection affected the parameter estimations of the SVM, LR and ANN algorithms and had an effect on the accuracy of the susceptibility model because landslide conditioning factors vary according to the geographic locations in the study area.
Abstract: Landslide is a natural hazard that results in many economic damages and human losses every year. Numerous researchers have studied landslide susceptibility mapping (LSM), each attempting to improve the accuracy of the final outputs. However, few studies have been published on the training data selection effects on the LSM. Thus, this study assesses the training landslides random selection effects on support vector machine (SVM) accuracy, logistic regression (LR) and artificial neural networks (ANN) models for LSM in a catchment at the Dodangeh watershed, Mazandaran province, Iran. A 160 landslide locations inventory was collected by Geological Survey of Iran for this investigation. Different methods were implemented to define the landslide locations, such as inventory reports, satellite images and field survey. Moreover, 14 landslide conditioning factors were considered in the analysis of landslide susceptibility. These factors include curvature, plan curvature, profile curvature, altitude, slope ...

334 citations

Journal ArticleDOI
TL;DR: It was found that ML performed the best followed by ANN, DT and SAM with accuracies of 86%, 84%, 51% and 49% respectively.
Abstract: Several classification algorithms for pattern recognition had been tested in the mapping of tropical forest cover using airborne hyperspectral data. Results from the use of Maximum Likelihood (ML), Spectral Angle Mapper (SAM), Artificial Neural Network (ANN) and Decision Tree (DT) classifiers were compared and evaluated. It was found that ML performed the best followed by ANN, DT and SAM with accuracies of 86%, 84%, 51% and 49% respectively.

170 citations

Journal ArticleDOI
TL;DR: In this article, two models such as cellular automata (CA) and the SLEUTH models are applied in a geographical information system (GIS) to simulate and predict the urban growth and land use change for the City of Sana'a (Yemen) for the period 2004-2020.
Abstract: An effective and efficient planning of an urban growth and land use changes and its impact on the environment requires information about growth trends and patterns amongst other important information. Over the years, many urban growth models have been developed and used in the developed countries for forecasting growth patterns. In the developing countries however, there exist a very few studies showing the application of these models and their performances. In this study two models such as cellular automata (CA) and the SLEUTH models are applied in a geographical information system (GIS) to simulate and predict the urban growth and land use change for the City of Sana’a (Yemen) for the period 2004–2020. GIS based maps were generated for the urban growth pattern of the city which was further analyzed using geo-statistical techniques. During the models calibration process, a total of 35 years of time series dataset such as historical topographical maps, aerial photographs and satellite imageries was used to identify the parameters that influenced the urban growth. The validation result showed an overall accuracy of 99.6 %; with the producer’s accuracy of 83.3 % and the user’s accuracy 83.6 %. The SLEUTH model used the best fit growth rule parameters during the calibration to forecasting future urban growth pattern and generated various probability maps in which the individual grid cells are urbanized assuming unique “urban growth signatures”. The models generated future urban growth pattern and land use changes from the period 2004–2020. Both models proved effective in forecasting growth pattern that will be useful in planning and decision making. In comparison, the CA model growth pattern showed high density development, in which growth edges were filled and clusters were merged together to form a compact built-up area wherein less agricultural lands were included. On the contrary, the SLEUTH model growth pattern showed more urban sprawl and low-density development that included substantial areas of agricultural lands.

152 citations

Journal ArticleDOI
TL;DR: In this article, the authors presented the application of remote sensing techniques, digital image analysis and Geographic Information System tools to delineate the degree of landslide hazard and risk areas in the Balik Pulau area in Penang Island, Malaysia.
Abstract: This paper presents the application of remote sensing techniques, digital image analysis and Geographic Information System tools to delineate the degree of landslide hazard and risk areas in the Balik Pulau area in Penang Island, Malaysia Its causes were analysed through various thematic attribute data layers for the study area Firstly, landslide locations were identified in the study area from the interpretation of aerial photographs, satellite imageries, field surveys, reports and previous landslide inventories Topographic, geologic, soil and satellite images were collected and processed using Geographic Information System and image processing tools There are 12 landslide-inducing parameters considered for the landslide hazard analyses These parameters are: topographic slope, topographic aspect, plan curvature, distance to drainage and distance to roads, all derived from the topographic database; geology and distance to faults, derived from the geological database; landuse/landcover, derived from Landsat satellite images; soil, derived from the soil database; precipitation amount, derived from the rainfall database; and the vegetation index value, derived from SPOT satellite images In addition, hazard analyses were performed using landslide-occurrence factors with the aid of a statistically based frequency ratio model Further, landslide risk analysis was carried out using hazard map and socio-economic factors using a geospatial model This landslide risk map could be used to estimate the risk to population, property and existing infrastructure like transportation networks Finally, to check the accuracy of the success-rate prediction, the hazard map was validated using the area under curve method The prediction accuracy of the hazard map was 89% Based on these results the authors conclude that frequency ratio models can be used to mitigate hazards related to landslides and can aid in land-use planning

130 citations


Cited by
More filters
Journal Article
TL;DR: This book by a teacher of statistics (as well as a consultant for "experimenters") is a comprehensive study of the philosophical background for the statistical design of experiment.
Abstract: THE DESIGN AND ANALYSIS OF EXPERIMENTS. By Oscar Kempthorne. New York, John Wiley and Sons, Inc., 1952. 631 pp. $8.50. This book by a teacher of statistics (as well as a consultant for \"experimenters\") is a comprehensive study of the philosophical background for the statistical design of experiment. It is necessary to have some facility with algebraic notation and manipulation to be able to use the volume intelligently. The problems are presented from the theoretical point of view, without such practical examples as would be helpful for those not acquainted with mathematics. The mathematical justification for the techniques is given. As a somewhat advanced treatment of the design and analysis of experiments, this volume will be interesting and helpful for many who approach statistics theoretically as well as practically. With emphasis on the \"why,\" and with description given broadly, the author relates the subject matter to the general theory of statistics and to the general problem of experimental inference. MARGARET J. ROBERTSON

13,333 citations

01 Jan 2016
TL;DR: The remote sensing and image interpretation is universally compatible with any devices to read and is available in the digital library an online access to it is set as public so you can get it instantly.
Abstract: Thank you very much for downloading remote sensing and image interpretation. As you may know, people have look hundreds times for their favorite novels like this remote sensing and image interpretation, but end up in malicious downloads. Rather than reading a good book with a cup of tea in the afternoon, instead they are facing with some malicious virus inside their computer. remote sensing and image interpretation is available in our digital library an online access to it is set as public so you can get it instantly. Our book servers spans in multiple countries, allowing you to get the most less latency time to download any of our books like this one. Merely said, the remote sensing and image interpretation is universally compatible with any devices to read.

1,802 citations

Book
01 Dec 1988
TL;DR: In this paper, the spectral energy distribution of the reflected light from an object made of a specific real material is obtained and a procedure for accurately reproducing the color associated with the spectrum is discussed.
Abstract: This paper presents a new reflectance model for rendering computer synthesized images. The model accounts for the relative brightness of different materials and light sources in the same scene. It describes the directional distribution of the reflected light and a color shift that occurs as the reflectance changes with incidence angle. The paper presents a method for obtaining the spectral energy distribution of the light reflected from an object made of a specific real material and discusses a procedure for accurately reproducing the color associated with the spectral energy distribution. The model is applied to the simulation of a metal and a plastic.

1,401 citations

Journal ArticleDOI
TL;DR: Based on thermal radiance transfer equation, an attempt has been made in this paper to develop a mono-window algorithm for retrieving land surface temperature (LST) from Landsat TM6 data.
Abstract: Remote sensing of land surface temperature (LST) from the thermal band data of Landsat Thematic Mapper (TM) still remains unused in comparison with the extensive studies of its visible and near-infrared (NIR) bands for various applications. The brightness temperature can be computed from the digital number (DN) of TM6 data using the equation provided by the National Aeronautics and Space Administration (NASA). However, a proper algorithm for retrieving LST from the only one thermal band of the sensor still remains unavailable due to many difficulties in the atmospheric correction. Based on thermal radiance transfer equation, an attempt has been made in the paper to develop a mono-window algorithm for retrieving LST from Landsat TM6 data. Three parameters are required for the algorithm: emissivity, transmittance and effective mean atmospheric temperature. Method about determination of atmospheric transmittance is given in the paper through the simulation of atmospheric conditions with LOWTRAN 7 program. A ...

1,134 citations