scispace - formally typeset
Search or ask a question

Showing papers by "Ming Chuan University published in 2001"


Journal ArticleDOI
TL;DR: In this paper, the authors examined the relationship between the personality of the service providers and the service quality performance they provide and found that openness correlated with assurance, conscientiousness was a valid predictor of reliability, extraversion was positively related to responsiveness, and agreeableness significantly correlated with both empathy and assurance.
Abstract: The main objective of this paper is to examine the relationship between the personality of the service providers and the service quality performance they provide. The Five-Factor model of personality and the SERVQUAL model of service quality were used to substantiate the hypothesized relationship. Empirical data from 143 pairs of employees and customers indicate that employees with different personality traits perform differently on customers' perception of service quality. The results indicated that openness correlated with assurance, conscientiousness was a valid predictor of reliability, extraversion was positively related to responsiveness, and agreeableness significantly correlated with both empathy and assurance. Moreover, the relationship between personality and service quality was moderated by customers' gender. Overall, the outcome of this study can be applied to personnel allocation in accordance with the service quality strategy of a company.

127 citations


Journal ArticleDOI
TL;DR: In this paper, a method for identifying the bottleneck of a production line with Markovian machines having different cycle times is proposed. But it is only applied to a camshaft production line at an automotive engine plant.
Abstract: The bottleneck of a production line is a machine that impedes the system performance in the strongest manner. In production lines with the so-called Markovian model of machine reliability, bottlenecks with respect to the downtime, uptime, and the cycle time of the machines can be introduced. The two former have been addressed in recent publications [1] and [2]. The latter is investigated in this paper. Specifically, using a novel aggregation procedure for performance analysis of production lines with Markovian machines having different cycle time, we develop a method for c-bottleneck identification and apply it in a case study to a camshaft production line at an automotive engine plant.

109 citations


Journal ArticleDOI
TL;DR: A fast and novel technique for color quantization using reduction of color space dimensionality and a fast pixel mapping algorithm based on the proposed data clustering algorithm are presented.

77 citations


Journal ArticleDOI
TL;DR: In this paper, personal accessibility is described as a measure of the potential ability of individuals within a household not only to reach activity opportunities, but also to do so with sufficient time available for participation in those activities, subject to the spatio-temporal constraints imposed by their daily obligations and transportation supply environment.
Abstract: Using the conceptual framework of time-space geography, this paper incorporates both spatio-temporal constraints and household interaction effects into a meaningful measure of the potential of a household to interact with the built environment. Within this context, personal accessibility is described as a measure of the potential ability of individuals within a household not only to reach activity opportunities, but to do so with sufficient time available for participation in those activities, subject to the spatio-temporal constraints imposed by their daily obligations and transportation supply environment. The incorporation of activity-based concepts in the measurement of accessibility as a product of travel time savings not only explicitly acknowledges a temporal dimension in assessing the potential for spatial interaction but also expands the applicability of accessibility consideration to such real-world policy options as the promotion of ride-sharing and trip chaining behaviors. An empirical application of the model system provides an indication of the potential of activity-based modeling approaches to assess the bounds on achievable improvements in accessibility and travel time based on daily household activity patterns. It also provides an assessment of roles for trip chaining and ride-sharing as potentially effective methods to facilitate transportation policy objectives.

68 citations


Journal ArticleDOI
TL;DR: In this paper, a modified measure to examine bank efficiency is proposed and it is found that banks' X-inefficiency has substantially dropped off in Taiwan over the last 10 years, falling from an average Xinefficiency magnitude of 3.9% in 1988 to 2.0% in 1997.
Abstract: This paper employs data envelopment analysis to investigate the effects of X-inefficiency on Taiwan's banking industry. A modified measure to examine bank efficiency is proposed and it is found that banks' X-inefficiency has substantially dropped off in Taiwan over the last 10 years, falling from an average X-inefficiency magnitude of 3.9% in 1988 to 2.0% in 1997. Banks did improve their relative abilities to both maximize outputs and minimize inputs between ex post and ex ante of 1990s. The results obtained in this research may affirm the validity of banking deregulation policy in Taiwan.

51 citations


Journal ArticleDOI
TL;DR: This work presents a model to make equipment choice decisions in a multi-product, multi-machine, and single-stage production environment with congestion effects, which is a nonlinear integer program and shows that the solution procedure is quite effective in solving industry size problems.

50 citations


Journal ArticleDOI
TL;DR: This paper initiates research into scheduling with batching in the no-wait flowshop environment and gives NP -hardness proofs for three special cases, suggesting that it is very unlikely to devise an efficient method to find an optimal schedule for the problem under study.

37 citations


Journal ArticleDOI
TL;DR: A skew detection method which first smoothes the black runs and locates the black–white transitions to emphasize the text lines and determines the skew angle, which is determined by an improved Hough transform for the block classification step.

34 citations


Journal ArticleDOI
TL;DR: In this paper, a computationally multi-objective programming approach and a Leontief inter-industry model are used to investigate the impact of mitigating CO 2 emissions on Taiwan's economy.

34 citations


Journal ArticleDOI
01 Dec 2001-System
TL;DR: In this article, the stream of thought was used as an impetus for English as a foreign language learners' self-reflections on the purpose of journal writing, and certain general patterns emerged in relation to vocabulary acquisition, organizational strategies, invention, personal expression, and thought.

29 citations


Proceedings ArticleDOI
15 Jul 2001
TL;DR: An adaptive learning algorithm is proposed to decrease the influences of variant run-time context domain and shows which objects are to be adjusted and how to adjust their probabilities by a neural network model.
Abstract: Contextual language processing plays an important role for the post-processing of speech recognition. The purpose of the contextual language processing is to find the most plausible candidate for each syllable with the maximum likelihood probability. Generally, the performance of the probabilistic model is affected by two major errors, i.e., modeling error and estimation error in training corpus. In this paper, we focus on the problem of estimation error in training corpus. An adaptive learning algorithm is proposed to decrease the influences of variant run-time context domain. It shows which objects are to be adjusted and how to adjust their probabilities by a neural network model. The resulting techniques are greatly simplified and robust. The experimental results demonstrate the effects of the learning algorithm from generic domain to specific domain. In general, these techniques can be easily extended to various language models and corpus-based applications.

Journal ArticleDOI
TL;DR: In this article, a 5-year moving average was adopted to calculate the transition probability of deck officers in Taiwan from 1993 to 1998 at different levels in the hierarchy, and an absorbing Markov transition matrix was also constructed to forecast the terms of seniority and the annual supply of shipboard officers.
Abstract: This study adopted a 5-year moving average to calculate the transition probability of deck officers in Taiwan from 1993–1998 at different levels in the hierarchy. An absorbing Markov transition matrix was also constructed to forecast the terms of seniority and the annual supply of deck officers. In addition, this work applied the GM (1,1) model of Grey theory to forecast the annual demand of deck officers, and used cross analysis to investigate the manpower supply and demand of ocean deck officers in Taiwan. Results in this study can provide a valuable reference for pertinent authorities when determining the manpower policy of shipping companies in Taiwan.

Journal ArticleDOI
TL;DR: The problem of minimizing either the maximum tardiness or the number of tardy jobs in a two-machine flow shop with due date considerations is known to be strongly NP-hard.

Journal ArticleDOI
TL;DR: In this article, semi-Markov and Markov models for counting processes whose intensities are defined in terms of two stopping times T 1 < T 2, problems of goodness-of-fit for these models are studied.
Abstract: Survival data with one intermediate state are described by semi-Markov and Markov models for counting processes whose intensities are defined in terms of two stopping times T 1 < T 2 , Problems of goodness-of-fit for these models are studied. The test statistics are proposed by comparing Nelson-Aalen estimators for data stratified according to T 1 . Asymptotic distributions of these statistics are established in terms of the weak convergence of some random fields. Asymptotic consistency of these test statistics is also established. Simulation studies are included to indicate their numerical performance.

Proceedings ArticleDOI
29 Nov 2001
TL;DR: A data mining language is presented and an efficient data mining technique is proposed to extract sequential patterns according to user requests.
Abstract: Mining sequential patterns is used to discover sequential purchasing behaviors of most customers from a large amount of customer transactions. A data mining language is presented. From the data mining language, users can specify the interested items and the criteria of the sequential patterns to be discovered. Also, an efficient data mining technique is proposed to extract sequential patterns according to user requests.

Journal ArticleDOI
TL;DR: In this article, a method for nonparametric regression data analysis by analyzing the sensitivity of normally large perturbations with the Principal Hessian Directions (PHD) method is introduced, combining the merits of effective dimension reduction and visualization.
Abstract: A new method for nonparametric regression data analysis by analyzing the sensitivity of normally large perturbations with the Principal Hessian Directions (PHD) method (Li 1992) is introduced, combining the merits of effective dimension reduction and visualization. We develop techniques for detecting perturbed points without knowledge of the functional form of the regression model when a small percentage of observations is subject to normally large values. The main feature in our proposed method is to estimate the deviation angle of the PHD direction. The basic idea is to recursively trim out perturbed points which cause larger directional deviations. Our multiple trimming method always reduces the pattern-ambiguity of geometric shape information about the regression surface. Several simulations with empirical results are reported.

Journal ArticleDOI
TL;DR: In this article, a generalized foldover scheme was proposed to find all the sets of defining contrasts in a fractional factorial experiment with a design requirement such as resolution or estimation of some interactions.
Abstract: The order of experimental runs in a fractional factorial experiment is essential when the cost of level changes in factors is considered. The generalized foldover scheme given by [1]gives an optimal order to experimental runs in an experiment with specified defining contrasts. An experiment can be specified by a design requirement such as resolution or estimation of some interactions. To meet such a requirement, we can find several sets of defining contrasts. Applying the generalized foldover scheme to these sets of defining contrasts, we obtain designs with different numbers of level changes and then the design with minimum number of level changes. The difficulty is to find all the sets of defining contrasts. An alternative approach is investigated by [2]for two-level fractional factorial experiments. In this paper, we investigate experiments with all factors in slevels.

Journal ArticleDOI
TL;DR: This paper presents a data extraction and identification method for paper-based Chinese road maps that identifies characters and legends by distinguishing the characters from the legends using Bayes's theorem, and the performance of the proposed system is evaluated on 20 test maps.
Abstract: This paper presents a data extraction and identification method for paper-based Chinese road maps. First, the extraction of the map title box and the legend index table is accomplished by a rule with trained parameter values. Then they are identified by distinguishing the characters from the legends using Bayes's theorem. Second, the gray- level histogram of the large components is constructed and smoothed, and then the road images are filtered out using the multilevel threshold- ing technique. The extracted roads are further vectorized into line seg- ments to save storage. The gaps between line segments are filled by a postprocessing procedure. Third, the characters and the legends are segmented by combining the small-component image and the difference image between the large-component image and the road images. The extracted legends are recognized by the proposed probabilistic template matching method. The performance of the proposed system is evaluated on 20 test maps, and the experimental results show that the proposed system is effective. © 2001 Society of Photo-Optical Instrumentation Engineers.

Book ChapterDOI
16 Apr 2001
TL;DR: A rough data mining approach to the student modeling problems is proposed as a knowledge discovery process in which a student's domain knowledge was discovered and rebuilt using rough set data mining techniques.
Abstract: Student modeling has been an active research area in the field of intelligent tutoring systems. In this paper, we propose a rough data mining approach to the student modeling problems. The problem is modeled as a knowledge discovery process in which a student's domain knowledge (classification rules) was discovered and rebuilt using rough set data mining techniques. We design two knowledge extraction modules based on the lower approximation set and upper approximation set of the rough set theory, respectively. To verify the effectiveness of the knowledge extraction modules, two similarity metrics are presented. A set of experiments is conducted to evaluate the capability of the knowledge extraction modules. At last, based on the experimental results some suggestions about a future knowledge extraction module are outlined.

Journal ArticleDOI
01 Jun 2001
TL;DR: A new leasingcredit granting model is proposed to improve the quality of leasing credit granting decision-making under fuzzy environment and a fuzzy breakeven analysis based on the cost-volume-profit analysis by using the concepts of triangular fuzzy numbers and linguistic variables is discussed.
Abstract: The main purpose of this paper is to apply fuzzy set theory to the leasing credit granting management. A new leasing credit granting model is proposed to improve the quality of leasing credit granting decision-making under fuzzy environment. A fuzzy breakeven analysis based on the cost-volume-profit (CVP) analysis by using the concepts of triangular fuzzy numbers and linguistic variables is discussed. A leasing client's granting quality determination model is constructed by combining fuzzy breakeven analysis with Markov chain. An algorithm of the proposed model is developed to see if there is any improvement in credit quality of client after leasing by a decision rule defined in this paper. The strategic decisions of leasing industry are also discussed in this study by using fuzzy breakeven analysis.

Proceedings ArticleDOI
11 Dec 2001
TL;DR: In this paper, a systematic instructional model for product eco-design education, including the review of related factors such as time schedule, capability of students, curriculum aims, teaching materials, and grading of assignments, has been established.
Abstract: The environmental conscious has been prevailed through the world in recent years. Moreover, the related issues, including environmental regulations, energy development, the technology of eco-materials and manufacturing, Eco-design methodologies, etc., also have raised their importance among the professional and educational fields. During the process of planning eco-design instructional models and curriculum design, the educator may have to face the professional and complicated eco-design body of knowledge. Therefore, there is a need to establish the curriculum structure and set up curriculum aims for providing eco-design educators a guideline. The author has conducted the study of eco-design education models based on the author's experience on participating several eco-design related workshops and collecting teaching materials. This research tries to compare these two educational models of workshops and formal instruction in the university level. It is suggested that the curriculum structure can be categorized into three levels: fundamentals (such as Introduction of the Ecology and Environmental Psychology), design practice (such as design guidelines and methodologies), and advanced research (such as advanced eco-design strategy and management, and advanced eco-technology). In this research, a systematic instructional model for product eco-design education, including the review of related factors such as time schedule, capability of students, curriculum aims, teaching materials, and grading of assignments, has been established. Eventually, this research will provide a curriculum design guideline for the eco-design educators.

Journal ArticleDOI
TL;DR: In this article, the effect of changing the tariff rate on the environmental qualities of imported goods was analyzed and the impact of environmental quality improvement on the optimal tariff rate was discussed, and it was shown that improving environmental qualities reduces the rate of decrease of consumers' marginal utilities and results in higher environ-mental qualities.
Abstract: Tariff is a trade measure with both of the trade and environmental effects. This article analyzes the interaction between the tariff rate and the short and long run environmental qualities of imperfectly substitutive imported goods. In the first part, we analyze the effect of changing the tariff rate on the environmental qualities of imported goods. That is, the import country government sets up the tariff rate before the foreign exporters choose their environmental qualities. If improving environmental qualities reduces the rate of decrease of consumers' marginal utilities, then a reduction in the tariff rate results in higher environ-mental qualities; moreover, the long run environ-mental qualities of imported goods will be higher. In the second part, we discuss the effect of environmental quality improvement on the optimal tariff. That is, the foreign exporters choose their environmental qualities before the home country government sets up the tariff rate. When the consumers are not environmentally conscious, whether the importance of environmental qualities is emphasized or not, it is optimal for the government to impose a lower (higher) tariff rate on the high (low) environmental quality imported goods. If

Proceedings ArticleDOI
22 Aug 2001
TL;DR: The goal of the design is to achieve the maximum image quality under a target computational complexity and to provide a flexible fast block matching algorithm that allows user to terminate algorithm at anytarget computational complexity.
Abstract: In this paper, we propose a novel fast block matching algorithm design strategy based on the complexity-distortion optimization. The goal of our design is to achieve the maximum image quality under a target computational complexity and to provide a flexible fast block matching algorithm that allows user to terminate algorithm at any target computational complexity. Based on the proposed predictive complexity-distortion benefit list technique, which is employed to predict the motion compensation benefit, we modify the full-search block matching, three-step search, new three-step search, and four-step search to the flexible algorithm. Experimental results show that the flexible algorithms could achieve better performance than traditional fixed complexity algorithms from the viewpoint of complexity-distortion optimization.

Journal ArticleDOI
TL;DR: In this article, the assignment of residual income claimancy to an R&D manager and applying the model to biotechnology firms was studied and shown to predict differences in the residual incomes claimancy of managers and in other aspects of organization for innovative research firms like biotechs.
Abstract: This paper models the assignment of residual income claimancy to an R&D manager and applies the model to biotechnology firms. Residual income claimancy provides incentives for the manager to monitor the R&D process. Because the nature of R&D and of monitoring scientific effort is different, our model predicts stark differences in the residual income claimancy of managers and in other aspects of organization for innovative R&D firms like biotechs. In particular, R&D firms are expected to be more owner-managed, more expert-managed, and smaller in size. Cross-sectional data on biotechnology firms is consistent with these implications. Additionally, longitudinal data indicate that as firms alter their focus on biotech research, their organizational structure changes as expected.

Journal ArticleDOI
TL;DR: In this paper, the authors proposed the quasi-score statistic to establish test procedure for testing the hypothesis that whether the link function is correct or not in a quasi-likelihood model.
Abstract: In quasi-likelihood model, we propose the quasi-score statistic to establish test procedure for testing the hypothesis that whether the link function is correct or not. In addition, fOur practical examples are given to Snow me advantage of the proposed test

Journal ArticleDOI
TL;DR: In this article, an analytical model to calculate the small-signal parameters for GaAs/AlGaAs inverted high electron mobility transistors is presented, based on a self-consistent solution of the Schrodinger and Poisson's equations and non-linear velocity-electric field (vd−E) characteristic.

Journal ArticleDOI
TL;DR: Variable-depth-search heuristic (VDSH) method is one of the solution methods that have been developed to produce quality near-optimal solutions to linear cost assignment problem and the numerical results statistically evince that different combina-tions of these two operations will result in solutions of different quality levels.
Abstract: The classical assignment problem seeks to determine a mapping between a set of assignees and a set of assignments. The linear cost assignment problem (LCGAP), as a generalized model, incorporates the relative workloads between assignees and assignments. Although LCGAP is computationally intractable, it has been extensively studied because of its practical implications in many real world situations. Variable-depth-search heuristic (VDSH) method is one of the solution methods that have been developed to produce quality near-optimal solutions to LCGAP. The main structure of VDSH consists of two basic operations: reassign and swap. In this paper, we make further observations on this effective heuristic method through a series of computational experiments. The numerical results statistically evince that different combina-tions of these two operations will result in solutions of different quality levels. These findings are expected to have similar implications to search heuristics for other optimization problems.

Journal ArticleDOI
TL;DR: In this article, the authors applied content analysis that founds research situation on research projects and used grey prediction forecast that development trend on research project and found that the popular research topics are product and environment design, soft computing, intelligent system, manufacture planning and service performance.
Abstract: The industrial engineer is a great contribution to economic growth. Every year personal research projects ware being subsidized by industrial engineering department in National Science Council, which is developing research work in the field of industrial engineer. This Study applied content analysis that founds research situation on research projects. And it use grey prediction forecast that development trend on research projects. There are three conclusions of this study: (1) The popular research topics are product and environment design, soft computing, intelligent system, manufacture planning and service performance. (2) The popular research tools are experimental research, algorithm and fuzzy theory. (3) Form 1999 to 2002, the rate of research projects are growth which order are 12.66%, 16.25%, 16.02% and 16.17%. Those show industrial engineering department still growth that receives a grant from the government in future.

Proceedings Article
01 Feb 2001
TL;DR: This paper is concerned with the issue of how to acquire and represent conceptual knowledge explicitly from lexical definitions and their semantic network within a machine-readable dictionary.
Abstract: Knowledge acquisition is an essential and intractable task for almost every natural language processing study. To date, corpus approach is a primary means for acquiring lexical-level semantic knowledge, but it has the problem of knowledge insufficiency when training on various kinds of corpora. This paper is concerned with the issue of how to acquire and represent conceptual knowledge explicitly from lexical definitions and their semantic network within a machine-readable dictionary. Some information retrieval techniques are applied to link between lexical senses in WordNet and conceptual ones in Roget's categories. Our experimental results report an overall accuracy of 85.25% (87, 78, 89, and 87% for nouns, verbs, adjectives and adverbs, respectively) when evaluating on polysemous words discussed in previous literature.