scispace - formally typeset
Search or ask a question

Showing papers by "University of Piraeus published in 2002"


Journal ArticleDOI
TL;DR: This paper develops an alternative approach for dealing with imprecise data in DEA by transforming a non-linear DEA model to a linear programming equivalent, on the basis of the original data set, by applying transformations only on the variables.

357 citations


Journal ArticleDOI
TL;DR: The global efficiency approach is introduced as a means to improve the discriminating power of Data Envelopment Analysis (DEA) and the units that can maintain their efficiency score under common weighting structures are dealt with.
Abstract: We introduce in this paper the global efficiency approach as a means to improve the discriminating power of Data Envelopment Analysis (DEA). To discriminate further among the DEA efficient units, we deal only with the units that can maintain their efficiency score under common weighting structures. Then we proceed further to ranking the whole set of DEA efficient units. We compare the global efficiency approach with the multi-criteria DEA and the cross-efficiency approaches on the basis of characteristic numerical examples drawn from the literature.

159 citations


Journal ArticleDOI
TL;DR: Experiments performed on real field data show that utilization of the bilinear model parameters as features improves correct classification scores at the cost of increased complexity and computations.
Abstract: Objectives: This paper focusses on the person identification problem based on features extracted from the ElectroEncephaloGram (EEG). A bilinear rather than a purely linear model is fitted on the EEG signal, prompted by the existence of non-linear components in the EEG signal – a conjecture already investigated in previous research works. The novelty of the present work lies in the comparison between the linear and the bilinear results, obtained from real field EEG data, aiming towards identification of healthy subjects rather than classification of pathological cases for diagnosis. Methods: The EEG signal of a, in principle, healthy individual is processed via (non)linear (AR, bilinear) methods and classified by an artificial neural network classifier. Results: Experiments performed on real field data show that utilization of the bilinear model parameters as features improves correct classification scores at the cost of increased complexity and computations. Results are seen to be statistically significant at the 99.5% level of significance, via the X2 test for contingency. Conclusions: The results obtained in the present study further corroborate existing research, which shows evidence that the EEG carries individual-specific information, and that it can be successfully exploited for purposes of person identification and authentication.

145 citations


Journal ArticleDOI
TL;DR: The hypothesis that a pedagogical agent incorporated in an ITS can enhance students' learning experience is confirmed and the hypothesis that the presence of the agent improves short-term learning effects was rejected.
Abstract: This paper describes the evaluation of the persona effect of a speech-driven anthropomorphic agent that has been embodied in the interface of an intelligent tutoring system (ITS). This agent is responsible for guiding the student in the environment and communicating the system's feedback messages. The agent was evaluated in terms of the effect that it could have on students' learning, behaviour and experience. The participants in the experiment were divided into two groups: half of them worked with a version of the ITS which embodied the agent and the rest worked with an agent-less version. The results from this study confirm the hypothesis that a pedagogical agent incorporated in an ITS can enhance students' learning experience. On the other hand, the hypothesis that the presence of the agent improves short-term learning effects was rejected.

139 citations


Journal ArticleDOI
TL;DR: This paper provided empirical evidence on the casual relationship between stock prices and exchange rates volatility in four East Asian countries using a GARCH model for which a BEKK representation is adopted, and then test for the relevant zero restrictions on the conditional variance parameters.
Abstract: In this paper we provide some empirical evidence on the casual relationship between stock prices and exchange rates volatility in four East Asian countries. In order to test for causality-in-variance, we use a GARCH model for which a BEKK representation is adopted, and then test for the relevant zero restrictions on the conditional variance parameters. We find that in the pre-crisis sample stock prices lead exchange rates negatively in Japan and South Korea (consistently with the portfolio approach) and positively in Indonesia and Thailand. In the latter two countries after the onset of the 1997 East Asian crisis the spillover effects are found to be bidirectional. Copyright © 2002 John Wiley & Sons, Ltd.

132 citations


Journal ArticleDOI
TL;DR: The authors showed that higher initial margin requirements are associated with lower subsequent stock market volatility during normal and bull periods, but show no relationship during bear periods, and that higher margins are also negatively related to the conditional mean of stock returns, apparently because they reduce systemic risk.
Abstract: Higher initial margin requirements are associated with lower subsequent stock market volatility during normal and bull periods, but show no relationship during bear periods. Higher margins are also negatively related to the conditional mean of stock returns, apparently because they reduce systemic risk. We conclude that a prudential rule for setting margins (or other regulatory restrictions) is to lower them in sharply declining markets in order to enhance liquidity and avoid a depyramiding effect in stock prices, but subsequently raise them and keep them at the higher level in order to prevent a future pyramiding effect. Copyright 2002, Oxford University Press.

86 citations


Journal ArticleDOI
TL;DR: A heuristic algorithm for finding a sub-optimal solution of the single machine scheduling problem, where the critical scheduling parameter, that is job value, deteriorates exponentially over time, is presented.

57 citations


Journal ArticleDOI
TL;DR: An environmental analysis of the used starting, lighting and ignition (SLI) batteries sector, based on the logistics involved in the recovery process and the tools available for measuring the environmental impact of such a process, is performed in this article.
Abstract: An environmental analysis of the used starting, lighting and ignition (SLI) batteries sector, based on the logistics involved in the recovery process and the tools available for measuring the environmental impact of such a process, is performed in this paper. The different phases of the reverse supply chain of used batteries are analysed, and the operations and practices involved are related with the impact on the environment, using a life cycle analysis. The analysis is exemplified by using the results of a case study. In addition, the situation in different countries from a legislative point of view is presented. Finally, the results of the analysis are summarised and several important issues arising from the analysis are discussed.

57 citations


Journal ArticleDOI
TL;DR: This paper is concerned with a ship's container load planning which satisfies two considerations: ship stability and minimum number of rehandles required and minimises the number ofRehandles.
Abstract: The efficiency of a container terminal depends primarily on the smooth and orderly process of handling containers, especially during the ship's loading procedure. The loading plan is mainly determined by two considerations: ship stability and minimum number of rehandles required. These two basic considerations are often in conflict. Most containerships have a cellular structure, imposing a strong restriction on the order of the container loading sequence. To preserve a ship's stability, some containers may be stowed in a ship hold in middle vertical locations. A similar loading problem exists in the stacking of yard containers. If these containers are stacked in the yard under others which are to be picked up later, then the loading process requires a number of container rehandles. This paper is concerned with a ship's container load planning which satisfies these two considerations and minimises the number of rehandles.

54 citations


Proceedings ArticleDOI
04 Mar 2002
TL;DR: This work introduces an efficient methodology for processor cores self-testing which requires knowledge of their instruction set and Register Transfer (RI) level description and is more efficient in terms of fault coverage, test code size and test application time.
Abstract: Software self-testing for embedded processor cores based on their instruction set, is a topic of increasing interest since it provides an excellent test resource partitioning technique for sharing the testing task of complex systems-on-chip (SoC) between slow, inexpensive testers and embedded code stored in memory cores of the SoC. We introduce an efficient methodology for processor cores self-testing which requires knowledge of their instruction set and Register Transfer (RI) level description. Compared with functional testing methodologies proposed in the past, our methodology is more efficient in terms of fault coverage, test code size and test application time. Compared with recent software based structural testing methodologies for processor cores, our methodology is superior in terms of test development effort and has significantly smaller code size and memory requirements, while virtually the same fault coverage is achieved with an order of magnitude smaller test application time.

50 citations


Journal ArticleDOI
TL;DR: dCaP, by individualizing the dCa concentrations used and timing the switching between them, may improve intradialytic BP instability and simultaneously minimize the risk for HD patients to develop hypercalcemia.

Journal ArticleDOI
TL;DR: In this article, the authors provide a unifying framework in which identified offset and sterilization equations can be derived and estimated for the German experience of the 1980s and the results point to active sterilization by the Bundesbank during this period.
Abstract: This paper provides a unifying framework in which identified offset and sterilization equations can be derived and estimated. The theoretical model suggests that, in the case where the central bank cares about both external and internal goals and capital is less than perfectly mobile, there will be some offsetting capital flows and the central bank will sterilize. Several results from the literature are encompassed as special cases. The equations are estimated for the German experience of the 1980s and the results point to active sterilization by the Bundesbank during this period. Copyright © 2002 John Wiley & Sons, Ltd.

Book ChapterDOI
TL;DR: In this article, the design principles for closed loop supply chains, both from a theoretical perspective and a business case, are studied from both a theoretical and a practical perspective, with the aim to close goods flows thereby limiting emission and residual waste, while providing customer service at low cost.
Abstract: Closed loop supply chains aim at closing goods flows thereby limiting emission and residual waste, but also providing customer service at low cost. In this paper we study design principles for closed loop supply chains, both from a theoretical perspective and a business case. Obvious improvements can be made by applying traditional ‘forward logistics’ principles. Also new, life cycle driven principles need to be applied. This can be supported by advanced management tools such as LCA and LCC.

Proceedings ArticleDOI
06 Oct 2002
TL;DR: This paper describes the method for initializing the student model in a Web-based Algebra Tutor, which is called Web-EasyMath, which uses an innovative combination of stereotypes and the distance weighted k-nearest neighbor algorithm to initialize the model of a new student.
Abstract: In this paper we describe the method for initializing the student model in a Web-based Algebra Tutor, which is called Web-EasyMath. The system uses an innovative combination of stereotypes and the distance weighted k-nearest neighbor algorithm to initialize the model of a new student. In particular, the student is first assigned to a stereotype category concerning her/his knowledge level based on her/his performance on a preliminary test. The system then initializes all aspects of the student model using the distance weighted k-nearest neighbor algorithm among the students that belong to the same stereotype category with the new student. The basic idea of the application of the algorithm is to weigh the contribution of each of the neighbor students according to their distance from the new student; the distance between students is calculated based on a similarity measure. In the case of Web-EasyMath the similarity measure is estimated taking into account the school class students belong to, their degree of carefulness while solving exercises as well as their proficiency in using simple arithmetic operations.

Book ChapterDOI
TL;DR: An improved KeyWord Interface (KWI) is proposed to enhance the efficiency of information retrieval when using an advanced search engine, as an intelligent agent, forming a loose/adaptive semantic network.
Abstract: This work proposes an improved KeyWord Interface (KWI) to enhance the efficiency of information retrieval when using an advanced search engine, as an intelligent agent This can be achieved by restructuring the KWI into a new hierarchical structure based on an { n domains} by {3 levels} arrangement of keywords ( n ×3 KWI), forming a loose/adaptive semantic network The hierarchical levels used in the suggested implementation are set of species, logical category , and holistic entity As an illustration, the method is applied to an example of literature survey concerning a well-documented process engineering field The results of the proposed technology are compared with the outcome of general-purpose search-engines built in common academic publication databases The comparison reveals the advantage of intelligent searching in creating a local base according to the orders/interests of the researcher

Journal ArticleDOI
TL;DR: In this article, the applicability and effectiveness of Stereolithography rapid prototyping to the field of scale modeling for architectural design evaluation and demonstration purposes is investigated, and two scale models concerning a modern renovated track and field sports facility and a reconstructed ancient stadium are examined.
Abstract: The purpose of this paper is to investigate the applicability and effectiveness of Stereolithography rapid prototyping to the field of scale modelling for architectural design evaluation and demonstration purposes. Two scale models concerning a modern renovated track and field sports facility and a reconstructed ancient stadium are examined. Both models were constructed by assembling together resin parts fabricated with Stereolithography instead of milling. Critical issues encountered during the building phase of the two models are addressed and presented in detail. Comments are made on the CAD requirements of the parts geometry, on the part building and the post‐processing phases as well as on the end achieved accuracy. Problems associated with the computational time, related to the 3‐D solid modelling, and with the physical properties of the parts, are also discussed. The present Stereolithography methodology is compared to conventional model building techniques by investigating efficiency and productivity factors, quality matters and time requirements.

Journal ArticleDOI
TL;DR: In this paper, the authors highlight the conversations-for-action needed for effective coordination in both strategic and operational planning, and highlight the importance of framing problems in systemic terms and the poor coordination resulting from inefficient communication patterns.
Abstract: As we witness a changing landscape of business and world affairs, it becomes increasingly more important to maintain a systemic understanding of developments, as a guide for improved strategic management. Organizations that embarked on implementing new approaches (TQM, reengineering, learning organizations, etc) for significant performance improvements have experienced limited or no success. In periods of rapid change, the paramount challenges in strategic management are related to the effective execution of current plans and the adaptation needed to cope with change. Two significant causes for disappointing results have been the lack of framing problems in systemic terms and the poor coordination resulting from inefficient communication patterns. In this sense, any solution to a problem contains the seeds of the next set of problems. An effective approach for handling these deficiencies highlights the conversations-for-action needed for effective coordination in both strategic and operational planning. T...

Journal ArticleDOI
TL;DR: A state model is developed, which represents the behavior of elementary structures degrading, due to aging processes and stochastic environmental loads, and it is shown, that this can be used to model random loads induced by external events.

Journal ArticleDOI
TL;DR: In this paper, simple approximation techniques exploiting relationships between generalized convex orders and appropriate probability metrics are developed exploiting relationship between positive/negative dependence concepts and convex ordering, leading to approximations and bounds for the distributions of sums of positive and negative dependent random variables.
Abstract: Simple approximation techniques are developed exploiting relationships between generalized convex orders and appropriate probability metrics. In particular, the distance between s-convex ordered random variables is investigated. Results connecting positive/negative dependence concepts and convex ordering are also presented. These results lead to approximations and bounds for the distributions of sums of positive/negative dependent random variables. Applications and extensions of the main results pertaining to compound Poisson, normal and exponential approximation are provided as well.

Journal ArticleDOI
TL;DR: In this article, the authors used the hole-drilling strain gage method of stress relaxation to determine the polymerisation-induced residual stresses generated in stereolithography (SL) built test specimens.
Abstract: This paper presents an experimental study undertaken to determine the polymerisation‐induced residual stresses generated in stereolithography (SL) built test specimens, by using the hole‐drilling strain gage method of stress relaxation. Experimentally measured strains, using special three‐element strain gage rosettes, were input into the blind‐hole analysis to calculate the induced residual stresses. The mechanical properties of resin specimens fabricated by the solidification process using an epoxy based photopolymer and post‐cured under ultraviolet (UV) and thermal exposure were determined and incorporated into the subsequent drill‐hole analysis. The effect of the pre‐selected fabrication parameters (hatching space and curing depth) and subsequent by the post‐curing procedure (UV, thermal curing) on the magnitude of the recorded strains is also presented.

Proceedings ArticleDOI
08 Apr 2002
TL;DR: This paper includes a brief discussion of the major reverse logistics issues and it presents the new e-business models in this field, identifying key factors for their competitive advantages.
Abstract: Reverse logistics, i.e. all operations related to the extension of a useful lifecycle for used products, commercial returns, excess inventory and packaging materials, is gaining increasing attention globally for its promising financial potential, its sustainable growth alternatives and the environmentally positive impact that it could have. In this paper, we include a brief discussion of the major reverse logistics issues and we present the new e-business models in this field. We identify key factors for their competitive advantages, and we discuss conceptual and current opportunities for these e-business models to thrive and advance. Finally, we outline some possible future developments of e-commerce tools for reverse logistics.


Journal ArticleDOI
TL;DR: A graphical user interface (GUI) that provides intelligent help to users that is based on an adaptation of a cognitive theory called human plausible reasoning and performs goal recognition based on the effects of users' commands is described.
Abstract: This article is about a graphical user interface (GUI) that provides intelligent help to users The GUI is called IFM (Intelligent File Manipulator) IFM monitors users while they work; if a user has made a mistake with respect to his or her hypothesized intentions, then IFM intervenes automatically and offers advice IFM has two underlying reasoning mechanisms: One is based on an adaptation of a cognitive theory called human plausible reasoning and the other one performs goal recognition based on the effects of users' commands The requirement analysis of the system has been based on an empirical study that was conducted involving real users of a standard file manipulation program like the Windows Explorer; this analysis revealed a need for intelligent help Finally, IFM has been evaluated in comparison with a standard file manipulation GUI and in comparison with human experts acting as consultants The results of the evaluation showed that IFM can produce successfully advice that is helpful to users

Proceedings ArticleDOI
04 Nov 2002
TL;DR: A new density-biased sampling algorithm that exploits spatial indexes and the local density information they preserve, to provide improved quality of sampling result and fast access to elements of the dataset.
Abstract: In this paper we describe a new density-biased sampling algorithm. It exploits spatial indexes and the local density information they preserve, to provide improved quality of sampling result and fast access to elements of the dataset. It attains improved sampling quality, with respect to factors like skew, noise or dimensionality. Moreover, it has the advantage of efficiently handling dynamic updates, and it requires low execution times. The performance of the proposed method is examined experimentally. The comparative results illustrate its superiority over existing methods.

Journal ArticleDOI
TL;DR: In this paper, a simple risk model and a simple discrete multiple scan (DMS) statistic were used to study the occurrences of clusters of threshold exceedances by the individual claims, and a compound Poisson approximation was established and certain asymptotic results were obtained for both the risk models and a similar in nature financial problem.
Abstract: In this paper, we consider a simple risk model and study the occurrences of clusters of threshold exceedances by the individual claims. The statistic used to study the model is the discrete multiple scan statistic. A compound Poisson approximation is established and certain asymptotic results are obtained for both the risk model and a similar in nature financial problem. Finally, we review two typical examples from areas of applied science where the outcomes of this paper may have beneficial impact.

Journal ArticleDOI
TL;DR: In this paper, the authors introduce and perform a detailed study of a class of multistate reliability structures in which no ordering in the levels of components' performances is necessary, and develop the basic reliability formulae, reliability bounds, asymptotic results.
Abstract: The primary objective of this work is to introduce and perform a detailed study of a class of multistate reliability structures in which no ordering in the levels of components' performances is necessary. In particular, the present paper develops the basic theory (exact reliability formulae, reliability bounds, asymptotic results) that will make it feasible to investigate systems whose components are allowed to experience m ≥ 2 kinds of failure (failure modes), and their breakdown is described by different families of cut sets in each mode. For illustration purposes, two classical (binary) systems are extended to analogous multiple failure mode structures, and their reliability performance (bounds and asymptotic behavior) is investigated by numerical experimentation. © 2002 Wiley Periodicals, Inc. Naval Research Logistics 49: 167–185, 2002; DOI 10.1002/nav.10007

Journal ArticleDOI
TL;DR: In this paper, the authors tried to quantify the risk of an accident leading to pollution as a result of the exposure of the ship to certain risk factors and tested the statistical significance of the relevant risk's magnitude by using real data that refer to pollution incidents occurring during 1993-1997.
Abstract: Ship accidents may create, apart from other damage, environmental pollution. For at least three decades, continuous efforts have been made by various maritime organizations, and especially the IMO, to find effective ways towards reducing these accidents. This paper tries to quantify the risk of an accident leading to pollution as a result of the exposure of the ship to certain risk factors. It tests the statistical significance of the relevant risk’s magnitude by using real data that refer to pollution incidents occurring during 1993‐1997. The results have shown that 60 per cent of the accidents occurred in ports and regulated zones. The large tankers involved in pollution accidents, that happened in ports and regulated zones, present an almost seven times higher risk than smaller tankers. Most frequently these accidents are attributed to collisions, hull and machinery damage and groundings. Small bulk carriers have experienced a risk rising to ten. The paper also provides certain recommendations for policy makers and indications for further research.

Journal ArticleDOI
TL;DR: The theory of fuzzy logic and fuzzy reasoning is combined with the theory of Markov systems and the concept of a fuzzy non-homogeneous Markov system is introduced for the first time to deal with the uncertainty introduced in the estimation of the transition probabilities and the input probabilities in Markov Systems.
Abstract: In this paper the theory of fuzzy logic and fuzzy reasoning is combined with the theory of Markov systems and the concept of a fuzzy non-homogeneous Markov system is introduced for the first time. This is an effort to deal with the uncertainty introduced in the estimation of the transition probabilities and the input probabilities in Markov systems. The asymptotic behaviour of the fuzzy Markov system and its asymptotic variability is considered and given in closed analytic form. Moreover, the asymptotically attainable structures of the system are estimated also in a closed analytic form under some realistic assumptions. The importance of this result lies in the fact that in most cases the traditional methods for estimating the probabilities can not be used due to lack of data and measurement errors. The introduction of fuzzy logic into Markov systems represents a powerful tool for taking advantage of the symbolic knowledge that the experts of the systems possess.

Book ChapterDOI
16 Sep 2002
TL;DR: This paper has developed an interactive search results manipulation application, which is executed at the client's workspace through a Web browser without any further interaction with the server that provided the initial search results list.
Abstract: In this paper, we address the issue of interactive search results manipulation, as provided by typical Web-based information retrieval modules like search engines and directories. Many digital library systems could benefit a lot from the proposed approach, since it is heavily based on metadata, which constitute the building block of such systems. We also propose a way of ranking search results according to their overall importance, which is defined as a weighted combination of the relevancy and popularity of a resource that is being referenced in a search results list. In order to evaluate this model, we have developed an interactive search results manipulation application, which is executed at the client's workspace through a Web browser without any further interaction with the server that provided the initial search results list. The prototype implementation is based on the XML standard and has been evaluated through an adequate evaluation process from which useful conclusions have been obtained.

Journal ArticleDOI
TL;DR: In this paper, the authors examine several practical issues regarding the implementation of the Phillips and Hansen fully modified least squares (FMLS) method for the estimation of a cointegrating vector.
Abstract: This paper examines several practical issues regarding the implementation of the Phillips and Hansen fully modified least squares (FMLS) method for the estimation of a cointegrating vector. Various versions of this method arise by selecting between standard and prewhitened kernel estimation and between parametric and nonparametric automatic bandwidth estimators and also among alternative kernels. A Monte Carlo study is conducted to investigate the finite-sample properties of the alternative versions of the FMLS procedure. The results suggest that the prewhitened kernel estimator of Andrews and Monahan (1992, Econometrica 60, 953-966) in which the bandwidth parameter is selected via the nonparametric procedure of Newey and West (1994, Review of Economic Studies 61, 631-653) minimizes the second-order asymptotic bias effects.