scispace - formally typeset
Search or ask a question

Showing papers by "University of Cyprus published in 2001"


Journal ArticleDOI
TL;DR: In this article, the relationship between human capital accumulation and economic growth using various measures of human capital frequently employed by researchers was studied and the authors used semiparametric estimation techniques to uncover any nonlinearities that may exist.
Abstract: In this paper we study the relationship between human capital accumulation and economic growth using various measures of human capital frequently employed by researchers. We use semiparametric estimation techniques to uncover any nonlinearities that may exist. Using mean years of schooling measures of human capital we find a nonlinear effect on economic growth. There seem to be important differences in the growth effect of educational attainment by gender and level of education. Enrollment rates do not yield a nonlinear effect.

248 citations


Proceedings ArticleDOI
06 Jul 2001
TL;DR: Thecoordination ratio is used as a measure of the cost of lack of coordination among the network users for such a system, where each network user selfishly routes its traffic only on those links available to it that minimize its expected latency cost, given the network congestion caused by the other users.
Abstract: We study the problem of routing traffic through a congested network. We focus on the simplest case of a network consisting of m parallel links. We assume a collection of n network users, each employing a mixed strategy which is a probability distribution over links, to control the shipping of its own assigned traffic. Given a capacity for each link specifying the rate at which the link processes traffic, the objective is to route traffic so that the maximum expected latency over all links is minimized. We consider both uniform and non-uniform link capacities.How much decrease in global performace is necessary due to the absence of some central authority to regulate network traffic and implement an optimal assignment of traffic to links? We investigate this fundamental question in the context of Nash equilibria for such a system, where each network user selfishly routes its traffic only on those links available to it that minimize its expected latency cost, given the network congestion caused by the other users. We use the coordination ratio, defined by Koutsoupias and Papadimitriou [25] as the ratio of the maximum (over all links) expected latency in the worst possible Nash equlibrium, over the least possible maximum latency had global regulation been available, as a measure of the cost of lack of coordination among the network users.

213 citations


Journal ArticleDOI
P. Achard1, O. Adriani2, M. Aguilar-Benitez, J. Alcaraz  +369 moreInstitutions (43)
TL;DR: A search for exotic unstable neutral and charged heavy leptons was performed with the L3 detector at LEP as discussed by the authors, but no evidence for their existence was found and lower limits on their masses were set.

166 citations


Journal ArticleDOI
TL;DR: In this paper, the performance of the Pt/La 0.5Ce0.5MnO3 catalyst was investigated under lean-burn conditions in the 100-400°C range over 0.1 wt% Pt/γ-Al2O3 and tested under the same reaction conditions.

162 citations


Journal ArticleDOI
TL;DR: In this article, the authors identify the factors that play an important role in determining the degree of international pricing strategy standardization and suggest that the extent to which multinationals standardize their international pricing strategies depends on the level of similarity between home and host countries in terms of customer characteristics, legal environment, economic conditions and stage of the product life cycle.
Abstract: In response to certain important gaps identified in the global marketing literature, the focus of this inquiry is an investigation of the pricing strategies followed by manufacturing subsidiaries of multinational corporations. Specifically, the authors attempt to identify the factors that play an important role in determining the degree of international pricing strategy standardization. The findings suggest that the extent to which multinationals standardize their international pricing strategies depends on the level of similarity between home and host countries in terms of customer characteristics, legal environment, economic conditions, and stage of the product life cycle. The authors highlight implications of the findings for business practitioners and discuss future research directions along with the limitations of the study.

144 citations


Book ChapterDOI
TL;DR: The authors argue that quantile regression methods can play a constructive role in the analysis of duration (survival) data offering a more flexible, more complete analysis than is typically available with more conventional methods.
Abstract: We argue that quantile regression methods can play a constructive role in the analysis of duration (survival) data offering a more flexible, more complete analysis than is typically available with more conventional methods. We illustrate the approach with a reanalysis of the data from the Pennsylvania Reemployment Bonus Experiments. These experiments, conducted in 1988–89, were designed to test the efficacy of cash bonuses paid for early reemployment in shortening the length of insured unemployment spells

141 citations


Journal ArticleDOI
TL;DR: In this article, the authors trace the historical developments emphasizing the optimality features of tests based on scores and their usefulness in practical problems in statistics and econometrics, and give some new results, present easier computation of score-based tests and alternative derivations of some known results.

134 citations


Journal ArticleDOI
TL;DR: In this paper, the authors compared the effectiveness of two instructional approaches in the acquisition of basic astronomy concepts: standard, textbook-based instruction and explanations that would maximize the plausibility of scientific conceptions.

116 citations


Proceedings Article
11 Sep 2001
TL;DR: C 2 P is presented, a new clustering algorithm for large spatial databases, which exploits spatial access methods for the determination of closest pairs and attains the advantages of hierarchical clustering and graphtheoretic algorithms providing both efficiency and quality of clustering result.
Abstract: In this paper we present C 2 P, a new clustering algorithm for large spatial databases, which exploits spatial access methods for the determination of closest pairs. Several extensions are presented for scalable clustering in large databases that contain clusters of various shapes and outliers. Due to its characteristics, the proposed algorithm attains the advantages of hierarchical clustering and graphtheoretic algorithms providing both efficiency and quality of clustering result. The superiority of C 2 P is verified both with analytical and experimental results.

109 citations


Journal ArticleDOI
TL;DR: The combined use of molecular dynamics free energy simulations to study one binding process thoroughly, followed by molecular dynamics and Poisson-Boltzmann free energy calculations to study a series of related ligands or mutations is proposed as a paradigm for protein or ligand design.

109 citations


Journal ArticleDOI
TL;DR: This work investigates the application of the method of fundamental solutions for the calculation of the eigenvalues of the Helmholtz equation in the plane subject to homogeneous Dirichlet boundary conditions and presents results for circular and rectangular geometries.

Journal ArticleDOI
TL;DR: It is argued that perceptual learning does not threaten the cognitive impenetrability of perception, and that the neuropsychological research does not provide evidence in favor of the top-down character of perception.

Journal ArticleDOI
TL;DR: The approach generalizes the classical normal-based one-way analysis of variance in the sense that it obviates the need for a completely specified parametric model and is applied to rain-rate data from meteorological instruments.
Abstract: We consider m distributions in which the first m − 1 are obtained by multiplicative exponential distortions of the mth distribution, which is a reference. The combined data fromm samples, one from each distribution, are used in the semiparametric large-sample problem of estimating each distortion and the reference distribution and testing the hypothesis that the distributions are identical. The approach generalizes the classical normal-based one-way analysis of variance in the sense that it obviates the need for a completely specified parametric model. An advantage is that the probability density of the reference distribution is estimated from the combined data and not only from the mth sample. A power comparison with the t and F tests and with two nonparametric tests, obtained by means of a simulation, points to the merit of the present approach. The method is applied to rain-rate data from meteorological instruments.

Proceedings Article
04 Aug 2001
TL;DR: This paper presents a new system, called the A- System, performing abductive reasoning within the framework of Abductive Logic Programming, based on a hybrid computational model that implements the abductive search in terms of two tightly coupled processes.
Abstract: This paper presents a new system, called the A- System, performing abductive reasoning within the framework of Abductive Logic Programming. It is based on a hybrid computational model that implements the abductive search in terms of two tightly coupled processes: a reduction process of the highlevel logical representation to a lower-level constraint store and a lower-level constraint solving process. A set of initial "proof of principle" experiments demonstrate the versatility of the approach stemming from its declarative representation of problems and the good underlying computational behaviour of the system. The approach offers a general methodology of declarative problem solving in AI where an incremental and modular refinement of the high-level representation with extra domain knowledge can improve and scale the computational performance of the framework.

Journal ArticleDOI
TL;DR: This paper examined the relative information content of earnings and cash flows for security returns using a methodology incorporating contextual factors which may affect earnings and Cash Flow response coefficients, finding evidence that the earnings coefficient is related to earnings permanence, growth and firm size and that the cash flow coefficient may be related to growth.
Abstract: This paper examines the relative information content of earnings and cash flows for security returns using a methodology incorporating contextual factors which may affect earnings and cash flow response coefficients. For our UK dataset, we provide evidence that the earnings coefficient is related to earnings permanence, growth and firm size and that the cash flow coefficient may be related to growth. Although our results emphasise the value relevance of earnings, they also suggest that both contemporaneous and prior period cash flow are positively related to security returns and that market-to-book and market value of equity have predictive power for returns.

Proceedings ArticleDOI
01 Jan 2001
TL;DR: An overview of medical imaging fusion techniques with an emphasis on the use of neural network algorithms is presented and it is anticipated that these tools will help the physician towards a more realistic and quantitative, assessment of disease.
Abstract: Computer aided fusion of multi-modality medical images provides a very promising diagnostic tool with numerous clinical applications. The objective of this paper is to present an overview of medical imaging fusion techniques with an emphasis on the use of neural network algorithms. Case studies derived from oncology (data level fusion), microscopy and ultrasound imaging (feature level and decision level fusion), and lesion placement in pallidotomy (data level fusion) are presented. It is anticipated that these tools will help the physician towards a more realistic and quantitative, assessment of disease.

Journal ArticleDOI
TL;DR: Using empirical work in the UK, the authors presented and asesses gendered patterns of work and employment in the computing sector whilst comparing it to other traditionally male-dominated sectors, and found that computing is one of the most growing industries offering many opportunities for employment and advancement.
Abstract: Computing is one of the most growing industries offering many opportunities for employment and advancement. Furthermore it is one of the newest industries which comprises of mainly young firms and relatively freshly constituted forms of working practices. Using empirical work in the UK, this paper presents and asesses gendered patterns of work and employment in the computing sector whilst comparing it to other traditionally male-dominated sectors.

Journal ArticleDOI
TL;DR: Compared with cluster analysis, the ANN models were more capable of detecting even minor characteristics in the rainfall waveshapes investigated, and they also performed a more realistic categorization of the available data.
Abstract: In this paper, the usefulness of artificial neural networks (ANNs) as a suitable tool for the study of the medium and long-term climatic variability is examined. A method for classifying the inherent variability of climatic data, as represented by the rainfall regime, is investigated. The rainfall recorded at a climatological station in Cyprus over a long time period has been used in this paper as the input for various ANN and cluster analysis models. The analysed rainfall data cover the time span 1917–1995. Using these values, two different procedures were followed for structuring the input vectors for training the ANN models: (a) each 1-year subset consisting of the 12 monthly elements, and (b) each 2-year subset consisting of the 24 monthly elements. Several ANN models with a varying number of output nodes have been trained, using an unsupervised learning paradigm, namely, the Kohonen’s self-organizing feature maps algorithm. For both the 1- and 2-year subsets, 16 classes were empirically considered as the optimum for computing the prototype classes of weather variability for this meteorological parameter. The classification established by using the ANN methodology is subsequently compared with the classification generated by using cluster analysis, based on the agglomerative hierarchical clustering algorithm. To validate the classification results, the rainfall distributions for the more recent years 1996, 1997 and 1998 were utilized. The respective 1- and 2-year distributions for these years were assigned to particular classes for both the ANN and cluster analysis procedures. Compared with cluster analysis, the ANN models were more capable of detecting even minor characteristics in the rainfall waveshapes investigated, and they also performed a more realistic categorization of the available data. It is suggested that the proposed ANN methodology can be applied to more climatological parameters, and with longer cycles. Copyright © 2001 Royal Meteorological Society.

Journal ArticleDOI
P. Achard1, O. Adriani2, M. Aguilar-Benitez, J. Alcaraz  +369 moreInstitutions (43)
TL;DR: In this paper, the first generation heavy neutrino that is an isosinglet under the standard $SU(2)_L$ gauge group was reported, with the data collected with the L3 detector at center-of-mass energies between 130.

Journal ArticleDOI
TL;DR: In this paper, the authors investigated the application of the fundamental solutions to two-dimensional elasticity problems in isotropic and anisotropic single materials and bimaterials, where the interface continuity conditions were approximated in the same manner as the boundary conditions.
Abstract: In this paper, we investigate the application of the method of fundamental solutions to two-dimensional elasticity problems in isotropic and anisotropic single materials and bimaterials. A domain decomposition technique is employed in the bimaterial case where the interface continuity conditions are approximated in the same manner as the boundary conditions. The method is tested on several test problems and its relative merits and disadvantages are discussed.

Journal ArticleDOI
TL;DR: In this article, the spectrum of the confining strings of the SU(6) gauge theory was studied and the three independent string tensions related to sources with ${Z}_{N}$ charge $k=1,2,3,$ using Monte Carlo simulations.
Abstract: In the context of four-dimensional $\mathrm{SU}(N)$ gauge theories, we study the spectrum of the confining strings. We compute, for the SU(6) gauge theory formulated on a lattice, the three independent string tensions ${\ensuremath{\sigma}}_{k}$ related to sources with ${Z}_{N}$ charge $k=1,2,3,$ using Monte Carlo simulations. Our results, whose uncertainty is approximately 2% for $k=2$ and 4% for $k=3,$ are consistent with the sine formula ${\ensuremath{\sigma}}_{k}/\ensuremath{\sigma}=\mathrm{sin}(k\ensuremath{\pi}/N)/\mathrm{sin}(\ensuremath{\pi}/N)$ for the ratio between ${\ensuremath{\sigma}}_{k}$ and the standard string tension \ensuremath{\sigma}, and show deviations from the Casimir scaling. The sine formula is known to emerge in supersymmetric $\mathrm{SU}(N)$ gauge theories and in M theory. We comment on an analogous behavior exhibited by two-dimensional $\mathrm{SU}(N)\ifmmode\times\else\texttimes\fi{}\mathrm{SU}(N)$ chiral models.

Journal ArticleDOI
TL;DR: In this paper, a large test battery was constructed which contained items from the WISC-Rc, from Case's and from Demetriou's previous research, and the battery comprised five subscales that were intended to assess spatial, quantitative, verbal/propositional, qualitative/analytic, and causal thought.

Journal ArticleDOI
01 Jun 2001
TL;DR: The problem of finding traversal patterns from collections of frequently occurring access sequences is examined and three algorithms, one which is level-wise with respect to the lengths of the patterns and two which are not are presented.
Abstract: In data models that have graph representations, users navigate following the links of the graph structure. Conducting data mining on collected information about user accesses in such models, involves the determination of frequently occurring access sequences. In this paper, the problem of finding traversal patterns from such collections is examined. The determination of patterns is based on the graph structure of the model. For this purpose, three algorithms, one which is level-wise with respect to the lengths of the patterns and two which are not are presented. Additionally, we consider the fact that accesses within patterns may be interleaved with random accesses due to navigational purposes. The definition of the pattern type generalizes existing ones in order to take into account this fact. The performance of all algorithms and their sensitivity to several parameters is examined experimentally.

Journal ArticleDOI
TL;DR: This work uses an AM-FM representation for each fingerprint to obtain significant gains in classification performance as compared to the commonly used National Institute of Standards system, for the same classifier.
Abstract: Research on fingerprint classification has primarily focused on finding improved classifiers, image and feature enhancement, and less on the development of novel fingerprint representations. Using an AM-FM representation for each fingerprint, we obtain significant gains in classification performance as compared to the commonly used National Institute of Standards system, for the same classifier.

Journal ArticleDOI
TL;DR: In this article, a wavelet shrinkage methodology for univariate natural exponential families with quadratic variance functions is proposed, covering the Gaussian, Poisson, gamma, binomial, negative binomial and generalised hyperbolic secant distributions.
Abstract: We propose a wavelet shrinkage methodology for univariate natural exponential families with quadratic variance functions, covering the Gaussian, Poisson, gamma, binomial, negative binomial and generalised hyperbolic secant distributions. Simulation studies for Poisson and binomial data are used to illustrate the usefulness of the proposed methodology, and comparisons are made with other methods available in the literature. We also present applications to datasets arising from high-energy astrophysics and from epidemiology.

Journal ArticleDOI
TL;DR: In this article, the authors investigated the link between real stock price changes and economic growth and developed a simple growth model, which presents the relationship between the real stock prices and output.
Abstract: This paper investigates the link between real stock price changes and economic growth. We develop a simple growth model, which presents the relationship between real stock prices and output. Evidence from the G-7 economies by use of the VAR methodology shows that real stock price changes and output growth are strongly related, as predicted by the theoretical model. The bivariate framework also provides useful information for understanding the response of economic growth and real stock prices to external shocks.

Journal ArticleDOI
TL;DR: Despite increased stockholding opportunities, standard expected-utility models overpredict household participation and stock holdings, and it has been suggested that departures from expected utility could resolve both puzzles as discussed by the authors.
Abstract: Despite increased stockholding opportunities, standard expected-utility models overpredict household participation and stock holdings. It has been suggested that departures from expected utility could resolve both puzzles. We investigate three measurable departures: (i) Kreps-Porteus preferences, (ii) Yaari's Dual Theory, and (iii) Quiggin's Rank-dependent Utility. Improvements tend to occur in predicted portfolio composition rather than participation. They are limited under (i), questionable under (ii), and more sizeable under (iii). Contrary to priors in the literature, improvements under (iii) do not result from solutions at kinks of indifference curves. We conclude that stockholding puzzles are unlikely to be resolved through preferences alone. Financial developments over the past decade have significantly enhanced the opportunities of households to invest in risky financial assets. Financial liberalisation led to considerable innovation, both in financial products and in the bundling of financial services. Privatisation of public utilities generated new stockholding opportunities, while the associated advertising campaign provided households with considerable information regarding the workings of the stock market. The proliferation of mutual funds allowed even small investors to hold highly diversified portfolios, and gave them access to the financial expertise of professional fund managers. Increased international integration and policy developments such as EMU improved prospects for risky asset holding across international borders. Pension reforms are encouraging households to undertake portfolio investments in order to provide for their old age. Progress in telecommunications and information channels, notably internet trading of securities and mutual funds, facilitate transactions. These important 'supply-side' developments have increased the number of stockholders, but have not been sufficient to induce the majority of house

Journal ArticleDOI
TL;DR: In this article, an aborted attempt at reform of the Greek pension system, following a series of previous failures, is discussed, and the authors apply the framework of rational choice institutionalism to examine the strategy and setting of relevant actors.
Abstract: This paper seeks to explain an aborted attempt at reform of the Greek pension system, following a series of previous failures. It applies the framework of rational choice institutionalism in order to examine the strategy and setting of the relevant actors. The pension system had become a huge fiscal burden on the state, threatening Greece’s position in the European Union. Moreover, its gross inequalities of provision and bureaucratic inefficiency were symptoms of the embedded clientelism and ‘disjointed corporatism’ that stood in the way of the government’s self-proclaimed ‘modernization’ programme. In the event, though EMU entry requirements empowered the reform momentum, a combination of the strategic weakness of key actors and the entrenched opposition of sectoral interests dissipated the initiative. The failure suggests the relevance of the wider social setting to reform: in particular, the weakness of the technocratic community and the relative absence of a supportive ‘advocacy coalition’, beyond the dominance of the ‘party state’. Faced with criticism, the political leadership sought to protect their electoral position and postponed pressing decisions. The case study raises important questions about the scope for such reform in Greece and the future stability of the ‘Euro-zone’. The commitment to Economic and Monetary Union (EMU) contained in the Maastricht Treaty (1991) has been a major stimulus to reform in many, if not all, European Union countries. ‘Strapped to the mast’ of EMU, member states have been subject to an externally imposed discipline on government budget deficits and debt levels that have re-structured fiscal policy choices. This discipline gave domestic reformers a ‘vincolo esterno’ (external link) by which to empower their own advocacy of policy shifts across a broad public agenda (Dyson and Featherstone, 1996, 1999). Nowhere was the imperative of joining the transition to a single currency felt more strongly than in Greece. Despite being excluded from the first set of states entering Stage 3 of EMU in 1999, the target of Greek entry by 2001 has been accepted by the overwhelming majority of political elites and the mass public. Interestingly, neither of the two main political parties has opposed Greek entry into the single currency while both PASOK and New Democracy accepted the imperative of entry in the April 2000 parliamentary elections. This mirrors the public support evident in the opinion polls conducted for the European Commission: 65 percent of the Greek public was reported to be in favour of the

Book ChapterDOI
01 Mar 2001
TL;DR: This survey presents some of the most common models and technologies that offer coordination mechanisms for Internet agents, and argues for the need of using coordination, then it presents some basic infrastructure technologies before examining in more detail particular coordination models for Internet Agents.
Abstract: Agent technology has evolved rapidly over the past few years along a number of dimensions giving rise to numerous “flavours” of agents such as intelligent agents, mobile agents, etc. One of the most attractive and natural fields for the development of agent technology is the Internet with its vast quantity of available information and offered services. In fact, the term “Internet agent” is effectively an umbrella for most of the other types of agents, since Internet agents should enjoy intelligence, mobility, adaptability, etc. All these different types of agents must be able to somehow interact with each other for the purpose of exchanging information, collaborating or managing heterogeneous environments. This survey presents some of the most common models and technologies that offer coordination mechanisms for Internet agents. It argues for the need of using coordination, then it presents some basic infrastructure technologies before examining in more detail particular coordination models for Internet agents, themselves classified into some general categories.

Journal ArticleDOI
P. Achard1, O. Adriani2, M. Aguilar-Benitez, J. Alcaraz  +370 moreInstitutions (43)
TL;DR: In this paper, the L3 detector at LEP at centre-of-mass energies up to about 209.4 GeV was used for the search for the standard model Higgs boson.