scispace - formally typeset
Search or ask a question

Showing papers by "University of Cyprus published in 2002"


Journal ArticleDOI
TL;DR: In this paper, a meta-analysis of empirical studies on the export marketing strategy performance relationship is presented, showing that although many marketing strategy variables demonstrate positive effects on overall export performance, the relationship is not always significant.

737 citations


Journal ArticleDOI
TL;DR: A snapshot of the applications of wireless telemedicine systems is provided and a review of the spectrum of these applications and the potential benefits of these efforts is presented, followed by successful case studies in electronic patient record, emergency teleMedicine, teleradiology, and home monitoring.
Abstract: Rapid advances in information technology and telecommunications - and, more specifically, wireless and mobile communications - and their convergence ("telematics") are leading to the emergence of a new type of information infrastructure that has the potential of supporting an array of advanced services for healthcare. The objective of this paper is to provide a snapshot of the applications of wireless telemedicine systems. A review of the spectrum of these applications and the potential benefits of these efforts is presented, followed by successful case studies in electronic patient record, emergency telemedicine, teleradiology, and home monitoring. It is anticipated that the progress carried out in these efforts and the potential benefits of emerging mobile technologies will trigger the development of more applications, thus enabling the offering of a better service to the citizen.

412 citations


Journal ArticleDOI
TL;DR: In this article, the authors compare the pre-lawsuit profile of 209 violators to a sample of matched control firms between 1994 and 1998, and find that the likelihood of becoming a lawsuit defendant increases with board size, with the fraction of directors in industrial firms, and with fraction of inside ownership, and decreases with the number of directorships held by outside directors.
Abstract: Each year, hundreds of firms are prosecuted for violating environmental laws and hundreds of millions of dollars in penalties are assessed. At the same time, a much larger number of firms escape the various costs associated with litigation by adhering to the provisions of the same laws and regulations. It is not a priori apparent why this dichotomy exists. In this paper we draw on corporate governance and stakeholder theories to empirically investigate environmental lawsuits. Specifically, we compare the pre-lawsuit profile of 209 violators to a sample of matched control firms between 1994 and 1998. We find that the likelihood of becoming a lawsuit defendant increases with board size, with the fraction of directors in industrial firms, and with the fraction of inside ownership, and decreases with the number of directorships held by outside directors. These findings are robust to alternative dependent variable specifications. Together, our results suggest that managers, researchers, and policy-makers need to direct their attention to the corporate board as the core decision-making unit forming corporate environmental policies. Copyright © 2002 John Wiley & Sons, Ltd.

391 citations


Journal ArticleDOI
TL;DR: In this paper, the Hofmeister series of anions and the known effects of these anions on the self-assembly and phase behavior of cationic and non-ionic surfactants are reviewed.
Abstract: Recent work on mesoporous silica formation using cationic and non-ionic templates has unveiled a large number of anion effects. Anions are seen to change the hydrolysis rates of the silicate precursors, affecting the surface properties and morphologies of the final products after calcination, and they often improve the hydrothermal stability of the silica materials. These advances are reviewed in connection with the Hofmeister series of anions and the known effects of anions on the self-assembly and phase behavior of cationic and non-ionic surfactants.

380 citations


Journal ArticleDOI
TL;DR: This work illustrates both molecular dynamics and Poisson-Boltzmann methods with a detailed study of amino acid recognition by aspartyl-tRNA synthetase, whose specificity is important for maintaining the integrity of the genetic code.
Abstract: In recent years, molecular dynamics simulations of biomolecular free energy differences have benefited from significant methodological advances and increased computer power. Applications to molecular recognition provide an understanding of the interactions involved that goes beyond, and is an important complement to, experimental studies. Poisson-Boltzmann electrostatic models provide a faster and simpler free energy method in cases where electrostatic interactions are important. We illustrate both molecular dynamics and Poisson-Boltzmann methods with a detailed study of amino acid recognition by aspartyl-tRNA synthetase, whose specificity is important for maintaining the integrity of the genetic code.

346 citations


Journal ArticleDOI
TL;DR: In this paper, the authors evaluate the performance of several recently proposed tests for structural breaks in the conditional variance dynamics of asset returns, which apply to the class of ARCH and SV type processes as well as data-driven volatility estimators using highfrequency data.
Abstract: The paper evaluates the performance of several recently proposed tests for structural breaks in the conditional variance dynamics of asset returns. The tests apply to the class of ARCH and SV type processes as well as data-driven volatility estimators using high-frequency data. In addition to testing for the presence of breaks, the statistics identify the number and location of multiple breaks. We study the size and power of the new tests for detecting breaks in the conditional variance under various realistic univariate heteroscedastic models, change-point hypotheses and sampling schemes. The paper concludes with an empirical analysis using data from the stock and FX markets for which we find multiple breaks associated with the Asian and Russian financial crises. These events resulted in changes in the dynamics of volatility of asset returns in the samples prior and post the breaks. Copyright © 2002 John Wiley & Sons, Ltd.

272 citations


Book ChapterDOI
08 Jul 2002
TL;DR: This work provides a comprehensive collection of efficient algorithms, hardness results (both as NP-hardness and #P-completeness results), and structural results for these algorithmic problems related to the computation of Nash equilibria for the selfish routing game the authors consider.
Abstract: In this work, we study the combinatorial structure and the computational complexity of Nash equilibria for a certain game that models selfish routing over a network consisting of m parallel links. We assume a collection of n users, each employing a mixed strategy, which is a probability distribution over links, to control the routing of its own assigned traffic. In a Nash equilibrium, each user selfishly routes its traffic on those links that minimize its expected latency cost, given the network congestion caused by the other users. The social cost of a Nash equilibrium is the expectation, over all random choices of the users, of the maximum, over all links, latency through a link.We embark on a systematic study of several algorithmic problems related to the computation of Nash equilibria for the selfish routing game we consider. In a nutshell, these problems relate to deciding the existence of a Nash equilibrium, constructing a Nash equilibrium with given support characteristics, constructing the worst Nash equilibrium (the one with maximum social cost), constructing the best Nash equilibrium (the one with minimum social cost), or computing the social cost of a (given) Nash equilibrium. Our work provides a comprehensive collection of efficient algorithms, hardness results (both as NP-hardness and #P-completeness results), and structural results for these algorithmic problems. Our results span and contrast a wide range of assumptions on the syntax of the Nash equilibria and on the parameters of the system.

271 citations


Journal ArticleDOI
TL;DR: A longitudinal study investigates the relations between processing efficiency, working memory, and problem solving from the age of 8 years to to theAge of 16 years, and suggests that processing efficiency is a factor closely associated with developmental differences in problem solving, whereas working memory is associated with individual differences.
Abstract: This Monograph aims to contribute to the information processing, the differential, and the developmental modeling of the mind, and to work these into an integrated theory. Toward this aim, a longitudinal study is presented that investigates the relations between processing efficiency, working memory, and problem solving from the age of 8 years to to the age of 16 years. The study involved 113 participants, about equally drawn among 8-, 10-, 12-, and 14-year-olds at the first testing; these participants were tested two more times spaced one year apart. Participants were tested with a large array of tasks addressed to processing efficiency (i.e., speed of processing and inhibition), working memory (in terms of Baddeley's model, phonological storage, visual storage, and the central executive of working memory), and problem solving (quantitative, spatial, and verbal reasoning). Confirmatory factor analysis validated the presence of each of the above dimensions and indicated that they are organized in a three-stratum hierarchy. The first stratum includes all of the individual dimensions mentioned above. These dimensions are organized, at the second stratum, in three constructs: processing efficiency, working memory, and problem solving. Finally, all second-order constructs are strongly related to a third-order general factor. This structure was stable in time. Structural equation modeling indicated that the various dimensions are interrelated in a cascade fashion so that more fundamental dimensions are part of more complex dimensions. That is, speed of processing is the most important aspect of processing efficiency, and it perfectly relates to the condition of inhibition, indicating that the more efficient one is in stimulus encoding and identification, the more efficient one is in inhibition. In turn, processing efficiency is strongly related to the condition of executive processes in working memory, which, in turn, is related to the condition of the two modality-specific stores (phonological and visual). Finally, problem solving is related to processing efficiency and working memory, the central executive in particular. All dimensions appear to change systematically with time. Growth modeling suggested that there are significant individual differences in attainment in each of the three aspects of the mind investigated. Moreover, each of the three aspects of the mind as well as their interrelations change differently during development. Mixture growth modeling suggested that there are four types of developing persons, each defined by a different combination of performance in these aspects of the mind. Some types are more efficient and stable developers than others. These analyses indicated that processing efficiency is a factor closely associated with developmental differences in problem solving, whereas working memory is associated with individual differences. Modeling by logistic equations uncovered the rates and form of change in the various dimensions and their reciprocal interactions during development. These findings are discussed from the point of view of information processing, differential, and developmental models of thinking, and an integrative model is proposed.

235 citations


Book ChapterDOI
TL;DR: This paper aims to chart out the main developments of the field over the last ten years and to take a critical view of these developments from several perspectives: logical, epistemological, computational and suitability to application.
Abstract: Abduction in Logic Programming started in the late 80s, early 90s, in an attempt to extend logic programming into a framework suitable for a variety of problems in Artificial Intelligence and other areas of Computer Science. This paper aims to chart out the main developments of the field over the last ten years and to take a critical view of these developments from several perspectives: logical, epistemological, computational and suitability to application. The paper attempts to expose some of the challenges and prospects for the further development of the field.

195 citations


Journal ArticleDOI
TL;DR: In this article, the authors propose extensions of the continuous record asymptotic analysis for rolling sample variance estimators developed for estimating the quadratic variation of asset returns, referred to as integrated or realized volatility.
Abstract: We propose extensions of the continuous record asymptotic analysis for rolling sample variance estimators developed for estimating the quadratic variation of asset returns, referred to as integrated or realized volatility. We treat integrated volatility as a continuous time stochastic process sampled at high frequencies and suggest rolling sample estimators which share many features with spot volatility estimators. We discuss asymptotically efficient window lengths and weighting schemes for estimators of the quadratic variation and establish links between various spot and integrated volatility estimators. Theoretical results are complemented with extensive Monte Carlo simulations and an empirical investigation.

181 citations


Journal ArticleDOI
TL;DR: In this paper, it was shown that the nuclear matrix elements can be reduced by about 25% in the case of light neutrinos by modifying the axial current of the nucleon current.

Journal ArticleDOI
01 May 2002
TL;DR: It is shown that the Alpha EV8 branch predictor achieves prediction accuracy in the same range as the state-of-the-art academic global history branch predictors that do not consider implementation constraints in great detail.
Abstract: This paper presents the Alpha EV8 conditional branch predictor The Alpha EV8 microprocessor project, canceled in June 2001 in a late phase of development, envisioned an aggressive 8-wide issue out-of-order superscalar microarchitecture featuring a very deep pipeline and simultaneous multithreading. Performance of such a processor is highly dependent on the accuracy of its branch predictor and consequently a very large silicon area was devoted to branch prediction on EV8. The Alpha EV8 branch predictor relies on global history and features a total of 352 Kbits.The focus of this paper is on the different trade-offs performed to overcome various implementation constraints for the EV8 branch predictor. One such instance is the pipelining of the predictor on two cycles to facilitate the prediction of up to 16 branches per cycle from any two dynamically successive, 8 instruction fetch blocks. This resulted in the use of three fetch-block old compressed branch history information for accesing the predictor. Implementation constraints also restricted the composition of the index functions for the predictor and forced the usage of only single-ported memory cells.Nevertheless, we show that the Alpha EV8 branch predictor achieves prediction accuracy in the same range as the state-of-the-art academic global history branch predictors that do not consider implementation constraints in great detail.

Journal ArticleDOI
TL;DR: In this paper, the Witten-Veneziano formula was used to study the dependence of the ground state energy of four-dimensional SU-N$ gauge theories in the large-N limit.
Abstract: We study the $\theta$ dependence of four-dimensional SU($N$) gauge theories, for $N\geq 3$ and in the large-N limit. We use numerical simulations of the Wilson lattice formulation of gauge theories to compute the first few terms of the expansion of the ground-state energy $F(\theta)$ around $\theta=0$, $F(\theta)-F(0) = A_2 \theta^2 (1 + b_2 \theta^2 + ...)$. Our results support Witten's conjecture: $F(\theta)-F(0) = {\cal A} \theta^2 + O(1/N)$ for sufficiently small values of $\theta$, $\theta < \pi$. Indeed we verify that the topological susceptibility has a nonzero large-N limit $\chi_\infty=2 {\cal A}$ with corrections of $O(1/N^2)$, in substantial agreement with the Witten-Veneziano formula which relates $\chi_\infty$ to the $\eta^\prime$ mass. Furthermore, higher order terms in $\theta$ are suppressed; in particular, the $O(\theta^4)$ term $b_2$ (related to the $\eta^\prime - \eta^\prime$ elastic scattering amplitude) turns out to be quite small: $b_2=-0.023(7)$ for N=3, and its absolute value decreases with increasing $N$, consistently with the expectation $b_2=O(1/N^2)$.

Posted Content
TL;DR: The gamma radiation in samples of a variety of natural tiling rocks (granites) imported in Cyprus for use in the building industry was measured, employing high-resolution gamma-ray spectroscopy, finding that 25 samples meet the exemption dose limit, two meet the upper dose limit and only one clearly exceeds this limit.
Abstract: The gamma radiation in samples of a variety of natural tiling rocks (granites) imported in Cyprus for use in the building industry was measured, employing high-resolution gamma-ray spectroscopy. The rock samples were pulverized, sealed in 1 litre plastic Marinelli beakers, and measured in the laboratory with an accumulating time between 10 and 14 hours each. From the measured gamma-ray spectra, activity concentrations were determined for Th-232 (range from 1 to 906 Bq/kg), U-238 (from 1 to 588 Bq/kg) and K-40 (from 50 to 1606 Bq/kg). The total absorbed dose rates in air calculated from the concentrations of the three radionuclides, Th-232 and U-238 series and K-40, ranged from 7 to 1209 nGy/h for full utilization of the materials, from 4 to 605 nGy/h for half utilization and from 2 to 302 nGy/h for one quarter utilization. The total effective dose rates per person indoors were determined to be between 0.02 to 2.97 mSv/y for half utilization of the materials. Applying dose criteria recently recommended by the EU for superficial materials, 25 of the samples meet the exemption dose limit of 0.3 mSv/y, two of them meet the upper dose limit of 1 mSv/y and only one exceeds clearly this limit.

Journal ArticleDOI
TL;DR: In this paper, the authors draw a comparison between harmonious and problematic foreign business relationships between U.S. exporting manufacturers and find that firms with harmonious relationships are more experienced, employ more people, and exhibit more active behavior toward conducting their foreign business.
Abstract: Although the United States has been one of the leading actors in international trade, limited evidence exists as to individual U.S. company relationships with overseas customers. On the basis of a sample of 201 U.S. exporting manufacturers, the authors draw a comparison between harmonious and problematic foreign business relationships. The findings reveal that, as opposed to problematic cases, firms with harmonious relationships are more experienced, employ more people, and exhibit more active behavior toward conducting their foreign business. Such firms sell to a greater number of export markets, deal with more foreign customers, and obtain more orders. The study also shows that harmonious relationships with overseas customers are distinguished by greater dependence, trust, understanding, commitment, communication, and cooperation but less distance, uncertainty, and conflict between the parties. The article provides export management with a set of guidelines for establishing, developing, and sus...

Journal ArticleDOI
TL;DR: In this article, a 0.1 wt% Pt supported on La 0.7Sr0.2Ce0.1FeO3 solid was studied for the NO/H2/O2 reaction in the 100-400°C range.

Journal ArticleDOI
TL;DR: This article examined the relation between teacher attributions of student school failure and teacher behavior toward the failing student and found that the presence of anger was associated with a teacher tendency to give-up efforts to help the student improve.
Abstract: The aim of this study was to examine the relation between teacher attributions of student school failure and teacher behavior toward the failing student. A structural equation model was proposed and its ability to fit the data was tested. It was found that teachers tend to behave in ways that indicate more pity and less anger when they attribute a student's low achievement to her or his low abilities, whereas they express more anger when attributing low achievement to the student's low effort. In contrast to previous research that argues in favor of anger as a high ability cue, this study has found that the presence of anger was associated with a teacher tendency to give-up efforts to help the student improve. This giving-up behavior was negatively related to the tendency of the teachers to accept some responsibility for the student failure. © 2002 Wiley Periodicals, Inc.

Journal ArticleDOI
TL;DR: In this paper, an integrated simulation and optimization framework for multicurrency asset allocation problems is developed and implemented models that optimize the conditional-value-at-risk (CVaR) metric.
Abstract: We develop an integrated simulation and optimization framework for multicurrency asset allocation problems. The simulation applies principal component analysis to generate scenarios depicting the discrete joint distributions of uncertain asset returns and exchange rates. We then develop and implement models that optimize the conditional-value-at-risk (CVaR) metric. The scenario-based optimization models encompass alternative hedging strategies, including selective hedging that incorporates currency hedging decisions within the portfolio selection problem. Thus, the selective hedging model determines jointly the portfolio composition and the level of currency hedging for each market via forward exchanges. We examine empirically the benefits of international diversification and the impact of hedging policies on risk–return profiles of portfolios. We assess the effectiveness of the scenario generation procedure and the stability of the model's results by means of out-of-sample simulations. We also compare the performance of the CVaR model against that of a model that employs the mean absolute deviation (MAD) risk measure. We investigate empirically the ex post performance of the models on international portfolios of stock and bond indices using historical market data. Selective hedging proves to be the superior hedging strategy that improves the risk–return profile of portfolios regardless of the risk measurement metric. Although in static tests the MAD and CVaR models often select portfolios that trace practically indistinguishable ex ante risk–return efficient frontiers, in successive applications over several consecutive time periods the CVaR model attains superior ex post results in terms of both higher returns and lower volatility.

01 Jan 2002
TL;DR: In this paper, the authors consider selfish routing over a network consisting of m parallel bottleneck links through which $n$ selfish users route their traffic trying to minimize their own expected latency.
Abstract: We consider selfish routing over a network consisting of m parallel links through which $n$ selfish users route their traffic trying to minimize their own expected latency. We study the class of mixed strategies in which the expected latency through each link is at most a constant multiple of the optimum maximum latency had global regulation been available. For the case of uniform links it is known that all Nash equilibria belong to this class of strategies. We are interested in bounding the coordination ratio (or price of anarchy) of these strategies defined as the worst-case ratio of the maximum (over all links) expected latency over the optimum maximum latency. The load balancing aspect of the problem immediately implies a lower bound Ω(ln m ln ln m) of the coordination ratio. We give a tight (up to a multiplicative constant) upper bound. To show the upper bound, we analyze a variant of the classical balls and bins problem, in which balls with arbitrary weights are placed into bins according to arbitrary probability distributions. At the heart of our approach is a new probabilistic tool that we call ball fusion; this tool is used to reduce the variant of the problem where balls bear weights to the classical version (with no weights). Ball fusion applies to more general settings such as links with arbitrary capacities and other latency functions.

Journal ArticleDOI
TL;DR: This work developed an image-based rendering approach for displaying multiple avatars that takes advantage of the properties of the urban environment and the way a viewer and the avatars move within it to produce fast rendering, based on positional and directional discretization.
Abstract: Populated virtual urban environments are important in many applications, from urban planning to entertainment. At the current stage of technology, users can interactively navigate through complex, polygon-based scenes rendered with sophisticated lighting effects and high-quality antialiasing techniques. As a result, animated characters (or agents) that users can interact with are also becoming increasingly common. However, rendering crowded scenes with thousands of different animated virtual people in real time is still challenging. To address this, we developed an image-based rendering approach for displaying multiple avatars. We take advantage of the properties of the urban environment and the way a viewer and the avatars move within it to produce fast rendering, based on positional and directional discretization. To display many different individual people at interactive frame rates, we combined texture compression with multipass rendering.


Journal ArticleDOI
TL;DR: Methods to deal with various aspects of crowd visualization, ranging from collision detection and behaviour modeling to fast rendering with shadows and quality shading are looked at, to suggest that simulations of reasonably complex environments populated with thousands of animated characters are possible in real‐time.
Abstract: Real-time crowd visualization has recently attracted quite an interest from the graphics community and, as interactive applications become even more complex, there is a natural demand for new and unexplored application scenarios. However, the interactive simulation of complex environments populated by large numbers of virtual characters is a composite problem which poses serious difficulties even on modern computer hardware. In this paper we look at methods to deal with various aspects of crowd visualization, ranging from collision detection and behaviour modeling to fast rendering with shadows and quality shading. These methods make extensive use of current graphics hardware capabilities with the aim of providing scalability without compromising run-time speed. Results from a system employing these techniques seem to suggest that simulations of reasonably complex environments populated with thousands of animated characters are possible in real-time.

Journal ArticleDOI
TL;DR: This article found that price-cost differentials cannot be explained by cost differences, making this an example of quality discrimination; market introduction time strongly affects sales, suggesting that time is the crucial dimension of discrimination; and there is substantial price rigidity across books and over time.

Journal ArticleDOI
02 Apr 2002-Langmuir
TL;DR: In this paper, the mechanism of formation of gold particles by reduction of AuIII in solutions of alcyltrimethylammonium chloride surfactants was studied in the absence and in the presence of NaCl.
Abstract: The mechanism of formation of gold particles by reduction of AuIII in solutions of alcyltrimethylammonium chloride surfactants was studied in the absence and in the presence of NaCl. AuIII anions interact strongly with trimethylammonium cations forming insoluble ion pairs (Torigoe et al. Langmuir 1992, 8, 59). Above the surfactant critical micelle concentration, the ion pairs are solubilized in the micelles returning to the solution. Gold particles were produced by photochemical reduction of the clear micellar solutions. The coupling between surfactant aggregation and inorganic crystallization phenomena in these systems was investigated using transmission electron microscopy (TEM), UV−vis, and time-resolved fluorescence spectroscopy. At concentrations close to the phase boundary of the L1 phase with the lyotropic liquid crystalline phases many gold particles have a threadlike morphology, as previously noted by Esumi et al. (Langmuir 1995, 11, 3285). The presence of NaCl modifies the micellar size and affe...

Journal ArticleDOI
TL;DR: This paper analyzes the computational complexity of credulous and sceptical reasoning under the semantics of admissible and preferred arguments for (the propositional variant of) the instances of the abstract framework capturing theorist, circumscription, logic programming, default logic, and autoepistemic logic.

Journal ArticleDOI
TL;DR: In this article, a nonparametric bootstrap procedure is proposed for stochastic processes which follow a general autoregressive structure, and the procedure generates bootstrap replicates by locally resampling the original set of observations reproducing automatically its dependence properties.

Journal ArticleDOI
TL;DR: A discrete-time (Markov-chain) methodology (implemented within a finite-difference scheme) is proposed for the valuation of American as well as European options and is also applicable to financial options with multiple types of rare events.

Journal ArticleDOI
TL;DR: Certain aspects of a particular version of the MFS, also known as the Charge Simulation Method, are investigated when it is applied to the Dirichlet problem for Laplace's equation in a disk.
Abstract: The Method of Fundamental Solutions (MFS) is a boundary-type method for the solution of certain elliptic boundary value problems. The basic ideas of the MFS were introduced by Kupradze and Alexidze and its modern form was proposed by Mathon and Johnston. In this work, we investigate certain aspects of a particular version of the MFS, also known as the Charge Simulation Method, when it is applied to the Dirichlet problem for Laplace's equation in a disk.

Posted Content
TL;DR: In this paper, a general equilibrium model of a two-class source or host country was constructed to examine the effects of permanent migration on class, and national welfare, showing that marginal immigration hurts people already in the country regardless of whether or not non-traded goods exist.
Abstract: An important issue in public policy debates is the effect of international migration on welfare in source and host countries. We address this issue by constructing a general equilibrium model of a two-class source or host country. Each country produces many traded and non-traded goods, uses income taxes and distributes the tax receipts equally to all individuals. The analysis examines the effects of permanent migration on class, and national welfare. We show, among other things, that marginal immigration hurts people already in the country regardless of whether or not non-traded goods exist. The presence of international capital mobility, however, may reverse the above result.

Journal ArticleDOI
TL;DR: In this article, the spectrum of the conflning strings in four-dimensional SU(N) gauge theories is studied. But the results are consistent with the sine formula aek=ae = sink, N =sin, N for the ratio between aek and the standard string tension ae.
Abstract: We study the spectrum of the conflning strings in four-dimensional SU(N) gauge theories. We compute, for the SU(4) and SU(6) gauge theories formulated on a lattice, the string tensions aek related to sources with ZN charge k, using Monte Carlo simulations. Our results are consistent with the sine formula aek=ae = sink … N =sin … N for the ratio between aek and the standard string tension ae. For the SU(4) and SU(6) cases the accuracy is approximately 1% and 2%, respectively. The sine formula is known to emerge in various realizations of supersymmetric SU(N) gauge theories. On the other hand, our results show deviations from Casimir scaling. We also discuss an analogous behavior exhibited by two-dimensional SU(N)£ SU(N) chiral models.