scispace - formally typeset
Search or ask a question

Showing papers by "Instituto Tecnológico Autónomo de México published in 2000"


Journal ArticleDOI
TL;DR: In this article, a multivariate two-step estimator of the memory parameters of a nonstationary vector process was proposed to analyze the long-memory properties of trading volume for the 30 stocks in the Dow Jones Industrial Average index.
Abstract: This article examines consistent estimation of the long-memory parameters of stock-market trading volume and volatility. The analysis is carried out in the frequency domain by tapering the data instead of detrending them. The main theoretical contribution of the article is to prove a central limit theorem for a multivariate two-step estimator of the memory parameters of a nonstationary vector process. Using robust semiparametric procedures, the long-memory properties of trading volume for the 30 stocks in the Dow Jones Industrial Average index are analyzed. Two empirical results are found. First, there is strong evidence that stock-market trading volume exhibits long memory. Second, although it is found that volatility and volume exhibit the same degree of long memory for most of the stocks, there is no evidence that both processes share the same long-memory component.

203 citations


Journal ArticleDOI
TL;DR: The authors used a calibrated life cycle model to evaluate why high income households save as a group a much higher fraction of income than do low income households in US cross-section data and found that age and relatively permanent earnings differences across households together with the structure of the US social security system are sufficient to replicate this fact.

135 citations


Journal ArticleDOI
TL;DR: The logic for identifying the much larger market failures attributable to the failure of smokers to fully internalize the costs of their addictive behavior is developed and focused on teen addiction as a form of "intrapersonal" externality.

98 citations


Journal ArticleDOI
TL;DR: This paper analyzes several control charts suitable for monitoring process dispersion when subgrouping is not possible or not desirable and compares the performances of a moving range chart, a cumulative sum (CUSUM) chart based on moving ranges, a CUSUMChart based on an approximate normalizing transformation, a self-starting C USUM chart, and a change-point CUSum chart.
Abstract: In this paper we analyze several control charts suitable for monitoring process dispersion when subgrouping is not possible or not desirable. We compare the performances of a moving range chart, a ...

52 citations


Journal ArticleDOI
TL;DR: In this paper, the authors estimate an identified vector autoregression for the Mexican economy using monthly data from 1976 to 1997, taking into account the changes in the monetary policy regime which occurred during this period.
Abstract: Motivated by the dollarization debate in Mexico, we estimate an identified vector autoregression for the Mexican economy using monthly data from 1976 to 1997, taking into account the changes in the monetary policy regime which occurred during this period. We find that 1) exogenous shocks to monetary policy have had no impact on output and prices, 2) most of the shocks originated in the foreign sector, 3) disturbances originating in the U.S. economy have been a more important source of fluctuations for Mexico than shocks to oil prices. We also study the endogenous response of domestic monetary policy by means of a counterfactual experiment. The results indicate that the response of monetary policy to foreign shocks played an important part in the 1994 crisis.

44 citations


Journal ArticleDOI
28 Feb 2000-Chaos
TL;DR: It is shown that the "primary intersection" of the stable and unstable manifolds is generically a neat submanifold of a "fundamental domain" and compute the intersections perturbatively using a codimension one Melnikov function.
Abstract: We study families of volume preserving diffeomorphisms in R3 that have a pair of hyperbolic fixed points with intersecting codimension one stable and unstable manifolds. Our goal is to elucidate the topology of the intersections and how it changes with the parameters of the system. We show that the “primary intersection” of the stable and unstable manifolds is generically a neat submanifold of a “fundamental domain.” We compute the intersections perturbatively using a codimension one Melnikov function. Numerical experiments show various bifurcations in the homotopy class of the primary intersections.

35 citations


Journal ArticleDOI
TL;DR: In this paper, the authors prove the large deviation principle for non-degenerate small noise diffusions with discontinuous drift and a state-dependent diffusion matrix based on a variational representation for functionals of strong solutions of stochastic differential equations and on weak convergence methods.
Abstract: This paper proves the large deviation principle for a class of non-degenerate small noise diffusions with discontinuous drift and with state-dependent diffusion matrix. The proof is based on a variational representation for functionals of strong solutions of stochastic differential equations and on weak convergence methods.

28 citations


Journal ArticleDOI
TL;DR: In this paper, the authors describe a methodology for evaluating information technology investments using the real options approach, which is suited for the evaluation of IT projects in which a firm invests an uncertain amount of money over an uncertain period of time to develop an IT asset that can be sold to third parties or used for its own purposes.
Abstract: This article describes a methodology for evaluating information technology investments using the real options approach. IT investment projects are categorized into development and acquisition projects depending upon the time it takes to start benefiting from the IT asset once the decision to invest has been taken. A couple of models that account for uncertainty both in the costs and benefits associated with the investment opportunity are proposed for each of these project types: a) The first model is suited for the evaluation of IT projects in which a firm invests an uncertain amount of money over an uncertain period of time to develop an IT asset that can be sold to third parties or used for its own purposes.In this model, the stochastic cost function incorporates the technical and input cost uncertainties of Pindyck's formulation for investments of uncertain cost, the uncertainty in the time required for developing the IT asset and the possibility that a catastrophic event causes the permanent abandonment of the development effort. Benefits are summarized in the value of an underlying asset that also evolves stochastically over time. b) The second model is suited for the valuation of investments in which a firm acquires an IT asset for its own use. In this model, investment is assumed to be instantaneous and the benefits associated with the investment are represented as a stream of differential cash flows over a period of time in which the technology is considered to be useful. This type of project is similar to an exchange option in which the exercise price (the cost) and the asset received are both uncertain. In contrast with previous work, both models take into consideration the particular decay in costs experienced by some IT assets (e.g., hardware) over time even if no investment takes place. The paper also illustrates the application of the IT acquisition model for the valuation of a real world example involving the deployment of point-of-sale debit services by a banking network in New England.

25 citations


Journal ArticleDOI
TL;DR: A basic combining rule of linear predictors is established and it is shown that such problems as forecast updating, missing value estimation, restricted forecasting with binding constraints, analysis of outliers and temporal disaggregation can be viewed as problems of optimal linear combination of restrictions and forecasts.
Abstract: An important tool in time series analysis is that of combining information in an optimal way. Here we establish a basic combining rule of linear predictors and show that such problems as forecast updating, missing value estimation, restricted forecasting with binding constraints, analysis of outliers and temporal disaggregation can be viewed as problems of optimal linear combination of restrictions and forecasts. A compatibility test statistic is also provided as a companion tool to check that the linear restrictions are compatible with the forecasts generated from the historical data. Copyright © 2000 John Wiley & Sons, Ltd.

24 citations


Journal ArticleDOI
01 Jun 2000-Test
TL;DR: In this article, the authors consider the Bayesian analysis of a simple Galton-Watson process and propose two simple analytic approximations to the posterior marginal distribution of the reproduction mean.
Abstract: In this article we consider the Bayesian statistical analysis of a simple Galton-Watson process. Problems of interest include estimation of the offspring distribution, classification of the process, and prediction. We propose two simple analytic approximations to the posterior marginal distribution of the reproduction mean. This posterior distribution suffices to classify the process. In order to assess the accuracy of these approximations, a comparison is provided with a computationally more expensive approximation obtained via standard Monte Carlo techniques. Similarly, a fully analytic approximation to the predictive distribution of the future size of the population is discussed. Sampling-based and hybrid approximations to this distribution are also considered. Finally, we present some illustrative examples.

21 citations


Posted Content
TL;DR: In this article, the authors describe a methodology for evaluating information technology investments using the real options approach, where IT investment projects are categorized into development and acquisition projects depending upon the time it takes to start benefiting from the IT asset once the decision to invest has been taken.
Abstract: This article describes a methodology for evaluating information technology investments using the real options approach. IT investment projects are categorized into development and acquisition projects depending upon the time it takes to start benefiting from the IT asset once the decision to invest has been taken. A couple of models that account for uncertainty both in the costs and benefits associated with the investment opportunity are proposed for these project types. Our stochastic cost function for IT development projects incorporates the technical and input cost uncertainties of Pindyck’s model (1993) but also considers the fact that the investment costs of some IT projects might change even if no investment takes place. In contrast to other models in the real options literature in which benefits are summarized in the underlying asset value, our model for IT acquisition projects represents these benefits as a stream of stochastic cash flows.

Journal ArticleDOI
TL;DR: In this paper, the authors estimate the aggregate import demand function for Greece using annual data for the period 1951-92 and find that the variables used in the aggregate demand function are not stationary but are cointegrated.
Abstract: This study estimates the aggregate import demand function for Greece using annual data for the period 1951–92. There are two methodological novelties in this paper. The authors find that the variables used in the aggregate import demand function are not stationary but are cointegrated. Thus, a long-run equilibrium relationship exists among these variables during the period under study. The price elasticity is found to be close to unity in the long run. The cross-price elasticity is also found to be close to unity. Import demand is found to be highly income elastic in the long run. This implies that with economic growth, ceteris paribus, the trade deficit for Greece is likely to get worse.

Journal ArticleDOI
TL;DR: In this article, the effect of the term structure of interest rates on the exchange rate for Mexico is investigated, and the authors show that the relationship between the currency exchange rate and interest rates can be complicated and counterintuitive when investors are risk averse.

Journal ArticleDOI
TL;DR: A retrospective view of the adoption of CASE tools in organizations using some empirical data from various research studies in this field and some explanations of why some organizations are successful in adopting CASE tools are provided.
Abstract: This paper provides a retrospective view of the adoption of CASE tools in organizations using some empirical data from various research studies in this field. First, relevant factors that influence the decision to adopt such a tool are discussed. Such factors include elements related to the organization adopting such a technology, as well as other characteristics associated with the application environment and the alternative development methods being used. Then, the advantages and disadvantages of using CASE tools are discussed and some critical success factors are identified. Finally, a taxonomy of CASE tools in the 90's is presented. The paper provides some explanations of why some organizations are successful in adopting CASE tools and gives recommendations for making a better use of such a technology.

Journal ArticleDOI
TL;DR: In this paper, the performance of three different methods of calculating VaR in the context of volatile markets is compared, and weaknesses of these methods are examined by using five different tests, including time until first failure, failure rate, expected value, autocorrelation, and rolling mean absolute percentage error.
Abstract: We compare the performance of three different methods of calculating VaR in the context of volatile markets. Some of these methods are routinely used for banks, pension funds and mutual funds without critical valuation of their efficiency. We examine weaknesses of these methods by using five different tests. They are (1) test based on the time until first failure, (2) test based on failure rate, (3) test based on expected value, (4) test based on autocorrelation, and (5) test based on (rolling) mean absolute percentage error.

Journal ArticleDOI
TL;DR: In this paper, an explicit formula for the saddle connection of an integrable family of standard maps studied by Y. Suris was given, and compared with computations of the lobe area.

Book ChapterDOI
01 Jan 2000
TL;DR: The number of countries with old age, disability and death programs has increased steadily (see table 6.1) as mentioned in this paper, which is a phenomenon of the twenty-first century.
Abstract: Social security has been a phenomenon of the twentieth century. The number of countries with old age, disability and death programs has increased steadily (see table 6.1).

01 Aug 2000
TL;DR: In this paper, the authors present results from a study conducted with 98 estudiantes of entre 12 and 19 anos, in which they investigated the comprension of the concept of variable in algebraic programs.
Abstract: Se ha mostrado que aun despues de cursar algebra durante varios anos, los estudiantes universitarios tienen dificultades serias para comprender los usos elementales de la variable. En este trabajo se presentan y analizan los resultados de un estudio en el que se investigo la comprension del concepto de variable en los diferentes grados de la ensenanza media. El estudio se realizo con 98 estudiantes de entre 12 y 19 anos. Los resultados muestran que las concepciones de la variable que tienen los estudiantes en los diferentes cursos, no reflejan una diferencia sustancial en la comprension de este concepto. Consideramos que las dificultades que manifiestan los estudiantes estan fuertemente influidas por las practicas docentes y el contenido de los cursos de algebra.

Posted Content
TL;DR: In this paper, the authors evaluate the desirability of having an elastic currency generated by a lender of last resort that prints money and lends it to banks in distress, and explore two alternate policies aimed at eliminating such monetary instability while preserving the steady-state benefits of the elastic currency.
Abstract: We evaluate the desirability of having an elastic currency generated by a lender of last resort that prints money and lends it to banks in distress. When banks cannot borrow, the economy has a unique equilibrium that is not Pareto optimal. The introduction of unlimited borrowing at a zero nominal interest rate generates a steady state equilibrium that is Pareto optimal. However, this policy is destabilizing in the sense that it also introduces a continuum of non-optimal inflationary equilibria. We explore two alternate policies aimed at eliminating such monetary instability while preserving the steady-state benefits of an elastic currency. If the lender of last resort imposes an upper bound on borrowing that is low enough, no inflationary equilibria can arise. For some (but not all) economies, the unique equilibrium under this policy is Pareto optimal. If the lender of last resort instead charges a zero real interest rate, no inflationary equilibria can arise. The unique equilibrium in this case is always Pareto optimal.

01 Sep 2000
TL;DR: This paper proposes an algorithmic methodology that obtains a series of segmentations of human head tomographies, produces a set of unstructured points in the 3D space, and then automatically produces a surface from the set ofunstructured 3D points about which the author has no topological knowledge.
Abstract: Reconstructing the surface from a set of unstructured points to build a 3D model is a problem that arises in many scientific and industrial fields as new 3D scanning technology is able to produce large databases of full 3D information. 3D surface reconstruction is also important after segmenting sets of 2D images to visualise the 3D surface represented by the segmentation. In this paper we propose an algorithmic methodology that obtains a series of segmentations of human head tomographies, produces a set of unstructured points in the 3D space, and then automatically produces a surface from the set of unstructured 3D points about which we have no topological knowledge. The methodology can be divided in two stages. First, tomographic images are segmented with a Neural Network algorithm based on Kohonen's Self-Organising Maps (SOM). The output neurones that have adapted to the image, are a series of 3D points that will be fed to the second stage. Next, our method uses a spatial decomposition and surface tracking algorithm to produce a rough approximation S' of the unknown manifold S. The produced surface S' serves as initialisation for a dynamic mesh model that yields the details of S to improve the quality of the reconstruction.

Journal Article
TL;DR: In this article, the authors present an expected utility framework for the study of multiparty elections and for the first time estimate the magnitudes and direction of strategic voting for a Mexican federal election, showing that strategic voting is not only exchanged between the two main opposition parties, but also between the PRI and its beneficiaries.
Abstract: This article presents an expected utility framework for the study of multiparty elections and for the first time estimates the magnitud and direction of strategic voting for a Mexican federal election. The study provides a methodology for measuring the opportunity cost of voting sincerely and shows how its use improves the predictive eficacy of the electoral choice model. Its findings challenge pleibiscitarian interpretations of Mexican elections, by clearly showing that strategic voting is not only exchanged between the two main opposition parties, but that the PRI is also amonsgt its beneficiaries. It shows that no particular demographic characteristic is associated ti the likelyhood of voting strategically, and illustrates the expected utility calculations driving this behavior.

Book ChapterDOI
11 Apr 2000
TL;DR: An overview of modeling and simulation in NSL together with a depth perception model example and current and future work with the NSL/ASL system in the development and simulation of modular neural systems executed in a single computer or distributed computer network are discussed.
Abstract: As neural systems become large and complex, sophisticated tools are needed to support effective model development and efficient simulation processing. Initially, during model development, rich graphical interfaces linked to powerful programming languages and component libraries are the primary requirement. Later, during model simulation, processing efficiency is the primary concern. Workstations and personal computers are quite effective during model development, while parallel and distributed computation become necessary during simulation processing. We give first an overview of modeling and simulation in NSL together with a depth perception model example. We then discuss current and future work with the NSL/ASL system in the development and simulation of modular neural systems executed in a single computer or distributed computer network.

Proceedings ArticleDOI
19 Mar 2000
TL;DR: Any estimation procedure should reconcile the semantics of a fuzzy set with the experimental evidence conveyed by numeric data and develop the hybrid two-phase approach that starts from a rough specification of the support of the fuzzy sets that is followed by detailed computations involving a specific type of membership function and an estimation of its parameters.
Abstract: In this study, we elaborate on the important issue of membership function determination. The main point is that any estimation procedure should reconcile the semantics of a fuzzy set (regarded as an information granule of some level of abstraction) with the experimental evidence conveyed by numeric data. This, in the sequel, calls for the development of the hybrid two-phase approach that starts from a rough specification of the support of the fuzzy sets that is followed by detailed computations involving a specific type of membership function and an estimation of its parameters. The role of robust statistics in this setting is also raised. Finally, experimental results are presented.

Book ChapterDOI
11 Apr 2000
TL;DR: This work aims to express some general conclusions for risk assessment of software projects, particularly, but not limited to, those involving knowledge acquisition (KA).
Abstract: Software project mistakes represent a loss of millions of dollars to thousands of companies all around the world. These software projects that somehow ran off course share a common problem: Risks became unmanageable. There are certain number of conjectures we can draw from the high failure rate: Bad management procedures, an inept manager was in charge, managers are not assessing risks, poor or inadequate methodologies where used, etc. Some of them might apply to some cases, or all, or none, is almost impossible to think in absolute terms when a software project is an ad hoc solution to a given problem. Nevertheless, there is an ongoing effort in the knowledge engineering (KE) community to isolate risk factors, and provide remedies for runaway projects, unfortunately, we are not there yet. This work aims to express some general conclusions for risk assessment of software projects, particularly, but not limited to, those involving knowledge acquisition (KA).

Journal ArticleDOI
TL;DR: In this article, the basic norm and its legal functions are analyzed and a possible solution to the problem of irregular norms is offered through new definitions of the existence, validity and legitimacy of norms.
Abstract: `Authority', `competence' andother related concepts are determined on the basis ofthe concept of law as a dynamic order of norms. Thenorms which regulate the processes of norm creationestablish empowerments (Ermachtigungen). Thematerial domain of validity of the empowering norm iscalled `competence'. The concept of `person' inrelation to empowering norms yields the concepts of`organ' and `authority'. The spatial domain of thevalidity of these norms is the spatial or territorialjurisdiction. This paper analyses the basic norm andits legal functions; it considers the irregularity oflegal acts and norms, as well as the legalconsequences thereof, namely nullity and annulment.Additionally, the Kelsenian `Tacit AlternativeClause' is criticized and a possible solution to theproblem of irregular norms is offered through newdefinitions of the existence, validity and legitimacyof norms.

10 Jan 2000
TL;DR: For instance, Atienza et al. as discussed by the authors considerar the creación legislativa as a proceso de interaccion entre distintos elementos (editores, destinatarios, sistema juridico, fine and valores) and define a breve documento de trabajo, en extension and en profundidad, to articular the Teoria de la Legislación con el Nuevo Institucionalismo.
Abstract: plataformas politicas representadas en los poderes Legislativo y Ejecutivo, sino tambien a aquellos propios de un sistema juridico complejo, caracterizado por disenos legislativos en donde a menudo prima la vaguedad, la ambiguedad, la redundancia, la contradiccion y la inconsistencia. Estas caracteristicas del sistema juridico se originan en buena medida por la aplicacion de una mala tecnica legislativa y por la carencia de una politica legislativa. Tomar en serio los problemas inherentes a la creacion de leyes y su utilidad en la construccion de politicas publicas, es aceptar una invitacion en torno al debate sobre el proceso para tomar decisiones legislativas y sobre la calidad de las leyes. En estas paginas, que constituyen un breve documento de trabajo, en extension y en profundidad, el lector encontrara algunas ideas para articular la Teoria de la Legislacion con el Nuevo Institucionalismo y una reexpresion del proceso de produccion legislativa. Las ideas bordan sobre lo que Atienza llama analisis de tipo interno de la legislacion, y que consiste, como lo senala Gema Marcilla, en considerar la creacion legislativa como “un proceso de interaccion entre distintos elementos (editores, destinatarios, sistema juridico, fines y valores) y que, ademas, puede examinarse desde distintas perspectivas o ideas de racionalidad (linguistica, logico-formal, pragmatica, teleologica y etica)”.1 Nos concretaremos en dicho proceso de interaccion, en su definicion, en sus distintas etapas y las actividades que pueden considerarse dentro de cada etapa. as dificultades que tenemos en Mexico para producir leyes no se deben exclusivamente a problemas politicos derivados de las diferentes


Book ChapterDOI
01 Jan 2000
TL;DR: In this paper, the authors survey the literature of the effects of pay-as-you-go systems on labor markets, and the privatization of social security and its impact on the labor market.
Abstract: Changes in the social security system have an impact on the rest of the economy. The most direct effects would be in the labor market. Increases in social security benefits usually involve raising taxes. Taxes take the form of a payroll tax. Therefore, the most common method of assessing the (first round) effects of social security is to analyze the impact on the labor market. In the first section, we study two aspects of the effects of social security on labor markets. First, we survey the literature of the effects of pay as you go systems on labor markets, and second, the privatization of social security and its impact on the labor market.


Book ChapterDOI
01 Jan 2000
TL;DR: This chapter develops models of social security that have been used for analyzing social security systems and introduces two basic models: models of population and models of economics.
Abstract: In this chapter, we will develop models of social security that have been used for analyzing social security systems. There are two basic models: models of population and models of economics.