scispace - formally typeset
Search or ask a question

Showing papers by "Government of Canada published in 2008"


Journal ArticleDOI
TL;DR: The viability of corresponding processes for electrosynthesis of formate salts and/or formic acid from CO(2) is examined here through conceptual flowsheets for two process options, each converting CO( 2) at the rate of 100 tonnes per day.
Abstract: With respect to the negative role of carbon dioxide on our climate, it is clear that the time is ripe for the development of processes that convert CO(2) into useful products. The electroreduction of CO(2) is a prime candidate here, as the reaction at near-ambient conditions can yield organics such as formic acid, methanol, and methane. Recent laboratory work on the 100 A scale has shown that reduction of CO(2) to formate (HCO(2)(-)) may be carried out in a trickle-bed continuous electrochemical reactor under industrially viable conditions. Presuming the problems of cathode stability and formate crossover can be overcome, this type of reactor is proposed as the basis for a commercial operation. The viability of corresponding processes for electrosynthesis of formate salts and/or formic acid from CO(2) is examined here through conceptual flowsheets for two process options, each converting CO(2) at the rate of 100 tonnes per day.

264 citations


Journal ArticleDOI
TL;DR: Directly dating the residues by accelerator mass spectrometry (AMS) radiocarbon measurement, the results represent the earliest direct dates for maize in Early Formative Ecuadorian sites and provide further support that, once domesticated ≈9000 calendar years ago, maize spread rapidly from southwestern Mexico to northwestern South America.
Abstract: The study of maize (Zea mays L.) domestication has advanced from questions of its origins to the study—and debate—of its dietary role and the timing of its dispersal from Mexico. Because the investigation of maize's spread is hampered by poor preservation of macrobotanical remains in the Neotropics, research has focused on microbotanical remains whose contexts are often dated by association, leading some to question the dates assigned. Furthermore, some scholars have argued that maize was not introduced to southwestern Ecuador until ≈4150–3850 calendar years before the present (cal B.P.), that it was used first and foremost as a fermented beverage in ceremonial contexts, and that it was not important in everyday subsistence, challenging previous studies based on maize starch and phytoliths. To further investigate these questions, we analyzed every-day cooking vessels, food-processing implements, and sediments for starch and phytoliths from an archaeological site in southwestern Ecuador constituting a small Early Formative village. Employing a new technique to recover starch granules from charred cooking-pot residues we show that maize was present, cultivated, and consumed here in domestic contexts by at least 5300–4950 cal B.P. Directly dating the residues by accelerator mass spectrometry (AMS) radiocarbon measurement, our results represent the earliest direct dates for maize in Early Formative Ecuadorian sites and provide further support that, once domesticated ≈9000 calendar years ago, maize spread rapidly from southwestern Mexico to northwestern South America. ceramic residues microfossil analysis AMS radiocarbon measurement crop dispersals agricultural origins

125 citations


ReportDOI
TL;DR: This paper analyzed the long-term effects of graduation in a recession on earnings, job mobility, and employer characteristics for a large sample of Canadian college graduates using matched university-employer-employee data from 1982 to 1999.
Abstract: This paper analyzes the long-term effects of graduating in a recession on earnings, job mobility, and employer characteristics for a large sample of Canadian college graduates using matched university-employer-employee data from 1982 to 1999. The results are used to assess the role of job mobility and firm quality in the propagation of shocks for different groups in the labor market. We find that young graduates entering the labor market in a recession suffer significant initial earnings losses that, on average, eventually fade after 8 to 10 years. Labor market conditions at graduation affect firm quality and job mobility, which can account for 40-50% of losses and catch-up in our sample. We also document that higher skilled graduates suffer less from entry in a recession because they switch to better firms quickly. Lower skilled graduates are permanently affected by being down ranked to low-wage firms. These adjustment patterns are consistent with differential choices of intensity of search for better employers arising from comparative advantage and time-increasing search costs. All results are robust to an extensive sensitivity analysis including controls for correlated business cycle shocks after labor market entry, endogenous timing of graduation, permanent cohort differences, and selective labor force participation.

109 citations


Journal ArticleDOI
TL;DR: Twenty-one narwhals tagged in 2003 and 2004 in Admiralty Inlet showed a different summer distributional pattern than previous narwhal-tracking studies from Somerset Island, Eclipse Sound and Melville Bay, and distribution size of range, and population size did not appear to be related.
Abstract: Twenty-one narwhals tagged in 2003 and 2004 in Admiralty Inlet showed a different summer distributional pattern than previous narwhal-tracking studies from Somerset Island, Eclipse Sound and Melville Bay. The migration of the narwhals tracked from Admiralty Inlet moved out through Lancaster Sound 15 days earlier (P < 0.0001) than the narwhals summering around Eclipse Sound, whereas the Admiralty Inlet narwhals reached the mouths of Eclipse Sound 18 days later (P < 0.0001) than the Eclipse Sound summering population. The winter range of the Admiralty Inlet narwhals overlapped with the winter range of narwhals from Melville Bay and Eclipse Sound in central southern Baffin Bay and Northern Davis Strait, but not with the winter range of narwhals from Somerset Island that wintered further north. Distribution size of range, and population size did not appear to be related. An example of considerable year to year variation between area of summer and winter distribution in the 2 years was believed to be related to the sample size and number of pods of whales tagged, rather than to differences in sex or age classes.

60 citations


ReportDOI
TL;DR: In this paper, the authors present comparative evidence from a sample of OECD countries and find that the average long run effect of an increase in imports on domestic productivity is close to zero.
Abstract: In the wake of falling trade costs, two central consequences in the importing economy are, first, that stronger competition through increased imports can lead to market share reallocations among domestic firms with different productivity levels (selection). Second, the increase in imports might improve domestic technologies through learning externalities (spillovers). Each of these channels may have a major impact on aggregate productivity. This paper presents comparative evidence from a sample of OECD countries. We find that the average long run effect of an increase in imports on domestic productivity is close to zero. If the scope for technological learning is limited, the selection effect dominates and imports lead to lower productivity. If, however, imports are relatively technology-intensive, imports also generate learning that can on net raise domestic productivity. Moreover, there is somewhat less selection when the typical domestic firm is large. The results support models in which trade triggers both substantial selection and technological learning.

51 citations


Journal ArticleDOI
TL;DR: Results indicate that SES inequities in utilization are apparent, appearing to be more relevant in initial contact with the system than in the number of visits.
Abstract: A plethora of literature links socioeconomic status (SES) to health and health care utilization. Recent anecdotal evidence indicates that Canadians believe their access to health care is diminishing. This study describes health care utilization patterns for services provided under public health insurance (physicians, specialists, and hospitals) in Canada between 1978 and 2003. The relationship between SES and utilization, controlling for health and demographic characteristics, is examined to investigate whether changes in the equity of utilization have occurred over time. Results indicate that SES inequities in utilization are apparent, appearing to be more relevant in initial contact with the system than in the number of visits. Specialists' services are particularly problematic and becoming more so over time.

40 citations


Journal ArticleDOI
TL;DR: There is a need to develop validated methods of amino acid analysis in foods using liquid chromatographic techniques, which have replaced ion-exchange methods for quantifying amino acids in most laboratories.
Abstract: Accurate standardized methods for the determination of amino acid in foods are required to assess the nutritional safety and compositional adequacy of sole source foods such as infant formulas and enteral nutritionals, and protein and amino acid supplements and their hydrolysates, and to assess protein claims of foods Protein digestibility-corrected amino acid score (PDCAAS), which requires information on amino acid composition, is the official method for assessing protein claims of foods and supplements sold in the United States PDCAAS has also been adopted internationally as the most suitable method for routine evaluation of protein quality of foods by the Food and Agriculture Organization/World Health Organization Standardized methods for analysis of amino acids by ion-exchange chromatography have been developed However, there is a need to develop validated methods of amino acid analysis in foods using liquid chromatographic techniques, which have replaced ion-exchange methods for quantifying amino acids in most laboratories Bioactive peptides from animal and plant proteins have been found to potentially impact human health A wide range of physiological effects, including blood pressure-lowering effects, cholesterol-lowering ability, antithrombotic effects, enhancement of mineral absorption, and immunomodulatory effects have been described for bioactive peptides There is considerable commercial interest in developing functional foods containing bioactive peptides There is also a need to develop accurate standardized methods for the characterization (amino acid sequencing) and quantification of bioactive peptides and to carry out dose-response studies in animal models and clinical trials to assess safety, potential allergenicity, potential intolerance, and efficacy of bioactive peptides Information from these studies is needed for determining the upper safe levels of bioactive peptides and as the basis for developing potential health claims for bioactive peptides This information is, in turn, needed by regulatory agencies for developing appropriate policy and regulations on adding these substances to foods and for determining if health claims are scientifically substantiated

37 citations


Journal ArticleDOI
TL;DR: Empirical estimates of smoker preferences for increased efficacy and other attributes of smoking cessation therapies (SCTs) are provided and systematic preference heterogeneity for therapy types and SCT attributes between light and heavy smokers is found, as well as random heterogeneity using random parameters logit models.
Abstract: Promoting cessation is a cornerstone of tobacco control efforts by public-health agencies. Economic information to support cessation programs has generally emphasized cost-effectiveness or the impact of cigarette pricing and smoking restrictions on quit rates. In contrast, this study provides empirical estimates of smoker preferences for increased efficacy and other attributes of smoking cessation therapies (SCTs). Choice data were collected through a national survey of Canadian smokers. We find systematic preference heterogeneity for therapy types and SCT attributes between light and heavy smokers, as well as random heterogeneity using random parameters logit models. Preference heterogeneity is greatest between length of use and types of SCTs. We estimate that light smokers would be willing to pay nearly $500 ($CAN) to increase success rates to 40% with the comparable figure for heavy smokers being near $300 ($CAN). Results from this study can be used to inform research and development for smoking cessation products and programs and suggest important areas of future inquiry regarding heterogeneity of smoker preferences and preferences for other health programs. Copyright © 2008 John Wiley & Sons, Ltd.

29 citations


Journal ArticleDOI
TL;DR: Les auteurs etudient le role de la structure de marche dans la diffusion des services bancaires electroniques as mentioned in this paper, i.e., services (et plus largement le commerce electronique) permettent aux entreprises d'economiser sur de nombreuses transactions.
Abstract: Les auteurs etudient le role de la structure de marche dans la diffusion des services bancaires electroniques. Ces services (et plus largement le commerce electronique) permettent aux entreprises d'economiser sur de nombreuses transactions.

29 citations


Posted Content
TL;DR: The authors assesses the economic impacts of a free trade agreement between Canada and Korea, using the Global Trade Analysis Project (GTAP) CGE model and version 6 of its database (base year 2001).
Abstract: This document assesses the economic impacts of a free trade agreement between Canada and Korea, using the Global Trade Analysis Project (GTAP) CGE model and version 6 of its database (base year 2001). Five alternative scenarios are simulated based on a range of assumptions concerning the supply-side responses of the two economies to expanded bilateral trade. We find that choice of closure assumptions heavily influences the scale (and even the sign) of the welfare impacts and the breakdown of the economic impacts between output and terms of trade. The default closure within the GTAP modelling framework of fixed labour and capital supply, which is equivalent to assuming zero supply elasticities for factor inputs, results in terms of trade gains dominating output gains, an outcome that in our view is unlikely for small, open economies like Canada and Korea. Our central scenario incorporates the assumptions which in our view are best suited for Canada and Korea respectively; this yields larger economic gains for both countries, with a more reasonable breakdown between output and terms of trade impacts.

28 citations


Journal ArticleDOI
TL;DR: In this paper, a distinguishing feature of macro stress testing exercises is the use of macroeconomic models in scenario design and implementation, and it is widely agreed that scenarios should be based on "rare but plausible" events that have either resulted in vulnerabilities in the past or could do so in the future.
Abstract: A distinguishing feature of macro stress testing exercises is the use of macroeconomic models in scenario design and implementation. It is widely agreed that scenarios should be based on "rare but plausible" events that have either resulted in vulnerabilities in the past or could do so in the future.

Posted Content
TL;DR: In this paper, the Triple Helix model is adapted to the South African context and facilities and impediments for working according to the triple Helix in South Africa are identified, where the authors focus on entrepreneurship development.
Abstract: This article deals with Triple Helix (university, industry and government cooperation) from an institutional theory perspective. The empirical context is the Western Cape Region in South Africa and the focus is entrepreneurship development. The purpose is two-fold: first, the existing Triple Helix model is adapted to the South African context; and second, facilities and impediments for working according to Triple Helix in South Africa are identified. The empirical material consists of a survey and three longitudinal case studies illustrating the degree of cooperation between the three parties. The article contributes to knowledge about how the Triple Helix model works on a regional level in a developing country. The study draws the following conclusions: when cooperation is to be identified between the three actors, only two of the three are involved; one missing link in the Triple Helix model is the focus on the entrepreneur; cooperation between the three parties are incidental rather than planned and there is lack of structure. In turn, some of these conclusions may be an effect of institutional changes on a national level. For a normative legacy, the article proposes a set of suggestions for incorporating all relevant parties on a practical level.

Journal ArticleDOI
TL;DR: In this paper, the authors examined the level of multifactor productivity (MFP) in Canada relative to that of the United States for the 1994-to-2003 period and examined the relative importance of differences in capital intensity and MFP in accounting for the labour productivity differences between the two countries.
Abstract: This paper has three main objectives. First, it examines the level of multifactor productivity (MFP) in Canada relative to that of the United States for the 1994-to-2003 period. Second, it examines the relative importance of differences in capital intensity and MFP in accounting for the labour productivity differences between the two countries. Third, it traces the overall MFP difference between Canada and the United States to its industry origins and estimates the contributions of the goods, services and engineering sectors to the overall MFP gap.Our main findings are as follows. First, the overall capital intensity is as high in Canada as in the United States; but there are considerable differences in Canada’s capital intensity across asset classes. Canada has considerably less machinery and equipment, about the same amount of buildings and considerably more engineering construction. Second, most of the differences in labour productivity between Canada and the United States are due to the differences in MFP. Third, our industry results show that the levels of labour productivity and MFP in the goods and the engineering sectors are closer to those of the United States. But, the level of labour and multifactor productivity in the services sector is much lower in Canada. The lower levels of labour productivity and MFP in the Canadian services sector account for most of the overall productivity level difference between the two countries.

Journal ArticleDOI
TL;DR: Les auteurs appliquent au systeme canadien de transfert de paiements de grande valeur propose a methode similaire a l'algorithme PageRank de Google as mentioned in this paper.
Abstract: Les auteurs appliquent au systeme canadien de transfert de paiements de grande valeur une methode similaire a l'algorithme PageRank de Google. Ils obtiennent ainsi une estimation de la vitesse de traitement des paiements dans chaque banque.

Posted Content
TL;DR: In this article, a small quarterly projection model of the U.S. economy is estimated with Bayesian techniques, which provide a very efficient way of imposing restrictions to produce both plausible dynamics and sensible forecasting properties.
Abstract: This is the first of a series of papers that are being written as part of a project to estimate a small quarterly Global Projection Model (GPM). The GPM project is designed to improve the toolkit for studying both own-country and cross-country linkages. In this paper, we estimate a small quarterly projection model of the U.S. economy. The model is estimated with Bayesian techniques, which provide a very efficient way of imposing restrictions to produce both plausible dynamics and sensible forecasting properties. After developing a benchmark model without financial-real linkages, we introduce such linkages into the model and compare the results with and without linkages.

Posted Content
TL;DR: In this article, a small quarterly projection model of the US, Euro Area, and Japanese economies is estimated with Bayesian techniques, which provide a very efficient way of imposing restrictions to produce both plausible dynamics and sensible forecasting properties.
Abstract: This is the second of a series of papers that are being written as part of a larger project to estimate a small quarterly Global Projection Model (GPM). The GPM project is designed to improve the toolkit for studying both own-country and cross-country linkages. In this paper, we estimate a small quarterly projection model of the US, Euro Area, and Japanese economies. The model is estimated with Bayesian techniques, which provide a very efficient way of imposing restrictions to produce both plausible dynamics and sensible forecasting properties. We show how the model can be used to construct efficient baseline forecasts that incorporate judgment imposed on the near-term outlook.

Posted Content
TL;DR: In this paper, the authors estimate a small quarterly projection model of the US, Euro Area, and Japanese economies that incorporates oil prices and allows them to trace out the effects of shocks to oil prices.
Abstract: This is the third of a series of papers that are being written as part of a larger project to estimate a small quarterly Global Projection Model (GPM). The GPM project is designed to improve the toolkit for studying both own-country and cross-country linkages. In this paper, we estimate a small quarterly projection model of the US, Euro Area, and Japanese economies that incorporates oil prices and allows us to trace out the effects of shocks to oil prices. The model is estimated with Bayesian techniques. We show how the model can be used to construct efficient baseline forecasts that incorporate judgment imposed on the near-term outlook.

Journal ArticleDOI
TL;DR: In this paper, the authors developed comparable capital stock estimates to examine the relative capital intensity of Canada and the United States by applying common depreciation rates to Canadian and U.S. assets.
Abstract: Official data from statistical agencies are not always ideal for cross-country comparisons because of differences in data sources and methodology. Analysts who engage in cross-country comparisons need to carefully choose among alternatives and sometimes adapt data especially for their purposes. This paper develops comparable capital stock estimates to examine the relative capital intensity of Canada and the United States. To do so, the paper applies common depreciation rates to Canadian and U.S. assets to come up with comparable capital stock estimates by assets and by industry between the two countries. Based on common depreciation rates, it finds that capital intensity is higher in the Canadian business sector than in the U.S. business sector. This is the net result of quite different ratios at the individual asset level. Canada has as higher intensity of engineering infrastructure assets per dollar of gross domestic product produced. Canada has a lower intensity of information and communications technology (ICT) machinery and equipment (ME however, it has a deficit in M&E that goes beyond ICT assets.

Journal ArticleDOI
TL;DR: Ova metal concentrations observed in walleye and lake whitefish does not appear to be related to the ontogenetic or nutritional state of the female, and were generally lower than those reported in earlier studies.

Journal ArticleDOI
TL;DR: In this article, the authors present the long-term trends in outsourcing and offshoring across Canadian industries, and present an overview of the main players in the outsourcing and outsourcing process.
Abstract: The paper presents the long-term trends in outsourcing and offshoring across Canadian industries.

Posted Content
TL;DR: Policy research provided a means to shape international thinking around long‐term solutions for problems that seem insoluble in the present, and enabled experts and practitioners to set their sights farther ahead.
Abstract: This article discusses a policy research project on possible future economic, governance and constitutional arrangements for Iraq that brought together a number of Iraqi and international practitioners and scholars at the Canadian Ministry of Foreign Affairs and International Trade in 2006, highlighting some conclusions eventually also reflected in a volume, Iraq: Preventing a New Generation of Conflict, edited by Markus E. Bouillon, David M. Malone and Ben Rowswell (London and Boulder CA: Lynne Rienner, 2007). It also discusses policy research sometimes incubated within and sometimes supported beyond Canada's foreign ministry (and other such institutions).

Journal ArticleDOI
TL;DR: In this article, the effects of implicit guarantees from Fannie Mac and Freddie Mac on the macroeconomy of the US housing market are investigated. But the authors focus on Fannie and Freddie providing an interest rate subsidy for mortgages.
Abstract: Working Paper 2008-3 February 2008 Abstract: This working paper comments on Karsten Jeske and Dirk Krueger's "Housing and the Macroeconomy: The Role of Implicit Guarantees for Government Sponsored Enterprises," delivered at the Fiscal Policy and Monetary/Fiscal Policy Interactions conference held on April 19-20, 2007. Key words: housing, mortgage market, default risk ********** Jeske and Krueger attempt to quantify the effects of implicit guarantees from Fannie Mac and Freddie Mac on the macroeconomy. Modelling the guarantees is an incredibly daunting and complex task. Do the guarantees matter due to times of aggregate turmoil and/or do they have an effect due to the riskiness of individual mortgages? Perhaps the guarantee is simply a straight subsidy? The potential analysis is almost unlimited. I commend the authors for taking up such a task. They choose to focus on Fannie and Freddie providing an interest rate subsidy for mortgages. This comes from the fact that debt issued by Fannie Mac and Freddie Mac is 40 basis points lower than comparable debt. This lower rate can be interpreted as a consequence of Fannie and Freddie having an implicit guarantee from the federal government that lowers the risk premium they face. Jeske and Krueger evaluate the effects of a 40 basis point subsidy on mortagages in a fully specified general equilibrium model of heterogeneous agents who face individual risk and a complex asset choice of bonds, mortgages and housing. An important aspect of their model is that mortgages are defaultable. However, they do not include any aggregate risk. They find that removing a 40 basis point subsidy to mortgages results in * substantially smaller mortgages, with mortgages falling by 91% in the aggregate, * a halving of default, * a 0.72% decrease in the aggregate stock of housing * and higher welfare. In the rest of this discussion I will slowly build up the main features of the model, starting from a representative agent environment and ending with their environment, to illustrate the effects that a mortgage subsidy plays in the economy, with an emphasis on welfare. 1 Model 1.1 Main Features The main features of their model are: (a) Agents are heterogeneous due to idiosyncratic income shocks. (b) Agents derive utility from a consumption good and housing services. (c) Agents can hold two assets: a risk-free bond; and a combination of risky housing and a (continuum of) mortgages. (d) Owning and renting are disconnected. (e) There is a competitive banking sector that issues mortgages. (f) There is a construction sector, but it is normalized away. (g) Last, the government subsidizes mortgages, financed via a proportional earnings tax. 1.2 Representative Agent Model To examine the consequences of a subsidy, let's first consider a representative agent model where safe housing is the only asset. The representative agent's problem is [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] subject to c + [P.sub.l]h + g' = y + (1 - [delta]) g + [P.sub.l]g' where V is the value function, y is income, c is consumption, h is housing rented, g is housing owned and [P.sub.l] is the rental price of housing. Note that I follow the timing in the paper that in the period that a house is purchased it can be rented out to receive [P.sub.l]g'. The price of a house has also been normalized to one as in the paper. The market clearing condition is h = g'. In such a representative agent model there are no inefficiencies so we get the efficient outcome. Therefore there is no role for a subsidy. However, to create a benchmark to examine the effects of a subsidy, modify the budget constraint by the introduction of a subsidy s financed by a tax [tau], c + [P.sub.l]h + g' = y(1 - [tau]) + (1 - [delta])g + ([P.sub. …

Journal ArticleDOI
TL;DR: In this paper, the authors compared the productivity growth of a set of Canadian and U.S. regulated industries over the period from 1977 to 2003, and found that many of the Canadian industries that underwent deregulation experienced faster labour productivity growth and multifactor productivity growth than did the aggregate Canadian business sector.
Abstract: This paper compares the productivity growth of a set of Canadian and U.S. regulated industries. Using data from Statistics Canada’s KLEMS database and the U.S. Bureau of Economic Analysis, the paper examines productivity growth in transportation services (which includes air and rail), broadcasting and telecommunications, and financial services (which includes financial intermediation and insurance), over the period from 1977 to 2003. The majority of these provide the foundational networks on which other industries rely. These sectors were quite heavily regulated in Canada at the beginning of the period of study (1977), experienced partial deregulation during the period and still faced various types of regulation at the end (2003). Deregulation also occurred in the United States, but regulation has generally been less restrictive there over most of the period.The evidence shows that many of the Canadian industries that underwent deregulation experienced faster labour productivity growth and multifactor productivity growth than did the aggregate Canadian business sector and had similar or higher productivity growth than did their counterparts in the United States over the 1977-to-2003 period. Those industries include rail transportation, broadcasting and telecommunications, financial intermediation and insurance carriers. The airline industry had slower productivity growth in Canada than in the United States over the 1977 to 2003 period.

Journal ArticleDOI
TL;DR: In this paper, the Profit Elasticity Measure (PEM) is proposed as a new measure of competition intensity, which has the advantage of being theoretically monotonic in competition intensity and relatively parsimonious in its data requirements.
Abstract: Numerous policy issues in Canada implicitly or explicitly rely on vigorous competition intensity. Surprisingly, as little is known about competition intensity in Canada today, as it was at the time of Macdonald Commission. In this paper, we describe and estimate a new measure of competition intensity, the Profit Elasticity Measure (PEM) proposed by Boone (2008), that has the advantage of being theoretically monotonic in competition intensity and relatively parsimonious in its data requirements. We then use these estimates to benchmark competition intensity in a number of industries in Canada and the U.S for the period 1985-2005. The comparative analysis of these measures for selected Canadian and US industries are consistent with popular qualitative perceptions about relative intensity of competition in Canada and the U.S.

Journal ArticleDOI
TL;DR: In this article, the authors examined firm entry and exit patterns in the Canadian business sector by using the Longitudinal Employment Analysis Program database developed by Statistics Canada and presented stylized facts and provided descriptive analysis of the entry/exit patterns in order to form a solid foundation for future in-depth theoretical and empirical studies of firm dynamics.
Abstract: This paper examines firm entry and exit patterns in the Canadian business sector by using the Longitudinal Employment Analysis Program database developed by Statistics Canada. Our primary purpose is to present stylized facts and provide descriptive analysis of the entry and exit patterns in the Canadian economy in order to form a solid foundation for future in-depth theoretical and empirical studies of firm dynamics. In particular, this paper focuses on the relative importance of entrants and exiters in terms of both number and employment, the persistence of entry and exit patterns over time, and the correlation between industry entry and exit rates.

Journal ArticleDOI
TL;DR: The authors developed and estimated a DSGE model which realistically assumes that many goods in the economy are produced through more than one stage of production and showed that intermediate-stage technology shocks explain most of short-run output fluctuations, whereas final-stage technological shocks only have a small impact.
Abstract: We develop and estimate a DSGE model which realistically assumes that many goods in the economy are produced through more than one stage of production. Firms produce differentiated goods at an intermediate stage and a final stage, post different prices at both stages, and face stage-specific technological change. Wage-setting households are imperfectly competitive with respect to labor skills. Intermediate-stage technology shocks explain most of short-run output fluctuations, whereas final-stage technology shocks only have a small impact. Despite the dominance of technology shocks, the model predicts a near-zero correlation between hours worked and the return to work and mildly procyclical real wages. The factors mainly responsible for these findings are an input-output linkage between firms operating at the different stages and movements in the relative price of goods. We show that, depending the source, a technology improvement may either have a contractionary or expansionary impact on employment.

Journal ArticleDOI
TL;DR: In this paper, the National Statistics Council, which advises the chief statistician of Canada on the full range of Statistics Canada's activities, particularly on overall program priorities, was consulted.
Abstract: This paper was prepared for the National Statistics Council, which advises the chief statistician of Canada on the full range of Statistics Canada’s activities, particularly on overall program priorities

Journal ArticleDOI
TL;DR: In this article, the authors examine the nature of Speaker impartiality within the British Parliament system by examining the political involvement and the casting vote of Speakers, and argue that the source of this irregularity was the close seat count between opposition and government members.
Abstract: This paper will examine the nature of Speaker impartiality within the British Parliamentary system by examining the political involvement and the casting vote of Speakers. This will attempt to historically contextualize the role of the Speaker in the province of New Brunswick and explain the institutional circumstances of the unusual conduct of Speakers Bev Harrison and Michael ‘Tanker’ Malley during the third session of the 55th Legislative Assembly. This paper argues that the source of this irregularity was the close seat count between opposition and government members. This issue illustrates the difficulties of smaller Legislative Assemblies reaching the convention established by the British Parliament respecting the impartiality of a Speaker where a Speaker is apolitical, thereby divorced from partisan politics. In the British convention, which serves as the fount of parliamentary practice, the Speaker refrains from any active political involvement, and in return the Speaker is uncontested in upcoming parliamentary elections.