scispace - formally typeset
Search or ask a question
Book

Intermediate microeconomics : A modern approach

01 Jan 2006-
TL;DR: The Varian approach as mentioned in this paper gives students tools they can use on exams, in the rest of their classes, and in their careers after graduation, and is still the most modern presentation of the subject.
Abstract: This best-selling text is still the most modern presentation of the subject. The Varian approach gives students tools they can use on exams, in the rest of their classes, and in their careers after graduation.
Citations
More filters
Journal ArticleDOI
TL;DR: In this article, the authors proposed a framework to estimate the indirect economic loss due to damaged bridges within the highway system of a major metropolitan area in a simulated earthquake within the St. Louis metropolitan region.
Abstract: Destruction from natural and man-made disasters can result in extensive damage to the affected area's infrastructure. While the destruction results in costs that are necessary to restore the physical destruction and repair of existing infrastructure, a wider economic impact is often indirectly measured and felt. Policy-makers generally focus only on losses that are directly caused by the destruction, such as the replacement of roads and bridges, yet tend to overlook the consequences from indirect economic losses. This study proposes a framework to estimate the indirect economic loss due to damaged bridges within the highway system of a major metropolitan area. For the research, a simulated earthquake within the St. Louis metropolitan region is selected as a case study. The computable general equilibrium (CGE) model is applied as the loss estimation tool for modeling the indirect cost. The study results show that the indirect loss is significant when compared to the direct loss and should therefore be cons...

37 citations


Cites background from "Intermediate microeconomics : A mod..."

  • ...The equilibrium state of the CGE system requires that all markets satisfy Walrasian's law ( Francois and Reinert, 1997 ; Shoven and Whalley, 1992 ; Nicholson, 1994 ; Varian, 1993 ; Partridge and Rickman, 1998 )....

    [...]

Proceedings ArticleDOI
Fan Zhang1, Yiqun Liu1, Xin Li1, Min Zhang1, Yinghui Xu2, Shaoping Ma1 
07 Aug 2017
TL;DR: A new evaluation framework based on upper limits (either fixed or changeable as search proceeds) for both benefit and cost is proposed and it is shown how to derive a new metric from the framework and demonstrated that it can be adopted to revise traditional metrics like Discounted Cumulative Gain, Expected Reciprocal Rank and Average Precision.
Abstract: The design of a Web search evaluation metric is closely related with how the user's interaction process is modeled. Each behavioral model results in a different metric used to evaluate search performance. In these models and the user behavior assumptions behind them, when a user ends a search session is one of the prime concerns because it is highly related to both benefit and cost estimation. Existing metric design usually adopts some simplified criteria to decide the stopping time point: (1) upper limit for benefit (e.g. RR, AP); (2) upper limit for cost (e.g. Precision@N, DCG@N). However, in many practical search sessions (e.g. exploratory search), the stopping criterion is more complex than the simplified case. Analyzing benefit and cost of actual users' search sessions, we find that the stopping criteria vary with search tasks and are usually combination effects of both benefit and cost factors. Inspired by a popular computer game named Bejeweled, we propose a Bejeweled Player Model (BPM) to simulate users' search interaction processes and evaluate their search performances. In the BPM, a user stops when he/she either has found sufficient useful information or has no more patience to continue. Given this assumption, a new evaluation framework based on upper limits (either fixed or changeable as search proceeds) for both benefit and cost is proposed. We show how to derive a new metric from the framework and demonstrate that it can be adopted to revise traditional metrics like Discounted Cumulative Gain (DCG), Expected Reciprocal Rank (ERR) and Average Precision (AP). To show effectiveness of the proposed framework, we compare it with a number of existing metrics in terms of correlation between user satisfaction and the metrics based on a dataset that collects users' explicit satisfaction feedbacks and assessors' relevance judgements. Experiment results show that the framework is better correlated with user satisfaction feedbacks.

37 citations


Additional excerpts

  • ...Œen Azzopardi suggests that Production Œeory [36] could be used to model the search process instead and proposes Search Economic Œeory (SET) [3] to model ad-hoc topic retrieval....

    [...]

Journal ArticleDOI
TL;DR: In this paper, the authors used the Latent Stochastic Frontier Model (LSTM) to estimate the technical efficiency of UK airports and ranked them according to their technical efficiency for the period 2000-06.

37 citations


Cites background from "Intermediate microeconomics : A mod..."

  • ...The specification of the cost function follows microeconomic theory (Varian, 1987 )....

    [...]

Proceedings ArticleDOI
Chen Zhu1, Hengshu Zhu1, Hui Xiong, Pengliang Ding1, Fang Xie1 
13 Aug 2016
TL;DR: Wang et al. as discussed by the authors proposed a new research paradigm for recruitment market analysis by leveraging unsupervised learning techniques for automatically discovering recruitment market trends based on large-scale recruitment data.
Abstract: Recruitment market analysis provides valuable understanding of industry-specific economic growth and plays an important role for both employers and job seekers. With the rapid development of online recruitment services, massive recruitment data have been accumulated and enable a new paradigm for recruitment market analysis. However, traditional methods for recruitment market analysis largely rely on the knowledge of domain experts and classic statistical models, which are usually too general to model large-scale dynamic recruitment data, and have difficulties to capture the fine-grained market trends. To this end, in this paper, we propose a new research paradigm for recruitment market analysis by leveraging unsupervised learning techniques for automatically discovering recruitment market trends based on large-scale recruitment data. Specifically, we develop a novel sequential latent variable model, named MTLVM, which is designed for capturing the sequential dependencies of corporate recruitment states and is able to automatically learn the latent recruitment topics within a Bayesian generative framework. In particular, to capture the variability of recruitment topics over time, we design hierarchical dirichlet processes for MTLVM. These processes allow to dynamically generate the evolving recruitment topics. Finally, we implement a prototype system to empirically evaluate our approach based on real-world recruitment data in China. Indeed, by visualizing the results from MTLVM, we can successfully reveal many interesting findings, such as the popularity of LBS related jobs reached the peak in the 2nd half of 2014, and decreased in 2015.

37 citations

Journal ArticleDOI
TL;DR: In this paper, a methodology for the optimal design of incentive schemes based on the minimisation of Dead Weight Loss for different policy goals and policy restrictions is presented, which is illustrated by designing optimal combinations of taxes and subsidies in Spain for three types of appliances: dishwashers, refrigerators and washing machines.

37 citations