scispace - formally typeset
Search or ask a question

Showing papers by "DePaul University published in 2006"


Journal ArticleDOI
01 Sep 2006
TL;DR: Five software platforms for scientific agent-based models (ABMs) were reviewed by implementing example models in each, finding MASON and Repast usually fastest, Swarm fastest for simple models but slowest for complex ones, and NetLogo intermediate.
Abstract: Five software platforms for scientific agent-based models (ABMs) were reviewed by implementing example models in each. NetLogo is the highest-level platform, providing a simple yet powerful programming language, built-in graphical interfaces, and comprehensive documentation. It is designed primarily for ABMs of mobile individuals with local interactions in a grid space, but not necessarily clumsy for others. NetLogo is highly recommended, even for prototyping complex models. MASON, Repast, and Swarm are "framework and library" platforms, providing a conceptual framework for organizing and designing ABMs and corresponding software libraries. MASON is least mature and designed with execution speed a high priority. The Objective-C version of Swarm is the most mature library platform and is stable and well organized. Objective-C seems more natural than Java for ABMs but weak error-handling and the lack of developer tools are drawbacks. Java Swarm allows Swarm's Objective-C libraries to be called from Java; it does not seem to combine the advantages of the two languages well. Repast provides Swarm-like functions in a Java library and is a good choice for many, but parts of its organization and design could be improved. A rough comparison of execution speed found MASON and Repast usually fastest (MASON 1-35% faster than Repast), Swarm (including Objective-C) fastest for simple models but slowest for complex ones, and NetLogo intermediate. Recommendations include completing the documentation (for all platforms except NetLogo), strengthening conceptual frameworks, providing better tools for statistical output and automating simulation experiments, simplifying common tasks, and researching technologies for understanding how simulation results arise.

741 citations


Book ChapterDOI
01 Jan 2006
TL;DR: A few of the many successful applications of LSA to text-processing problems are described and a number of current research directions are presented, which show how it matches human behavior.
Abstract: Latent semantic analysis (LSA) is a technique for comparing texts using a vector-based representation that is learned from a corpus This article begins with a description of the history of LSA and its basic functionality LSA enjoys both theoretical support and empirical results that show how it matches human behavior A number of the experiments that compare LSA with humans are described here The article also describes a few of the many successful applications of LSA to text-processing problems and finishes by presenting a number of current research directions

519 citations


Journal ArticleDOI
TL;DR: It is suggested that MTF transgender youth of color have many unmet needs and are at extreme risk of acquiring HIV, and future research is needed to better understand this adolescent subgroup and to develop targeted broad-based interventions that reduce risky behaviors.

511 citations


Journal ArticleDOI
TL;DR: There is substantial evidence for the mediating role of family relationship in the relation between stressors and child and adolescent psychological symptoms and future studies should integrate moderator and mediator research by testing for specific mediators in relation to particular moderating contexts, to better understand the complex ways in which stressful life experiences affect the well-being of children and adolescents.

490 citations


Book
25 Sep 2006
TL;DR: In this paper, the authors present a theory of social determinants and well-being for health care and public health in the U.S. The theory is based on the moral foundations of markets.
Abstract: CHAPTER 1: THE JOB OF JUSTICE 1.1 Which Inequalities Matter Most 1.2 Justice and Well-Being 1.3 Justice, Sufficiency, and Systematic Disadvantage 1.4 Foundations of Public Health 1.5 Medical Care and Insurance Markets 1.6 Setting Priorities 1.7 Justice, Democracy, and Social Values CHAPTER 2 2.1 Introduction 2.2 Essential Dimensions of Well-Being 2.3 A Moderate Essentialism 2.4 Well-Being and Nonideal Theory 2.5 The Main Alternatives 2.6 Capabilities, Functioning, and Well-Being 2.7 Relativism, Moral Imperialism, and Political Neutrality 2.8 Justice and Basic Human Rights CHAPTER 3: JUSTICE, SUFFICIENCY, AND SYSTEMATIC DISADVANTAGE 3.1 Varieties of Egalitarianism 3.2 The Leveling-Down Objection 3.3 The Strict Egalitarian's Pluralist Defense 3.4 Is the Appeal to Equality Unavoidable 3.5 A Sufficiency of Well-Being Approach 3.6 Toward a Unified Theory of Social Determinants and Well-Being 3.7 Densely Woven, Systematic Patterns of Disadvantage 3.8 Conclusion CHAPTER 4: SOCIAL JUSTICE AND PUBLIC HEALTH 4.1 Introduction 4.2 Moral Justification for Public Health 4.3 Public Health, the Negative Point of Justice, and Systematic Disadvantage 4.4 Public Health, the Positive Point of Justice, and Health Inequalities CHAPTER 5: MEDICAL CARE AND INSURANCE MARKETS 5.1 The Moral Foundations of Markets 5.2 Sources of Market Failure 5.3 Responses to Market Failure: Some Examples from the U.S. Experience 5.4 Making Matters Worse: Employer-Based Insurance in the United States 5.5 Private Markets and Public Safety Nets CHAPTER 6: SETTING PRIORITIES 6.1 Introduction 6.2 Mimicking Markets 6.3 Cost-Effectiveness and Cost-Utility Alternatives 6.4 Systematic Disadvantage 6.5 The Relevance of Childhood, Old Age, and Human Development 6.6 Beyond Separate Spheres of Justice 6.7 Trade-Offs within Health 6.8 Conclusion CHAPTER 7: JUSTICE, DEMOCRACY, AND SOCIAL VALUES 7.1 Lost on the Oregon Trail 7.2 From Substantive Justice 7.3 Mimicking Majorities: Moralizing Preferences and Empiricizing Equity 7.4 Theory, After All? 7.5 DALYs, Deliberation, and Empirical Ethics CHAPTER 8: FACTS AND THEORY References Author Index Subject Index

412 citations


Journal ArticleDOI
TL;DR: In this paper, the authors examine the valuation effects of the interaction between differences of opinion and short sale constraints and find robust evidence of significant overvaluation for stocks that are subject to both conditions simultaneously.
Abstract: Miller (1977) hypothesizes that dispersion of investor opinion in the presence of short-sale constraints leads to stock price overvaluation. However, previous empirical tests of Miller's hypothesis examine the valuation effects of only one of these two necessary conditions. We examine the valuation effects of the interaction between differences of opinion and short sale constraints. We find robust evidence of significant overvaluation for stocks that are subject to both conditions simultaneously. Stocks are not systematically overvalued when either one of these two conditions is not met.

366 citations


Journal ArticleDOI
TL;DR: While the topics of business ethics and social responsibility education have received much attention in scholarly and pedagogical literature as discussed by the authors, the authors of this paper focus on the business ethics education.
Abstract: While the topics of business ethics and social responsibility education have received much attention in scholarly and pedagogical literature (although less in the pedagogical literature), the autho...

354 citations


Journal ArticleDOI
TL;DR: This study proposes a unified conceptual model for wireless technology adoption and postulates that, under the mobile context, user intention to perform general tasks that do not involve transactions and gaming is influenced by perceived usefulness and perceived ease of use.
Abstract: The technology acceptance model (TAM) is one of the most widely used models of information technology (IT) adoption. According to TAM, IT adoption is influenced by two perceptions: usefulness and ease of use. In this study, we extend TAM to the mobile commerce context. We categorize the tasks performed on wireless handheld devices into three categories: (1) general tasks that do not involve transactions and gaming, (2) gaming tasks, and (3) transactional tasks. We propose a unified conceptual model for wireless technology adoption. In this model, task type moderates the effects of four possible determinants: perceived usefulness, perceived ease of use, perceived playfulness, and perceived security. We postulate that, under the mobile context, user intention to perform general tasks that do not involve transactions and gaming is influenced by perceived usefulness and perceived ease of use, user intention to play games is affected by perceived playfulness, and user intention to transact is influenced by perceived usefulness and perceived security. A survey was conducted to collect data about user perception of 12 tasks that could be performed on wireless handheld devices and user intention to use wireless technology. Multiple regression analyses supported the proposed research model.

320 citations


Journal ArticleDOI
TL;DR: It is proved that the NP-hard distinguishing substring selection problem has no polynomial time approximation schemes of running time f(1/@e)n^o^(^1^/^@e^) for any function f unless an unlikely collapse occurs in parameterized complexity theory.

313 citations


Journal ArticleDOI
TL;DR: The invasion of European earthworms into previously earthworm-free temperate and boreal forests of North America dominated by Acer, Quercus, Betula, Pinus and Populus has provided ample opportunity to observe how earthworms engineer ecosystems.
Abstract: Earthworms are keystone detritivores that can influence primary producers by changing seedbed conditions, soil characteristics, flow of water, nutrients and carbon, and plant-herbivore interactions. The invasion of European earth- worms into previously earthworm-free temperate and boreal forests of North America dominated

310 citations


Journal ArticleDOI
TL;DR: In this article, the authors investigate the effects of expansionary and contractionary policy separately on the loan behavior of low-capital and high-capital banks, and between pre-basel/FDICIA and post-Basel/ FDICIA periods.
Abstract: Utilizing a bank-lending channel framework, we investigate the effects of expansionary and contractionary policy separately on the loan behavior of low-capital and high-capital banks, and between pre-Basel/FDICIA and post-Basel/FDICIA periods. Our results show that low-capital banks are adversely affected by contractionary policy. Expansionary policy, however, is not effective in stimulating the loan growth of low-capital banks. These results are consistent with lending channel predictions, but only hold in the post-Basel/FDICIA period when the capital constraint is stringent relative to the pre-Basel/FDICIA period. These asymmetric policy results have implications for the interaction of monetary and capital regulatory policies.

Journal ArticleDOI
TL;DR: In this article, the authors compare the effect of traditional manufacturing-oriented supply chain strategies on the operational and financial performance of firms in both service and manufacturing sectors, and highlight similarities and differences between the two sectors.
Abstract: SUMMARY As the economy evolves from manufacturing to services, it is important to understand whether the lessons learned in the manufacturing sector can be directly extrapolated to service supply chains. Unfortunately, the majority of existing supply chain research focuses exclusively on the manufacturing sector. To address this deficiency, this article compares the effect of traditional manufacturing-oriented supply chain strategies on the operational and financial performance of firms in both service and manufacturing sectors. The results highlight similarities and differences between the two sectors — demonstrating that effective supply chain strategies in one sector may not be appropriate in the other sector. This suggests that practicing managers should identify appropriate benchmarks and competitive priorities before pursuing specific supply chain strategies. The insights provided by this research should help guide companies toward strategies that may positively affect their specific organization's operational and financial performance.

Journal ArticleDOI
TL;DR: In a recent experiment, 150 individuals in Illinois were randomly assigned to either an Oxford House or usual-care condition after substance abuse treatment discharge, and at the 24-month follow-up, those in the Oxford House condition had significantly lower substance use, significantly higher monthly income, and significantly lower incarceration rates.
Abstract: Oxford Houses are democratic, mutual help-oriented recovery homes for individuals with substance abuse histories. There are more than 1200 of these houses in the United States, and each home is operated independently by its residents, without help from professional staff. In a recent experiment, 150 individuals in Illinois were randomly assigned to either an Oxford House or usual-care condition (i.e., outpatient treatment or self-help groups) after substance abuse treatment discharge. At the 24-month follow-up, those in the Oxford House condition compared with the usual-care condition had significantly lower substance use, significantly higher monthly income, and significantly lower incarceration rates.

Proceedings ArticleDOI
20 Aug 2006
TL;DR: This paper proposes and studies different attributes derived from user profiles for their utility in attack detection and shows that a machine learning classification approach that includes attributesderived from attack models is more successful than more generalized detection algorithms previously studied.
Abstract: Collaborative recommender systems are highly vulnerable to attack. Attackers can use automated means to inject a large number of biased profiles into such a system, resulting in recommendations that favor or disfavor given items. Since collaborative recommender systems must be open to user input, it is difficult to design a system that cannot be so attacked. Researchers studying robust recommendation have therefore begun to identify types of attacks and study mechanisms for recognizing and defeating them. In this paper, we propose and study different attributes derived from user profiles for their utility in attack detection. We show that a machine learning classification approach that includes attributes derived from attack models is more successful than more generalized detection algorithms previously studied.

Journal ArticleDOI
TL;DR: In this article, the authors investigate the effect of outsourcing on firm level performance metrics, providing evidence about outsourcing influences on a firm's cost efficiency, productivity, and profitability, using publicly available accounting data.
Abstract: Purpose – This research aims to empirically investigate the effect of outsourcing on firm level performance metrics, providing evidence about outsourcing influences on a firm's cost‐efficiency, productivity and profitability.Design/methodology/approach – This study is concerned with empirically examining the impact of outsourcing on a firm's performance. The results are based on a sample of 51 publicly traded firms that outsourced parts of their operations between 1990 and 2002. Publicly available accounting data are used to test for changes in operating performances that result from outsourcing decisions. Operating performances are examined over a four‐quarter period after the outsourcing announcement.Findings – This research provides evidence that outsourcing can improve a firm's cost‐efficiency. While existing literature on outsourcing has also sought to draw anecdotal and conceptual evidence that highly visible companies have improved their productivity and profitability as well through outsourcing, t...

Journal ArticleDOI
TL;DR: A comprehensive classification of security policy conflicts that might potentially exist in a single security device or between different network devices in enterprise networks is presented and the high probability of creating such conflicts even by expert system administrators and network practitioners is shown.
Abstract: Network security polices are essential elements in Internet security devices that provide traffic filtering, integrity, confidentiality, and authentication. Network security perimeter devices such as firewalls, IPSec, and IDS/IPS devices operate based on locally configured policies. However, configuring network security policies remains a complex and error-prone task due to rule dependency semantics and the interaction between policies in the network. This complexity is likely to increase as the network size increases. A successful deployment of a network security system requires global analysis of policy configurations of all network security devices in order to avoid policy conflicts and inconsistency. Policy conflicts may cause serious security breaches and network vulnerability such as blocking legitimate traffic, permitting unwanted traffic, and insecure data transmission. This article presents a comprehensive classification of security policy conflicts that might potentially exist in a single security device (intrapolicy conflicts) or between different network devices (interpolicy conflicts) in enterprise networks. We also show the high probability of creating such conflicts even by expert system administrators and network practitioners.

Posted Content
TL;DR: This paper presented a dynamic equilibrium model of bond markets in which two groups of agents hold heterogeneous expectations about future economic conditions, which cause agents to take speculative positions against each other and therefore generate endogenous relative wealth fluctuation.
Abstract: This paper presents a dynamic equilibrium model of bond markets in which two groups of agents hold heterogeneous expectations about future economic conditions. The heterogeneous expectations cause agents to take speculative positions against each other and therefore generate endogenous relative wealth fluctuation. The relative wealth fluctuation amplifies asset price volatility and contributes to the time variation in bond premia. Our model shows that a modest amount of heterogeneous expectation can help explain several puzzling phenomena, including the "excessive volatility" of bond yields, the failure of the expectations hypothesis, and the ability of a tent-shaped linear combination of forward rates to predict bond returns.

Proceedings ArticleDOI
11 Sep 2006
TL;DR: An information retrieval based approach for automating the detection and classification of non-functional requirements (NFRs) and then evaluates its effectiveness in an experiment based on fifteen requirements specifications developed as term projects by MS students at DePaul University.
Abstract: This paper introduces an information retrieval based approach for automating the detection and classification of non-functional requirements (NFRs). Early detection of NFRs is useful because it enables system level constraints to be considered and incorporated into early architectural designs as opposed to being refactored in at a later time. Candidate NFRs can be detected in both structured and unstructured documents, including requirements specifications that contain scattered and non-categorized NFRs, and freeform documents such as meeting minutes, interview notes, and memos containing stakeholder comments documenting their NFR related needs. This paper describes the classification algorithm and then evaluates its effectiveness in an experiment based on fifteen requirements specifications developed as term projects by MS students at DePaul University. An additional case study is also described in which the approach is used to classifying NFRs from a large free form requirements document obtained from Siemens Logistics and Automotive Organization.

Journal ArticleDOI
TL;DR: In this paper, the authors point out gaps in the current literature and examine the link between outsourcing implementation and firms' performance metrics by analysing hard data, and point out three main gaps: lack of objective metrics for outsourcing results evaluation, lack of research on the relationship between outsourcing implementations and companies' value, and lack of work on the outsourcing contract itself.
Abstract: Purpose – Outsourcing emerged as a popular operational strategy in the 1990s and most of current literature was established in the same time. However, the result of outsourcing is still vague. The purpose of this article is to point out gaps in the current literature and examine the link between outsourcing implementation and firms' performance metrics by analysing hard data.Design/methodology/approach – In this research, current outsourcing research (from 1990 to 2003) methodologies are grouped by five categories: case study, survey, conceptual framework, mathematical modeling, and financial data analyses; research scope is identified by three areas: outsourcing determinant, outsourcing process, and outsourcing result.Findings – This article figures out three main gaps in the current literature: lack of objective metrics for outsourcing results evaluation, lack of research on the relationship between outsourcing implementation and firms' value, and lack of research on the outsourcing contract itself.Rese...

Journal ArticleDOI
TL;DR: A comparative review of research on how writers learn genres suggests future directions for the interdisciplinary study of genre learning.

Journal ArticleDOI
TL;DR: Positive and negative changes in health risks are associated with same-direction changes in presenteeism.
Abstract: Objective:This prospective study investigates whether changes in health risks are associated with changes in presenteeism (on-the-job productivity loss).Method:A total of 7026 employees of a national financial services company responded to a health risk appraisal (HRA), which included a modi

Journal ArticleDOI
TL;DR: In this article, the authors examined the perceptions children had of their relationships with parents, peers, and teachers; their bonds with schools and neighborhoods; and their social, behavioral, and emotional adjustment.
Abstract: In this investigation, the authors examined the perceptions children had of their relationships with parents, peers, and teachers; their bonds with schools and neighborhoods; and their social, behavioral, and emotional adjustment. Participants were 96 students in the fifth and sixth grades who were receiving special education services for learning disabilities (n = 40), emotional and behavioral disorders (n = 18), mild mental retardation (n = 18), and other health impairments (n = 20). Findings indicated that both positive and negative aspects of these children's relationships and bonds were associated with social, behavioral, and emotional adjustment. Furthermore, different aspects of these relationships and bonds were differentially associated with adjustment variables. These findings suggest that it is important to consider how social relationships and social contexts relate to the adjustment and functioning of children with high-incidence disabilities.

Journal ArticleDOI
TL;DR: This work presents a matching algorithm that establishes many-to-many correspondences between the nodes of two noisy, vertex-labeled weighted graphs using a novel embedding technique based on a spherical encoding of graph structure.
Abstract: Object recognition can be formulated as matching image features to model features. When recognition is exemplar-based, feature correspondence is one-to-one. However, segmentation errors, articulation, scale difference, and within-class deformation can yield image and model features which don't match one-to-one but rather many-to-many. Adopting a graph-based representation of a set of features, we present a matching algorithm that establishes many-to-many correspondences between the nodes of two noisy, vertex-labeled weighted graphs. Our approach reduces the problem of many-to-many matching of weighted graphs to that of many-to-many matching of weighted point sets in a normed vector space. This is accomplished by embedding the initial weighted graphs into a normed vector space with low distortion using a novel embedding technique based on a spherical encoding of graph structure. Many-to-many vector correspondences established by the Earth Mover's Distance framework are mapped back into many-to-many correspondences between graph nodes. Empirical evaluation of the algorithm on an extensive set of recognition trials, including a comparison with two competing graph matching approaches, demonstrates both the robustness and efficacy of the overall approach.

Journal ArticleDOI
TL;DR: In this article, a conceptual approach is taken to interpret and explain evolution in entrepreneurial thought, using the application of history to unify the extant and wide-ranging concepts underlying the field to detect a conceptual foundation.
Abstract: Purpose – To interpret and explain evolution in entrepreneurial thought, using the application of history to unify the extant and wide‐ranging concepts underlying the field to detect a conceptual foundation.Design/methodology/approach – A conceptual approach is taken, the paper undertaking a delineation of how past theory has brought about the field's current state and an identification of some conceptual areas for future advancement.Findings – The importance and impact of the entrepreneurship field is increasing in academic and practical settings. A historical view on the conceptual development of entrepreneurial thought provides a lens for scholars as well as practitioners to interpret and explain their own entrepreneurial activity or research and formulate new questions.Originality/value – The paper aids scholars and researchers to interpret and explain entrepreneurial activity.

Journal ArticleDOI
TL;DR: A recognizable pattern is identified that is proposed to call Devastating Epileptic encephalopathy in School-age Children (DESC) that begins with prolonged SE triggered by fever of unknown cause, and persists as intractable perisylvian epilepsy with severe cognitive deterioration.

Journal ArticleDOI
Alec Brownlow1
01 Mar 2006-Geoforum
TL;DR: In this article, the authors examine how mechanisms of social control function to mediate human-environment relations and processes of environmental change in the city of Philadelphia using the Fairmount Park System of Philadelphia as a case study.

Book ChapterDOI
28 Aug 2006
TL;DR: This paper presents an O( 1.2738k + kn)-time polynomial-space parameterized algorithm for Vertex Cover improving the previous O(1.286k - kn) upper bound by Chen, Kanj, and Jia.
Abstract: This paper presents an O(1.2738k + kn)-time polynomial-space parameterized algorithm for Vertex Cover improving the previous O(1.286k + kn)-time polynomial-space upper bound by Chen, Kanj, and Jia. The algorithm also improves the O(1.2745kk4 + kn)-time exponential-space upper bound for the problem by Chandran and Grandoni.

Journal ArticleDOI
Woods Bowman1
TL;DR: In this paper, the authors report on a theory-based experiment to determine whether there is an observable relationship between changes in charitable giving to an organization and changes in the proportion of r...
Abstract: This article reports on a theory-based experiment to determine whether there is an observable relationship between changes in charitable giving to an organization and changes in the proportion of r...

Proceedings Article
16 Jul 2006
TL;DR: This paper considers two recommendation algorithms, one based on k-means clustering and the other based on Probabilistic Latent Semantic Analysis (PLSA), and shows that, particularly, the PLSA-based approach can achieve comparable recommendation accuracy.
Abstract: The open nature of collaborative recommender systems allows attackers who inject biased profile data to have a significant impact on the recommendations produced. Standard memory-based collaborative filtering algorithms, such as k- nearest neighbor, have been shown to be quite vulnerable to such attacks. In this paper, we examine the robustness of model-based recommendation algorithms in the face of profile injection attacks. In particular, we consider two recommendation algorithms, one based on k-means clustering and the other based on Probabilistic Latent Semantic Analysis (PLSA). These algorithms aggregate similar users into user segments that are compared to the profile of an active user to generate recommendations. Traditionally, model-based algorithms have been used to alleviate the scalability problems associated with memory-based recommender systems. We show, empirically, that these algorithms also offer significant improvements in stability and robustness over the standard k- nearest neighbor approach when attacked. Furthermore, our results show that, particularly, the PLSA-based approach can achieve comparable recommendation accuracy.

Journal ArticleDOI
TL;DR: It is shown that conservatives oppose affirmative action more for Blacks than for other groups, in this case women, and that the relationship between conservatism and affirmative action attitudes is mediated best by group-based stereotypes that offer deservingness information.
Abstract: Why do educated conservatives oppose affirmative action? Those in the “principled conservatism” camp say opposition is based on principled judgments of fairness about the policies. Others, however, argue that opposition is based on racism. The present article offers an alternative perspective that may reconcile these contradictory points of view. In 2 studies, the authors show 2 major findings: (a) that conservatives oppose affirmative action more for Blacks than for other groups, in this case women, and (b) that the relationship between conservatism and affirmative action attitudes is mediated best by group-based stereotypes that offer deservingness information and not by other potential mediators like old-fashioned racism or the perceived threat that affirmative action poses to oneself. The authors conclude that educated conservatives are indeed principled in their opposition to affirmative action, but those principles are group based not policy based.